³ÉÈË¿ìÊÖ

Archives for October 2011

Prototyping Weeknotes #83

Post categories: ,Ìý,Ìý

Tristan Ferne | 10:06 UK time, Monday, 31 October 2011

Our team's quite big now and these weeknotes have been getting quite long so here are just some snippets from this week...

Duncan got a custom scheduler app using the API up and running on our servers and has started firing HTTP requests to this so we can send notifications.
Pete's been working with Laura to consolidate the UX thinking around the Programme List alerts and wireframe some basic preferences options.

Sean was in Prague on Monday and Tuesday this week at the . One theme that emerged was that streaming protocol authors have recognised the need to include timestamp information in-band for synchronisation. Andrew dusted off the Autumnwatch secondscreen content to see how we can repurpose it for the LIMO project.

Read the rest of this entry

Prototyping Weeknotes #82

Post categories: ,Ìý,Ìý

Barbara Zambrini | 09:50 UK time, Monday, 24 October 2011

It has been already over a month since Barbara joined the team and feels ready to give it a go at writing weeknotes – with an Italian flavour ;-) This week everybody has been really busy while munching on the sweets Joanne brought back from her trip to Mexico, apparently they have free Wifi and electric socket points in outdoor spaces. At our R&D North Lab there were some interesting discussions on background research for an audio recognition interface between Vicky, Penny and Max.

Read the rest of this entry

The inaugural Networked Electronic Media Art and Design Contest

Post categories: ,Ìý,Ìý,Ìý,Ìý,Ìý,Ìý

Rowena Goldman | 13:02 UK time, Tuesday, 18 October 2011

The inaugural Networked Electronic Media Art and Design Contest at the in Barcelona last year yielded five joint winners, one of which was ³ÉÈË¿ìÊÖ R&D's very own collaborative installation "The Cut Up".

Read the rest of this entry

Dual Screen Experiences: the Secret Fortune Pilot and the Sync API

Post categories: ,Ìý

Steve Jolly Steve Jolly | 10:30 UK time, Tuesday, 18 October 2011

Our team here at ³ÉÈË¿ìÊÖ R&D has spent the last couple of years working on ways to take advantage of of the increasing number of smartphones, tablets and laptops in the homes of ³ÉÈË¿ìÊÖ audience members to enhance our TV and radio programmes and the way people interact with them. I've blogged previously about this area of work, including the Universal Control API, and the ways in which it could replace remote controls and enable what my colleague Jerry Kramskoy has called "orchestrated media" experiences, which consist of media presented on multiple devices that are synchronised to one another.

In that earlier blog post, I mention that technologies exist that are already being used to synchronise media on mobile devices to television programmes, such as and watermarking, and watermarking and delivering synchronisation information via the Internet. The advantage of all these solutions is that they require no modifications to the set-top box or television. A common disadvantage is that content on other devices can only follow what happens on the television, and not vice-versa. In the longer term, we believe that a technology like Universal Control offers very significant advantages in this regard, but we recently took advantage of an opportunity to work with colleagues from across the ³ÉÈË¿ìÊÖ to investigate some of these existing methods of synchronisation, to see what kinds of "dual screen" experience might be possible today.

Read the rest of this entry

Prototyping Weeknotes #81

Post categories: ,Ìý,Ìý

Chris Godbert | 11:59 UK time, Monday, 17 October 2011

We put our engineering house in order with the housekeeping work last week, and everyone's now back on project work. We kicked off some new projects this week, including our final development iteration on the and some research around a notifications framework. Laura, a design trainee, joined the team this week and will be with us until the end of October working across a number of our projects. The world also sadly said goodbye to , the co-creator of both the C Programming Language *and* of UNIX, who died on Wednesday.

Read the rest of this entry

The Carbon Footprint of Watching Television

Post categories: ,Ìý

Jigna Chandaria | 11:10 UK time, Tuesday, 11 October 2011

In the Green Technology project here at ³ÉÈË¿ìÊÖ R&D we’re looking at the environmental impact of media technology and how it could be reduced.Ìý In order to get an overview of where the biggest impacts are and so where to focus our efforts, we recently did some work to estimate the carbon footprint of the end-to-end television chain from a programme’s production all the way through to watching it at home.

Read the rest of this entry

Multimedia Classification

Post categories: ,Ìý

Sam Davies | 07:45 UK time, Tuesday, 11 October 2011

As John Zubrzycki mentioned yesterday,Ìýthis project is running as part of ³ÉÈË¿ìÊÖ R&D’s Archive Research Section, is developing new ways in which to open up the ³ÉÈË¿ìÊÖ’s archive to the public. The aim of the project is to allow people to search and browse the archive to find content that they want to watch or listen to, but didn’t know existed.

Read the rest of this entry

Prototyping Weeknotes #80

Post categories: ,Ìý,Ìý

George Wright George Wright | 17:01 UK time, Monday, 10 October 2011

As part of our housekeeping week, where we fix or tweak areas of our code and deployment infrastructure, Andrew and Dan have been creating a baseline virtual machine that we can use in future projects. This involved investigating loads of oddly named technologies such as Chef, ÌýKnife, Puppet andÌý.ÌýThey managed to get a series of recipes that can be used to build machines locally or on production, using a stream-lined method for deploying applications using Git post-receive hooks, so setting up a local development version of a project should take just two commands. As part of this work, Duncan created and documented the creation of a multi-core Solr instance, and continued work on the web interface onto our internal Git repository.

Read the rest of this entry

³ÉÈË¿ìÊÖ R&D, the ³ÉÈË¿ìÊÖ Archive and digital public space: an overview of our work on the archive from preservation to multimedia classifications

Post categories: ,Ìý

John Zubrzycki | 11:51 UK time, Monday, 10 October 2011

The ³ÉÈË¿ìÊÖ has about a million hours of video and audio content, plus a wealth of documents, including the original scripts. Most of this content is still on magnetic tape, film, records or paper and so needs to be digitised and made searchable before it can be contributed to the Digital Public Space which was the subject of a recent from the Guardian. ³ÉÈË¿ìÊÖ R&D has a long track record of developing innovative technology for the ³ÉÈË¿ìÊÖ’s archives, including the Ingex digitisation process for D3 videotape [³ÉÈË¿ìÊÖ R&D White Paper WHP 155], and Reverse Standards Conversion, which reverse engineers the processes applied by pioneering standards converters of the 1960's to programmes of that era provided to broadcasters abroad and lost from our own archives. Research is continuing to extend the digitisation process to other types of video tape and to develop automated methods to detect picture and sound faults to ensure good quality digitisation and to assist restoration.

Read the rest of this entry

Orchestrated Media and genres

Post categories:

Jerry Kramskoy Jerry Kramskoy | 13:30 UK time, Friday, 7 October 2011

Last week, we looked at the background around Orchestrated media (OM), and bringing you up to date with work that R&D has been doing on these lines.Ìý Here we look at how OM may enhance the experiences around various programme genres.Ìý As you can imagine, this extends well beyond technology considerations ...

The OM team looks at technology and enablers required to create OM experiences, and discusses with colleagues beyond R&D about possible editorial propositions, and considers the future technology landscape and market activities and how it may affect OM.

Accessibility research has stimulated a lot of thought around OM experiences.Ìý The OM team worked very closely with the Accessibility team to create the Universal Control API initiallyÌý for DTV accessibility, which we then extended for synchronising media across the TV and companion devices, and for interactivity.ÌýÌýÌý

Synchronising content across broadcast and IPÌý is a big topic in its own right, which again various colleagues in R&D have been considering for awhile, and which the OM team is continuing, for example using IP-based events or audio watermarking.Ìý

Other teams in R&D put together prototypes, such as AutumnWatch to better understand the possible experiences, and consider the user experience and behavioral aspects of these.Ìý The OM team piggy-backed off this AutumnWatch work adding in Universal Control to enable a two-way TV - companion device experience. Ìý

R&D have also worked with the Royal College of Art on media services for the home.Ìý

Beyond R&D, other colleagues consider how to design editorial propositions around dual-screen.


Playing with time

Some set top boxes support the ability to play back content at various speeds, forwards or backwards (including speed 0 - paused).Ìý The content is VoD (pay-per-view and catch-up) and off-air content that has been recorded locally for watching later (time-shifted).

The box may support similar for content delivered via the currently tuned-in channel (pause and rewind, and fast-forward until it tries to exceed the current point in the broadcast).

This capability raises interesting questions for OM experiences, both technical and editorial.Ìý For example, if a game is sync'd to the TV and the viewer rewinds, should the companion device follow or not ... you don't want to re-present an already answered question.ÌýÌý The designer of the experience needs to decide what makes sense and the system would need configuring as to how it reacts.Ìý If there is an intimate link between the TV and the companion device, whereby the companion knows at all times exactly where the TV content's play head is currently, then more control over the experience is possible.Ìý But if the synchronisation is based on audio watermarking for example, then decisions are needed how to handle this technically ... sit and wait for the next expected event?Ìý Monitor for previous events?

The way this is handled will be genre-specific, both technically and editorially.

If the companion device wishes to control the TV's rewind etc, for experiences that use symmetric synchronisation, then the TV must expose this functionality somehow.Ìý R&D uses our Universal Control API for this purpose.

Applicability of OM to genres

The ³ÉÈË¿ìÊÖ's charter is to entertain, educate and inform through its content, to bring public value.

Games are an obvious form of social entertainment ... a TV content-aware game can synchronise its own content and the interactions it offers with game segments in the TV or radio show. As the show unfolds, you play against the studio audience and your family and friends.This requires asymmetric synchronisation: the game content on a companion device slavishly follows the TV program.Ìý This sort of OM experience we refer to as a dual-screen experience. It's very powerful, and synchronisation can be achieved without additional software on the TV. If the TV content is played , the content-aware game follows.Ìý The current Secret Fortune pilot is an example of this, which we discussed in last week's OM blog.

Live sports programmes are another great form of social entertainment, where synchronised content can really enhance the experience, such as offering statistics about players, athletes, teams during the event, and offering the ability to "video-mark" goals and so on, for sharing and later replay via social OM ... the latter would require symmetric synchronisation.


Symmetric synchronisation opens up many more opportunities, especially for educating and informing. If you think about these activities, they typically involve your (and maybe your friends) own deeper inquiry into the subject matter, be that news, wildlife, science, music and so on. There are often mental-pauses, -rewinds, and -diversions to consult related topics. Hence, if you could control the TV content that stimulates these activities in an analogous manner this may enhance the TV experience and provide more value to you.

Symmetric synchronisation enables this to affect the TV's actions. With suitable TV content-aware services on your companion device and control software on the TV, you can deeply explore topics related to the TV content back and forth in time, in a semantically deeper-shallower manner. Ìý The TV content follows the implied navigation back and forth, and the TV displays other content selections for the wider exploration. Imagine if the ³ÉÈË¿ìÊÖ's archive were fully digitised and made available for search and selection (sadly, content rights stand in the way of this happening in its entirety, not to mention the shear labour involved in this process).

The challenge here is one of standardisation for missing functions to be built into TV and radio devices, but also for simple access to these functions to make life easy for developers working with content creators.

Social OM

Linear TV creates a strong social experience.Ìý People enjoy sharing the viewing experience. Live Sports really engages friends and family in a big way.Ìý But the other genres mentioned above also have the capability to support shared educating and informing.Ìý

If there is social chatter surrounding a show (or segments of it), then the TV content can be repositioned back and forth.Ìý The companion's TV content-aware service linked with social media can filter that chatter to discussions around the segment of the moment on the TV. This could be a dual-screen experience. For example, after a holiday, you could watch catch-up TV and automatically follow what your friends had to say about it on your mobile.

With symmetric synchronisation in the equation, we can do the opposite ... embed content markers into the social chatter, and by replaying the chatter, the TV replays the associated content.

Inter-device application interactions

An area of future exploration lies in understanding the opportunities around TV apps and (remote) companion device apps interacting with each other.Ìý This enables family-interactions, and social interactions, potentially aggregating the results on the TV, for example, indivudal scores for a family-game played against a broadcast game show.

Next blog

We'll take a look at the current state of in-home digital services, and in particular the problems for device- and service- interoperability.

Women in Technology

Post categories:

Becky Gregory-Clarke Becky Gregory-Clarke | 09:35 UK time, Wednesday, 5 October 2011

A few weeks ago, R&D took part in FM’s first Women’s Networking Event held at MediaCityUK, for women working in technology. The event had demos from various techie departments of the Beeb, some talks from women in senior positions in FM such as Victoria Jaye (Head of IPTV and TV Online) and Mary McCarthy (Executive Product Manager for Core Services), as well as talks from Daniel Danker (General Manager for On Demand) and Phil Fearnley (General Manager for News and Knowledge). It was attended by around 55 women in the industry, from outside the ³ÉÈË¿ìÊÖ.

The R&D Engineers attending the Women in Technology Event

Ìý

Flying the flag for R&D were Penny Allen, Liz Valentine, Maxine Glancy and myself, plus some help from Yameen Rasul and Ian Forrester as the token R&D males. We took along Daleketta, our hugely popular talking Dalek, who put on a very good show indeed demonstrating some of our Universal Control research. We got to chat to some extremely nice and interesting women in the industry, who seemed very encouraged not only by our research, but also to know that there was a relatively strong female presence within FM. We could always do with more though, so perhaps this event might encourage a few more to join us.

All in all it was a very nicely put-together event, which I hope will run again in the future. Maybe next time we can take a bigger Dalek...

Ìý

R&D's Dalek demo

More from this blog...

³ÉÈË¿ìÊÖ iD

³ÉÈË¿ìÊÖ navigation

³ÉÈË¿ìÊÖ Â© 2014 The ³ÉÈË¿ìÊÖ is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.