³ÉÈË¿ìÊÖ

Archives for August 2012

IRFS Weeknotes #121

Post categories: ,Ìý,Ìý,Ìý

Tristan Ferne | 16:10 UK time, Friday, 24 August 2012

This is the 121st issue of weeknotes from R&D's Internet Research & Future Services team where we write about the progress of our projects to give you some insight into what goes on in just one part of R&D (see other posts in this blog and our website for our other work).

First, there's an opening in our team for a senior Ruby engineer with experience of team leadership and knowledge of speech recognition, audio analysis, streaming A/v, machine learning or search technologies. If you're interested, or know someone who might be, then you can or get in touch at irfs@bbc.co.uk.

There's lots of activity on Thursday morning when I start writing these notes; there's an engineering team meeting, Sean is hosting a with reps from the EBU and Frontier Silicon , we're meeting the University of Lancaster to talk about their TV testbed and there's our regular team demo session. Unfortunately the last three events are all at the same time, so the team splits. Though I note that Chris Needham seems to go to all of them. And after all that drops in to demo his parent-and-child , "humanity's first general-purpose self-replicating manufacturing machine".

Graham's RepRap printer

Read the rest of this entry

ViSTA-TV: Linked Open Data, Statistics and Recommendations for Live TV

Post categories: ,Ìý,Ìý

Libby Miller | 11:00 UK time, Friday, 24 August 2012

In this post, Libby Miller and Chris Newell introduce the project, which has just started.

ViSTA-TV (Video Stream Analytics for Viewers in the TV Industry) is a two-year collaborative research project about linked open data, statistics and recommendations for live TV, involving online TV viewing data, programme metadata and other external sources of data. We are working with three research institutions (University of Zurich, TU Dortmund University, and the VU University Amsterdam) and two companies (Zattoo and Rapid-I) to create:

  • Real-time TV recommendations for viewers
  • Highly accurate low-latency audience research
  • A high-quality, linked open dataset about TV
  • A marketplace for audience metrics

For us, it's an opportunity to find interesting uses for the streams of data that are produced by iPlayer showing (anonymously) what channels people are watching and when they start and finish watching. Part of the project is to analyse - in aggregate - whether there are observable events within the video, audio or text streams that make people change channel, and combine this with external features such as peoples' interests or peoples' actions on social media.

It also gives us the opportunity to learn more about machine learning techniques, particularly data mining over large streams of data. We hope to draw on the expertise of Rapid-I and the Universities of Zurich and Dortmund to get a greater understanding of the contribution of various forms of data to the quality of the information we get back in return. So for example we might find out whether it is worth doing video processing to look for events of interest, or whether subtitles is a better route - or perhaps whether working with both datasets gives us quantifiably better results.

A major goal for some of the partners is to produce a marketplace for audience metrics, to lay the groundwork to be able to offer characterisations of audiences in real time for sale. ³ÉÈË¿ìÊÖ R&D is interested primarily in the other end of the spectrum - being able to create and use linked open data sources for TV-related data (such as /programmes), and to this end the work of the VU University Amsterdam in the project will be very interesting, as it builds on work begun in the NoTube project on collating and enriching metadata about TV.

The ³ÉÈË¿ìÊÖ's role in the project, like Zattoo's, is a dual one: that of data provider (anonymised streams of behavioural data from iPlayer simulcast streams, subtitles, metadata, video and audio) and application maker. Our goal is not just to find out more about the data but to give audiences something useful back, such as recommendations of what to watch right now, or other applications that make interesting use of the results of the project and enable us to evaluate them. We are very pleased to be able to follow up aspects of the MyMedia and NoTube projects, which in different ways addressed data for recommendations - this project takes those results and pushes them in new directions, towards real-time analytics.

As I mentioned in weeknotes last week, we'll be holding a workshop on possible applications, so if this is something that interests you, let me know and I'll try and get you an invite.

IRFS Weeknotes #120

Post categories: ,Ìý,Ìý,Ìý

Libby Miller | 11:17 UK time, Tuesday, 21 August 2012

For much of the team this has been an FI-Content / egBox week. To recap, egBox is a prototyping platform which can play back TV using a DVB-T Tuner, overlay a user interface, and also be controlled using an http API. It's currently being developed as part of the FI ('Future Internet') project, to demonstrate authorisation and tagging on TVs. The plan is to demo it at , which is in mid-October. Previously in egBox: In a dramatic U-turn a couple of weeks ago, the team decided to move away from the full HTML 5 implementation (because the required transcoding meant the machine spec needed to be too high) and, for now, has moved to VLC with a custom Webkit over the top...

Read the rest of this entry

Olympic Diving "Splashometer"

Post categories:

Robert Dawes | 15:05 UK time, Thursday, 16 August 2012

On the morning of Saturday 11th you may have seen ³ÉÈË¿ìÊÖ diving commentator Leon Taylor talking to Mishal Husain about Tom Daley's below par performance in the men's 10m platform diving preliminary round. During the discussion they played videos of two of Daley's dives with graphics added to describe the size of the splash and the angle of entry.

Graphics overlaid on footage of Tom Daley diving

Graphics overlaid on footage of Tom Daley diving

The system producing this analysis has come out of the biomechanics project, from which we also recently developed the augmented reality athletics tool. The analysis is automatic and works live - the system examines the video frame by frame to extract the diver and splash from the rest of the scene. It then measures the size of the splash and the angle the diver is making as he or she enters the water and displays the results to the viewer.

Read the rest of this entry

Sibyl Recommender Prototype

Post categories:

Chris Newell | 10:50 UK time, Thursday, 16 August 2012

There are so many programmes to choose from these days it can be hard to decide what to watch. Personalised recommender systems can help but usually require knowledge of your previous viewing history or preferences. Unfortunately, some users may be new to our services, may not log-in or may not want their viewing history to be tracked. For these reasons we have been exploring the concept of a "standalone recommender" that does not depend on historical viewing records.

Read the rest of this entry

IRFS Weeknotes #119

Post categories: ,Ìý,Ìý,Ìý

Olivier Thereaux | 16:10 UK time, Friday, 10 August 2012

This may be summer and there may be a major sports competition which has a few of us glued to several live streams at a time (that's our new four-screens strategy), projects have been humming along nicely for this 119th instalment of our weeknotes.

³ÉÈË¿ìÊÖ office

James' new home office setup suffers from surprisingly poor internet connectivity.

The project team continues looking at TV authentication: Chris Needham has been building an web application and working with James on some UML sequence diagrams to model the "Authorization Code Grant" flow. Meanwhile, Barbara happily reports that the initial UI for the smart app has been reviewed and implementation is starting next week. Recruitment for the ethnographic study is also making progress.

Libby and Chris Newell are still working on the foundations of the VistaTV project. Libby has been trying the - merging data by series and brand, experimenting with different partitions and generating graphs to try and learn more about our month's worth of anonymised viewing dataset. She's also been installing our TV-cum-HTML5 prototype egbox to see if it could be used in the VistaTV project.

The ABC-IP world service / automated tagging project is still running its user trial. Chris Lowis has been working on an interface to allow listeners to improve the automated tags prototype, as well as a few other tweaks and bug fixes; Andrew N did a bit of work on a more advanced speaker segmentation interface for the ABC world service prototype and spent the rest of the week was spent getting to grips with , a framework for creating javascript web apps. He will be using it for the web-app remote control for egbox.

Cards everywhere

Our show and tell meeting this week was fairly quiet, but we did get a visit on Monday: came to talk about his experiments with popup DAB radio with crowd-based DJing, and his work creating radio cases with kits.

As for me, it's been a typically quiet summer period on the W3C front, with , which I chair... I'm looking forward to seeing activity pick up again!

In the meantime, I have been collaborating with Theo, Michael and Sean on an update to our R&D website, meeting people, sorting keywords and generally invading a significant portion of our office with cards, stickies and printouts. Not quite ready for the paperless office yet.

Little Sun - ³ÉÈË¿ìÊÖ R&D create participatory art experience at Tate Modern

Post categories: ,Ìý,Ìý,Ìý

Matthew Brooks | 14:00 UK time, Wednesday, 8 August 2012

Ìý

³ÉÈË¿ìÊÖ R&D have joined forces with Studio Olafur Eliasson to create a participatory art experience at the Tate Modern. In 18 days, we created an installation and launched a website - - bringing thousands of artworks created by the public to anyone with a modern web browser. In this blog post, I’ll give you a rundown of how we managed to do this in just under three weeks.

Olafur Eliasson's recent project is concerned with bringing light to the 1.6 billion people in the world who don’t have access to electricity. To engage people with the importance of access to light, Olafur wanted to create an installation in which people could create . Not only that, he wanted all of those light paintings to create a sun that could be explored on the internet. There would be a blacked out room at the Tate in which to create the light graffiti, and a live feed of the creation process, and an online digital sun allowing the light graffiti to be explored. Olafur explains his concept much more eloquently than I do in this .

Our Chief Scientist Brandon Butterworth had been in conversation with Olafur about the project for some time. Olafur had created in the Tate Modern’s Turbine Hall in 2003, an artwork I remember being fascinated by. When Brandon sent an email around asking for volunteers to sign up for a project involving Olafur, I jumped at it, as did several other brave and foolhardy R&D engineers.

Read the rest of this entry

IRFS Weeknotes #118

Post categories: ,Ìý,Ìý,Ìý

Pete Warren Pete Warren | 15:17 UK time, Tuesday, 7 August 2012

A bumper R&D IRFS weeknotes this Olympic™ week, as the activities from the past two weeks are condensed into a single entry, a bit like the two steaming meaty patties crammed into every delicious ®.

Image of the ³ÉÈË¿ìÊÖ studio at the Olympic Park

NASA Curiosity touches down on Mars... oh hang on. No, it's the ³ÉÈË¿ìÊÖ studio inside the Olympic Park

Read the rest of this entry

The Olympics in Super Hi-Vision

Post categories: ,Ìý,Ìý

John Zubrzycki | 13:00 UK time, Wednesday, 1 August 2012

The Broadcasting House Radio Theatre with Super Hi Vision installed for the Olympics

The Broadcasting House Radio Theatre with Super Hi Vision installed for the Olympics

I sat with a crowd of people watching the Olympic Opening CeremonyÌýlast Friday evening. They were having a very good time, clapping and cheering throughout the evening. We all had the best seats in the stadium, but we weren’t in the Olympic Stadium, we were four miles away in ³ÉÈË¿ìÊÖ Broadcasting House. Other groups of people were in Bradford, Glasgow USA and Japan.Ìý Why the fuss? TV has been doing this for many years. The reason is that we were viewing using an Ultra-HD system called (SHV) developed by (The Japanese national broadcaster). SHV has sixteen times as many pixels as HDTV making a picture with 7680 pixels across by 4320 pixels down. It was displayed on an 8-metre wide screen, accompanied by a 22.2 multichannel 3-dimensional sound system. The combined effect was to transport the people in the viewing theatre right into the stadium – telepresence comes of age.

Read the rest of this entry

More from this blog...

³ÉÈË¿ìÊÖ iD

³ÉÈË¿ìÊÖ navigation

³ÉÈË¿ìÊÖ Â© 2014 The ³ÉÈË¿ìÊÖ is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.