Big data is a big trend in 2013, but what does it mean for the TV industry and content creators? That was the question explored in a MIPCube panel session this afternoon.
The panelists: Jeroen Doucet, Managing Director, ExMachina Strategies & Concepts; Marc Hernandez, Managing Director Europe, Trendrr; and Owen Hanks, general manager Europe at YuMe. The panel was moderated by Mike Dicks.
Doucet started off by talking about his company’s work on second-screen strategies for broadcasters. “We’re increasingly seeing that data is becoming an increasingly big part of any second-screen or social TV strategy,” he said.
The key challenge: big data in the second-screen world isn’t just about measuring viewers of a TV show: it’s just as much about measuring their interactions on Twitter, Facebook and other services.
“We increasingly see that this data is crucial in monetising whatever you’re doing on your second screen, and over time we believe it will also be increasingly important for monetising what you’re doing on your first screen,” he said. Which is why broadcasters need a data strategy: defining beforehand what data they want to collect, and why.
He talked about a couple of case studies. First, an elections show for Dutch broadcaster RTL, where 130,000 people voted on who should get the last 30 minutes of a TV debate. “The numbers we were bringing in were more accurate than any of the [other]polls,” he said.
The second case study was talent show The Voice, with a second-screen smartphone app used by 650,000 people in the Netherlands, and which is rolling out to other territories where the show is airing.
Over to Trendrr’s Marc Hernandez, who showed off the company’s dashboard of data from social networks on popular TV shows. It works with Twitter, Facebook, GetGlue and other second-screen apps. “We can really do the clear measurements on who is engaging with the show, and how they are.”
Why is this important? “We are now able to deliver exact metrics on what is happening on social media regarding a show, a character, a celebrity. We can show what people are thinking minute by minute… and this is of course the first step to transfer this data into revenue.”
YuMe is a digital video advertising company, which has something called PQI – a Placement Quality Index – to help brands buy “smarter” ads around digital video, developing a 0-10 score for the performance of any piece of content.
“We were telling some of the world’s largest publishers that they were not performing very well from a brand’s efficiency and effectiveness viewpoint,” said Hanks, before suggesting that the PQI idea has since caught on.
Pitches completed, the panelists sat down for a discussion of the issues around big data. Is it too fragmented, wondered Dicks? Hanks talked about the need to measure audiences as they disperse across different devices, while Hernandez said brands need clear metrics to base their buying decisions on.
“The people who are buying, putting money on the second screen around social TV, they need to get something clear for their return on investment. This is the next step,” he said.
Are we monitoring the noise around a TV show? “Noise is already in the past,” said Hernandez, noting that it’s not enough to say there were three million tweets around a specific show. It’s important to dig into sentiment (were they good or bad?) and engagement (how many people were tweeting how many times?) among other datas.
Doucet said that advertisers don’t need reach – they want people to like their brand more and buy more products. “Reach isn’t an end: it’s a means.”
Hanks disagreed, saying that brands do still need reach, although Doucet said that reach can only be judged by the results that it yields.
The panel gave their advice for MIPCube attendees to finish. “You need to decide whether you actually need the data,” said Hanks. “Not collating data just to send the wrong message out to the wrong audience.”