Note: this post is a continuation of my previous article, The Gambler.
In my last post I spoke about my development on a recently launched live mass-participation game show. Since then the show has had three, very reasonably, successful broadcasts running 9pm til midnight Thursday across to Saturday. Grateful to the forgiving pool of beta testers who excused the previous technical dilemmas when we had them, the recent shows seem to have gone almost completely without hitch.
Being a hybrid format aiming to balance as much the creation of a social gaming community as extending the achievements of the original television channel, it’s interesting to explore how just as many of the inevitable teething problems we’ve encountered, have in fact, been outside of the technical scope. These problems instead, have been found to be more part of the process of migration to the new adaptation. The beta test group has grown to over five hundred users, with that, some unexpected and quite interesting trends have begun to emerge.
Initially, understandably taking off from the success of the original programme, the focus of the project was seen to be the live video content – the presenter-led broadcast with eccentric compere, who would introduce and commentate on the games in progress. This was key, and second came his interaction with the players, a communication only made possible by the translation to an online platform. This idea held quite innocently, saw the show as a direct port – the show simply delivered online – with just a change of viewing medium. What wasn’t predicted however, were the behavioural changes of online contestants, given now, that their level of participation has been radically reversed with it’s re-imagining, perhaps so much so, that this priority focus may need to be altered.
Previously the only way of answering questions for example, was by a single, individual phone-in effort at the (literal) cost of the player. This could only occur one at a time of course – and presumably then, one could not accurately know until the declaration of their answer, how long a single game could last. Now however, all players have free access and the opportunity to answer all questions, so there is no single ‘spotlight’. Short of a low threshold profanity filter, there are no barriers holding back players voices. As a result, the integrated chat client is furiously active, but an intriguing place. The users are already beginning to build the intended community sphere, but as much a place of social interaction, guaranteed after the final question of each game will the players begin comparing answers, scores, offering the most sincere congratulations and commiserations.
They are offering each other as much, if not more entertainment, than the assumed center of attention – the presenter. In some cases, the production team, not grudgingly, succumbs to the demands for more and more immediate and frequent games. The players too are beginning to see a shift in who really has control of the show. This confirms even greater still, the need for our technical system to be hardily stable and reliable. As online users become more demanding, the less compromise they (quite rightly) feel the need to offer in search of their own entertainment.
With most of the bugs and glitches ironed out of the game engine which now steadily ticks over as it should, the only remaining problem of any substance is the delivery of that live video broadcast. Acknowledged early on, the audio and visual quality would not be equal the digital television picture they’d previously broadcast, but we endeavoured to explore any way to best achieve something as near to that as possible. We also had to consider the amount of data being transferred to achieve this. Surely out maths must be wrong – with video at 300kbps, audio at a variable 128kbps, for three hours, for three nights, for up to one thousand users.. we abandoned the calculations when they began needing measurement in terabytes per night.
Instead we use a bespoke content distribution network (CDN), who specialise in mass content delivery, specifically designed to distribute large amounts of rich media. We push to one URI, essentially another Flash Media Server (the Flash Streaming Server equivalent of the Interactive Media Server) and it’s bounced around various worldwide locations, which each client subscribes to. We’ve had intermittent success with Limelight Networks, but this week will be changing over to Akamai, who boast leadership in streaming media services, hopefully to solve our remaining issues.
The beta phase will continue for a further three weeks, over which time the audience figures will steadily rise. Future projections intend the system to reliably run for up to one thousand users, a figure we currently see confidently attainable. Beyond that stage, only now beginning to be considered, will need to bring more significant architectural changes. We’ve discussed using multiple Flash Media Servers for the game engine, whether sharing the load or sharing the connections – or construct some hierarchy of master to slave servers.
Although with Adobe’s release of Flash Media Server 3 the licencing costs have been dramatically reduced, the overheads will amount regardless. With that in mind, I will start to look at running Red5, an open source Flash RTMP Server – essentially the open source equivalent of Adobe’s FMS, as it becomes more stable and nears it’s first public release.