In Australia we have the National Broadband Network. In Kansas, they have Google Fiber. These services deliver super-fast broadband to the masses for very reasonable prices. With these services now within reach of some media organisation, I felt it is time to put on the thinking cap and work out what to actually do with speeds in excess of 100Mbps. I think there is a lot more potential here than just zero latency on YouTube clips.
Here are 5 possibilities for Media Organisations with a Super Fast Broadband service:
If you’re dealing with delivery of video content to various platforms, you would know all about the different formats you need to supply in order to reach the widest variety of devices. Transcoding to all of these formats is CPU-intensive, and if you do it a lot then you probably have a computer dedicated to it.
Having ready access to ample bandwidth removes the need to perform CPU-intense activities within your own four walls. Why not outsource your transcoding it to someone else and take advantage of their economies of scale? Zencoder makes this possible starting at $0.05 per minute of video. Amazon’s Elastic Transcoder does the same for $0.03 per minute (excluding transfer and storage)
A lot of these providers also offer API access, so there is the potential to automate your workflow from the point of creating your master file to the point of publishing it on the web.
Cloud-based Playout System
Playout systems have traditionally required large file servers, database servers, fancy sound cards, and dedicated workstations. Imagine if you could do away with all of this while still maintaining the same level of reliability and functionality.
As far as I can tell, such a system doesn’t exist yet. We come close with Sourcefabric’s AirTime, but this isn’t yet geared towards professional broadcasters as it lacks a live playout interface instead focusing on pre-scheduled content playback.
Still, the emergence of super-fast broadband opens up the possibility for such a system to be developed and actually be commercially viable. Doing so would dramatically simplify the broadcast IT environment and also introduce an enormous level of flexibility.
Every studio would be controlled by a slick web-based interface, making remote control easy. Disaster recovery would be infinitely more simple. Outside broadcasts could be run the same way as in-studio broadcasts are. The on-site support burden could be reduced. Material could be archived forever without juggling file storage requirements. Patches to servers wouldn’t be a hassle as you wouldn’t have any servers to patch.
Online Backup and Disaster Recovery
Online backup services are currently being pushed as a solution for individuals. Consumer oriented services such as Backblaze, Crashplan and Carbonite are taking off because they are cheap and provide a level of protection which you can’t have with local backups. Nothing beats fully automated, fool proof, offsite backups.
Business grade services are significantly more expensive, partially due to the storage requirements but also due to the speed at which recovery is required. If your business premises burns down you can’t really wait a few days to get your data shipped to you on a hard drive.
I recently heard of a local business here in Western Sydney which is offering not only a online backup service, but also a innovative way to recover data: customers can choose to restore their backups as virtual servers managed by the backup company. This means you can have instant access to everything you need.
Super-fast broadband is only going to make this better and cheaper. Imagine instantly restoring your servers to VMs in a datacenter, then as you get replacement hardware you could do instant migrations back to your own ‘local cloud’. With fast broadband and the ongoing development of virtualised servers and storage, this is sort of service must only be around the corner.
Limitless, Automated Program Archives
Maintaining archives of everything you put to air (or output online, etc.) is something most people would agree is important. Yet, many stations don’t bother archiving all of their material. Once the short term legally-required logger files are deleted, that content is gone forever. Those that do are juggling tapes, hard drives, and who knows what else. Getting access to this material involves rummaging through archive boxes and finding the material you need.
I’ve done the math, and it would be possible for a radio station to archive everything that goes to air at a high bitrate for around $30/month (this is factoring in small economies of scale – it would become more efficient with each station that starts using such a service). Such a service could offer access to all of this material on-demand to whomever within the station wants access. Making best-of shows and compilations becomes easy.
3G and now 4G technology has been a miracle for those wishing to do Outside Broadcasts. Technologies such as the Tieline, Telos Z/IP One, and LiveU have allowed use of these widely available networks for broadcasting purposes. Trouble is, as the networks become more and more popular with consumers the reliability drops. Packet loss and reduced bandwidth can kill a broadcast. That’s why a lot of broadcasters still choose to turn to ISDN or Satellite to get their program back to base.
As super fast broadband connections become more widely available, it should become easier to get a reliable connection. Combine your broadband with something like a Ubiquiti PTP link and you have a highly-reliable IP connection for your OB.
Not only can you have a reliable path for your program, but you can also use it for other auxiliary services such as video feeds, multiple comms channels and remote control of the gear (mix that live band from within the comfort of your studio).
These are just five exciting possibilities which come to mind. Do you have any other ideas?