In 2017, I worked on a project to syndicate the entire music library (and metadata, including categories and audio markers) from a capital-city radio station running RCS Zetta and Powergold Music Scheduling, to a number of small regional radio stations running mAirList. This article explains how it works, the architecture behind this system, and why we built it.
Project Requirements
This project was designed to support a collection of very small community radio stations in Regional NSW. We needed to find a way to continually provide copies of our music library, along with metadata such as title, artist, category and audio markers, to each of these stations without adding any additional work to either the originating station, or the receiving stations.
At the originating station, we already had RCS Zetta containing the audio files, and Powergold containing the category data. Zetta uses a Microsoft SQL Server backend, while Powergold uses Paradox DB (first released for Windows in 1992, Paradox mostly died when Microsoft Access was bundled with the Office suite in 1995 – but you’d be shocked to see who else is still using it as a backend to their WinForm apps).
At the receiving stations, we were starting from scratch – able to select and deploy a playout system that would meet the specific requirements of this project and the ongoing needs of the stations.
Automation System Options
While the key technology at the originating station was already in-place (one of the project goals was to minimise the impact on the originating station), we had some flexibility at the receiving stations.
We considered a full Zetta deployment, with either Site Replication or ZCast to do the audio file transfer. However, we wanted a completely transparent system, at a very low cost per-station, with some specific business rules around how the content syndication would work. The stations themselves also didn’t have very complex operating requirements.
We then began evaluating other automation and database systems. Amongst those we investigated were StationPlaylist, Simian, mAirList, and RadioDJ. We evaluated the first two as they already had a significant user-base within the community radio sector in Australia. mAirList was trialled based on the recommendations from an industry internet forum. We tried RadioDJ as I’d used it previously on an internet radio project. We also looked at some other systems based on their capabilities around media syndication, but we could not find an automation system that had in-built interoperability with other vendor’s systems.
After putting each of these systems through their pace, mAirList eventually won. One of the significant deciding factors was that mAirList was backed by a Postgres database, rather than various flat file systems as used by StationPlaylist and Simian. It’s also a very powerful automation system, which the various stakeholders in this project have since been taking advantage of.
Integrating Everything?
One of the key factors in selecting an automation system was how easy it’d be to integrate with the other systems at the originating end. mAirList’s use of a Postgres database, and well as some of its flexibility with file structures and metadata, made this much easier.
As we had not found a pre-existing package to integrate these different applications, we started designing our own in-house system.
This flowchart shows each of the components we used and their respective data flows:
At the heart of the system sits OnAirHope, a pre-existing web app to manage now-playing data, schedules, streaming stats, and on-demand audio. This seemed like the logical place to store all metadata, as it already integrated with Zetta and had a well-structed API.
A Python application was developed to talk to Zetta and Powergold on a regular basis and get the latest version of category data, and find any audio files needed by the remote stations. Categories, Titles, and Artists are taken from Powergold, and compared against Zetta for the actual audio file and audio markers (in trim, out trim, segue point, etc.).
This on-premise Python application uses ODBC to communicate to both Microsoft SQL Server (Zetta) and Paradox (Powergold). It also used FFMPEG to conform all the audio files and strip out any unnecessary file headers.
All audio assets are uploaded to Amazon’s S3 file storage service, using their time-limited upload token API system. AWS was selected as it provides a cheap and scalable way to store this moderate amount of audio data – we don’t need to provision or manage any storage servers for this project, so there are no spinning disks to fill up.
At the receiving end, another Python application was built to lookup OnAirHope on a regular basis, compare the remote media/metadata against the local mAirList database, and sync any changes.
As these stations are in remote locations with patchy internet connections, the sync is differential and designed to use as little bandwidth as possible (the bulk of the bandwidth goes to downloading new PCM WAV files).
Rollout
The final deployment model consisted of a single refurbished Dell R610 at each station, running Windows Server 2012 R2, mAirList, and our custom syndication software. We selected this hardware as it had dual power supplies, ECC RAM, Hot-Swappable Hard Drives, and was rack-mountable. Honestly, at 48GB RAM and far too many CPU cores – it’s complete overkill – but actually very cost effective and super simple to support remotely.
The rollout to the first four stations was completed by my colleagues.
Results
All of this is completely automatic, and does not require intervention by anyone. mAirList is set to auto-schedule music based on local clock structures, and some stations also choose to add their own local music categories for variety and localism. These stations are still 100% locally run, but this system removes the guesswork involved in music selection and categorisation.
Each of the stations have been reporting positive results, and a very low error rate. Indeed, this system has now been running for twelve months with the only intervention needed from the originating end is to change the category configuration if our local clocks change. Our local music directors can continue on each day without needing to worry about updating anything at the ‘network’ stations.