I’ve been working on a building project for my local church – taking the lead on designing and overseeing the roll out out the audio, visual, lighting, IT, security systems for the church. No one can go it alone – and I’m fortunate to have the chance to work with some excellent builders, architects, engineers, project managers, electrical and AV companies. My role is mainly to be the glue between the needs of the local congregation and these third-party consultants – plus a bit of vision-casting thrown in to make sure we’re planning for the future.
While the project is still ongoing (due to finish August 2020), the tech plans are well and truly drawn up, the DSP/control programming well advanced – and it’s pretty close to roll out as soon as the building schedule allows.
I wanted to take the time to write up some of the concepts that we have for the fit out. This is by no means a comprehensive write-up of everything that’s going on, but it does cover some of the key things we decided to include.
I’m not going to be posting all the as-builts here, partially because they haven’t been finished, and also because I don’t think I have the rights to do that.
Future articles will hopefully contain some photos & thoughts about the final outcome, and some lessons learned along the way. The purpose of this article is just to explain & document some of the design decisions we have made.

For an idea of how the project has been going, check out the ‘project update’ videos on the YouTube channel.
The Space
The earliest plans for this move were drawn up in October 2017, but the desire to move had really been an on-again, off-again proposition for the church since at least the 80’s. I’ve attended the church since around 2013, and I remember talk about a possible move in my first 12 months (and I think I was on a building committee for another project in the 2nd year I attended?).
This particular move has seen us vacate our old Parramatta CBD premises in March 2018, occupy ‘temporary’ space in Granville in the interim, and soon move in to a beautifully renovated two story commercial & retail building in North Parramatta.

What we have for the church is essentially one split-level floor, divided up into a number of key areas.
- Worship Hall
- Function Hall
- Kids Room
- Parents Room
- Adaptable Room
- Foyer
- Waiting Area
- Meeting Room
- Office
Here is a very, very, very rough mud map of the layout:

This list & map excludes a lot of details – such as bathrooms, storage rooms, rack rooms, corridors, counselling rooms, green room, etc. etc. But the rooms I’ve included on this map are the main ones for the purposes of this article.
All of these rooms (plus a few more) have AV requirements which had to be carefully considered.
We also have a dedicated AV Rack Room, which contains two racks, a distribution board, dimmers/distro, lighting patch point system, and A/C.
There is a small AV room where we can direct & mix and broadcast feeds (I really should say ‘live stream’, but ‘broadcast’ sounds way cooler).
We also have two further racks sitting in storerooms, and a couple of small racks side-stage for patching stuff on stage.
General Concepts
There were some key principals we kept in mind when starting the AV design process. These were derived from meeting with various stakeholders inside the church, chatting to people at other churches, reading various write-ups, and applying my own experience.
Audio Everywhere
Almost every space in the building needed to be able to hear something – wether this be a feed from the main worship hall, background music, or audio for a local meeting/event, we decided the majority of spaces needed in-ceiling speakers (and the main hall required a proper PA, obviously).
Video Everywhere
Almost every space in the building also needed to be able to see something – either a local laptop input, a camera feed from the main worship hall, digital signage, or even something cast from your mobile device or web browers.
In addition to being able to see, we also decided a number of rooms needed to be able to provide video back into the main video system – mostly for the purposes of streaming. We decided early on that we wanted several meeting rooms and breakout spaces to be able to run a live stream if necessary. A common use-case for this would be providing training events to a national audience, or live streaming an event that would feel out of place in the 250-seater auditorium.
Flexible Connectivity & Control
We wanted every space to be controlled, in some capacity, by someone with very limited technical expertise. From the worship hall, down to the smallest meeting room, and everything in between – I wanted to confidently book events, and have people rock up and control all key systems without it being a hassle or a risk.
Additionally, we considered how we wanted people to be able to connect into the systems. This resulted in a lot of Cat6a and SDI video cable throughout the building. It included a limited number of analog audio cable, fibre, and HDMI patch cable. But better still, it has also included consideration for wirelessly connecting into audio & video systems – mainly via AirPlay, Chromecast, Bluetooth audio, and radio mics.
Wherever practical, we chose to use IP Audio instead of anything analog. We’ve tried to keep analog audio to a minimum (it’s mostly used in breakout spaces), and all analog inputs get digitised as soon as possible (basically we need to patch everything directly into a I/O box).
Maintainability
We need to select products that not only are cost-effective to buy, but also easy to maintain. This means platforms such as Q-SYS, and off-the-shelf IT hardware, are prefect – we have all the skills we need in-house to maintain it, and it’s not so exotic that you couldn’t call up an AV company and pay them for support if needed.
Future Proof
No one wants to have a building with antiquated systems. This desire partially drove the decisions around how much cable we spec’d (why is there SDI in the ground floor foyer? Well, you never know when you may need to plug a camera in down there…).
As I’ve seen how complicated the wall and ceiling cavities are, and how tight some can get after the plasterboard goes up, I’m very glad there should be no need to run any extra cable in the foreseeable future.
Apart from all this, there were some personal goals I wanted to achieve – including eliminating the need for TVs on trolleys, getting WiFi available to everyone everywhere, and implementing a very neat and robust set of solutions across the whole building. My end-goal is to not only have a really functional and flexible facility, but to provide this as a blueprint to be rolled out to other churches – showing what can be done in a modern church.
Decisions for every room
Designing anything for a building requires a lot of attention to detail. To make this easier, and to help with support and maintenance, we tried to come up with some common building blocks to apply across every room.
Small Meeting/Activity Rooms
These spaces all have 1x 65″ TV, 4-6x Mic Outlets, 4-6x SDI Outlets, 4-6x Cat6 outlets, HDMI patch cable to the TV, bluetooth audio receiver, AirPlay/Chromecast receiver, antenna connection, and in-ceiling speakers.

Common Areas
TVs are either 65″ or 42″ (depending on the location), with 2x Cat6, a digital signage player, antenna connection, and in-ceiling speakers. All technology in common areas automatically gets turned on/off on a schedule, with some simple overrides available for out of hours usage.
Function Hall
2x projectors with motorised screens (on 2x different walls to account for the ‘L’ shape of the room), in-ceiling speakers, mic outlets, HDMI outlets, bluetooth audio receiver, AirPlay/Chromecast receiver, hearing aid loop, and what not.
Worship Hall
Being the biggest and most versatile space, we have by far the most equipment in here. We have a dedicated tech desk, 3x motorised lighting trusses, 5x smaller fixed lighting bars, a flown PA, projectors with fixed-frame screens (with provision to fly a LED Wall when budget permits), hearing aid loop, and a fair bit more.
Systems design
With these concepts in mind, and some high-level decisions made, we moved onto a more detailed design phase.
Much of the work early on involved was simply ensuring the requirements were communicated correctly to the engineers (so they could put hundreds of symbols on dozens of drawings), liaising with the architects to ensure locations were going to work for them, and just generally paying a lot of attention to every revision of the plans that came through.
There was also the need to author some short documents outlining more detailed requirements for the eventual winner of the tender. This included items such as rigging, main PA requirements, security system requirements, etc.
The way our project has ended up (partially due to skills available in-house, mainly due to budget) is that we only contracted out some of the key parts of the fit-out to an external AV company. These items were the ones that must be done before the building was completed, because they would be too difficult to do later.
- Rigging
- Hearing Aid Loops
- Main PA in the worship hall
- In-ceiling speakers
- Cabling
Really, we could only afford to have these big-ticket items contracted out. While I initially had a lot more in-scope, this got cut when prices started coming back. Most of the rest of the equipment is to be supplied, configured and installed by our own volunteer team.
Tech at the heart of the design
We selected some key systems to sit at the heart of the AV:
All AV control & DSP is handled by QSC Q-Sys. We have a Core 510i at the heart of the network (with a redundant core to come before too long), talking Dante/AES67 & TCI/IP to all the other devices. Much of the Dante I/O will be via Yamaha Tio boxes, but we also have many computers running Dante DVS, and a main Allen & Heath console in the worship hall with a Dante card.
Core video routing is via a Blackmagic Design VideoHub, connecting all key SDI I/O. We can’t afford a huge SDI router, so much of the SDI will be directly connected between sources/destinations initially. The VideoHub is controllable via IP in Q-SYS, and will be connected via a normalised BNC patchbay, so resetting anything ‘special’ is as simple as pulling out the leads on the front.
We have three video switchers – all various models of Blackmagic ATEMs. One switcher for the main projection feed, one for a ‘broadcast’ feed (live stream, plus distribution to around the building), and a final one for the stage display (current & next slides will be shown when applicable, but automatically switched to a PGM feed during periods such as preaching, etc. The clock will always be DSK’d over the output, which is why I can’t simply use an AUX bus on another switcher).
All network switching will be handled by Cisco 3750’s (an oldie, but suitable for our needs, and available at a fantastic price). These will run in two separate stacks (one stack in each half of the building), and be linked via multiple redundant fibre connections. Separate VLANs will exist for Servers, Office, AV, Dante, Building Services, and Untrusted (W)LAN. A lone 3750 will also exist standalone for KVM switching & Dante Secondary LAN.
We’ll have a pair of Dell PowerEdge servers, running VMWare ESXi as a core hypervisor. We’re not licensing the vSAN functionality initially, so all VM workloads we have selected will run redundantly across both hosts at an application level.
I’m still working out what core router to use. I like the Mikrotik RouterBoard devices, but am also considering PFSense hardware and FortiGate (I do miss their UI and the ease of setup with WAN failover & load balancing).
Our AirPlay/Chromecast receivers will be Intel Compute Sticks, running AirServer. Bluetooth receivers will be Raspberry Pi‘s, talking via IP to the Media Receiver block on the Q-SYS (we have a plugin in the works to control these – hopefully we’ll open source it eventually).
Digital Signage will be run from PiSignage, and most TVs will be Sony Bravia commercial displays (we are re-using a couple of older TVs with IP2IR boxes for remote control, but honestly the RESTful API in the Sony’s is fantastic and I hope to replace the older TVs eventually).

Our tech desk in the worship hall will have a large number of monitors, an audio console, and not much else. KVM receivers will connect every monitor back to the computers in the rack room. No computers will live at the tech desk. This is helpful for security, noise, power, cooling, and space. I’ve trialled a IP KVM extender I found on Alibaba (pretty happy so far), and we’ve built a Q-SYS plugin to allow switching computers around at the tech desk via these KVM extenders – I plan to write more about this in the future. Being able to turn on the correct computers along with the rest of the AV system will be fantastic.
The main audio console in the auditorium is an Allen & Heath GLD-80, with Dante. An eventual upgrade to an Avantis will be nice, moving the GLD to broadcast mix duty. All outputs are via Dante – when the system is in ‘simple’ mode, a Q-SYS UCI will control all front-of-house audio as well as foldback, hearing aid loops, recording, etc. When the ‘advanced’ system is turned on, a IP relay will power up the GLD, and switch all audio feeds to that console.
Shure ULX-D will run all radio mics in the auditorium, giving us native Dante I/O & monitoring via Wireless Workbench.
ProPresenter 7 & Resolume will drive on-screen content, depending on the needs of each event. There will be HDMI inputs at the tech desk & on-stage, as well as AirPlay/Chromecast for conference scenarios. All this will be switchable via Q-SYS of course.
We will have DALI controlled house lighting, with a DMX to DALI interface giving us some simple control over this from a lighting console (LSC Clarity to start with). A bevy of theatrical lighting fixtures around the room will eventually give us some great looks in a variety of colours in the body of the hall, without the need to use this DALI house lighting.
Production power will be switched by LSC APS, and some Lindy IP Power Strips for rack & tech desk power. We have selected the DMX King sACN to DMX interfaces. Q-SYS will control power, and have the option to control lighting. We’re using sACN priorities to allow a lighting console or lighting software to override the Q-SYS sACN stream.
In an effort to be thrifty, we’ve procured some 2nd hand Dell Optiplex 5050 & Dell Precision workstations to be used for all production PCs. These come in very cheap, and allowed me to spend some money on Decklink SDI cards & good GPUs for the video workstations.
Next Steps
The next couple of months will be pretty critical. The structural work is nearing completion, and the builder moves into a phase of finishing off a lot of fittings and finishes. As we get right near the end of the project timeline, we have a window of opportunity to take a team on-site and fit out the racks, install the TVs, set all the parameters in the DSP, and just generally commission the system.
There’s still a lot to do, but I’m looking forward to my church community being able to jump in and start enjoying the space. If I’ve done my job right, they won’t have to think about a lot of the complexities involved in making it happen.