Most Popular Articles
IP Audio Begins Interoperability Journey
The Audio Engineering Society has developed a new standard regarding the interoperability of audio over IP (AoIP) systems now known as AES67. The new standard is the result of the work of the AES-X192 task group, which finished its work in April 2013. After being ratified by the AES Standards due process, it was published on Sept. 11, 2013.
You'll hear more about this standard as time goes by. But what is AES67, and how can it benefit a radio station? To start, let's use the AES67-2013 standard itself for the definition of its scope:
"This standard defines an interoperability mode for transport of high-performance audio over networks based on the Internet Protocol. For the purposes of the standard, high-performance audio refers to audio with full bandwidth and low noise. These requirements imply linear PCM coding with a sampling frequency of 44.1kHz and higher and resolution of 16 bits and higher. High performance also implies a low-latency capability compatible with live sound applications. The standard considers latency performance of 10 milliseconds or less."
The AES67-2013 standard also points out that the current AoIP systems are not interoperable, despite the fact that they all have a common basis in IP. AES67 is simply a means to give them a way to talk to one another, using existing protocols; no new protocols have been developed in this process. The AES expects this standard to be useful in fixed and touring live-sound applications, as well as music production and post-production, and, of course, broadcasting.
Many of us have used AoIP systems in which timing was irrelevant; Streaming is the most obvious one. When using UDP/IP (the best effort transport) it doesn't matter to us when audio is heard on the far end. Using audio over IP for remote broadcasts is yet another example; usually we would just accept and work with whatever the delay was between points A and B. It's out of our control, generally speaking, and not a problem unless it gets exceptionally long anyway.
However, it's not hard to imagine if you work in live sound how important timing would be. One could not allow timing to change for front-of-house or stage applications. Using AoIP in that application would demand fixed time delays. This seems to me, after reading AES67, to be one of the most important aspects of the new standard. From the AES67-2013 standard:
"The ability for network participants to share an accurate common clock distinguishes high-performance media streaming from its lower-performance brethren such as Internet radio and IP telephony. Using a common clock, receivers anywhere on the network can synchronize their playback with one another. A common clock allows for a fixed and determinable latency between sender and receiver. A common clock assures that all streams are sampled and presented at exactly the same rate. Streams running at the same rate may be readily combined in receivers. This property is critical for efficient implementation of networked audio devices such as digital mixing consoles."
Let me give you a hypothetical application in broadcasting though. Let's say you use AoIP for your main STL for an FM station; and let's also say you want to build an on-channel booster for that same FM station. In this application you'll want to maintain a fixed (but configurable) delay between the two; but you're also using IP for your connection to the booster site. How would you ensure the time delay remains constant? AES67 might provide the answer: It uses IEEE 1588-2008, otherwise known as Precision Time Protocol. PTP basically works like this:
■ A grand-master clock lives on the network, and it receives its time data via GPS
■ Time-stamped synchronization messages are sent out on the network, and received at each end point (slave), where the time is read.
■ The slave can (optionally) send a delay request message back to the grand master
■ The grand master in turn sends the delay-request message back to the slave that requested it, with an updated time stamp
■ The slave receives the message, takes the time difference between when it sent the message, and when it received it, and divides it in two. The calculated propagation delay is then added to the clock at the end point.
So what's particularly neat about this is that not only do you synchronize the clocks at the end points, but you also continually measure the propagation delay, and corrections are made as necessary.
- continued on page 2
Acceptable Use Policy blog comments powered by Disqus
[an error occurred while processing this directive]
Today in Radio History
The history of radio broadcasting extends beyond the work of a few famous inventors.
Read each issue online in our Digital Edition Format in your Web browser.
EAS Information More on EAS
The feed provides feeds for all US states and territories.
Need a calendar for your computer desktop? Use one of ours.
When building its new broadcast production vehicle, MRN applied lessons learned from the past.
Browse Back Issues[an error occurred while processing this directive]
Also in the April Issue
- Update on Transmitters
- On-air Missteps to Avoid
- Tower Lease Renegotiation
- New Products
- Applied Technology: Streaming with the MPEG HE-AAC Audio Codec
- Side by Side: Studio Furniture
- Practical Use: Circulators and Isolators