For decades the broadcast industry relied on a standard and interoperable set of technologies for media creation and processing: SDI. People across production, post-production and transmission all used the same set of standards, which delivered the high degree of compliance and interoperability necessary to ensure that two devices equipped with SDI interfaces would work together.
The broadcast industry is migrating successfully from SDI to IP in large part due to SMPTE ST 2110; a suite of open standards that enables the use of commercial-off-the-shelf (COTS) IT infrastructure equipment while benefitting from the flexibility, agility, and scalability of IP Networks.
A few weeks before his untimely death on July 26, David Chiappini, executive vice president, research and development at Matrox, delivered what would turn out to be his last commentary piece for Installation, on the future of AVoIP.
AIMS Techfest Presents: IP Showcase highlighting the great strides that have been made in achieving an all-IP ecosystem using open standards was a great success.
Presentations and case studies demonstrated how the SMPTE ST 2110 suite of standards and the AMWA NMOS technology stack are improving media workflows for large and small, broadcast and Pro AV deployments alike.
If you were unable to attend the live event — or if you’re just in the mood for fresh perspectives on media over IP — the recorded presentations are all available now to view at your leisure.
Check them out here!
One of the many ways in which AIMS supports the adoption of media over IP across both the broadcast and Pro AV industries is by featuring products designed to ease the transition from SDI to IP.
AIMS members have not only enhanced existing products to support IP but have also introduced new solutions that offer critical capabilities for IP-based media workflows.
Check out our collection of products and solutions on our solutions page, where you’ll find an extensive listing of products compliant with the AIMS roadmap, delivering essential functionality for broadcasters and other organizations making the move to IP.
For a deep dive into EDID support in IPMX, check out this recent blog post by Andrew Starks. Andrew does a fantastic job of outlining challenges and new solutions in the tricky universe of practical EDID implementations amidst the always increasing range of displays.
By Andrew Starks.
Plug your monitor into your laptop, and you will expect to see video on the screen and hear audio through your speakers. Your laptop and your monitor will negotiate the most optimal settings and even allow you to override its selection with an alternate mode supported by your display, should you choose. Thanks in part to EDID, and its replacement DisplayID, this works perfectly… almost always.
However, throw in a distribution amplifier or multicast with AV over IP to drive multiple monitors, and you’ll begin to understand the pain and frustration of your average Pro AV installer. This is because EDID is designed for simple one-to-one connections, and everything beyond this is done with hacks that are not based on standards or best practices. As a result, there is a small cottage industry of donkey-knuckles (aka dongles and dohickeys) designed to fake out source devices, replacing the EDID information that the source device would normally receive with whatever the installer wants. Believe it or not, these kinds of workarounds represent the state of the art in AV over IP installations as of 2021.
For IPMX, AIMS recognized the opportunity to improve this situation dramatically by thoroughly supporting content negotiation between sources and displays, especially in multicast scenarios. With that goal in mind, AMWA is developing a new specification called NMOS IS-11 Sink Metadata Processing. Along with NMOS Receiver Capabilities (BCP-004-01), a prior AMWA group’s work, and NMOS EDID to Receiver Capabilities Mapping (BCP-005-01), IS-11 defines how NMOS Senders negotiate with one or more Receiver devices connected to displays using EDID, and eventually DisplayID.
To illustrate how it works, consider the simplest scenario: one Sender and one Receiver device. The Receiver device detects when a monitor is attached to its output using HDMI’s signaling facilities and reads the EDID information. The Receiver then maps the EDID’s timing, video, and audio profile data to receiver capabilities endpoints (a URL) on one or more NMOS Receivers. These Receivers now contain a list of all the video or audio modes supported by the attached monitor. When the user connects a Sender to one of these Receivers, the NMOS controller gets this information from the Receiver and gives it to the Sender. If the Sender can support one or more of the modes it receives, it’s configured and ready to make a connection. If a controller attempts to change the flow to something that is not supported, the Sender shuts down the flow and returns an error.
Now imagine there are multiple monitors. In the case of video, the NMOS controller retrieves the receiver capabilities of each connected Receiver. Typically, the controller would combine them into a constrained set that all monitors support. However, it could alternately use custom logic to determine how mismatched capabilities are handled or further constrain the choices, as desired by the user. With this constrained set, the Sender can pick a mode that every monitor can support or fail if it can’t.
Hidden in these examples are the gory details and exceptions that the IS-11 group must address, thanks to the rocky world of real-world EDID implementations and the ever-increasing variety of displays. Currently, they are in Phase 3, which includes validation and testing of the design with implemented contributions from Pebble, Matrox, and Macnica. Guided by the feedback received from integrators, customers, and manufacturers, the group is confident that they can deliver a robust and deterministic way of managing one of Pro AV’s thorniest issues.
A universal standard for AV-over-IP deployment could be the greatest gift the broadcast industry can give to its cousins in AV. The only issue is which one, says Paul Bray.
The streaming market is defined by technology, but as tech advances, it becomes more and more transparent to the user as it not only better facilitates delivery of media, but also makes the user experience better.
The AES67 standard has been at the heart of audio over IP since its publication in 2013. Defining a minimum set of requirements essential to interoperability at the IP layer, AES67 fulfilled (and still does) the requirements of professional audio while remaining easy to adopt.
Over the past year, it has become clear that IPMX, which builds on SMPTE ST 2110 and AMWA’s NMOS APIs, is shaping up to be the standard that not only meets the needs of Pro AV, but also to become the standard that is used in both production and presentation workflows. With one open standard for low-latency video and audio over any network, for any purpose, the effect of IPMX will not be just like other AV over IP solutions, only bigger. It will be transformative.
Sound & Communications’ Editor, Dan Ferrisi, wanted to know more about the latest developments with the IPMX protocol, which AIMS champions. He also wanted to get all the details about AIMS TechFest 2021, which is taking place March 9 and 10. So, Dan spoke to AIMS’ Pro AV Working Group Chair, David Chiappini, earlier this week to bring all of us up to speed.