ISE 2023: Boundless Integration

After visiting IBC last year, I wanted to get a view of the technology and broadcast industry from another perspective — systems integration. Manufacturers are always keen to announce upcoming products and showcase new innovations, leaving the integrators to solve the puzzle and fit all the pieces together. Integrated Systems Europe (ISE) 2023 was the ideal place to explore this.

Despite the disruption over the past two years from the pandemic, this was the largest ISE ever with over 58,000 people attending the Barcelona event. This illustrates the industry’s demand for in-person events, and rightly so — there were a lot of technical demonstrations, and exhibitors were really looking to push the envelope and gauge demand for new technology.

Enhancing the audio experience was a big theme at ISE 2023. This is very important when looking at the future of the metaverse — extended reality applications often try to enhance the senses, so audio plays a huge part. Spatial audio through Dolby Atmos has been around for a while with Sky Cinema and the Premier League, where dialogue needs to be heard by all listeners. This can restrict the position of other audio effects to prioritize speech.

However, music doesn’t have that restriction. Electronic music composer and producer Jean-Michel Jarre was at the show and talked about how he mastered his new album, Oxymore, in spatial audio. Mr Jarre said that “in real life, stereo doesn’t exist, our audio field is 360 degrees”, and “musicians are egocentric, we put the sounds where we want to really enhance the experience”. I listened to two of his tracks on Coda Audio’s Space Panels, which provide 360-degree audio in speaker panels just 70 mm deep, and it was an incredible experience.

I also attended several interesting keynotes about a range of topics. Virtual production and in-camera virtual effects (ICVFX) were covered by many, but two speakers stood out: David Gray from Lux Machina and BK Johannessen from Epic Games.

Mr Gray went over the history and range of ICVFX, noting that Lux Machina has worked on productions such as The Mandalorian, Oblivion and Barbie. Discussing the benefits of a fully virtual production, he explained that in-camera effects can save over 15% of a film’s budget, especially in longer sequences. Integration with Epic’s Unreal Engine 5 (UE5) is a massive benefit. For example in e-sports, if the game is built in UE5, video and augmented reality effects can be triggered by in-game events. The main message I got from the keynote was the importance of getting directors of photography (DoPs) and writers involved early in the production process so all technical avenues can be explored and tested.

Continuing the theme, Mr Johannessen led a keynote titled All Paths Lead to Real Time. Following on from the implementation showcased by Lux Machina earlier in the day, he acknowledged Epic’s massive lead in the virtual production space. The Epic ecosystem provides an open platform for content creators — UE5 is free in most contexts so has become the base standard in the industry. Over 300 TV and film projects have used it since 2016, with that number expected to grow rapidly. Additionally, Epic’s Fortnite game, which runs in UE5, has over 80 million monthly active users, most of which are 18 to 24 years old. Demand for ICVFX is only growing, putting Epic in a very good position.

I asked both speakers about the future — could Epic keep hold of its crown as market leader after the announcements of Nvidia’s Omniverse and Unity’s virtual production tools? Mr Gray said it would take a lot to overtake Epic, and its current pace of innovation shows no signs of slowing. However, Mr Johannessen was very philosophical, noting that Epic welcomes competitors because “competition breeds innovation”. This is a smart strategy from Epic: the firm has learnt from the cautionary tale of one company that becomes dominant, puts on its blinders and gets sideswiped by the competition. I don’t think this corner of the industry will be a one-horse race for much longer.

Rounding off the keynote presentations, I attended two talks about systems integration in a control room environment. Jochen Bauer from Guntermann & Drunck spoke about the evolution of mission-critical control rooms using the example of fully remote air traffic control. Integrating all airport systems, especially older ones, is difficult but this can be helped by using a combination of hybrid cloud platforms and remote KVM switches to control devices not able to be moved to a cloud environment, such as local emergency systems. This allows for remote supervision of smaller airfields that would need restrictions on airspace because of staffing requirements, with remote operators handling multiple sites either full-time or as and when needed.

Tara McLaughlin from Ajar Technology covered the modernization and streamlining of Heathrow Airport’s control rooms using similar technology that Mr Bauer talked about. The airport went from 29 separate control rooms in 2014 to three in 2022. This was a huge undertaking with lots of people involved at all levels. To add to the challenge, down time wasn’t an option as the airport runs 24/7.

She explained how all the different systems needed to be assessed and integrated, several of which, such as Border Force and the Metropolitan Police, had to be kept in secure enclaves. Using a hybrid cloud approach with a redundant local data centre, Ajar reduced the number of monitors by 50% and implemented single sign-on so any operator can use any desk. This also gives terminal controllers much more oversight and access to real-time analytics to keep on top of problems as they occur.

Quick access to data and system resiliency or redundancy are pillars of system design. By exploring all options and the latest innovations, especially hybrid cloud workflows, designers are pushing boundaries and forcing a rethink across the board.

Semiconductors were also a big part of the show, with chipmakers showing off the latest chips that can adapt to different video transport ideals, be that Network Device Interface (NDI) or SMPTE standards, or pushing many high-bandwidth streams down a single cable. The desire for flexibility was clear.

Xilinx, a division of AMD, demonstrated Internet-based audio-visual transmission across its product line. The company has developed multiple platforms to facilitate most AV-over-IP uses — for example, its Kira system-on-module can be used as hardware implementation for NDI or with an IPMX IP core for SMPTE 2110 workflows. It also implemented a machine learning software development kit on its Zynq UltraScale+ platform, demonstrating real-time face tracking in a 4K video stream.

Valens Semiconductor, founder of the HDBaseT Alliance, also showcased some interesting chips. Its VA7000 series allows uncompressed video down a single twisted pair, scalable from one to four video streams. Originally from its automotive arm, the new solution is expected to be very useful in the videoconferencing market. However, I think that’s just scratching the surface, with a lot of opportunities still to be explored in healthcare applications.

ISE 2023 provided an excellent overview of the latest systems integration developments across the industry. Audio and video capabilities were very prominent, as well as the move to hybrid and full cloud solutions in most workflows. Above all, it was really impressive to see how manufacturers are pushing the boundaries of integration and exploring ways to overcome barriers through technological innovation.