Hybrid live concerts: Bringing artists and music fans closer together

Live concerts? We've had those forever. But picture this: Live concerts in which the musicians, instead of standing together on a single stage, are connected online and able to play their music without any perceptible delay. Concertgoers at different venues could participate in the action in person, and artists would hear their fans’ applause in real time. Now, that would truly be something new.

With cinemas and theaters closed down and long-awaited concerts cancelled, the COVID-19 pandemic and subsequent lockdown have made life difficult for event managers and artists alike. Researchers from three institutions, Fraunhofer FOKUS, Fraunhofer HHI, and Fraunhofer IIS are working together to get the industry back on track and lend it as much support as possible. To do this, they established the Virtual LiVe project, funded by Fraunhofer’s “KMU akut” program, which focuses on research for small and medium-sized companies burdened by the corona pandemic. The aim of the project is to make it easier for SMEs to access cutting-edge technologies, all while fostering the growth of new and hitherto unimaginable formats that offer long-term benefits across the industry.

“We are combining a range of existing Fraunhofer technologies in a single toolkit,” explains Christian Weissig from Fraunhofer HHI. SMEs can access the tools they require and combine these with their own technologies in any way they need. The research team began by conducting workshops with companies and external partners to get a better understanding of the demands of the SMEs. “In further stages we plan to carry out feasibility studies, and validation projects in which we work together with the SMEs to test the toolkit in specific applications,” reports Dr. Siegfried Foessel from Fraunhofer IIS. Companies also have access to a monthly colloquium where they can network and share information about their technologies.

A hybrid concert in Berlin’s Kesselhaus
"The best way to understand the VirtualLiVe project is to consider one of the initiative’s validation projects, in this case, the upcoming hybrid concert at the Kesselhaus in Berlin.", explains Stephan Steglich from Fraunhofer FOKUS. On December 11, 2021, Billy Andrews, also known as The Dark Tenor, will hold a concert at the venue. This event will have three very special features. First, Billy Andrews will be accompanied live by the Queenz of Piano. Unlike Andrews, however, the Queenz of Piano will not be playing at the Kesselhaus – instead, the performers will be connected via the Internet. Second, the concert is set to be streamed live to the Planetarium Bochum, where the audience can enjoy a 360-degree audiovisual experience – almost like the live show itself. Audio and video will not only stream from the Kesselhaus to the planetarium; data will be transmitted in both directions, allowing performers at the Kesselhaus in Berlin to hear synthesized applause from the audience in Bochum. This synthesized applause is generated from emojis that participants can send via a web app. Cinemas may serve as additional locations, and initial tests are already underway. Finally, the concert will also be streamed live to and for viewers at home. Here, the idea is to offer remote viewers the possibility to enjoy the concert in a way that comes as close as possible to the live experience. Viewers will be able to interact with the artist and, with a little luck, even be invited on stage by Andrews himself. It may even be possible to sell special online tickets for these kinds of extra features in the future.

Finally, completing the toolkit is MPEG-H, an audiovisual coding standard. MPEG-H 3D audio, which is developed in large part by Fraunhofer IIS, is used to produce three-dimensional sound that improves upon existing surround sound technologies. “This 3D sound is not only perfectly suited for the planetarium in Bochum – it is also ideal for home sound systems or headphones when you’re on-the-go,” explains Dr. Ulli Scuda from Fraunhofer IIS, adding, “and the audio only needs to be produced once.”

The MPEG-H decoder integrated in commercially available AV receivers renders the signal for the specific end device in each application, producing high-quality playback in any setting.

The virtual concert venue
In a second validation project, the Fraunhofer research team plans to take another major step forward – this time, joined by Radar Media and High Road Stories, the companies behind FANTAVENTURA, a virtual reality experience created for the German hip-hop ensemble Die Fantastischen Vier. In this validation project, titled “The Other Room: Martin Kohlstedt,” the team makes a volumetric recording of a concert, for which the necessary processing steps are performed offline. The pre-processed recording can then be streamed in real time to remote viewers. These viewers may experience a live performance of an artist inside a virtual space through which they are able to move freely. Viewers wearing a VR headset will be able to take a few steps forward, backward, or to the side and turn their heads in different directions. The necessary video data is transmitted to the VR headset, offering viewers what experts refer to as “six degrees of freedom,” or 6DoF for short. In this field test, between 50 and 100 viewers shall be able to simultaneously experience the virtual concert, which is complemented by an immersive, spatial audio playback. “We want to use MPEG DASH to send scalable three-dimensional data so that participants can even use smartphones as end devices, for instance,” says Steglich. In the long run, the team’s aim is to establish a complete live end-to-end volumetric workflow from capture to playback.

Among other technologies, the team plans to incorporate the MPEG-I standard MPEG Immersive video (MIV), which is currently in development, as an alternative way to transmit the required video data. The advantage of MIV is that the decoder comes with a built-in renderer which can then generate any novel view needed. Instead of transmitting the entire 3D model and rendering it on the viewer side as usual, only a rendering from 2D videos would be required. The researchers at Fraunhofer IIS are currently exploring whether the MPEG-I MIV standard is suitable for this type of application and how many virtual cameras would be needed to be placed throughout the virtual scene. This all begs the question: What can we achieve using current technology? Virtual LiVe aims to find the answer.


Click here for more information about the project:


Find out more about the exhibit:


For more information, questions or requests please contact Michel Bätz via E-mail:

Back to the Trendbrochure overview