FMT#2: Enabling the immersive media experience for all
- 5G-MAG

- Sep 17
- 4 min read
Updated: Sep 22
Find here the presentations and summaries of the Future Media Townhall session on "The Future of Media Experiences".
Please note that these are quick summaries generated with AI tools. We apologize for any inaccuracy. Please refer to the presentations from the speakers for the details.
Find other content in the following links:
Towards the on-line-only future of media delivery: https://www.5g-mag.com/post/fmt-1-towards-the-on-line-only-future-of-media-delivery
5G Broadcast, ready for launch: https://www.5g-mag.com/post/fmt-3-5g-broadcast-ready-for-launch
5G-MAG Members' Demos and Paper Pitches: https://www.5g-mag.com/post/5g-mag-members-demos-and-papers-shine-at-the-fmt

Presentation from Disney Studios, by Eddie Drake
Disney is reimagining storytelling through immersive technologies that span multiple devices and platforms. Their research focuses on creating the building blocks of mixed reality, including engines, assets, and technical standards to ensure interoperability. Content consumption is evolving with formats like 180°/360° video, stereoscopic and spatial storytelling, and both linear and interactive experiences. Disney’s strategy is to optimize distribution of XR content across devices—headsets, glasses, and handhelds—while integrating with Disney+. Real-time rendering and streaming play a central role, ensuring engaging, cross-device XR experiences that bring consistent storytelling wherever fans choose to connect.
5 Takeaways:
Disney is investing in fundamental XR research, covering engines, assets, and standards.
Storytelling formats now include 180°/360°, stereoscopic, spatial, linear, and non-linear.
Distribution strategy emphasizes multi-device support: handhelds, headsets, and AR glasses.
Disney+ will serve as a key platform for integrating immersive experiences.
Real-time rendering and streaming are crucial to delivering consistent cross-platform XR
Presentation from Accedo, by José Somolinos
The presentation explores how XR (Extended Reality) can transform sports and entertainment by merging the best of watching at home with the immersion of being at the stadium. Fans want freedom of choice—camera angles, stats, and replays—while also craving live atmosphere, spatial audio, and closeness to the action. Delivering this requires synchronized video/data feeds, ultra-high-resolution immersive video, advanced codecs, edge computing, and next-generation connectivity. Future innovations include AI-driven companions, spatial stats, interactive overlays, and social engagement layers. The ultimate goal: an experience more immersive, engaging, and interactive than either traditional broadcasting or attending in person.
5 Takeaways
Fans seek both comfort and immersion—customizable home viewing plus stadium atmosphere.
Technology enablers: 8K+ immersive video, spatial audio, edge computing, and high-speed networks.
Efficiency innovations: variable pixel focus and tiled streaming reduce bandwidth by up to 75%.
Fan of the future: interactivity through AI companions, 3D overlays, gamification, and social layers.
Standardization is critical—shared frameworks for synchronization, spatial stats, and cross-platform experiences
Presentation from Collabora, by Frederic Plourde
This presentation highlights the role of open standards and open-source technology in shaping the future of XR for sports and entertainment. XR enables fans to experience live content with greater immersion and interactivity, making it a natural driver of adoption in media. However, challenges include balancing bandwidth with fidelity, future-proofing against rapidly evolving technology, and avoiding vendor lock-in. OpenXR, developed by Khronos Group, provides a standard API for cross-device compatibility. Monado, an open-source runtime, and Electric-Maple, an XR streaming platform, demonstrate how open ecosystems foster innovation, interoperability, and trust, ensuring scalable and future-proof XR experiences.
5 Takeaways
XR enhances fan engagement by creating immersive experiences in sports and entertainment settings.
OpenXR is a widely adopted open standard that provides a common set of APIs for developing XR applications that run across a wide range of AR and VR devices.
Monado offers an open-source XR runtime, supporting safe experimentation on Linux, Windows, and Android platforms.
Electric-Maple enables remote XR streaming, integrating Monado and GStreamer with support for major GPU vendors.
Open-source ecosystems promote trust and flexibility, helping creators avoid vendor lock-in and future-proof their XR solutions.
Presentation from Qualcomm, by Thomas Stockhammer
The presentation focuses on how 6G and AI agents will revolutionize media experiences, XR, and connectivity at scale. Shifting from app-based, user-initiated interactions to persistent, context-aware AI agents, consumers will rely on seamless natural interfaces (voice, glasses, wearables). These AI-driven experiences demand ultra-low latency, distributed computing, and intelligent networks capable of adapting in real time. 6G will provide tiered inferencing across device, edge, and cloud, optimizing performance and power. New applications include digital twins, immersive XR, personalized media, and avatar communication. Ultimately, 6G aims to enable richer user experiences, collaborative communications, and next-generation digital services.
5 Takeaways
AI agents will dominate—shifting user interaction from apps to natural, context-aware assistants.
6G enables scale—supporting XR, IoT, robotics, and immersive media with low latency and high reliability.
Tiered AI inferencing—balancing compute between devices, edge, and cloud to optimize performance.
Network intelligence—UX-aware RAN and collaborative communications adapt resources based on user and app context.
New media possibilities—full-body avatars, AI-driven media workflows, immersive XR, and trusted communications
Presentation from 3GPP SA4, by Gilles Teniou
This presentation provides an update on 3GPP SA4’s media-related activities and their role in paving the way toward 6G. Building on Release 19, advancements include network slicing, QoS improvements, support for new media formats, avatar communication, and DRM in DASH. Release 20 will push forward with Ultra Low Bitrate Coding (especially for satellite), AI/ML integration in IMS, and studies on energy efficiency. Future directions also explore 3D Gaussian Splats, advanced media formats (JPEG-AI, stereo+depth), and media delivery over QUIC. Together, these evolutions aim to support richer, more efficient, and immersive media experiences for the 5G-to-6G era
5 Takeaways
Release 19 achievements: improvements in media delivery, DRM, avatar communication, and advanced codecs.
QoS advancements: better congestion handling, monitoring, and efficient media distribution.
Release 20 focuses: ultra-low bitrate audio, AI/ML support, and energy efficiency analysis.
Future studies: 3D Gaussian Splats, depth-based video, and dynamic traffic characterization.
Road to 6G: foundational work leading to interoperability, scalability, and advanced immersive media services











Comments