Search
318 results found with an empty search
- 5G-MAG announces Steering Group leaders for the 2025-2027 term
The 5G-MAG General Assembly has elected its new Steering Group to lead the activities of the association during the next two years. The new Steering Group members can be consulted here: Structure About 5G-MAG - The Media Connectivity Association 5G‑MAG is a non-for-profit association fostering collaboration to shape the future of connected media experiences . We focus on solutions for internet media and multimedia services , building on open standards and accelerating adoption through open‑source software . At 5G‑MAG, our members: Bridge standards and implementation by turning specifications into usable reference tools and open‑source enablers. Foster collaboration across industries and academia to solve real‑world challenges in internet media applications. By combining standards with open‑source, we ensure that services, applications and networks are interoperable and future‑proof.
- New Software! - MBS User Service: MBS Transport Function
We are pleased to announce the release of version 1.1.0 of the MBS Transport Function (part of the MBS User Services). This release improves the composition of the #FLUTE session for #STREAMING mode #Distribution #Sessions , adds the Distribution Session state transitions and improves error responses for calls to the #API . Please see the tutorial for examples of how to use the MBSTF: https://lnkd.in/d9BzugJZ The full release notes are available in the 5G-MAG rt-mbs-transport-function repository: https://lnkd.in/dGdyqMJb We'd like to thank BBC 's Richard Bradbury , David Waring and Dev Audsin for the contribution to the 5G-MAG Reference Tools and the community for supporting testing. This starts to close the gap with the implementation of MBS for the 5GC and initial support of NG-RAN CU contributed by iTEAM Research Institute Universitat Politècnica de València (UPV) 's Jaime Sánchez Roldán , Borja Iñesta Hernández , David Gomez-Barquero . Find more information about the project at https://lnkd.in/d_PhkM2B or directly ask Daniel Silhavy Find more information about the project at https://5g-mag.github.io/Getting-Started/pages/5g-multicast-broadcast-services/
- New Software! - 5G Core Service Consumers (for BSF, PFC and MB-SMF)
We are pleased to announce the release of the 5G Core Service Consumers v2.0.1. The v2 release of the service consumers sees the addition of the MB-SMF Service Consumer library, libscmbsmf and two example tools that use the new library to exercise the Nmbsmf_TMGI and Nmbsmf_MBSSession APIs. The release also includes an uplift of the existing PCF and BSF Service Consumer libraries to be compatible with Open5GS v2.7.2. If you still need compatibility with Open5GS v2.6.4 then please use the v1.0.x branch of the 5G Core Service Consumers.or any v1 release. For examples of how to use the new tools for libscmbsmf please see the tutorials: https://hub.5g-mag.com/Getting-Started/pages/5g-core-service-consumers/tutorials.html The release can be found in the 5G-MAG rt-5gc-service-consumers repository at: https://github.com/5G-MAG/rt-5gc-service-consumers/releases/tag/rt-5gc-service-consumers-2.0.1 We'd like to thank BBC 's Richard Bradbury , David Waring and Dev Audsin for the contribution to the 5G-MAG Reference Tools and the community for supporting testing. Find more information about the project at https://hub.5g-mag.com/Getting-Started/pages/5g-core-service-consumers/
- Time synchronization services for media production over 5G networks
Download the 5G-MAG Report (PDF) Download the supplementary document on ST2110 (PDF) ABOUT THE REPORT This is a report produced by the 5G-MAG Workgroup CP (Content Production - Standards and Architecture). Current version of the report: v.1.0 Date of publication: 22nd December 2023 ABSTRACT The 5G-MAG report " Towards a comprehensive 5G-based toolbox for live media production " identified a series of high-level scenarios where time synchronization is a required feature. Time synchronization is found key to align different media sources either within the same location, when carried over different network domains (fixed/wireless) or to support remote operations and control. This report provides: information on the ability of 5G technologies to support time services in relevant deployment scenarios; and guidelines in relation to time information distribution and exposure to applications.
- New Software! - 5MBS Core NFs and NG-RAN CU implementations
We are very happy to announce the availability of new software for 5G Multicast Broadcast Services (MBS). The new developments encompass the 5G core - as an extension of Open5GS - and the Central Unit (CU) of the gNodeB, as an extension of the srsRAN_Project. Both components implement functionalities for establishing MBS Broadcast Sessions. At this stage, Multicast Services support is not yet included, and the Distributed Unit (DU) and User Equipment (UE) components are not provided. Therefore, this constitutes a first MVP that enables exploration and evaluation of MBS control and session management from the 5GC up to the early stages of the NG-RAN, with a stable and extensible foundation for future work to add the remaining components and multicast capability. These contribution have been funded by the European Union -funded project 6G‑SANDBOX, and are made available to the industry and the wider community via the 5G-MAG Reference Tools Developer Community. 💬 The software is available under our forks of: open5gs / 5mbs - https://github.com/5G-MAG/open5gs/tree/5mbs srsRAN_Project / 5mbs - https://github.com/5G-MAG/rt-srsRAN_Project/tree/5mbs A big thank you to the main contributors of this release: Borja Iñesta Hernández and Jaime Sanchez Roldan, from the iTEAM Research Institute of the Universitat Politècnica de València (UPV). Find more information about the project at https://5g-mag.github.io/Getting-Started/pages/5g-multicast-broadcast-services/
- 17+24.10.2025 - Workshop: 3GPP Release 19, from Specification to Implementation
With 3GPP Release 19 now frozen and specifications done, next step starts with implementation... And what better way to dive in than a 5G-MAG Workshop where spec creators meet open-source builders. Let's get insights straight from the doers and start turning Rel-19 into reality. PART I - 17th October 2025 Get the invite (with an ICS file) for FREE: eveeno.com/5g-mag AMD: Extensions to 5G Media Streaming protocols (Discussion lead: Thomas Stockhammer – Qualcomm) Common Media Client Data (CMCD) Multi-access media delivery Media delivery from multiple service endpoints/locations, including CMMF and Content Steering Distributing DRM-encrypted and high-value content Improved QoS support for Media Streaming services including ECN marking for L4S and QoS monitoring Media Streaming aspects of Network Slicing 5G_RTP: 5G Real-time Media Transport Protocol for XR Services (Discussion lead: Saba Ahsan – Nokia) Split Rendering with IMS (Discussion lead: Saba Ahsan – Nokia) Wrap-up Part II - 24th October 2025 Get the invite (with an ICS file) for FREE: eveeno.com/5g-mag AMD: Extensions to MBS protocols (Discussion lead: Thomas Stockhammer – Qualcomm) In-session unicast repair for MBS Object Distribution Selected MBMS Functionalities not supported in MBS XR Media / Avatar (Discussion lead: Thomas Stockhammer – Qualcomm) - Teaser Get the invite (with an ICS file) for FREE: eveeno.com/5g-mag 🔍 What to Expect Deep dives into key Release 19 features relevant to content delivery and immersive media experiences. Identification of implementation priorities and integration pathways Collaborative discussions on aligning specifications with open-source development Opportunities to contribute to the evolution of connected media technologies Whether you're a developer, researcher, or standards contributor, this workshop offers a unique opportunity to bridge between specification and deployment — and help shape the future of interoperable, standards-based solutions. 📅 Date: 17th October at 13:00 CEST + 24th October at 13:00 CEST 📍 Location: On-Line via Zoom 🔗 Register: https://eveeno.com/5g-mag Find all the information about Reference Tools at: developer.5g-mag.com
- 28.10.2025 - 5G-MAG at 6G-XR Impact Day
Join our Head of Technology, Jordi J. Giménez, at the 6G-XR Impact Day. He will be speaking about standardization efforts and open-source software for Connected Media applications. On top of this, we have been kindly invited to bring some of our 5G-MAG Reference Tools demos! EVENT: 6G-XR Impact Day DATE: 28th October 2025 LOCATION: 5TONIC – IMDEA Networks Institute Madrid INFORMATION ABOUT THE EVENT AND REGISTRATION: https://6g-xr.eu/event/6g-xr-impact-day
- 21/22.10.2025 - 5G-MAG at NEM Summit
Join Alexander Zoubarev, who had to step in last minute for Daniel Silhavy, at the NEM Summit. Alex will be presenting our work on Standards and Software for Connected Media Experiences, Open-Source, Standardization impact and Industry Collaboration. EVENT: NEM Summit DATE: 21st and 22nd October 2025 LOCATION: Fraunhofer Fokus, Berlin. INFORMATION ABOUT THE EVENT AND REGISTRATION: https://nem-initiative.org/nem-summit-2025/
- New Releases! - Content Delivery Protocols (FLUTE)
Happy to announce the release of version 0.11.0 of our FLUTE library. The release includes various new features, improvements, and bug fixes such as the ability to transmit via a UDP tunnel and XML namespace support for FDT handling. 💬 Release Notes: https://lnkd.in/dF-MxVae A big thank you to the main contributors of this release: David Waring ( BBC ), Klaus Kühnhammer ( Bitstem ) and Alexander Zoubarev ( Fraunhofer FOKUS ). 👏 A special thank you to Yannick Poirier ( SES Satellites ) for his guidance in finding and fixing multiple issues in the library.
- New Releases! - V3C Immersive Platform
We are very happy to announce the release of version 1.1 of the V3C Immersive platform. The release includes an MPEG #V3C #DASH packager and tools and scripts to generate V-PCC bitstreams. In addition, we added the source code for #Haptic (decoder & synthesizer) and V-PCC (synthesizer) and improved the V-PCC renderer. The link to the detailed release notes can be found below. We would like to thank all the contributors to this release, especially Bertrand Leroy , Patrick Fontaine , Céline Guede , cyril quinquis , Bart Kroon ( Philips ), Nils Duval ( Motion Spell ) who contributed to the development, testing and deployment of these tools. 💬 Release Notes: - V3C Unity Player: https://lnkd.in/dhREXZEG - V3C Decoder Plugin: https://lnkd.in/d48Fcpdp - V3C Content: https://lnkd.in/d4Vi5-Vg 💥 Getting Started To get started, checkout our V3C tutorial here: https://lnkd.in/dyrt6RjM You can directly contact Daniel Silhavy , Nils Duval and Jordi J. Giménez for more info.
- 9.10.2025 - Demonstrators at MPEG 152 Meeting in Geneva
Join 5G-MAG at the MPEG 152 Meeting in Geneva with our Reference Tools demonstrators on XR Media with MPEG-I Scene Description and the V3C Immersive Platform . Find all the information about Reference Tools at: developer.5g-mag.com
- FMT#2: Enabling the immersive media experience for all
Find here the presentations and summaries of the Future Media Townhall session on "The Future of Media Experiences". Please note that these are quick summaries generated with AI tools. We apologize for any inaccuracy. Please refer to the presentations from the speakers for the details. Find other content in the following links: Towards the on-line-only future of media delivery: https://www.5g-mag.com/post/fmt-1-towards-the-on-line-only-future-of-media-delivery 5G Broadcast, ready for launch: https://www.5g-mag.com/post/fmt-3-5g-broadcast-ready-for-launch 5G-MAG Members' Demos and Paper Pitches: https://www.5g-mag.com/post/5g-mag-members-demos-and-papers-shine-at-the-fmt Presentation from Disney Studios, by Eddie Drake Go to the Slides: https://drive.google.com/file/d/14tMPxVjw5A-5nY5S8dG3xdD5WdFF_tgm/preview Disney is reimagining storytelling through immersive technologies that span multiple devices and platforms. Their research focuses on creating the building blocks of mixed reality, including engines, assets, and technical standards to ensure interoperability. Content consumption is evolving with formats like 180°/360° video, stereoscopic and spatial storytelling, and both linear and interactive experiences. Disney’s strategy is to optimize distribution of XR content across devices—headsets, glasses, and handhelds—while integrating with Disney+. Real-time rendering and streaming play a central role, ensuring engaging, cross-device XR experiences that bring consistent storytelling wherever fans choose to connect. 5 Takeaways: Disney is investing in fundamental XR research, covering engines, assets, and standards. Storytelling formats now include 180°/360°, stereoscopic, spatial, linear, and non-linear. Distribution strategy emphasizes multi-device support: handhelds, headsets, and AR glasses. Disney+ will serve as a key platform for integrating immersive experiences. Real-time rendering and streaming are crucial to delivering consistent cross-platform XR Presentation from Accedo, by José Somolinos Go to the Slides: https://drive.google.com/file/d/1gQKFH6rbUrEoTZbHfW6h05B6EWnTc6O7/preview The presentation explores how XR (Extended Reality) can transform sports and entertainment by merging the best of watching at home with the immersion of being at the stadium. Fans want freedom of choice—camera angles, stats, and replays—while also craving live atmosphere, spatial audio, and closeness to the action. Delivering this requires synchronized video/data feeds, ultra-high-resolution immersive video, advanced codecs, edge computing, and next-generation connectivity. Future innovations include AI-driven companions, spatial stats, interactive overlays, and social engagement layers. The ultimate goal: an experience more immersive, engaging, and interactive than either traditional broadcasting or attending in person. 5 Takeaways Fans seek both comfort and immersion —customizable home viewing plus stadium atmosphere. Technology enablers : 8K+ immersive video, spatial audio, edge computing, and high-speed networks. Efficiency innovations : variable pixel focus and tiled streaming reduce bandwidth by up to 75%. Fan of the future : interactivity through AI companions, 3D overlays, gamification, and social layers. Standardization is critical —shared frameworks for synchronization, spatial stats, and cross-platform experiences Presentation from Collabora, by Frederic Plourde Go to the Slides: https://drive.google.com/file/d/1GNouLR_fIV0HJ9jCAWXwqn-76Lqb6dd-/preview This presentation highlights the role of open standards and open-source technology in shaping the future of XR for sports and entertainment. XR enables fans to experience live content with greater immersion and interactivity, making it a natural driver of adoption in media. However, challenges include balancing bandwidth with fidelity, future-proofing against rapidly evolving technology, and avoiding vendor lock-in. OpenXR, developed by Khronos Group, provides a standard API for cross-device compatibility. Monado, an open-source runtime, and Electric-Maple, an XR streaming platform, demonstrate how open ecosystems foster innovation, interoperability, and trust, ensuring scalable and future-proof XR experiences. 5 Takeaways XR enhances fan engagement by creating immersive experiences in sports and entertainment settings. OpenXR is a widely adopted open standard that provides a common set of APIs for developing XR applications that run across a wide range of AR and VR devices. Monado offers an open-source XR runtime , supporting safe experimentation on Linux, Windows, and Android platforms. Electric-Maple enables remote XR streaming , integrating Monado and GStreamer with support for major GPU vendors. Open-source ecosystems promote trust and flexibility , helping creators avoid vendor lock-in and future-proof their XR solutions. Presentation from Qualcomm, by Thomas Stockhammer Go to the Slides: https://drive.google.com/file/d/1v9czWxmkduixH5RI9A6iLZHoTG9c3PrX/preview The presentation focuses on how 6G and AI agents will revolutionize media experiences, XR, and connectivity at scale. Shifting from app-based, user-initiated interactions to persistent, context-aware AI agents, consumers will rely on seamless natural interfaces (voice, glasses, wearables). These AI-driven experiences demand ultra-low latency, distributed computing, and intelligent networks capable of adapting in real time. 6G will provide tiered inferencing across device, edge, and cloud, optimizing performance and power. New applications include digital twins, immersive XR, personalized media, and avatar communication. Ultimately, 6G aims to enable richer user experiences, collaborative communications, and next-generation digital services. 5 Takeaways AI agents will dominate—shifting user interaction from apps to natural, context-aware assistants. 6G enables scale—supporting XR, IoT, robotics, and immersive media with low latency and high reliability. Tiered AI inferencing—balancing compute between devices, edge, and cloud to optimize performance. Network intelligence—UX-aware RAN and collaborative communications adapt resources based on user and app context. New media possibilities—full-body avatars, AI-driven media workflows, immersive XR, and trusted communications Presentation from 3GPP SA4, by Gilles Teniou Go to the Slides: https://drive.google.com/file/d/14Wo8-iBAihvtjdYtFjI2z4De3MJkEdwa/preview This presentation provides an update on 3GPP SA4’s media-related activities and their role in paving the way toward 6G. Building on Release 19, advancements include network slicing, QoS improvements, support for new media formats, avatar communication, and DRM in DASH. Release 20 will push forward with Ultra Low Bitrate Coding (especially for satellite), AI/ML integration in IMS, and studies on energy efficiency. Future directions also explore 3D Gaussian Splats, advanced media formats (JPEG-AI, stereo+depth), and media delivery over QUIC. Together, these evolutions aim to support richer, more efficient, and immersive media experiences for the 5G-to-6G era 5 Takeaways Release 19 achievements: improvements in media delivery, DRM, avatar communication, and advanced codecs. QoS advancements: better congestion handling, monitoring, and efficient media distribution. Release 20 focuses: ultra-low bitrate audio, AI/ML support, and energy efficiency analysis. Future studies: 3D Gaussian Splats, depth-based video, and dynamic traffic characterization. Road to 6G: foundational work leading to interoperability, scalability, and advanced immersive media services












