top of page

New 5G-MAG WI on "XR and 3D Assets in 5G-based communication"

The 5G-MAG Steering Group has approved the Work Item "XR and 3D Assets in 5G-based communication" which kicks off a new area of work on Immersive Media.



The scope of the work

The tasks in this work item involve:

  • Providing an overview and introduction to XR and 3D Experiences via 5G based on Rel-18 specifications in 3GPP;

  • Elaborating implementation Guidelines for XR and 3D Experiences via 5G based on Rel-18 specifications in 3GPP for selected use cases;

  • Creating a new cluster for XR and 3D asset distribution ready to include features beyond 3GPP SA4 release 18 such as additional MAF pipelines, immersive codecs developed in MPEG-I, or new media types.


In addition, the 5G-MAG Reference Tools have adopted several open source libraries to support 3D and XR services in 3GPP based on the MPEG-I Scene Description (SD) as defined on the specification in ISO/IEC 23090-14.


A full work item description is available to members under the Member's platform. Please check www.5g-mag.com/workitems#wi-011


Related work in 3GPP

The initially available libraries are closely aligned with the work in Rel-18 3GPP SA4, namely:

  • MeCAR: TS 26.119 is aimed at defining the media capabilities for AR devices (addressing various types of form factors) when connected to the 3GPP networks. Working assumptions connect the AR media processing to an XR runtime for which OpenXR is used as a reference API, i.e. ensuring at least compatibility with OpenXR and supporting MPEG-I Scene Description, with optionally using V3C (ISO/IEC 23090-5), as an end point.

  • Split Rendering MSE: TS 26.565 is aimed to support split rendering workflows and is based on TS 26.119

  • PROMISE: 5G-Advanced media profiles for messaging services creates a new specification in TS 26.143 in order to address, among others, 3D Message Formats.


In addition, other Rel-18 work items related to XR are covered in RAN and SA2.


Related work in MPEG​

Related work in MPEG involves two workgroups:

  • ISO/IEC Moving Picture Experts Group (MPEG) Working Group 3 (WG03) defines a scene description framework in part 14 of the MPEG-I series of standards (i.e., ISO/IEC 23090-14), serving as an entry point to rich 3D dynamic and temporal scenes, enabling immersion, fusion with the real world and rich interactivity, while providing real-time media and scene update delivery. Furthermore, the standard defines an architecture together with an application programming interface (API), that allows the application to separate access to the immersive timed media content from the rendering of this media. The separation and the definitions of this API allow the implementation of a wide range of optimization techniques, such as the adaptation of the retrieved media to the network conditions, partial retrieval, access at different levels of detail, and adjustment of the content quality.

  • ISO/IEC Moving Picture Experts Group (MPEG) Working Group 7 (WG07) defines a group of standards, under the umbrella of Visual Volumetric Video-based Coding (V3C). These standards aim to efficiently code, store, and transport immersive content with 6 degrees of freedom. The V3C family of standards currently consists of three documents: ISO/IEC 23090-5 defines the generic concepts of volumetric video-based coding and its application to dynamic point cloud data; ISO/IEC 23090-12 specifies another application that enables compression of volumetric video content captured by multiple cameras; and ISO/IEC 23090-10 describes how to store and deliver V3C compressed volumetric video content. Each standard leverages the capabilities of traditional 2D video coding and delivery solutions, allowing for re-use of existing infrastructures which facilitates fast deployment of volumetric video.


Additional references

References and additional information for MPEG-I Scene Description


References and additional information for V3C

bottom of page