Special Session on Multimedia in 5G Network Architectures
- Submission deadline:
December 15, 2017February 2, 2018 (extended)
- Acceptance notification: February 21, 2018
- Camera ready deadline: April 19, 2018
- Online submission: https://submissions.mmsys2018.org/5gmultimedia
- Submission format: 6-12 pages, using ACM style format (double-blind). Please see the submission guidelines for more information about the process.
- Reproducibility: obtain an ACM reproducibility badge by making datasets and code available (authors will be contacted to make their artifacts available after paper acceptance)
Special Session chairs
- Imed Bouazizi, Samsung Research America, United States
- Yong Liu, New York University, United States
- Rufael Mekuria, Unified Streaming, The Netherlands
Scope & goals
The current mobile network faces numerous challenges as a platform for media delivery: the high bandwidth demand is fuelled by online video streaming services and the advent of ultra-high-definition (UHD) video, High Dynamic Range, virtual reality and other advanced media content (e.g. point clouds, light fields); widespread use of social media fosters instant sharing of user-generated video; rising consumer interests in augmented and/or virtual reality (AR/VR) underscore the need to support richer media forms with lower latency. In the meantime, design and experimentation of the of the mobile network itself is evolving towards the next generation 5G envisioned in the 2020s, as demonstrated by recent advances in Network function virtualization were parts of the mobile network run as software on top of virtualized hardware is increasing deployment agility and flexibility. In addition, software-defined networking (SDN), mobile edge computing (MEC) technologies, small/micro cells and novel millimetre wave techniques are emerging as building blocks for the next generation mobile network. This special session aims to highlight research works that investigate core 5G technologies through the prism of its most prevalent application: multimedia. Intriguing research questions like: how can network function virtualization (NFV) support advanced video streaming services? What is the most efficient distribution mechanism for social sharing of user-generated video? What are proper performance metrics for novel networked multimedia applications based on augmented/virtual reality and what requirements would 5G need to target to support the next generation of multimedia services? What advances in 5G and future network architectures will enable new media services and applications? How should the development of 5G incorporate intrinsic understanding of video streaming and the most popular standards developed for multimedia to achieve demanding key performance indicators (KPI’s)? What will be the role of resource orchestration of cloud and network resources under the NFV paradigm for multimedia applications? How will error correcting codes such as based on raptor or other codes based on the fountain coding principles influence multimedia? How will 5G support the large variety in media services ranging from SD to HD, to HDR, to VR and AR like services without exploding bandwidth and storage. How will network based media processing (NBMP) enable more ubiquitous support for different media applications from single source content? How will media ingest in network architectures with network distributed media processing work? What security issues will be important to consider? What are the different standardization activities in this area (ETSI NFV, ETSI MEC, IETF, MPEG, DASH-IF, 3GPP) and what are unresolved issues exist.
Topics of Interest:
- 5G Architectures and their intrinsic support for multimedia
- Software Defined Networking for Multimedia applications
- Network Function Virtualization Technologies for Multimedia applications (vertical and horizontal scaling, resource allocation, fast instantiation of network functions )
- The role of network based media processing in future network architectures (e.g. super resolution, trans-multiplexing, content insertion, trans-drm, content insertion etc.)
- Transcoding in future 5G networks and their role in advanced media services (e.g. guided transcoding)
- Cloud and mobile edge computing (MEC) technologies and their use for multimedia
- Virtualization and container technologies and their application in multimedia
- Resource allocation for multimedia in 5G network architectures
- Security and privacy of media contents in cloud and 5G architectures
- Interoperability issues and standardization developments in network distributed media
- Cloud RAN integration with multimedia applications
- Wireless techniques (millimeter waves) and their application to support of multimedia
- New application of multimedia in 5G (e.g. VR/AR)
- Intrinsic support of new media data types in 5G networks (e.g. light fields, point clouds, sensor data)
- Standardization of media formats for future network architectures (MPEG, DASH-IF, 3GPP, ETSI, IETF )
The members of the technical program committee include selected experts from academia and industry that have been active in research and standardization in the intersection of multimedia and networking.
Papers should be between six and twelve pages long (in PDF format) including references prepared in the ACM style and written in English. Hence MMSys papers enable authors to present entire multimedia systems or research work that builds on considerable amounts of earlier work in a self-contained manner. MMSys papers are published in the ACM Digital Library; they are available for just as long as journal papers and authors should not feel compelled by space limitations to publish extended works on top of an MMSys paper. Authors who submit very specific, detailed research work are encouraged to use less than 12 pages. The papers are double-blind reviewed.