diff --git a/pages/Network_APIs/Content_Production/Production_Contribution_Introduction.md b/pages/Network_APIs/Content_Production/Production_Contribution_Introduction.md
new file mode 100644
index 00000000..5f9511f2
--- /dev/null
+++ b/pages/Network_APIs/Content_Production/Production_Contribution_Introduction.md
@@ -0,0 +1,73 @@
+---
+layout: default
+title: Content Production & Contribution
+parent: Network APIs
+nav_order: 0
+has_children: true
+---
+
+
+
+{: .warning }
+This documentation is currently **under development and subject to change**. It reflects outcomes elaborated by 5G-MAG members. If you are interested in becoming a member of the 5G-MAG and actively participating in shaping this work, please contact the [Project Office](https://www.5g-mag.com/contact)
+
+# Introduction: Content Production and Contribution over Mobile Networks
+
+Wireless connectivity plays a key role in content production and contribution scenarios such as production in studios, coverage of live in-venue (a football match) or on-the-move (the Tour-de-France) events, commentary stands (in a convention), newsgathering (breaking news in the street),...
+
+These different setups may have unique infrastructure and equipment needs, and may benefit from the provision of connectivity with variyng QoS requirements.
+
+The use of wireless connectivity may differ as a trade-off between quality (or importance) of the content and connection reliability.
+
+The choice to use wireless connectivity is not limited to specific scenarios or production levels. Instead, it is a strategic trade-off, balancing cost, tolerance for technical glitches, risk of failure, and the quality and importance of the content. Examples are given below.
+
+## Diversity in connectivity needs in the same deployment scenario
+
+
+
+
+
Media production scenarios often require a mix of connectivity solutions to meet a variety of needs. For example, during a football match, a production team uses high-quality cameras for the main broadcast, while a commentator stand might have additional wireless cameras for pre-game interviews. Wireless cameras are also deployed outside the stadium to capture interviews with the crowd at the entrance of the stadium. Similarly, a major event like the coronation of King Charles III brought together numerous TV producers. They used a combination of dedicated, high-quality streams for the main ceremony and various other setups for newsgathering and interviews from journalists deployed around the site. This demonstrates how a single event can have multiple connectivity needs, from high-bandwidth main broadcasts to more flexible, on-the-go reporting. This is independent of the overall cost or budget of the whole event.
+
+
+
+## Immediacy Over Quality
+
+
+
+
+
When a sudden street event unfolds, the only way to cover it is with smartphones on a best-effort connection. Getting any live footage is far more valuable than dismissing the connection due to its unreliability. While the video might not be broadcast-quality, the immediate, raw footage from the scene is critical for covering the event as it happens.
+
+
+
+## Agility over Cost
+
+
+
+
+
For both sudden and partially-planned events, cellular bonding systems have emerged as cost-effective solutions to eliminate the need for e.g. dedicated satellite feeds, making live reporting from a wider range of locations economically viable. The equipment itself may be a major investment. The backpacks, modems, and SIM cards are not inexpensive and the news organization has to pay for a data plan for each SIM card and a service fee to the external company that provides the bonding infrastructure. Cellular bonding is needed as a single best-effort public mobile network cannot guarantee reliability. The news organization will plan where to send the different journalists that will provide reports from remote locations during the news programme. Covering sudden events with cellular bonding equipment is also usual, which may achieve better reliability that the connectivity via a single smartphone.
+
+
+
+## Dynamic Footage over Signal Stability
+
+
+
+
+
High-mobility cameras introduce the unique challenge of seamlessly mixing their footage (generally highly engaging) into a high-quality production that includes wired cameras with reliable connections. This means the wireless setup needs to be as stable as possible, whereas the nature of its constant motion, changing environments, and potential signal obstructions makes that challenging with frequent signal fades or brief drops in connectivity. Despite these issues, the value of the unique camera perspective is prioritized. A camera on a referee provides an on-field view of the action and is critical for live replays and enhancing the narrative of the game. A camera on a motorbike in the Tour de France provides up-close views of the riders that a stationary camera could never capture.
+
+
+
+## Beyond best-effort connectivity: exposure of network capabilities
+
+The fundamental trade-off in using wireless connectivity for media is that as a connection becomes more reliable, it enables more high-quality content to be delivered on it.
+Historically, media was uplinked using highly reliable satellite and RF technologies. Today, the widespread availability of public mobile networks or LEO satellite constellations triggered a shift toward more agile tools like smartphones, cellular bonding packs or wireless modems.
+The exposure of network capabilities to applications representes an opportunity to exploit advanced network features beyond best-effort connectivity. Examples of network capabilities maz include on-demand quality, user equipment (UE) management, precise time synchronization,... Accessing and utilizing the desired features can be intricate and inconsistent across different networks. Several initiatives are taking shape to explore the opportunities behind Network APIs (exposing network capabilities to API consumers), offering high-level abstractions of underlying network functionalities to simplify resource utilization for non-network experts.
+
+# What we are doing
+
+At 5G-MAG we’re investigating the possibility of using network APIs to meet certain requiremnts for Content Production and Contribution scenarios over mobile networks.
+
+Please go to the following sections:
+* [Reference Scenarios](./Production_Contribution_Scenarios.html)
+* [Workflows](./Production_Contribution_Workflows.html)
+* [Using Network APIs](./Production_Contribution_UsingCAMARAAPIs.html)
diff --git a/pages/Network_APIs/Content_Production/Production_Contribution_Scenarios.md b/pages/Network_APIs/Content_Production/Production_Contribution_Scenarios.md
index 5efac84d..abdecbf0 100644
--- a/pages/Network_APIs/Content_Production/Production_Contribution_Scenarios.md
+++ b/pages/Network_APIs/Content_Production/Production_Contribution_Scenarios.md
@@ -1,7 +1,8 @@
---
layout: default
-title: Content Production & Contribution
-parent: Network APIs
+title: Reference Scenarios
+grand_parent: Network APIs
+parent: Content Production & Contribution
nav_order: 0
has_children: true
---
@@ -11,79 +12,13 @@ has_children: true
{: .warning }
This documentation is currently **under development and subject to change**. It reflects outcomes elaborated by 5G-MAG members. If you are interested in becoming a member of the 5G-MAG and actively participating in shaping this work, please contact the [Project Office](https://www.5g-mag.com/contact)
-# Scenarios and Use Cases: Content Production and Contribution over Mobile Networks
+# Scenarios & Use Cases: Content Production and Contribution over Mobile Networks
This section contains information on:
-* [**A general introduction about the context**](#introduction)
-* [**Reference Scenarios**](#reference-scenarios), including:
+* **Reference Scenarios**, including:
* [**Single-device Connectivity**](#single-device-connectivity-single-camera-live-video-production-mobile-journalism-mojo-newsgathering-uplink-video)
* [**Multi-device Connectivity**](#multi-device-connectivity-outside-broadcast-small-scale-video-production-remote-production)
-## Introduction
-
-Wireless connectivity plays a key role in content production and contribution scenarios such as production in studios, coverage of live in-venue (a football match) or on-the-move (the Tour-de-France) events, commentary stands (in a convention), newsgathering (breaking news in the street),...
-
-These different setups may have unique infrastructure and equipment needs, and may benefit from the provision of connectivity with variyng QoS requirements.
-
-The use of wireless connectivity may differ as a trade-off between quality (or importance) of the content and connection reliability.
-
-The choice to use wireless connectivity is not limited to specific scenarios or production levels. Instead, it is a strategic trade-off, balancing cost, tolerance for technical glitches, risk of failure, and the quality and importance of the content. Examples are given below.
-
-### Diversity in connectivity needs in the same deployment scenario
-
-
-
-
-
Media production scenarios often require a mix of connectivity solutions to meet a variety of needs. For example, during a football match, a production team uses high-quality cameras for the main broadcast, while a commentator stand might have additional wireless cameras for pre-game interviews. Wireless cameras are also deployed outside the stadium to capture interviews with the crowd at the entrance of the stadium. Similarly, a major event like the coronation of King Charles III brought together numerous TV producers. They used a combination of dedicated, high-quality streams for the main ceremony and various other setups for newsgathering and interviews from journalists deployed around the site. This demonstrates how a single event can have multiple connectivity needs, from high-bandwidth main broadcasts to more flexible, on-the-go reporting. This is independent of the overall cost or budget of the whole event.
-
-
-
-### Immediacy Over Quality
-
-
-
-
-
When a sudden street event unfolds, the only way to cover it is with smartphones on a best-effort connection. Getting any live footage is far more valuable than dismissing the connection due to its unreliability. While the video might not be broadcast-quality, the immediate, raw footage from the scene is critical for covering the event as it happens.
-
-
-
-### Agility over Cost
-
-
-
-
-
For both sudden and partially-planned events, cellular bonding systems have emerged as cost-effective solutions to eliminate the need for e.g. dedicated satellite feeds, making live reporting from a wider range of locations economically viable. The equipment itself may be a major investment. The backpacks, modems, and SIM cards are not inexpensive and the news organization has to pay for a data plan for each SIM card and a service fee to the external company that provides the bonding infrastructure. Cellular bonding is needed as a single best-effort public mobile network cannot guarantee reliability. The news organization will plan where to send the different journalists that will provide reports from remote locations during the news programme. Covering sudden events with cellular bonding equipment is also usual, which may achieve better reliability that the connectivity via a single smartphone.
-
-
-
-### Dynamic Footage over Signal Stability
-
-
-
-
-
High-mobility cameras introduce the unique challenge of seamlessly mixing their footage (generally highly engaging) into a high-quality production that includes wired cameras with reliable connections. This means the wireless setup needs to be as stable as possible, whereas the nature of its constant motion, changing environments, and potential signal obstructions makes that challenging with frequent signal fades or brief drops in connectivity. Despite these issues, the value of the unique camera perspective is prioritized. A camera on a referee provides an on-field view of the action and is critical for live replays and enhancing the narrative of the game. A camera on a motorbike in the Tour de France provides up-close views of the riders that a stationary camera could never capture.
-
-
-
-### Beyond best-effort connectivity: exposure of network capabilities
-
-The fundamental trade-off in using wireless connectivity for media is that as a connection becomes more reliable, it enables more high-quality content to be delivered on it.
-Historically, media was uplinked using highly reliable satellite and RF technologies. Today, the widespread availability of public mobile networks or LEO satellite constellations triggered a shift toward more agile tools like smartphones, cellular bonding packs or wireless modems.
-
-The exposure of network capabilities to applications representes an opportunity to exploit advanced network features beyond best-effort connectivity. Examples of network capabilities maz include on-demand quality, user equipment (UE) management, precise time synchronization,... Accessing and utilizing the desired features can be intricate and inconsistent across different networks. Several initiatives are taking shape to explore the opportunities behind Network APIs (exposing network capabilities to API consumers), offering high-level abstractions of underlying network functionalities to simplify resource utilization for non-network experts.
-
-### Considerations on Devices
-
-The devices in these scenarios may involve the following:
-
- - A **single UE (e.g. a smartphone or any piece of equipment with a single UE)** equipped with a single SIM card (or eSIM) connected to the mobile network.
-
- - A **single device (e.g. a smartphone) equipped with 2 UEs** each with 1 SIM card (or eSIM) connected to a different carrier of the same mobile network or different mobile networks. Note that multi-SIM devices enable users to utilize multiple cellular connections simultaneously. Dual-SIM Dual-Active (DSDA) enable this use case with two SIM cards. This is different to Dual-SIM Dual-Standby (DSDS), which allows only one SIM to stay connected with active data at a time. DSDA enhances data performance for end users by enabling the use of two data connections concurrently across SIM1 and SIM2, with the option to choose the best of them or aggregate both, if necessary, to reach higher data throughput.
-
- - A **device with multiple UEs (e.g. a cellular bonding backpack)** equipment with multiple SIM cards each one connected to a different carrier of the same mobile network or connected to different mobile networks.
-
-# Reference Scenarios
-
## Single-device Connectivity (Single Camera Live Video Production, Mobile Journalism (MoJo), Newsgathering, Uplink Video)
A media producer (e.g. journalist in the field or at a venue) is interested in connectivity for capturing and contributing (uplinking) content to an application server located in the cloud or remote premises. This is a small-size Live Video Production, where practical equipment for immediacy is used e.g. mobile devices (SmartPhones) or camera connected to backpack solutions (specialized devices).
@@ -152,3 +87,13 @@ The network functions and applications involved are:
- **Aggregator API Platform (optional)**, located in the path between the Network API Platforms and the API Invoker. It grants access to Network API Platforms from different Network Providers.
- **API Consumer / Invoker**, used by the Production equipment (functions) to interact with the Network API Platform of a Network Provider.
- **Media Servers**, typically located in the Studio Production Hub (operated by the Production Manager) and interact with the production devices, e.g. receiving video or audio streams.
+
+## Considerations on Devices
+
+The devices in these scenarios may involve the following:
+
+ - A **single UE (e.g. a smartphone or any piece of equipment with a single UE)** equipped with a single SIM card (or eSIM) connected to the mobile network.
+
+ - A **single device (e.g. a smartphone) equipped with 2 UEs** each with 1 SIM card (or eSIM) connected to a different carrier of the same mobile network or different mobile networks. Note that multi-SIM devices enable users to utilize multiple cellular connections simultaneously. Dual-SIM Dual-Active (DSDA) enable this use case with two SIM cards. This is different to Dual-SIM Dual-Standby (DSDS), which allows only one SIM to stay connected with active data at a time. DSDA enhances data performance for end users by enabling the use of two data connections concurrently across SIM1 and SIM2, with the option to choose the best of them or aggregate both, if necessary, to reach higher data throughput.
+
+ - A **device with multiple UEs (e.g. a cellular bonding backpack)** equipment with multiple SIM cards each one connected to a different carrier of the same mobile network or connected to different mobile networks.
diff --git a/pages/Network_APIs/Content_Production/Production_Contribution_Workflows.md b/pages/Network_APIs/Content_Production/Production_Contribution_Workflows.md
index 02ce8f0d..5de09d9e 100644
--- a/pages/Network_APIs/Content_Production/Production_Contribution_Workflows.md
+++ b/pages/Network_APIs/Content_Production/Production_Contribution_Workflows.md
@@ -45,6 +45,91 @@ The **Network API Platform** of a Network Operator is accessed via an **Aggregat
+## Consolidation of requirements on network interactions
+The basic requirements for this scenario are:
+
+### **Ability to DISCOVER network resources**, at a given location and time/duration.
+ * This step is required to obtain information about the ability or not to reserve (and use) network resources for the intended location and time/duration.
+ * A QoS template may be used to define the required QoS parameters between the application (device) and application server.
+ * It should be able to indicate an aggregate of network resources corresponding to the number of devices with the same QoS requirements. For a single device, the aggregate would be just one device.
+
+
+
+
DISCOVERY API
+
+
+
Invoked with: QoS template, location/area, time/duration, number of devices intended to use resources concurrently
+
+
+
Response: Ability or not to reserve such resources.
+
+
+
+Remarks on the **location**:
+* The user should be able to indicate the location where the network resources are to be used by means of coordinates for an area (array of points) or a single point.
+* It is unlikely that the user-defined area corresponds to the operator-defined area.
+* A more general API could be invoked with the user-defined area as input and a operator-defined area identifier as output.
+
+### **Ability to RESERVE network resources**, by indicating location and time/duration.
+ * Network resources can be reserved for the intended location and time/duration.
+ * A QoS template may be used to define the required QoS parameters between the application (device) and application server.
+ * It should be able to reserve an aggregate of network resources corresponding to the number of devices with the same QoS requirements. For a single device, the aggregate would be just one device.
+
+
+
+
RESERVATION API
+
+
+
Invoked with: QoS template, location, time/duration, number of devices intended to use resources concurrently
+
+
+
Response: Effective reservation of resources for the specified location and duration. A range of reservation IDs corresponding to the number of devices which can concurrently use such resources.
+
+
+
+### **Ability to ASSIGN the device to the reserved network resources**, by linking a _reservation ID_ with a _device ID_.
+ * The devices for which resources are reserved are known in advance. However better flexibility would be given if the resources are not linked to a specific device at reservation. The device finally using the network resources may change between the reservation of network resources and their actual usage. A change of the device while in operation may also be needed (e.g. for replacement by a back-up device).
+
+
+
+
ASSIGNMENT API
+
+
+
Invoked with: reservation ID and device ID
+
+
+
Response: ACK and an assingment ID per device.
+
+
+
+### **Ability to activate/deactivate the USAGE of the network resources**, either automatically when the device is connected to the network or manually. Activating the usage of network resources just when the device obtains connectivity is not ideal. For instance, a device should use best-effort connectivity in the event of a problem (need to exchange a device) while a new device is assigned the network resources.
+
+
+
+
USAGE API
+
+
+
Invoked with: Assingment ID
+
+
+
Response: ACK the activation of the resources for the current assignment
+
+
+
+### **Ability to activate/deactivate NOTIFICATIONS on the usage of the network resources**.
+
+
+
+
NOTIFICATIONS API
+
+
+
Invoked with: Assignment ID, periodicity of notifications, sink for notifications
+
+
+
Response: ACK the activation of the notification
+
+
+
# Single-device Connectivity (Single Camera Live Video Production, Mobile Journalism (MoJo), Newsgathering, Uplink Video)
## Before the Event
@@ -70,10 +155,25 @@ The **Network API Platform** of a Network Operator is accessed via an **Aggregat
Through the Network API Platform:
-1. The production crew (already on location or while traveling to the event) can discover the capabilities the network can offer in a particular location and at a particular time (for which the production company is eligible for). Example: QoD available, connectivity monitoring available.
+
+1. The production crew (already on location or while traveling to the event) can discover the capabilities the network can offer in a particular location and at a particular time. Example: QoD available, connectivity monitoring available.
+
+
+
+
Requirement to invoke DISCOVERY API
+
+
+
2. The production crew requests network services for the devices (identified by its SIM cards) in advance. The booking of resources is done based on:
* Geographical area
* Schedule (starting time and closing time, or duration, of the event)
+
+
+
+
Requirement to invoke RESERVATION API
+
+
+
3. The production manager receives a booking reference responding to the service request.
4. The production manager accepts the service booking offer (involving payment).
5. The production manager receives **network access IDs** to be used by the production device UEs to access the network and the requested capabilities for the specified location and duration.
@@ -94,8 +194,21 @@ Through the Network API Platform:
### Phase C: Configuration and Usage of the network capabilities
1. The production crew arrives at the event and can start using the booked network services (See phase B).
+
+
+
+
Requirement to invoke ASSIGNMENT API
+
+
+
2. The production device makes use of the network capabilities according to the network access IDs reveived. The media related parameters can be adapted using an application-specific API, citing the network access IDs delivered in step B.5).
+
+
+
Requirement to invoke USAGE API
+
+
+
Practical example
@@ -110,6 +223,12 @@ Through the Network API Platform:
### Phase D: Location teardown
1. Through the Network API Platform, the production crew releases the booked resources when the event finishes.
+
+
+
Requirement to invoke RESERVATION API
+
+
+
---
## Multi-device connectivity (Outside Broadcast, Small-Scale Video Production, Remote Production)
@@ -141,6 +260,12 @@ Through the Network API Platform:
Through the Network API Platform:
1. The production crew (on location or from the production centre) can discover the capabilities the network can offer in a particular location and at a particular time (for which the production company is eligible for). Example: QoD available, connectivity monitoring available, Timing as a service available, edge compute instantiation, etc.
+
+
+
Requirement to invoke DISCOVERY API
+
+
+
2. The production crew requests network services for the devices (identified by its SIM cards) in advance. Possible services (network capabilities) are:
1. *Quality-on-Demand*
* One or several QoS profiles for each SIM card (QoS profiles are mapped to 5QIs)
@@ -151,7 +276,13 @@ Through the Network API Platform:
The booking of resources is done based on:
* Geographical area
* Schedule (starting time and closing time, or duration, of the event)
-
+
+
+
+
Requirement to invoke RESERVATION API
+
+
+
3. The production manager receives a booking reference responding to the service request.
4. The production manager accepts the service booking offer (involving payment/contract/SLA aspects).
5. The production manager receives **network access IDs** to be used by the production device UEs to access the network and the requested capabilities for the specified location and duration.
@@ -172,13 +303,26 @@ Through the Network API Platform:
### Phase C: Configuration and Usage of the network capabilities
1. The production crew arrives in the venue, plugs the SIM cards and turn on the devices, connectivity is enabled based on the booked network services (See phase B).
-2. The production crew initiates the setup of the location production by interacting with the production network orchestrator.
-3. The production network orchestrator configures the production device nodes using an application-specific API, citing the network access IDs delivered in step B.5).
+
+
+
+
Requirement to invoke ASSIGNMENT API
+
+
+
+3. The production crew initiates the setup of the location production by interacting with the production network orchestrator.
+4. The production network orchestrator configures the production device nodes using an application-specific API, citing the network access IDs delivered in step B.5).
* Example: QoD service: A camera for which one video + one audio is pre-booked. The application-specific API is used to properly configure the bitrate of the audio and video output, and the provided IDs.
* Example: Time Sync service: A camera for which access to global clock is requested. The application-specific API is used to properly configure the time parameters and the provided IDs.
-4. The production device makes use of the network capabilities according to the network access IDs reveived.
+5. The production device makes use of the network capabilities according to the network access IDs reveived.
+
+
+
+
Requirement to invoke USAGE API
+
+
@@ -204,17 +348,42 @@ A series of actions can be expected "During the Event" as changes, reconfigurati
* The production crew should use the Network API Platform to monitor that the flows are coming and are properly using the reserved resources.
* The production crew should receive notifications through the Network API Platform indicating potential issues (throughput, delay, etc.).
+
+
+
Requirement to invoke NOTIFICATION API
+
+
+
## Reconfiguration for a given device
* The production crew through the Network API Platform should request a change of the current configuration assinged to a device
* The production crew through the Network API Platform should request an update/modification of the originally booked resources (e.g. increase or decrease the thoughput associated to an existing profile). Same validation steps as from B.2 to B.5 will be conducted after requesting the change. Note that the network access IDs are not expected to change when a reconfiguration occurs.
+
+
+
Requirement to invoke RESERVATION API
+
+
+
## Back-up devices
* The production crew through the Network API Platform should switch/update a device while being able to use the original booking of a different device.
+
+
+
Requirement to invoke ASSIGNMENT API
+
+
+
## Dynamic prioritization of QoS for different media flows
In a setup with multiple camras, the media producer would like to ensure that there is always a subset of those being prioritized with the highest QoS profile. While each individual camera should be entitled to exploit such high QoS profile (i.e. the original booking should take into account that X devices will be requesting QoS profile Y), not all of them will be using such profile concurrently. Therefore:
* The production crew though the Network API Platform should dynamically attach/detach a device to a QoS profile.
+
+
+
+
Requirement to invoke ASSIGNMENT API
+
+
+
* The network operator should secure that a subset of devices can concurrently request a given QoS profile and that all other devices remain eligible to access such profile when it is no longer used.
### Basic example: Definition of QoS Profiles
diff --git a/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_Introduction.md b/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_Introduction.md
new file mode 100644
index 00000000..103415ed
--- /dev/null
+++ b/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_Introduction.md
@@ -0,0 +1,40 @@
+---
+layout: default
+title: Live Media Distribution
+parent: Network APIs
+nav_order: 1
+has_children: true
+---
+
+
+
+{: .warning }
+This documentation is currently **under development and subject to change**. It reflects outcomes elaborated by 5G-MAG members. If you are interested in becoming a member of the 5G-MAG and actively participating in shaping this work, please contact the [Project Office](https://www.5g-mag.com/contact)
+
+# Introduction: Live Media Distribution
+
+For many people the internet is becoming their primary means of accessing TV and radio content, while for some the internet is already the only method they use.
+
+As a result, content providers are increasingly interested in understanding how well their content is being delivered online, particularly when it is offered "over the top" (OTT), therefore with no ability to influence how networks would treat content to meet particlar quality of service (QoS) requirements. Similarly, network operators want insight into how well their networks are meeting consumers’ needs and expectations.
+
+## Monitoring QoS
+When it comes to monitoring QoS, content providers typically have sight of the two ends of the distribution chain: the source (often a CDN in this context) and the application running on a user device, such as a smartphone. For services delivered over Dynamic Adaptive Streaming over HTTP (MPEG-DASH), for example, it appears that QoS may be sufficiently described by the statistics of the time taken to deliver each segment i.e. the time to last byte (TTLB). This data can be readily obtained in the client app by timing how long it takes from each segment request to its receipt in full [ref].
+
+If poor performance is detected the content provider can often rule out the CDN as a cause by inspecting the CDN logs.
+
+Although content providers can also record radio access network (RAN) performance indicators such as cell ID, signal strength and quality, they have no direct visibility of the underlying network. Any issues not attributable to the CDN or insufficient signal are effectively lumped into the ‘network’, which appears to them as ‘The Cloud’.
+
+**Figure goes here**
+
+Content providers can measure ‘end to end’ performance via the client app and the CDN endpoints. Network operators have greater visibility of their network but may see the entire distribution chain.
+
+Network operators, on the other hand, have far greater visibility of what is happening in their networks. Typically, however, for third party services where network operators don’t have access to the client app, they too typically lack visibility of the full distribution chain. In these cases, for example, interference to the uplink may prevent segment requests ‘reaching’ the network. Such events cannot be recorded by the network as an issue. From the operator’s perspective, the request simply does not exist, making it impossible to diagnose or attribute the underlying cause of any subsequent degraded experience.
+
+# What we are doing
+
+At 5G-MAG we’re investigating the possibility of logging relevant performance data through the client application and share it with the network operator. Standard ways of feeding back and sharing data would make the process easier and encourage such collaboration.
+
+Please go to the following sections:
+* [Reference Scenarios](./Live_Media_Distribution_Scenarios.html)
+* [Workflows](./Live_Media_Distribution_Workflows.html)
+* [Using Network APIs](./Live_Media_Distribution_UsingCAMARAAPIs.html)
diff --git a/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_Scenarios.md b/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_Scenarios.md
index c51d48b1..5fc3500a 100644
--- a/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_Scenarios.md
+++ b/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_Scenarios.md
@@ -1,8 +1,9 @@
---
layout: default
-title: Live Media Distribution
-parent: Network APIs
-nav_order: 1
+title: Reference Scenarios
+parent: Live Media Distribution
+grand_parent: Network APIs
+nav_order: 0
has_children: true
---
@@ -13,12 +14,6 @@ This documentation is currently **under development and subject to change**. It
# Scenarios and Use Cases: Live Media Distribution
-## Introduction
-
-Today, video/audio streaming services are delivered over-the-top, i.e. without any defined quality of service (QoS). Measurements of such a service over today's mobile networks indicate that the QoS experienced by viewers and listeners may fall short of their expectations (e.g. interruption-free playback delivered with a delay comparable with linear broadcast radio/television).
-
-# Reference Scenario
-
### Actors
diff --git a/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_UsingCAMARAAPIs.md b/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_UsingCAMARAAPIs.md
index 4069553b..d5a0917b 100644
--- a/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_UsingCAMARAAPIs.md
+++ b/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_UsingCAMARAAPIs.md
@@ -3,7 +3,7 @@ layout: default
title: Using CAMARA APIs
parent: Live Media Distribution
grand_parent: Network APIs
-nav_order: 1
+nav_order: 2
has_children: false
---
diff --git a/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_Workflows.md b/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_Workflows.md
index 9d605958..5dc8c4dc 100644
--- a/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_Workflows.md
+++ b/pages/Network_APIs/Live_Media_Distribution/Live_Media_Distribution_Workflows.md
@@ -3,7 +3,7 @@ layout: default
title: Workflow
parent: Live Media Distribution
grand_parent: Network APIs
-nav_order: 0
+nav_order: 1
has_children: false
---
@@ -12,7 +12,7 @@ has_children: false
{: .warning }
This documentation is currently **under development and subject to change**. It reflects outcomes elaborated by 5G-MAG members. If you are interested in becoming a member of the 5G-MAG and actively participating in shaping this work, please contact the [Project Office](https://www.5g-mag.com/contact)
-# Workflows and Requirements for Live Media Production
+# Workflows and Requirements for Live Media Distribution
[Scenarios and Use Cases](../Live_Media_Distribution_Scenarios.html) describe the reference scenario. The workflows in relation to the booking and usage of network capabilities are described here with a focus on quality of service (QoS).
diff --git a/pages/network_apis.md b/pages/network_apis.md
index 573fc458..2748c5d1 100644
--- a/pages/network_apis.md
+++ b/pages/network_apis.md
@@ -33,7 +33,8 @@ The following resources are available:
### Network Capability Exposure for Content Production and Contribution Scenarios
-* [**Scenarios and Use Cases**](./Network_APIs/Content_Production/Production_Contribution_Scenarios.html). This is a selection of scenarios and use cases that may benefit from the use of network services (exposed via APIs).
+* [**Introduction**](./Network_APIs/Content_Production/Production_Contribution_Introduction.html). Introduction to the work on Network APIs for Media Production.
+* [**Scenarios & Use Cases**](./Network_APIs/Content_Production/Production_Contribution_Scenarios.html). This is a selection of scenarios and use cases that may benefit from the use of network services (exposed via APIs).
* [**Workflows and Requirements to exploit network capabilities**](./Network_APIs/Content_Production/Production_Contribution_Workflows.html). This describes generic workflows and interaction to explot network capabilities and provides insight into devices and requirements.
* [**Using CAMARA APIs**](./Network_APIs/Content_Production/Production_Contribution_UsingCAMARAAPIs.html). This contains several examples of instantiations of the workflows and scenarios above when using CAMARA APIs.
diff --git a/pages/xr/mpeg-i-scene-description.md b/pages/xr/mpeg-i-scene-description.md
index 26d5e7eb..e245c536 100644
--- a/pages/xr/mpeg-i-scene-description.md
+++ b/pages/xr/mpeg-i-scene-description.md
@@ -37,19 +37,19 @@ MPEG-I SD defined the following reference architecture.
A first set of extensions (green in the figure) enable the timed framework including:
-* [**MPEG_media**](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_media/README.md), which enables the referencing of external media streams that are delivered over protocols such as RTP/SRTP, MPEG-DASH, or others
-* [**MPEG_accessor_timed**](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_accessor_timed/README.md), used in a scene that contains timed media and/or metadata to describe access to the dynamically changing data
-* [**MPEG_buffer_circular**](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_buffer_circular/README.md), to extend the buffer into a circular buffer
+* [MPEG_media](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_media/README.md), which enables the referencing of external media streams that are delivered over protocols such as RTP/SRTP, MPEG-DASH, or others
+* [MPEG_accessor_timer](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_accessor_timed/README.md), used in a scene that contains timed media and/or metadata to describe access to the dynamically changing data
+* [MPEG_buffer_circular](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_buffer_circular/README.md), to extend the buffer into a circular buffer
A second group of extensions (blue in the figure) enables the inclusion of dynamic and temporal media including:
-* [**MPEG_texture_video**](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_texture_video/README.md). provides the possibility to link a texture object defined in glTF 2.0 to media and its respective track
-* [**MPEG_audio_spatial**](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_audio_spatial/README.md), to support spatial audio
-* [**MPEG_mesh_linking**](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_mesh_linking/README.md), provides the possibility to link a mesh to another mesh in a glTF asset
-* [**MPEG_scene_dynamic**](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_scene_dynamic/README.md), [**MPEG_viewport_recommended**](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_viewport_recommended/README.md), and [**MPEG_animation_timing**](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_animation_timing/README.md), which indicate that a particular form of timed data is provided to the Presentation Engine during the consumption of the scene and that it shall adapt to the changing information.
+* [MPEG_texture_video](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_texture_video/README.md). provides the possibility to link a texture object defined in glTF 2.0 to media and its respective track
+* [MPEG_audio_spatial](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_audio_spatial/README.md), to support spatial audio
+* [MPEG_mesh_linking](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_mesh_linking/README.md), provides the possibility to link a mesh to another mesh in a glTF asset
+* [MPEG_scene_dynamic](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_scene_dynamic/README.md), [MPEG_viewport_recommended](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_viewport_recommended/README.md), and [MPEG_animation_timing](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/MPEG_animation_timing/README.md), which indicate that a particular form of timed data is provided to the Presentation Engine during the consumption of the scene and that it shall adapt to the changing information.
A third group of extensions enables the distribution of real-time immersive and interactive media content including:
-* Augmented Reality anchor (**MPEG_scene_anchor**, **MPEG_node_anchor**), to support AR experiences where virtual content is inserted into the user's real environment
-* Interactivity (**MPEG_scene_interactivity**, **MPEG_node_interactivity**), to describe interactivity at runtime with support for interactions between user and virtual objects and between virtual objects, with triggers based on proximity, visibility, collision or user input.
-* Avatar (**MPEG_node_avatar**), to support the representation of 3D avatars.
-* Lighting (**MPEG_light**), to provide a realistic user experience including shadows and lighting.
-* Haptics (**MPEG_haptic**, **MPEG_material_haptic**), to support haptics based on the MPEG standard for Coded representation of Haptics by attaching haptic information to a node or to a mesh.
+* Augmented Reality anchor (MPEG_anchor), to support AR experiences where virtual content is inserted into the user's real environment
+* Interactivity (MPEG_scene_interactivity, MPEG_node_interactivity), to describe interactivity at runtime with support for interactions between user and virtual objects and between virtual objects, with triggers based on proximity, visibility, collision or user input.
+* Avatar (MPEG_avatar), to support the representation of 3D avatars.
+* Lighting (MPEG_lights_texture_based, MPEG_light_punctual), to provide a realistic user experience including shadows and lighting.
+* Haptics (MPEG_haptic), to support haptics based on the MPEG standard for Coded representation of Haptics by attaching haptic information to a node or to a mesh.