1. INTRODUCTION

MIST Studio offers the ability to be controlled by a third-party system using Marquise Technologies' STORM API

There is 2 ways of communication with the application:

  • A Command Line Interface

  • A REST Interface

This document describes the STORM API.

1.1. Legal Information

1.1.1. Copyright Notice

All rights reserved. No part of this document may be reproduced, copied or transmitted in any form by any means electronic, mechanical or otherwise without the permission of Marquise Technologies sàrl. If you are interested in receiving permissions for reproduction or excerpts, please contact us at contact@marquise-tech.com

1.1.2. Trademarks

Marquise Technologies, the logo and MIST are trademarks of Marquise Technologies sàrl.

All other trademarks mentioned here within are the property of their respective owners.

1.1.3. Notice of Liability

The information in this document is distributed and provided “as is“ without warranty.

While care has been taken during the writing of this document to make the information as accurate as possible, neither the author or Marquise Technologies sàrl shall not be held responsible for losses or damages to any person or entity as a result of using instructions as given in this document.

1.1.4. Disclaimer of Warranty

*THIS SOFTWARE AND THE ACCOMPANYING FILES ARE PROVIDED "AS IS" AND WITHOUT WARRANTIES AS TO PERFORMANCE OR MERCHANTABILITY OR ANY OTHER WARRANTIES WHETHER EXPRESSED OR IMPLIED.

NO WARRANTY OF FITNESS FOR A PARTICULAR PURPOSE IS OFFERED.*

STORM API

Version 2021

Application Programming Interface

1. About STORM API

STORM is the name of the engine powering all the software applications of Marquise Technologies. It’s API is described in the following sections:

  • Operational Contexts

  • Command Line Interface

  • REST API

  • STORM API References

2. Operational contexts

Each job supported by STORM will require an operational context as the main parameter. The context defines the kind of task that must be performed.

When using the command line interface, the context is the first parameter that is mandated. It also defines the following accepted parameters and options. Below is an example for the usage of the context for the command line interface:

2.1. Command Line Interface example

stormserver mediainfo -i "E:/SOURCE/testFile.mxf" -format pdf -o "E:/RESULT/testFileInfo.pdf"

The exact syntax for the command line interface is explained in the following sections.

2.2. REST API example

When adding a job, the JSON parameters must contain the context definition, otherwise the job will not be added to the queue. Here is an example of how a job context is identified in the JSON parameters:

{
  "context" : "mediainfo",
  "input"   : "E:/SOURCE/testFile.mxf",
  "output"  : "E:/RESULT/testFileInfo.pdf",
  "format"  : "pdf",
  ...
}

2.3. Operational Contexts

The values for the context are described in the table below:

Table 1. Operational Contexts
Context Details

analayze

Creates a content analysis report

enum

Enumerates the various entities supported by the API such a mastering formats, codecs, etc

filecopy

Copies a file from source to destination

filemove

Moves a file from source to destination

ftpdownload

Downloads a file from a FTP source to a local repository

ftpupload

Uploads a file from a local repository to a FTP destination

help

Displays the command line interface usage instructions

masterinfo

Create a mastering information report in various file formats

mediainfo

Create a media information report in various file formats

validate

Creates a QC report for a master in various file formats

xcode

Transcode a file, a composition or a package into various file formats

websrv

Start the STORM API server as a webservice

webwrk

Start the STORM API server as a webworker

For each of the above operatinal contexts, a set of parameters are defined.

2.3.1. xcode Operational Context parameters

Table 2. xcode context parameters
Parameter Name Required/Optional Details

input

Required

Absolute file name of the source file in one of the remote mounted volumes (e.g. "F:/MEDIA/test.mxf").

output

Required

Absolute file name of the destination file in one of the remote mounted volumes (e.g. "G:/RENDERS/result.mov").

format

Optional

Defines the container format (e.g. "quicktime").

preset

Optional

Absolute file name of the XML preset to be used for the output file. It is mutually exclusive with the format parameter.

2.3.2. websrv Operational Context parameters

Table 3. websrv context parameters
Parameter Name Required/Optional Details

workers

Optional

Used to specify the number of worker nodes to be launched as local sub-processes by this node.

service

Optional

If specified than the node can be installed as a system service.

2.3.3. webwrk Operational Context parameters

Table 4. webwrk context parameters
Parameter Name Required/Optional Details

address

Required

Specifies this node’s URL e.g. "http://127.0.0.1"

port

Required

Specifies this node’s communication port.

nodeaddress

Required

Specifies the manager’s node URL with which the worker node should register e.g. "http://127.0.0.1:8080"

2.3.4. Enumerable entities

Table 5. Enumerables
Option Name Details

masters

Displays a list of available master format names

formats

Displays a list of available output file format names

specifications

Displays a list of available specifications

shims

Displays a list of available shims

shimversions

Displays a list of available shium-versions

3. Command Line Interface API

Storm can be used via the command line interface (CLI) just like any other operating system command. This interface is simple and practical for integration into scripts for batch processing and is supported by most of the interpreted languages (e.g. PHP, etc).

The syntax of the command line interface follows a structure based on contexts, actions, objects and options. The general syntax can be described like this:

stormserver [context] [action] [object 1] .. [object n] [option 1] .. [option n]

Example 1:

stormserver enum -masters

The example above enumerates the possible mastering formats.

Example 2:

stormserver mediainfo –i “E:/video.mov” –format “xml” –o “E:/metadata.xml”

The example above extracts the metadata from the file “E:/video.mov” into the file “e:/metadata.xml”.

Please note that the double quotes around options are only necessary when options contain white spaces or characters that would be forbidden in the direct syntax of a regular command line on the operating system. In most cases they can be omitted, however it is always safe to use them.

3.1. Using the CLI on Microsoft Windows systems

To use the command line interface on Microsoft Windows systems, the following steps must be folloed:

  1. Press the Windows key and R (for run)

  2. When the prompt window appears (next to the Windows Start button), type cmd (for command) and press Enter. The console/command line tool should be displayed.

  3. Go to the installation directory using the cd command (change directory):

E:\>cd "Marquise Technologies\x64"

Note that if the installation drive is different than the current drive, the current drive (volume) must be changed by typing the drive/volume letter followed by the column character:

Change current drive

C:\Users\marquise>e:

Change current directory

E:\>cd "Marquise Technologies\x64"

The last step is to type the STORM Server command as illustrated above, followed by the operational context and the list of parameters for the context. After this sequence, STORM Server would start in the specified operational context and will continue with the requested processing.

4. REST Interface

The STORM API can be accessed via the REST (REpresentational State Transfer) interface which allows a scalable, service-oriented use. This interface is simple and practical for integration into scripts for batch processing and is supported by most of the interpreted languages (e.g. PHP, etc).

The STORM API REST interface is implemented via the HTTP protocol. Commands are sent as JSON structured requests.

In the following documentation, cURL is used as a tool to demonstrate the HTTP requests to the STORM API server. cURL is multi-platform free tool and can be obtained here:

4.1. Starting the server

In order to start STORM Server as a REST web service, you must use the following command:

stormserver websrv

The command above will start the STORM server as a web service listening for HTTP requests on port 8080 by default. This will result in the following:

STORM v2021.1.5.0
(C) 2009-2021 Marquise Technologies
Release Date Jan 5 2021
start threadpool
starting up server... done!
listening on port 8080
waiting requests...

At this point the server will keep running until the /control/shutdown request is received or the command line process is killed.

If you wish to change the communication port, the following command must be used:

stormserver websrv –port 1234

STORM Server can also be installed and started as a system service. In order to do so you must specify the -service flag when installing the server as a service.

The following example uses the Windows sc.exe service manager to install then start STORM Server as a service:

sc create "STORMSERVER" binPath="stormserver websrv -service"
sc start "STORMSERVER"

4.2. How jobs are managed

Storm processes media via a list of queued jobs. These jobs can be created directly via the API, or can be spawned automatically by a watchfolder that is actively monitored by a specific node.

Regardless of the manner in which the job has been created, once the job has been added to the queue of jobs, it can be further manipulated like any other regular job.

4.3. Processing graphs

Processing graphs are complex XML descriptions of various image processing, audio processing and transcoding functions that Storm can apply to the media that needs to be processed.

The processing graph can describe one or more outputs, each of the output being the result of a list of daisy-chained processing nodes.

The following example shows a processing graph document basics (incomplete):

<?xml version="1.0" encoding="UTF-8" ?>
<mtprocessinggraph>
  <nodelist>
    <node uuid="12d42cf7-76ea-42ba-bc39-40e684176b08" type="source">
    ...
    </node>
  </nodelist>
  <linklist>
    <link target="13015df7-a165-47a5-9061-50ce646dcfed" input="0" source="12d42cf7-76ea-42ba-bc39-40e684176b08" output="0"/>
    ...
  </linklist>
</mtprocessinggraph>

Each node has a unique id, a type and a list of type-specific parameters

4.3.1. Node linking

The processing order of the nodes is dictated by a list of links that connect the various nodes involved in the processing of an output. This mechanism allows a node to be used in multiple processing combinations within the same graph.

Each link has a target node id (i.e. the next node in the processing chain), as well as an input number. The number of inputs depends on the target, however most nodes have only one input and therefore the input number is usually 0.

In addition to the target description, each link has a source node id (i.e. the previous node in the processing chain), as well as an output number. The number of outputs depends on the source, however most nodes have only one output and therefore the output number is usually 0.

4.3.2. Processable nodes

Only output nodes are actually processed by Storm. Nodes that are not connected to an output are ignored and do not generate any processing.

4.3.3. Processing Graph example

The following example describes a full processing graph with two processing nodes and one ouput node. Also, please note the top node of the type "source", which describes the input media.

<?xml version="1.0" encoding="UTF-8" ?>
<mtprocessinggraph>
  <nodelist>
    <node uuid="12d42cf7-76ea-42ba-bc39-40e684176b08" type="source"/>
    <node uuid="13015df7-a165-47a5-9061-50ce646dcfed" type="lut">
      <annotation>Untitled Output</annotation>
      <filename>E:\Marquise Technologies\x64\luts/HDR 1000 nits to Gamma 2.4.cube</filename>
    </node>
    <node uuid="97ce96df-08f4-4e9e-8fb9-9e4222c687d2" type="imageResize">
      <annotation>Untitled Output</annotation>
      <width>1920</width>
      <height>1080</height>
      <aspectratio>0</aspectratio>
      <fit>3</fit>
      <flip>false</flip>
      <flop>false</flop>
      <expandmethod>0</expandmethod>
      <compressmethod>0</compressmethod>
    </node>
    <node uuid="29b7adb8-24b8-4d30-9420-b436eb6b44fa" type="output">
      <annotation>Untitled Output</annotation>
      <directory>$ProjectName$/$CompositionName$</directory>
      <filename>$ClipName$</filename>
      <container>mxfop1a</container>
      <video>
        <channel>0</channel>
        <codec>dnxhd</codec>
        <quality>hqx</quality>
        <bitdepth>10</bitdepth>
        <colorprimaries>CIEXYZ</colorprimaries>
        <transfercharacteristic>SMPTE.ST2084</transfercharacteristic>
        <codingequations>ITU-R.BT.2020</codingequations>
        <coderange>full</coderange>
        <ST2086Metadata enabled="false">
          <MaximumLuminance>0</MaximumLuminance>
          <MinimumLuminance>0</MinimumLuminance>
          <WhitePointChromaticity>
            <x>0.333333</x>
            <y>0.333333</y>
          </WhitePointChromaticity>
          <Primaries>
            <ColorPrimaryRed>
              <x>0.333333</x>
              <y>0.333333</y>
            </ColorPrimaryRed>
            <ColorPrimaryGreen>
              <x>0.333333</x>
              <y>0.333333</y>
            </ColorPrimaryGreen>
            <ColorPrimaryBlue>
              <x>0.333333</x>
              <y>0.333333</y>
            </ColorPrimaryBlue>
          </Primaries>
        </ST2086Metadata>
      </video>
      <audio>
        <byteorder>little</byteorder>
        <soundfieldlist>
          <soundfield config="51" downmix="no">
            <name>surround mix</name>
            <language>ja</language>
            <mapping/>
          </soundfield>
          <soundfield config="ST" downmix="no">
            <name>stereo downmix</name>
            <language>ja</language>
            <mapping/>
          </soundfield>
        </soundfieldlist>
      </audio>
      <timecode>
        <tcorigin>0</tcorigin>
        <tcstart>00:00:00:00</tcstart>
      </timecode>
    </node>
  </nodelist>
  <linklist>
    <link target="13015df7-a165-47a5-9061-50ce646dcfed" input="0" source="12d42cf7-76ea-42ba-bc39-40e684176b08" output="0"/>
    <link target="97ce96df-08f4-4e9e-8fb9-9e4222c687d2" input="0" source="13015df7-a165-47a5-9061-50ce646dcfed" output="0"/>
    <link target="29b7adb8-24b8-4d30-9420-b436eb6b44fa" input="0" source="97ce96df-08f4-4e9e-8fb9-9e4222c687d2" output="0"/>
  </linklist>
</mtprocessinggraph>

4.4. Working with watchfolders

Storm can handle a list of watchfolders that are actively monitored.

Every new entry (i.e. new file or new directory) into one of the monitored watch folders is checked by Storm against a list of filters attached to a watchfolder.

If the entry matches one of the filters, then a new job is automatically added to the queue of job. The job is described by a processing graph, associated with the watchfolder.

4.4.1. How watchfolders spawn jobs

As described previously, watchfolders monitor specific file system locations for incoming media. Once a media has been accepted by one of the filters associated with the watchfolder, a job is automatically spawned.

The job will process the media using a processing graph or a preset, associated with the watchfolder, when the watchfolder is created.

Once the job is spawned, it is automatically queued to the list of jobs and therefore can be managed as any regular job.

4.4.2. Input and output folders of watchfolder jobs

Each watchfolder descriptor has a definition for an input and an output folder. The input folder is the folder being monitored, the output folder is the location where the results are generated.

4.5. Compositions

Compositions are XML timeline descriptions of various audio, video, subtitles and data elements synchronized together. Storm can process complex compositions and apply various processing operators onto them and generate multiple outputs.

4.5.1. Composition example

The following example shows a simple composition document (incomplete):

<?xml version="1.0" encoding="UTF-8"?>
<axf>
  <project>
    <name>Simple</name>
    <format>
      <width>3840</width>
      <height>2160</height>
      <aspectratio>1:1</aspectratio>
      <framerate>24:1</framerate>
      <cms>
      ...
      </cms>
    </format>
    <composition>
      <name>simple</name>
        <type>2</type>
        <in>0</in>
        <out>750</out>
        <format>
          <width>1920</width>
          <height>1080</height>
          <aspectratio>1:1</aspectratio>
          <framerate>25:1</framerate>
          <cms>
          ...
          </cms>
        </format>
        <soundfieldlist>
          <soundfield>
            <audioconfig>LtRt</audioconfig>
            <mapping>
            ...
            </mapping>
          </soundfield>
        </soundfieldlist>
        <uid>0c85ed19-fb97-405d-ac6b-8c7057cee84a</uid>
        <media>
          <video>
            <layer>
              <name>V1</name>
              <v1>
                <segment type="clip">
                  <name>Clearcast 30sec AD FINAL_1080p_full</name>
                  <uid>46c307d6-e5b5-47b0-ac7a-b7d2f0307861</uid>
                  <in>0</in>
                  <out>750</out>
                  <source>
                    <refname>/Clearcast 30sec AD FINAL_1080p_full.mov</refname>
                    <url>F:\Adstream\Clearcast 30sec AD FINAL_1080p_full.mov</url>
                    <in>0</in>
                    <out>750</out>
                  </source>
                </segment>
              </v1>
            </layer>
          </video>
          <audio>
            <layer>
              <name>Lt</name>
              <routing>
                <channel>0</channel>
              </routing>
              <audiotrack>
                <segment type="clip">
                  <name>Clearcast 30sec AD FINAL_1080p_full</name>
                  <uid>43564d4b-2ddc-4938-b52e-755867df3bc1</uid>
                  <in>0</in>
                  <out>750</out>
                  <groupid>0</groupid>
                  <source>
                    <refname>/Clearcast 30sec AD FINAL_1080p_full.mov</refname>
                    <url>F:\Adstream\Clearcast 30sec AD FINAL_1080p_full.mov</url>
                    <in>0</in>
                    <out>750</out>
                    <routing>
                      <channel>0</channel>
                    </routing>
                  </source>
                </segment>
              </audiotrack>
            </layer>
            <layer>
            ...
            </layer>
          <subtitles>
          ...
          </subtitles>
          <auxdata>
          ...
          </auxdata>
        </media>
      </composition>
    </project>
</axf>

Compositions can have multiple tracks, multiple layers and can deal with complex editing information, just like any NLE timeline.

4.6. Output presets

Output presets are XML descriptions that define the output format for the input media passed to Storm server. They are templates that can be reused when transcoding.

The following example shows a output preset that describes an MP4 container with a h264 encoded video with a 720p resolution:

<?xml version="1.0" encoding="UTF-8" ?>
<MasterDeliverySpecification>
  <Name>720p_H264_MP4_noTCStart</Name>
  <Type>MPEG4</Type>
  <Specification>custom</Specification>
  <Shim>isobmff.mp4</Shim>
  <ShimVersion>default</ShimVersion>
  <Presentation>monoscopic</Presentation>
  <Video>
    <FrameWidthList>
      <FrameWidth>1280</FrameWidth>
    </FrameWidthList>
    <FrameHeightList>
      <FrameHeight>720</FrameHeight>
    </FrameHeightList>
    <FrameRateList>
      <FrameRate>24:1</FrameRate>
    </FrameRateList>
    <Codec>
      <Id>h264</Id>
      <ColorEncoding>YCbCr.4:2:2</ColorEncoding>
      <Profile>baseline</Profile>
    </Codec>
    <Colorimetry>
      <ColorSpace>custom</ColorSpace>
      <CodeRange>head</CodeRange>
    </Colorimetry>
  </Video>
  <Audio>
    <Codec>
      <Id>aac</Id>
    </Codec>
    <SoundfieldList/>
  </Audio>
</MasterDeliverySpecification>

For a detailed description of the Output Preset file format see: MT-TN-20 Output Preset File Format Specification

4.7. Understanding logs

There are various objects that can be manipulated by TORNADO. These objects are entities upon which the process can execute an action, such as jobs, output-presets, watch-folders or workers. Moreover, each object has it’s own dedicated manager. The role of the manager is to handle the object’s lifecycle.

The actions that happen at the manager level and at the object level can generate events. This difference is reflected in the way logs can be accessed via the REST API: The /<object_type>/log endpoint will list all the events that happened at the manager level and the /<object_type><object_UUID>/log will list the events related to a specific object. These events can be used to trace and understand the activity of the system.

For example, in the case of job processing, the object is the job itself while the manager handles the different stages required for executing the type of task represented by the job.

When a job is created, like the case of a transcoding job, a series of steps are executed in order to validate it’s properties and create the delivery engine.

If an error was made in the request structure, like using the wrong execution context, then an error is raised only at the manager level and an event is generated. The event can be consulted by calling the /jobs/log :

{
    "events": [
        {
            "timestamp": "2022-02-07T11:13:53.111+00:00",
            "severity": "error",
            "message": "Processing not supported for context",
            "code": 8
        }
    ],
    "status": "ok"
}

If an error was made at the codec quality level when configuring a preset, then an error is raised while validating the codec constraints. This error will generate one or more events at the job object level and can be consulted by calling the /jobs/<job_UUID>/log :

{
    "events": [
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "severity": "error",
            "message": "Video codec quality not supported!",
            "code": 8
        },
        {
            "timestamp": "2022-02-07T12:29:44.820+00:00",
            "severity": "error",
            "message": "Cannot create delivery engine",
            "info": "D:/TEST_FOOTAGE/DCP/DCP/sintel-with-markers",
            "code": 2
        }
    ],
    "status": "ok"
}

The first event is generated by the codec quality validation failure that, in turn, causes an error in creating the delivery engine which generates another event. These errors will cause the job process to fail. Because job processing includes the validation and engine creation procedures, that event will be logged at the manager level accessible by calling the /jobs/log :

{
    "events": [
        {
            "timestamp": "2022-02-07T12:29:47.237+00:00",
            "severity": "error",
            "message": "Failed during job processing",
            "code": 8
        }
    ],
    "status": "ok"
}

5. About STORM node architecture

5.1. Manager and Worker nodes

When starting STORM Server in webserver mode, via the websrv operational context, it will automatically accept and handle jobs via an internally managed job queue. This means that the process will handle REST request and jobs on the same node. In this case the job queue is treated sequentially on job at a time.

STORM Server can also be used as a manager node. In this case it has a registered list of worker nodes that handle the received jobs while the manager node is used to distribute jobs and control the worker pool.

5.2. Creating Worker nodes

A STORM Server worker node can be started as a subprocess of a manager node or as a standalone webserver.

A subprocess worker node can only be created by a manager node and it can only run locally on the same machine as the manager.

A standalone worker node can only be created by an external process and it can run locally or remotely.

In both cases a worker always needs to register with a manager node when started.

5.2.1. Starting Worker nodes as subprocesses

In order to start workers as subprocesses, a manager node must be first started, via the CLI, with the -workers command option followed by the number of workers to be spawned.

Example starting a manager node with 4 workers as subprocesses:

stormserver websrv -workers 4

5.2.2. Starting Worker nodes manually

A worker node can be started manually or via an automated script used by an external application. In this case prior knowledge of existing manager nodes and their addresses is required as a worker can be started only if it can register with a manager node.

Example starting a worker node with a given manager:

stormserver webwrk -address "http://192.168.1.10" -port 6060 -nodeaddress "http://192.168.10.1:8080"

The above command will start a worker on a machine with the given address and on the specified port. The worker will try to register with a manager node having the address specified by the nodeaddress parameter.

5.3. Managing workers

In a standard configuration, STORM Server processes media via a list of queued jobs managed on a single, local node. These jobs are handled sequentially, one at a time.

There are cases when multiple nodes are required to handle jobs. In that case some of those nodes will be workers.

Workers can be managed by using the dedicated workers REST API available on the manager node on which they are registered.

STORM API Reference

1. Compositions

1.1. Obtain the list of compositions

Storm maintains a list of compositions that can be obtained at any time. The following example shows you how to obtain the list of compositions registered on a given node:

URL: /compositions METHOD: GET

Table 6. Request structure
Access Details

URL

/compositions

Query Params

To be added

Method

GET

1.1.1. Request

Request example (curl)

curl "http://127.0.0.1:8080/compositions"

1.1.2. Response

Response structure:
{
	"status" : "ok",
	"compositions" :
	[
		{
			"uuid" : "0326122f-3273-4700-af80-4c4e5d34fcdb",
			"name" : "single clip edit"
		},
		{
			"uuid" : "ab737d47-ce37-401b-8886-f483a9b9904b",
			"name" : "main program with color bars inserted",
		}
	]
}

1.2. Add a new composition

The following example shows you how to add a new composition to the list managed by a specific node:

Table 7. Request structure:
Access Details

URL

/compositions

Method

POST

1.2.1. Request

Request example:
curl -X POST -H "Content-Type: text/xml" -T E:/simple_composition.xml “http://127.0.0.1:8080/compositions”

1.2.2. Parameters

The command sends an XML composition to be added to the list of managed compositions. No other parameters are associated with this command.

1.2.3. Response

The response contains the status of the operation as well as the different timestamps associated with the request.

A UUID is assigned to the new composition and included in the response. This unique ID can be used for retrieval and further managing the composition after it’s creation.

Response example:
{
    "uuid": "ae81f492-04f5-43d1-b766-64dcbb97dc3b",
    "name": "SimpleComposition",
    "filename": "<user-home>/AppData/Roaming/Marquise Technologies/mediabase/TORNADO/session/compositions/ae81f492-04f5-43d1-b766-64dcbb97dc3b.axf"
}

1.3. Delete a composition

The following example shows you how to delete a composition from the list of compositions managed by a specific node:

Table 8. Request Structure:
Access Details

URL

/compositions/{composition UUID}

METHOD

DELETE

1.3.1. Request

Request example:
curl -X DELETE "http://127.0.0.1:8080/compositions/13ff097c-d4c9-43c0-a272-fb69a7a062e0"

1.3.2. Response

The response is the status of the operation.

Response example:
{
  "status" : "ok",
}

The following table shows the possible status codes that this function can return

Table 9. Status codes
Status Details

ok

The composition was successfully deleted.

error

The composition cannot be found in the list of managed processing graphs by this node.

busy

The composition is currently in use and cannot be deleted.

2. Color Management System

2.1. Using the CMS parameters

The Color Management System in TORNADO defines the working color space of the composition. It can be used via the CLI or the REST API for performing colorimetric transformations, like color space conversions when transcoding files.

The CMS works in 2 steps. The first step is determining the source’s properties. TORNADO detects automatically, on a best effort basis, the input’s properties. This step is automatic and can be modified only by using a CutList or an AXF file. The second step is the definition of the target Color Management settings. If not specified then the native settings of the composition, detected during the first step, are used.

In order to override the default target Color Management System a series of parameters must be specified. If not specified the detected source value will be used.

2.2. Color Management System Parameters

Table 10. CMS Parameters
Parameter Type Details

system

tag

A fixed text string value, defined in a controlled vocabulary, see Enums, defining the working CMS system to be used.

version

tag

A fixed text string value, defined in a controlled vocabulary, see Enums, defining the version of the CMS system to be used.

workflow

tag

A fixed text string value, defined in a controlled vocabulary, see Enums, defining the HDR workflow to be used.

primaries

tag

A fixed text string value, defined in a controlled vocabulary, see Enums, defining the Color Primaries to be used.

eotf

tag

A fixed text string value, defined in a controlled vocabulary, see Enums, defining the Electro-Optical Transfer Curve also called gamma curve to be used.

matrix

tag

A fixed text string value, defined in a controlled vocabulary, see Enums, defining the Coding Equations to be used.

cat

tag

A fixed text string value, defined in a controlled vocabulary, see Enums, defining the Chromatic Adaptation to be used.

dtm

tag

A fixed text string value, defined in a controlled vocabulary, see Enums, defining the Dynamic Tone Mapping Targets to be used.

dtmtarget

tag

Create a media information report in various file formats

Below is an example of using CMS override parameters, via the CLI or REST APIs, to define the target CMS for a DCP composition used to generate an MPEG4 h264 proxy:

2.3. Command Line Interface example

stormserver xcode -i "E:/SOURCE/DCP" -preset "E:/PRESETS/preset_proxy_h264.xml" -cmssystem "MTCMS" -cmsworkflow "custom" -cmsprimaries "ITU.BT.709" -cmseotf "ITU-R.BT.709" -cmscat "bradford" -o "E:/RESULT/dcp_proxy.mp4"

2.4. REST API example

{
    "context" : "xcode",
	"input" : "E:/SOURCE/DCP",
    "cms" : {
        "system" : "MTCMS",
        "workflow" : "custom",
        "primaries" : "ITU.BT.709",
        "eotf" : "ITU-R.BT.709",
        "cat" : "bradford"
    },
    "preset" : "E:/PRESETS/preset_proxy_h264.xml",
	"output" : "E:/RESULT/dcp_proxy.mp4"
}

This example will override the detected native CMS settings by defining the target CMS. This will cause a colorimetric transformation using the provided values. In this specific case the proxy will be encoded to the Rec709 color space.

The first step is to specify the CMS system. By default this property has the value NATIVE indicating that the composition’s detected CMS should be used. This parameter should be overriden in order to be able to define a custom workflow. If you select MTCMS, then you must specify the working color space (primaries), the transfer curve (eotf) and the chromatic adaptation (cat). These parameters will define the target CMS.

The transformation to the target CMS is made based on the source parameters. If no metadata is available at the moment of source characterisation then Rec709 is used by default.

3. Enumerables

3.1. Enumerable objects and their use

Certain values in the file formats specific to TORNADO, like Output Presets or AXFs, use a predefined controlled vocabulary. These values are reflected in the various enumerable objects accessible via the CLI or the REST API. The list of the currently available objects is subject to a continuous revision and extension.

While the structure of an enumerable object might vary slightly, there are element types that are common too all enum objects. One of these elements is the tag type. This type is of particular interest because it is referenced throughout the different technical notes describing the structure of specific file formats like Output Presets or composition AXF files.

3.2. Enumerate objects

The following example shows you how to list the currently available enumerable objects:

Table 11. Request structure:
Access Details

URL

/enum/<enum_object_name>

Method

GET

3.2.1. Request

curl -X GET “http://127.0.0.1:8080/enum/<enum_object_name>”

3.2.2. Enumerable object names

Object Name Description

masters

Enumerates the available mastering formats

specifications

Enumerates the available specifications

shims

Enumerates the available shims

shimversions

Enumerates the available shim versions

formats

Enumerates the available input and output formats

colorspaces

Enumerates the available color-spaces

cmssystem

Enumerates the available CMS systems

cmsworkflow

Enumerates the available CMS workflows

cmsprimaries

Enumerates the available CMS color primaries

cmseotf

Enumerates the available CMS electro-optical transfer functions

cmsmatrix

Enumerates the available CMS coding equations

cmscat

Enumerates the available CMS chromatic adaptations

cmsdtm

Enumerates the available CMS dynamic tone mappings

cmsdtmt

Enumerates the available CMS dynamic tone mapping targets

Request example:

The following request lists the available masters.

curl -X GET “http://127.0.0.1:8080/enum/masters”

3.2.3. Response

The response is the status of the operation as well as an array containing the available masters:

{
    "masters": [
        {
            "name": "AICP Master",
            "description": "Export AICP Master",
            "tag": "AICP"
        },
        {
            "name": "Apple iTunes Package",
            "description": "Export Apple iTunes Package",
            "tag": "AIT"
        },
        ...
        {
            "name": "YUV4MPEG2 Master",
            "description": "Export YUV4MPEG2 Master",
            "tag": "Y4M"
        }
    ],
    "status": "ok"
}

4. Nodes Control

4.1. Controlling nodes via the REST API

STORM Server nodes can be controlled via the dedicated REST API. The API can be used to put in place automated polling mechanisms that check the status, shuts down or restarts nodes.

4.2. Check the status of a node

The following example shows you how to check the status of a node:

Table 12. Request structure:
Access Details

URL

/control/status

Method

GET

4.2.1. Request

curl -X GET “http://127.0.0.1:8080/control/status”

4.2.2. Response

The response is the status of the operation:

{
    "status": "ok"
}

If an error occurs in the communication the node will not respond.

4.3. Shutting down a node

Shutting down a node will terminate the process and all managed workers registered with this node.

This operation will block until all registered nodes will be shutdown. This should be taken into account when implementing timeouts for this call.

The following example shows you how to shutdown a node:

Table 13. Request structure:
Access Details

URL

/control/shutdown

Method

POST

4.3.1. Request

curl -X POST “http://127.0.0.1:8080/control/shutdown”

4.3.2. Response

The response is the status of the operation:

{
    "status": "ok"
}

If an error occurs in the communication the node will not respond.

5. Jobs

5.1. Obtain the list of jobs

Storm maintains a list of jobs that can be obtained at any time. This is the easiest method to get the status of all the running or pending jobs. The following example shows you how to obtain the list of jobs currently running or pending on a given node:

Table 14. Request structure
Access Details

URL

/jobs

METHOD

GET

5.1.1. Request

curl "http://127.0.0.1:8080/jobs"

5.1.2. Response

{
	"status" : "ok",
	"jobs" :
	[
		{
			"uuid" : "977a2c65-b4d4-4848-9464-8aefd52538e7",
			"status" : "queued",
			"progress" : "0"
		},
		{
			"uuid" : "887b12ee-30f1-4059-afff-db3e89a4e270",
			"status" : "queued",
			"progress" : "0"
		}
	]
}

5.2. Get job description

The following example shows you how to retrieve the details for a given job:

Table 15. Request structure:
Access Details

URL

/jobs/<job_UUID>

Method

GET

5.2.1. Request

curl -X GET -T “http://127.0.0.1:8080/jobs/<job_UUID>”

5.2.2. Response

The response is the status of the operation as well as the UUID assigned to the new job. This unique ID can be used for further job manipulation requests:

{
    "job": {
        "uuid": "b9206f48-42a3-4557-a80e-1912160b776e",
        "priority": 0,
        "status": "queued",
        "requesttimestamp": "2021-02-17T21:44:49+00:00",
        "input": "F:/ALEXA_Mini_LF_MXFARRIRAW/F004C003_190925_MN99.mxf",
        "output": "F:/ALEXA_Mini_LF_MXFARRIRAW/F004C003_190925_MN99_metadata.xml",
        "format": "xml",
        "context": "mediainfo"
    },
    "status": "ok"
}

5.3. Add a new job

The following example shows you how to add a new job to the list of jobs of a specific node:

Table 16. Request structure:
Access Details

URL

/jobs

Method

POST

5.3.1. Request

curl -X POST -T "e:/job_parameters.json" “http://127.0.0.1:8080/jobs”

5.3.2. Parameters

Parameter Name Value Description

context

string

Mandatory parameter. Defines the operational context of the job. See Operational Contexts for more information.

input

string

Mandatory parameter. The path of the input file or folder for the job.

output

string

Mandatory parameter. The path of the output file or folder for the job.

master

string

Optional parameter. A fixed text string value defined in a controlled vocabulary. It cannot be used together with the preset parameter

format

string

Optional parameter. A fixed text string value defined in a controlled vocabulary that defines the container format (e.g. "quicktime"). It cannot be used together with the preset parameter.

preset

string

Optional parameter. The path of the xml file containing the parameter definition. It cannot be used together with the master or format parameters. See MT-TN-20 Output Preset File Format Specification for more information on the document structure.

cms

object

Optional parameter. A JSON object defining the composition’s ColorManagementSystem and it’s properties.

Request example:

The following request exports the file metadata in XML. See MT-TN-10 Report XML File Format Specification for more information on the document structure.

{
	"context" : "mediainfo",
	"input" : "F:/ALEXA_Mini_LF_MXFARRIRAW/F004C003_190925_MN99.mxf",
	"output" : "F:/ALEXA_Mini_LF_MXFARRIRAW/F004C003_190925_MN99_metadata.mxf",
	"format" : "xml"
}

5.3.3. Response

The response is the status of the operation as well as the UUID assigned to the new job. This unique ID can be used for further job manipulation requests:

{
    "uuid": "b9206f48-42a3-4557-a80e-1912160b776e",
    "priority": 0,
    "status": "queued",
    "requesttimestamp": "2021-02-17T21:44:49+00:00",
    "input": "F:/ALEXA_Mini_LF_MXFARRIRAW/F004C003_190925_MN99.mxf",
    "output": "F:/ALEXA_Mini_LF_MXFARRIRAW/F004C003_190925_MN99_metadata.xml",
    "format": "xml",
    "context": "mediainfo"
}

5.4. Abort and delete a running job

Running a DELETE HTTP request once will abort the job while running it a second time it will completely remove the job from the internal list.

The following example shows you how to abort or delete a running job for a specific node:

Table 17. Request structure:
Access Details

URL

/jobs/<jobs_UUID>

Method

DELETE

5.4.1. Request

curl -X POST “http://127.0.0.1:8080/jobs/887b12ee-30f1-4059-afff-db3e89a4e270”

5.4.2. Response

The first time the DELETE request is sent the response status is "aborted" if the job exists.

{
	"status" : "aborted"
}

The second time the DELETE request is sent the response status is "ok" if the job exists.

{
	"status" : "ok"
}
Table 18. Status codes
Status Details

aborted

The command has been successfully received and the job is being aborted.

ok

The job was successfully deleted.

error

The job cannot be found in the list of jobs on this node.

5.5. Obtaining a job’s log

The following example shows you how to obtain a job’s log for a specific object:

Table 19. Request structure:
Access Details

URL

/jobs/<jobs_UUID>/log

Method

GET

5.5.1. Request

curl “http://127.0.0.1:8080/jobs/887b12ee-30f1-4059-afff-db3e89a4e270/log”

5.5.2. Response

{
    "events": [
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Master type not supported",
            "severity": "error",
            "code": 2
        }
    ],
    "status": "ok"

}

An empty "events" array may be returned in the case when there’s nothing to be reported.

{
    "events": []
}

5.6. Obtaining logs for all jobs

The following example shows you how to obtain the list of events that happened at the job manager’s level:

Table 20. Request structure:
Access Details

URL

/jobs/log

Method

GET

5.6.1. Request

curl “http://127.0.0.1:8080/jobs/log”

5.6.2. Response

{
    "events": [
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Master type not supported",
            "severity": "error",
            "code": 2
        },
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Master type not supported",
            "severity": "error",
            "code": 2
        },
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Master type not supported",
            "severity": "error",
            "code": 2
        }
    ],
    "status": "ok"
}

An empty "events" array may be returned in the case when there’s nothing to be reported.

{
    "events": [],
    "status": "ok"
}

6. Processing Graphs

6.1. Obtain the list of processing graphs

Storm maintains a list of processing graphs that can be obtained at any time.

The following example shows you how to obtain the list of processing graphs currently registered on a given node:

Table 21. Request structure:
Access Details

URL

/processinggraphs

Method

GET

6.1.1. Request

curl -X GET "http://127.0.0.1:8080/processinggraphs"

6.1.2. Response

{
	"status" : "ok",
	"processinggraphs" :
	[
		{
			"uuid" : "8e8d789d-5f03-4deb-9d0a-ebe9ef9923a8",
			"name" : "Convert to MXF Op1a DNxHD"
		},
		{
			"uuid" : "31c082ce-a38c-473a-b0c3-1e1708b9fb53",
			"name" : "multiple HD1080 ouputs",
		}
	]
}

6.2. Add a new processing graph

The following example shows you how to add a new processing graph to the list of managed by a specific node:

Table 22. Request structure:
Access Details

URL

/processinggraphs

Method

POST

6.2.1. Request

curl -X POST -H "Content-Type: text/xml" -T E:/processing_graph_example.xml “http://127.0.0.1:8080/processinggraphs”

6.2.2. Parameters

The command sends an XML processing graph to be added to the list of managed processing graphs. No other paramters are associated with this command.

6.2.3. Response

The response is the status of the operation as well as the UUID assigned to the new processing graph. This unique ID can be used for further managing the processing graph

{
	"uuid" : "7d16441a-45ec-4891-8f78-37e582f18c5f",
	"name" : "ProcessingGraphExample",
	"filename": "<user-home>/AppData/Roaming/Marquise Technologies/mediabase/TORNADO/session/processinggraphs/7d16441a-45ec-4891-8f78-37e582f18c5f.xml"
}

6.3. Delete a processing graph

The following example shows you how to delete a processing graph from the list of processing graphs managed by a specific node:

Table 23. Request structure:
Access Details

URL

/processinggraphs/<processinggraph_UUID>

METHOD

DELETE

6.3.1. Request

curl -X DELETE "http://127.0.0.1:8080/processinggraphs/13ff097c-d4c9-43c0-a272-fb69a7a062e0"

6.3.2. Response

The response is the status of the operation.

{
  "status" : "ok",
}

The following table shows the possible status codes that this function can return

Table 24. Status codes
Status Details

ok

The processing graph was successfully deleted.

error

The processing graph cannot be found in the list of managed processing graphs by this node.

busy

The processing graph is currently in use and cannot be deleted.

6.4. Obtaining a processing-graph’s log

The following example shows you how to obtain a processing-graph’s log for a specific object:

Table 25. Request structure:
Access Details

URL

/processinggraphs/<UUID>/log

Method

GET

6.4.1. Request

curl “http://127.0.0.1:8080/processinggraphs/887b12ee-30f1-4059-afff-db3e89a4e270/log”

6.4.2. Response

{
    "events": [
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Invalid request ContentType",
            "severity": "error",
            "code": 2
        }
    ],
    "status": "ok"
}

An empty "events" array may be returned in the case when there’s nothing to be reported.

{
    "events": []
}

6.5. Obtaining logs for all processing-graphs

The following example shows you how to obtain the list of events that happened at the processing-graph manager’s level:

Table 26. Request structure:
Access Details

URL

/processinggraphs/logs

Method

GET

6.5.1. Request

curl “http://127.0.0.1:8080/processinggraphs/logs”

6.5.2. Response

{
    "events": [
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Invalid request ContentType",
            "severity": "error",
            "code": 2
        },
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Cannot write Processing Graph file",
            "severity": "error",
            "code": 2
        }
    ],
    "status": "ok"
}

An empty "events" array may be returned in the case when there’s nothing to be reported.

{
    "events": [],
    "status": "ok"
}

7. Output Presets

7.1. Obtain the list of Output Presets

Storm maintains a list of output presets that can be obtained at any time.

The following example shows you how to obtain the list of output presets currently registered on a given node:

Table 27. Request structure:
Access Details

URL

/outputpresets

Method

GET

7.1.1. Request

curl -X GET "http://127.0.0.1:8080/outputpresets"

7.1.2. Response

{
	"status" : "ok",
	"outputpresets" :
	[
		{
			"uuid" : "8e8d789d-5f03-4deb-9d0a-ebe9ef9923a8",
			"name" : "720p_H264_MP4"
		},
		{
			"uuid" : "31c082ce-a38c-473a-b0c3-1e1708b9fb53",
			"name" : "1080p_DNxHD_LB_MXFOp1A",
		}
	]
}

7.2. Add a new Output Preset

The following example shows you how to add a new output preset to the list:

Table 28. Request structure:
Access Details

URL

/outputpresets

Method

POST

7.2.1. Request

curl -X POST -H "Content-Type: text/xml" -T E:/output_preset_example.xml “http://127.0.0.1:8080/outputpresets”

7.2.2. Parameters

The command sends an XML output preset to be added to the list of managed presets. No other parameters are associated with this command.

7.2.3. Response

The response is the status of the operation as well as the UUID assigned to the new output preset. This unique ID can be used for further managing the preset.

{
    "uuid": "a51f1483-217d-4ddb-a926-4d13d6ffc309",
    "name": "TestPreset",
    "filename": "<user-home>/AppData/Roaming/Marquise Technologies/mediabase/TORNADO/session/outputpresets/a51f1483-217d-4ddb-a926-4d13d6ffc309.xml"
}

7.3. Delete a Output Preset

The following example shows you how to delete a output preset from the list of output presets managed by a specific node:

Table 29. Request structure:
Access Details

URL

/outputpresets/<outputpreset_UUID>

METHOD

DELETE

7.3.1. Request

curl -X DELETE "http://127.0.0.1:8080/outputpresets/13ff097c-d4c9-43c0-a272-fb69a7a062e0"

7.3.2. Response

The response is the status of the operation.

{
  "status" : "ok",
}

The following table shows the possible status codes that this function can return

Table 30. Status codes
Status Details

ok

The output preset was successfully deleted.

error

The output preset cannot be found in the list of managed output presets by this node.

7.4. Obtaining an output preset’s log

The following example shows you how to obtain an output-preset’s log for a specific object:

Table 31. Request structure:
Access Details

URL

/outputpresets/<UUID>/log

Method

GET

7.4.1. Request

curl -X GET “http://127.0.0.1:8080/outputpresets/887b12ee-30f1-4059-afff-db3e89a4e270/log”

7.4.2. Response

{
    "events": [
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Invalid request ContentType",
            "severity": "error",
            "code": 2
        }
    ],
    "status": "ok"
}

An empty "events" array may be returned in the case when there’s nothing to be reported.

{
    "events": []
}

7.5. Obtaining logs for all output presets

The following example shows you how to obtain the list of events that happened at the output-preset manager’s level:

Table 32. Request structure:
Access Details

URL

/outputpresets/logs

Method

GET

7.5.1. Request

curl -X GET “http://127.0.0.1:8080/outputpresets/logs”

7.5.2. Response

{
    "events": [
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Invalid request ContentType",
            "severity": "error",
            "code": 2
        },
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Cannot write Processing Graph file",
            "severity": "error",
            "code": 2
        }
    ],
    "status": "ok"
}

An empty "events" array may be returned in the case when there’s nothing to be reported.

{
    "events": [],
    "status": "ok"
}

8. Overlays

8.1. Overlays

Overlays are XML descriptions that define the format for text and image supposed to be overlayed on top of a transcoding job’s result.

The following example shows an overlay that describes a text element, using a tag element to capture the clip’s name, that is positioned 5% from the left side and 5% from the top side of the image container and is horizontaly left aligned and centered vertically:

<?xml version="1.0" encoding="UTF-8" ?>
<MTOverlaysTemplate>
	<Text hpos="5" vpos="5" halign="left" valign="bottom">NAME: $ClipName$</Text>
</MTOverlaysTemplate>

For a detailed description of the Overlay file format see: MT-TN-50 Overlay File Format Specification

8.2. Add a new Overlay

The following example shows you how to add a new overlay to the list:

Table 33. Request structure:
Access Details

URL

/overlays

Method

POST

8.2.1. Request

curl -X POST -H "Content-Type: text/xml" -T E:/overlay_example.xml “http://127.0.0.1:8080/overlays”

8.2.2. Parameters

The command sends an XML overlay to be added to the list of managed overlays. No other parameters are associated with this command.

8.2.3. Response

The response is the status of the operation as well as the UUID assigned to the new overlay. This unique ID can be used for further managing the overlay.

{
    "uuid": "dad697dd-da6a-4b5b-befe-85142e46e79f",
    "name": "TestOverlay",
    "filename": "<user-home>/AppData/Roaming/Marquise Technologies/mediabase/TORNADO/session/overlays/dad697dd-da6a-4b5b-befe-85142e46e79f.xml"
}

8.3. Delete a Overlay

The following example shows you how to delete an overlay from the managed list for a specific node:

Table 34. Request structure:
Access Details

URL

/overlays/<overlay_UUID>

METHOD

DELETE

8.3.1. Request

curl -X DELETE "http://127.0.0.1:8080/overlays/dad697dd-da6a-4b5b-befe-85142e46e79f"

8.3.2. Response

The response is the status of the operation.

{
  "status" : "ok",
}

The following table shows the possible status codes that this function can return

Table 35. Status codes
Status Details

ok

The overlay was successfully deleted.

error

The overlay cannot be found in the managed list for this node.

8.4. Obtain the list of Overlays

Storm maintains a list of overlays that can be obtained at any time.

The following example shows you how to obtain the list of overlays currently registered on a given node:

Table 36. Request structure:
Access Details

URL

/overlays

Method

GET

8.4.1. Request

curl -X GET "http://127.0.0.1:8080/overlays"

8.4.2. Response

{
    "overlays": [
        {
            "uuid": "7370d3d8-7bc8-4900-a304-9e24ebcbe1c6",
            "name": "",
            "filename": "<user-home>/AppData/Roaming/Marquise Technologies/mediabase/TORNADO/session/overlays/7370d3d8-7bc8-4900-a304-9e24ebcbe1c6.xml"
        },
        {
            "uuid": "d39a5944-065f-470d-8b5f-fb914f2e03d8",
            "name": "",
            "filename": "<user-home>/AppData/Roaming/Marquise Technologies/mediabase/TORNADO/session/overlays/d39a5944-065f-470d-8b5f-fb914f2e03d8.xml"
        }
    ],
    "status": "ok"
}

8.5. Obtaining an overlay’s log

The following example shows you how to obtain an overlay’s log for a specific object:

Table 37. Request structure:
Access Details

URL

/overlays/<UUID>/log

Method

GET

8.5.1. Request

curl -X GET “http://127.0.0.1:8080/overlays/dad697dd-da6a-4b5b-befe-85142e46e79f/log”

8.5.2. Response

{
    "events": [
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Invalid request ContentType",
            "severity": "error",
            "code": 2
        }
    ],
    "status": "ok"
}

An empty "events" array may be returned in the case when there’s nothing to be reported.

{
    "events": [],
    "status": "ok"
}

8.6. Obtaining logs for all overlays

The following example shows you how to obtain the list of events that happened at the overlay manager’s level:

Table 38. Request structure:
Access Details

URL

/overlays/logs

Method

GET

8.6.1. Request

curl -X GET “http://127.0.0.1:8080/overlays/logs”

8.6.2. Response

{
    "events": [
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Invalid request ContentType",
            "severity": "error",
            "code": 2
        },
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Cannot create Overlay (not enough memory)",
            "severity": "error",
            "code": 1
        }
    ],
    "status": "ok"
}

An empty "events" array may be returned in the case when there’s nothing to be reported.

{
    "events": [],
    "status": "ok"
}

9. Watchfolders

9.1. Obtain the list of watchfolders

Storm maintains a list of watchfolders that can be obtained at any time. This is the easiest method to get the status of all the watchfolders that are monitored along with their status.

The following example shows you how to obtain the list of watchfolders currently registered on a given node:

Table 39. Request structure:
Access Details

URL

/watchfolders

Method

GET

9.1.1. Request

curl "http://127.0.0.1:8080/watchfolders"

9.1.2. Response

{
	"status" : "ok",
	"watchfolders" :
	[
		{
			"uuid" : "4e64937f-b254-4e52-af73-48f8a4674feb",
			"status" : "active",
			"processgraphid" : "509f9f3a-4c4d-43a7-a2ac-a85c764f63c3",
			"input" : "E:/MEDIA_SOURCE",
			"output" : "E:/MEDIA_DESTINATION"
		},
		{
			"uuid" : "e0e5c68d-18bf-49b3-aa11-cc0ce838c8fc",
			"status" : "paused",
			"processgraphid" : "fadd6541-f81c-4be4-add6-f5b265b69411",
			"input" : "E:/RECEIVED_FILES",
			"output" : "E:/PROCESSED_FILES"
		}
	]
}

9.2. Add a new watchfolder

The following example shows you how to add a new watchfolder to the list of watchfolders monitored by a specific node:

Table 40. Request structure:
Access Details

URL

/watchfolders

Method

POST

9.2.1. Request

curl -X POST -T E:/watchfolder_parameters.json “http://127.0.0.1:8080/watchfolders”

9.2.2. Parameters

Parameter Name Value Description

name

string

Mandatory parameter. Custom defined name of the watchfolder.

preset

string

Mandatory parameter. Mutually exclusive with preset. The path to the preset that should be applied to the current watchfolder.

input

string

Mandatory parameter. The path to the folder that should be used as input.

output

string

Mandatory parameter. The path to the folder that should be used as output.

context

string

Mandatory parameter. Defines the operational context of the jobs to be handled by the current watchfolder. See Operational Contexts for more information.

filters

array

Optional parameter. List of extension filter objects to be applied to the input folder.

fileextension

string

Mandatory parameter in a filter object. A fixed text string value defined in a controlled vocabulary that defines the file format.

Request example:

The following example creates a watchfolder named "MOV_MXF_folder" that transcodes all files, respecting the extension filers, by using the passed preset and context to define the job properties.

{
    "name" : "MOV_MXF_folder",
	"preset" : "E:/presets/preset_IMF_NFX_2016_1080p_stereo.xml",
	"input" : "E:/RECEIVED_FILES",
	"output" : "E:/PROCESSED_FILES",
	"context" : "xcode",
	"filters" :
	[
		{
			"fileextension" : "mxf"
		},
		{
			"fileextension" : "mov"
		}
	]
}

9.2.3. Response

The response is the status of the operation as well as the UUID assigned to the new watchfolder. This unique ID can be used for further managing the watchfolder

{
    "uuid": "3b3fc808-0053-4a20-beae-c8344e8aa384",
    "priority": 0,
    "status": "active",
    "name": "MOV_MXF_folder",
    "requesttimestamp": "2021-12-01T17:17:44+00:00",
    "input": "E:/RECEIVED_FILES",
    "output": "E:/PROCESSED_FILES",
    "preset": "E:/presets/preset_IMF_NFX_2016_1080p_stereo.xml",
    "context": "xcode"
}

9.3. Delete a watchfolder

The following example shows you how to delete a watchfolder from the list of monitored watchfolders on a specific node:

Table 41. Request structure
Access Details

URL

/watchfolders/{watchfolder UUID}

METHOD

DELETE

9.3.1. Request

curl -X DELETE "http://127.0.0.1:8080/watchfolders/13ff097c-d4c9-43c0-a272-fb69a7a062e0"

9.3.2. Response

The response is the status of the operation.

{
  "status" : "ok",
}

The following table shows the possible status codes that this function can return

Table 42. Status codes
Status Details

ok

The watchfolder was successfully deleted.

error

The watchfolder cannot be found in the list of monitored watchfolders by this node.

9.4. Changing the status of a watchfolder

The following example shows you how to change the status of a watchfoldermonitored for a specific node:

Table 43. Request structure:
Access Details

URL

/watchfolders/<watchfolder_uuid>

Method

PUT

9.4.1. Request

curl -X PUT -H "Content-Type: application/json" -d '{"status":"paused"}'  “http://127.0.0.1:8080/watchfolders/887b12ee-30f1-4059-afff-db3e89a4e270”

9.4.2. Parameters

Parameter Name Value Description

status

string

Mandatory parameter. A fixed text string value defined in a controlled vocabulary that defines possible states of a watchfolder. Possible values active, paused.

9.4.3. Response

{
	"status" : "ok"
}

10. Workers

10.1. Obtaining the list of workers

STORM Server maintains a list of managed workers that can be obtained at any time. The following example shows you how to obtain the list of workers being managed by the current node:

Table 44. Request structure
Access Details

URL

/workers

METHOD

GET

10.1.1. Request

curl "http://127.0.0.1:8080/workers"

10.1.2. Response

The response is an array containing the workers descriptors as well as the status of the operation:

{
	"status" : "ok",
	"workers": [
        {
            "uuid": "8c80d6d9-f8a6-445f-828d-6f0bc8edcc5b",
            "address": "http://localhost",
            "port": 65535
        },
        {
            "uuid": "77038177-5fa4-471b-a5ec-e18d1f3cb9e9",
            "address": "http://localhost",
            "port": 65534
        },
        {
            "uuid": "556e68b6-b185-4980-91ac-b57bdefe4152",
            "address": "http://192.168.1.250",
            "port": 9090
        },
        {
            "uuid": "a016a364-5a86-4f4d-98f4-abe53e0a8275",
            "address": "http://192.168.1.150",
            "port": 9090
        }
    ]
}

10.2. Add a new worker

The following example describes the API for adding a new worker to the manager’s worker list.

This API is automatically called when a worker is either spawned locally or created remotely (see Manager and Worker nodes). It should not be used by an external application, safe for testing purposes.

Table 45. Request structure:
Access Details

URL

/workers

Method

POST

10.2.1. Request

curl -X POST -T "e:/worker_parameters.json" “http://127.0.0.1:8080/workers”

10.2.2. Parameters

Parameter Name Value Description

address

string

Mandatory parameter. Defines the address of the worker node.

port

number

Mandatory parameter. The port on which the worker node will communicate.

pid

number

Mandatory parameter. The Process ID of the worker node assigned by the Operating System.

Request example:
{
	"address": "http://localhost",
	"port": 6060,
	"pid": 10252
}

10.2.3. Response

The response is the status of the operation as well as the UUID assigned to the new worker. This unique ID can be used for further worker manipulation requests. In case there was an error in the process then the response will only contain the status of the operation set to "error":

{
    "uuid": "b9206f48-42a3-4557-a80e-1912160b776e",
    "status": "ok"
}

10.3. Delete a running worker

The following example shows you how to delete a specific worker from the manager’s node list.

Using this API will also call the control shutdown API (see Shutting down a node) with the worker’s parameters. As such, the status code of this operation will be the logical conjunction of the delete and shutdown operations.

Table 46. Request structure
Access Details

URL

/workers/{worker UUID}

METHOD

DELETE

10.3.1. Request

curl -X DELETE "http://127.0.0.1:8080/workers/887b12ee-30f1-4059-afff-db3e89a4e270"

10.3.2. Response

The response is the status of the operation.

{
  "status" : "ok",
}

The following table shows the possible status codes that this function can return:

Table 47. Status codes
Status Details

ok

The worker was successfully deleted.

error

The worker cannot be found in this node’s list.

10.4. Obtaining a worker’s log

The following example shows you how to obtain a worker’s log for a specific object:

Table 48. Request structure:
Access Details

URL

/workers/<worker_UUID>/log

Method

GET

10.4.1. Request

curl “http://127.0.0.1:8080/workers/887b12ee-30f1-4059-afff-db3e89a4e270/log”

10.4.2. Response

{
    "events": [
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Invalid UUID.",
            "severity": "error",
            "code": 2
        },
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Cannot shutdown Worker (worker unreachable)",
            "severity": "error",
            "code": 7
        }
    ],
    "status": "ok"
}

An empty "events" array may be returned in the case when there’s nothing to be reported.

{
    "events": [],
    "status": "ok"
}

10.5. Obtaining logs for all workers

The following example shows you how to obtain the list of events that happened at the worker manager’s level:

Table 49. Request structure:
Access Details

URL

/workers/logs

Method

GET

10.5.1. Request

curl “http://127.0.0.1:8080/jobs/logs”

10.5.2. Response

{
    "events": [
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Cannot find Worker",
            "severity": "error",
            "code": 2
        },
        {
            "timestamp": "2022-02-07T12:29:41.604+00:00",
            "message": "Cannot register Worker (not enough memory)",
            "severity": "error",
            "code": 1
        }
    ],
    "status": "ok"
}

An empty "events" array may be returned in the case when there’s nothing to be reported.

{
    "events": [],
    "status": "ok"
}

TECHNICAL NOTES

1. MT-TN-10 Report XML File Format Specification

1.1. Introduction

1.1.1. Overview

This document describes the Marquise Technologies Report XML file format used for various reporting purposes among the Marquise Technology product line. Among these purposes, the report is generated when validating various file formats, or simply to obtain detailed information on those.

The file format is based on the eXtensible Markup Language (XML) and uses its conventions and structures.

1.1.2. Scope

This specification is intended to give a reference to developers and implementers of this file format. As the format evolves to include new features, this specification is subject to frequent updates.

1.1.3. Document Organization

The specification is divided in various sections with a strong focus on implementation.

The last section of the document provides a detailed description of the various types used by the file format. These types are either simple types or complex types. Complex types contain a description of the sub-elements they contain.

The types are listed in alphabetic order.

1.1.4. Document Notation and Conventions

1.1.5. Language code conventions

This specification makes an extensive use of language codes for different purposes. All codes are assumed to use the ISO 639-1 or 639-2 notation.

1.1.6. XML Conventions

The document always uses the UTF-8 encoding.

Currently this specification does not follow the rules of an XML schema (XSD), however this may change in the future.

1.2. AssetDescriptorType

1.2.1. Elements

Table 50. AssetDescriptorType elements
Element Name Type Description

AssetKind

AssetKindEnum

a tag uniquely identifying the type of the asset

SourceDescriptor

SourceDescriptorType

a source descriptor that describes the location and storage properties of the asset

1.2.2. AssetKindEnum

The AssetKindEnum is an enumeration of fixed UTF-8 values that indicate the kind of the asset. Typically the kind of the asset must be compatible with the type of the essence referenced in a composition playlist track, otherwise it’s considered as an error.

Table 51. AssetKindEnum elements
Value Description

unknown

any asset for which the kind cannot be determined has this value

picture

the asset contains a single video track.

picture.s3d

the asset contains a single stereoscopic video track.

sound

the asset contains a single audio track.

timedtext

the asset contains a single timed text track.

immersiveaudio

the asset contains a single immersive audio track.

data

the asset contains a single data track.

combined

the asset contains multiple tracks with various essences.

pkl

the asset is a packing list.

cpl

the asset is a composition playlist.

opl

the asset is an output profile list.

scm

the asset is a sidecar composition map.

1.3. AcquisitionMetadataType

1.3.1. Elements

1.4. AssetDescriptorListType

The AssetDescriptorList element is defined by the AssetDescriptorListType and contains an ordered list of AssetDescriptor elements. Each AssetDescriptor element is defined by the AssetDescriptorType.

Below is an example of a AssetDescriptorList:

...
<AssetDescriptorList>
  <AssetDescriptor>
  ...
  </AssetDescriptor>
  ...
  <AssetDescriptor>
  ...
  </AssetDescriptor>
</AssetDescriptorList>
...

1.5. AudioDescriptorType

1.5.1. Elements

Table 52. AudioDescriptorType elements
Element Name Type Description

ChannelCount

UInt32

a value representing the number of audio channels in the essence referenced by the parent track.

SoundfieldConfiguration

UTF-8 string

a tag representing the configuration formed by the audio channels in the track. The tag is based on the SMPTE ST377-4 Multichannel Audio framework MCATagSymbol, using the "sg" prefix.

Annotation

UTF-8 string

An optional user-defined annotation text string

Title

UTF-8 string

An optional text string as specified in SMPTE ST377-41

TitleVersion

UTF-8 string

An optional text string as specified in SMPTE ST377-41

TitleSubVersion

UTF-8 string

An optional text string as specified in SMPTE ST377-41

Episode

UTF-8 string

An optional text string as specified in SMPTE ST377-41

ContentKind

UTF-8 string

A tag representing the content kind as specified in SMPTE ST377-41

ElementKind

UTF-8 string

A tag representing the element kind as specified in SMPTE ST377-41

PartitionKind

UTF-8 string

A tag representing the partition kind as specified in SMPTE ST377-41

PartitionNumber

UTF-8 string

A text string representing the partition number as specified in SMPTE ST377-41

ContentType

UTF-8 string

A tag representing the content type as specified in SMPTE ST377-41

ContentSubType

UTF-8 string

A tag representing the content subtype as specified in SMPTE ST377-41

ContentDifferentiator

UTF-8 string

An optional user-defined text string representing the content differentiator as specified in SMPTE ST377-41

UseClass

UTF-8 string

A tag representing the use class as specified in SMPTE ST377-41

SpokenLanguageAttribute

UTF-8 string

A tag representing the attribute of the main spoken language as specified in SMPTE ST377-41

1.6. CIExyType

1.6.1. Elements

Table 53. CIExyType elements
Element Name Type Description

x

Float32

a floating point value representing the x coordinated of a CIExy chromaticity vector

y

Float32

a floating point value representing the y coordinated of a CIExy chromaticity vector

1.7. CompositionDescriptorType

1.7.1. Elements

Table 54. CompositionDescriptorType elements
Element Name Type Description

Name

UTF-8 string

a human readable text string containing the name of the composition as stored in the subject

Annotation

UTF-8 string

A human readable text containing comments associated with the composition

Issuer

UTF-8 string

A human readable text that identifies the entity that produced the composition

IssueDate

xs:dateTime

creation date and time of the composition

Creator

UTF-8 string

a human readable text string containing the name of the tool used to create the composition

Language

ISO639LanguageType

main composition language

Duration

UInt64

A 64-bit unsigned interger containing the number of edit units in the composition

EditRate

Rational

A rational containing the edit rate of the composition (e.g. frame rate)

TimecodeDescriptor

TimecodeDescriptorType

A timecode descriptor defining the start timecode of the composition

1.8. ComplianceTestPlanListType

The ComplianceTestPlanList element is defined by the ComplianceTestListType and contains an ordered list of ComplianceTestPlan elements. Each ComplienceTestPlan element is defined by the ComplianceTestPlanType.

Below is an example of a ComplianceTestPlanList:

...
<ComplianceTestPlanList>
  <ComplianceTestPlan>
  ...
  </ComplianceTestPlan>
  ...
  <ComplianceTestPlan>
  ...
  </ComplianceTestPlan>
</ComplianceTestPlanList>
...

1.9. ComplianceTestPlanType

The ComplienceTestPlan is defined by the ComplianceTestPlanType and contains an ordered list of TestSequence elements. Each TestSequence element is defined by the TestSequenceType.

Below is an example of a ComplianceTestPlan:

...
<ComplianceTestPlan type=<type> version=<version>>
  <TestSequence type=<type>>
  ...
  </TestSequence>
  ...
  <TestSequence type=<type>>
  ...
  </TestSequence>
</ComplianceTestPlan>
...

1.9.1. Attributes

Table 55. ComplianceTestPlanListType attributes
Attribute Name Type Description

type

UTF-8 string

A fixed text string value defined in a controlled vocabulary

version

UTF-8 string

A fixed text string value defined in a controlled vocabulary

1.10. ContentLightLevelsType

1.10.1. Elements

Table 56. ContentLightLevelsType elements
Element Name Type Description

MaxCLL

Float32

Maximum Content Light Level as defined in CEA-861.3 2015, Annex A

MaxFALL

Float32

Maximum Frame Average Light Level as defined in CEA-861.3 2015, Annex A

1.11. CryptographicDescriptorType

1.11.1. Elements

Table 57. CryptographicDescriptorType elements
Element Name Type Description

UsesHMAC

Boolean

A flag indicating the usage of Hash-based message authentication code

KeyDescriptor

KeyDescriptorType

A complex type describing the encryption key properties

1.12. DataRateDescriptorType

1.12.1. Elements

Table 58. DataRateType elements
Element Name Type Description

AverageBytesPerSecond

UInt32

A 32-bit non-null unsigned integer containing the average bytes per second

AverageSampleSize

UInt32

A 32-bit non-null unsigned integer containing the average sample size in bytes

MaximumBytesPerSecond

UInt32

A 32-bit non-null unsigned integer containing the maximum bytes per second

MaximumSampleSize

UInt32

A 32-bit non-null unsigned integer containing the maximum sample size in bytes

MinimumSampleSize

UInt32

A 32-bit non-null unsigned integer containing the minimum sample size in bytes

1.13. EncodingDescriptorType

1.13.1. Elements

Table 59. EncodingDescriptorType elements
Element Name Type Description

Codec

UTF-8 string

a tag uniquely identifying the codec used by the parent essence descriptor

Profile

UTF-8 string

a tag uniquely identifying the codec profile used by the parent essence descriptor

Level

UTF-8 string

a tag uniquely identifying the codec profile level used by the parent essence descriptor

1.14. EssenceDescriptorType

1.14.1. Elements

Table 60. EssenceDescriptorType elements
Element Name Type Description

Duration

UInt64

A 64-bit unsigned interger containing the number of edit units in the underlying essence.

SampleRate

Rational

A rational containing the sample rate of the underlying essence.

Language

ISO639LanguageType

main essence language

EncodingDescriptor

EncodingDescriptorType

A complex type describing the codec parameters used to encode the essence

DataRateDescriptor

DataRateDescriptorType

An optional complex type describing the essence data rate

1.14.2. Attributes

Table 61. EssenceDescriptorType attributes
Attribute Name Type Description

type

UTF-8 string

a tag uniquely identifying the type of essence described

1.14.3. About SampleRate

For video essence descriptors, the sample rate is equal to the edit rate of the track that the essence descriptor belongs to. However, if the essence is stereoscopic, the sample rate is the double of the track’s edit rate. For instance, a stereoscopic video track that has an edit rate of 24 frames per second, will have it’s essence sample rate set to 48, indicating that 2 samples (i.e. 2 frames) are played for every edit unit in the track.

For audio essence descriptors, the sample rate is representing the audio sample rate (typically 48kHz).

Unless specified otherwise, other tracks have their sample rate specified in video frame units.

1.15. ImmersiveAudioDescriptorType

1.15.1. Elements

Table 62. ImmersiveAudioDescriptorType elements
Element Name Type Description

SoundfieldConfiguration

UTF-8 string

a tag representing the immersive audio configuration in the track. The tag is based on the SMPTE ST377-4 Multichannel Audio framework MCATagSymbol (typically IAB).

Annotation

UTF-8 string

An optional user-defined annotation text string

Title

UTF-8 string

An optional text string as specified in SMPTE ST377-41

TitleVersion

UTF-8 string

An optional text string as specified in SMPTE ST377-41

TitleSubVersion

UTF-8 string

An optional text string as specified in SMPTE ST377-41

Episode

UTF-8 string

An optional text string as specified in SMPTE ST377-41

ContentKind

UTF-8 string

A tag representing the content kind as specified in SMPTE ST377-41

ElementKind

UTF-8 string

A tag representing the element kind as specified in SMPTE ST377-41

PartitionKind

UTF-8 string

A tag representing the partition kind as specified in SMPTE ST377-41

PartitionNumber

UTF-8 string

A text string representing the partition number as specified in SMPTE ST377-41

ContentType

UTF-8 string

A tag representing the content type as specified in SMPTE ST377-41

ContentSubType

UTF-8 string

A tag representing the content subtype as specified in SMPTE ST377-41

ContentDifferentiator

UTF-8 string

An optional user-defined text string representing the content differentiator as specified in SMPTE ST377-41

UseClass

UTF-8 string

A tag representing the use class as specified in SMPTE ST377-41

SpokenLanguageAttribute

UTF-8 string

A tag representing the attribute of the main spoken language as specified in SMPTE ST377-41

1.16. JobDescriptorType

The JobDescriptor element is a complex element containing various sub-elements that describe the nature of the job that produced the XML report.

1.16.1. Elements

Table 63. JobDescriptorType elements
Element Name Type Description

Type

UTF-8 string

a tag uniquely identifying the type of the job that has produced the report

Status

UTF-8 string

a tag that represents the status of the job that has produced the report

Creator

UTF-8 string

A human readable text that identifies the device that has produced the report

1.17. ISO639LanguageType

A UTF-8 text string containing the ISO 639-1 or ISO 639-2 language code

1.18. MasteringDisplayDescriptorType

1.18.1. Elements

Table 64. MasteringDisplayDescriptorType elements
Element Name Type Description

MaximumLuminance

Float32

a value representing the maximum luminance of the reference display in cd/m2

MinimumLuminance

Float32

a value representing the minimum luminance of the reference display in cd/m2

CIERedPrimary

CIExyType

the CIExy coordinates of the red chromaticity of the reference display

CIEGreenPrimary

CIExyType

the CIExy coordinates of the green chromaticity of the reference display

CIEBluePrimary

CIExyType

the CIExy coordinates of the bluechromaticity of the reference display

CIEWhitePoint

CIExyType

the CIExy coordinates of the white point of the reference display

1.19. SourceDescriptorType

1.19.1. Elements

Table 65. SourceDescriptorType elements
Element Name Type Description

Path

UTF-8 string

a string defining the file system location of the object (i.e. file name or directory name)

Length

UInt64

The number of bytes occupied by the object on the file system

CreationDate

xs:dateTime

creation date and time of the object on the file system

LastAccessDate

xs:dateTime

last access date and time of the object on the file system

LastModifiedDate

xs:dateTime

last modification date and time of the object on the file system

1.20. SubjectDescriptorType

The SubjectDescriptor element is a complex element containing various sub-elements that contain information on the subject that the XML report was produced for.

The values for the various sub-elements are extracted or deduced from the subject’s metadata.

1.20.1. Elements

Table 66. SubjectDescriptorType elements
Element Name Type Description

Type

UTF-8 string

a tag uniquely identifying the type of the subject (i.e. file format or package format)

Name

UTF-8 string

A human readable text containing the subject name as stored in the underlying subject format

Annotation

UTF-8 string

A human readable text containing comments associated with the subject’s file or package

Issuer

UTF-8 string

A human readable text that identifies the entity that produced the file or package

Specification

UTF-8 string

a tag uniquely identifying the type of the job that has produced the report

SourceDescriptor

SourceDescriptorType

a source descriptor that describes the location and storage properties of the subject

1.21. TestSequenceType

The TestSequenceType contains an ordered list of Test elements. Each Test element is defined by the TestType.

Below is an example of a TestSequence:

...
<TestSequence type="ID_AssetMap">
  ...
  <Test id="ID_AssetMap_Existence">
    ...
  </Test>
  <Test id="ID_AssetMap_XMLSchema">
    ...
  </Test>
  ...
</TestSequence>
...

1.21.1. Attributes

Table 67. TestSequenceType attributes
Attribute Name Type Description

type

UTF-8 string

a tag uniquely identifying the type of test sequence

1.22. TimecodeDescriptorType

1.22.1. Elements

Table 68. TimecodeDescriptorType elements
Element Name Type Description

StartTimecode

UInt64

a value representing the start frame that the timecode corresponds to

RoundedTimecodeBase

UInt32

a value representing the rounded timecode frame rate base.

DropFrame

Boolean

a boolean value that indicates whether the timecode skips frames at certain frame number. This value is typically set to TRUE for NTSC-like frame rates.

1.22.2. About RoundedTimecodeBase

For integer frame rate, this value is the actual frame rate.

For non-integer frame rates (e.g. NTSC-like frame rates), this value represents the frame rate rounded to the smaller integer that is larger than the frame rate. For instance, for a 29.97 frame rate, the RoundedTimecodeBase will be equal to 30.

1.23. TimedTextDescriptorType

1.23.1. Elements

Table 69. TimedTextDescriptorType elements
Element Name Type Description

Usage

UTF-8 string

a tag uniquely identifying the usage or role of the timed text (closed captions, force narrative, etc)

1.24. TrackDescriptorType

1.24.1. Elements

Table 70. TrackDescriptorType elements
Element Name Type Description

Name

UTF-8 string

A human readable text containing the track name as stored in the underlying subject format

Language

UTF-8 string

a ISO 639-1 or ISO 639-2 language code

Duration

UInt64

A 64-bit unsigned interger containing the number of edit units in the track

EditRate

Rational

A rational containing the edit rate of the track (e.g. frame rate for video tracks)

EssenceDescriptor

EssenceDescriptorType

A complex type describing the essence properties used by the track

1.25. VideoDescriptorType

1.25.1. Elements

Table 71. VideoDescriptorType elements
Element Name Type Description

Width

UInt32

A non-null 32-bit unsigned interger containing the frame width in pixels

Height

UInt32

A non-null 32-bit unsigned interger containing the frame height in pixels

ChannelCount

UInt32

If present, a non-null 32-bit unsigned interger representing the number of video channels in the essence referenced by the parent track. A value of one is used for monoscopic content and a value of 2 is used for stereoscopic content. If the element is absent, then the content is considered monoscopic

FrameLayout

UTF-8 string

a tag uniquely identifying the frame layout as progressive or interlaced

Orientation

UTF-8 string

a tag uniquely identifying the frame stored orientation

PixelAspectRatio

Rational

A rational containing the pixel aspect ratio

ColorSpace

UTF-8 string

a tag uniquely identifying the signaled color space

ColorPrimaries

UTF-8 string

a tag uniquely identifying the signaled color primaries

TransferCharacteristic

UTF-8 string

a tag uniquely identifying the signaled EOTF

CodingEquations

UTF-8 string

a tag uniquely identifying the coding equations required for RGB to YCbCr and YCbCr to RGB cpmversions

ColorRange

UTF-8 string

a tag uniquely identifying the range of values used by the content pixels

DynamicToneMapping

UTF-8 string

a tag uniquely identifying the dynamic tone mapping technology required for content interpretation

ACESInputDeviceTransform

UTF-8 string

a tag uniquely identifying the ACES Input Device Transform associated with the content

MasteringDisplayDescriptor

MasteringDisplayDescriptorType

If present, a complex type contaioing the SMPTE ST2086 HDR Static Metadata

ContentLightLevelsDescriptor

ContentLightLevelsType

If present, a complex type contaioing the CEA 861.3-2015 HDR Contgent Light Levels

2. MT-TN-20 Output Preset File Format Specification

2.1. Introduction

2.1.1. Overview

This document describes the Marquise Technologies' Preset XML file format used for specifying and reusing various parameters for delivery jobs.

The file format is based on the eXtensible Markup Language (XML) and uses its conventions and structures.

Scope

This specification is intended to give a reference to developers and implementers of this file format. As the format evolves to include new features, this specification is subject to frequent updates.

2.1.2. Document Organization

The specification is divided in various sections with a strong focus on implementation.

The last section of the document provides a detailed description of the various types used by the file format. These types are either simple types or complex types. Complex types contain a description of the sub-elements they contain.

The types are listed in the order they appear in the XML document.

2.1.3. Document Notation and Conventions

2.1.4. Language code conventions

This specification makes an extensive use of language codes for different purposes. All codes are assumed to use the ISO 639-1 or 639-2 notation.

2.1.5. XML Conventions

The document always uses the UTF-8 encoding.

Currently this specification does not follow the rules of an XML schema (XSD), however this may change in the future.

2.2. About Output Presets

When an input file is passed to TORNADO, an internal, virtual composition is built, based on the properties found by decoding the file and the additional parameters specified by the user.

Output Presets are XML files containing elements that define parameters affecting the output of a transcoding job. Their purpose is to specify and reuse the same set of properties for a given master.

2.3. Output Preset Document Structure

The Output Preset document structure is defined with the starting MasterDeliverySpecification root element.

Below is an example of a basic structure of an Output Preset document:

<MasterDeliverySpecification>
  <Name/>
  <Type/>
  <Shim/>
  <ShimVersion/>
  <Container/>
  <TimecodeStart/>
  <Presentation/>
  <Video>
    ...
  </Video>
  <Audio>
    ...
  </Audio>
  <TimedText>
    ...
  </TimedText>
</MasterDeliverySpecification>

Depending on the desired output not all elements are required to be specified.

2.3.1. Elements

Element Name Type Mandatory Description

Name

simple

no

a human readable text string containing the name of the composition.

Type

tag

yes

A fixed text string value, defined in a controlled vocabulary, see Enums

Specification

tag

no

A fixed text string value, defined in a controlled vocabulary, see Enums

Shim

tag

yes

A fixed text string value, defined in a controlled vocabulary, see Enums

ShimVersion

tag

no

A fixed text string value, defined in a controlled vocabulary, see Enums

Container

tag

yes

A fixed text string value, defined in a controlled vocabulary, see Enums

TimecodeStart

simple

no

A timecode descriptor defining the start timecode of the composition.

Presentation

keyword

no

An image descriptor defining the presentation type of the picture. Can have the values monoscopic or stereoscopic.

Video

complex

yes

see VideoType

Audio

complex

yes

see AudioType

TimedText

complex

no

see TimedTextType

2.3.2. About Specification, Shim and ShimVersion

The Specification element, also known as a delivery specification, defines a set of constrains for the output file.

The Shim adds an additional layer of constrains and are applicable for a given Specification.

The ShimValue identifies which version of the Shim is to be used.

For example, a broadcaster like BBC, has a delivery specification: BBC AS-11 UK DPP HD 1.1.

This delivery specification will contain file naming conventions, containers, codecs and other mandatory parametrs necessary for the delivery.

Then this Specification is further refined by the Shim: AS-11 UK DPP HD because there is another variant for SD.

Lastly the ShimVersion is 1.1 because there is another, deprecated variant, 1.0.

2.3.3. About TimecodeStart

When the TimecodeStart element is present and a value is specified then that value is used as the start timecode for the output file using the preset.

If the element is not present then the start timecode of the composition is used as the start timecode of the output file.

2.4. VideoType

The Video element is a complex type using various sub-type elements that describe the video track’s properties. Below is the basic structure of the VideoType:

<Video>
  <FrameWidthList>
    ...
  </FrameWidthList>
  <FrameHeightList>
    ...
  </FrameHeightList>
  <FrameRateList>
    ...
  </FrameRateList>
  <Codec>
    ...
  </Codec>
  <Colorimetry>
    ..
  </Colorimetry>
</Video>

2.4.1. Elements

Element Name Type Mandatory Description

FrameWidthList

complex

no

see FrameWidthListType

FrameHeightList

complex

no

see FrameHeightListType

FrameRateList

complex

no

see FrameRateListType

Codec

complex

yes

see CodecType

Colorimetry

complex

no

see ColorimetryType

2.4.2. FrameWidthListType

The FrameWidthList element is defined by the FrameWidthListType and contains an ordered list of FrameWidth elements that define the allowed frame widths for the current specification. Each FrameWidth element is defined by the FrameWidthType.

Below is an example of a FrameWidthList:

...
<FrameWidthList>
  <FrameWidth>1920</FrameWidth>
  ...
  <FrameWidth>1280</FrameWidth>
  ...
</FrameWidthList>
...

2.4.3. Elements

Element Name Type Mandatory Description

FrameWidth

simple

yes

A unsigned 32bit integer representing the frame width

2.4.4. FrameHeightListType

The FrameHeightList element is defined by the FrameHeightListType and contains an ordered list of FrameHeight elements that define the allowed frame heights for the current specification. Each FrameHeight element is defined by the FrameHeightType.

Below is an example of a FrameHeightList:

...
<FrameHeightList>
  <FrameHeight>1080</FrameHeight>
  ...
  <FrameHeight>720</FrameHeight>
  ...
</FrameHeightList>
...

2.4.5. Elements

Element Name Type Mandatory Description

FrameHeight

simple

yes

A unsigned 32bit integer representing the frame height

2.4.6. FrameRateListType

The FrameRateList element is defined by the FrameRateListType and contains an ordered list of FrameRate elements that define the allowed frame rates for the current specification. Each FrameRate element is defined by the FrameRateType.

Below is an example of a FrameRateList:

...
<FrameRateList>
  <FrameRate>24:1</FrameRate>
  ...
  <FrameRate>25:1</FrameRate>
  ...
</FrameRateList>
...

2.4.7. Elements

Element Name Type Mandatory Description

FrameRate

simple

yes

a rational representing the video playback frame rate

2.5. CodecType

The Codec element is a complex type using various sub-type elements that describe the video and audio track’s encoding properties.

Below is an example of a definition for the H264 codec using the baseline profile with a bitdepth of 8 and a bitrate of 100 Mbits/s:

<Codec>
  <Id>h264</Id>
  <ColorEncoding>YCbCr.4:2:2</ColorEncoding>
  <Quality>lossyvbr</Quality>
  <Profile>h264:baseline</Profile>
  <Level>auto</Level>
  <BitDepth>8</BitDepth>
  <BitRate>100</BitRate>
</Codec>

2.5.1. Elements

Element Name Type Mandatory Description

Id

tag

yes

A fixed text string value, defined in a controlled vocabulary, see Enums

ColorEncoding

tag

yes if Video

A fixed text string value, defined in a controlled vocabulary, see Enums

Quality

tag

yes if Video

A fixed text string value, defined in a controlled vocabulary, see Enums

Profile

tag

yes if Video

A fixed text string value, defined in a controlled vocabulary, see Enums

Level

tag

yes if Video

A fixed text string value, defined in a controlled vocabulary, see Enums

BitRate

simple

no

An unsigned 32bit integer representing the codec’s supported bit-rate

BitDepth

simple

no

An unsigned 32bit integer representing the codec’s supported bit-depth

2.5.2. About Codec

Much like the Specification, Shim and ShimVersion the values defining the Codec define a set of constraints where one or more values depend on the ones chosen previously. In the context of a Codec the first level of constraint is defined by the Id then the definition is further refined by the Profile then Level and so on. Furthermore, not all elements are applicable to all Codec definitions.

2.6. ColorimetryType

The Colorimetry element defines the color-space and the black to white range for the delivery specification. These values are not used for colorimetric transformations but rather to signal the color-space of the delivery specification.

2.6.1. Elements

Element Name Type Mandatory Description

ColorSpace

tag

no

A fixed text string value, defined in a controlled vocabulary, see Enums

CodeRange

tag

no

A fixed text string value, defined in a controlled vocabulary, see Enums

2.7. AudioType

The Audio element is a complex type using various sub-type elements that describe the audio track’s properties. Below is the basic structure of a Audio element:

<Audio>
  <Codec>
    ..
  </Codec>
  <SoundfieldList>
    ...
  </SoundfieldList>
</Audio>

2.7.1. Elements

Element Name Type Mandatory Description

Codec

complex

yes

see CodecType

SoundfieldList

complex

no

see SoundfieldListType

2.8. SoundfieldListType

The SoundfieldList element is a complex type containing a list of Soundfield elements.

2.8.1. Elements

Element Name Type Mandatory Description

Soundfield

complex

no

see SoundfieldType

2.9. SoundfieldType

The Soundfield element is a complex type using various sub-type elements that describe the properties of a soundfield.

Below is the Soundfield definition where the first 2 channels found in the first audio track in the source composition are mapped to a stereo configuration in the output file.

<Soundfield>
  <Configuration>ST</Configuration>
  <ChannelMap>
    <Channel trackType="audio" trackIndex="0" symbol="L">0</Channel>
    <Channel trackType="audio" trackIndex="0" symbol="R">1</Channel>
  </ChannelMap>
</Soundfield>

2.9.1. Elements

Element Name Type Mandatory Description

Title

simple

no

User defined value setting the Title metadata element when applicable

TitleVersion

simple

no

User defined value setting the TitleVersion metadata element when applicable

TitleSubVersion

simple

no

User defined value setting the TitleSubVersion metadata element when applicable

Episode

simple

no

User defined value setting the Episode metadata element when applicable

ContentKind

simple

no

User defined value setting the ContentKind metadata element when applicable

ElementKind

simple

no

User defined value setting the ElementKind metadata element when applicable

Language

tag

no

A fixed text string value, defined in a controlled vocabulary, see Enums

Annotation

simple

no

User defined value setting the Annotation metadata element when applicable

Configuration

tag

yes

A fixed UTF-8 string value, defined in a controlled vocabulary, see Enums

ChannelMap

complex

yes

see ChannelMapType

2.10. ChannelMapType

The ChannelMap element is a complex type containing an ordered list of Channel elements.

Below is an example of a ChannelMap for a 5.1 track configuration:

<ChannelMap>
  <Channel trackType="audio" trackIndex="0" symbol="L">0</Channel>
  <Channel trackType="audio" trackIndex="0" symbol="R">1</Channel>
  <Channel trackType="audio" trackIndex="0" symbol="C">2</Channel>
  <Channel trackType="audio" trackIndex="0" symbol="LFE">3</Channel>
  <Channel trackType="audio" trackIndex="0" symbol="Ls">4</Channel>
  <Channel trackType="audio" trackIndex="0" symbol="Rs">5</Channel>
</ChannelMap>

2.10.1. Elements

Element Name Type Mandatory Description

Channel

simple

yes

Unsigned 32bit integer value designating the position index of a channel in the channel-map

2.11. Channel

The Channel element is an unsigned 32-bit integer specifying the source channel to be used. In addition a number of attributes can be specified:

2.11.1. Attributes

Attribute Name Type Description

trackType

keyword

A keyword that describes the track type. Can be audio or aux.

trackIndex

simple

A zero-based index value, defining the track index in the source composition.

symbol

tag

A text string value, defined in a controlled vocabulary, designating the label of the channel used for determining the destination index

2.11.2. About Configuration and ChannelMap

The Soundfield Configuration value will determine the number of Channel elements in a ChannelMap.

For example, in the case where Configuration has the value ST, representing a stereo audio configuration, the ChannelMap shall contain and define 2 Channel elements while a Configuration having the value 51 shall contain and define 6 Channel elements.

2.12. TimedTextType

The TimedText element is a complex type using various sub-type elements that describe the properties of timed text tracks.

Below is the basic structure of a TimedText element:

<TimedText>
  <Container/>
  <TrackMap/>
</TimedText>

2.12.1. Elements

Element Name Type Mandatory Description

Container

tag

yes

A fixed text string value, defined in a controlled vocabulary, see Enums

TrackMap

complex

yes

see TrackMapType

2.13. TrackMapType

The TrackMap element is a complex type containing an ordered list of Track elements.

<TrackMap>
  <Track/>
  ...
  <Track/>
  ...
</TrackMap>

2.14. ExtendedPropertiesType

The ExtendedProperties element is a complex type using various sub-type elements that describe the additional metadata that should be written into the delivery. Below is the basic structure of a ExtendedProperties element:

<ExtendedProperties>
  <PropertyList>
    ...
  </PropertyList>
</ExtendedProperties>

2.14.1. Elements

Element Name Type Description

PropertyList

complex

see PropertyListType

2.15. PropertyListType

The PropertyList element is a complex type containing a list of Property elements.

2.15.1. Elements

Element Name Type Description

Property

complex

see PropertyType

2.16. PropertyType

The Property element is a complex type defining a metadata or additional master delivery property entry as a name/value pair.

2.16.1. Elements

Element Name Type Description

Name

tag

A text string value, defined in a controlled vocabulary, that determines the field value for a metadata or master delivery property entry

Value

simple

User defined, text string value, corresponding to the value for the metadata or property entry

3. MT-TN-30 Authoring eXchange Format (AXF) XML File Format Specification

3.1. Introduction

3.1.1. Overview

This document describes the Marquise Technologies Authoring eXchange Format (AXF) XML file format used for project and compositions exchange among the Marquise Technology product line.

The file format is based on the eXtensible Markup Language (XML) and uses its conventions and structures.

3.1.2. Scope

This specification is intended to give a reference to developers and implementers of this file format. As the format evolves to include new features, this specification is subject to frequent updates.

3.1.3. Document Organization

The specification is divided in various sections with a strong focus on implementation.

The last section of the document provides a detailed description of the various types used by the file format. These types are either simple types or complex types. Complex types contain a description of the sub-elements they contain.

The types are listed in alphabetic order.

3.1.4. Document Notation and Conventions

3.1.5. XML Conventions

The document always uses the UTF-8 encoding.

Currently this specification does not follow the rules of an XML schema (XSD), however this may change in the future.

3.2. AXF Document Structure

The AXF document structure is defined with the starting MTAXF root element, containing an unordered list of Project elements. The most common usage is to store a single Project element per AXF document. Each Project element contains an optional AssetList and an optional CompositionList. Below is the basic structure of an AXF document:

<MTAXF>
  <Project>
    <AssetList>
    ...
    </AssetList>
    <CompositionList>
    ...
    </CompositionList>
  </Project>
</MTAXF>

The optional AssetList element is used to reference the assets necessary for the project and the CompositionList element contains an unordered list of compositions. Each composition describes the temporal organization of the assets in a synchronized manner in order to produce a timeline, also know as a play list.

3.2.1. About assets

In AXF, the AssetList contains a unordered list of assets that are available to all compositions in the project. If no compositions are present in the project, then the AssetList simply represents a structured bin of clips. For instance, the following AXF example contains a single project with only a list of assets:

<MTAXF>
  <Project>
    <AssetList>
      <Asset>
        <Id>urn:uuid:d0c3e019-0877-4877-9921-bf1517bf522a</Id>
        <Reference>/MySimpleImageSequence</Reference>
        <URL>F:\SourceMaterial\TIFF\1920x1080\sequence_#######.tiff</URL>
      </Asset>
      <Asset>
        <Id>urn:uuid:0c8e782d-689a-4ed4-94dd-4c86d6d8e1ea</Id>
        <Reference>/MyClip</Reference>
        <URL>F:\SourceMaterial\testClip.mov</URL>
      </Asset>
    </AssetList>
  </Project>
</MTAXF>

In this example, the list of assets has two entries, one referencing an image sequence and another referencing a QuickTime Movie clip. Note that the URL element of the asset provides the file system location of the assets. For image sequences, the part of the file name that is used for the number of a frame inside a sequence is replaced by # characters, one per digit.

3.2.2. About compositions

In AXF a compposition is a description of a timline (or play list) containing on or more kinds of essence, synchronized for playback. The properties of the composition such as the frame rate an resolution, the audio sample rate and color management parameters are stored as well. Many other properties required for interactive editing (e.g. insert or replace modes, safe guides, etc) are also stored.

A project can carry zero or more compositions. When one or more compositions are present in the project, the compositins are listed in a CompositionList element:

<MTAXF>
  <Project>
    <AssetList>
    ...
    </AssetList>
    <CompositionList>
      <Composition>
      ...
      </Composition>
      ...
      <Composition>
      ...
      </Composition>
    </CompositionList>
  </Project>
</MTAXF>

3.2.3. About media, tracks and stacks

An AXF composition describes the synchronization of various pieces of material (essence) via a Media object. The Media object is a complex structure made of a collection of stacks, each stack being a collection of tracks. Below is a simplified example of a Media object containing multiple stacks and tracks:

...
<Media>
  <Video>
    <Track>
      ...
    </Track>
  </Video>
  <Audio>
    <Track>
      ...
    </Track>
    <Track>
      ...
    </Track>
  </Audio>
</Media>
...

A stack is made up of one or more tracks. The role of a stack is to provide information on the type of essence that its tracks represent as well as the logical relationship that may exist between the tracks of the same stack. Stacks may or may not exist in the tool or device used to edit or play a composition. This is entirely implementation dependent. The following table shows the currently defined stack types:

Table 72. Stack types
Element Name Description

Audio

use for audio essence tracks

Auxiliary

used for other non-conventional essence tracks, such as immersive audio, dynamic metadata, etc

TimedText

used for subtitles and captions

Video

used for video essence tracks

The most obvious example is an Audio stack that describes a group of audio tracks which jointly participate in the correct reproduction of a sound mix. Typically a stereo mix is made up of two audio channels, namely left and right channels. Each channel is represented in AXF by a track. In order to signal the logical groupping of these tracks as a stereo mix, a stack is used. The stack carries the group type information as well as the configuration, in this case a stereo mix configuration. Each track of the stereo stack will represent then the essence for a channel and will also carry the information on the role of the track within the stack. In AXF such an Audio stack would be represented like this:

...
<Audio config="LtRt">
  <Track channel="Lt">
    <SegmentList>
      <Segment type="clip">
        <Id>urn:uuid:c902bdb2-0016-49a0-8f61-f0b28d134305</Id>
        <Name>testAudio_left</Name>
        <Duration>750</Duration>
        ...
      </Segment>
    </SegmentList>
  </Track>
  <Track channel="Rt">
    <SegmentList>
      <Segment type="clip">
        <Id>urn:uuid:ae7f6d1b-f66b-43f9-b571-23a96b4781ed</Id>
        <Name>testAudio_right</Name>
        <Duration>750</Duration>
        ...
      </Segment>
    </SegmentList>
  </Track>
</Audio>
...

In the example above the Audio stack has the config attribute that describes the audio configuration for the stack. Each Track element of the Audio stack has a channel atribute that describes its role within the Audio stack. Implementations that do not support the concept of stacks can disregard the stacks when reading AXF compositions and treat the tracks as independent structures.

Tracks are then made up of a list of segments, representing the assembly of material (essence) for each track. This is described in the following section.

3.2.4. About tracks and segments

Tracks represent the assembly of material portions in order to form a coherent edit. The assembly is represented by a collection of segments. Segments can be of various kinds such as clips, gaps, generators and transitions.

Each track describes an independent list of consecutive segments. Segments are primarily defined by their duration, expressed in edit units. The edit units are based on the frame rate described in the composition properties. In other words each edit unit needs to be presented during playback with a duration of 1/frame rate.

Segments do not have a starting point. Each segment immediately follows the previous one. This implies that a list of segments does not have gaps between segments. In order to specify a gap within a track, a filler segment must be used with the duration of the gap. The following table represents the various segment types:

Table 73. Segment types
Type Description

clip

a segment representing a section of material (essence)

transition

a segment representing a transition between the previous an the next segments

generator

a segment representing a generated essence (e.g. color bars, silence, etc)

filler

a segment representing a gap between other segments

Each segment has a collection of properties that depend on is type. These properties are detailed in the SegmentType definition, in the AXF Document Elements section.

3.3. AXF Document Elements

The following section describes the various element and sub-element types used in this specification

Table 74. root element
Element Name Type Description

MTAXF

complex

AXF document root element

Table 75. A
Element Name Type Description

ACESInputDeviceTransform

simple

ACESOutputDeviceTransform

simple

ActiveArea

simple

Anchor

simple

Angle

simple

AspectRatio

simple

Asset

complex

see AssetType

AssetId

simple

AssetList

complex

see AssetListType

AssetSourceChannel

simple

Audio

complex

see AudioType

AudioDownmixType

simple

AudioRoutingMap

complex

see [AudioRoutingMapType]

AudioRoutingMapChannel

complex

see [AudioRoutingMapChannelType]

AudioSampleRate

simple

Auxiliary

complex

see [AuxiliaryType]

Table 76. B
Element Name Type Description

BitDepth

simple

Blanking

complex

see BlankingType

BlueWeights

simple

BottomLeft

simple

BottomRight

simple

Brightness

simple

BurnInList

complex

see [BurnInListType]

BurnInText

simple

Table 77. C
Element Name Type Description

CanvasAspectRatio

simple

ChannelIndex

simple

ChromaFormat

simple

ChromaticAdaptationMethod

simple

CIEBluePrimary

simple

CIEGreenPrimary

simple

CIERedPrimary

simple

CIEWhitePoint

simple

CMS

complex

see CMSType

CodingEquations

simple

Color

simple

ColorBalance

simple

ColorCorrectionPrimaryIn

complex

see [ColorCorrectionPrimaryInType]

ColorCorrectionPrimaryOut

complex

see [ColorCorrectionPrimaryOutType]

ColorCorrectionSecondary

complex

see [ColorCorrectionSecondaryType]

ColorEncoding

simple

ColorEncodingFormat

simple

ColorPrimaries

simple

ColorRange

simple

ColorSpace

simple

ColorVolume

complex

see [ColorVolumeType]

Comment

simple

Composition

complex

see CompositionType

CompositionList

complex

see CompositionListType

ContentKind

simple

ContentLightLevels

complex

see ContentLightLevelsType

ContentVersionIdentifiers

complex

see [ContentVersionIdentifiersType]

Contrast

simple

Table 78. D
Element Name Type Description

DataType

simple

DiagonalSize

simple

DMCVT

complex

see [DMCVTType]

DolbyVision

complex

see [DolbyVisionType]

Duration

simple

DurationRange

complex

see [DurationRangeType]

DynamicToneMapping

complex

see [DynamicToneMappingType]

Table 79. E
Element Name Type Description

EditingParameters

complex

see [EditingParametersType]

ElementKind

simple

EOTF

simple

Episode

simple

Table 80. F
Element Name Type Description

FilmType

simple

Flip

simple

Flop

simple

Font

complex

see [FontType]

FontFace

simple

FontList

complex

see [FontListType]

Format

complex

see FormatType

FrameRate

simple

Table 81. G
Element Name Type Description

Gain

simple

GreenWeights

simple

Table 82. H
Element Name Type Description

HDR10Plus

complex

see [HDR10PlusType]

HFactor

simple

Height

simple

HighlightClipping

simple

Highlights

simple

Hue

simple

HueVectorField

simple

Table 83. I
Element Name Type Description

Id

simple

ImageAspectRatio

simple

ImageCharacter

simple

ImmersiveAudio

complex

see [ImmersiveAudioType]

ImportDescriptor

complex

see ImportDescriptorType

In

simple

Table 84. K
Element Name Type Description

KeyFrame

complex

see [KeyFrameType]

KeyFrameList

complex

see [KeyFrameListType]

Table 85. L
Element Name Type Description

Language

simple

Level1

complex

see [Level1Type]

Level2

complex

see [Level2Type]

Level3

complex

see [Level3Type]

Level4

complex

see [Level4Type]

Level5

complex

see [Level5Type]

Level8

complex

see [Level8Type]

Level9

complex

see [Level9Type]

Locator

complex

see [LocatorType]

LocatorList

complex

see [LocatorListType]

Table 86. M
Element Name Type Description

Marker

complex

see [MarkerType]

MarkerList

complex

see [MarkerListType]

MarkRange

complex

see [MarkRangeType]

MasteringDisplay

complex

see [MasteringDisplayType]

MaxCLL

simple

MaxFALL

simple

MaximumLuminance

simple

Media

complex

see MediaType

Metadata

simple

MetadataEntry

simple

MetadataEntryList

complex

see [MetadataEntryListType]

MidContrastBias

simple

Midtones

simple

MidTonesOffset

simple

MinimumLuminance

simple

Mode

simple

Table 87. N
Element Name Type Description

Name

simple

Table 88. O
Element Name Type Description

Offset

simple

Opacity

simple

Origin

simple

Out

simple

Table 89. P
Element Name Type Description

PanAndScan

complex

see [PanAndScanType]

Parameters

complex

see [ParametersType]

PARInvert

simple

PARValue

simple

PARVertical

simple

PartitionKind

simple

PartitionNumber

simple

Picture

complex

see [PictureType]

PictureList

complex

see [PictureListType]

PixelAspectRatio

simple

Playhead

complex

see [PlayheadType]

PlayheadList

complex

see [PlayheadListType]

PlayheadRange

complex

see [PlayheadRangeType]

Position

simple

Power

simple

ProcessingGraph

complex

see [ProcessingGraphType]

ProcessingGraphList

complex

see [ProcessingGraphListType]

ProcessingPipeline

complex

see [ProcessingPipelineType]

Project

complex

see ProjectType

Properties

complex

see PropertiesType

Table 90. R
Element Name Type Description

RangeMax

simple

RangeMin

simple

RAWImageProcessingSettings

complex

see [RAWImageProcessingSettingsType]

RedWeights

simple

Reel

complex

see ReelType

ReelList

complex

see ReelListType

Reference

simple

Resource

complex

see [ResourceType]

ResourceId

simple

ResourceList

complex

see [ResourceListType]

RGBMixer

complex

see [RGBMixerType]

Table 91. S
Element Name Type Description

SafeArea

complex

see [SafeAreaType]

SafeAreaList

complex

see [SafeAreaListType]

SafeColorVolume

complex

see [SafeColorVolumeType]

SafeGuides

complex

see [SafeGuidesType]

Saturation

simple

SaturationVectorField

simple

Scale

simple

Segment

complex

see [SegmentType]

SegmentList

complex

see [SegmentListType]

Shadows

simple

Shear

simple

SidecarAsset

complex

see [SidecarAssetType]

SidecarAssetList

complex

see [SidecarAssetListType]

Slope

simple

Source

complex

see [SourceType]

SourceColorModel

simple

SourceColorPrimary

simple

SourceList

complex

see [SourceListType]

SpeedFactor

simple

StereoscopicConvergence

simple

System

simple

Table 92. T
Element Name Type Description

TargetId

simple

TargetDisplayId

simple

Temperature

simple

TimedText

complex

see TimedTextType

TimedTextColorMapping

complex

see [TimedTextColorMappingType]

Tint

simple

Title

simple

TitleVersion

simple

TitleSubVersion

simple

TopLeft

simple

TopRight

simple

Track

complex

see TrackType

TrackIndex

simple

TrackType

simple

Trim

simple

Type

simple

Table 93. U
Element Name Type Description

URI

simple

URL

simple

Table 94. V
Element Name Type Description

VFactor

simple

Version

simple

Video

complex

see VideoType

ViewportLUT

simple

Table 95. W
Element Name Type Description

Width

simple

Workflow

simple

Table 96. Y
Element Name Type Description

YRGB

complex

see [YRGBType]

Table 97. Z
Element Name Type Description

Zoom

simple

3.4. AudioType

TBD

3.5. AssetListType

The AssetList element is defined by the AssetListType and contains an ordered list of Asset elements. Each Asset element is defined by the AssetType.

Below is an example of a AssetList:

...
<AssetList>
  <Asset>
  ...
  </Asset>
  ...
  <Asset>
  ...
  </Asset>
</AssetList>
...

3.6. AssetType

The AssetType is a complex element type describing the location of the asset within the project’s own library as well as the corresponding location on the file system. In addition to these properties, a globally unique identifier (GUID) is assigned to the asset.

3.6.1. Elements

Table 98. AssetType elements
Element Name Type Description

Id

UUID

a unique ID following the convention described in RFC 4122

Reference

URI

a local URI that describes the location of the asset in the project’s asset library

URL

URI

a system-wide URI that describes the location of the asset in the file system

3.6.2. Example

Below are a few simple examples describing the useage of the Asset elements.

Image Sequence asset
...
<Asset>
  <Id>urn:uuid:d0c3e019-0877-4877-9921-bf1517bf522a</Id>
  <Reference>/MySimpleImageSequence</Reference>
  <URL>F:\SourceMaterial\TIFF\1920x1080\sequence_#######.tiff</URL>
</Asset>
...
Container clip asset
...
<Asset>
  <Id>urn:uuid:0c8e782d-689a-4ed4-94dd-4c86d6d8e1ea</Id>
  <Reference>/MyClip</Reference>
  <URL>F:\SourceMaterial\testClip.mov</URL>
</Asset>
...

3.7. BlankingType

The BlankingType is a complex element type describing the blanking properties used to burn-in a letter-box or a pillar-box black matte on the top of the canvas after all video tracks have been rendered.

3.7.1. Elements

Table 99. BlankingType elements
Element Name Type Description

Opacity

Real

a 0 to 100 percentage value that controls the blanking opacity

Table 100. BlankingType attributes
Element Name Type Description

enabled

Boolean

set to TRUE if the blanking matte is active (painted), FALSE else

3.7.2. Example

...
<Blanking enabled="false">
  <Opacity>100</Opacity>
</Blanking>
...

3.8. CMSType

The CMSType is a complex element type describing the various properties used by the color management engine when processing the composition’s video frames.

3.8.1. Elements

Table 101. CMSType elements
Element Name Type Description

System

tag

a UTF-8 string defining the color management system used for processing

Version

tag

a UTF-8 string defining the version of color management system used for processing

Workflow

tag

a UTF-8 string defining the workflow of color management system used for processing

ColorPrimaries

tag

a UTF-8 string defining color primaries of the color space used

EOTF

tag

a UTF-8 string defining electro-optical transfer function used

CodingEquations

tag

a UTF-8 string defining coding equations (matrix coefficients) for RGB to YCBCr and YCbCr to RGB conversions

ChromaticAdaptationMethod

tag

a UTF-8 string defining the chromatic adaptation method used for color space conversion

3.9. CompositionListType

The CompositionListType is a complex element type describing the various compositions that are part of the project.

3.9.1. Elements

Table 102. CompositionListType elements
Element Name Type Description

Composition

CompositionType

one or more occurence of a complex element describing compositions

3.10. CompositionType

The CompositionType is a complex element type describing the various properties and synchronization of media inside a timeline (play list).

3.10.1. Elements

Table 103. CompositionType elements
Element Name Type Description

Id

UUID

a unique ID following the convention described in RFC 4122

Name

UTF-8 String

a UTF-8 string containing the name of the composition

ImportDescriptor

ImportDescriptorType

a complex element containing the source master type and location

Media

MediaType

a complex element containing the stacks for various kinds of media

Properties

PropertiesType

a complex element containing the composition properties

ReelList

ReelListType

a list of reels present in the composition

3.11. FormatType

The FormatType is a complex element type describing the frame geometry, frame rate and audio rate properties of a composition.

3.11.1. Elements

Table 104. FormatType elements
Element Name Type Description

Mode

keyword

a keyword that describes the presentation mode, the allowed values are monoscopic and stereoscopic

Width

UInt32

a unsigned 32bit integer representing the frame width

Height

UInt32

a unsigned 32bit integer representing the frame height

PixelAspectRatio

Rational

a rational representing the pixel aspect ratio

ActiveArea

UInt32Array

four unsignd 32bit integer values representing the active width, height, X and Y offsets of the frame active area

FrameRate

Rational

a rational representing the video playback frame rate

AudioSampleRate

Rational

a rational representing the audio playback frame rate

FilmType

keyword

a keyword that describes the film format corresponding to the composition

3.11.2. Example

Below is a simple example describing the format of a composition:

...
<Format>
  <Mode>monoscopic</Mode>
  <Width>1920</Width>
  <Height>1080</Height>
  <PixelAspectRatio>1:1</PixelAspectRatio>
  <ActiveArea>1920 1080 0 0</ActiveArea>
  <FrameRate>25:1</FrameRate>
  <AudioSampleRate>48000:1</AudioSampleRate>
  <FilmType>undefined</FilmType>
</Format>
...

3.12. ImportDescriptorType

The ImportDescriptorType is a complex element type describing the source that the composition was derived from.

3.12.1. Elements

Table 105. ImportDescriptorType elements
Element Name Type Description

Type

tag

a tag that identifies the source master type (e.g. DCP, IMF, etc)

URL

URI

a system-wide URI that describes the location of the source master

3.13. MediaType

The MediaType is a complex element type describing the various stacks of time-synchronized material parts inside the composition.

3.13.1. Elements

Table 106. MediaType elements
Element Name Type Description

Audio

AudioType

a complex element containing one or more audio stacks of tracks to be played synchronously

TimedText

TimedTextType

a complex element containing one or more timed text stacks of tracks to be played synchronously

Video

VideoType

a complex element containing one or more video stacks of tracks to be played synchronously

3.14. ProjectType

The ProjectType is a complex element type describing the various core objects belonging to a project such as assets and compositions.

3.14.1. Elements

Table 107. ProjectType elements
Element Name Type Description

AssetList

AssetListType

a list of assets available for all compositions

CompositionList

CompositionListType

a list of compositions present in the project

3.15. PropertiesType

The PropertiesType is a complex element type describing the various composition properties.

3.15.1. Elements

Table 108. PropertiesType elements
Element Name Type Description

Format

FormatType

a complex element containing the composition audio-visual format properties

Blanking

BlankingType

a complex element containing the frame blanking parameters

CMS

CMSType

a complex element containing the color management system parameters

ContentLightLevels

ContentLightLevelsType

a complex element containing the HDR content light levels such as MaxCLL and MaxFALL

SafeGuides

SafeGuidesType

a complex element containing the safe guides

EditingParameters

EditingParametersType

a complex element containing the editing parameters

3.16. ReelListType

The ReelListType is a complex element type describing the reels in the composition.

3.16.1. Elements

Table 109. ReelListType elements
Element Name Type Description

Reel

ReelType

one or more occurence of a complex element describing a reel

3.16.2. Example

...
<ReelList>
  <Reel>
    <Name>First Reel</Name>
    <Color>#33A8C7</Color>
    <In>0</In>
    <Duration>100</Duration>
  </Reel>
  ...
  <Reel>
    <Name>Last Reel</Name>
    <Color>#ff0000</Color>
    <In>500</In>
    <Duration>280</Duration>
  </Reel>
</ReelList>
...

3.17. ReelType

The ReelType is a complex element type describing the various properties of a composition reel.

3.17.1. Elements

Table 110. ReelType elements
Element Name Type Description

Id

UUID

a unique ID following the convention described in RFC 4122

Name

UTF-8 String

a UTF-8 string containing the name of the reel

Color

UTF-8 String

a UTF-8 string containing the color name or value of the reel using the HTML/CSS syntax

In

UInt32

an unsigned 32-bit numerical value indicating start of the reel in edit units (usually video frames)

Duration

UInt32

an unsigned 32-bit numerical value indicating duration of the reel in edit units (usually video frames)

3.17.2. Example

...
<Reel>
  <Name>First Reel</Name>
  <Color>#33A8C7</Color>
  <In>0</In>
  <Duration>100</Duration>
</Reel>
...

3.18. TimedTextType

TBD

3.19. TrackType

The TrackType is a complex element type describing the various properties of a track.

3.19.1. Elements

Table 111. TrackType elements
Element Name Type Description

Name

UTF-8 String

name of the track

DataType

keywork

a keyword defining the class/role of the essence represented by the track

SegmentList

SegmentListType

a complex element containing an ordered list of Segment elements

Table 112. TrackType attributes
Element Name Type Description

enabled

Boolean

set to TRUE if the track is enabled for playback/rendering, FALSE else

displayHeight

UInt32

a number of pixels that represent the height of a track when displayed in an editor

locked

Boolean

set to TRUE if the track is editable, FALSE else

selected

Boolean

set to TRUE if the track is selected, FALSE else

3.20. VideoType

TBD

4. MT-TN-40 Cut List XML File Format Specification

4.1. Introduction

4.1.1. Overview

This document describes the Marquise Technologies Cut List XML file format used to describe simple cut-based edit lists among the Marquise Technology product line.

The file format is based on the eXtensible Markup Language (XML) and uses its conventions and structures.

4.1.2. Scope

This specification is intended to give a reference to developers and implementers of this file format. As the format evolves to include new features, this specification is subject to frequent updates.

4.1.3. Document Organization

The specification is divided in various sections with a strong focus on implementation.

The last section of the document provides a detailed description of the various types used by the file format. These types are either simple types or complex types. Complex types contain a description of the sub-elements they contain.

The types are listed in alphabetic order.

4.1.4. Document Notation and Conventions

4.1.5. XML Conventions

The document always uses the UTF-8 encoding.

Currently this specification does not follow the rules of an XML schema (XSD), however this may change in the future.

4.2. About Marquise Technologies' Cut Lists

The Cut Lists are used to define the properties of a package consisting of multiple assets. This can be especially useful when using assets generated outside the Marquise Technologies' ecosystem and there is a need to use these assets with our different products.

4.2.1. MTCTL Example

Below is a complete example of a package consisting of one EXR image sequence and 5 WAV audio files, grouped under a package containing one segment with 1 image track and 5 mono audio tracks:

<?xml version="1.0" encoding="UTF-8" ?>
<MTCTL xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  <Name>openEXR_MTCTL</Name>
  <Issuer>Marquise Technologies</Issuer>
  <IssueDate>2021-09-02T13:26:32+00:00</IssueDate>
  <Creator>STORM</Creator>
  <Specification>custom</Specification>
  <Shim>islongplay</Shim>
  <ShimVersion>default</ShimVersion>
  <Properties>
    <Format>
      <Width>1920</Width>
      <Height>1080</Height>
      <FrameRate>24:1</FrameRate>
      <AudioSampleRate>48000:1</AudioSampleRate>
      <ColorSpace>ACES</ColorSpace>
    </Format>
  </Properties>
  <AssetList>
    <Asset id="pic_reel.1">Image/#######.exr</Asset>
    <Asset id="snd_reel.1.track.1">Audio/Channel_L.wav</Asset>
	<Asset id="snd_reel.1.track.2">Audio/Channel_R.wav</Asset>
	<Asset id="snd_reel.1.track.3">Audio/Channel_C.wav</Asset>
	<Asset id="snd_reel.1.track.4">Audio/Channel_LFE.wav</Asset>
	<Asset id="snd_reel.1.track.5">Audio/Channel_Ls.wav</Asset>
	<Asset id="snd_reel.1.track.6">Audio/Channel_Rs.wav</Asset>
  </AssetList>
  <SegmentList>
    <Segment duration="550">
      <Picture in="0" sourceChannel="0">pic_reel.1</Picture>
      <Sound in="0" sourceChannel="1">snd_reel.1.track.1</Sound>
      <Sound in="0" sourceChannel="1">snd_reel.1.track.2</Sound>
      <Sound in="0" sourceChannel="1">snd_reel.1.track.3</Sound>
      <Sound in="0" sourceChannel="1">snd_reel.1.track.4</Sound>
      <Sound in="0" sourceChannel="1">snd_reel.1.track.5</Sound>
      <Sound in="0" sourceChannel="1">snd_reel.1.track.6</Sound>
    </Segment>
  </SegmentList>
</MTCTL>

4.3. MTCTL

The MTCTL is the cut-list’s root element.

4.3.1. Elements

Element Name Type Description

Name

UTF-8 string

A user defined, human readable text string, containing the name of the composition.

Issuer

UTF-8 string

A user defined, human readable text string, containing the name of the issuer.

IssueDate

UTF-8 string

A text string value, containing the issue date/time following the ISO 8601 standard

Creator

UTF-8 string

A user defined, human readable text string, containing the name of the document’s creator.

Specification

UTF-8 string

A fixed text string value, defined in a controlled vocabulary, that defines a collection of predefined values for all the elements found in a MasterDeliverySpecification.

Shim

UTF-8 string

A fixed text string value, defined in a controlled vocabulary, that defines a collection of predefined values, that are applicable for a given Specification.

ShimVersion

UTF-8 string

A fixed text string value, defined in a controlled vocabulary, that defines a collection of predefined values, that are applicable for a given Shim.

Properties

PropertiesType

Complex type containing various cut-list format properties

AssetList

AssetListType

Complex type containing a list of Asset elements

SegmentList

SegmentListType

Complex type containing a list of Segment elements

4.4. Properties

The Properties element is a complex type containing various sub elements describing the cut-list’s properties.

4.4.1. Elements

Element Name Type Description

Format

UTF-8 string

Complex type containing sub elements

TimecodeStart

UTF-8 string

A text string value defining the start timecode of the first segment, following the SMPTE Timecode syntax

4.5. Format

The Format element is a complex type containing various sub elements describing the cut-list’s format.

4.5.1. Elements

Element Name Type Description

Width

UInt32

An unsigned 32bit integer representing the image width

Height

UInt32

An unsigned 32bit integer representing the image height

FrameRate

Rational

A rational number representing the cut-list’s edit rate

AudioSampleRate

Rational

A rational number representing the cut-list’s audio sampling rate

ColorSpace

UTF-8 string

A fixed text string value, defined in a controlled vocabulary

4.5.2. About ColorSpace

The ColorSpace element indicates the working color-space for all the image assets defined in the AssetList complex type.

4.6. AssetList

The AssetList element is a complex type containing a list of Asset elements.

4.6.1. Elements

Element Name Type Description

Asset

URI

A local URI that describes the location of the asset relative to the cut-list file location

4.7. Asset

The Asset element is a complex type containing a list attributes. Below is an example of an image sequence Asset element:

<Asset id="pic_reel.1">Image/NGCT-ACES-P3D6048nits/#######.exr</Asset>

4.7.1. Attributes

Element Name Type Description

id

UTF-8 string

A unique, user defined text string value, defining the id of the asset

4.8. SegmentList

The SegmentList element is a complex type containing a list of Segment elements.

4.8.1. Elements

Element Name Type Description

Segment

SegmentType

Complex type containing various elements that describe the segment type

4.9. Segment

The Segment element is a complex type containing various elements that describe the segment type.

4.9.1. Attributes

Attribute Name Type Description

duration

UTF-8 string

A text string value defining the duration of the segment, in edit units or as a SMPTE timecode.

4.9.2. Elements

Element Name Type Description

Picture

UTF-8 string

A unique, user defined text string value, previously defined as the id attribute of an existing Asset element

Sound

UTF-8 string

A unique, user defined text string value, previously defined as the id attribute of an existing Asset element

TimedText

UTF-8 string

A unique, user defined text string value, previously defined as the id attribute of an existing Asset element

4.10. Picture

The Picture element is defined as a sub-element of a Segment element. Below is an example of an image sequence Picture element:

<Picture in="0" sourceChannel="0">pic_reel.1</Picture>

4.10.1. Attributes

Element Name Type Description

in

UInt32

The number of video frames, greater or equal to zero, defining the start position within the referenced source material.

sourceChannel

UInt32

A zero-based index value, defining the referenced Asset source channel index

4.11. Sound

The Sound element is defined as a sub-element of a Segment element. Below is an example of one mono-channel Sound element:

<Sound in="0" sourceChannel="1">snd_reel.1.track.1</Sound>

4.11.1. Attributes

Element Name Type Description

in

UInt32

The number of samples, greater or equal to zero, defining the start position within the referenced source material.

sourceChannel

UInt32

A zero-based index value, defining the referenced Asset source channel index

4.12. TimedText

The TimedText element is defined as a sub-element of a Segment element. Below is an example of one TimedText element:

<TimedText in="0" sourceChannel="1">tt_reel.1.track.1</TimedText>

4.12.1. Attributes

Element Name Type Description

in

UInt32

The number of video frames, greater or equal to zero, defining the start position within the referenced source material.

sourceChannel

UInt32

A zero-based index value, defining the referenced Asset source channel index

5. MT-TN-50 Overlay XML File Format Specification

5.1. Introduction

5.1.1. Overview

This document describes the Marquise Technologies Overlay XML file format used to describe overlays and their properties among the Marquise Technology product line.

The file format is based on the eXtensible Markup Language (XML) and uses its conventions and structures.

5.1.2. Scope

This specification is intended to give a reference to developers and implementers of this file format. As the format evolves to include new features, this specification is subject to frequent updates.

5.1.3. Document Organization

The specification is divided in various sections with a strong focus on implementation.

The last section of the document provides a detailed description of the various types used by the file format. These types are either simple types or complex types. Complex types contain a description of the sub-elements they contain.

The types are listed in alphabetic order.

5.1.4. Document Notation and Conventions

5.1.5. XML Conventions

The document always uses the UTF-8 encoding.

Currently this specification does not follow the rules of an XML schema (XSD), however this may change in the future.

5.2. About Marquise Technologies' Overlays

Overlay XML files are used to describe the properties of text or image elements superimposed over the image and bounded by the composition’s container.

5.2.1. Overlay Example

Below is a complete example of an overlay defining the properties of two text elements using tags to capture the clip’s name and the timecode. The second Text element uses a Rich Text Markup command to specify the color and transparency of the text. In this case the text will be red with an alpha (transparency) of 50%.

<?xml version="1.0" encoding="UTF-8" ?>
<MTOverlaysTemplate>
	<Text hpos="5" vpos="0" halign="left" valign="center">TCR: $Timecode$</Text>
	<Text hpos="5" vpos="5" halign="left" valign="bottom">{\col(#ff000088) NAME: $ClipName$}</Text>
</MTOverlaysTemplate>

5.3. MTOverlaysTemplate

The MTOverlaysTemplate is the overlay’s root element.

5.3.1. Elements

Element Name Type Description

Text

simple

A user defined, human readable text string, containing the value of the text. This value can be combined with different tags.

5.4. Text

The Text element is a simple type defining via it’s properties the positioning as well as the text to be displayed as an overlay. It can make use of tags to capture various data fields available in the clip’s metadata as well as make use of Marquise Technologies rich-text commands.

5.4.1. Attributes

Element Name Type Description

hpos

simple

Horizontal position, expressed in percentage of the composition’s container horizontal dimensions where 0 represents the text block’s baseline. The progression direction is from left to right.

vpos

simple

Vertical position, expressed in percentage of the composition’s container vertical dimensions where 0 represents the text block’s baseline. The progression direction is from top to bottom.

halign

keyword

Horizontal alignment. Defines the text block’s alignment relative to the composition’s container where the origin is represented by the text block’s inline-start position. Possible values: left, right, center.

valign

keyword

Vertical alignment. Defines the text block’s alignment relative to the composition’s container where the origin is represented by the text block’s inline-end position. Possible values: top, bottom, center.

5.4.2. About positioning

The baseline is a line that follows the inline text line box. It is used to position and guide the individual glyphs from different fonts or font sizes when typesetting. The baseline is defined by the font face baseline table and different writing systems might prefer different baseline tables. Text overlays use the alphabetic baseline that typically aligns with the bottom of uppercase glyphs in the case of Latin scripts.

The inline is defined by the content (glyphs) that flows within a line of text. It’s dimension is parallel to the flow of text within a line. It can be a horizontal dimension for horizontal writing modes (Latin scipts) or a vertical one in the case of vertical writing modes. The inline-start and inline-end are thus determined by the inline direction which, in turn, is determined by the writing mode of the used script.

It is important to understand these concepts when positioning text. You might find that a line of text, using a Latin script, that is left-aligned and top-aligned and it’s horizontal and vertical positiong are set to 0, doesn’t appear in the composition’s container (i.e. it is outside the visible bounds of the image). This is because the alignment uses the font’s baseline and the inline start/end coordinates to position the block of text. In our example the baseline is aligned with the top edge of the composition’s container. As previously mentioned, in the case of a Latin script the baseline is usually the bottom of an uppercase glyph. By modifying the vertical positioning (vpos) the text block can be descended and the text block can be brought in the visible region of the container.

5.5. Tags

Tag

Description

CameraManufacturer

CAM MANUFACTURER

CameraModel

CAM MODEL

CameraSerialNumber

CAM S/N

CameraId

CAM ID

CameraSensorId

CAM SENSOR

ClipName

CLIP NAME

Codec

CODEC

CompositionName

COMPOSITION NAME

CompositionResolution

COMPOSITION RESOLUTION

ContentKind

CONTENT KIND

Edgecode

EDGE CODE

EpisodeNumber

EPISODE NUMBER

EpisodeTitle

EPISODE TITLE

FrameNumber

FRAME NUMBER

FrameRate

FRAME RATE

ISDCF

ISDCF D-CINEMA NAMING CONVENTION

Language

LANGUAGE

Name

NAME

PackageName

PKG NAME

ProjectName

PROJECT NAME

Production

PRODUCTION NAME

Agency

AGENCY NAME

Producer

PRODUCER NAME

Director

DIRECTOR NAME

Title

PROJECT TITLE

CreationDate

PROJECT CREATION DATE

ModificationDate

PROJECT LAST MODIFICATION DATE

Client

CLIENT NAME

ClientCode

CLIENT CODE

Colorist

COLORIST NAME

DoP

DIRECTOR OF PHOTOGRAPHY NAME

Facility

FACILITY NAME

Studio

STUDIO NAME

ReelIndex

REEL INDEX

ReelName

REEL NAME

ReelNumber

REEL NUMBER

Timecode

TIMECODE

UUID

UUID

Version

VERSION

ClipName

CLIP NAME

CompositionName

COMPOSITION NAME

ProjectName

PROJECT NAME

ReelName

REEL NAME

ReelNumber

REEL NUMBER

5.5.1. About Tags

Tags can be used within a Text element’s value in order to capture various informations about the media. This is done by reading the available clip and or composition’s metadata. For example, if an overlay is created in order to burnin the camera manufacturer of a source clip but that information is not available in the metadata, then if no other text was provided the overlay will be empty and no visible data will be written on top of the image.

6. MT-TN-60 Rich Text Markup Specification

6.1. Introduction

6.1.1. Overview

This document describes the Marquise Technologies Rich Text Markup used across all the products to apply advanced styling to text elements.

6.1.2. Scope

This specification is intended to give a reference to end-users on how the Rich Text Markup can be used to apply advanced styling during text rendering for subtitles and captions as well as any other place where user-defined text is rendered.

6.1.3. Syntax Structure

Rich Text Markup (RTM) uses a compact structure that supports styling commands and nested blocks called spans. Each span can have its own list of styling commands. Nested spans inherit the styling commands from the parent spans.

Text Stream

A text stream is a stream of characters that may contain Rich Text Markup as described in this document. The text stream does not contain any document layout properties such as page management or paragraph management. These concepts are out of scope and depend on the context where Rich Text Markup is used. However they may influence the default rendering of the text (e.g. default text color, default font size, etc).

Text Span

A text span is delimited by enclosing the text between curly brackets { and }. These delimiting characters have the Unicode values of U+007B and U+007D respectively. The top-level span in a nested group of spans may or may not be enclosed using the curly brackets. In that case the RTM processors assumes that the start and the end of the top-level span are defined by the start and the end of the entire block of text.

Styling Commands

Styling commands must follow the beginning of a span, by using the backslash character \ (Unicode character U+005C). Each command begins with a backslash. No whitespace character must exist between the start of a span and the list of commands and no whitespace character must exist between commands. The RTM parser will stop interpreting styling commands when encountering the first whitespace after the start of a span or a styling command.

Table 113. Styling Commands
Command Name Description

b

changes the current text background color, see Text Background Color

bidi

col

changes the current text color, see Text Color

dir

font

fontFamily

fontSize

changes the current font size, see Font Size

fontStyle

changes the current font slanting style (e.g. Normal, Italic, etc), see Font Style

fontWeight

changes the current font weigth (e.g. Normal, Bold, etc), see Font Weight

opacity

ruby

textCombine

textDecoration

textEmphasis

textOrientation

textOutline

textShadow

Font Weight

fontWeight(weight)

The fontWeight command changes the weight of the current font within the current span.

Table 114. Parameters
Parameter Description

normal

sets the standard weight for the font

bold

sets the bold weight for the font

Table 115. Properties
Property Description

Inherited

yes

Font Size

The fontSize command changes the size of the current font within the current span.

Table 116. Properties
Property Description

Inherited

yes

Font Style

fontStyle(style)

The fontStyle command changes the slanting style of the current font within the current span.

Table 117. Parameters
Parameter Description

normal

sets the normal slanting for the font

italic

sets the italic slanting for the font

oblique

sets the oblique slanting for the font (if font does not support the style, italid is used)

Table 118. Properties
Property Description

Inherited

yes

Text Background Color

The b command changes the current text background color within the current span.

Table 119. Properties
Property Description

Inherited

yes

Text Color

The col command changes the current text color within the current span.

Table 120. Properties
Property Description

Inherited

yes

7. MT-TN-70 Color Information Signaling Specification

7.1. Introduction

7.1.1. Overview

This document describes the Marquise Technologies Color Information Signaling used across all the products.

7.1.2. Scope

This specification describes the controlled vocabulary used to signal color information such as color spaces, color primaries, transfer functions and more.

7.2. Color Space definitions

The following table defines the tags used for color space signaling.

Tag Symbol Reference

custom

Custom color space definition

ACES

ACES AP0 linear

ACES.cc

ACEScc

ACES.cct

ACEScct

ACES.cg

ACEScg

ACES.proxy

ACESproxy

ARRI.WideGamut3

ARRI Wide Gamut 3

ARRI.WideGamut4

ARRI Wide Gamut 4

CIEXYZ

CIE XYZ

DCDM.P3.DCI

DCDM P3 DCI

DCDM.P3.D60

DCDM P3 D60

DCDM.P3.D65

DCDM P3 D65

DCDM.P3.D65.ST2084

DCDM P3 D65 ST2084

DCDM.P7.DCI

DCDM P7 DCI

DCDM.P7.D60

DCDM P7 D60

ITU-R-BT.601.PAL

ITU-R-BT.601 PAL

ITU-R-BT.601.NTSC

ITU-R-BT.601 NTSC

ITU-R-BT.709

ITU-R BT.709

ITU-R-BT.709.D60

ITU-R BT.709 D60

ITU-R-BT.709.D75

ITU-R BT.709 D75

ITU-R-BT.709.D93

ITU-R BT.709 D93

ITU-R-BT.2020

ITU-R BT.2020

ITU-R-BT.2020.ST2084

ITU-R BT.2020 ST2084

ITU-R-BT.2020.ST2084@1000

ITU-R BT.2020 ST2084 1000 nits

ITU-R-BT.2020.ST2084@2000

ITU-R BT.2020 ST2084 2000 nits

ITU-R-BT.2020.ST2084@4000

ITU-R BT.2020 ST2084 4000 nits

ITU-R-BT.2020.HLG

ITU-R BT.2020 HLG

ITU-R-BT.2020.HLG@1000

ITU-R BT.2020 HLG 1000 nits

ITU-R-BT.2020.HLG@2000

ITU-R BT.2020 HLG 2000 nits

ITU-R-BT.2020.HLG@4000

ITU-R BT.2020 HLG 4000 nits

ITU-R-BT.2020.ST2115.CameraLogS3

ITU-R BT.2020 ST2115 Camera Log S3

ITU-R-BT.2020.ST2115.CameraLogV

ITU-R BT.2020 ST2115 Camera Log V

ITU-R-BT.2020.ST2115.CameraLogC2

ITU-R BT.2020 ST2115 Camera Log C2

ITU-R-BT.2020.ST2115.CameraLogC3

ITU-R BT.2020 ST2115 Camera Log C3

P3.DCI

P3 DCI

P3.D60

P3 D60

P3.D60.ST2084@1000

P3 D60 ST2084 1000 nits

P3.D60.ST2084@2000

P3 D60 ST2084 2000 nits

P3.D60.ST2084@4000

P3 D60 ST2084 4000 nits

P3.D65

P3 D65

P3.D65.ST2084

P3 D65 ST2084

P3.D65.ST2084@1000

P3 D65 ST2084 1000 nits

P3.D65.ST2084@2000

P3 D65 ST2084 2000 nits

P3.D65.ST2084@4000

P3 D65 ST2084 4000 nits

P7.DCI

P7 DCI

P7.D60

P7 D60

sRGB.D65

sRGB D65

sRGB.D50

sRGB D50

Sony.SGamut

Sony S-Gamut

Sony.SGamut3

Sony S-Gamut3

Sony.SGamut3.Cine

Sony S-Gamut3 Cine

xvYCC709

xvYCC 709

DCIMovies.HDR

DCI Movies HDR

Dolby.Cinema.DCI

Dolby Cinema DCI

Dolby.Cinema.EDR

Dolby Cinema EDR

EclairColor.EC1.EDR

EclairColor EC1 EDR

EclairColor.EC1.HDR

EclairColor EC1 HDR

Canon.CinemaGamut

Canon Cinema Gamut

Panasonic.Varicam

Panasonic Varicam

RED.WideGamut

RED Wide Gamut

BMD.FilmWideGamutGen5

BMD Film Wide Gamut Gen5

BMD.DaVinciWideGamut

BMD DaVinci Wide Gamut

ST2067.40

Cinema Mezzanine XYZ Linear

ST2067.40.DCDM

Cinema Mezzanine DCDM

Nikon

Nikon

7.3. Color Matrix / Coding Equations definitions

The following table defines the tags used for color matrix (aka coding equations) signaling.

Tag Symbol Reference

none

N/A

ITU-R.BT.601

ITU-R BT.601

ITU-R.BT.709

ITU-R BT.709

ITU-R.BT.2020

ITU-R BT.2020

SMPTE.240M

SMPTE 240M

YCgCo

YCgCo

GBR

GBR

ITU-R.BT.2100.ICtCp

ITU-R BT.2100 ICtCp

Dolby.IPTPQc2

Dolby IPTPQc2

8. MT-TN-80 Audio Information Signaling Specification

8.1. Introduction

8.1.1. Overview

This document describes the Marquise Technologies Audio Information Signaling used across all the products.

8.1.2. Scope

This specification describes the controlled vocabulary used to signal audio information such as audio channels, soundfields, audio metadata and more.

8.1.3. Soundfield Configuration Signaling

The following table defines the tags used for partition kind signaling.

Name Symbol Definition

1.0 Monaural

M

C

Lt-Rt

LtRt

Lt, Rt

Standard Stereo

ST

L, R

Dual Mono

DM

M1, M2

Discrete Numbered Sources

DNS

NSC001, NSC002, …

3.0

30

L, C, R

4.0

40

L, C, R, S

5.0

50

L, C, R, Ls, Rs

5.1

51

L, C, R, Ls, Rs, LFE

5.1EX

51EX

L, C, R, Lst, Rst, LFE

6.0

60

L, C, R, Ls, Rs, Cs

7.0DS

70

L, C, R, Lss, Rss, Rls, Rrs

7.1DS

71

L, C, R, Lss, Rss, Lrs, Rrs,LFE

7.1SDS

SDS

L, Lc, C, Rc, R, Ls, Rs, LFE

6.1

61

L, R, C, Lss, Rss, Cs, LFE

5.1.4

514

L, C, R, Ls, Rs, LFE, Ltfs, Rtfs, Ltrs, Rtrs

7.1.4

714

L, C, R, Lss, Rss, Lrs, Rrs, LFE, Ltfs, Rtfs, Ltrs, Rtrs

9.1.4

914

L, C, R, Lw, Rw, Lss, Rss, Lrs, Rrs, LFE, Ltfs, Rtfs, Ltrs, Rtrs

9.1.6

916

L, C, R, Lw, Rw, Lss, Rss, Lrs, Rrs, LFE, Ltfs, Rtfs, Ltms, Rtms, Ltrs, Rtrs

6.0 Height

60HT

L, R, Ls, Rs, Lh, Rh

8.0 Height

80HT

L, R, Ls, Rs, Lh, Rh, Lsh, Rsh

9.1 Overhead

91OH

L, C, R, Lss, Rss, Lrs, Rrs, LFE, Lts, Rts

9.1 Height

91HT

L, C, R, Ls, Rs, LFE, Lh, Rh, Lsh, Rsh

10.1 Height

101HT

L, C, R, Ls, Rs, LFE, Lh, Rh, Lsh, Rsh, Ts

11.1 Height

111HT

L, C, R, Ls, Rs, LFE, Lh, Ch, Rh, Lsh, Rsh, Ts

13.1 Height

131HT

L, C, R, Lss, Rss, Lrs, Rrs, LFE, Lh, Ch, Rh, Lsh, Rsh, Ts

15.1 Height

151HT

L, C, R, Lss, Rss, Lrs, Rrs, LFE, Lh, Ch, Rh, Lssh, Rssh, Lrsh, Rrsh, Ts

12.0 Center Height

120CH

L, C, R, Lss, Rss, Lrs, Rrs, Ltfs, Rtfs, Ltrs, Rtrs, Ch

12.1 Center Height

121CH

L, C, R, Lss, Rss, Lrs, Rrs, LFE, Ltfs, Rtfs, Ltrs, Rtrs, Ch

6.0 Center Height

60CH

L, C, R, Ls, Rs, Ch

6.1 Center Height

61CH

L, C, R, LFE, Ls, Rs, Ch

22.2 UHDTV

222

L, C, R, LFE1, LFE2, Lss, Rss, Lrs, Rrs, Cr, Lc, Rc, Lh, Rh, Ch, Lssh, Rssh, Crh, Lrsh, Rrsh, Ts, Lb, Rb, Cb

Audio Description Studio Signal

ADSS

VIN, ADSSdc as specified in BBC R&D White Paper WHP 198

IAB

IAB

Immersive Audio Bitstream

Hearing Accessibility

HA

HI

Visual Accessibility

VA

VIN

8.1.4. Metadata Signaling

Partition Kind

The following table defines the tags used for partition kind signaling.

Name Symbol Definition

Full Length

FL

The entire program with no partitioning.

Reel

REEL

A segment of the program that traditionally corresponded to a reel of film but today can be any segment of the program that is determined to be convenient for post-production purposes.

Act

ACT

A segment of the program corresponding to a defined part of the story.

Part

PART

A presentation segment of the program, such as of a program presented over more than one continuous showing, for example, a mini-series.

Content

The following table defines the tags used for content signaling.

Name Symbol Definition

Primary

PRM

Primary program content that combines dialog, music, effects and narration (if applicable).

Secondary Audio Program

SAP

Program content that combines a Primary (PRM) with a Voice Over (VO).

Hearing Impaired

HI

Audio content designed for hearing-impaired audiences, generally consisting of primary program content with its dialog content mixed considerably louder than its music and effects content, optionally employing audio compression or limiting.

Descriptive Video

DV

Audio content designed for visually-impaired audiences that combines primary program content with narration contentdescribing selected action in the picture.

Dialog

DX

The composite of all spoken original language dialog content or the composite of all dubbed spoken dialog content for a given language translation.

Music

MX

The composite of all music content of the primary program, in its final balances, in a single Soundfield Group.

Effects

FX

The composite of all sound effects content, including backgrounds, Foley, hard effects and any other effects.

Filled Effects

FFX

The composite of all sound effects content, including backgrounds, Foley, hard effects and any other effects.

Music and Effects

ME

The composite of all music and effects content of the primary program, without dialog content but with additional material added to fill production effects in areas where dialog was removed.

Optional Music and Effects

OP

Optional music and effects content that can either be dubbed or used as is for localization.

Music and Effects with Optional

MESP

A Group of Soundfield Groups containing an ME Soundfield Group and one or more OP Soundfield Groups.

DME

DME

Dialog, Music, and Effects. A Group of Soundfield Groups containing an original language DX Soundfield Group, an MX Soundfield Group and an FX Soundfield Group.

NDME

NDME

Narration, Dialog, Music, and Effects

Program Narration

PNAR

Narration content inherent to the primary program, for example, in a documentary.

Optional Narration

ONAR

Optional Narration content that is not inherent to the primary program but relates to it.

Voice Over

VO

Localized dialog intended to be heard over the primary program, i.e. as an alternative to replacing the originaldialog with translated dialog.

Visually Impaired

VI

Narration content designed for visually-impaired audiences that describes selected action in the picture.

Recorded Commentary

CM

Commentary on selected material in the picture that has been previously recorded to go along with the primary program.

Live Commentary

LCM

Live Commentary content, such as would be created by an event announcer, for example, a sports announcer.

Silence

MOS

Intentional silence, typically content with all digital zero values.

Custom

x-<private>

Any Audio Channel, Soundfield Group or Group of Soundfield Groups not covered by another category. Symbol begins with “x-” and is followed by 1-4 character private tag that uniquely describes the content.

Content Subtype

The following table defines the tags used for content subtype signaling.

Name Symbol Definition

Director

DIR

Recorded Commentary with the Director as the primary speaker.

Technical

TECH

Recorded Commentary with a technical person as the primary speaker, such as one that designed the visual effects.

Writer

WRT

Recorded Commentary with the screenplay writer as the primary speaker.

Cast

CAST

Recorded Commentary with one or more of the cast members.

Announcer

ANN

Live Commentary with a person announcing the event.

Commentator

CTR

Live Commentary with a person or people who are commenting on an event or other piece of content.

Other

OTHER

Placeholder for other types of Live Commentary.

Use Class

The following table defines the tags used for use class signaling.

Name Symbol Definition

Finished Composite

FCMP

The associated Content is a composite, complete work and need not be mixed prior to presentation.

Intermediary Composite

ICMP

The associated Content is a composite but requires mixing with other elements prior to presentation.

Simplified

SMPL

The associated Content contains a simplified mix, typically mono, stereo or Lt-Rt, intended for special uses.

Singular

SING

Element contains a single content (such as narration voice) that canbe output on its own or be mixed with other Soundfield Groups

Spoken Language Attribute

The following table defines the tags used for spoken language attribute signaling.

Name Symbol Definition

Original

ORIGINAL

Indicates that the content contains the original spoken dialog.

Dubbed

DUBBED

Indicates that the content contains dubbed dialog which replaces some or all of the original spoken dialog with another language.

APPENDICES

1. Input File Formats Support

1.1. Compositions

Format Name File Extension(s)

Advanced Authoring Format

.aaf

D-Cinema Composition Playlist

.xml

EDL CMX 3600

.edl

Final Cut Pro

.xml

Final Cut Pro X

.fcpxml

IMF Composition Playlist

.edl

1.2. Camera Files

Camera Models Format Names

Apple

iPhone

h.264 QT MOV

ARRI

Alexa (B6W, LF, SXR, SXT, XR, XT, 65)
Alexa Mini LF
Amira

ARRIRAW (incl. 4:3), ARRIRAW MXF
ProRes MXF, ProRes QT MOV
ARRIRAW HDE (Codex)

BlackMagic

Cinema, Pocket Cinema, URSA

Blackmagic RAW
Cinema DNG
ProRes QT MOV

Canon

EOS 1D / 5D / 7D
C100 / C200 / C300
C500 / C700 / C700FF

h.264 QT MOV
h.264 MXF
Canon XF-AVC
Canon RAW CRM + RMF

GoPro

Any

h.264 MP4

Nikon

DSLR cameras

NEF DSLR RAW

Panasonic

Varicam 35
Eva-1

Panasonic VRW RAW
Panasonic P2 MXF
AVC-ULTRA
ProRes QT MOV
h.264 QT MOV

Phantom

4K Flex
HD Gold
Miro

.cine RAW

RED

Dragon
Epic, Epic Monochrome
Helium
Komodo 6K
Monstro
Scarlet
Weapon

REDCODE RAW (R3D)

SONY

F65, F55, F5
NEXFS 700
Venice/X-OCN
XDCam EX
DSLR cameras

SonyRAW
Sony Simple Studio Profile (SStP) MP4
Sony XDCam EX MXF
Sony XDCam EX MP4
Sony XAVC
ARW DSLR RAW

1.3. Image Sequences

Format Name File Extension(s) Comment

Alias Pix

.pix, .als

Amiga ILBM

.ilbm, .lbm, .iff

ARRI RAW

.ari

ARRI RAW HDE

.arx

Artisan

.art

(Media Logic Inc.)

Aurora

.sim, .im

BMP

.bmp

Canon RAW

.rmf

Cineon

.cin

Chyron

.chr

DNG

.dng, .krw

DPX

.dpx

See DPX Support

DEEP

.dep, .deep, .iff

(TV Paint)

ERIMovie

.eri

(Elastic Reality Inc.)

FilmLight FL32

.fl32

Filmstrip

.flm

JPEG

.jpg, .jpeg

JPEG2000

.j2k, .j2c

(J2K)

JPEG XS

.jxs

JPEG High Throughput

.jph

Maya IFF

.iff

MentalRay

.bit, .ct, .cth, .ct16, .ctfp, .map, .mt, .nt, .shmap, .st, .tt, .zt

OpenEXR

.exr

Panasonic VRAW

.vrw

PCX

.pcx

Photoshop PSD

.psd

Import composite image only

Pixar

.pxr

PNG

.png

Radiance

.hdr, .pic, .rgbe

Rendition

.6rn

RGB8 Impulse

.rgb8, .iff

RGBN Impulse

.rgbn, .iff

SGI

.sgi, .rgb

[Silicon Graphics]

Softimage

. pic

SunRaster

. sun, .ras

Targa

.tga, .targa

TIFF

.tif, .tiff

Wavefront

.rla, .rlb, .rpf

Weisscam RAW

.fhgWx (A, B, C, D, E, F, G)

YUV RAW

.yuv, .yuv10)

YUVN MacroSystem

.yuvn, .iff

1.3.1. DPX support

8 bit UYVY

10 bit YUV 4:2:2 b.e. V2

12 bit RGB b.e.

8 bit YUVA 4:2:2:4

10 bit YUV 4:2:2 Cineon b.e.

12 bit RGB b.e. V2

8 bit YUV 4:2:2 b.e. V2

10 bit YUV 4:4:4 b.e. V2

12 bit RGB l.e. V2

8 bit YUV 4:2:2 l.e. V2

10 bit RGB Cineon b.e.

8 bit YUV 4:4:4 b.e. V2

10 bit RGBA

16 bit RGB Cineon b.e.

8 bit RGB b.e. V2

8 bit RGB l.e. V2

DPX Monochrome

8 bit RGBA

DPX Alpha

b.e. = big endian / l.e. = little endian

1.4. Video

Format Name File Extension(s)

Advanced Systems Format (Windows Media Video)

.asf, .wma, .wmv

AV1

.av1

AVI

.avi

Avid MXF

.mxf

Canon RAW light

.crm

Blackmagic RAW

.braw

DV

.dv

FlashVideo

.flv

GXF

.gxf

H.262 / MPEG-2

.m2v, .h262, .262

H.264 Advanced Video Coding

.avc, .h264, 264, .m4v

H.265 High Efficiency Video Coding

.hevc, .h265, .265

InteractiveFX Archive Movie

.arc

JPEG 2000

.j2kves

JPEG XS

.jxs

Matroska

.mkv, .mka

Motion JPEG 2000

.mj2, .mpj2

MPEG-2

.mpg, .mpeg, .mp2

MPEG-4 ISO BMFF

.mp4, .3gp, .3g2, .3g2

MPEG-4 Visual

.m4v

MPEG-TS (Transport Stream)

.ts, mts, m2ts

MPEG-TS BDAV (Blu-ray Disc Audio-Video)

.m2ts

MXF OP-Atom

.mxf

MXF AS-02

.mxf

MXF AS-10

.mxf

MXF AS-11 DPP

.mxf

MXF AS-11 D10

.mxf

MXF AS-11 OP1a

.mxf

MXF ARD/ZDF/HDF

.mxf

MXF D10

.mxf

MXF RDD9

.mxf

MXF RDD32

.mxf

Motion JPEG2000

.mj2

Phantom RAW

.cine

QuickTime

.mov, .qt

R3D

.R3D

VOB

.vob

WebM

.webm

YUV4MPEG2

.y4m

1.5. Audio

Format Name File Extension(s)

AAC

.aac

AC-3

.ac3

Audio Interchange File Format (AIFF)

.aif, .aiff

Digital Theater Systems (DTS-X)

.dts

Dolby Atmos

.atmos

Free Lossless Audio Codec (FLAC)

.flac

MXF AS-02 Audio

.mxf

Opus

.opus, .ogg

Vorbis

.vorbis, .ogg

Waveform Audio File Format (WAVE)

.wav, wave

ITU Broadcast Wave

.wav, .wave, .bwf, .bw64

EBU Broadcast Wave

.wav, .wave, .rf64

1.6. Elementary Streams

Format Name File Extension(s)

H.262 / MPEG-2

.h262

H.264 Advanced Video Coding

.avc, .h264, .264

H.265 (HEVC)

.h265

1.7. Subtitles & Captions

Format Name File Extension(s) Comments

Apple iTunes Timed Text

.itt

ARIB Timed Text

.xml, .ttml

Cheetah Closed Captions

.cap

CineCanvas

.xml

Common File Format Timed Text

.xml, .ttml

Digital Cinema Subtitle

.xml

DFXP

.xml, .dfxp

(Distribution Format Exchange Profile)

EBU STL

.stl

EBU Timed Text (EBU-TT-D)

.xml, .ttml

EEG 708 Captions

.xml

European Subtitle Exchange Format

.xml

Internet Media Subtitles and Captions (IMSC)

.xml, .imsc, .ttml

Support for v1. and v1.01. Animation not supported

Scenarist Closed Captions

.scc

Screen Electronics PAC

.pac

Sony BDN

.xml, .png, .tif

Spruce STL

.stl

SubRip

.srt

SubViewer

.sub

TTML

.xml, .ttml

[Timed Text Markup Language]

Videotron Lambda

.cap

SMPTETT

.xml

WebVTT

.vtt

1.8. IMF Applications

The following IMF packages can be imported:

  • Application 2, 2e (Studio Profile)

  • Application 3 (Sstp)

  • Application 4 (Cinema Mezzanine)

  • Application 5 (ACES)

  • IMF ProRes RDD45

  • RDD 59-1 IMF Application DPP (ProRes)

  • RDD 59-2 IMF Application DPP (JPEG2000)

Supported JPEG2000 profiles:

  • Broadcast profiles, up to BPC L7

  • IMF profiles, up to 16 bit

1.9. Metadata Files

Format Name File Extension(s) Comments

BWF ADM

.wav

BWF Audio Definition Model

Dolby Vision Metadata File

.xml

HDR10+ Metadata File

.json

MXF AS-02 ADM

.mxf

Audio Definition Model

MXF AS-02 S-ADM

.mxf

Serial ADM

MXF AS-02 IAB

.mxf

MXF AS-02 ISXD

.mxf

Dolby Vision

2. Output File Formats Support

2.1. Compositions

Format Name File Extension(s)

D-Cinema Composition Playlist

.xml

EDL CMX 3600

.edl

Final Cut Pro

.xml

Final Cut Pro X

.fcpxml

IMF Composition Playlist

.edl

Avid Log Exchange

.ale

Advanced Authoring Format

.aaf

2.2. Image Sequences

Format Name File Extension(s) Comment

Cineon

.cin

DPX

.dpx

Targa

.tga

TIFF

.tif, .tiff

ARRI

.ari

SGI

.sgi, .rgb

JPEG

.jpg, .jpeg

JPEG2000 (J2K)

.j2k, .j2c

JPEG XS

.jxs

JPEG High Throughput

.jph

BMP

.bmp

PNG

.png

Photoshop PSD

.psd

Composite uncompressed PSD only.

OpenEXR

.exr

ERIMovie

.eri

Chyron

.chr

Media Logic Artisan

.art

Rendition

.6rn

2.3. Video

Format Name File Extension(s)

Advanced Systems Format (Windows Media Video)

.wmv

AS-02

.mxf

AS-10

.mxf

AS-11 DPP

.mxf

AS-11 D10

.mxf

ARD/ZDF/HDF

.mxf

AVI

.avi

Avid MXF

.mxf

DV

.dv

GXF

.mxf

D10

.mxf

RDD9

.mxf

Matroska

.mkv

MPEG-2

.mpg, .mpeg

MPEG-4

.mp4

MPEG-TS

.mts

MPEG-TS BDAV

.m2ts

MXF OP-Atom

.mxf

MXF OP-1a

.mxf

QuickTime

.mov

WebM

.webm

2.4. Audio

Format Name File Extension(s)

Waveform Audio File Format (WAVE)

.wav

2.5. Elementary Streams

Format Name File Extension(s)

H.265 (HEVC)

.h265

2.6. Subtitles & Captions

Format Name File Extension(s) Comments

CineCanvas

.xml

Digital Cinema XML

.xml

EBU STL

.stl

Internet Media Subtitles and Captions (IMSC)

.xml, .ttml

Support for v1. and v1.01. Export of image profiles not supported. Animation not supported.

2.7. IMF Applications

The following IMF packages can be exported:

  • Application 2, 2e (Studio Profile)

  • Application 4 (Cinema Mezzanine)

  • Application 5 (ACES)

  • IMF ProRes RDD45

  • RDD 59-1 IMF Application DPP (ProRes)

  • RDD 59-2 IMF Application DPP (JPEG2000)

Supported JPEG2000 profiles:

  • Broadcast profiles, up to BPC L7

  • IMF profiles, up to 16 bit