TORNADO

Basic Integration Guide

1. Deploying TORNADO        2

1.1. Installing TORNADO        2

1.2. Starting TORNADO manually        2

2. Understanding TORNADO Jobs        3

2.1. Job Structure        3

2.1.1. Metadata Extraction Job Structure        4

2.1.2. Masterinfo Job Structure        4

2.1.3. Validation Job Structure        5

2.1.4. Transcoding Job Structure        5

3. Use Case: DCP to WebM proxy using TORNADO        7

3.1. DCP to WebM proxy workflow overview        7

3.2. Sending and monitoring a JOB        9

3.2.1. Sending and monitoring a Metadata Extraction Job        10

3.2.2. Sending and monitoring a Masterinfo Job        11

3.2.3. Sending and monitoring a Validation Report Job        13

3.2.4. Sending and monitoring a Transcoding Job        15


1. Deploying TORNADO

The first step in any workflow is to decide and set up the proper topology for your TORNADO installation. In order to do so you must first install TORNADO on one or more dedicated machines.

1.1. Installing TORNADO

TORNADO is distributed with it’s own dedicated installer.  Follow the steps in order to install it on your system.

Note that if you choose the option Install TORNADO as a system service, at the Select Additional Tasks stage, then TORNADO will be deployed in a manager/worker configuration with one webserver manager node piloting 4 worker nodes (more details on manager/worker configurations at...). The processes will be started as system services and can be managed via the Windows Services app.

1.2. Starting TORNADO manually

TORNADO can be piloted via the STORM API (see STORM API chapter in the TORNADO Integration Guide manual for more details). This chapter will guide you through starting up a webserver manager node with 2 worker nodes. If you already installed TORNADO as a system service then you can skip these steps.

Once started, a webserver node uses a REST API in order to send commands to TORNADO.

  1. Navigate to your TORNADO installation folder
  2. In your CLI startup stormserver.exe. If you don’t provide any command line options to the process, it will print a Help message.
  3. Startup a webserver manager node with 2 worker subprocesses:

stormserver websrv -workers 2

              

The command will start a webserver on port 8080. If the command is successful then it will print the following message:

STORM v2021.11.23.0

(C) 2009-2021 Marquise Technologies

Release Date Nov 23 2021

Architecture 64 bits

STORM Log File

Version:            2021.11.23.0 (released on Nov 23 2021)

Started:            2021-11-23 @ 18:15:23

starting up server...

done!

listening on port 8080

waiting for requests...

starting workers...

2 workers started!

  1. Use CURL, Postman or any other http capable program or language to interrogate the webserver (see the STORM API Reference chapter in the TORNADO Integration Guide).

In the case of a manager/worker configuration all requests will be sent to the manager node. This node will be used to pilot the worker nodes by distributing jobs to them.

2. Understanding TORNADO Jobs

A task performed by TORNADO is called a job. You can think of jobs as simple objects that contain the details or properties of the task you want performed by TORNADO.

2.1. Job Structure

In the case of the REST API, a job is represented by a JSON Object sent to the /jobs API endpoint via a POST HTTP request.

2.1.1. Metadata Extraction Job Structure

The object prototype for a metadata (mediainfo context ID) extraction job has the following structure:

{

    "context" : "mediainfo",

    "input" : "<path_to_DCP_package_folder>/<package_file>",

    "output" : "<path_to_output_xml_metadata_file>",

    "format" : "xml"

}

In our specific DCP to WebM proxy workflow, this object template will be used to create metadata extraction jobs for a file that is part of the DCP package. It uses the mediainfo context ID and provides the path to the input file(for which the metadata must be extracted), the output path (where the metadata should be saved) and lastly the format of the metadata file.

2.1.2. Masterinfo Job Structure

The object prototype for a masterinfo (masterinfo context ID) extraction job has the following structure:

{

    "context" : "masterinfo",

    "input" : "<path_to_DCP_package_folder>",

    "output" : "<path_to_output_xml_metadata_file>",

    "format" : "xml"

}

In our specific DCP to WebM proxy workflow, this object template will be used to create masterinfo jobs for a DCP package. It uses the masterinfo context ID and provides the path to the input DCP folder (for which the masterinfo report must be extracted), the output path (where the report should be saved) and lastly the format of the report file.

NOTE: While the mediainfo job is used to extract metadata for a given DCP package file the purpose of the masterinfo job is to generate a report of the DCP package. A DCP package consists of a collection of assets that can be referenced and grouped into various compositions. While each asset can have specific metadata accessible via the mediainfo context the compositions are specific to the package and their properties are accessible via the mastering report.

2.1.3. Validation Job Structure

The object prototype for a DCP validation report (validate context ID) job has the following structure:

{

    "context" : "validate",

    "input" : "<path_to_DCP_package_folder>",

    "output" : "<path_to_output_report_file>",

    "master" : "DCP",

    "format" : "xml" | "pdf"

}

In our specific DCP to WebM proxy workflow, this object template will be used to create validation reports jobs. It uses the validate context ID and provides the path to the input file(for which the validation report must be generated), the output path (where the report file should be saved), the format of the master and lastly the format of the report file (XML or PDF).

2.1.4. Transcoding Job Structure

The object prototype for a DCP to WebM proxy transcoding (xcode context ID) job has the following structure:

{

    "context" : "xcode",

    "input" : "<path_to_DCP_package_folder>",

    "output" : "<path_to_proy_output_file>",

    "preset" : "<path_to_preset_xml_file>",

    "cms" : {

        "system" : "<system_tag>",

        "version" : "<version_tag>",

        "workflow" : "<workflow_tag>",

        "primaries" : "<primaries_tag>",

        "eotf" : "<eotf_tag>",

        "cat" : "bradford"

    }

}

In our specific DCP to WebM proxy workflow, this object template will be used to create transcoding jobs. It uses the xcode context ID and provides the path to the input file(for which the WebM proxy must be generated), the output path (where the proxy file should be saved), and lastly the XML preset file containing the transcoding parameters for the output proxy file.


3. Use Case: DCP to WebM proxy using TORNADO

This chapter will guide you through extracting metadata, validating then transcoding a DCP package to a WebM proxy using TORNADO’s REST API and transcoding presets.

3.1. DCP to WebM proxy workflow overview

Figure 1 shows an overview of a DCP to Proxy workflow using TORNADO’s REST API.

Figure 1

An external application first sends a metadata extraction (mediainfo context ID) and a validation report (validate context ID) job to the TORNADO manager node.

The manager node distributes these jobs for processing to it’s registered workers. When the workers finish their metadata extraction and validation jobs, the results, in this case files, are saved on the disk.

The external application parses, then saves, the required metadata from the metadata XML file to an external DB.

Same thing can be done for the validation report. In this case the external application or a human operator can read the validation result and make a decision based on it status. If the validation PASSED then the DCP to Proxy transcoding job can be launched. If NOT a different decision can be made.

If the validation stage is not required then the validation JOB can be omitted or the result can be ignored.

3.2. Sending and monitoring a JOB

Figure 2 shows the process of sending a JOB to TORNADO’s manager node, recuperating the returned UUID and then polling it’s status using that UUID.

Figure 2

Steps 5 and 7 must be repeated in order to get the updated JOB object.

3.2.1. Sending and monitoring a Metadata Extraction Job

The following complete example demonstrates how to create and send a Metadata Extraction JOB (mediainfo context ID) for the video asset from a DCP package to TORNADO using the REST API.

This example uses CURL to send the HTTP requests to TORNADO and assumes that you have already followed the installation and deployment steps. In this and all following examples TORNADO runs at http://localhost:8080.

  1. Create the Job request JSON Object with the following data (see 2.1.1. Metadata Extraction Job Structure) and save it to a metadata_extraction.json file.

{

    "context" : "mediainfo",

    "input" : "D:/TEST_FOOTAGE/DCP/SWLastJedi_TLR-C-3D_S_FR-XX_CH_51_2K_DI_20171018_DGL_IOP-3D_OV/264671-2914640_SWLastJedi_TLR-C-3D_S_FR-XX_01.mxf",

    "output" : "E:/FILE_IO_TST/metadata/metadata_SWLastJedi.xml",

    "format" : "xml"

}

  1. Use CURL to send the metadata_extraction.json file as a POST request to the /jobs REST API endpoint (see Add a new job in the TORNADO Integration Guide). This request will create a new task for TORNADO with the Job parameters.

$ curl -X POST -T "E:/FILE_IO_TST/CURL/metadata/metadata_extraction.json" "http://localhost:8080/jobs"

  1. TORNADO will respond with the Job object by adding a unique identifier field as well as the status and request timestamp. Get and store the value of the “uuid” field from the received response. It will be used to further manipulate the Job.

{

    "uuid": "ec5e72b1-150d-4822-a43f-5c27b2668142",

    "priority": 0,

    "status": "queued",

    "requesttimestamp": "2021-11-24T11:48:19+00:00",

    "input": "D:/TEST_FOOTAGE/DCP/SWLastJedi_TLR-C-3D_S_FR-XX_CH_51_2K_DI_20171018_DGL_IOP-3D_OV/264671-2914640_SWLastJedi_TLR-C-3D_S_FR-XX_01.mxf",

    "output": "E:/FILE_IO_TST/metadata/metadata_SWLastJedi.xml",

    "format": "xml",

    "context": "mediainfo"

}

  1. Use CURL to send a GET request to the /jobs/<job_UUID> REST API endpoint (see Get job description in the TORNADO Integration Guide). This request will retrieve the updated job after queuing.

$ curl -X GET -T "http://localhost:8080/jobs/ec5e72b1-150d-4822-a43f-5c27b2668142"

  1. The response will contain the updated Job object along with the status and progress fields. Use the values of these fields to check it’s status (finished, error, aborted) and it’s progress (from 0 to 100). For long running jobs you can create a polling mechanism that repeats steps 4 and 5 in order to display the progress.

{

    "job": {

        "uuid": "ec5e72b1-150d-4822-a43f-5c27b2668142",

        "priority": 0,

        "status": "finished",

        "progress": 100,

        "step": 1,

        "stepcount": 1,

        "requesttimestamp": "2021-11-24T11:42:40+00:00",

        "begintimestamp": "2021-11-24T11:42:40+00:00",

        "endtimestamp": "2021-11-24T11:42:40+00:00",

        "input": "D:/TEST_FOOTAGE/DCP/SWLastJedi_TLR-C-3D_S_FR-XX_CH_51_2K_DI_20171018_DGL_IOP-3D_OV/264671-2914640_SWLastJedi_TLR-C-3D_S_FR-XX_01.mxf",

        "output": "E:/FILE_IO_TST/metadata/metadata_SWLastJedi.xml",

        "format": "xml",

        "context": "mediainfo"

    },

    "status": "ok"

}

3.2.2. Sending and monitoring a Masterinfo Job

The following complete example demonstrates how to create and send a Masterinfo JOB (masterinfo context ID) for the DCP package to TORNADO using the REST API.

This example uses CURL to send the HTTP requests to TORNADO and assumes that you have already followed the installation and deployment steps. In this and all following examples TORNADO runs at http://localhost:8080.

  1. Create the Job request JSON Object with the following data (see 2.1.2. Masterinfo Job Structure) and save it to a masterinfo_report.json file.

{

    "context" : "mediainfo",

    "master" : "DCP",

    "input" : "D:/TEST_FOOTAGE/DCP/SWLastJedi_TLR-C-3D_S_FR-XX_CH_51_2K_DI_20171018_DGL_IOP-3D_OV",

    "output" : "E:/FILE_IO_TST/metadata/masterinfo_SWLastJedi.xml",

    "format" : "xml"

}

  1. Use CURL to send the masterinfo_report.json file as a POST request to the /jobs REST API endpoint (see Add a new job in the TORNADO Integration Guide). This request will create a new task for TORNADO with the Job parameters.

$ curl -X POST -T "E:/FILE_IO_TST/CURL/metadata/masterinfo_report.json" "http://localhost:8080/jobs"

  1. TORNADO will respond with the Job object by adding a unique identifier field as well as the status and request timestamp. Get and store the value of the “uuid” field from the received response. It will be used to further manipulate the Job.

{

    "uuid": "ec5e72b1-150d-4822-a43f-5c27b2668142",

    "priority": 0,

    "status": "queued",

    "requesttimestamp": "2021-11-24T11:48:19+00:00",

    "input": "D:/TEST_FOOTAGE/DCP/SWLastJedi_TLR-C-3D_S_FR-XX_CH_51_2K_DI_20171018_DGL_IOP-3D_OV/264671-2914640_SWLastJedi_TLR-C-3D_S_FR-XX_01.mxf",

    "output": "E:/FILE_IO_TST/metadata/masterinfo_SWLastJedi.xml",

    "master": "DCP",

    "format": "xml",

    "context": "masterinfo"

}

  1. Use CURL to send a GET request to the /jobs/<job_UUID> REST API endpoint (see Get job description in the TORNADO Integration Guide). This request will retrieve the updated job after queuing.

$ curl -X GET -T "http://localhost:8080/jobs/ec5e72b1-150d-4822-a43f-5c27b2668142"

  1. The response will contain the updated Job object along with the status and progress fields. Use the values of these fields to check it’s status (finished, error, aborted) and it’s progress (from 0 to 100). For long running jobs you can create a polling mechanism that repeats steps 4 and 5 in order to display the progress.

{

    "job": {

        "uuid": "ec5e72b1-150d-4822-a43f-5c27b2668142",

        "priority": 0,

        "status": "finished",

        "progress": 100,

        "step": 1,

        "stepcount": 1,

        "requesttimestamp": "2021-11-24T11:42:40+00:00",

        "begintimestamp": "2021-11-24T11:42:40+00:00",

        "endtimestamp": "2021-11-24T11:42:40+00:00",

        "input": "D:/TEST_FOOTAGE/DCP/SWLastJedi_TLR-C-3D_S_FR-XX_CH_51_2K_DI_20171018_DGL_IOP-3D_OV/264671-2914640_SWLastJedi_TLR-C-3D_S_FR-XX_01.mxf",

        "output": "E:/FILE_IO_TST/metadata/masterinfo_SWLastJedi.xml",

        "master": "DCP",

        "format": "xml",

        "context": "masterinfo"

    },

    "status": "ok"

}

3.2.3. Sending and monitoring a Validation Report Job

The following complete example demonstrates how to create and send a Validation Report Job (validate context ID) for a DCP package to TORNADO using the REST API.

This example uses CURL to send the HTTP requests to TORNADO and assumes that you have already followed the installation and deployment steps. In this and all following examples TORNADO runs at http://localhost:8080.

  1. Create the Job request JSON Object with the following data (see 2.1.2. Validation Job Structure) and save it to a validation_report.json file.

{

    "context" : "validate",

    "input" : "D:/TEST_FOOTAGE/DCP/SWLastJedi_TLR-C-3D_S_FR-XX_CH_51_2K_DI_20171018_DGL_IOP-3D_OV”,

    "output" : "E:/FILE_IO_TST/validation/validation_SWLastJedi.xml",

    "master" : "DCP",

    "format" : "xml"

}

  1. Use CURL to send the validation_report.json file as a POST request to the /jobs REST API endpoint (see Add a new job in the TORNADO Integration Guide). This request will create a new task for TORNADO with the Job parameters.

$ curl -X POST -T "E:/FILE_IO_TST/CURL/validation/validation_report.json" "http://localhost:8080/jobs"

  1. TORNADO will respond with the Job object by adding a unique identifier field as well as the status and request timestamp. Get and store the value of the “uuid” field from the received response. It will be used to further manipulate the Job.

{

    "uuid": "64e05405-1f7a-481e-9b3d-81cdc6ace7f3",

    "priority": 0,

    "status": "queued",

    "requesttimestamp": "2021-11-24T11:49:31+00:00",

    "input": "D:/TEST_FOOTAGE/DCP/SWLastJedi_TLR-C-3D_S_FR-XX_CH_51_2K_DI_20171018_DGL_IOP-3D_OV",

    "output": "E:/FILE_IO_TST/validation/validation_SWLastJedi.xml",

    "format": "xml",

    "context": "validate"

}

  1. Use CURL to send a GET request to the /jobs/<job_UUID> REST API endpoint (see Get job description in the TORNADO Integration Guide). This request will retrieve the updated job after queuing.

$ curl -X GET -T "http://localhost:8080/jobs/64e05405-1f7a-481e-9b3d-81cdc6ace7f3"

  1. The response will contain the updated Job object along with the status and progress fields. Use the values of these fields to check it’s status (finished, error, aborted) and it’s progress (from 0 to 100). For long running jobs you can create a polling mechanism that repeats steps 4 and 5 in order to display the progress.

{

    "job": {

        "uuid": "64e05405-1f7a-481e-9b3d-81cdc6ace7f3",

        "priority": 0,

        "status": "finished",

        "progress": 100,

        "step": 54,

        "stepcount": 54,

        "requesttimestamp": "2021-11-24T11:54:11+00:00",

        "begintimestamp": "2021-11-24T11:54:11+00:00",

        "endtimestamp": "2021-11-24T11:54:52+00:00",

        "input": "D:/TEST_FOOTAGE/DCP/SWLastJedi_TLR-C-3D_S_FR-XX_CH_51_2K_DI_20171018_DGL_IOP-3D_OV",

        "output": "E:/FILE_IO_TST/validation/validation_SWLastJedi.xml",

        "format": "xml",

        "context": "validate"

    },

    "status": "ok"

}

3.2.4. Sending and monitoring a Transcoding Job

The following complete example demonstrates how to create and send a Transcoding Job (xcode context ID) that generates a WebM proxy for a DCP package to TORNADO using the REST API.

This example uses CURL to send the HTTP requests to TORNADO and assumes that you have already followed the installation and deployment steps. In this and all following examples TORNADO runs at http://localhost:8080.

  1. Create the Job request JSON Object with the following data (see 2.1.3. Transcoding Job Structure) and save it to a transcoding_webm.json file.

{

    "context" : "xcode",

    "input" : "D:/TEST_FOOTAGE/DCP/SWLastJedi_TLR-C-3D_S_FR-XX_CH_51_2K_DI_20171018_DGL_IOP-3D_OV”,

    "output" : "E:/FILE_IO_TST/proxy",

    "preset" : "E:/FILE_IO_TST/presets/preset_proxy_webm_SWLastJedi.xml",

    "cms" : {

        "system" : "MTCMS",

        "version" : "none",

        "workflow" : "custom",

        "primaries" : "ITU.BT.709",

        "eotf" : "ITU-R.BT.709",

        "cat" : "bradford"

    }

}

  1. Use CURL to send the transcoding_webm.json file as a POST request to the /jobs REST API endpoint (see Add a new job in the TORNADO Integration Guide). This request will create a new task for TORNADO with the Job parameters.

$ curl -X POST -T "E:/FILE_IO_TST/CURL/validation/transcoding_webm.json" "http://localhost:8080/jobs"

  1. TORNADO will respond with the Job object by adding a unique identifier field as well as the status and request timestamp. Get and store the value of the “uuid” field from the received response. It will be used to further manipulate the Job.

{

    "uuid": "4668b4db-f60d-4114-acaa-8f57fc81c667",

    "priority": 0,

    "status": "queued",

    "requesttimestamp": "2021-11-24T14:39:43+00:00",

    "input": "D:/TEST_FOOTAGE/DCP/SWLastJedi_TLR-C-3D_S_FR-XX_CH_51_2K_DI_20171018_DGL_IOP-3D_OV",

    "output": "E:/FILE_IO_TST/proxy",

    "preset" : "E:/FILE_IO_TST/presets/preset_proxy_webm_SWLastJedi.xml",

    "cms" : {

        "system" : "MTCMS",

        "version" : "none",

        "workflow" : "custom",

        "primaries" : "ITU.BT.709",

        "eotf" : "ITU-R.BT.709",

        "cat" : "bradford"

    },

    "context": "xcode"

}

  1. Use CURL to send a GET request to the /jobs/<job_UUID> REST API endpoint (see Get job description in the TORNADO Integration Guide). This request will retrieve the updated job after queuing.

$ curl -X GET -T "http://localhost:8080/jobs/4668b4db-f60d-4114-acaa-8f57fc81c667"

  1. The response will contain the updated Job object along with the status and progress fields. Use the values of these fields to check it’s status (finished, error, aborted) and it’s progress (from 0 to 100). For long running jobs you can create a polling mechanism that repeats steps 4 and 5 in order to display the progress.

{

    "job": {

        "uuid": "4668b4db-f60d-4114-acaa-8f57fc81c667",

        "priority": 0,

        "status": "finished",

        "progress": 100,

        "step": 4664,

        "stepcount": 4664,

        "requesttimestamp": "2021-11-24T14:55:11+00:00",

        "begintimestamp": "2021-11-24T14:39:46+00:00",

        "endtimestamp": "2021-11-24T14:54:52+00:00",

        "input": "D:/TEST_FOOTAGE/DCP/SWLastJedi_TLR-C-3D_S_FR-XX_CH_51_2K_DI_20171018_DGL_IOP-3D_OV",

        "output": "E:/FILE_IO_TST/proxy",

        "master": "WebM",

        "preset" : "E:/FILE_IO_TST/presets/preset_proxy_webm_SWLastJedi.xml",

        "cms" : {

            "system" : "MTCMS",

            "version" : "none",

            "workflow" : "custom",

            "primaries" : "ITU.BT.709",

            "eotf" : "ITU-R.BT.709",

            "cat" : "bradford"

        },

        "context": "xcode"

    },

    "status": "ok"

}