diff --git a/docs/docs/TiesDb-Guide.md b/docs/docs/TiesDb-Guide.md
index d93dd523b050..ccc6fe138d4e 100644
--- a/docs/docs/TiesDb-Guide.md
+++ b/docs/docs/TiesDb-Guide.md
@@ -116,10 +116,10 @@ different S3 bucket, a new output object will be created. Otherwise, the previou
object will be used.
It is possible for there to be multiple matching supplementals in TiesDb. In that case,
-Workflow Manager will first pick the supplementals with the best job status. The job statuses
-from best to worst are `COMPLETE`, `COMPLETE_WITH_WARNINGS`, and `COMPLETE_WITH_ERRORS`. If no jobs
-with those statuses exist, all other statuses are considered equally bad. If there are multiple
-supplementals with the same status, the most recently created supplemental will be used.
+Workflow Manager will first pick the supplementals with the best job status. It first looks for a
+job status of `COMPLETE`, and then `COMPLETE_WITH_WARNINGS`. If no jobs with those statuses exist,
+all other statuses are considered equally bad. If there are multiple supplementals with the same
+status, the most recently created supplemental will be used.
In order to determine if a previous job was similar enough to a current job, a hash of the
important parts of the jobs is computed. The parts of the job that are included in the hash are:
diff --git a/docs/docs/html/REST-API.html b/docs/docs/html/REST-API.html
index 7e6e1a783b2f..774e546e6cab 100644
--- a/docs/docs/html/REST-API.html
+++ b/docs/docs/html/REST-API.html
@@ -1041,8 +1041,8 @@
SingleJobInfo (Response only)
A string that will be one of the following values: UNKNOWN,
INITIALIZED (the job has been initialized but not started),
- IN_PROGRESS, IN_PROGRESS_ERRORS, IN_PROGRESS_WARNINGS
- COMPLETE, COMPLETE_WITH_ERRORS, COMPLETE_WITH_WARNINGS,
+ IN_PROGRESS, IN_PROGRESS_ERRORS, IN_PROGRESS_WARNINGS,
+ COMPLETE, COMPLETE_WITH_WARNINGS,
CANCELLING, CANCELLED, CANCELLED_BY_SHUTDOWN, ERROR
|
@@ -1070,7 +1070,7 @@ SingleJobInfo (Response only)
terminal |
boolean |
- True if jobStatus is COMPLETE, COMPLETE_WITH_ERRORS,
+ True if jobStatus is COMPLETE,
COMPLETE_WITH_WARNINGS, CANCELLED, CANCELLED_BY_SHUTDOWN, or ERROR;
otherwise false
|
diff --git a/docs/site/TiesDb-Guide/index.html b/docs/site/TiesDb-Guide/index.html
index 9a184b1954c7..a793aaab4bfc 100644
--- a/docs/site/TiesDb-Guide/index.html
+++ b/docs/site/TiesDb-Guide/index.html
@@ -395,10 +395,10 @@ Before Job Check
different S3 bucket, a new output object will be created. Otherwise, the previous job's output
object will be used.
It is possible for there to be multiple matching supplementals in TiesDb. In that case,
-Workflow Manager will first pick the supplementals with the best job status. The job statuses
-from best to worst are COMPLETE, COMPLETE_WITH_WARNINGS, and COMPLETE_WITH_ERRORS. If no jobs
-with those statuses exist, all other statuses are considered equally bad. If there are multiple
-supplementals with the same status, the most recently created supplemental will be used.
+Workflow Manager will first pick the supplementals with the best job status. It first looks for a
+job status of COMPLETE, and then COMPLETE_WITH_WARNINGS. If no jobs with those statuses exist,
+all other statuses are considered equally bad. If there are multiple supplementals with the same
+status, the most recently created supplemental will be used.
In order to determine if a previous job was similar enough to a current job, a hash of the
important parts of the jobs is computed. The parts of the job that are included in the hash are:
diff --git a/docs/site/html/REST-API.html b/docs/site/html/REST-API.html
index 7e6e1a783b2f..774e546e6cab 100644
--- a/docs/site/html/REST-API.html
+++ b/docs/site/html/REST-API.html
@@ -1041,8 +1041,8 @@ SingleJobInfo (Response only)
A string that will be one of the following values: UNKNOWN,
INITIALIZED (the job has been initialized but not started),
- IN_PROGRESS, IN_PROGRESS_ERRORS, IN_PROGRESS_WARNINGS
- COMPLETE, COMPLETE_WITH_ERRORS, COMPLETE_WITH_WARNINGS,
+ IN_PROGRESS, IN_PROGRESS_ERRORS, IN_PROGRESS_WARNINGS,
+ COMPLETE, COMPLETE_WITH_WARNINGS,
CANCELLING, CANCELLED, CANCELLED_BY_SHUTDOWN, ERROR
|
@@ -1070,7 +1070,7 @@ SingleJobInfo (Response only)
terminal |
boolean |
- True if jobStatus is COMPLETE, COMPLETE_WITH_ERRORS,
+ True if jobStatus is COMPLETE,
COMPLETE_WITH_WARNINGS, CANCELLED, CANCELLED_BY_SHUTDOWN, or ERROR;
otherwise false
|
diff --git a/docs/site/index.html b/docs/site/index.html
index 20a3e606be45..3b4e6b613738 100644
--- a/docs/site/index.html
+++ b/docs/site/index.html
@@ -408,5 +408,5 @@ Overview
diff --git a/docs/site/search/search_index.json b/docs/site/search/search_index.json
index 4286e287abd9..a72bd8cb0d4d 100644
--- a/docs/site/search/search_index.json
+++ b/docs/site/search/search_index.json
@@ -502,7 +502,7 @@
},
{
"location": "/TiesDb-Guide/index.html",
- "text": "NOTICE:\n This software (or technical data) was produced for the U.S. Government under contract,\nand is subject to the Rights in Data-General Clause 52.227-14, Alt. IV (DEC 2007). Copyright 2024\nThe MITRE Corporation. All Rights Reserved.\n\n\nTiesDb Overview\n\n\nRefer to \nhttps://github.com/Noblis/ties-lib\n for more information on the Triage Import Export\nSchema (TIES). For each piece of media, we create one or more TIES\n\"supplementalDescription (Data Object)\" entries in the database, one for each\nanalytic (algorithm) run on the media. In general, a \"supplementalDescription\" is a kind of TIES\n\"assertion\", which is used to represent metadata about the media object. In our case it\nrepresents the detection and track information in the OpenMPF JSON output object. Workflow Manager\ncan be configured to check TiesDb for a\n\nsupplemental it previously created\n and\n\nuse the results from that previous job to avoid re-running the job.\n\nWorkflow Manager can also be configured to \ncopy results to a different S3 bucket\n\nwhen a matching job is found.\n\n\nConfiguration\n\n\n\n\nTIES_DB_URL\n job property or \nties.db.url\n system property\n\n\nWhen provided, information about completed jobs will be sent to the specified TiesDb server.\n\n\nFor example: \nhttps://tiesdb.example.com\n\n\n\n\n\n\nSKIP_TIES_DB_CHECK\n job property or \nties.db.skip.check\n system property\n\n\nWhen true, TiesDb won't be checked for a compatible job before processing media.\n\n\n\n\n\n\nTIES_DB_S3_COPY_ENABLED\n job property or \nties.db.s3.copy.enabled\n system property\n\n\nWhen true and a job is skipped because a compatible job is found in TiesDb, the results\n from the previous job will be copied to a different S3 bucket. Copying results will always\n result in a new JSON output object, even if using the same S3 location as the previous job.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_ACCESS_KEY\n job property or \nties.db.copy.src.s3.access.key\n system property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this is the S3 access key\n that will be used when getting the results from S3. If not provided, defaults to the value of\n \nS3_ACCESS_KEY\n.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_SECRET_KEY\n job property or \nties.db.copy.src.s3.secret.key\n system property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this is the S3 secret key\n that will be used when getting the results from S3. If not provided, defaults to the value of\n \nS3_SECRET_KEY\n.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_SESSION_TOKEN\n job property or \nties.db.copy.src.s3.session.token\n system\n property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this is the S3 session\n token that will be used when getting the results from S3. If not provided, defaults to the\n value of \nS3_SESSION_TOKEN\n.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_REGION\n job property or \nties.db.copy.src.s3.region\n system property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this is the S3 region that\n will be used when getting the results from S3. If not provided, defaults to the value of\n \nS3_REGION\n.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_USE_VIRTUAL_HOST\n job property or \nties.db.copy.src.s3.use.virtual.host\n\n system property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this enables virtual host\n style bucket URIs. If not provided, defaults to the value of \nS3_USE_VIRTUAL_HOST\n.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_HOST\n job property or \nties.db.copy.src.s3.host\n system property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this is the S3 host that\n will be used when getting the results from S3. If not provided, defaults to the value of\n \nS3_HOST\n.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_UPLOAD_OBJECT_KEY_PREFIX\n job property or\n \nties.db.copy.src.s3.upload.object.key.prefix\n system property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this is the S3 object key\n prefix that will be used when getting the results from S3. If not provided, defaults to the\n value of \nS3_UPLOAD_OBJECT_KEY_PREFIX\n.\n\n\n\n\n\n\ndata.ties.db.check.ignorable.properties.file\n system property\n\n\nPath to the\n \nJSON file containing the list of properties that should not be considered\n\n when checking for a compatible job in TiesDb.\n\n\n\n\n\n\nLINKED_MEDIA_HASH\n media property\n\n\nWhen this property is set, interactions with TiesDb will use the value of \nLINKED_MEDIA_HASH\n\n instead of the media's actual SHA-256 hash.\n\n\n\n\n\n\n\n\nAfter Job Supplemental Creation\n\n\nWhen a URL is provided for the \nTIES_DB_URL\n job property or \nties.db.url\n system property,\nWorkflow Manager will post a supplemental to TiesDb at the end of the job. The full URL that\nWorkflow Manager will post to is created by taking the provided URL and appending\n\n/api/db/supplementals?sha256Hash=\n to it. If, for example, the provided TiesDb URL\nis \nhttps://tiesdb.example.com/path\n and the SHA-256 hash of the media is\n\nd1bc8d3ba4afc7e109612cb73acbdd\n, Workflow Manager will post to \n\n\nhttps://tiesdb.example.com/path/api/db/supplementals?sha256Hash=d1bc8d3ba4afc7e109612cb73acbdd\n\n\nThis is an example of what Workflow Manager will post to TiesDb:\n\n\n{\n \"assertionId\": \"87298cc2f31fba73181ea2a9e6ef10dce21ed95e98bdac9c4e1504ea16f486e4\",\n \"dataObject\": {\n \"algorithm\": \"MOG\",\n \"pipeline\": \"MOG MOTION PIPELINE\",\n \"jobId\": \"mpf.example.com-13\",\n \"jobStatus\": \"COMPLETE\",\n \"outputType\": \"MOTION\",\n \"outputUri\": \"https://s3.example.com/2c/f2/2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824\",\n \"processDate\": \"2021-10-08T15:24:04.168448Z\",\n \"sha256OutputHash\": \"2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824\",\n \"systemHostname\": \"mpf.example.com\",\n \"systemVersion\": \"6.0.2\",\n \"trackCount\": 100,\n \"jobConfigHash\": \"d52ad13e6e2db75e780b858e92b89df18c021674c24fd6c84dd151dcd28f5c56\"\n },\n \"informationType\": \"OpenMPF_MOTION\",\n \"securityTag\": \"UNCLASSIFIED\",\n \"system\": \"OpenMPF\"\n}\n\n\n\nBefore Job Check\n\n\nWorkflow Manager can be configured to check TiesDb for a supplemental produced as a result of\na previously run OpenMPF job. In order for Workflow Manager to do this check, the \nTIES_DB_URL\n\njob property or the \nties.db.url\n system property must be set to the TiesDb server's URL and\nthe \nSKIP_TIES_DB_CHECK\n job property or the \nties.db.skip.check\n system property must be set to\nfalse. If Workflow Manager finds a supplemental with a \njobConfigHash\n that matches the job that\nWorkflow Manager is currently running, Workflow Manager will not process the media in the current\njob. Workflow Manager will use the output of the previous job to determine the output for the\ncurrent job. When Workflow Manager is configured to copy the results of the previous job to a\ndifferent S3 bucket, a new output object will be created. Otherwise, the previous job's output\nobject will be used.\n\n\nIt is possible for there to be multiple matching supplementals in TiesDb. In that case,\nWorkflow Manager will first pick the supplementals with the best job status. The job statuses\nfrom best to worst are \nCOMPLETE\n, \nCOMPLETE_WITH_WARNINGS\n, and \nCOMPLETE_WITH_ERRORS\n. If no jobs\nwith those statuses exist, all other statuses are considered equally bad. If there are multiple\nsupplementals with the same status, the most recently created supplemental will be used.\n\n\nIn order to determine if a previous job was similar enough to a current job, a hash of the\nimportant parts of the jobs is computed. The parts of the job that are included in the hash are:\n\n\n\n\nNames of the algorithms used in the pipeline and their order\n\n\nNon-\nignorable job properties\n\n\noutput.changed.counter\n system property\n\n\nThis is an integer that is incremented when there is a change to the Workflow Manager after\n which previous TiesDb records should not be used. Because this number is part of the job\n configuration hash, changing this number invalidates all previous TiesDb records.\n\n\n\n\n\n\nComponent descriptor's \noutputChangedCounter\n property.\n\n\nThis works the same way as \noutput.changed.counter\n, except that it only invalidates TiesDb\n records for jobs that used the component.\n\n\n\n\n\n\nFrame ranges and time ranges\n\n\nJSON output object major and minor version\n\n\nSHA-256 hashes of all input media\n\n\nAs a result of this, in order to find a matching job, both jobs must have been run on all\n of the same media. To improve the chances that a matching job is found in TiesDb, a user\n can choose to only submit jobs for a single piece of media.\n\n\n\n\n\n\n\n\nIgnorable Properties\n\n\nThere are certain job properties, that when changed, do not change the output. There are also job\nproperties that only affect certain types of media. To make it more likely that a matching job will\nbe found in TiesDb, Workflow Manager can be configured to ignore the previously mentioned job\nproperties when computing the job configuration hash.\n\n\nThe properties that should be ignored are specified in a JSON file. The\n\ndata.ties.db.check.ignorable.properties.file\n system property contains the path to the JSON file.\nThe JSON file must contain a list of objects with two properties: \nignorableForMediaTypes\n and\n\nnames\n. \nignorableForMediaTypes\n is a list of strings specifying which media types are able\nto ignore the properties listed in the \nnames\n list.\n\n\nIn the example below, the \nVERBOSE\n job property is never included in the job hash because all\nmedia types are present in the \nignorableForMediaTypes\n list. \nARTIFACT_EXTRACTION_POLICY\n\nis ignored when the media is audio or unknown. \nFRAME_INTERVAL\n appears in both the second\nand third object, so it is ignorable when the media is audio, unknown, or image.\n\n\n[\n {\n \"ignorableForMediaTypes\": [\"VIDEO\", \"IMAGE\", \"AUDIO\", \"UNKNOWN\"],\n \"names\": [\"VERBOSE\"]\n },\n {\n \"ignorableForMediaTypes\": [\"AUDIO\", \"UNKNOWN\"],\n \"names\": [\"ARTIFACT_EXTRACTION_POLICY\", \"FRAME_INTERVAL\"]\n },\n {\n \"ignorableForMediaTypes\": [\"IMAGE\"],\n \"names\": [\"FRAME_INTERVAL\"]\n }\n]\n\n\n\nAvoid Downloading Media\n\n\nThe SHA-256 hash of the job's media is also included when computing the job configuration hash.\nIf the job request contains the media's hash and MIME type, Workflow Manager can avoid downloading\nthe media if a match is found in TiesDB. If the media's hash and MIME type are not included in the\njob request Workflow Manager will use the normal media inspection process to get that information.\nIf the media is not a local path, this will require Workflow Manager to download the media.\n\n\nBelow is an example of a job creation request that includes the media's hash and MIME type:\n\n\n{\n \"algorithmProperties\": {},\n \"buildOutput\": true,\n \"jobProperties\": {\n \"S3_ACCESS_KEY\": \"xxxxxx\",\n \"S3_SECRET_KEY\": \"xxxxxx\",\n \"TIES_DB_URL\": \"https://tiesdb.example.com\",\n \"SKIP_TIES_DB_CHECK\": \"false\"\n },\n \"media\": [\n {\n \"mediaUri\": \"https://s3.example.com/bucket/my-video.mp4\",\n \"metadata\": {\n \"MEDIA_HASH\": \"2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824\",\n \"MIME_TYPE\": \"video/mp4\"\n }\n }\n ],\n \"pipelineName\": \"OCV FACE DETECTION PIPELINE\",\n \"priority\": 4\n}\n\n\n\nS3 Copy\n\n\nWhen the \nTIES_DB_S3_COPY_ENABLED\n job property or \nties.db.s3.copy.enabled\n system property is\ntrue and a matching job is found in TiesDb, Workflow Manager will copy the artifacts, markup,\nand derivative media to the bucket specified in the current job's \nS3_RESULTS_BUCKET\n job property\nor \ns3.results.bucket\n system property. Since the job's artifacts, markup, and derivative media\nare in a new location, the output object must be updated before it is uploaded to the new S3 bucket.\nIn the updated output object, the \ntiesDbSourceJobId\n property will be set to the previous job's ID\nand \ntiesDbSourceMediaPath\n will be set to the path of the previous job's media. When the S3 copy\nis enabled and the results bucket is the same as the previous job, a new output object is created,\nbut copies of the artifacts, markup, and derivative media are not created. If the S3 copy is\ndisabled, \ntiesDbSourceJobId\n and \ntiesDbSourceMediaPath\n are not added because the original job's\noutput object is used without changes. If the copy fails, a link to the old JSON output object will\nbe provided.\n\n\nWhen performing the S3 copy, the \nS3 job properties\n like\n\nS3_ACCESS_KEY\n and \nS3_SECRET_KEY\n use the values from the current job and apply to the\ndestination of the copy. If the values for the S3 properties should be different for the source of\nthe copy, the properties prefixed with \nTIES_DB_COPY_SRC_\n can be set. If for a given property the\n\nTIES_DB_COPY_SRC_\n prefixed version is not set, the non-prefixed version will be used.\n\n\nFor example, if a job is received with the following properties set:\n\n\n\n\nS3_SECRET_KEY\n=\nnew-secret-key\n\n\nS3_ACCESS_KEY\n=\naccess-key\n\n\nTIES_DB_COPY_SRC_S3_SECRET_KEY\n=\nold-secret-key\n\n\n\n\nthen, when accessing the previous job's results \naccess-key\n will be used for the access key and\n\nold-secret-key\n will be used for the secret key. When uploading the results to the new bucket,\n\naccess-key\n will be used for the access key and \nnew-secret-key\n will be used for the secret key.",
+ "text": "NOTICE:\n This software (or technical data) was produced for the U.S. Government under contract,\nand is subject to the Rights in Data-General Clause 52.227-14, Alt. IV (DEC 2007). Copyright 2024\nThe MITRE Corporation. All Rights Reserved.\n\n\nTiesDb Overview\n\n\nRefer to \nhttps://github.com/Noblis/ties-lib\n for more information on the Triage Import Export\nSchema (TIES). For each piece of media, we create one or more TIES\n\"supplementalDescription (Data Object)\" entries in the database, one for each\nanalytic (algorithm) run on the media. In general, a \"supplementalDescription\" is a kind of TIES\n\"assertion\", which is used to represent metadata about the media object. In our case it\nrepresents the detection and track information in the OpenMPF JSON output object. Workflow Manager\ncan be configured to check TiesDb for a\n\nsupplemental it previously created\n and\n\nuse the results from that previous job to avoid re-running the job.\n\nWorkflow Manager can also be configured to \ncopy results to a different S3 bucket\n\nwhen a matching job is found.\n\n\nConfiguration\n\n\n\n\nTIES_DB_URL\n job property or \nties.db.url\n system property\n\n\nWhen provided, information about completed jobs will be sent to the specified TiesDb server.\n\n\nFor example: \nhttps://tiesdb.example.com\n\n\n\n\n\n\nSKIP_TIES_DB_CHECK\n job property or \nties.db.skip.check\n system property\n\n\nWhen true, TiesDb won't be checked for a compatible job before processing media.\n\n\n\n\n\n\nTIES_DB_S3_COPY_ENABLED\n job property or \nties.db.s3.copy.enabled\n system property\n\n\nWhen true and a job is skipped because a compatible job is found in TiesDb, the results\n from the previous job will be copied to a different S3 bucket. Copying results will always\n result in a new JSON output object, even if using the same S3 location as the previous job.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_ACCESS_KEY\n job property or \nties.db.copy.src.s3.access.key\n system property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this is the S3 access key\n that will be used when getting the results from S3. If not provided, defaults to the value of\n \nS3_ACCESS_KEY\n.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_SECRET_KEY\n job property or \nties.db.copy.src.s3.secret.key\n system property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this is the S3 secret key\n that will be used when getting the results from S3. If not provided, defaults to the value of\n \nS3_SECRET_KEY\n.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_SESSION_TOKEN\n job property or \nties.db.copy.src.s3.session.token\n system\n property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this is the S3 session\n token that will be used when getting the results from S3. If not provided, defaults to the\n value of \nS3_SESSION_TOKEN\n.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_REGION\n job property or \nties.db.copy.src.s3.region\n system property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this is the S3 region that\n will be used when getting the results from S3. If not provided, defaults to the value of\n \nS3_REGION\n.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_USE_VIRTUAL_HOST\n job property or \nties.db.copy.src.s3.use.virtual.host\n\n system property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this enables virtual host\n style bucket URIs. If not provided, defaults to the value of \nS3_USE_VIRTUAL_HOST\n.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_HOST\n job property or \nties.db.copy.src.s3.host\n system property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this is the S3 host that\n will be used when getting the results from S3. If not provided, defaults to the value of\n \nS3_HOST\n.\n\n\n\n\n\n\nTIES_DB_COPY_SRC_S3_UPLOAD_OBJECT_KEY_PREFIX\n job property or\n \nties.db.copy.src.s3.upload.object.key.prefix\n system property\n\n\nIf a job is skipped because a compatible job was found in TiesDb, this is the S3 object key\n prefix that will be used when getting the results from S3. If not provided, defaults to the\n value of \nS3_UPLOAD_OBJECT_KEY_PREFIX\n.\n\n\n\n\n\n\ndata.ties.db.check.ignorable.properties.file\n system property\n\n\nPath to the\n \nJSON file containing the list of properties that should not be considered\n\n when checking for a compatible job in TiesDb.\n\n\n\n\n\n\nLINKED_MEDIA_HASH\n media property\n\n\nWhen this property is set, interactions with TiesDb will use the value of \nLINKED_MEDIA_HASH\n\n instead of the media's actual SHA-256 hash.\n\n\n\n\n\n\n\n\nAfter Job Supplemental Creation\n\n\nWhen a URL is provided for the \nTIES_DB_URL\n job property or \nties.db.url\n system property,\nWorkflow Manager will post a supplemental to TiesDb at the end of the job. The full URL that\nWorkflow Manager will post to is created by taking the provided URL and appending\n\n/api/db/supplementals?sha256Hash=\n to it. If, for example, the provided TiesDb URL\nis \nhttps://tiesdb.example.com/path\n and the SHA-256 hash of the media is\n\nd1bc8d3ba4afc7e109612cb73acbdd\n, Workflow Manager will post to \n\n\nhttps://tiesdb.example.com/path/api/db/supplementals?sha256Hash=d1bc8d3ba4afc7e109612cb73acbdd\n\n\nThis is an example of what Workflow Manager will post to TiesDb:\n\n\n{\n \"assertionId\": \"87298cc2f31fba73181ea2a9e6ef10dce21ed95e98bdac9c4e1504ea16f486e4\",\n \"dataObject\": {\n \"algorithm\": \"MOG\",\n \"pipeline\": \"MOG MOTION PIPELINE\",\n \"jobId\": \"mpf.example.com-13\",\n \"jobStatus\": \"COMPLETE\",\n \"outputType\": \"MOTION\",\n \"outputUri\": \"https://s3.example.com/2c/f2/2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824\",\n \"processDate\": \"2021-10-08T15:24:04.168448Z\",\n \"sha256OutputHash\": \"2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824\",\n \"systemHostname\": \"mpf.example.com\",\n \"systemVersion\": \"6.0.2\",\n \"trackCount\": 100,\n \"jobConfigHash\": \"d52ad13e6e2db75e780b858e92b89df18c021674c24fd6c84dd151dcd28f5c56\"\n },\n \"informationType\": \"OpenMPF_MOTION\",\n \"securityTag\": \"UNCLASSIFIED\",\n \"system\": \"OpenMPF\"\n}\n\n\n\nBefore Job Check\n\n\nWorkflow Manager can be configured to check TiesDb for a supplemental produced as a result of\na previously run OpenMPF job. In order for Workflow Manager to do this check, the \nTIES_DB_URL\n\njob property or the \nties.db.url\n system property must be set to the TiesDb server's URL and\nthe \nSKIP_TIES_DB_CHECK\n job property or the \nties.db.skip.check\n system property must be set to\nfalse. If Workflow Manager finds a supplemental with a \njobConfigHash\n that matches the job that\nWorkflow Manager is currently running, Workflow Manager will not process the media in the current\njob. Workflow Manager will use the output of the previous job to determine the output for the\ncurrent job. When Workflow Manager is configured to copy the results of the previous job to a\ndifferent S3 bucket, a new output object will be created. Otherwise, the previous job's output\nobject will be used.\n\n\nIt is possible for there to be multiple matching supplementals in TiesDb. In that case,\nWorkflow Manager will first pick the supplementals with the best job status. It first looks for a\njob status of \nCOMPLETE\n, and then \nCOMPLETE_WITH_WARNINGS\n. If no jobs with those statuses exist,\nall other statuses are considered equally bad. If there are multiple supplementals with the same\nstatus, the most recently created supplemental will be used.\n\n\nIn order to determine if a previous job was similar enough to a current job, a hash of the\nimportant parts of the jobs is computed. The parts of the job that are included in the hash are:\n\n\n\n\nNames of the algorithms used in the pipeline and their order\n\n\nNon-\nignorable job properties\n\n\noutput.changed.counter\n system property\n\n\nThis is an integer that is incremented when there is a change to the Workflow Manager after\n which previous TiesDb records should not be used. Because this number is part of the job\n configuration hash, changing this number invalidates all previous TiesDb records.\n\n\n\n\n\n\nComponent descriptor's \noutputChangedCounter\n property.\n\n\nThis works the same way as \noutput.changed.counter\n, except that it only invalidates TiesDb\n records for jobs that used the component.\n\n\n\n\n\n\nFrame ranges and time ranges\n\n\nJSON output object major and minor version\n\n\nSHA-256 hashes of all input media\n\n\nAs a result of this, in order to find a matching job, both jobs must have been run on all\n of the same media. To improve the chances that a matching job is found in TiesDb, a user\n can choose to only submit jobs for a single piece of media.\n\n\n\n\n\n\n\n\nIgnorable Properties\n\n\nThere are certain job properties, that when changed, do not change the output. There are also job\nproperties that only affect certain types of media. To make it more likely that a matching job will\nbe found in TiesDb, Workflow Manager can be configured to ignore the previously mentioned job\nproperties when computing the job configuration hash.\n\n\nThe properties that should be ignored are specified in a JSON file. The\n\ndata.ties.db.check.ignorable.properties.file\n system property contains the path to the JSON file.\nThe JSON file must contain a list of objects with two properties: \nignorableForMediaTypes\n and\n\nnames\n. \nignorableForMediaTypes\n is a list of strings specifying which media types are able\nto ignore the properties listed in the \nnames\n list.\n\n\nIn the example below, the \nVERBOSE\n job property is never included in the job hash because all\nmedia types are present in the \nignorableForMediaTypes\n list. \nARTIFACT_EXTRACTION_POLICY\n\nis ignored when the media is audio or unknown. \nFRAME_INTERVAL\n appears in both the second\nand third object, so it is ignorable when the media is audio, unknown, or image.\n\n\n[\n {\n \"ignorableForMediaTypes\": [\"VIDEO\", \"IMAGE\", \"AUDIO\", \"UNKNOWN\"],\n \"names\": [\"VERBOSE\"]\n },\n {\n \"ignorableForMediaTypes\": [\"AUDIO\", \"UNKNOWN\"],\n \"names\": [\"ARTIFACT_EXTRACTION_POLICY\", \"FRAME_INTERVAL\"]\n },\n {\n \"ignorableForMediaTypes\": [\"IMAGE\"],\n \"names\": [\"FRAME_INTERVAL\"]\n }\n]\n\n\n\nAvoid Downloading Media\n\n\nThe SHA-256 hash of the job's media is also included when computing the job configuration hash.\nIf the job request contains the media's hash and MIME type, Workflow Manager can avoid downloading\nthe media if a match is found in TiesDB. If the media's hash and MIME type are not included in the\njob request Workflow Manager will use the normal media inspection process to get that information.\nIf the media is not a local path, this will require Workflow Manager to download the media.\n\n\nBelow is an example of a job creation request that includes the media's hash and MIME type:\n\n\n{\n \"algorithmProperties\": {},\n \"buildOutput\": true,\n \"jobProperties\": {\n \"S3_ACCESS_KEY\": \"xxxxxx\",\n \"S3_SECRET_KEY\": \"xxxxxx\",\n \"TIES_DB_URL\": \"https://tiesdb.example.com\",\n \"SKIP_TIES_DB_CHECK\": \"false\"\n },\n \"media\": [\n {\n \"mediaUri\": \"https://s3.example.com/bucket/my-video.mp4\",\n \"metadata\": {\n \"MEDIA_HASH\": \"2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824\",\n \"MIME_TYPE\": \"video/mp4\"\n }\n }\n ],\n \"pipelineName\": \"OCV FACE DETECTION PIPELINE\",\n \"priority\": 4\n}\n\n\n\nS3 Copy\n\n\nWhen the \nTIES_DB_S3_COPY_ENABLED\n job property or \nties.db.s3.copy.enabled\n system property is\ntrue and a matching job is found in TiesDb, Workflow Manager will copy the artifacts, markup,\nand derivative media to the bucket specified in the current job's \nS3_RESULTS_BUCKET\n job property\nor \ns3.results.bucket\n system property. Since the job's artifacts, markup, and derivative media\nare in a new location, the output object must be updated before it is uploaded to the new S3 bucket.\nIn the updated output object, the \ntiesDbSourceJobId\n property will be set to the previous job's ID\nand \ntiesDbSourceMediaPath\n will be set to the path of the previous job's media. When the S3 copy\nis enabled and the results bucket is the same as the previous job, a new output object is created,\nbut copies of the artifacts, markup, and derivative media are not created. If the S3 copy is\ndisabled, \ntiesDbSourceJobId\n and \ntiesDbSourceMediaPath\n are not added because the original job's\noutput object is used without changes. If the copy fails, a link to the old JSON output object will\nbe provided.\n\n\nWhen performing the S3 copy, the \nS3 job properties\n like\n\nS3_ACCESS_KEY\n and \nS3_SECRET_KEY\n use the values from the current job and apply to the\ndestination of the copy. If the values for the S3 properties should be different for the source of\nthe copy, the properties prefixed with \nTIES_DB_COPY_SRC_\n can be set. If for a given property the\n\nTIES_DB_COPY_SRC_\n prefixed version is not set, the non-prefixed version will be used.\n\n\nFor example, if a job is received with the following properties set:\n\n\n\n\nS3_SECRET_KEY\n=\nnew-secret-key\n\n\nS3_ACCESS_KEY\n=\naccess-key\n\n\nTIES_DB_COPY_SRC_S3_SECRET_KEY\n=\nold-secret-key\n\n\n\n\nthen, when accessing the previous job's results \naccess-key\n will be used for the access key and\n\nold-secret-key\n will be used for the secret key. When uploading the results to the new bucket,\n\naccess-key\n will be used for the access key and \nnew-secret-key\n will be used for the secret key.",
"title": "TiesDb Guide"
},
{
@@ -522,7 +522,7 @@
},
{
"location": "/TiesDb-Guide/index.html#before-job-check",
- "text": "Workflow Manager can be configured to check TiesDb for a supplemental produced as a result of\na previously run OpenMPF job. In order for Workflow Manager to do this check, the TIES_DB_URL \njob property or the ties.db.url system property must be set to the TiesDb server's URL and\nthe SKIP_TIES_DB_CHECK job property or the ties.db.skip.check system property must be set to\nfalse. If Workflow Manager finds a supplemental with a jobConfigHash that matches the job that\nWorkflow Manager is currently running, Workflow Manager will not process the media in the current\njob. Workflow Manager will use the output of the previous job to determine the output for the\ncurrent job. When Workflow Manager is configured to copy the results of the previous job to a\ndifferent S3 bucket, a new output object will be created. Otherwise, the previous job's output\nobject will be used. It is possible for there to be multiple matching supplementals in TiesDb. In that case,\nWorkflow Manager will first pick the supplementals with the best job status. The job statuses\nfrom best to worst are COMPLETE , COMPLETE_WITH_WARNINGS , and COMPLETE_WITH_ERRORS . If no jobs\nwith those statuses exist, all other statuses are considered equally bad. If there are multiple\nsupplementals with the same status, the most recently created supplemental will be used. In order to determine if a previous job was similar enough to a current job, a hash of the\nimportant parts of the jobs is computed. The parts of the job that are included in the hash are: Names of the algorithms used in the pipeline and their order Non- ignorable job properties output.changed.counter system property This is an integer that is incremented when there is a change to the Workflow Manager after\n which previous TiesDb records should not be used. Because this number is part of the job\n configuration hash, changing this number invalidates all previous TiesDb records. Component descriptor's outputChangedCounter property. This works the same way as output.changed.counter , except that it only invalidates TiesDb\n records for jobs that used the component. Frame ranges and time ranges JSON output object major and minor version SHA-256 hashes of all input media As a result of this, in order to find a matching job, both jobs must have been run on all\n of the same media. To improve the chances that a matching job is found in TiesDb, a user\n can choose to only submit jobs for a single piece of media.",
+ "text": "Workflow Manager can be configured to check TiesDb for a supplemental produced as a result of\na previously run OpenMPF job. In order for Workflow Manager to do this check, the TIES_DB_URL \njob property or the ties.db.url system property must be set to the TiesDb server's URL and\nthe SKIP_TIES_DB_CHECK job property or the ties.db.skip.check system property must be set to\nfalse. If Workflow Manager finds a supplemental with a jobConfigHash that matches the job that\nWorkflow Manager is currently running, Workflow Manager will not process the media in the current\njob. Workflow Manager will use the output of the previous job to determine the output for the\ncurrent job. When Workflow Manager is configured to copy the results of the previous job to a\ndifferent S3 bucket, a new output object will be created. Otherwise, the previous job's output\nobject will be used. It is possible for there to be multiple matching supplementals in TiesDb. In that case,\nWorkflow Manager will first pick the supplementals with the best job status. It first looks for a\njob status of COMPLETE , and then COMPLETE_WITH_WARNINGS . If no jobs with those statuses exist,\nall other statuses are considered equally bad. If there are multiple supplementals with the same\nstatus, the most recently created supplemental will be used. In order to determine if a previous job was similar enough to a current job, a hash of the\nimportant parts of the jobs is computed. The parts of the job that are included in the hash are: Names of the algorithms used in the pipeline and their order Non- ignorable job properties output.changed.counter system property This is an integer that is incremented when there is a change to the Workflow Manager after\n which previous TiesDb records should not be used. Because this number is part of the job\n configuration hash, changing this number invalidates all previous TiesDb records. Component descriptor's outputChangedCounter property. This works the same way as output.changed.counter , except that it only invalidates TiesDb\n records for jobs that used the component. Frame ranges and time ranges JSON output object major and minor version SHA-256 hashes of all input media As a result of this, in order to find a matching job, both jobs must have been run on all\n of the same media. To improve the chances that a matching job is found in TiesDb, a user\n can choose to only submit jobs for a single piece of media.",
"title": "Before Job Check"
},
{
diff --git a/docs/site/sitemap.xml b/docs/site/sitemap.xml
index 080567a2740b..554a7d5d7064 100644
--- a/docs/site/sitemap.xml
+++ b/docs/site/sitemap.xml
@@ -2,162 +2,162 @@
/index.html
- 2025-03-18
+ 2025-06-02
daily
/Release-Notes/index.html
- 2025-03-18
+ 2025-06-02
daily
/License-And-Distribution/index.html
- 2025-03-18
+ 2025-06-02
daily
/Acknowledgements/index.html
- 2025-03-18
+ 2025-06-02
daily
/Install-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Admin-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/User-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/OpenID-Connect-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Media-Segmentation-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Feed-Forward-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Derivative-Media-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Object-Storage-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Markup-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/TiesDb-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Trigger-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Roll-Up-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Health-Check-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Artifact-Extraction-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Quality-Selection-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Media-Selectors-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/REST-API/index.html
- 2025-03-18
+ 2025-06-02
daily
/Component-API-Overview/index.html
- 2025-03-18
+ 2025-06-02
daily
/Component-Descriptor-Reference/index.html
- 2025-03-18
+ 2025-06-02
daily
/CPP-Batch-Component-API/index.html
- 2025-03-18
+ 2025-06-02
daily
/Python-Batch-Component-API/index.html
- 2025-03-18
+ 2025-06-02
daily
/Java-Batch-Component-API/index.html
- 2025-03-18
+ 2025-06-02
daily
/GPU-Support-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Contributor-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Development-Environment-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Node-Guide/index.html
- 2025-03-18
+ 2025-06-02
daily
/Workflow-Manager-Architecture/index.html
- 2025-03-18
+ 2025-06-02
daily
/CPP-Streaming-Component-API/index.html
- 2025-03-18
+ 2025-06-02
daily
\ No newline at end of file