Description
Right now submitting a task expects a payload like:
{
"type": "index",
"spec": {
"dataSchema": { ... },
"tuningConfig": { ... },
"ioConfig": { ... }
}
}
(see https://druid.apache.org/docs/latest/tutorials/tutorial-batch.html#loading-data-with-a-spec-via-console)
While a supervisor has a payload of
{
"type": "kafka",
"dataSchema": { ... },
"tuningConfig": { ... },
"ioConfig": { ... }
}
(see https://druid.apache.org/docs/latest/tutorials/tutorial-kafka.html)
This is a hastle for the console to handle and also confusing to users ( https://stackoverflow.com/questions/58097806/cant-create-druid-ingestion-task-through-api/58131519#58131519 )
I suggest to normalize the task API to be able to accept
{
"type": "index",
"dataSchema": { ... },
"tuningConfig": { ... },
"ioConfig": { ... }
}
as the payload. It should also be backwards compatible.
Implementation wise it could just rewrite it to the spec form internally.
Description
Right now submitting a task expects a payload like:
(see https://druid.apache.org/docs/latest/tutorials/tutorial-batch.html#loading-data-with-a-spec-via-console)
While a supervisor has a payload of
(see https://druid.apache.org/docs/latest/tutorials/tutorial-kafka.html)
This is a hastle for the console to handle and also confusing to users ( https://stackoverflow.com/questions/58097806/cant-create-druid-ingestion-task-through-api/58131519#58131519 )
I suggest to normalize the task API to be able to accept
as the payload. It should also be backwards compatible.
Implementation wise it could just rewrite it to the
specform internally.