Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 17 additions & 5 deletions integration-tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ docker-compose -f docker-compose.druid-hadoop.yml up

1. Build druid-cluster, druid-hadoop docker images. From root module run maven command:
```
mvn clean install -pl integration-tests -P integration-tests -Ddocker.run.skip=true -Dmaven.test.skip=true
mvn clean install -pl integration-tests -P integration-tests -Ddocker.run.skip=true -Dmaven.test.skip=true -Ddocker.build.hadoop=true
```

2. Run druid cluster by docker-compose:
Expand Down Expand Up @@ -147,6 +147,8 @@ You need to build druid containers only once, after you can skip docker build st
up the docker containers (Druid, Kafka, Hadoop, MYSQL, zookeeper, etc). Please make sure that you actually do have
these containers already running if using this flag. Additionally, please make sure that the running containers
are in the same state that the setup script (run_cluster.sh) would have brought it up in.
- -Ddocker.build.hadoop=true to build the hadoop image when either running integration tests or when building the integration test docker images without running the tests.
- -Dstart.hadoop.docker=true to start hadoop container when you need to run IT tests that utilize local hadoop docker

### Debugging Druid while running tests

Expand Down Expand Up @@ -283,7 +285,7 @@ of the integration test run discussed above. This is because druid
test clusters might not, in general, have access to hadoop.
This also applies to integration test that uses Hadoop HDFS as an inputSource or as a deep storage.
To run integration test that uses Hadoop, you will have to run a Hadoop cluster. This can be done in two ways:
1) Run Druid Docker test clusters with Hadoop container by passing -Dstart.hadoop.docker=true to the mvn command.
1) Run Druid Docker test clusters with Hadoop container by passing -Dstart.hadoop.docker=true to the mvn command. If you have not already built the hadoop image, you will also need to add -Ddocker.build.hadoop=true to the mvn command.
2) Run your own Druid + Hadoop cluster and specified Hadoop configs in the configuration file (CONFIG_FILE).

Currently, hdfs-deep-storage and other <cloud>-deep-storage integration test groups can only be run with
Expand All @@ -302,12 +304,22 @@ If using the Docker-based Hadoop container, the steps above are automatically do

When running the Hadoop tests, you must set `-Dextra.datasource.name.suffix=''`, due to https://github.com/apache/druid/issues/9788.

Run the test using mvn (using the bundled Docker-based Hadoop cluster):
Option 1: Run the test using mvn (using the bundled Docker-based Hadoop cluster and building docker images at runtime):
```
mvn verify -P integration-tests -Dit.test=ITHadoopIndexTest -Dstart.hadoop.docker=true -Doverride.config.path=docker/environment-configs/override-examples/hdfs -Dextra.datasource.name.suffix=''
mvn verify -P integration-tests -Dit.test=ITHadoopIndexTest -Dstart.hadoop.docker=true -Ddocker.build.hadoop=true -Doverride.config.path=docker/environment-configs/override-examples/hdfs -Dextra.datasource.name.suffix=''
```

Run the test using mvn (using config file for existing Hadoop cluster):
Option 2: Run the test using mvn (using the bundled Docker-based hadoop cluster and not building images at runtime):
```
mvn verify -P integration-tests -Dit.test=ITHadoopIndexTest -Dstart.hadoop.docker=true -Ddocker.build.skip=true -Doverride.config.path=docker/environment-configs/override-examples/hdfs -Dextra.datasource.name.suffix=''
```

Option 3: Run the test using mvn (using the bundled Docker-based hadoop cluster and when you have already started all containers)
```
mvn verify -P integration-tests -Dit.test=ITHadoopIndexTest -Ddocker.run.skip=true -Ddocker.build.skip=true -Doverride.config.path=docker/environment-configs/override-examples/hdfs -Dextra.datasource.name.suffix=''
```

Option 4: Run the test using mvn (using config file for existing Hadoop cluster):
```
mvn verify -P int-tests-config-file -Dit.test=ITHadoopIndexTest -Dextra.datasource.name.suffix=''
```
Expand Down
2 changes: 2 additions & 0 deletions integration-tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -370,6 +370,7 @@
<start.hadoop.docker>false</start.hadoop.docker>
<docker.run.skip>false</docker.run.skip>
<docker.build.skip>false</docker.build.skip>
<docker.build.hadoop>false</docker.build.hadoop>
<it.indexer>middleManager</it.indexer>
<override.config.path />
<resource.file.dir.path />
Expand All @@ -391,6 +392,7 @@
<phase>pre-integration-test</phase>
<configuration>
<environmentVariables>
<DRUID_INTEGRATION_TEST_BUILD_HADOOP_DOCKER>${docker.build.hadoop}</DRUID_INTEGRATION_TEST_BUILD_HADOOP_DOCKER>
<DRUID_INTEGRATION_TEST_START_HADOOP_DOCKER>${start.hadoop.docker}</DRUID_INTEGRATION_TEST_START_HADOOP_DOCKER>
<DRUID_INTEGRATION_TEST_JVM_RUNTIME>${jvm.runtime}</DRUID_INTEGRATION_TEST_JVM_RUNTIME>
<DRUID_INTEGRATION_TEST_GROUP>${groups}</DRUID_INTEGRATION_TEST_GROUP>
Expand Down
2 changes: 1 addition & 1 deletion integration-tests/script/docker_build_containers.sh
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ else
fi

# Build Hadoop docker if needed
if [ -n "$DRUID_INTEGRATION_TEST_START_HADOOP_DOCKER" ] && [ "$DRUID_INTEGRATION_TEST_START_HADOOP_DOCKER" == true ]
if [ -n "$DRUID_INTEGRATION_TEST_BUILD_HADOOP_DOCKER" ] && [ "$DRUID_INTEGRATION_TEST_BUILD_HADOOP_DOCKER" == true ]
then
docker build -t druid-it/hadoop:2.8.5 $HADOOP_DOCKER_DIR
fi
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
"type": "STRING",
"size": 0,
"hasMultipleValues": false,
"hasNulls": false,
"minValue": "location_1",
"maxValue": "location_5",
"cardinality": 5,
Expand All @@ -27,6 +28,7 @@
"type": "thetaSketch",
"size": 0,
"hasMultipleValues": false,
"hasNulls": true,
"minValue": null,
"maxValue": null,
"cardinality": null,
Expand All @@ -36,6 +38,7 @@
"type": "thetaSketch",
"size": 0,
"hasMultipleValues": false,
"hasNulls": true,
"minValue": null,
"maxValue": null,
"cardinality": null,
Expand All @@ -45,6 +48,7 @@
"type": "LONG",
"size": 0,
"hasMultipleValues": false,
"hasNulls": false,
"minValue": null,
"maxValue": null,
"cardinality": null,
Expand All @@ -54,6 +58,7 @@
"type": "STRING",
"size": 0,
"hasMultipleValues": false,
"hasNulls": false,
"minValue": "product_1",
"maxValue": "product_9",
"cardinality": 15,
Expand Down