Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
53 commits
Select commit Hold shift + click to select a range
b0a4ced
Snapshot
paul-rogers Jan 14, 2023
f0f3c67
Snapshot
paul-rogers Jan 15, 2023
d05846a
Basic functionality works (at least for the Driver)
paul-rogers Jan 17, 2023
05aac53
Fix protobuf enums
paul-rogers Jan 17, 2023
0b217af
Snapshot
paul-rogers Jan 23, 2023
24cde0c
Shaded jar
paul-rogers Jan 24, 2023
d404f21
Snaphot: one attempt to resolve dependencies
paul-rogers Jan 25, 2023
7efcdc8
Snapshot: went down wrong path on class loader issues
paul-rogers Jan 26, 2023
a522e28
Extension tested in Broker server
paul-rogers Jan 26, 2023
5d23a49
Fix running tests from the command line
paul-rogers Feb 2, 2023
131fdac
Hide gRPC shaded jar module from IDEs
paul-rogers Feb 2, 2023
3df151a
grpc protobufwriter
andrisnoko Feb 6, 2023
edd97a9
Cache Method object, serialize complex types to bytes
andrisnoko Feb 7, 2023
6ed242b
Merge pull request #3 from andrisnoko/grpc-protobufwriter
paul-rogers Feb 7, 2023
74a8057
add empty response and GrpcResponseHandler tests
andrisnoko Feb 14, 2023
2e08d27
Merge branch 'master' into grpc-query
paul-rogers Feb 23, 2023
159caf4
Merge pull request #4 from andrisnoko/add-tests
paul-rogers Feb 23, 2023
6cf4a27
IT shell for gRPC tests
paul-rogers Feb 24, 2023
5804cf7
Support for multiple extension directories
paul-rogers Feb 25, 2023
c0683ad
Build the grpc-query distribution tar.gz file
paul-rogers Feb 25, 2023
bbdbd11
More steps toward rRPC unit tests
paul-rogers Feb 25, 2023
310876c
Merge branch 'grpc-query' of github.com:paul-rogers/druid into grpc-q…
paul-rogers Feb 25, 2023
e7fbc0d
Basic grpc IT works
paul-rogers Mar 1, 2023
b70c742
ITs pass
paul-rogers Mar 2, 2023
b32fae9
ITs pass from the command line
paul-rogers Mar 2, 2023
81f2d42
Remove gRPC
paul-rogers Mar 2, 2023
2df1b2f
Merge branch 'master' into 23302-it-extn
paul-rogers Mar 2, 2023
ce19c24
Cleanup
paul-rogers Mar 2, 2023
194cc73
Cleanup
paul-rogers Mar 2, 2023
ca3ae0c
Address review comments
paul-rogers Mar 8, 2023
b2b41aa
Merge branch 'master' into 23302-it-extn
paul-rogers Mar 8, 2023
89d4f74
Build fix
paul-rogers Mar 8, 2023
78ed57d
Doc fix
paul-rogers Mar 8, 2023
b00a3f5
Fixes
paul-rogers Mar 10, 2023
80a5d4a
Address comments
paul-rogers Mar 10, 2023
02fc38d
Build fix
paul-rogers Mar 14, 2023
b049967
Refine it.sh
paul-rogers Mar 16, 2023
b72ed28
Merge branch 'master' into 23302-it-extn
paul-rogers Mar 16, 2023
abb0782
Build fix
paul-rogers Mar 16, 2023
6e42006
Remove docker prune for github runs
paul-rogers Mar 17, 2023
9f3b18d
Remove debug code
paul-rogers Mar 17, 2023
7dc0e05
Fix typo
paul-rogers Mar 17, 2023
8c4e6f2
Merge branch 'master' into 23302-it-extn
paul-rogers Mar 22, 2023
fdfe55a
Merge branch 'master' into 23302-it-extn
paul-rogers Apr 3, 2023
2763d19
Revert extension path to maybe fix Kafka test
paul-rogers Apr 3, 2023
69a74d2
Merge branch 'master' into 23302-it-extn
tejaswini-imply Apr 24, 2023
3e54a9a
nit
tejaswini-imply Apr 25, 2023
21c8b24
nit: permission error
tejaswini-imply Apr 25, 2023
1e856ce
nit: permission error
tejaswini-imply Apr 25, 2023
b143fb2
output more debug logs for revised ITs
tejaswini-imply Apr 25, 2023
5ac4005
run init scripts before starting metadata container
tejaswini-imply Apr 26, 2023
20c970b
debug
tejaswini-imply May 11, 2023
451dfc9
don't persist db as it's not needed for tests
tejaswini-imply May 12, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
59 changes: 26 additions & 33 deletions integration-tests-ex/cases/cluster.sh
Original file line number Diff line number Diff line change
Expand Up @@ -27,15 +27,17 @@ set -e
# Enable for debugging
#set -x

export MODULE_DIR=$(cd $(dirname $0) && pwd)
export BASE_MODULE_DIR=$(cd $(dirname $0) && pwd)

# The location of the tests, which may be different than
# the location of this file.
export MODULE_DIR=${IT_MODULE_DIR:-$BASE_MODULE_DIR}

function usage {
cat <<EOF
Usage: $0 cmd [category]
-h, help
Display this message
prepare category
Generate the docker-compose.yaml file for the category for debugging.
up category
Start the cluster
down category
Expand All @@ -45,7 +47,7 @@ Usage: $0 cmd [category]
compose-cmd category
Pass the command to Docker compose. Cluster should already be up.
gen category
Generate docker-compose.yaml files (only.) Done automatically as
Generate docker-compose.yaml file (only.) Done automatically as
part of up. Use only for debugging.
EOF
}
Expand All @@ -60,7 +62,7 @@ CMD=$1
shift

function check_env_file {
export ENV_FILE=$MODULE_DIR/../image/target/env.sh
export ENV_FILE=$BASE_MODULE_DIR/../image/target/env.sh
if [ ! -f $ENV_FILE ]; then
echo "Please build the Docker test image before testing" 1>&2
exit 1
Expand Down Expand Up @@ -127,33 +129,33 @@ function show_status {
function build_shared_dir {
mkdir -p $SHARED_DIR
# Must start with an empty DB to keep MySQL happy
rm -rf $SHARED_DIR/db
sudo rm -rf $SHARED_DIR/db
mkdir -p $SHARED_DIR/logs
mkdir -p $SHARED_DIR/tasklogs
mkdir -p $SHARED_DIR/db
mkdir -p $SHARED_DIR/kafka
mkdir -p $SHARED_DIR/resources
cp $MODULE_DIR/assets/log4j2.xml $SHARED_DIR/resources
cp $BASE_MODULE_DIR/assets/log4j2.xml $SHARED_DIR/resources
# Permissions in some build setups are screwed up. See above. The user
# which runs Docker does not have permission to write into the /shared
# directory. Force ownership to allow writing.
chmod -R a+rwx $SHARED_DIR
sudo chmod -R a+rwx $SHARED_DIR
}

# Either generate the docker-compose file, or use "static" versions.
function docker_file {

# If a template exists, generate the docker-compose.yaml file. Copy over the Common
# folder.
TEMPLATE_DIR=$MODULE_DIR/templates
TEMPLATE_SCRIPT=${DRUID_INTEGRATION_TEST_GROUP}.py
if [ -f "$TEMPLATE_DIR/$TEMPLATE_SCRIPT" ]; then
# If a template exists, generate the docker-compose.yaml file.
# Copy over the Common folder.
TEMPLATE_SCRIPT=docker-compose.py
if [ -f "$CLUSTER_DIR/$TEMPLATE_SCRIPT" ]; then
export PYTHONPATH=$BASE_MODULE_DIR/cluster
export COMPOSE_DIR=$TARGET_DIR/cluster/$DRUID_INTEGRATION_TEST_GROUP
mkdir -p $COMPOSE_DIR
pushd $TEMPLATE_DIR > /dev/null
pushd $CLUSTER_DIR > /dev/null
python3 $TEMPLATE_SCRIPT
popd > /dev/null
cp -r $MODULE_DIR/cluster/Common $TARGET_DIR/cluster
cp -r $BASE_MODULE_DIR/cluster/Common $TARGET_DIR/cluster
else
# Else, use the existing non-template file in place.
if [ ! -d $CLUSTER_DIR ]; then
Expand Down Expand Up @@ -205,6 +207,13 @@ function verify_docker_file {
fi
}

function run_setup {
SETUP_SCRIPT="$CLUSTER_DIR/setup.sh"
if [ -f "$SETUP_SCRIPT" ]; then
source "$SETUP_SCRIPT"
fi
}

# Determine if docker-compose is available. If not, assume Docker supports
# the compose subcommand
set +e
Expand All @@ -219,17 +228,6 @@ set -e
# Print environment for debugging
#env

# Determine if docker-compose is available. If not, assume Docker supports
# the compose subcommand
set +e
if which docker-compose > /dev/null
then
DOCKER_COMPOSE='docker-compose'
else
DOCKER_COMPOSE='docker compose'
fi
set -e

case $CMD in
"-h" )
usage
Expand All @@ -238,24 +236,19 @@ case $CMD in
usage
$DOCKER_COMPOSE help
;;
"prepare" )
check_env_file
category $*
build_shared_dir
docker_file
;;
"gen" )
category $*
build_shared_dir
docker_file
echo "Generated file is in $COMPOSE_DIR"
echo "Generated file is $COMPOSE_DIR/docker-compose.yaml"
;;
"up" )
check_env_file
category $*
echo "Starting cluster $DRUID_INTEGRATION_TEST_GROUP"
build_shared_dir
docker_file
run_setup
cd $COMPOSE_DIR
$DOCKER_COMPOSE $DOCKER_ARGS up -d
# Enable the following for debugging
Expand Down
19 changes: 19 additions & 0 deletions integration-tests-ex/cases/cluster/AzureDeepStorage/verify.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#--------------------------------------------------------------------

require_env_var AZURE_ACCOUNT
require_env_var AZURE_KEY
require_env_var AZURE_CONTAINER
3 changes: 2 additions & 1 deletion integration-tests-ex/cases/cluster/Common/dependencies.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,7 @@ services:
# platform: linux/x86_64
image: mysql:$MYSQL_IMAGE_VERSION
container_name: metadata
restart: always
command:
- --character-set-server=utf8mb4
networks:
Expand All @@ -79,7 +80,7 @@ services:
ports:
- 3306:3306
volumes:
- ${SHARED_DIR}/db:/var/lib/mysql
- ${SHARED_DIR}/db/init.sql:/docker-entrypoint-initdb.d/init.sql
environment:
MYSQL_ROOT_PASSWORD: driud
MYSQL_DATABASE: druid
Expand Down
23 changes: 23 additions & 0 deletions integration-tests-ex/cases/cluster/GcsDeepStorage/verify.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#--------------------------------------------------------------------

require_env_var GOOGLE_BUCKET
require_env_var GOOGLE_PREFIX
require_env_var GOOGLE_APPLICATION_CREDENTIALS
if [ ! -f "$GOOGLE_APPLICATION_CREDENTIALS" ]; then
echo "Required file GOOGLE_APPLICATION_CREDENTIALS=$GOOGLE_APPLICATION_CREDENTIALS is missing" 1>&2
exit 1
fi
21 changes: 21 additions & 0 deletions integration-tests-ex/cases/cluster/S3DeepStorage/verify.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#--------------------------------------------------------------------

require_env_var DRUID_CLOUD_BUCKET
require_env_var DRUID_CLOUD_PATH
require_env_var AWS_REGION
require_env_var AWS_ACCESS_KEY_ID
require_env_var AWS_SECRET_ACCESS_KEY
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
PyYaml does the grunt work of converting the data structure to the YAML file.
'''

import yaml, os, os.path
import yaml, os
from pathlib import Path

# Constants used frequently in the template.
Expand All @@ -49,15 +49,16 @@ def generate(template_path, template):
'''

# Compute the cluster (test category) name from the template path which
# we assume to be module/<something>/<template>/<something>.py
# we assume to be <module>/templates/<something>.py
template_path = Path(template_path)
cluster = template_path.stem
cluster = template_path.parent.name

# Move up to the module (that is, the cases folder) relative to the template file.
module_dir = Path(__file__).parent.parent
# Move up to the module relative to the template file.
module_dir = template_path.parent.parent.parent

# The target location for the output file is <module>/target/cluster/<cluster>/docker-compose.yaml
target_dir = module_dir.joinpath("target")
os.makedirs(target_dir, exist_ok=True)
target_file = target_dir.joinpath('cluster', cluster, 'docker-compose.yaml')

# Defer back to the template class to create the output into the docker-compose.yaml file.
Expand Down Expand Up @@ -205,7 +206,7 @@ def add_env(self, service, var, value):
def add_property(self, service, prop, value):
'''
Sets a property for a service. The property is of the same form as the
.properties file: druid.some.property.
runtime.properties file: druid.some.property.
This method converts the property to the env var form so you don't have to.
'''
var = prop.replace('.', '_')
Expand All @@ -230,7 +231,7 @@ def add_port(self, service, local, container):
Add a port mapping to the service
'''
ports = service.setdefault('ports', [])
ports.append(local + ':' + container)
ports.append(str(local) + ':' + str(container))

def define_external_service(self, name) -> dict:
'''
Expand Down
30 changes: 29 additions & 1 deletion integration-tests-ex/docs/compose.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,10 +37,38 @@ See also:

## File Structure

Docker Compose files live in the `druid-it-cases` module (`test-cases` folder)
Docker Compose files live in the `druid-it-cases` module (`cases` folder)
in the `cluster` directory. There is a separate subdirectory for each cluster type
(subset of test categories), plus a `Common` folder for shared files.

### Cluster Directory

Each test category uses an associated cluster. In some cases, multiple tests use
the same cluster definition. Each cluster is defined by a directory in
`$MODULE/cluster/$CLUSTER_NAME`. The directory contains a variety of files, most
of which are optional:

* `docker-compose.yaml` - Docker composes file, if created explicitly.
* `docker-compose.py` - Docker compose "template" if generated. The Python template
format is preferred. (One of the `docker-compose.*` files is required)
* `verify.sh` - Verify the environment for the cluster. Cloud tests require that a
number of environment variables be set to pass keys and other setup to tests.
(Optional)
* `setup.sh` - Additional cluster setup, such as populating the "shared" directory
with test-specific items. (Optional)

The `verify.sh` and `setup.sh` scripts are sourced into one of the "master"
scripts and can thus make use of environment variables already set:

* `BASE_MODULE_DIR` points to `integration-tests-ex/cases` where the "base" set
of scripts and cluster definitions reside.
* `MODULE_DIR` points to the Maven module folder that contains the test.
* `CATEGORY` gives the name of the test category.
* `DRUID_INTEGRATION_TEST_GROUP` is the cluster name. Often the same as `CATEGORY`,
but not always.

The `set -e` option is in effect so that an any errors fail the test.

## Shared Directory

Each test has a "shared" directory that is mounted into each container to hold things
Expand Down
31 changes: 31 additions & 0 deletions integration-tests-ex/docs/docker.md
Original file line number Diff line number Diff line change
Expand Up @@ -211,6 +211,37 @@ when it starts. If you start, then restart the MySQL container, you *must*
remove the `db` directory before restart or MySQL will fail due to existing
files.

### Per-test Extensions

The image build includes a standard set of extensions. Contrib or custom extensions
may wish to add additional extensions. This is most easily done not by altering the
image, but by adding the extensions at cluster startup. If the shared directory has
an `extensions` subdirectory, then that directory is added to the extension search
path on container startup. To add an extension `my-extension`, your shared directory
should look like this:

```text
shared
+- ...
+- extensions
+- my-extension
+- my-extension-<version>.jar
+- ...
```

The `extensions` directory should be created within the per-cluster `setup.sh` script
which is when starting your test cluster.

Be sure to also include the extension in the load list in your `docker-compose.py` template.
To load the extension on all nodes:

```python
def extend_druid_service(self, service):
self.add_env(service, 'druid_test_loadList', 'my-extension')
```

Note that the above requires Druid and IT features added in early March, 2023.

### Third-Party Logs

The three third-party containers are configured to log to the `/shared`
Expand Down
Loading