Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
8730a84
Test running unit tests in multiple jobs in parallel.
Kami Mar 26, 2021
78baa3b
Fix test case and make sure we don't realy on state created by another
Kami Mar 27, 2021
6287ddd
Add additional logging.
Kami Mar 27, 2021
bffff5e
Add additional parallel job.
Kami Mar 27, 2021
6de8b11
Only try running redis container for integration tests where it's
Kami Mar 27, 2021
6b3c1e4
Move redis to test requirements.
Kami Mar 27, 2021
8f74e7a
Try listening only on ipv4.
Kami Mar 27, 2021
35f2e2f
For now, dont run micro benchmarks on nightly basis since it's very
Kami Mar 27, 2021
1dfce18
Use docker rm.
Kami Mar 27, 2021
cbbe3fb
Remove workarounds we don't need anymore.
Kami Mar 27, 2021
aa10c02
Remove testing change.
Kami Mar 27, 2021
2248e71
Add two more chunks and a comment.
Kami Mar 27, 2021
31d966b
Move pylint check from ci-checks to ci-compile to spread the load more
Kami Mar 27, 2021
84cea73
Try simplifying and unifying the requirements.
Kami Mar 27, 2021
481dab7
Don't skip duplicated builds for master branch.
Kami Mar 27, 2021
9e9bcc3
Test approach to cache apt deps to speed up the build.
Kami Mar 27, 2021
c4a2538
Move functionality into a script file.
Kami Mar 27, 2021
ca7c7f2
Revert apt-cache change which doesn't seem to be working and it's
Kami Mar 27, 2021
bffe4c5
Add new make target for running microbenchmarks as part of a weekly
Kami Mar 27, 2021
ef4607f
Test another change.
Kami Mar 27, 2021
71a2265
Remove testing changes.
Kami Mar 27, 2021
d755ba1
Test a change.
Kami Mar 28, 2021
f0e0e6a
Change cache key.
Kami Mar 28, 2021
581a61e
Dont run apt-get update on populated cache.
Kami Mar 28, 2021
c97eb14
Remove testing changes.
Kami Mar 28, 2021
32798c1
Remove file we don't need anymore.
Kami Mar 28, 2021
c6e0569
Remove testing change.
Kami Mar 28, 2021
5bf8f73
Remove change which is not needed anymore.
Kami Mar 28, 2021
23094ff
Decrease number of jobs which run in parallel.
Kami Mar 31, 2021
76d0422
Merge branch 'master' into nose_parallel_experiment
Kami Mar 31, 2021
ef852fd
Merge branch 'master' of github.com:StackStorm/st2 into nose_parallel…
Kami Apr 1, 2021
bc440bc
Merge branch 'nose_parallel_experiment' of github.com:StackStorm/st2 …
Kami Apr 1, 2021
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
117 changes: 84 additions & 33 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ on:
# ones and only run commands which are needed for some steps for those steps and
# not for all
jobs:
# Special job which automatically cancels old runs for the same branch, prevents runs for the
# same file set which has already passed, etc.
pre_job:
name: Skip Duplicate Jobs Pre Job
runs-on: ubuntu-latest
Expand All @@ -38,32 +40,68 @@ jobs:

ci:
needs: pre_job
if: ${{ needs.pre_job.outputs.should_skip != 'true' }}
# NOTE: We always want to run job on master since we run some additional checks there (code
# coverage, etc)
if: ${{ needs.pre_job.outputs.should_skip != 'true' || github.ref == 'refs/heads/master' }}
name: '${{ matrix.name }} - python (${{ matrix.python-version }})'
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
# NOTE: To speed the CI run, we split unit and integration tests into multiple jobs where
# each job runs subset of tests.
include:
- name: 'Lint Checks'
- name: 'Lint Checks (black, flake8, etc.)'
task: 'ci-checks'
nosetests_node_total: 1
nosetests_node_index: 0
python-version: '3.6'
- name: 'Compile'
- name: 'Compile (pip deps, pylint, etc.)'
task: 'ci-compile'
nosetests_node_total: 1
nosetests_node_index: 0
python-version: '3.6'
- name: 'Pack Tests'
task: 'ci-packs-tests'
nosetests_node_total: 1
nosetests_node_index: 0
python-version: '3.6'
- name: 'Unit Tests'
- name: 'Unit Tests (chunk 1)'
task: 'ci-unit'
nosetests_node_total: 3
nosetests_node_index: 0
python-version: '3.6'
- name: 'Integration Tests'
- name: 'Unit Tests (chunk 2)'
task: 'ci-unit'
nosetests_node_total: 3
nosetests_node_index: 1
python-version: '3.6'
- name: 'Unit Tests (chunk 3)'
task: 'ci-unit'
nosetests_node_total: 3
nosetests_node_index: 2
python-version: '3.6'
- name: 'Integration Tests (chunk 1)'
task: 'ci-integration'
nosetests_node_total: 3
nosetests_node_index: 0
python-version: '3.6'
- name: 'Integration Tests (chunk 2)'
task: 'ci-integration'
nosetests_node_total: 3
nosetests_node_index: 1
python-version: '3.6'
- name: 'Integration Tests (chunk 3)'
task: 'ci-integration'
nosetests_node_total: 3
nosetests_node_index: 2
python-version: '3.6'
# This job is slow so we only run in on a daily basis
# - name: 'Micro Benchmarks'
# task: 'micro-benchmarks'
# python-version: '3.6'
# nosetests_node_total: 1
# nosetests_node_ index: 0
services:
mongo:
image: mongo:4.0
Expand Down Expand Up @@ -102,28 +140,31 @@ jobs:
#

# Used for the coordination backend for integration tests
# TODO: Only start this for integration tests via job step
# https://github.bokerqi.topmunity/t/conditional-services-in-a-job/135301/3
redis:
# Docker Hub image
image: redis
# Set health checks to wait until redis has started
options: >-
--name "redis"
--health-cmd "redis-cli ping"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 6379:6379/tcp
# NOTE: To speed things up, we only start redis for integration tests
# where it's needed
# redis:
# # Docker Hub image
# image: redis
# # Set health checks to wait until redis has started
# options: >-
# --name "redis"
# --health-cmd "redis-cli ping"
# --health-interval 10s
# --health-timeout 5s
# --health-retries 5
# ports:
# - 6379:6379/tcp

env:
TASK: '${{ matrix.task }}'

NODE_TOTAL: '${{ matrix.nosetests_node_total }}'
NODE_INDEX: '${{ matrix.nosetests_node_index }}'

# We need to explicitly specify terminal width otherwise some CLI tests fail on container
# environments where small terminal size is used.
COLUMNS: '120'
PYLINT_CONCURRENCY: '2'
PYLINT_CONCURRENCY: '4'

# CI st2.conf (with ST2_CI_USER user instead of stanley)
ST2_CONF: 'conf/st2.ci.conf'
Expand Down Expand Up @@ -174,7 +215,8 @@ jobs:
echo "::set-output name=year::$(/bin/date -u "+%Y")"
echo "::set-output name=month::$(/bin/date -u "+%m")"
echo "::set-output name=week::$(/bin/date -u "+%U")"
- uses: actions/cache@v2
- name: Cache Python Dependencies
uses: actions/cache@v2
with:
path: |
~/.cache/pip
Expand All @@ -186,24 +228,24 @@ jobs:
key: ${{ runner.os }}-python-${{ matrix.python-version }}-${{ hashFiles('requirements.txt', 'test-requirements.txt') }}
restore-keys: |
${{ runner.os }}-python-${{ matrix.python }}-
- uses: actions/cache@v2
- name: Cache APT Dependencies
id: cache-apt-deps
uses: actions/cache@v2
with:
path: |
/var/cache/apt/archives/*.deb
/var/cache/apt/archives/partial/*.deb
/var/cache/apt/*.bin
key: ${{ runner.os }}-apt-${{ steps.date.outputs.year }}-${{ steps.date.outputs.week }}
~/apt_cache
# TODO: Also incorporate package names into cache key
key: ${{ runner.os }}-apt-v5-${{ steps.date.outputs.year }}-${{ steps.date.outputs.week }}
restore-keys: |
${{ runner.os }}-apt-${{ steps.date.outputs.year }}-
${{ runner.os }}-apt-
- name: Install apt depedencies
${{ runner.os }}-apt-v5-${{ steps.date.outputs.year }}-
${{ runner.os }}-apt-v5-
- name: Install APT Depedencies
env:
CACHE_HIT: ${{steps.cache-apt-deps.outputs.cache-hit}}
run: |
# install dev dependencies for Python YAML and LDAP packages
# https://github.com/StackStorm/st2-auth-ldap
sudo apt-get -y update
sudo apt-get -f -y install libldap2-dev libsasl2-dev libssl-dev libyaml-dev ldap-utils
# shellcheck is already available in docker runner image we use
# sudo apt-get -y install shellcheck
./scripts/github/install-apt-packages-use-cache.sh
- name: Install virtualenv
run: |
set -x
Expand Down Expand Up @@ -233,6 +275,12 @@ jobs:
cp conf/st2.dev.conf "${ST2_CONF}" ; sed -i -e "s/stanley/${ST2_CI_USER}/" "${ST2_CONF}"
./scripts/ci/add-itest-user-key.sh
sudo .circle/add-itest-user.sh
- name: Run Redis Service Container
if: "${{ env.TASK == 'ci-integration' }}"
timeout-minutes: 2
run: |
docker run --rm --detach -p 127.0.0.1:6379:6379/tcp --name redis redis:latest
until [ "$(docker inspect -f {{.State.Running}} redis)" == "true" ]; do sleep 0.1; done
- name: Permissions Workaround
if: "${{ env.TASK == 'ci-packs-tests' || env.TASK == 'ci-integration' }}"
run: |
Expand Down Expand Up @@ -270,6 +318,9 @@ jobs:
if: "${{ success() && ((env.TASK == 'ci-unit') || (env.TASK == 'ci-integration')) && (env.ENABLE_COVERAGE == 'yes') }}"
run: |
./scripts/ci/submit-codecov-coverage.sh
- name: Stop Redis Service Container
if: "${{ always() && env.TASK == 'ci-integration' }}"
run: docker rm --force redis || true
slack-notification:
name: Slack notification for failed master builds
if: always()
Expand Down
42 changes: 21 additions & 21 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -155,6 +155,11 @@ play:
@echo
@echo INCLUDE_TESTS_IN_COVERAGE=$(INCLUDE_TESTS_IN_COVERAGE)
@echo
@echo NODE_TOTAL=$(NODE_TOTAL)
@echo
@echo
@echo NODE_INDEX=$(NODE_INDEX)
@echo

.PHONY: check
check: check-requirements check-sdist-requirements flake8 checklogs
Expand Down Expand Up @@ -275,7 +280,13 @@ check-python-packages-nightly:
done

.PHONY: ci-checks-nightly
ci-checks-nightly: check-python-packages-nightly micro-benchmarks
# TODO: Ony run micro-benchmarks once a week since they are extremly slow on CI
ci-checks-nightly: check-python-packages-nightly
#ci-checks-nightly: check-python-packages-nightly micro-benchmarks

# CI checks which are very slow and only run on a weekly basic
.PHONY: ci-checks-weekly
ci-checks-weekly: micro-benchmarks

.PHONY: checklogs
checklogs:
Expand Down Expand Up @@ -633,24 +644,14 @@ requirements: virtualenv .requirements .sdist-requirements install-runners insta
@echo "==================== requirements ===================="
@echo
# Show pip installed packages before we start
echo ""
$(VIRTUALENV_DIR)/bin/pip list
echo ""

# Note: Use the verison of virtualenv pinned in fixed-requirements.txt so we
# only have to update it one place when we change the version
$(VIRTUALENV_DIR)/bin/pip install --upgrade $(shell grep "^virtualenv" fixed-requirements.txt)

$(VIRTUALENV_DIR)/bin/pip install --upgrade "setuptools==$(SETUPTOOLS_VERSION)" # workaround for pbr issue
$(VIRTUALENV_DIR)/bin/pip install --upgrade "pbr==5.4.3" # workaround for pbr issue

# Fix for Travis CI race
$(VIRTUALENV_DIR)/bin/pip install "six==1.12.0"

# Fix for Travis CI caching issue
if [[ "$(TRAVIS_EVENT_TYPE)" != "" ]]; then\
$(VIRTUALENV_DIR)/bin/pip uninstall -y "pytz" || echo "not installed"; \
$(VIRTUALENV_DIR)/bin/pip uninstall -y "python-dateutil" || echo "not installed"; \
$(VIRTUALENV_DIR)/bin/pip uninstall -y "orquesta" || echo "not installed"; \
fi

# Install requirements
for req in $(REQUIREMENTS); do \
Expand All @@ -662,12 +663,7 @@ requirements: virtualenv .requirements .sdist-requirements install-runners insta
# NOTE: We pass --no-deps to the script so we don't install all the
# package dependencies which are already installed as part of "requirements"
# make targets. This speeds up the build
(cd st2common; ${ROOT_DIR}/$(VIRTUALENV_DIR)/bin/python setup.py develop --no-deps)

# Note: We install prance here and not as part of any component
# requirements.txt because it has a conflict with our dependency (requires
# new version of requests) which we cant resolve at this moment
$(VIRTUALENV_DIR)/bin/pip install "prance==0.15.0"
(cd ${ROOT_DIR}/st2common; ${ROOT_DIR}/$(VIRTUALENV_DIR)/bin/python setup.py develop --no-deps)

# Install st2common to register metrics drivers
# NOTE: We pass --no-deps to the script so we don't install all the
Expand All @@ -685,7 +681,9 @@ requirements: virtualenv .requirements .sdist-requirements install-runners insta
git submodule update --init --recursive --remote

# Show currently install requirements
echo ""
$(VIRTUALENV_DIR)/bin/pip list
echo ""

.PHONY: check-dependency-conflicts
check-dependency-conflicts:
Expand Down Expand Up @@ -1084,8 +1082,10 @@ debs:
.PHONY: ci
ci: ci-checks ci-unit ci-integration ci-packs-tests

# NOTE: pylint is moved to ci-compile so we more evenly spread the load across
# various different jobs to make the whole workflow complete faster
.PHONY: ci-checks
ci-checks: .generated-files-check .shellcheck .black-check .pre-commit-checks .pylint .flake8 check-requirements check-sdist-requirements .st2client-dependencies-check .st2common-circular-dependencies-check circle-lint-api-spec .rst-check .st2client-install-check check-python-packages
ci-checks: .generated-files-check .shellcheck .black-check .pre-commit-checks .flake8 check-requirements check-sdist-requirements .st2client-dependencies-check .st2common-circular-dependencies-check circle-lint-api-spec .rst-check .st2client-install-check check-python-packages

.PHONY: .rst-check
.rst-check:
Expand Down Expand Up @@ -1146,4 +1146,4 @@ ci-orquesta: .ci-prepare-integration .orquesta-itests-coverage-html
ci-packs-tests: .packs-tests

.PHONY: ci-compile
ci-compile: check-dependency-conflicts compilepy3
ci-compile: check-dependency-conflicts compilepy3 .pylint
37 changes: 37 additions & 0 deletions scripts/github/install-apt-packages-use-cache.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
#!/usr/bin/env bash

# Special script which supports installing apt-packages, caching installed files into a directory
# and then on next run if cache is available, re-using that cache

# Packages which will be installed and cached
# NOTE: shellcheck is already available in docker runner image we use
APT_PACKAGES="libldap2-dev libsasl2-dev libssl-dev libyaml-dev ldap-utils"

# Directory where installed package files will be copied - should match directory specified for
# cache target in github actions workflow
CACHE_DIRECTORY="${HOME}/apt_cache"

export APT_DIR="${CACHE_DIRECTORY}"
export APT_STATE_LISTS="${APT_DIR}/lists"
export APT_CACHE_ARCHIVES="${APT_DIR}/archives"

# shellcheck disable=SC2059
printf "dir::state::lists ${APT_STATE_LISTS};\ndir::cache::archives ${APT_CACHE_ARCHIVES};\n" | sudo tee /etc/apt/apt.conf

mkdir -p "${APT_STATE_LISTS}/partial"
mkdir -p "${APT_CACHE_ARCHIVES}/partial"

# NOTE apt-get update is only needed is there is no cache. If there is an existing cache, we don't
# run it to speed things up
if [[ "$CACHE_HIT" == 'false' ]]; then
sudo apt-get -y update
fi

# shellcheck disable=SC2086
sudo apt-get -f -y install ${APT_PACKAGES}

ls -la "${APT_STATE_LISTS}"
ls -la "${APT_CACHE_ARCHIVES}"

# Workaround for caching issue (ensure runer can read the downloaded packages)
sudo chown -R runner:runner "${CACHE_DIRECTORY}"
3 changes: 0 additions & 3 deletions scripts/github/prepare-integration.sh
Original file line number Diff line number Diff line change
Expand Up @@ -26,9 +26,6 @@ echo ""
cat conf/st2.ci.conf || true
echo ""

# Needed by the coordination backend
pip install "redis==3.5.3"

# install st2 client
python ./st2client/setup.py develop
st2 --version
Expand Down
19 changes: 17 additions & 2 deletions st2common/tests/integration/test_register_content_script.py
Original file line number Diff line number Diff line change
Expand Up @@ -171,17 +171,32 @@ def test_register_setup_virtualenvs(self):
self.assertEqual(exit_code, 0)

def test_register_recreate_virtualenvs(self):
# Single pack
# 1. Register the pack and ensure it exists and doesn't rely on state from previous
# test methods
pack_dir = os.path.join(get_fixtures_packs_base_path(), "dummy_pack_1")

cmd = BASE_CMD_ARGS + [
"--register-pack=%s" % (pack_dir),
"--register-recreate-virtualenvs",
"--register-setup-virtualenvs",
"--register-no-fail-on-failure",
]
exit_code, stdout, stderr = run_command(cmd=cmd)

self.assertIn('Setting up virtualenv for pack "dummy_pack_1"', stderr)
self.assertIn("Setup virtualenv for 1 pack(s)", stderr)
self.assertEqual(exit_code, 0)

# 2. Run it again with --register-recreate-virtualenvs flag
pack_dir = os.path.join(get_fixtures_packs_base_path(), "dummy_pack_1")

cmd = BASE_CMD_ARGS + [
"--register-pack=%s" % (pack_dir),
"--register-recreate-virtualenvs",
"--register-no-fail-on-failure",
]
exit_code, stdout, stderr = run_command(cmd=cmd)

self.assertIn('Setting up virtualenv for pack "dummy_pack_1"', stderr)
self.assertIn("Virtualenv successfully removed.", stderr)
self.assertIn("Setup virtualenv for 1 pack(s)", stderr)
self.assertEqual(exit_code, 0)
Loading