Test suite for verify interoperability of streams components like kafka, flink, etc... managed by operators on kubernetes.
There is several requirements you have to have installed to properly build the project and run the tests:
- Java 17+
- Helm 3+
- OperatorSDK
Test scenarios are documented in test code by test-metadata-generator and generated docs are stored in docs folder.
Run maven with profile get-operator-files
to download all operators install files which will be used in test suite.
$ ./mvnw install -P get-operator-files
All operator install files are download into operator-install-files
folder
Use configuration of test suite described here
If you want to use own installation files you need to complete following steps
- Install upstream files to create proper structure
$ ./mvnw install -P get-operator-files
- Replace install files in
operator-install-files
folder
Use configuration of test suite described here
Run all tests.
$ ./mvnw verify -P test
Run specific tag.
$ ./mvnw verify -P test -Dgroups=flink-sql-example
Run specific test class or test
$ ./mvnw verify -P test -Dit.tests=io.streams.e2e.flink.sql.SqlExampleST
$ ./mvnw verify -P test -Dit.tests=io.streams.e2e.flink.sql.SqlExampleST#testRecommendationApp
- To configure sql runner image set env var
SQL_RUNNER_IMAGE
- To use custom flink operator bundle image use env var
FLINK_OPERATOR_BUNDLE_IMAGE
- To use custom strimzi operator bundle image use env var
STRIMZI_OPERATOR_BUNDLE_IMAGE
- To use redhat catalog operators use env var
INSTALL_STRIMZI_FROM_RH_CATALOG
,INSTALL_APICURIO_FROM_RH_CATALOG
orINSTALL_CERT_MANAGER_FROM_RH_CATALOG
- Modify variables in config.yaml file in root folder of repository
Please see documentation
If PR is opened, you can use packit for run you tests on top of kind cluster. To run Packit CI, just make comment with following text...
# run sql example test
/packit test --labels flink-sql-example
# run all flink tests
/packit test --labels flink-all
# run smoke tests
/packit test --labels smoke