-
Notifications
You must be signed in to change notification settings - Fork 16
Contributing
Thank you for your interest in contributing tests to the ODPi!
To start off, please ensure you have completed a Individual CLA for yourself. If you are contributing on behalf of an organization, you will also need to have the organization complete a Corporate CLA as well. See the links below for the process:
The ODPi testing toolset is based on a fork of Apache BigTop 1.0.0, and leverages the existing BigTop smoke tests along with additional tests that verify ODPi compliance.
Here's just a quick outline on the existing examples of the integration tests. These examples could be used to start writing spec tests. The following explains how smoke tests are developed right now. All smoke tests related activities should be carried on from odpi-master branch of odpi/bigtop repo. Same repo/branch
should be used if you're standing up a cluster with Bigtop provisioner, e.g. ./gradlew -Pnum_instances=3 -Prun_smoke_tests=true docker-provisioner
Smoke tests are ran as gradle integration tests. All existing smoke tests (similar to future spec tests are located under bigtop-tests/smoke-tests
directory.
Smoke tests for different components could be found under bigtop-tests/smoke-tests
in the directories named after particular modules. Source code of a test suite might be co-located with the build
directrory, or borrowed from another project. In this case, all hdfs tests source code is actually located in the maven project, yet executed as smokes.
Source code directories are controlled by component's build.gradle
file via sourceSets element.
Tests could be written in Java, or Groovy. Tests can directly use JUnit APIs and will be correctly executed by the test runner. Optionally, tests could use facilities provided by iTest - Bigtop integration test framework. Among other things, iTest provide very convenient org.apache.bigtop.itest.shell.Shell API, which adds bash scripting to the set of available languages. Shell class also has the capabilities to change the effective user owning exec'ed shell process. That requires passworless sudo to be configured.
Tests could be executed from the top-level directory as the following command (in this case for hdfs smokes)
./gradlew bigtop-tests:smoke-tests:hdfs:test -Psmoke.tests
Optional --info
could be added for higher verbosity.
If tests log level needs to be changes, it could be done in bigtop-tests/smoke-tests/logger-test-config/src/main/resources/log4j.properties
file
This section will be providing the guidance and examples for developers of new reference implementation cluster smoke tests. Either existing test suites can be expanded, or new ones be added.
odpi-specific branches need to stop on top of the corresponding upstream branches, when seems reasonable or required. On the other hand, changes would need to be made to these odpi-specific branches. The following work-flow outlines the process. Recommended way to sync-up odpi-specific branch (ie odpi-master) with its upstream counterpart (master)
- sync up master of the odpi-specific repo with its upstream origin. It could be done directly without involving GH PR. assuming you have two remotes origin for odpi-specific GH repo; and public for its upstream
git fetch origin
git fetch public
git checkout -b master public/master
git push origin HEAD:master
- fork and create your own clone of odpi-specific repo (yours). Then in it
git checkout -b hotfix-branch odpi-master
# make and commit you changes
git push yours HEAD:hotfix-branch
# Create PR to the *odpi-master* from the new *hotfix-branch* as usual with GH flow.
###Adding new test cases into an existing smoke test suite.
Let's explore how additional test cases might be added to a test suite. As an example, let's take a closer look into bigtop-tests/smoke-tests/hdfs
. At the time of writing the source code of this suite fully resides in the bigtop test artifacts component. The approach like this allows to provide installable maven artifacts, runnable at a later time by other maven integration projects. However, the same source code is reusable for the purpose of cluster smoking. This can be achieved without intermediate creation of maven jars, ie directly from the source code.
The first order of business, is to add new test files, so they get discovered by the compiler. Gradle, the build system we are using to manage most of the aspects of the project, has default sets of source, that doesn't have to be explicitly declared. Those are src/main/groovy
, src/main/java
, src/test/groovy
, and src/test/java
. As long as new source code is put under either of those, you don't need to change anything else. In case you want to put new code into an alternative location, you'll have to update sourceSet
in bigtop-tests/smoke-tests/hdfs/build.gradle
. In our case, you'll still need to explicitly add src/test/groovy
sourceSet, because the one in build.gradle
overrides the defaults.
Now, let's add new test to check the version of running HDFS cluster. It isn't exactly HDFS specific test, but will do for all practical and demo purposes. Let's put the following code into src/test/groovy/TestHadoopVersion.groovy
package org.apache.bigtop.itest.hadoop.hdfs
import org.junit.Test
import org.apache.bigtop.itest.shell.Shell
public class HadoopVersion {
@Test
public void cliVersion () {
Shell sh = new Shell()
//Just printing out the first line containing Hadoop version
println sh.exec("hdfs version").out[0]
}
@Test
public void apiVersion () {
// TODO this is left as an exercise to the reader
}
}
Add the new test name to the def tests_to_include()
list in build.gradle
file. In the later version of the smoke tests this behavior will be fixed to discover all the tests automatically BIGTOP-2248.
Now you can run the tests as described above, and you should see new test case be added to the report.
If you need to add new resources to the test, it could be done in the same way although destination directory will be either src/main/resources
or a custom one.
###Adding new smoke test component This case is similar albeit a bit more laborious than the last. New test suite would require
- creation of a new directory under
bigtop-tests/smoke-tests
and adding a simplebuild.gradle
similar to existing one in hdfs, hcfs or other suites - finally, just like was explained above, write new tests On the next build execution, new test suite will be automatically discovered and added to the projects of the build, along with all standard tasks and properties.
The examples of how to write spec tests could be found under
https://github.com/odpi/bigtop/tree/spec-tests/bigtop-tests/spec-test
Adding new tests is simple and is done via adding new test descriptions to
runtime/src/test/resources/testRuntimeSpecConf.groovy
file. spec-tests are under development and not yet available from the main odpi-master branch just yet
To run spec tests you can follow the already familiar
./gradlew bigtop-tests:spec-tests:runtime:test -Pspec.tests
###Adding new spec tests Specification tests have two parts
-
a generic test runner located in
bigtop-tests/spec-tests/runtime/src/test/groovy/org/odpi/specs/runtime/TestSpecsRuntime.groovy
and suitable to run different types of the tests, specified by -
the descriptive DSL
bigtop-tests/spec-tests/runtime/src/test/resources/testRuntimeSpecConf.groovy
The DSL is flexible and easily extendable. There's a few types of tests already supported by the runner, and more could be added as needed.
If a new test case of existing type needs to be added, it is a simple matter of writing the test declaration in the DSL. For new types of tests, both the DSL and the runner code have to be altered.
There's a sufficient number of existing spec tests, for a technically savvy reader to quickly learn the ropes and start developing new tests. As a general rule of thumb, I would recommend to write as much of tests logic as possible in the DSL file, and keep the runner code generic, tiny and simple.
You can join the community conversation by subscribing to the mailing list below.
https://lists.odpi.org/mailman3/lists/odpi-technical.lists.odpi.org/
This work is licensed under a Creative Commons Attribution 4.0 International License