Docker Compose File parser for Runnable API Client
This module provides two functions:
const octobear = require('@runnable/octobear')
octobear.parse({
dockerComposeFileString: String, // String for `docker-compose.yml`
dockerComposeFilePath: String, // Path to the Compose file
repositoryName: String, // Name or repository. Used for naming the instances (Doesn't have to correlate 1-to-1)
ownerUsername: String, // User's github username. Used for pre-instance creation hostname generation
userContentDomain: String // Runnable API user content domain. Used for pre-instance creation hostname generation
})
.then(results => )
The response correspond to the following schema:
{
results: [{
metadata: {
name: String, // Name specified for service in `docker-compose.yml`
},
extends: {
service: 'String', // name to the service to extend
file: 'String' // path to the compose file where original service is registered
},
code: { // optional
repo: 'String', // repo full name
commitish: 'String' // Optional. Commit or branch
}),
build: {
dockerFilePath: String, // Optional. Path for Dockerfile used to build instance,
dockerBuildContext: String, // Optional. Path for Docker build context
},
files: { // Optional
'/Dockerfile': {
body: String // Body for Dockerfile to be used. Only specified if there is no `buildDockerfilePath` }
},
instance: {
name: String, // Instance name. Different from name specified in `docker-compose.yml`,
containerStartCommand: String, // Optional. Command provided to start instance
ports: Array<Number>, // Array of number for ports
env: Array<String> // Array of strings for env variables. Includes hostname substitution
}
}],
envFiles: [String] // Array of all ENV files that should be loaded,
mains: {
builds: { // includes all built main
{serviceName}: result {value in results for this}
},
externals: {
{serviceName}: results {value in results for this}
}
}
}
const octobear = require('@runnable/octobear')
const composeFileAsString = '...'
octobear.findExtendedFiles(composeFileAsString)
.then((filesPathes) => {
const composeFiles = fethcAllComposeFiles(filesPathes)
return octobear.parseAndMergeMultiple({...}, composeFiles, envFiles)
.then(({ results: services, envFiles }) => {
const envFiles = getAllTheseFilesAsHashTable(res.envFiles) // An object with filesnames as keys and strings as values
return populateENVsFromFiles(services, envFiles)
})
})
In order to run tests locally you also need to pull submodules. The easiest way to do that is cloning repo with
--recursive
flag:
git clone --recursive [email protected]:Runnable/octobear.git
To update them, use
git submodule update --init --recursive
Also, in order to run tests locally you'll need populate the environment variables in configs/.env
. We suggest adding them to configs/.env.test
.
There are three types of tests:
- Unit: Used to test individual functions
- Functional: Used to test the complete flow of a function. This should not use any external services.
- Integration: Used to test results of parsing against the Runnable API
- Go into
test/repos/${NAME}
- Run
git init
- Run
git add -A
- Run
git commit -m ""
- Create repo in github
- Push changes to github
rm -rf test/repos/${NAME}
(It's in github, don't worry)- git submodule add [email protected]:RunnableTest/${NAME}.git test/repos/${NAME}
- Run
git status
and make sure repo was added to.gitmodules
and was added - Add + Commit + Push octobear repo