To use the queue client to send a message, call function (example in Javascript):
const queueClient = require('queue-client');
queueClient.sendMessage(
'walletCreated',
{
walletId: '1234',
createdAt: '2021-01-01'
}
);
## The receive message API
To use the queue client to receive a message, call function (example in Javascript):
```javascript
const queueClient = require('queue-client');
queueClient.receiveMessage(
'walletCreated',
(message, acknowlege) => {
console.log(message);
acknowlege({
acknowledged: true,
});
}
);
Note:
The user of this client decides whether or not to acknowledge the message by calling the callback function acknowlege
.
The message should include the ack
field which is the jsonb data in db, so the user of this client can choose to modify the existing ack
or simply overwrite it.
The queue service use the database schema: 'queue'.
Tables:
-
message
Columns:
- id: UUID
- channel: text
- data: jsonb
- created_at: timestamptz
- updated_at: timestamptz
- ack: jsonb
Note, the act time could be record in ack column, like this:
{ acknowledgedByClientA: {ack: true, timestamp: 2023-03-15 18:37:06.880169-11} acknowledgedByClientB: {ack: true, timestamp: 2023-03-15 18:37:06.880169-11} }
This repository was created from Greenstand's template for microservice projects. This means it comes with many development tools that we use for development and deployment. As a contributor to this repository, you should learn and use these tools. They are outlined below.
db-migrate is used to create database tables and other database objects
https://www.conventionalcommits.org/en/v1.0.0/
husky will check commit messages to make sure they follow the required format
Open terminal and navigate to a folder to install this project:
git clone https://github.com/Greenstand/treetracker-repository-name.git
Install all necessary dependencies:
npm install
Generate Initial Code:
cd scripts
sh generate-resource.sh
The above commands would generate an initial set of endpoints for the resource using our defined code structure
-
Ask engineering leads for a doctl dev token
-
Install doctl command line tool
- MacOS: brew install doctl
- ./scripts/setup-dev-database-passwords.sh
- Copy
.env.*.example
to API root folder - Copy
database/database.json.example
to API database folder - Copy scripts folder to API root folder
- Set scripts/vars.sh to have the corret schema name
This repository using db-migrate to manage database migrations for its schema.
cd database/
db-migrate --env dev up
If you have not installed db-migrate globally, you can run:
cd database/
../node_modules/db-migrate/bin/db-migrate --env dev up
Documentation for db-migrate: https://db-migrate.readthedocs.io/en/latest/
Our microservices use multiple layer structure to build the whole system. Similar with MVC structure:
- HTTP Domain
- All error and success codes
- API payload validation
- Calls a service or a model function
- Orchestration between services and the domain model
- Database session
- External APIs
- Cloud Services (such as RabbitMQ or S3)
- Domain logic
- Accesses repositories
- Accesses the database, performs CRUD operations
- One repository for each database table in RDMS
To run the unit tests:
npm run test-unit
All the integration tests are located under folder __tests__
To run the integration test:
Run tests:
npm run test-integration
In order to efficiently run our integration tests, we rely on automated database seeding/clearing functions to mock database entries. To test these functions, run:
npm run test-seedDB
There is a command in the package.json
:
npm run test-watch
By running test with this command, the tests would re-run if any code change happened. And with the bail
argument, tests would stop when it met the first error, it might bring some convenience when developing.
NOTE: There is another command: test-watch-debug
, it is the same with test-watch
, except it set log's level to debug
.
Can also use Postman to test the API manually.
To run a local server with some seed data, run command:
npm run server-test
This command would run a API server locally, and seed some basic data into DB (the same with the data we used in the integration test).