The Event Management Agent is a tool for architects and developers working with event-driven architectures (EDAs) to discover event streams flowing through event brokers as well as the related broker configuration information. It can also discover schemas from registries. The Event Management Agent can be used in two different ways:
- As the Event Management Agent component of Solace PubSub+ Cloud Event Portal to:
- discover runtime event data from event brokers and schema registries
- populate Event Portal with the discovered runtime data to enable the management and reuse of EDA assets
- audit the runtime data and flag discrepancies between the runtime and the design time intent for event data governance purposes, and to ensure that the runtime and design time configurations stay in-sync
- As a standalone tool that discovers runtime event data from event or message brokers and schema registries.
Our plan is to open source the Event Management Agent to enable architects and developers to contribute to it as well as to build new plugins so that:
- the agent can discover data from different broker types
- existing plugins can be extended to discover additional data
- EDA data can be discovered from other systems, e.g. schemas from schema registries
- Users can discover Solace PubSub+ and Apache and Confluent Kafka brokers event flow data:
- Users can discover Solace PubSub+ queues and subscriptions
- Users can discover Apache and Confluent Kafka topics and consumer groups
- Users can discover Confluent Schema Registry schemas
- Users get discovered data in the form of JSON files separated by entity types
- The Event Management Agent architecture is currently in the form of Java packages
- The Event Management Agent has an open source plugin framework
- Support additional Solace PubSub+ and Apache Kafka event broker authentication types in the form of plugins such as Kerberos, etc.
- Collection of topics from events flowing though Solace PubSub+ brokers
- Export discovered data as AsyncAPI specifications
- Addition of the infrastructure needed for the Event Management Agent to be a true open source project
- Discovery of Apache Kafka connectors
- Introduction of a UI for the Event Management Agent
- Additional support to more broker types
- Event Management Agent executables
In this mode, the Event Management Agent connects to the Solace PubSub+ Event Portal. Scans are initiated from Event Portal and are automatically uploaded.
The Event Management Agent was tested to run with
- 1 CPU
- 1 GB RAM
- Docker
The Event Management Agent must be installed in a location that has access to your event brokers and schema registries.
Pull the Event Management Agent Docker image from Docker Hub:
docker pull solace/event-management-agent
- Log in the Solace Cloud Console: https://console.solace.cloud/login/
- Follow the steps for generating the connection file described in the Solace documentation: https://docs.solace.com/Cloud/Event-Portal/event-portal-collect-runtime-data.htm#creating_connection_file
- For security considerations, passwords should be configured as environment variables. Therefore, provide the environment variable names that will contain the passwords when generating the connection file.
- Download the connection file next to the Event Management Agent
- Create the environment variables containing the actual passwords you provided in place of the passwords when generating the connection file
Specify:
- the location of the connection file (e.g. /path/to/file/AcmeRetail.yml)
- the environment variables containing the authentication details of your event brokers and schema registries (e.g. KAFKA_PASSWORD)
- the name of the Event Management Agent container (e.g. event-managenement-agent)
- the Event Management Agent Docker image (e.g. solace/event-management-agent:latest)
docker run -d -p 8180:8180 -v /path/to/file/AcmeRetail.yml:/config/ema.yml --env KAFKA_PASSWORD=myKafkaPassword --name event-management-agent solace/event-management-agent:latest
NOTE: Depending on your OS, the container may fail to start if your connection filename contains spaces or special characters. If you see such failures, make sure your filenames are within quotes or that they do not contain spaces or special characters.
The Event Management Agent takes a couple of minutes to start. The Event Management Agent logs are available via Docker:
docker logs -f event-management-agent
The Event Management Agent is now connected to the Solace Cloud Console. Follow the steps in the documentation to run a Discovery scan: https://docs.solace.com/Cloud/Event-Portal/event-portal-collect-runtime-data.htm#collecting_runtime_data
The Event Management Agent must have access to your event brokers and schema registries.
- pull the Event Management Docker image
docker pull solace/event-management-agent
- clone the Event Management Agent GitHub repository
git clone https://github.com/SolaceProducts/event-management-agent.git
cd event-management-agent
- duplicate the connect file found in:
service/application/src/main/resources/application.yml
- uncomment the sections relevant to your event brokers and schema registries
- provide the authentication details required to connect to your event brokers and schema registries
Specify:
- the location of the connection file (e.g. /path/to/file/AcmeRetail.yml)
- the name of the Event Management Agent container (e.g. event-managenement-agent)
- the Event Management Agent Docker image (e.g. solace/event-management-agent:latest)
docker run -d -p 8180:8180 -v /path/to/file/AcmeRetail.yml:/config/ema.yml --name event-management-agent solace/event-management-agent:latest
NOTE: Depending on your OS, the container may fail to start if your connection filename contains spaces or special characters. If you see such failures, make sure your filenames are within quotes or that they do not contain spaces or special characters.
The Event Management Agent takes a couple of minutes to start. The Event Management Agent logs are available via Docker:
docker logs -f event-management-agent
The Event Management Agent includes REST APIs that allow users to interact with the agent. See REST Documentation for additional information.
To run a scan:
- provide the id of the event broker you specified in the connection file, e.g. mykafkacluster
- provide the scan type, e.g. KAFKA_ALL
- provide the destination: FILE_WRITER
curl -X 'POST' \
'http://localhost:8180/api/v2/ema/resources/mykafkacluster/scan' \
-H 'accept: */*' \
-H 'Content-Type: application/json' \
-d '{
"scanTypes": [
"KAFKA_ALL"
],
"destinations": [
"FILE_WRITER"
]
}'
The agent will return a scan id.
The Event Management Agent can export Discovery scans as zip files containing the entity type JSON files.
- provide the scan id, e.g. ih9z9lwwv5r
- provide the name of the zip file, e.g. ih9z9lwwv5r.zip
curl -X 'GET' \
'http://localhost:8180/api/v2/ema/resources/export/ih9z9lwwv5r/zip' \
-H 'accept: */*' \
--output ih9z9lwwv5r.zip
You can manually upload Discovery scan zip files to Event Portal. To do so:
- Scan and export Discovery scans following with a first Event Management Agent running in standalone mode: Running the Event Management Agent in standalone mode
- Set up a second agent in connected mode following: Running the Event Management Agent in connected mode
- Upload your Discovery zip file with the following curl command, providing the name of the zip file, e.g. scan.zip:
curl -X 'POST' \
'http://localhost:8180/api/v2/ema/resources/import' \
-H 'accept: */*' \
-H 'Content-Type: multipart/form-data' \
-F '[email protected];type=application/zip'
- Java 17 (JDK 17.0.8_7+)
- Maven
[email protected]:SolaceProducts/event-management-agent.git
cd event-management-agent/service
mvn clean install
The generated Event Management Agent jar file is found in:
application/target/event-management-agent-1.0.0-SNAPSHOT.jar
You can run the Event Management Agent as a process in both connected and standalone modes.
Specify:
- the Event Management Agent jar file (e.g. event-management-agent-1.0.0-SNAPSHOT.jar)
- the location of the connection file (e.g. /path/to/file/AcmeRetail.yml) - See Generating the Event Management Agent connection file to generate a connection file
java -jar application/target/event-management-agent-1.0.0-SNAPSHOT.jar --spring.config.location=/path/to/file/AcmeRetail.yml
The Event Management Agent is ready.
apiVersion: v1
kind: ConfigMap
metadata:
name: ema-example1
namespace: ema
data:
ema-properties: |
springdoc:
packages-to-scan: com.solace.maas.ep.event.management.agent.scanManager.rest
api-docs:
path: /docs/event-management-agent
swagger-ui:
path: /event-management-agent/swagger-ui.html
server:
port: 8180
spring:
h2:
console:
path: /h2
enabled: true
settings:
web-allow-others: true
datasource:
password: password
driver-class-name: org.h2.Driver
username: sa
url: jdbc:h2:file:./data/cache;DB_CLOSE_ON_EXIT=FALSE
jpa:
hibernate:
ddl-auto: create-drop
database-platform: org.hibernate.dialect.H2Dialect
defer-datasource-initialization: true
servlet:
multipart:
max-request-size: ${MAX_REQUEST_SIZE:5MB}
max-file-size: ${MAX_FILE_SIZE:5MB}
main:
allow-bean-definition-overriding: true
camel:
springboot:
use-mdc-logging: true
kafka:
client:
config:
reconnections:
max-backoff:
unit: milliseconds
value: 1000
backoff:
unit: milliseconds
value: 50
connections:
max-idle:
unit: milliseconds
value: 10000
request-timeout:
unit: milliseconds
value: 5000
timeout:
unit: milliseconds
value: 60000
eventPortal:
gateway:
id: TOFILL
name: TOFILL
messaging:
standalone: false
rtoSession: false
enableHeartbeats: true
testHeartbeats: true
connections:
- name: eventPortalGateway
authenticationType: ${EP_GATEWAY_AUTH:basicAuthentication}
msgVpn: ${EP_GATEWAY_MSGVPN:perf-evmr}
url: ${EP_GATEWAY_URL:TOFILL}
users:
- name: TOFILL
password: TOFILL
username: ${EP_GATEWAY_USERNAME:TOFILL}
clientName: TOFILL
organizationId: ${EP_ORGANIZATION_ID:TOFILL}
runtimeAgentId: ${EP_RUNTIME_AGENT_ID:TOFILL}
topicPrefix: ${EP_TOPIC_PREFIX:sc/ep/runtime}
plugins:
resources:
- id: TOFILL
type: kafka
name: TOFILL
connections:
- name: kafkaNoAuthConnection
url: ${KAFKA_BOOTSTRAP_SERVERS:TOFILL}
authentication:
- protocol: PLAINTEXT
credentials:
- type: noAuth
source: ENVIRONMENT_VARIABLE
operations:
- name: ALL
properties: [
]
- id: TOFILL
type: CONFLUENT_SCHEMA_REGISTRY
name: TOFILL
relatedServices:
- TOFILL
connections:
- name: TOFILL
url: ${CONFLUENT_SCHEMA_REGISTRY_URL:http://TOFILL}
apiVersion: v1
kind: Secret
metadata:
name: broker-passwords
namespace: ema
type: Opaque
data:
MY_KAFKA_PASSWORD: bXlTdXBlclNlY3JldFBhc3N3b3JkCg==
apiVersion: apps/v1
kind: Deployment
metadata:
labels:
test: deployment
name: ema-example1
namespace: ema
spec:
replicas: 1
selector:
matchLabels:
app: ema
template:
metadata:
labels:
app: ema
spec:
containers:
- image: docker.io/solace/event-mangement-agent:1.1.1
imagePullPolicy: IfNotPresent
name: event-management-agent
volumeMounts:
- mountPath: /config
name: ema-example1
env:
- name: MY_KAFKA_PASSWORD
valueFrom:
secretKeyRef:
name: broker-passwords
key: MY_KAFKA_PASSWORD
ports:
- containerPort: 8180
restartPolicy: Always
volumes:
- configMap:
defaultMode: 420
items:
- key: ema-properties
mode: 420
path: ema.yml
name: ema-example1
optional: true
name: ema-example1
If users wish to run the EMA in offline mode, they will need to create a service in order to make the pod reachable for REST commands outside of the cluster.
The Event Management Agent comes with the following event or message broker plugins included:
- Apache and Confluent Kafka
- Solace PubSub+
- Confluent Schema Registry
- MSK
The default application.yml provides various plugin examples. For Kafka, the properties section under credentials is passthrough. For example a property in ConsumerConfig or SSLConfigs classes.
If using AWS IAM, the AWS Access Key Id and AWS Secret Access Key need to be present. This can be done via a credentials file or environment variables as shown below:
Put a file with the following contents into a ~/.aws/credentials file
[default]
aws_access_key_id = <aws_access_key>
aws_secret_access_key = <aws_secret_key>
Alternatively, you can make these environment variables (these will also override the credentials file if present)
export AWS_ACCESS_KEY_ID=<aws_access_key>
export AWS_SECRET_ACCESS_KEY=<aws_secret_key>
See motivations
There are several interesting scenarios to test the Event Management Agent. These scenarios can be divided into two main categories according to the deployment mode.
- Testing the Event Management Agent as standalone service (stand-alone deployment).
- Testing the end-to-end flow in Solace PubSub+ Console connected mode (From the frontend to the Event Portal, then to the Event Management Agent)
The most important test in standalone mode is to ensure that the Event Management Agent runs and collects data properly. To that end, the test includes the steps below:
- Update the
plugins
section of theapplication.yml
with the details of the resource you want to scan. - Start the Event Management Agent either from the IDE or by running the JAR file.
- Examine the on-console logs for a log from
ResourceConfig
class indicating that the resource(s) has been created. Note: The logs may differ according to the resource(s) specified in theapplication.yml
file.
c.s.m.e.r.a.c.ResourceConfig : Created [kafka] resource with id:[sakdjf] and name: [some-name1]
c.s.m.e.r.a.c.ResourceConfig : Created [solace] resource with id:[hdfgkdjf] and name: [some-name2]
- View the local Swagger documentation to learn about the available REST endpoints for the Event Management Agent. To access
the Swagger documentation, use the link
http://localhost:8180/event-management-agent/swagger-ui/index.html
(Note: The Event Management Agent is under continuous development. Therefore, please check the Swagger documentation to make sure that you are using the recent endpoint schema). - Initiate a scan against a resource by sending a POST request to the endpoint that triggers the data
collection
/api/v2/ema/resources/{resourceId}/scan
. The request can be sent using either Postman or a curl command. - Ensure that the
destinations
in the request body containsFILE_WRITER
, i.e.,"destinations":["FILE_WRITER"]
, then send the request. - Confirm that you receive a scan id, e.g.,
3a41a0f5-cd85-455c-a863-9636f69dc7b2
- Examine the Event Management Agent console logs to make sure that individual scan types are complete. e.g.,
Route subscriptionConfiguration completed for scanId 3a41a0f5-cd85-455c-a863-9636f69dc7b2
- Examine the collected data by browsing to the directory
data_collection
. This directory is organized as {schedule_id}/{scan_id}/{scan_type.json} - Verify that the collected data contains a separate JSON file for each scan type.
- Verify the contents of each JSON file.
- Check the logs by browsing to
data_collection/logs/{scan_id}.log
andgeneral-logs.log
to make sure no exceptions or errors occurred. - Finally, if you have added the
EVENT_PORTAL
as a destination, check the Event Portal tables to confirm they contain the scanned data.
Contributions are encouraged! If you have ideas to improve an existing plugin, create a new plugin, or improve/extend the agent framework then please contribute!
Event Management Agent is available under the Apache License V2.
@gregmeldrum @slunelsolace @AHabes @MichaelDavisSolace @helrac @moodiRealist