-
Notifications
You must be signed in to change notification settings - Fork 7
Home
Spark Messaging is a message queue system that allows Vault Java SDK developers to send and receive messages from durable queue. It can allow loosely coupled, asynchronous integration within a Vault, between different Vaults, or between a Vault and an external system. This page describes example use cases of the integration between a Vault and an external 3rd party system and provides sample code that you can use with your Vault to step through the code in the Vault Java SDK Debugger.
The vsdk-spark-external-aws-sample project contains an example of an External Connection integration using Spark Messaging that sends messages from a Vault to an external Amazon Web Services (AWS) SQS queue for processing. The project demonstrates:
- How to create a sample 3rd party AWS Application:
- Amazon Simple Queue Service (SQS): Stores Spark messages in AWS for processing.
- Amazon API Gateway: Point of entry from Vault to AWS.
- Amazon Lambda: Verify, enqueue, and process Spark messages.
- How to configure the source Vault:
- External Connection: Sends a simple message to AWS for processing.
Users request loans for items by creating Loan Approval records in Vault. They must enter basic information about the loan request such as the name of the person requiring the loan, the item, the amount, and the loan duration. Once saved, a request is sent to an external finance system within AWS for approval or rejection based on the specified loan criteria. The loan approval record within Vault is then updated with an approval decision and, if approved, with details of the loan.
Users can make subsequent loan approval requests for the same item via a Loan re-quote record action.
- An AWS account. If you don't have one, sign up for an account at AWS Free Tier.
- A source Vault. You must have the Vault Owner security profile and the appropriate permissions to complete this project.
There are four main setup steps:
- Create AWS Application Services in the external system that will process the messages from Vault.
- Create a Connection record in the Vault that links it to AWS.
- Import Vault Packages to create necessary components and Spark Message Queues.
- Create a Queue object in the Vault for the Spark Messages.
The project contains a Vault Package File (VPK) in the "deploy-vpk" directory with the necessary objects, queues, and Vault Java SDK code.
- In the Amazon IAM console, click Roles in the menu on the left hand side of the page.
- Click Create role.
- Select AWS service as the trusted entity type and then select Lambda as the use case.
- Click Next: Permissions.
- Add the following permissions policies by searching for the policy name and clicking the check mark next to it when it appears (if you have multiple results, select the policy with a type of "AWS managed"):
- AWSLambdaBasicExecutionRole
- AWSLambdaSQSQueueExecutionRole
- AmazonSQSFullAccess
- AmazonS3FullAccess
- Click Next.
- For Role Name, enter SparkSampleAwsRole.
- Click Create role.
- In the Amazon SQS console, click Create New Queue.
- Configure the new Queue:
- For Queue Type, choose Standard.
- For Queue Name, enter vsdk-spark-sample-external-queue.
- Select Create Queue.
- With the vsdk-spark-sample-external-queue SQS queue selected, select the Access policy tab.
- Select Edit within the Access policy (Permissions) section.
- Under Access policy, launch the Policy generator.
- Within the Policy Generator, do the following:
- For Type of Policy, select SQS Queue Policy.
- For Effect, select Allow.
- For Principle, enter a * (this selects everybody).
- For Actions, choose the following:
- DeleteMessage
- GetQueueURL
- ReceiveMessage
- SendMessage
- Enter your specific ARN, which follows the format: arn:aws:sqs:${Region}:${Account}:${QueueName}
- Click Add Statement
- Click Generate Policy
- Copy the generated policy, navigate back to your SQS Queue, paste the policy into the Access Policy field, and click Save.
- With your newly created SQS queue selected, copy the URL displayed in the Details section, as this will be needed later in configuring AWS in the section Create Lambda function vsdkSparkSampleValidateAndEnqueueMessage.
- In the Amazon S3 console, click Create Bucket.
- Configure the s3 bucket:
- For BucketName, enter vsdk-spark-external-aws-s3-bucket.
- For Region, choose your Region.
- Leave the other configuration settings to their defaults and click Create Bucket.
- Within the S3 bucket, create a folder named PublicKeys.
- Note down the bucket name as it will be needed for the next step.
- Clone or download the sample Maven project vSDK Spark AWS External Sample project from GitHub, and save on your local machine.
- In the AWS Lambda console, click Create function.
- Set the following values:
- Select Author from scratch.
- For Name, enter vsdkSparkSampleValidateAndEnqueueMessage.
- For Runtime, select Java 8.
- Under the Change default execution role drop-down, select Use an existing role.
- For Existing role, select SparkSampleAwsRole which we created previously.
- Click Create function.
- In the Runtime settings section, click Edit and set the Handler to com.veeva.vault.LambdaHandler.
- Within the Code source section, select Upload from .zip or .jar file.
- Click Upload.
- From the sample Maven project vSDK Spark AWS External Sample project downloaded in step 1, select the \aws-lambda-samples\vsdkSparkSampleValidateAndEnqueueMessage.jar file.
- Click Save.
- Navigate to the Environment variables section of the Configuration tab and add the following:
- Key: VAULT_SAMPLE_SQS_QUEUE_URL; Value: URL for the queue recorded in the Create SQS Queue section.
- Key: VAULT_HOSTNAME; Value: {YourVault}, e.g. https://your-vault.veeva.com.
- Key: VAULT_USER; Value: Your Vault Integration service account's username.
- Key: VAULT_PASSWORD; Value: Your Vault Integration service account's password.
- Key: BUCKET_NAME; Value: The name of the S3 bucket you created in the last section.
- In the General configuration section of the Configuration tab, set the Timeout to 0 mins and 20 sec.
- Click Save.
- Click Functions in the breadcrumb trial at the top the page to return to the main functions page.
- While still in the AWS Lambda console, click Create function.
- Set the following values:
- Select Author from scratch.
- For Name, enter vsdkSparkSampleProcessMessage.
- For Runtime, select Python 3.11.
- Under the Change default execution role drop-down, select Use an existing role.
- For Existing role, select SparkSampleAwsRole which we created previously.
- Click Create function.
- Within the Code source section, select Upload from .zip file.
- Click Upload.
- From the sample Maven project vSDK Spark AWS External Sample project downloaded in step 1, select the \aws-lambda-samples\vsdkSparkSampleProcessMessage.zip file.
- Click Save.
- Navigate to the Environment variables section of the Configuration tab and add the following:
- Key: CLIENT_ID; Value: vsdk-sparksample-aws-process-message.
- Key: VAULT_API_BURST_LIMIT_CUTOFF; Value: 200.
- Key: VAULT_VQL_PAGE_LIMIT; Value: 500.
- Key: VAULT_REST_API_BASE_URL; Value: https://{YourVault}/api/v22.2/.
- Key: VAULT_USER; Value: Your Vault Integration service account's username.
- Key: VAULT_PASSWORD; Value: Your Vault Integration service account's password.
- In the General configuration section of the Configuration tab, set the Timeout to 0 mins and 30 sec.
- Click Save.
- In the Triggers section of the Configuration tab, select Add trigger.
- In the Trigger configuration, select the SQS Queue vsdk-spark-sample-external-queue.
- Make sure Activate trigger is checked and click Add.
Note: Once this trigger is enabled, background processing will run in your AWS environment and count towards your billable resources. To stop this at any time, simply disable this trigger and press save.
- Navigate to the API Gateway console, and click Create API.
- Select the HTTP API option and click Build.
- Create and configure integrations:
- Click Add integration, choose Lambda from the drop-down, then choose the vsdkSparkSampleValidateAndEnqueueMessage lambda from the Lambda Function search box.
- For API name, enter vSdkSparkSampleAPIGateway.
- Click Next and then select the POST option in the Method drop-down, using
/message
as the Resource Path. - Click Next and the Next and Create.
- Record the Invoke URL displayed in the Stages section, as this will be needed later in configuring Vault during the Create Connection step.
The creation of API Gateway vSdkSparkSampleAPIGateway is now complete.
The Connection object is used to create records that define connections between different Vaults or between a Vault and an external system.
In this use case, we will create External records to link a source Vault to an external system.
- Log in and navigate to Business Admin > Connections and click Create
- Choose the External connection type. Select Continue.
- Enter AWS Queue Sample API Gateway in the Name field.
- Enter vsdk_aws_queue_sample_api_gateway in the API Name field.
- Enter your API Gateway Invoke URL in the URL field. This will be the value you were asked to record when creating the AWS API Gateway earlier and will look something like
https://{some-alpha-numeric-string}.execute-api.{region}.amazonaws.com/
. Important: make sure /message is not included at the end of the URL. - Enter your Vault Integration service account user in the Authorized Connection User field.
- Select Save.
The connection is now ready for use.
You must deploy the VPKs to your Vault prior to debugging these use cases.
- Retrieve the sample Maven project vSDK Spark AWS External Sample project that was previously downloaded from GitHub in section Create Lambda function vsdkSparkSampleValidateAndEnqueueMessage.
- Run through the Getting Started guide to set up your deployment environment, if needed.
-
In Vault, log in and navigate to Admin > Deployment > Inbound Packages and click Import:
Deploy Vault components and code: Select the \deploy-vpk\vault\vsdk-spark-external-aws-sample-components.vpk file.
-
From the Actions menu, select Review & Deploy. Vault displays a list of all components in the package.
-
Review the prompts to deploy the package. You will receive an email when the deployment is complete.
Once the package has been deployed, you should review the configuration and understand the normal behavior so you can observe the effects of the sample trigger code.
The package includes the following components.
- Loan approval (vsdk_loan_approval__c)
- Loan Period (Months) (vsdk_loan_period_months1__c)
- Approval Status (vsdk_approval_status1__c)
- com.veeva.vault.custom.triggers.vSdkSparkExternalAwsSampleTrigger - AFTER INSERT (vSdkAwsQueueSampleTrigger.java)
- Loan re-quote (vSdkSparkExternalAwsSampleAction.java)
Within Vault, you must configure a Spark Queue to utilize Spark messaging functionality. This queue will handle messages that are produced via the Vault Java SDK Message
and QueueService
.
- Log in and navigate to Admin > Connections > Spark Queues and click Create.
- Set the following values:
- For Label, enter vSDK AWS Queue Sample.
- For Queue Type, select Outbound.
- Click Save
- On the new Queue, scroll down to the Queue Connections section and click Create.
- Select the vsdk_aws_queue_sample_api_gateway and click Save.
- Run the project provides details of how to run the project.
- Code logic provides a detailed understanding of how the sample components work.