Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Blob Storage Forwarder to use HTTP #727

Merged

Conversation

mattsp1290
Copy link
Member

@mattsp1290 mattsp1290 commented Dec 29, 2023

What does this PR do?

Transitions the Azure Storage Blog Forwarder from TCP to HTTP. In Addition, this PR enabled running the mocha tests through Github Actions.

Motivation

TCP is difficult to troubleshoot and our logs teams here at Datadog has recommended transitioning over to HTTP. In addition this brings our EventHub and BlobStorage forwarders in line in terms of code and functionality.

Testing Guidelines

Tests have been added and enabled in CI. This has also been setup in a test org in Datadog and shows that the blob forwarder can submit logs still.

Types of changes

  • Bug fix
  • New feature
  • Breaking change
  • Misc (docs, refactoring, dependency upgrade, etc.)

Check all that apply

  • This PR's description is comprehensive
  • This PR contains breaking changes that are documented in the description
  • This PR introduces new APIs or parameters that are documented and unlikely to change in the foreseeable future
  • This PR impacts documentation, and it has been updated (or a ticket has been logged)
  • This PR's changes are covered by the automated tests
  • This PR collects user input/sensitive content into Datadog
  • This PR passes the integration tests (ask a Datadog member to run the tests)
  • This PR passes the unit tests
  • This PR passes the installation tests (ask a Datadog member to run the tests)

Copy link
Contributor

@parsons90 parsons90 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One nit

azure/blobs_logs_monitoring/index.js Outdated Show resolved Hide resolved
working-directory: azure
- run: npm run test
working-directory: azure

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why did this PR require a new workflow for running tests. Wouldn't the existing workflows run the tests?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

During the PR process found there was no existing workflow that ran the test.

var tls = require('tls');

const VERSION = '0.2.0';
const VERSION = '1.0.0';

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the http version requirement coming from Azure? Any reason why we do not use http 2?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the version of the blob forwarder. The HTTP from TLS is an internal req

const DD_URL = process.env.DD_URL || 'functions-intake.logs.' + DD_SITE;
const DD_PORT = process.env.DD_PORT || DD_SITE === 'datadoghq.eu' ? 443 : 10516;
const DD_HTTP_URL = process.env.DD_URL || 'http-intake.logs.' + DD_SITE;
const DD_HTTP_PORT = process.env.DD_PORT || 443;

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do we no longer use different ports for eu vs other sites?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was relevant for TCP and TCP is only supported in us1 and eu1

Copy link
Contributor

@kanishktripathi kanishktripathi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one question

this.scrubber = new Scrubber(this.context, SCRUBBER_RULE_CONFIGS);
this.batcher = new Batcher(
this.context,
256 * 1000,
Copy link
Contributor

@kanishktripathi kanishktripathi Jan 4, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How these limits were decided? Is it based on the doc here. The single record limit and max entries in a payload can be higher

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Uncertain they were added in a 2021 PR for the event hub forwarder
#456

@kanishktripathi
Copy link
Contributor

Its not a blocker for now, but we should look into gzip compressing the payloads.

@mattsp1290
Copy link
Member Author

Its not a blocker for now, but we should look into gzip compressing the payloads.

Agreed. We should have a task to knock this out for both forwarders now that they are architected the same

@mattsp1290 mattsp1290 merged commit 3a23cd9 into master Jan 5, 2024
13 checks passed
@mattsp1290 mattsp1290 deleted the matt.spurlin/update-blob-storage-forwarder-to-use-http branch January 5, 2024 22:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants