Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat!: add metrics hook #448

Merged
merged 1 commit into from
Jul 12, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"libs/hooks/open-telemetry": "6.0.2",
"libs/hooks/open-telemetry": "0.1.0",
"libs/providers/go-feature-flag": "0.5.12",
"libs/providers/flagd": "0.7.7",
"libs/providers/flagd-web": "0.3.4",
Expand Down
39 changes: 30 additions & 9 deletions libs/hooks/open-telemetry/README.md
toddbaert marked this conversation as resolved.
Show resolved Hide resolved
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,14 @@

# OpenTelemetry Hook

The OpenTelemetry hook for OpenFeature provides a [spec compliant][otel-spec] way to automatically add a feature flag evaluation to a span as a span event. Since feature flags are dynamic and affect runtime behavior, it’s important to collect relevant feature flag telemetry signals. This can be used to determine the impact a feature has on a request, enabling enhanced observability use cases, such as A/B testing or progressive feature releases.
The OpenTelemetry hooks for OpenFeature provide a [spec compliant][otel-spec] way to automatically add feature flag evaluation information to traces and metrics.
Since feature flags are dynamic and affect runtime behavior, it’s important to collect relevant feature flag telemetry signals.
These can be used to determine the impact a feature has on application behavior, enabling enhanced observability use cases, such as A/B testing or progressive feature releases.

## Installation

```
$ npm install @openfeature/open-telemetry-hook
$ npm install @openfeature/open-telemetry-hooks
```

### Peer dependencies
Expand All @@ -18,33 +20,52 @@ Confirm that the following peer dependencies are installed.
$ npm install @openfeature/js-sdk @opentelemetry/api
```

## Hooks

### TracingHook

This hook adds a [span event](https://opentelemetry.io/docs/concepts/signals/traces/#span-events) for each feature flag evaluation.

### MetricsHook

This hook performs metric collection by tapping into various hook stages. Below are the metrics are extracted by this hook:

- `feature_flag.evaluation_requests_total`
- `feature_flag.evaluation_success_total`
- `feature_flag.evaluation_error_total`
- `feature_flag.evaluation_active_count`

toddbaert marked this conversation as resolved.
Show resolved Hide resolved
## Usage

OpenFeature provides various ways to register hooks. The location that a hook is registered affects when the hook is run. It's recommended to register the `OpenTelemetryHook` globally in most situations but it's possible to only enable the hook on specific clients. You should **never** register the `OpenTelemetryHook` globally and on a client.
OpenFeature provides various ways to register hooks. The location that a hook is registered affects when the hook is run.
It's recommended to register both the `TracingHook` and `MetricsHook` globally in most situations, but it's possible to only enable the hook on specific clients.
You should **never** register these hooks both globally and on a client.

More information on hooks can be found in the [OpenFeature documentation][hook-concept].

### Register Globally

The `OpenTelemetryHook` can be set on the OpenFeature singleton. This will ensure that every flag evaluation will always create a span event, if am active span is available.
The `TracingHook` and `MetricsHook` can both be set on the OpenFeature singleton.
This will ensure that every flag evaluation will always generate the applicable telemetry signals.

```typescript
import { OpenFeature } from '@openfeature/js-sdk';
import { OpenTelemetryHook } from '@openfeature/open-telemetry-hook';
import { TracingHook } from '@openfeature/open-telemetry-hooks';

OpenFeature.addHooks(new OpenTelemetryHook());
OpenFeature.addHooks(new TracingHook());
```

### Register Per Client

The `OpenTelemetryHook` can be set on an individual client. This should only be done if it wasn't set globally and other clients shouldn't use this hook. Setting the hook on the client will ensure that every flag evaluation performed by this client will always create a span event, if am active span is available.
The `TracingHook` and `MetricsHook` can both be set on an individual client. This should only be done if it wasn't set globally and other clients shouldn't use this hook.
Setting the hook on the client will ensure that every flag evaluation performed by this client will always generate the applicable telemetry signals.

```typescript
import { OpenFeature } from '@openfeature/js-sdk';
import { OpenTelemetryHook } from '@openfeature/open-telemetry-hook';
import { MetricsHook } from '@openfeature/open-telemetry-hooks';

const client = OpenFeature.getClient('my-app');
client.addHooks(new OpenTelemetryHook());
client.addHooks(new MetricsHook());
```

## Development
Expand Down
6 changes: 3 additions & 3 deletions libs/hooks/open-telemetry/package.json
toddbaert marked this conversation as resolved.
Show resolved Hide resolved
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@openfeature/open-telemetry-hook",
"version": "6.0.2",
"name": "@openfeature/open-telemetry-hooks",
"version": "0.1.0",
toddbaert marked this conversation as resolved.
Show resolved Hide resolved
"repository": {
"type": "git",
"url": "https://github.com/open-feature/js-sdk-contrib.git",
Expand All @@ -15,7 +15,7 @@
},
"peerDependencies": {
"@openfeature/js-sdk": "^1.0.0",
"@opentelemetry/api": "^1.2.0"
"@opentelemetry/api": ">=1.3.0"
},
"license": "Apache-2.0"
}
3 changes: 2 additions & 1 deletion libs/hooks/open-telemetry/src/index.ts
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
export * from './lib/open-telemetry-hook';
export * from './lib/traces';
export * from './lib/metrics';
5 changes: 5 additions & 0 deletions libs/hooks/open-telemetry/src/lib/constants.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export const FEATURE_FLAG = 'feature_flag';
export const ACTIVE_COUNT_NAME = `${FEATURE_FLAG}.evaluation_active_count`;
export const REQUESTS_TOTAL_NAME = `${FEATURE_FLAG}.evaluation_requests_total`;
export const SUCCESS_TOTAL_NAME = `${FEATURE_FLAG}.evaluation_success_total`;
export const ERROR_TOTAL_NAME = `${FEATURE_FLAG}.evaluation_error_total`;
1 change: 1 addition & 0 deletions libs/hooks/open-telemetry/src/lib/metrics/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
export * from './metrics-hook';
216 changes: 216 additions & 0 deletions libs/hooks/open-telemetry/src/lib/metrics/metrics-hook.spec.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,216 @@
import { BeforeHookContext, EvaluationDetails, HookContext, StandardResolutionReasons } from '@openfeature/js-sdk';
import opentelemetry from '@opentelemetry/api';
import {
DataPoint,
MeterProvider,
MetricReader,
ScopeMetrics,
} from '@opentelemetry/sdk-metrics';
import { ACTIVE_COUNT_NAME, ERROR_TOTAL_NAME, REQUESTS_TOTAL_NAME, SUCCESS_TOTAL_NAME } from '../constants';
import { MetricsHook } from './metrics-hook';

// no-op "in-memory" reader
class InMemoryMetricReader extends MetricReader {
protected onShutdown(): Promise<void> {
return Promise.resolve();
}
protected onForceFlush(): Promise<void> {
return Promise.resolve();
}
}

describe(MetricsHook.name, () => {
let reader: MetricReader;

beforeAll(() => {
reader = new InMemoryMetricReader();
const provider = new MeterProvider();

provider.addMetricReader(reader);

// Set this MeterProvider to be global to the app being instrumented.
const successful = opentelemetry.metrics.setGlobalMeterProvider(provider);
expect(successful).toBeTruthy();
});

describe(MetricsHook.prototype.before, () => {
it('should increment evaluation_active_count and evaluation_requests_total and set attrs', async () => {
const FLAG_KEY = 'before-test-key';
const PROVIDER_NAME = 'before-provider-name';
const hook = new MetricsHook();
const mockHookContext: BeforeHookContext = {
flagKey: FLAG_KEY,
providerMetadata: {
name: PROVIDER_NAME,
},
} as BeforeHookContext;

hook.before(mockHookContext);
const result = await reader.collect();
expect(
hasDataPointMatching(
result.resourceMetrics.scopeMetrics,
ACTIVE_COUNT_NAME,
0,
(point) =>
point.value === 1 && point.attributes.key === FLAG_KEY && point.attributes.provider === PROVIDER_NAME
)
).toBeTruthy();
expect(
hasDataPointMatching(
result.resourceMetrics.scopeMetrics,
REQUESTS_TOTAL_NAME,
0,
(point) =>
point.value === 1 && point.attributes.key === FLAG_KEY && point.attributes.provider === PROVIDER_NAME
)
).toBeTruthy();
});
});

describe(MetricsHook.prototype.after, () => {
describe('variant set', () => {
it('should increment evaluation_success_total and set attrs with variant = variant', async () => {
const FLAG_KEY = 'after-test-key';
const PROVIDER_NAME = 'after-provider-name';
const VARIANT = 'one';
const VALUE = 1;
const hook = new MetricsHook();
const mockHookContext: HookContext = {
flagKey: FLAG_KEY,
providerMetadata: {
name: PROVIDER_NAME,
},
} as HookContext;
const evaluationDetails: EvaluationDetails<number> = {
variant: VARIANT,
value: VALUE,
reason: StandardResolutionReasons.STATIC,
} as EvaluationDetails<number>;

hook.after(mockHookContext, evaluationDetails);
const result = await reader.collect();
expect(
hasDataPointMatching(
result.resourceMetrics.scopeMetrics,
SUCCESS_TOTAL_NAME,
0,
(point) =>
point.value === 1 &&
point.attributes.key === FLAG_KEY &&
point.attributes.provider === PROVIDER_NAME &&
point.attributes.variant === VARIANT &&
point.attributes.reason === StandardResolutionReasons.STATIC
)
).toBeTruthy();
});

it('should increment evaluation_success_total and set attrs with variant = value', async () => {
const FLAG_KEY = 'after-test-key';
const PROVIDER_NAME = 'after-provider-name';
const VALUE = 1;
const hook = new MetricsHook();
const mockHookContext: HookContext = {
flagKey: FLAG_KEY,
providerMetadata: {
name: PROVIDER_NAME,
},
} as HookContext;
const evaluationDetails: EvaluationDetails<number> = {
value: VALUE,
reason: StandardResolutionReasons.STATIC,
} as EvaluationDetails<number>;

hook.after(mockHookContext, evaluationDetails);
const result = await reader.collect();
expect(
hasDataPointMatching(
result.resourceMetrics.scopeMetrics,
SUCCESS_TOTAL_NAME,
1,
(point) =>
point.value === 1 &&
point.attributes.key === FLAG_KEY &&
point.attributes.provider === PROVIDER_NAME &&
point.attributes.variant === VALUE.toString() &&
point.attributes.reason === StandardResolutionReasons.STATIC
)
).toBeTruthy();
});
});
});

describe(MetricsHook.prototype.finally, () => {
it('should decrement evaluation_success_total and set attrs', async () => {
const FLAG_KEY = 'finally-test-key';
const PROVIDER_NAME = 'finally-provider-name';
const hook = new MetricsHook();
const mockHookContext: HookContext = {
flagKey: FLAG_KEY,
providerMetadata: {
name: PROVIDER_NAME,
},
} as HookContext;

hook.finally(mockHookContext);
const result = await reader.collect();
expect(
hasDataPointMatching(
result.resourceMetrics.scopeMetrics,
ACTIVE_COUNT_NAME,
1,
(point) =>
point.value === -1 && point.attributes.key === FLAG_KEY && point.attributes.provider === PROVIDER_NAME
)
).toBeTruthy();
});
});

describe(MetricsHook.prototype.error, () => {
it('should decrement evaluation_success_total and set attrs', async () => {
const FLAG_KEY = 'error-test-key';
const PROVIDER_NAME = 'error-provider-name';
const ERROR_MESSAGE = 'error message';
const error = new Error(ERROR_MESSAGE);
const hook = new MetricsHook();
const mockHookContext: HookContext = {
flagKey: FLAG_KEY,
providerMetadata: {
name: PROVIDER_NAME,
},
} as HookContext;

hook.error(mockHookContext, error);
const result = await reader.collect();
expect(
hasDataPointMatching(
result.resourceMetrics.scopeMetrics,
ERROR_TOTAL_NAME,
0,
(point) =>
point.value === 1 && point.attributes.key === FLAG_KEY && point.attributes.provider === PROVIDER_NAME
)
).toBeTruthy();
});
});
});

const hasDataPointMatching = (
scopeMetrics: ScopeMetrics[],
metricName: string,
dataPointIndex: number,
dataPointMatcher: (dataPoint: DataPoint<number>) => boolean
) => {
const found = scopeMetrics.find((sm) =>
sm.metrics.find((m) => {
const point = m.dataPoints[dataPointIndex] as DataPoint<number>;
if (point) {
return m.descriptor.name === metricName && dataPointMatcher(point);
}
})
);
if (!found) {
throw Error('Unable to find matching datapoint');
}
return found;
};
Loading