Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Introducing Logging Class (Disabled Usage of Logger) #608

Merged
merged 38 commits into from
Apr 4, 2024
Merged
Show file tree
Hide file tree
Changes from 17 commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
e49e4ae
feat: prototyped new logs schema
Kevin101Zhang Mar 19, 2024
cc7a10a
prototype indexer-logger class
Kevin101Zhang Mar 25, 2024
a82aa7b
chore: removed test code
Kevin101Zhang Mar 25, 2024
4f33dd4
rebase
Kevin101Zhang Mar 26, 2024
dc80f7e
moved ddl, mocked batch func, updated createLogs calls after provisio…
Kevin101Zhang Mar 26, 2024
56dea50
add batch insert
Kevin101Zhang Mar 26, 2024
e1fb2e9
removed local variable for database connections
Kevin101Zhang Mar 26, 2024
7648295
added tracer
Kevin101Zhang Mar 26, 2024
ad1f5ab
added pg-format, add constrcutor, removed unneccessary code, timestam…
Kevin101Zhang Mar 29, 2024
58623aa
Merge branch 'main' of https://github.com/near/queryapi into 299-impl…
Kevin101Zhang Mar 31, 2024
039789d
fix: readded indexer snap file
Kevin101Zhang Mar 31, 2024
81824ce
fix: modified logger-test
Kevin101Zhang Mar 31, 2024
57b6bda
fix: removed cron from logs-table
Kevin101Zhang Mar 31, 2024
8afef13
fix: removed log_ prefix
Kevin101Zhang Mar 31, 2024
cc37561
combined calls in indexer-logs, additional minor fixes
Kevin101Zhang Apr 1, 2024
f3c4b37
chore:lint indexer.ts
Kevin101Zhang Apr 1, 2024
7bb8e4e
fix: refactored indexer-logger, ran eslint, adjusted indexer.ts
Kevin101Zhang Apr 2, 2024
4083dda
fix: added test to skip log when logLevel<, renamed trace name
Kevin101Zhang Apr 2, 2024
5265128
chore: added newline at eof in docker
Kevin101Zhang Apr 2, 2024
eea34ce
fix: renamed runSql to executeSqlOnSchema
Kevin101Zhang Apr 2, 2024
9d13c85
removed duplicate postgres.dockerfile
Kevin101Zhang Apr 2, 2024
cf8803f
feat: LogLevel is now owned by IndexerLogger
Kevin101Zhang Apr 3, 2024
04a2eb3
added single typed arguments, schema blockheight is now optional
Kevin101Zhang Apr 3, 2024
f0f46f9
fix: corrected pathing for test
Kevin101Zhang Apr 3, 2024
470eab3
Merge branch 'main' into 299-implement-the-new-logs-schema
Kevin101Zhang Apr 3, 2024
849dec6
fix: docker-compose reverted to main
Kevin101Zhang Apr 3, 2024
97390b0
updated indexer test params, instantiated indexerlogger
Kevin101Zhang Apr 3, 2024
c3b5f49
chore:lint
Kevin101Zhang Apr 3, 2024
f609269
fix: added indexerLogger to latest param to integration test
Kevin101Zhang Apr 3, 2024
296401c
updated mutation name from writeLogOld to writeLog
Kevin101Zhang Apr 4, 2024
7254ad8
temp: callWriteLog for streamhandler
Kevin101Zhang Apr 4, 2024
eff6eda
fix: readded back funccall in streamhandler
Kevin101Zhang Apr 4, 2024
6aa96f1
commented out streamhandler/provisioner/test
Kevin101Zhang Apr 4, 2024
d49cb12
reverted back indexer/test
Kevin101Zhang Apr 4, 2024
a5e1af9
reverted integration test
Kevin101Zhang Apr 4, 2024
2190587
repositioned runningMessage in indexer
Kevin101Zhang Apr 4, 2024
6574db8
added new line snapfile
Kevin101Zhang Apr 4, 2024
b816f9f
renamed provisioner methods
Kevin101Zhang Apr 4, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions postgres.Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
FROM postgres:14
RUN apt-get update && apt-get install -y postgresql-14-cron
EXPOSE 5432
CMD ["postgres", "-c", "shared_preload_libraries=pg_cron"]
4 changes: 3 additions & 1 deletion docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,8 @@ services:
- "6379:6379"

postgres:
build:
dockerfile: ./postgres.Dockerfile
image: postgres:14
restart: always
volumes:
Expand Down Expand Up @@ -160,4 +162,4 @@ services:
volumes:
postgres:
redis:
grafana:
grafana:
2 changes: 1 addition & 1 deletion runner/src/hasura-client/hasura-client.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ describe('HasuraClient', () => {
});
const client = new HasuraClient({ fetch: mockFetch as unknown as typeof fetch }, config);

await client.runMigrations('dbName', 'schemaName', 'CREATE TABLE blocks (height numeric)');
await client.runSql('dbName', 'schemaName', 'CREATE TABLE blocks (height numeric)');

expect(mockFetch.mock.calls).toMatchSnapshot();
});
Expand Down
5 changes: 2 additions & 3 deletions runner/src/hasura-client/hasura-client.ts
Original file line number Diff line number Diff line change
Expand Up @@ -155,11 +155,11 @@ export default class HasuraClient {
});
}

async runMigrations (source: string, schemaName: string, migration: string): Promise<any> {
async runSql (source: string, schemaName: string, sqlScript: string): Promise<any> {
return await this.executeSql(
`
set schema '${schemaName}';
${migration}
${sqlScript}
`,
{ source, readOnly: false }
);
Expand All @@ -172,7 +172,6 @@ export default class HasuraClient {
source,
}
);

return tablesInSource
.filter(({ schema }: { schema: string }) => schema === schemaName)
.map(({ name }: { name: string }) => name);
Expand Down
116 changes: 116 additions & 0 deletions runner/src/indexer-logger/indexer-logger.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,116 @@
import pgFormat from 'pg-format';
import IndexerLogger from './indexer-logger';
import type PgClient from '../pg-client';
import { LogLevel } from '../stream-handler/stream-handler';
import { LogType, type LogEntry } from './indexer-logger';

describe('IndexerLogger', () => {
let pgClient: PgClient;
let query: jest.Mock;

beforeEach(() => {
query = jest.fn().mockReturnValue({ rows: [] });
pgClient = {
query,
format: pgFormat
} as unknown as PgClient;
});

const mockDatabaseConnectionParameters = {
username: 'test_user',
password: 'test_password',
host: 'test_host',
port: 5432,
database: 'test_database'
};
const functionName = 'testFunction';

describe('writeLog', () => {
it('should insert a single log entry into the database', async () => {
const indexerLogger = new IndexerLogger(functionName, 5, mockDatabaseConnectionParameters, pgClient);
const logEntry: LogEntry = {
blockHeight: 123,
logTimestamp: new Date(),
logType: LogType.SYSTEM,
logLevel: LogLevel.INFO,
message: 'Test log message'
};

await indexerLogger.writeLogs(logEntry);

const expectedQueryStructure = `INSERT INTO "${functionName}".__logs (block_height, date, timestamp, type, level, message) VALUES`;
expect(query.mock.calls[0][0]).toContain(expectedQueryStructure);
});

it('should handle errors when inserting a single log entry', async () => {
query.mockRejectedValueOnce(new Error('Failed to insert log'));

const indexerLogger = new IndexerLogger(functionName, 5, mockDatabaseConnectionParameters, pgClient);
const logEntry: LogEntry = {
blockHeight: 123,
logTimestamp: new Date(),
logType: LogType.SYSTEM,
logLevel: LogLevel.INFO,
message: 'Test log message'
};

await expect(indexerLogger.writeLogs(logEntry)).rejects.toThrow('Failed to insert log');
});

it('should insert a batch of log entries into the database', async () => {
const indexerLogger = new IndexerLogger(functionName, 5, mockDatabaseConnectionParameters, pgClient);
const logEntries: LogEntry[] = [
{
blockHeight: 123,
logTimestamp: new Date(),
logType: LogType.SYSTEM,
logLevel: LogLevel.INFO,
message: 'Test log message 1'
},
{
blockHeight: 124,
logTimestamp: new Date(),
logType: LogType.SYSTEM,
logLevel: LogLevel.INFO,
message: 'Test log message 2'
}
];

await indexerLogger.writeLogs(logEntries);

const expectedQuery = `INSERT INTO "${functionName}".__logs (block_height, date, timestamp, type, level, message) VALUES`;
expect(query.mock.calls[0][0]).toContain(expectedQuery);
});

it('should handle errors when inserting a batch of log entries', async () => {
query.mockRejectedValueOnce(new Error('Failed to insert batch of logs'));

const indexerLogger = new IndexerLogger(functionName, 5, mockDatabaseConnectionParameters, pgClient);
const logEntries: LogEntry[] = [
{
blockHeight: 123,
logTimestamp: new Date(),
logType: LogType.SYSTEM,
logLevel: LogLevel.INFO,
message: 'Test log message 1'
},
{
blockHeight: 124,
logTimestamp: new Date(),
logType: LogType.SYSTEM,
logLevel: LogLevel.INFO,
message: 'Test log message 2'
}
];

await expect(indexerLogger.writeLogs(logEntries)).rejects.toThrow('Failed to insert batch of logs');
});

it('should handle empty log entry', async () => {
const indexerLogger = new IndexerLogger(functionName, 5, mockDatabaseConnectionParameters, pgClient);
const logEntries: LogEntry[] = [];
await indexerLogger.writeLogs(logEntries);
expect(query).not.toHaveBeenCalled();
});
});
});
77 changes: 77 additions & 0 deletions runner/src/indexer-logger/indexer-logger.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
import format from 'pg-format';
import { wrapError } from '../utility';
import PgClient from '../pg-client';
import { type DatabaseConnectionParameters } from '../provisioner/provisioner';
import { LogLevel } from '../stream-handler/stream-handler';
import { trace } from '@opentelemetry/api';

export interface LogEntry {
blockHeight: number
logTimestamp: Date
logType: string
logLevel: LogLevel
message: string
}

export enum LogType {
SYSTEM = 'system',
USER = 'user',
}
export default class IndexerLogger {
tracer = trace.getTracer('queryapi-runner-indexer-logger');

private readonly pgClient: PgClient;
private readonly schemaName: string;
private readonly logInsertQueryTemplate: string = 'INSERT INTO %I.__logs (block_height, date, timestamp, type, level, message) VALUES %L';
private readonly loggingLevel: number;

constructor (
functionName: string,
loggingLevel: number,
databaseConnectionParameters: DatabaseConnectionParameters,
pgClientInstance: PgClient | undefined = undefined
) {
const pgClient = pgClientInstance ?? new PgClient({
user: databaseConnectionParameters.username,
password: databaseConnectionParameters.password,
host: process.env.PGHOST,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
host: process.env.PGHOST,
host: databaseConnectionParameters.host

With the talk of multiple database instances there is potential that this could be wrong, it's unlikely, but will avoid headache in future :)

port: Number(databaseConnectionParameters.port),
database: databaseConnectionParameters.database,
});

this.pgClient = pgClient;
this.schemaName = functionName.replace(/[^a-zA-Z0-9]/g, '_');
this.loggingLevel = loggingLevel;
}

private shouldLog (logLevel: LogLevel): boolean {
return logLevel >= this.loggingLevel;
}

async writeLogs (
logEntries: LogEntry | LogEntry[],
): Promise<void> {
const entriesArray = (Array.isArray(logEntries) ? logEntries : [logEntries]).filter(entry => this.shouldLog(entry.logLevel)); ;
if (entriesArray.length === 0) return;

const spanMessage = `call writeLog function of IndexerLogger for ${entriesArray.length === 1 ? 'single entry' : `batch of ${entriesArray.length}`}`;
const writeLogSpan = this.tracer.startSpan(spanMessage);

await wrapError(async () => {
const values = entriesArray.map(entry => [
entry.blockHeight,
entry.logTimestamp,
entry.logTimestamp,
entry.logType,
LogLevel[entry.logLevel],
entry.message
]);

const query = format(this.logInsertQueryTemplate, this.schemaName, values);
await this.pgClient.query(query);
}, `Failed to insert ${entriesArray.length > 1 ? 'logs' : 'log'} into the ${this.schemaName}.__logs table`)
.finally(() => {
writeLogSpan.end();
});
}
}
2 changes: 1 addition & 1 deletion runner/src/indexer/__snapshots__/indexer.test.ts.snap
Original file line number Diff line number Diff line change
Expand Up @@ -534,4 +534,4 @@ exports[`Indexer unit tests Indexer.runFunctions() supplies the required role to
},
],
]
`;
`;
Loading
Loading