From d5e8d074ff3374a8b09a36bb8a9adc443c085b34 Mon Sep 17 00:00:00 2001 From: anastasiiasergienko Date: Wed, 18 Nov 2020 11:58:54 +0100 Subject: [PATCH] * #402: Updated the documentation about implementing a new dialect. --- doc/changes/changelog.md | 1 + doc/changes/changes_4.0.5.md | 7 +++ .../developing_a_dialect.md | 40 ++++++------- ...ng_additional_dialect_specific_behavior.md | 16 +++-- ...lementing_mandatory_sql_dialect_classes.md | 58 ++++++++++++------- .../integration_testing_with_containers.md | 4 +- 6 files changed, 80 insertions(+), 46 deletions(-) create mode 100644 doc/changes/changes_4.0.5.md diff --git a/doc/changes/changelog.md b/doc/changes/changelog.md index 1a03e8da5..7196cc922 100644 --- a/doc/changes/changelog.md +++ b/doc/changes/changelog.md @@ -1,5 +1,6 @@ # Changes +* [4.0.5](changes_4.0.5.md) * [4.0.4](changes_4.0.4.md) * [4.0.3](changes_4.0.3.md) * [4.0.2](changes_4.0.2.md) diff --git a/doc/changes/changes_4.0.5.md b/doc/changes/changes_4.0.5.md new file mode 100644 index 000000000..1e4b57bf8 --- /dev/null +++ b/doc/changes/changes_4.0.5.md @@ -0,0 +1,7 @@ +# Exasol Virtual Schemas 4.0.5, released 2020-??-?? + +Code name: + +## Documentation + +* #402: Updated the documentation about implementing a new dialect. \ No newline at end of file diff --git a/doc/development/developing-sql-dialect/developing_a_dialect.md b/doc/development/developing-sql-dialect/developing_a_dialect.md index 99a823487..7431667cf 100644 --- a/doc/development/developing-sql-dialect/developing_a_dialect.md +++ b/doc/development/developing-sql-dialect/developing_a_dialect.md @@ -25,25 +25,25 @@ As an example, PostgreSQL handles some of the data types subtly different from E Below you can see a layer model of the Virtual Schemas when implemented with the JDBC adapter. The layers in the middle — i.e. everything that deals with translating between the source and Exasol — are provided in this repository. - .-----------------------------------------. - | Exasol | Exasol | - | core |----------------------------| - | |//// Virtual Schema API ////| - |------------|----------------------------| - | | JDBC Adapter | Common JDBC functions - | In this |----------------------------| - | repository |///// SQL Dialect API //////| - | |----------------------------| - | | SQL Dialect Adapter | Even out specifics of the source database - |------------|----------------------------| - | |///////// JDBC API /////////| - | |----------------------------| - | | PostgresSQL JDBC Driver | JDBC compliant access to payload and metadata - | External |----------------------------| - | |// PostgresSQL Native API //| - | |----------------------------| - | | PostgreSQL | External data source - '-----------------------------------------' + .-------------------------------------------------. + | Exasol | Exasol | + | core |----------------------------| + | |//// Virtual Schema API ////| + |--------------------|----------------------------| + | In vs-common-jdbc | JDBC Adapter | Common JDBC functions + | repository |----------------------------| + | |///// SQL Dialect API //////| + | |----------------------------| + | In this repository | SQL Dialect Adapter | Even out specifics of the source database + |--------------------|----------------------------| + | |///////// JDBC API /////////| + | |----------------------------| + | | PostgresSQL JDBC Driver | JDBC compliant access to payload and metadata + | External |----------------------------| + | |// PostgresSQL Native API //| + | |----------------------------| + | | PostgreSQL | External data source + '-------------------------------------------------' For more information about the structure of the Virtual Schemas check the UML diagrams provided in the directory [model/diagrams](../../../model/diagrams). You either need [PlantUML](http://plantuml.com/) to render them or an editor that has PlamtUML preview built in. @@ -89,7 +89,7 @@ The Java package structure of the `virtualschema-jdbc-adapter` reflects the sepa | | | '-- ... | - '-- jdbc Base implementation for getting metadata from JDBC + '-- jdbc Base implementation for getting metadata from JDBC (based in virtual-schema-common-jdbc repository) ### Interfaces diff --git a/doc/development/developing-sql-dialect/implementing_additional_dialect_specific_behavior.md b/doc/development/developing-sql-dialect/implementing_additional_dialect_specific_behavior.md index e66810ca7..6755845bb 100644 --- a/doc/development/developing-sql-dialect/implementing_additional_dialect_specific_behavior.md +++ b/doc/development/developing-sql-dialect/implementing_additional_dialect_specific_behavior.md @@ -178,7 +178,7 @@ There are differences in how precise the remote data source can encode integer, The best way to find out how good the default mapping works for your source — run a manual [integration test](integration_testing.md) with [remote logging](../remote_logging.md) accessing a table with all data types available in the source. If you assume that you don't need to change data type conversion — go to the next checkpoint: [Implementing Query Rewriting](#implementing-query-rewriting) -Let's look at a HIVE dialect example. We only want to change mapping for one data type: DECIMAL. +Let's look at a HIVE dialect example. We only want to change mapping for two data type: DECIMAL and BINARY. 1. **Create `ColumnMetadataReader.java`** class that extends `BaseColumnMetadataReader.java`. @@ -198,8 +198,11 @@ Let's look at a HIVE dialect example. We only want to change mapping for one dat @Override public DataType mapJdbcType(final JdbcTypeDescription jdbcTypeDescription) { - if (jdbcTypeDescription.getJdbcType() == Types.DECIMAL) { + final int jdbcType = jdbcTypeDescription.getJdbcType(); + if (jdbcType == Types.DECIMAL) { return mapDecimal(jdbcTypeDescription); + } else if (jdbcType == Types.BINARY) { + return DataType.createMaximumSizeVarChar(DataType.ExaCharset.UTF8); } else { return super.mapJdbcType(jdbcTypeDescription); } @@ -323,7 +326,12 @@ For each of them a base implementation exists which works fine with a number of ```java @Override protected RemoteMetadataReader createRemoteMetadataReader() { - return new AthenaMetadataReader(this.connection, this.properties); + try { + return new AthenaMetadataReader(this.connectionFactory.getConnection(), this.properties); + } catch (final SQLException exception) { + throw new RemoteMetadataReaderException( + "Unable to create Athena remote metadata reader. Caused by: " + exception.getMessage(), exception); + } } ``` @@ -346,4 +354,4 @@ For each of them a base implementation exists which works fine with a number of return new BaseTableMetadataReader(this.connection, this.columnMetadataReader, this.properties, this.identifierConverter); } - ``` + ``` \ No newline at end of file diff --git a/doc/development/developing-sql-dialect/implementing_mandatory_sql_dialect_classes.md b/doc/development/developing-sql-dialect/implementing_mandatory_sql_dialect_classes.md index 5c70c6104..f0322d863 100644 --- a/doc/development/developing-sql-dialect/implementing_mandatory_sql_dialect_classes.md +++ b/doc/development/developing-sql-dialect/implementing_mandatory_sql_dialect_classes.md @@ -274,18 +274,16 @@ And we also need two corresponding test classes: Another thing we need to implement in the dialect class is quoting of string literals. - Athena expects string literals to be wrapped in single quotes and single qoutes inside the literal to be escaped by duplicating each. + Athena expects string literals to be wrapped in single quotes and single quotes inside the literal to be escaped by duplicating each. 1. **Create the `testGetLiteralString()` test** method: ```java - @ValueSource(strings = { "ab:\'ab\'", "a'b:'a''b'", "a''b:'a''''b'", "'ab':'''ab'''" }) + @ValueSource(strings = { "ab:'ab'", "a'b:'a''b'", "a''b:'a''''b'", "'ab':'''ab'''" }) @ParameterizedTest void testGetLiteralString(final String definition) { - final int colonPosition = definition.indexOf(':'); - final String original = definition.substring(0, colonPosition); - final String literal = definition.substring(colonPosition + 1); - assertThat(this.dialect.getStringLiteral(original), equalTo(literal)); + assertThat(this.dialect.getStringLiteral(definition.substring(0, definition.indexOf(':'))), + equalTo(definition.substring(definition.indexOf(':') + 1))); } ``` @@ -297,22 +295,29 @@ And we also need two corresponding test classes: ```java @Override public String getStringLiteral(final String value) { - final StringBuilder builder = new StringBuilder("'"); - builder.append(value.replaceAll("'", "''")); - builder.append("'"); - return builder.toString(); + if (value == null) { + return "NULL"; + } else { + return "'" + value.replace("'", "''") + "'"; + } } ``` ### Implement the Applying of Quotes 1. The next method to **implement: `applyQuote()`**. It applies quotes to table and schema names. - In case of Aurora it's a little bit complicated, so let's see a more generic example: + In case of Aurora there are two different ways to apply the quotes: "" and ``. + If an identifier starts with an underscore, we use the backticks. Otherwise, we use double quotes. ```java - @Test - void testApplyQuote() { - assertThat(this.dialect.applyQuote("tableName"), Matchers.equalTo("\"tableName\"")); + @CsvSource({ "tableName, \"tableName\"", // + "table123, \"table123\"", // + "_table, `_table`", // + "123table, \"123table\"", // + "table_name, \"table_name\"" }) + @ParameterizedTest + void testApplyQuote(final String unquoted, final String quoted) { + assertThat(this.dialect.applyQuote(unquoted), equalTo(quoted)); } ``` And implementation: @@ -320,8 +325,20 @@ And we also need two corresponding test classes: ```java @Override public String applyQuote(final String identifier) { - return "\"" + identifier + "\""; - } + if (this.id.startsWith("_")) { + return quoteWithBackticks(this.id); + } else { + return quoteWithDoubleQuotes(this.id); + } + } + + private String quoteWithBackticks(final String identifier) { + return "`" + identifier + "`"; + } + + private String quoteWithDoubleQuotes(final String identifier) { + return "\"" + identifier + "\""; + } ``` 1. You have **two unimplemented methods** left: `createQueryRewriter()` and `createRemoteMetadataReader()`. @@ -329,17 +346,18 @@ And we also need two corresponding test classes: ```java @Override - protected RemoteMetadataReader createRemoteMetadataReader() { + protected RemoteMetadataReader createRemoteMetadataReader() { try { return new AthenaMetadataReader(this.connectionFactory.getConnection(), this.properties); } catch (final SQLException exception) { - throw new RemoteMetadataReaderException("Unable to create Athena remote metadata reader.", exception); + throw new RemoteMetadataReaderException( + "Unable to create Athena remote metadata reader. Caused by: " + exception.getMessage(), exception); } } @Override protected QueryRewriter createQueryRewriter() { - return new BaseQueryRewriter(this, this.remoteMetadataReader, this.connectionFactory); + return new BaseQueryRewriter(this, createRemoteMetadataReader(), this.connectionFactory); } ``` @@ -394,4 +412,4 @@ It looks up the fully qualified class name of the dialect factories in the file It is more resource-efficient and secure to load only the one single dialect that we actually need. The factories are very lightweight, dialects not so much. -4. **Add the fully qualified factory name** `com.exasol.adapter.dialects.athena.AthenaSqlDialectFactory` to the **file `/src/main/resources/META-INF/services/com.exasol.adapter.dialects.SqlDialectFactory`** so that the class loader can find your new dialect factory. +4. **Add the fully qualified factory name** `com.exasol.adapter.dialects.athena.AthenaSqlDialectFactory` to the **file `/src/main/resources/META-INF/services/com.exasol.adapter.dialects.SqlDialectFactory`** so that the class loader can find your new dialect factory. \ No newline at end of file diff --git a/doc/development/developing-sql-dialect/integration_testing_with_containers.md b/doc/development/developing-sql-dialect/integration_testing_with_containers.md index da077998f..ced9a01b6 100644 --- a/doc/development/developing-sql-dialect/integration_testing_with_containers.md +++ b/doc/development/developing-sql-dialect/integration_testing_with_containers.md @@ -66,9 +66,9 @@ Another way to run integration tests: List of enabled integration tests: -* ExasolSqlDialectIT +* ExasolSqlDialectIT (in exasol-virtual-schema repository) * PostgreSQLSqlDialectIT - +* SqlServerSqlDialectIT ## Executing Disabled Integration Tests