Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bugfix/99 oracle timestamp #116

Merged
merged 20 commits into from
Mar 20, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
9be59e6
#99 fixed problem in creation of column description for IMPORT FROM JDBC
snehlsen Mar 13, 2019
2af2544
#99 fixed bug in oracle dialect (IMPORT FROM JDBC) and timestamp types
snehlsen Mar 13, 2019
9c71d43
#99 do not cast timestamps when the native ORA import is used
snehlsen Mar 13, 2019
fe70029
#99 enabled possibility to run oracle integration test locally
snehlsen Mar 13, 2019
280dd45
#99 fixed connections in oracle integrationtest
snehlsen Mar 14, 2019
22c6039
#99 fixed Oracle integration test
snehlsen Mar 15, 2019
35098ee
#99 automated deployment of oracle instantclient
snehlsen Mar 15, 2019
784c1b6
#99 removed oracle drivers
snehlsen Mar 15, 2019
a30d438
#99 added option to include additional local drivers
snehlsen Mar 15, 2019
2a19f13
fixed problem in instantclient deployment
snehlsen Mar 15, 2019
f2a89c8
#99 added test for SELECT * from timestamp columns to oracle integrat…
snehlsen Mar 15, 2019
f6f546f
Merge branch 'master' into bugfix/99_oracle_timestamp
snehlsen Mar 15, 2019
121983d
#99 fixed timestamp testcase
snehlsen Mar 15, 2019
e8fa47a
added final
snehlsen Mar 15, 2019
ffee125
Merge branch 'bugfix/99_oracle_timestamp' of https://github.com/exaso…
snehlsen Mar 15, 2019
486c20a
increased version to 1.8.1
snehlsen Mar 15, 2019
c6ffe66
merged master
snehlsen Mar 19, 2019
8796c43
fixed sonar findings
snehlsen Mar 19, 2019
3f972f1
added code for ORA IMPORT generation, fixed Oracle integration test
snehlsen Mar 19, 2019
d152f74
added test for building JdbcTypeDescription from ResultSetMetadata
snehlsen Mar 19, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions jdbc-adapter/doc/deploying_the_virtual_schema_adapter.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ cd virtual-schemas/jdbc-adapter/
mvn clean -DskipTests package
```

The resulting fat JAR is stored in `virtualschema-jdbc-adapter-dist/target/virtualschema-jdbc-adapter-dist-1.7.2.jar`.
The resulting fat JAR is stored in `virtualschema-jdbc-adapter-dist/target/virtualschema-jdbc-adapter-dist-1.8.1.jar`.

## Uploading the Adapter JAR Archive

Expand All @@ -42,8 +42,8 @@ Following steps are required to upload a file to a bucket:
1. Now upload the file into this bucket, e.g. using curl (adapt the hostname, BucketFS port, bucket name and bucket write password).

```bash
curl -X PUT -T virtualschema-jdbc-adapter-dist/target/virtualschema-jdbc-adapter-dist-1.7.2.jar \
http://w:[email protected]:2580/bucket1/virtualschema-jdbc-adapter-dist-1.7.2.jar
curl -X PUT -T virtualschema-jdbc-adapter-dist/target/virtualschema-jdbc-adapter-dist-1.8.1.jar \
http://w:[email protected]:2580/bucket1/virtualschema-jdbc-adapter-dist-1.8.1.jar
```

See chapter 3.6.4. "The synchronous cluster file system BucketFS" in the EXASolution User Manual for more details about BucketFS.
Expand Down Expand Up @@ -75,7 +75,7 @@ CREATE JAVA ADAPTER SCRIPT adapter.jdbc_adapter AS

// This will add the adapter jar to the classpath so that it can be used inside the adapter script
// Replace the names of the bucketfs and the bucket with the ones you used.
%jar /buckets/your-bucket-fs/your-bucket/virtualschema-jdbc-adapter-dist-1.7.2.jar;
%jar /buckets/your-bucket-fs/your-bucket/virtualschema-jdbc-adapter-dist-1.8.1.jar;

// You have to add all files of the data source jdbc driver here (e.g. Hive JDBC driver files)
%jar /buckets/your-bucket-fs/your-bucket/name-of-data-source-jdbc-driver.jar;
Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/doc/developing_an_sql_dialect.md
Original file line number Diff line number Diff line change
Expand Up @@ -292,7 +292,7 @@ CREATE OR REPLACE JAVA ADAPTER SCRIPT adapter.jdbc_adapter

// This will add the adapter jar to the classpath so that it can be used inside the adapter script
// Replace the names of the bucketfs and the bucket with the ones you used.
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.7.2.jar;
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.8.1.jar;

// You have to add all files of the data source jdbc driver here (e.g. MySQL or Hive)

Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/doc/sql_dialects/db2.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ CREATE or replace JAVA ADAPTER SCRIPT adapter.jdbc_adapter AS

// This will add the adapter jar to the classpath so that it can be used inside the adapter script
// Replace the names of the bucketfs and the bucket with the ones you used.
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.7.2.jar;
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.8.1.jar;

// DB2 Driver files
%jar /buckets/bucketfs1/bucket1/db2jcc4.jar;
Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/doc/sql_dialects/exasol.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ After uploading the adapter jar, the adapter script can be created as follows:
CREATE SCHEMA adapter;
CREATE JAVA ADAPTER SCRIPT adapter.jdbc_adapter AS
%scriptclass com.exasol.adapter.jdbc.JdbcAdapter;
%jar /buckets/your-bucket-fs/your-bucket/virtualschema-jdbc-adapter-dist-1.7.2.jar;
%jar /buckets/your-bucket-fs/your-bucket/virtualschema-jdbc-adapter-dist-1.8.1.jar;
/
```

Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/doc/sql_dialects/hive.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ CREATE SCHEMA adapter;
CREATE JAVA ADAPTER SCRIPT jdbc_adapter AS
%scriptclass com.exasol.adapter.jdbc.JdbcAdapter;

%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.7.2.jar;
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.8.1.jar;

%jar /buckets/bucketfs1/bucket1/hive_metastore.jar;
%jar /buckets/bucketfs1/bucket1/hive_service.jar;
Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/doc/sql_dialects/impala.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ CREATE SCHEMA adapter;
CREATE JAVA ADAPTER SCRIPT jdbc_adapter AS
%scriptclass com.exasol.adapter.jdbc.JdbcAdapter;

%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.7.2.jar;
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.8.1.jar;

%jar /buckets/bucketfs1/bucket1/hive_metastore.jar;
%jar /buckets/bucketfs1/bucket1/hive_service.jar;
Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/doc/sql_dialects/oracle.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ CREATE JAVA ADAPTER SCRIPT adapter.jdbc_oracle AS

// You need to replace `your-bucket-fs` and `your-bucket` to match the actual location
// of the adapter jar.
%jar /buckets/your-bucket-fs/your-bucket/virtualschema-jdbc-adapter-dist-1.7.2.jar;
%jar /buckets/your-bucket-fs/your-bucket/virtualschema-jdbc-adapter-dist-1.8.1.jar;

// Add the oracle jdbc driver to the classpath
%jar /buckets/bucketfs1/bucket1/ojdbc7-12.1.0.2.jar
Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/doc/sql_dialects/postgresql.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ CREATE OR REPLACE JAVA ADAPTER SCRIPT adapter.jdbc_adapter

// This will add the adapter jar to the classpath so that it can be used inside the adapter script
// Replace the names of the bucketfs and the bucket with the ones you used.
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.7.2.jar;
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.8.1.jar;

// You have to add all files of the data source jdbc driver here (e.g. MySQL or Hive)
%jar /buckets/bucketfs1/bucket1/postgresql-42.0.0.jar;
Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/doc/sql_dialects/redshift.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ CREATE OR REPLACE JAVA ADAPTER SCRIPT adapter.jdbc_adapter

// This will add the adapter jar to the classpath so that it can be used inside the adapter script
// Replace the names of the bucketfs and the bucket with the ones you used.
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.7.2.jar;
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.8.1.jar;

// You have to add all files of the data source jdbc driver here (e.g. MySQL or Hive)

Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/doc/sql_dialects/sql_server.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ CREATE OR REPLACE JAVA ADAPTER SCRIPT adapter.sql_server_jdbc_adapter

// This will add the adapter jar to the classpath so that it can be used inside the adapter script
// Replace the names of the bucketfs and the bucket with the ones you used.
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.7.2.jar;
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.8.1.jar;

// You have to add all files of the data source jdbc driver here
%jar /buckets/bucketfs1/bucket1/jtds.jar;
Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/doc/sql_dialects/sybase.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ CREATE OR REPLACE JAVA ADAPTER SCRIPT adapter.jdbc_adapter
AS

%scriptclass com.exasol.adapter.jdbc.JdbcAdapter;
%jar /buckets/bucketfs1/virtualschema/virtualschema-jdbc-adapter-dist-1.7.2.jar;
%jar /buckets/bucketfs1/virtualschema/virtualschema-jdbc-adapter-dist-1.8.1.jar;
%jar /buckets/bucketfs1/virtualschema/jtds-1.3.1.jar;
/
```
Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/doc/sql_dialects/teradata.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ CREATE OR REPLACE JAVA ADAPTER SCRIPT adapter.jdbc_adapter

// This will add the adapter jar to the classpath so that it can be used inside the adapter script
// Replace the names of the bucketfs and the bucket with the ones you used.
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.7.2.jar;
%jar /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.8.1.jar;

// You have to add all files of the data source jdbc driver here (e.g. MySQL or Hive)
%jar /buckets/bucketfs1/bucket1/terajdbc4.jar;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ general:
debugAddress: '192.168.0.12:3000' # Address which will be defined as DEBUG_ADDRESS in the virtual schemas
bucketFsUrl: http://exasol-host:2580/bucket1
bucketFsPassword: bucket1
jdbcAdapterPath: /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.7.2.jar
jdbcAdapterPath: /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.8.1.jar

exasol:
runIntegrationTests: true
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ general:
debugAddress: '192.168.0.12:3000' # Address which will be defined as DEBUG_ADDRESS in the virtual schemas
bucketFsUrl: http://exasol-host:2580/bucket1
bucketFsPassword: bucket1
jdbcAdapterPath: /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.7.2.jar
jdbcAdapterPath: /buckets/bucketfs1/bucket1/virtualschema-jdbc-adapter-dist-1.8.1.jar

exasol:
runIntegrationTests: true
Expand Down
21 changes: 16 additions & 5 deletions jdbc-adapter/integration-test-data/integration-test-travis.yaml
Original file line number Diff line number Diff line change
@@ -1,11 +1,12 @@
# Configuration file for integration tests run by `run_integration_tests.sh`

general:
debug: false
debugAddress: ''
bucketFsUrl: http://127.0.0.1:6594/default
bucketFsPassword: write
jdbcAdapterPath: /buckets/bfsdefault/default/virtualschema-jdbc-adapter-dist-1.7.2.jar
debug: false
debugAddress: ''
bucketFsUrl: http://127.0.0.1:6594/default
bucketFsPassword: write
jdbcAdapterPath: /buckets/bfsdefault/default/virtualschema-jdbc-adapter-dist-1.8.1.jar
additionalJDBCDriverDir: /var/tmp/vstest/drivers/

exasol:
runIntegrationTests: true
Expand All @@ -24,3 +25,13 @@ postgresql:
dockerPortMapping: 45432:5432
dockerName: testpg
dockerConnectionString: jdbc:postgresql://DBHOST:5432/postgres

oracle:
runIntegrationTests: false
jdbcDriverPath: /buckets/bfsdefault/default/drivers/jdbc/ORACLE/ojdbc7.jar;
connectionString: jdbc:oracle:thin:@localhost:1521/XE
user: system
password: myorapwd
dockerName: myora
dockerConnectionString: jdbc:oracle:thin:@DBHOST:1521/XE
instantclientDir: /var/tmp/vstest/instantclient/
4 changes: 4 additions & 0 deletions jdbc-adapter/integration-test-data/oracle-testdata.sql
Original file line number Diff line number Diff line change
Expand Up @@ -91,3 +91,7 @@ INSERT INTO LOADER.TYPE_TEST (c3, c5, c7, c_binfloat, c17) VALUES (
-- c_float126
-- c_long
);

create table ts_t(a timestamp, b timestamp with local time zone, c timestamp with time zone);
insert into ts_t values (timestamp '2018-01-01 11:00:00', timestamp '2018-01-01 11:00:00 +01:00', timestamp '2018-01-01 11:00:00 +01:00');

24 changes: 22 additions & 2 deletions jdbc-adapter/integration-test-data/run_integration_tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -41,12 +41,31 @@ deploy_jdbc_drivers() {
bucket_fs_url=$(awk '/bucketFsUrl/{print $NF}' $config)
bfs_url_no_http=$(echo $bucket_fs_url | awk -F/ '{for(i=3;i<=NF;++i)printf "%s/",$i}')
bucket_fs_pwd=$(awk '/bucketFsPassword/{print $NF}' $config)
bucket_fs_upload_url=http://w:$bucket_fs_pwd@$bfs_url_no_http/drivers/jdbc/
bucket_fs_upload_url=http://w:$bucket_fs_pwd@$bfs_url_no_http/drivers/
#upload drivers that are part of the repository
for d in $jdbc_driver_dir/*
do
db_driver=$(basename $d)
find $jdbc_driver_dir/$db_driver -type f -exec curl -X PUT -T {} $bucket_fs_upload_url/$db_driver/ \;
find $jdbc_driver_dir/$db_driver -type f -exec curl -X PUT -T {} $bucket_fs_upload_url/jdbc/$db_driver/ \;
done
#upload additional (local) drivers
additional_jdbc_driver_dir=$(awk '/additionalJDBCDriverDir/{print $NF}' $config)
if [ -d "$additional_jdbc_driver_dir" ]; then
for d in $additional_jdbc_driver_dir/*
do
db_driver=$(basename $d)
find $additional_jdbc_driver_dir/$db_driver -type f -exec curl -X PUT -T {} $bucket_fs_upload_url/jdbc/$db_driver/ \;
done
fi
#deploy oracle instantclient
instantclient_dir=$(awk '/instantclientDir/{print $NF}' $config)
instantclient_path=$instantclient_dir/instantclient-basic-linux.x64-12.1.0.2.0.zip
if [ -f $instantclient_path ]; then
curl -X PUT -T $instantclient_path $bucket_fs_upload_url/oracle/
fi
#workaround for https://github.com/exasol/docker-db/issues/26
docker exec -d exasoldb mkdir -p /exa/data/bucketfs/default/drivers
docker exec -d exasoldb ln -s /exa/data/bucketfs/bfsdefault/.dest/default/drivers/jdbc /exa/data/bucketfs/default/drivers/jdbc
}

replace_hosts_with_ips_in_config() {
Expand All @@ -55,6 +74,7 @@ replace_hosts_with_ips_in_config() {

start_remote_dbs() {
$docker_helper --run $config
sleep 10
}

cleanup_remote_dbs() {
Expand Down
45 changes: 28 additions & 17 deletions jdbc-adapter/integration-test-data/socker.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,21 +7,31 @@

def docker_run(config):
for db, properties in config.items():
if 'dockerImage' in properties:
cmd = "docker run -d -p {port_map} --name {name} {image}:{version}".format(
port_map = properties['dockerPortMapping'],
name = properties['dockerName'],
image = properties['dockerImage'],
version = properties['dockerImageVersion'])
print(cmd)
run(cmd)
if properties.get('runIntegrationTests', False):
if 'dockerImage' in properties:
cmd = "docker run -d -p {port_map} --name {name} {image}:{version}".format(
port_map = properties['dockerPortMapping'],
name = properties['dockerName'],
image = properties['dockerImage'],
version = properties['dockerImageVersion'])
print(cmd)
run(cmd)
elif 'dockerName' in properties:
cmd = "docker start {name}".format(name = properties['dockerName'])
print(cmd)
run(cmd)

def docker_rm(config):
for db, properties in config.items():
if 'dockerImage' in properties:
cmd = "docker rm -f {name}".format(name = properties['dockerName'])
print(cmd)
run(cmd)
if properties.get('runIntegrationTests', False):
if 'dockerImage' in properties:
cmd = "docker rm -f {name}".format(name = properties['dockerName'])
print(cmd)
run(cmd)
elif 'dockerName' in properties:
cmd = "docker stop {name}".format(name = properties['dockerName'])
print(cmd)
run(cmd)

def run(cmd):
try:
Expand All @@ -45,11 +55,12 @@ def run(cmd):

def replace_hosts_in(config):
for db, properties in config.items():
if 'dockerImage' in properties:
container_ip = get_ip_for(properties['dockerName'])
conn_string_with_ip = properties['dockerConnectionString'].replace(
'DBHOST',container_ip)
properties['dockerConnectionString'] = conn_string_with_ip
if properties.get('runIntegrationTests', False):
if 'dockerName' in properties:
container_ip = get_ip_for(properties['dockerName'])
conn_string_with_ip = properties['dockerConnectionString'].replace(
'DBHOST',container_ip)
properties['dockerConnectionString'] = conn_string_with_ip
return yaml.dump(config, default_flow_style=False)

def get_ip_for(docker_name):
Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/local/integration-test-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ general:
debugAddress: '10.44.1.228:3000' # Address which will be defined as DEBUG_ADDRESS in the virtual schemas
bucketFsUrl: http://localhost:2580/jars
bucketFsPassword: public
jdbcAdapterPath: /buckets/bfsdefault/jars/virtualschema-jdbc-adapter-dist-1.7.2.jar
jdbcAdapterPath: /buckets/bfsdefault/jars/virtualschema-jdbc-adapter-dist-1.8.1.jar

exasol:
runIntegrationTests: true
Expand Down
2 changes: 1 addition & 1 deletion jdbc-adapter/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
<module>virtualschema-jdbc-adapter-dist</module>
</modules>
<properties>
<product.version>1.7.2</product.version>
<product.version>1.8.1</product.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,10 @@ public SqlGenerationVisitor(final SqlDialect dialect, final SqlGenerationContext
checkDialectAliases();
}

protected SqlDialect getDialect() {
return dialect;
}

protected void checkDialectAliases() {
// Check if dialect provided invalid aliases, which would never be applied.
for (final ScalarFunction function : this.dialect.getScalarFunctionAliases().keySet()) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

import com.exasol.adapter.capabilities.Capabilities;
import com.exasol.adapter.dialects.*;
import com.exasol.adapter.jdbc.ConnectionInformation;
import com.exasol.adapter.metadata.DataType;
import com.exasol.adapter.sql.AggregateFunction;
import com.exasol.adapter.sql.ScalarFunction;
Expand Down Expand Up @@ -203,4 +204,21 @@ public String getStringLiteral(final String value) {
return "'" + value.replace("'", "''") + "'";
}

@Override
public String generatePushdownSql(final ConnectionInformation connectionInformation, final String columnDescription, final String pushdownSql) {
final ImportType importType = getContext().getImportType();
if (importType == ImportType.JDBC) {
return super.generatePushdownSql(connectionInformation, columnDescription, pushdownSql);
} else {
if ((importType != ImportType.ORA)) {
throw new AssertionError("OracleSqlDialect has wrong ImportType");
}
final StringBuilder oracleImportQuery = new StringBuilder();
oracleImportQuery.append("IMPORT FROM ORA AT ").append(connectionInformation.getOraConnectionName()).append(" ");
oracleImportQuery.append(connectionInformation.getCredentials());
oracleImportQuery.append(" STATEMENT '").append(pushdownSql.replace("'", "''")).append("'");
return oracleImportQuery.toString();
}
}

}
Original file line number Diff line number Diff line change
@@ -1,10 +1,7 @@
package com.exasol.adapter.dialects.impl;

import com.exasol.adapter.AdapterException;
import com.exasol.adapter.dialects.SqlDialect;
import com.exasol.adapter.dialects.SqlGenerationContext;
import com.exasol.adapter.dialects.SqlGenerationHelper;
import com.exasol.adapter.dialects.SqlGenerationVisitor;
import com.exasol.adapter.dialects.*;
import com.exasol.adapter.jdbc.ColumnAdapterNotes;
import com.exasol.adapter.metadata.ColumnMetadata;
import com.exasol.adapter.metadata.DataType;
Expand Down Expand Up @@ -511,8 +508,9 @@ private String getColumnProjectionString(SqlColumn column, String projString) th
if (!isDirectlyInSelectList) {
return projString;
}
String typeName = ColumnAdapterNotes.deserialize(column.getMetadata().getAdapterNotes(), column.getMetadata().getName()).getTypeName();
if (typeName.startsWith("TIMESTAMP") ||
final AbstractSqlDialect dialect = (AbstractSqlDialect) getDialect();
final String typeName = ColumnAdapterNotes.deserialize(column.getMetadata().getAdapterNotes(), column.getMetadata().getName()).getTypeName();
if ((typeName.startsWith("TIMESTAMP") && dialect.getContext().getImportType() == ImportType.JDBC) ||
typeName.startsWith("INTERVAL") ||
typeName.equals("BINARY_FLOAT") ||
typeName.equals("BINARY_DOUBLE") ||
Expand Down Expand Up @@ -540,7 +538,11 @@ private boolean nodeRequiresCast(SqlNode node) throws AdapterException {
if (typeName.equals("NUMBER") && column.getMetadata().getType().getExaDataType() == DataType.ExaDataType.VARCHAR) {
return true;
} else {
return TYPE_NAMES_REQUIRING_CAST.contains(typeName);
for (final String typeRequiringCast : TYPE_NAMES_REQUIRING_CAST) {
if (typeName.startsWith(typeRequiringCast)) {
return true;
}
}
}
}
return false;
Expand Down
Loading