You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
When running WhiteRabbit against a Snowflake instance with multiple databases and/or schema's, tables are not always properly identified by the correct database & schema, leading to crashes and/or errors reporting non-existent tables or columns.
To Reproduce
Steps to reproduce the behavior:
This requires a Snowflake instance with multiple database and/or schema's and duplicate column names across tables
to reproduce. Bug fixes will need to be field-tested with instances that have these features.
Expected behavior
Snowflake tables and columns should always be identified by the correct database and schema.
Desktop (please complete the following information):
OS: any
Version v1.0.0-RC1
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered:
An additional fix was applied to prevent an exception that was reported by a tester ("java.lang.ClassCastException: class org.ohdsi.rabbitInAHat.dataModel.Table cannot be cast to class java.lang.Comparable (org.ohdsi.rabbitInAHat.dataModel.Table is in unnamed module of loader 'app'; java.lang.Comparable is in module java.base of loader 'bootstrap')
* Create release 1.0.0
* Enforce consistent ordering of the tables in the scan report (solves issue #236)
* Snowflake: always use database and schema when accessing table (meta)data. Fixes issue #409
* Update Snowflake JDBC version and activate+fix Snowflake integration tests
* Upgrade dependency, testcontainer version and fix MSSqlServer integration test.
* Only run Snowflake integration tests when a Snowflake access configuartion is available
* Switch to SQL for obtaining field metadata for Snowflake (default, JDBC can still be used through a system property or env.var)
* Fix for #411 (can't process custom models with UTF8 BOM in csv file)
* Better method naming and clearer logging for SnowflakeHandler
* Add UTF BOM handling code reading of csv's
* Change to ojdbc8 version 19.23.0.0 (for Oracle). Different (sub)repo, more recently published, solves issue #415
* Avoid testing results for integration test with externally loaded BigQuery JDBC jar: makes setup more simple
Describe the bug
When running WhiteRabbit against a Snowflake instance with multiple databases and/or schema's, tables are not always properly identified by the correct database & schema, leading to crashes and/or errors reporting non-existent tables or columns.
To Reproduce
Steps to reproduce the behavior:
This requires a Snowflake instance with multiple database and/or schema's and duplicate column names across tables
to reproduce. Bug fixes will need to be field-tested with instances that have these features.
Expected behavior
Snowflake tables and columns should always be identified by the correct database and schema.
Desktop (please complete the following information):
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: