Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve documentation to clarify purpose of the driver #365

Merged
merged 6 commits into from
Mar 24, 2023

Conversation

amotl
Copy link
Member

@amotl amotl commented Mar 22, 2023

Following up on GH-364, this patch aims to:

  • Clarify the purpose of this "legacy" driver (f615472)

    • Describe why it is still needed in a few cases.
    • Also add the recommendation to use the official driver in other cases.
  • Make "implementation notes" more prominent (b64ff63)
    In order to better educate users about "what's inside", this refactors a section of the documentation from the appendix to the front page.

  • Add example about Apache Kafka, Apache Flink, and CrateDB (6a6912e)

@amotl amotl requested review from matriv and proddata March 22, 2023 16:33
docs/index.rst Outdated
Overview
========

- This JDBC driver is needed in certain scenarios like the one outlined at
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we explain/give examples here?
I checked the Flink example repo, but it also is not clear there, why we should not use the PG JDBC driver

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I've added the corresponding example with 6a6912e. But, true, I will be happy to add a more crystal-clear sentence if @matriv could provide a corresponding statement?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know if we can do since we don't have that PR merged in the flink-jdbc-connector.
Maybe just open an issue to address this later once that PR is merged and the cratedb-flink-jobs example is in place.

Copy link
Member Author

@amotl amotl Mar 22, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, the example still works on the older versions of Apache Flink where the adjustments to the flink-jdbc-connector have not been needed, and points into the right direction why we need the "legacy" JDBC driver. This detail does not change, as we still need this driver in the future.

When apache/flink-connector-jdbc#29 is merged, we can add an additional reference here, right?

Copy link
Member Author

@amotl amotl Mar 23, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It also is not clear there, why we should not use the PG JDBC driver.
I will be happy to add a more crystal-clear sentence about the "why".

Coming from 12, I think a good explanation could be:

When using the postgresql:// JDBC driver prefix, this will implicitly select the corresponding PostgreSQL dialect implementation for Apache Flink. However, this will employ a few behaviors that strictly expect a PostgreSQL server on the other end, so that some operations will croak on databases which only offer wire-compatibility with PostgreSQL.

Footnotes

  1. https://github.com/crate/cratedb-flink-jobs/issues/3

  2. https://pypi.org/project/sqlalchemy-postgresql-relaxed/

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've just added 159010b, in order to improve this detail further. What do you think about it?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another example is the one from crate/crate#11608 (comment) but it may not be something to mention

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've added it to 1684ea5, thanks!

LICENSE Outdated

APPENDIX: How to apply the Apache License to your work.

To apply the Apache License to your work, attach the following

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we remove this?

Copy link
Member Author

@amotl amotl Mar 24, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It got removed already?

docs/index.rst Outdated
In turn, this will employ a few behaviours that strictly expect a PostgreSQL
server on the other end, so that some operations will croak on databases
offering wire-compatibility with PostgreSQL, but do not provide the
corresponding features like the `hstore`_ or `jsonb`_ extensions.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"certain features" instead of "corresponding features" perhaps?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Addressed with 1684ea5, thanks!

docs/index.rst Outdated
Overview
========

- This JDBC driver is needed in certain scenarios like the one outlined at

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another example is the one from crate/crate#11608 (comment) but it may not be something to mention

amotl added 6 commits March 24, 2023 15:53
- Describe why it is still needed in a few cases.
- Also add the recommendation to use the official driver in other cases.
In order to better educate users about "what's inside", this patch
refactors a section of the documentation from the appendix to the front
page.
@amotl amotl force-pushed the amo/improve-docs branch from 1684ea5 to 274bf0a Compare March 24, 2023 14:53
@amotl amotl merged commit eda489f into master Mar 24, 2023
@amotl amotl deleted the amo/improve-docs branch March 24, 2023 14:56
will implicitly select the corresponding dialect implementation for PostgreSQL.

In turn, this will employ a few behaviours that strictly expect a PostgreSQL
server on the other end, so that some operations will croak on databases
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

minor, I'd prefer: croak -> fail

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Addressed with GH-369. Thanks!

Copy link
Contributor

@matriv matriv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @amotl !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants