-
Notifications
You must be signed in to change notification settings - Fork 850
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Request for comments: SQL to Go code generator that leverages pgx #915
Comments
Regarding usage of pgx APIs: I didn't see any red flags. Only suggestion I have is maybe to replace Regarding a trace API: Sounds interesting. There might be a bit of an overlap with the existing logging system. Not sure if that would be a problem or not. I'd be able to review PRs or design sketches but I don't currently have availability to work on it myself. |
Interesting, how would QueryFunc be faster? I figured the indirection through the function call would cause QueryFunc to always be slower than Query.
Perfect, I'll get codegen off the ground first, but I'll loop you in once I start taking a look at tracing. The sqlc maintainer pointed me at some prior art: https://github.com/ngrok/sqlmw that uses the default sql driver. Also, I've renamed the project to pggen (https://github.com/jschaf/pggen) to avoid confusion with the existing sqlc library. |
It costs an extra function call, but it avoids the allocation for the variadic argument to Scan (technically you can do that manually but it's not typically done). There are also some possible future optimizations. Scan may be called with different variables for each row -- even entirely different types. This means Scan has to do type checking for each row. It might be possible to do that once for the result with QueryFunc. That would save several cycles per value read. Though to be honest, it's likely the vast majority of real world performance won't matter one way or the other. The big win for me is making it harder to misuse. |
Ah, I see, thanks for explaining. Since I know the types ahead of time, it seems I could generate more optimal code. I know the generated use-case is not the norm, but it'd be neat if I could leverage a lower-level, "unsafe" interface. |
Well, you can use See the benchmarks for pgx for more info -- in particular these 3:
Scan simple is the normal way pgx is used. Explicit decoding is avoiding scan and manually decoding each value. Raw prepared is measuring simply sending the query and receiving the result bytes but doing no decoding. That should be the theoretical limit of performance. As you can see even the slowest one is within 19% of the theoretical limit. In almost all cases network time and PG server time vastly overwhelms pgx time. |
Hi! I'm working on sqld, a code generator that takes SQL queries and outputs type-safe Go code using pgx. I'm a big fan of pgx and if you have some time, I'd love to get your thoughts on the usage of pgx, especially around:
RFC: sqld: Go code generation for Postgres (Google doc)
To summarize, sqld takes a query like:
And generates:
sqld is heavily inspired by sqlc. The main differences are sqld gets type information from Postgres directly and sqld uses pgx instead of the standard library
sql
types.The text was updated successfully, but these errors were encountered: