Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

V1.0.0 #64

Open
wants to merge 33 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
bd981dc
feat(ecto-adapter): Major refactor
PM-Pepsico Nov 2, 2022
8e60117
feat(test): sqlite schema testing
PM-Pepsico Oct 19, 2022
dd13098
feat(snowflex): Use field source for database migration
dalhorinek Nov 16, 2022
0c0f408
fix(parse_result): Return success true for updates
PM-Pepsico Nov 29, 2022
f5d7ebc
fix(connection): Simplify passing options
PM-Pepsico Nov 29, 2022
421d64d
feat(stream): Add error if params passed
PM-Pepsico Nov 29, 2022
25984fc
fix(encode): Make all strings wvchar
PM-Pepsico Dec 9, 2022
61fe9cb
feat(snowflex): Add option to create migration for dynamic table
dalhorinek Dec 8, 2022
ffcc04e
reorder type encoding
mphfish Dec 13, 2022
0b93424
fix(connection): Rollback on disconnect
PM-Pepsico Jan 12, 2023
fc90ad8
fix(connection): Fix check for auto_commit mode
PM-Pepsico Jan 12, 2023
82bbad5
feat(ecto-adapter): implement `query_many/4` (#65)
dustinfarris Jan 24, 2023
7bd077a
build(deps): make sqlite3 opt-in (#66)
dustinfarris Feb 1, 2023
8a6c6ed
refactor: only define test repo if sqlite3 is available (#67)
dustinfarris Feb 8, 2023
b2a68e1
build(deps): remove unused dependencies
dustinfarris Feb 15, 2023
02d2ecb
Merge pull request #68 from pepsico-ecommerce/unused-deps
Ch4s3 Mar 30, 2023
d902d77
Don't store connection details in connection
kellyfelkins Apr 4, 2023
4617e47
Merge pull request #71 from pepsico-ecommerce/DAPPS-2084-remove-sensi…
kellyfelkins Apr 5, 2023
c87a686
fix(connection): Keep non-sensitive conn_opts in state
PM-Pepsico Apr 6, 2023
84e597c
Update limit function to not match for QueryExpr
dalhorinek May 9, 2023
83a1330
fix(snowflex-sqlite-test): typo in test name
BruceBC Oct 23, 2023
0064433
feat(ecto_adapter): cast maybe dates to date
BruceBC Oct 23, 2023
621f9cb
Merge pull request #78 from pepsico-ecommerce/maybe-date-type
BruceBC Oct 23, 2023
874cc4d
chore(changelog): add 0.5.3 update
BruceBC Oct 23, 2023
c371d58
chore(ecto_adatper): Update decoding behavior in adapters for Ecto 3.11
caleb-acosta Dec 5, 2023
6504bb4
Merge pull request #79 from pepsico-ecommerce/nil-data-type
caleb-acosta Dec 6, 2023
3eeebbb
Fix Logger.warn deprecation warnings.
kellyfelkins Mar 6, 2024
b6eb28c
Merge pull request #80 from pepsico-ecommerce/v1.0.0-fix-logger-depre…
kellyfelkins Mar 7, 2024
5699eea
chore: fix single-quoted warning
BruceBC Aug 14, 2024
b494802
chore: use Ecto.Type.type to determine db type
BruceBC Aug 14, 2024
77a50be
Merge pull request #81 from pepsico-ecommerce/bc/dependency-fixes
BruceBC Aug 16, 2024
70cf036
Revert "Merge pull request #81 from pepsico-ecommerce/bc/dependency-f…
BruceBC Aug 19, 2024
808c600
Merge pull request #82 from pepsico-ecommerce/bc/rollback-v1.0.0-chan…
BruceBC Aug 19, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,12 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.5.3] - 2023-10-23

### Added

- Handle casting maybe date types [#78](https://github.com/pepsico-ecommerce/snowflex/pull/78)

## [0.5.2] - 2022-11-18

### Added
Expand Down
114 changes: 60 additions & 54 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
**THIS IS A WORK IN PROGRESS. USE AT YOUR OWN RISK.**

# Snowflex ❄💪

**THIS IS A WORK IN PROGRESS. USE AT YOUR OWN RISK.**

[![Module Version](https://img.shields.io/hexpm/v/snowflex.svg)](https://hex.pm/packages/snowflex)
[![Hex Docs](https://img.shields.io/badge/hex-docs-lightgreen.svg)](https://hexdocs.pm/snowflex/)
[![Total Download](https://img.shields.io/hexpm/dt/snowflex.svg)](https://hex.pm/packages/snowflex)
Expand All @@ -19,19 +19,6 @@ config :snowflex,
driver: "/path/to/my/ODBC/driver" # defaults to "/usr/lib/snowflake/odbc/lib/libSnowflake.so"
```

Connection pools are not automatically started for you. You will need to define and establish each connection pool in your application module. Configuration values related to connection timeouts and the mapping of `:null` query values can be set here.

First, create a module to hold your connection information:

```elixir
defmodule MyApp.SnowflakeConnection do
use Snowflex.Connection,
otp_app: :my_app,
timeout: :timer.minutes(20),
map_nulls_to_nil?: true
end
```

Define your configuration:

```elixir
Expand All @@ -58,10 +45,6 @@ config :my_app, MyApp.SnowflakeConnection,
]
```

The odbc driver will, by default, return `:null` for empty values returned from snowflake queries.
This will be converted to `nil` by default by Snowflex. A configuration value `map_nulls_to_nil?`
can be set to `false` if you do not desire this behavior.

Then, in your application module, you would start your connection:

```elixir
Expand All @@ -86,11 +69,11 @@ end
If you are planning to connect to the Snowflake warehouse, your local Erlang instance
must have ODBC enabled. The erlang installed by Homebrew does NOT have ODBC support. The `asdf`
version of erlang does have ODBC support. You will also need the Snowflake ODBC driver installed
on your machine. You can download this from https://sfc-repo.snowflakecomputing.com/odbc/index.html.
on your machine. You can download this from <https://sfc-repo.snowflakecomputing.com/odbc/index.html>.

### Apple Silicon

Snowflake has a native `macaarch64 driver` available from https://sfc-repo.snowflakecomputing.com/odbc/macaarch64/index.html. However Erlang is unable to find the `unixodbc` files by default after Homebrew [changed their installation directory](https://github.com/Homebrew/brew/issues/9177) from `/usr/local` to `/opt/homebrew`.
Snowflake has a native `macaarch64 driver` available from <https://sfc-repo.snowflakecomputing.com/odbc/macaarch64/index.html>. However Erlang is unable to find the `unixodbc` files by default after Homebrew [changed their installation directory](https://github.com/Homebrew/brew/issues/9177) from `/usr/local` to `/opt/homebrew`.

We can build Erlang with `asdf` and ensure the correct files included to make sure `odbc.app` is available when running Elixir.

Expand All @@ -108,7 +91,7 @@ rm ~/.asdf/plugins/erlang/kerl-home/otp_installations

We can now get the neccesary ODBC and OpenSSL files from Brew, set their correct locations in the environment, and build Erlang and Elixir with `asdf` like so:

``` sh
```sh
brew install unixodbc
brew install [email protected]
export KERL_CONFIGURE_OPTIONS="--with-odbc=$(brew --prefix unixodbc) --with-ssl=$(brew --prefix [email protected])"
Expand All @@ -122,13 +105,14 @@ unset LDFLAGS
```

You will then need to add the following to `/opt/snowflake/snowflakeodbc/lib/simba.snowflake.ini`
```

```ini
ODBCInstLib=/opt/homebrew/Cellar/unixodbc/2.3.11/lib/libodbcinst.dylib
```

And finally ensure that your elixir config has the correct driver location

``` elixir
```elixir
config :snowflex, driver: "/opt/snowflake/snowflakeodbc/lib/libSnowflake.dylib"
```

Expand All @@ -139,62 +123,84 @@ The package can be installed by adding `:snowflex` to your list of dependencies
```elixir
def deps do
[
{:snowflex, "~> 0.5.1"}
{:snowflex, "~> 1.0.0"}
]
end
```

## DBConnection Support

[DBConnection](https://github.com/elixir-ecto/db_connection) support is currently in experimental phase, setting it up is very similar to current implementation with the expection of configuration options and obtaining the same results will require an extra step:
## Useage

### Configuration:
An Ecto Adapter is provided to allow for useage similar to any other SQL backed adapter.

Setting a Module to hold the connection is very similar, but instead you'll use `Snowflex.DBConnection`:

Example:
Simply declare an Ecto Repo using the Snowlfex adapter, define Ecto Schema modules and use standard Ecto functions.

```elixir
defmodule MyApp.SnowflakeConnection do
use Snowflex.DBConnection,
defmodule MyApp.Repo do
use Ecto.Repo,
otp_app: :my_app,
timeout: :timer.minutes(5)
adapter: Snowflex.EctoAdapter
end
```

```elixir
config :my_app, MyApp.SnowflakeConnection,
pool_size: 5, # the connection pool size
worker: MyApp.CustomWorker, # defaults to Snowflex.DBConnection.Server
connection: [
role: "PROD",
warehouse: System.get_env("SNOWFLAKE_POS_WH"),
uid: System.get_env("SNOWFLAKE_POS_UID"),
pwd: System.get_env("SNOWFLAKE_POS_PWD")
]
defmodule MyApp.Schema do
use Ecto.Schema

schema "schema" do
field(:x, :integer)
field(:y, :integer)
field(:z, :integer)
end
end
```

### Usage:
```elixir
MyApp.Repo.all(MyApp.Schema)
```

After setup, you can use your connection to query:
## Testing

```elixir
alias Snowflex.DBConnection.Result
Testing Ecto Schemas without connecting to a live Snowflake database is made
possible through swapping out the Ecto Adapter. A test repo which uses SQLite is
provided.

The largest difficulty with using another adapter is that there will be no
migrations to get the test repo in a useable state for testing. This is solved
by the `generate_migrations/2` macro in the `MigrationGenerator` module. The
test repo must also be created and dropped before each run of the test suite to
allow the generated migrations to run from a blank state.

{:ok, %Result{} = result} = MyApp.SnowflakeConnection.execute("my query")
{:ok, %Result{} = result} = MyApp.SnowflakeConnection.execute("my query", ["my params"])
Install `ecto_sqlite3` in `mix.exs`:

```elixir
{:ecto_sqlite3, "~> 0.8", only: [:test]},
```

As you can see we now receive an `{:ok, result}` tuple, to get results as expected with current implementation, we need to call `process_result/1`:
And update `test/test_helper.exs` file as follows:

```elixir
alias Snowflex.DBConnection.Result
require Snowflex.MigrationGenerator

{:ok, %Result{} = result} = MyApp.SnowflakeConnection.execute("my query")
opts = [strategy: :one_for_one, name: Snowflex.Supervisor]
Supervisor.start_link([Snowflex.SQLiteTestRepo], opts)

[%{"col" => 1}, %{"col" => 2}] = SnowflakeDBConnection.process_result(result)
Snowflex.SQLiteTestRepo.__adapter__().storage_up(Snowflex.SQLiteTestRepo.config())

Snowflex.MigrationGenerator.generate_migrations(Snowflex.SQLiteTestRepo, [
TestSchema,
TestSchema2,
TestSchema3
])

ExUnit.start()

ExUnit.after_suite(fn _ ->
Snowflex.SQLiteTestRepo.__adapter__().storage_down(Snowflex.SQLiteTestRepo.config())
end)
```

Refer to `test/snowflex_sqlite_test.exs` for useage.

## Copyright and License

Copyright (c) 2020 PepsiCo, Inc.
Expand Down
12 changes: 12 additions & 0 deletions config/test.exs
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,15 @@ config :snowflex, Snowflex.DBConnectionTest.SnowflakeDBConnection,
role: "DEV",
warehouse: "CUSTOMER_DEV_WH"
]

config :logger, level: :warn

config :snowflex, ecto_repos: [Snowflex.SQLiteTestRepo]

config :snowflex, repo: Snowflex.SQLiteTestRepo

config :snowflex, Snowflex.SQLiteTestRepo,
database: "test/snowflex_test_repo.sql",
journal_mode: :delete,
temp_store: :memory,
pool: Ecto.Adapters.SQL.Sandbox
145 changes: 39 additions & 106 deletions lib/snowflex.ex
Original file line number Diff line number Diff line change
@@ -1,75 +1,53 @@
defmodule Snowflex do
@moduledoc """
The client interface for connecting to the Snowflake data warehouse.

The main entry point to this module is `Snowflex.sql_query`. This function takes a string containing
a SQL query and returns a list of maps (one per row). NOTE: due to the way the Erlang ODBC works, all values comeback
as strings. You will need to cast values appropriately.
"""
alias Ecto.Changeset
alias Snowflex.Worker

# Shamelessly copied from http://erlang.org/doc/man/odbc.html#common-data-types-
@type precision :: integer()
@type scale :: integer()
@type size :: integer()
@type odbc_data_type ::
:sql_integer
| :sql_smallint
| :sql_tinyint
| {:sql_decimal, precision(), scale()}
| {:sql_numeric, precision(), scale()}
| {:sql_char, size()}
| {:sql_wchar, size()}
| {:sql_varchar, size()}
| {:sql_wvarchar, size()}
| {:sql_float, precision()}
| {:sql_wlongvarchar, size()}
| {:sql_float, precision()}
| :sql_real
| :sql_double
| :sql_bit
| atom()
@type value :: nil | term()

@type query_param :: {odbc_data_type(), [value()]}
@type sql_data :: list(%{optional(String.t()) => String.t()})
@type query_opts :: [timeout: timeout(), map_nulls_to_nil?: boolean()]
defmacrop is_iodata(data) do
quote do
is_list(unquote(data)) or is_binary(unquote(data))
end
end

@spec sql_query(atom(), String.t(), query_opts()) ::
sql_data() | {:error, term()} | {:updated, integer()}
def sql_query(pool_name, query, opts) do
timeout = Keyword.get(opts, :timeout)
def child_spec(options) do
DBConnection.child_spec(Snowflex.Connection, options)
end

case :poolboy.transaction(
pool_name,
&Worker.sql_query(&1, query, timeout),
timeout
) do
{:ok, results} -> process_results(results, opts)
err -> err
end
def prepare_execute(conn, name, statement, params \\ [], opts \\ [])
when is_iodata(name) and is_iodata(statement) do
query = %Snowflex.Query{name: name, statement: statement}
DBConnection.prepare_execute(conn, query, params, opts)
end

@spec param_query(atom(), String.t(), list(query_param()), query_opts()) ::
sql_data() | {:error, term()} | {:updated, integer()}
def param_query(pool_name, query, params, opts) do
timeout = Keyword.get(opts, :timeout)
def query(conn, statement, params \\ [], options \\ []) when is_iodata(statement) do
name = options[:cache_statement]
query_type = options[:query_type] || :binary

case :poolboy.transaction(
pool_name,
&Worker.param_query(&1, query, params, timeout),
timeout
) do
{:ok, results} -> process_results(results, opts)
err -> err
cond do
name != nil ->
statement = IO.iodata_to_binary(statement)
query = %Snowflex.Query{name: name, statement: statement, cache: :statement}
do_query(conn, query, params, options)

query_type in [:binary, :binary_then_text] ->
query = %Snowflex.Query{name: "", statement: statement}
do_query(conn, query, params, options)
end
end

def cast_results(data, schema) do
Enum.map(data, &cast_row(&1, schema))
# @spec execute(conn(), Snowflex.Query.t(), list(), [option()]) ::
# {:ok, MyXQL.Query.t(), MyXQL.Result.t()} | {:error, Exception.t()}
def execute(conn, %Snowflex.Query{} = query, params \\ [], opts \\ []) do
DBConnection.execute(conn, query, params, opts)
end

defp do_query(conn, %Snowflex.Query{} = query, params, options) do
conn
|> DBConnection.prepare_execute(query, params, options)
|> query_result()
end

defp query_result({:ok, _query, result}), do: {:ok, result}
defp query_result({:error, _} = error), do: error

def int_param(val), do: {:sql_integer, val}
def string_param(val, length \\ 250), do: {{:sql_varchar, length}, val}

Expand All @@ -83,55 +61,10 @@ defmodule Snowflex do
end
end

# Helpers

defp process_results(data, opts) when is_list(data) do
Enum.map(data, &process_results(&1, opts))
end

defp process_results({:selected, headers, rows}, opts) do
map_nulls_to_nil? = Keyword.get(opts, :map_nulls_to_nil?)

bin_headers =
headers
|> Enum.map(fn header -> header |> to_string() |> String.downcase() end)
|> Enum.with_index()

Enum.map(rows, fn row ->
Enum.reduce(bin_headers, %{}, fn {col, index}, map ->
data =
row
|> elem(index)
|> handle_encoding()
|> to_string_if_charlist()
|> map_null_to_nil(map_nulls_to_nil?)

Map.put(map, col, data)
end)
end)
end

defp process_results(results), do: results

defp process_results({:updated, _} = results, _opts), do: results

defp to_string_if_charlist(data) when is_list(data), do: to_string(data)
defp to_string_if_charlist(data), do: data

defp map_null_to_nil(:null, true), do: nil
defp map_null_to_nil(data, _), do: data

defp handle_encoding(data) when is_list(data) do
raw = :erlang.list_to_binary(data)

case :unicode.characters_to_binary(raw) do
utf8 when is_binary(utf8) -> utf8
_ -> :unicode.characters_to_binary(raw, :latin1)
end
def cast_results(data, schema) do
Enum.map(data, &cast_row(&1, schema))
end

defp handle_encoding(data), do: data

defp cast_row(row, schema) do
schema
|> struct()
Expand Down
Loading