Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Also build rdkafka with ssl if ssl feature is enabled #881

Merged
merged 6 commits into from
Dec 15, 2020

Conversation

untitaker
Copy link
Member

@untitaker untitaker commented Dec 9, 2020

Fix #879

librdkafka can link exclusively against openssl, while relay's http stack actually uses native-tls abstraction which does not require openssl outside of linux (see relay's release page, we also offer standalone .exe outside of docker). This poses a problem for our local dev workflow, as all our workstations are OS X, have no OpenSSL, but we still want to compile Relay in processing mode. The solution is a separate featureflag kafka-ssl.

@untitaker untitaker requested a review from a team December 9, 2020 14:49
@ajacques
Copy link

ajacques commented Dec 9, 2020

I tried to build this version, but got this:

 ---> Running in a90130a73f3b
cd relay && cargo build --release --locked --features ssl,processing --target=x86_64-unknown-linux-gnu
    Updating crates.io index
    Updating git repository `https://github.com/getsentry/sentry-rust`
    Updating git repository `https://github.com/getsentry/rust-json-forensics`
    Updating git repository `https://github.com/luser/rust-minidump`
error: the lock file /work/Cargo.lock needs to be updated but --locked was passed to prevent this
If you want to try to generate the lock file without accessing the network, use the --offline flag.
make: *** [Makefile:27: build-linux-release] Error 101

Removing --locked to test it, then gives me:

error: failed to run custom build command for `rdkafka-sys v2.1.0+1.5.0`

Caused by:
  process didn't exit successfully: `/work/target/release/build/rdkafka-sys-6dd6a1c0ff593a0d/build-script-build` (exit code: 101)
  --- stdout
  Configuring and compiling librdkafka
  running: "cmake" "/usr/local/cargo/registry/src/github.aaakk.us.kg-1ecc6299db9ec823/rdkafka-sys-2.1.0+1.5.0/librdkafka" "-DRDKAFKA_BUILD_STATIC=1" "-DRDKAFKA_BUILD_TESTS=0" "-DRDKAFKA_BUILD_EXAMPLES=0" "-DCMAKE_INSTALL_LIBDIR=lib" "-DWITH_ZLIB=1" "-DWITH_SSL=1" "-DWITH_SASL=0" "-DWITH_ZSTD=0" "-DENABLE_LZ4_EXT=0" "-DCMAKE_INSTALL_PREFIX=/work/target/x86_64-unknown-linux-gnu/release/build/rdkafka-sys-2e1dfde9b3fdaf16/out" "-DCMAKE_C_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_C_COMPILER=/usr/bin/cc" "-DCMAKE_CXX_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_CXX_COMPILER=/usr/bin/c++" "-DCMAKE_ASM_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_ASM_COMPILER=/usr/bin/cc" "-DCMAKE_BUILD_TYPE=RelWithDebInfo"
  -- The C compiler identification is GNU 8.3.0
  -- The CXX compiler identification is GNU 8.3.0
  -- Check for working C compiler: /usr/bin/cc
  -- Check for working C compiler: /usr/bin/cc -- works
  -- Detecting C compiler ABI info
  -- Detecting C compiler ABI info - done
  -- Detecting C compile features
  -- Detecting C compile features - done
  -- Check for working CXX compiler: /usr/bin/c++
  -- Check for working CXX compiler: /usr/bin/c++ -- works
  -- Detecting CXX compiler ABI info
  -- Detecting CXX compiler ABI info - done
  -- Detecting CXX compile features
  -- Detecting CXX compile features - done
  -- Looking for pow in m
  -- Looking for pow in m - found
  -- Could NOT find ZSTD (missing: ZSTD_LIBRARY ZSTD_INCLUDE_DIR) 
  -- Found ZLIB: /work/target/x86_64-unknown-linux-gnu/release/build/libz-sys-ba10cb51e4ee4672/out/build/libz.a (found version "1.2.11") 
  -- Configuring incomplete, errors occurred!
  See also "/work/target/x86_64-unknown-linux-gnu/release/build/rdkafka-sys-2e1dfde9b3fdaf16/out/build/CMakeFiles/CMakeOutput.log".

It's interesting how SSL works on the HTTP side, but not the Kafka side.

@untitaker
Copy link
Member Author

@ajacques I think I figured it out, can you try again

@ajacques
Copy link

ajacques commented Dec 9, 2020

Not sure what I'm doing wrong, but it's still failing with the same error:

[ec2-user@ip-10-0-28-70 relay]$ git status
On branch fix/rdkafka-ssl
Your branch is up to date with 'origin/fix/rdkafka-ssl'.

nothing to commit, working tree clean
[ec2-user@ip-10-0-28-70 relay]$ git rev-parse HEAD
05729ab5df4d8f11781a8c7dd713b8ed8cdae5c5
error: failed to run custom build command for `rdkafka-sys v2.1.0+1.5.0`

Caused by:
  process didn't exit successfully: `/work/target/release/build/rdkafka-sys-6dd6a1c0ff593a0d/build-script-build` (exit code: 101)
  --- stdout
  Configuring and compiling librdkafka
  running: "cmake" "/usr/local/cargo/registry/src/github.aaakk.us.kg-1ecc6299db9ec823/rdkafka-sys-2.1.0+1.5.0/librdkafka" "-DRDKAFKA_BUILD_STATIC=1" "-DRDKAFKA_BUILD_TESTS=0" "-DRDKAFKA_BUILD_EXAMPLES=0" "-DCMAKE_INSTALL_LIBDIR=lib" "-DWITH_ZLIB=1" "-DWITH_SSL=1" "-DWITH_SASL=0" "-DWITH_ZSTD=0" "-DENABLE_LZ4_EXT=0" "-DCMAKE_INSTALL_PREFIX=/work/target/x86_64-unknown-linux-gnu/release/build/rdkafka-sys-2e1dfde9b3fdaf16/out" "-DCMAKE_C_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_C_COMPILER=/usr/bin/cc" "-DCMAKE_CXX_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_CXX_COMPILER=/usr/bin/c++" "-DCMAKE_ASM_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_ASM_COMPILER=/usr/bin/cc" "-DCMAKE_BUILD_TYPE=RelWithDebInfo"
  -- The C compiler identification is GNU 8.3.0
  -- The CXX compiler identification is GNU 8.3.0
  -- Check for working C compiler: /usr/bin/cc
  -- Check for working C compiler: /usr/bin/cc -- works
  -- Detecting C compiler ABI info
  -- Detecting C compiler ABI info - done
  -- Detecting C compile features
  -- Detecting C compile features - done
  -- Check for working CXX compiler: /usr/bin/c++
  -- Check for working CXX compiler: /usr/bin/c++ -- works
  -- Detecting CXX compiler ABI info
  -- Detecting CXX compiler ABI info - done
  -- Detecting CXX compile features
  -- Detecting CXX compile features - done
  -- Looking for pow in m
  -- Looking for pow in m - found
  -- Could NOT find ZSTD (missing: ZSTD_LIBRARY ZSTD_INCLUDE_DIR) 
  -- Found ZLIB: /work/target/x86_64-unknown-linux-gnu/release/build/libz-sys-ba10cb51e4ee4672/out/build/libz.a (found version "1.2.11") 
  -- Configuring incomplete, errors occurred!
  See also "/work/target/x86_64-unknown-linux-gnu/release/build/rdkafka-sys-2e1dfde9b3fdaf16/out/build/CMakeFiles/CMakeOutput.log".

  --- stderr
  Building and linking librdkafka statically
  CMake Error at /usr/share/cmake-3.13/Modules/FindPackageHandleStandardArgs.cmake:137 (message):
    Could NOT find OpenSSL, try to set the path to OpenSSL root folder in the
    system variable OPENSSL_ROOT_DIR (missing: OPENSSL_CRYPTO_LIBRARY
    OPENSSL_INCLUDE_DIR)
  Call Stack (most recent call first):
    /usr/share/cmake-3.13/Modules/FindPackageHandleStandardArgs.cmake:378 (_FPHSA_FAILURE_MESSAGE)
    /usr/share/cmake-3.13/Modules/FindOpenSSL.cmake:412 (find_package_handle_standard_args)
    src/CMakeLists.txt:226 (find_package)


  thread 'main' panicked at '
  command did not execute successfully, got: exit code: 1

  build script failed, must exit now', /usr/local/cargo/registry/src/github.aaakk.us.kg-1ecc6299db9ec823/cmake-0.1.44/src/lib.rs:885:5
  note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
warning: build failed, waiting for other jobs to finish...
error: build failed
make: *** [Makefile:27: build-linux-release] Error 101

@untitaker
Copy link
Member Author

no sorry you're right, this is a problem inside of the docker image. I am not sure what's going on though, why rust-rdkafka does not link properly against openssl while clearly all the other stuff can. It even uses the same crate (openssl-sys) to find the openssl installation

@untitaker
Copy link
Member Author

@ajacques can you try again. This time it actually builds for me.

@ajacques
Copy link

We have success! I was able to connect to my broker over SSL using this latest commit.

@@ -63,7 +63,9 @@ RUN echo "Building OpenSSL" \
FROM getsentry/sentry-cli:1 AS sentry-cli
FROM relay-deps AS relay-builder

ARG RELAY_FEATURES=ssl,processing
# ssl and processing are required for basic functionality in onprem
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not exactly true, our onprem setup works well without kafka-ssl. I would even go as far and not build Relay with kafka-ssl as I see this as a severe edge case; however, if you are confident that there are no side effects, we can leave it on.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah by the comment I mean processing and ssl are required, kafka-ssl is no extra cost to add

@untitaker untitaker merged commit 5f83948 into master Dec 15, 2020
@untitaker untitaker deleted the fix/rdkafka-ssl branch December 15, 2020 10:07
jan-auer added a commit that referenced this pull request Dec 18, 2020
* master:
  feat: Enable use of multiple value types per field (#882)
  fix(user-report): Make all fields but event-id optional (#886)
  release: 20.12.1
  release: 20.12.0
  ci(release): Move to getsentry/publish for releases (#885)
  meta: Fix CODEOWNERS (#884)
  fix: Also build rdkafka with ssl if ssl feature is enabled (#881)
untitaker added a commit that referenced this pull request Dec 21, 2020
untitaker added a commit that referenced this pull request Dec 21, 2020
…889)

Co-authored-by: Jan Michael Auer <[email protected]>

This PR reverts #881 as the feature-flagging setup broke our development workflow

Reverts #888
Reverts #881

We should find a different solution that does not prevent us from using --all-features.

#skip-changelog
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Can't connect to Kafka broker using SSL
3 participants