From b9f5b65e095b3a905203c73643b92cc9e0f954cc Mon Sep 17 00:00:00 2001 From: Qi Date: Sun, 24 Apr 2022 15:55:43 +0800 Subject: [PATCH] Squashed commit of the following: MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit commit 86de7040bd3320a225fd8712591fc80bd2754ace Author: Murillo <103451714+gruceo@users.noreply.github.com> Date: Wed Apr 20 20:18:14 2022 -0300 fix(cp) proper error handling for export_deflated_reconfigure_payload commit 612648cab67aad86595dff1855218ff598faf761 Author: Murillo <103451714+gruceo@users.noreply.github.com> Date: Wed Apr 13 11:28:29 2022 -0300 fix(wrpc) do a pcall for all export_deflated_reconfigure_payload calls We are already wrapping some calls to `export_deflated_reconfigure_payload()` inside a pcall in the `wrpc_control_plane.lua` file. This change is doing a pcall in all the remaining calls to `export_deflated_reconfigure_payload()` in this file to avoid the CP crash whenever we find errors during initialization of modules for example. commit 3c89fa1c1e0d2504abb4f09405488026b40b054e Author: Murillo <103451714+gruceo@users.noreply.github.com> Date: Mon Apr 11 16:05:09 2022 -0300 fix(cp) do a pcall for all calls to export_deflated_reconfigure_payload We are already wrapping some calls to `export_deflated_reconfigure_payload()` inside a pcall in the `control_plane.lua` file. This change is doing a pcall in all the remaining calls to `export_deflated_reconfigure_payload()` in this file to avoid the CP crash whenever we find errors during initialization of modules for example. commit 6f20f2f7e643cadac138d171b636c45e62898dc2 Author: Enrique García Cota Date: Fri Apr 22 15:18:24 2022 +0200 tests(hybrid) mark test as flaky (#8713) commit fb8aa2d8a3da2c4c0aaf4930aa4c82419749ce93 Author: Suika <100666470+Suika-Kong@users.noreply.github.com> Date: Fri Apr 22 01:24:15 2022 +0800 fix(pdk) ignore user set Tranfer-Encoding (#8698) commit 31ca6ea2b37147b8fd29cb3bd71c2ae1630bdf07 Author: Colin Hutchinson Date: Thu Apr 21 11:33:23 2022 +0000 chore(release): cleanup the Jenkins release logic (#8706) commit 39dd72834234963843d0bb530d013ae020c7f85d Author: Aapo Talvensaari Date: Thu Apr 21 13:32:50 2022 +0300 feat(clustering) atomic export of declarative config with Postgres This minimizes the possibilities of inconsistencies in exported config, especially under high Admin API update traffic. commit 579537b494af136184aca3952a799291585b8b47 Author: Colin Hutchinson Date: Wed Apr 20 18:00:04 2022 +0000 Revert "feat(dao) use `cache_key` for target uniqueness detection" (#8705) This reverts commit 9eba2a1e4b78711def54c3ea5096634d7760ba06. commit a05cc4cb6401156714ece7e7a21d2984f4db933e Author: Vinicius Mignot Date: Tue Apr 19 16:42:12 2022 -0300 docs(CHANGELOG) added fix entry commit 9a65902689a92921adcb14a05a76f4b281da457d Author: Vinicius Mignot Date: Tue Apr 19 15:41:43 2022 -0300 fix(balancer) do not reschedule resolve timer when reloading commit f6aae6fac0ed079b251f6580c209ed46e8510b59 Author: Aapo Talvensaari Date: Tue Apr 19 17:56:27 2022 +0300 chore(deps) bump luarocks 3.8.0 to 3.9.0 (#8700) * `builtin` build mode now always respects CC, CFLAGS and LDFLAGS * Check that lua.h version matches the desired Lua version * Check that the version of the Lua C library matches the desired Lua version * Fixed deployment of non-wrapped binaries * Fixed crash when `--lua-version` option is malformed * Fixed help message for `--pin` option * Unix: use native methods and don't always rely on $USER to determine user * Windows: use native CLI tooling more * macOS: support .tbd extension when checking for libraries * macOS: add XCode SDK path to search paths * macOS: add best-effort heuristic for library search using Homebrew paths * macOS: avoid quoting issues with LIBFLAG * macOS: deployment target is now 11.0 on macOS 11+ * added DragonFly BSD support * LuaRocks test suite now runs on Lua 5.4 and LuaJIT * Internal dependencies of standalone LuaRocks executable were bumped commit eb9a8ba7eced719734ed4d95b7f0549b6d753c30 Author: Aapo Talvensaari Date: Mon Apr 11 16:35:08 2022 +0300 perf(conf) localize variables needed for configuration parsing Just localizes some variable for a faster configuration parsing, and tidier code. commit 951b93f6fdef43bcf7d0d829a4432135185e448e Author: Aapo Talvensaari Date: Mon Apr 11 15:57:22 2022 +0300 fix(conf) properly support vault configurations with process secrets Default vault configurations can be configured with Kong configuration. For example using environment variables: - `KONG_VAULT_ENV_PREFIX=vault_` - `KONG_VAULT_HCV_TOKEN=xxx` Previously these settings were not honoured when kong configuration references were dereferenced. This fixes that issue. commit 3d583c8aa2a60126e8b841aedac5ab0a4dd6540b Author: Aapo Talvensaari Date: Mon Apr 11 12:49:35 2022 +0300 refactor(pdk) vault pdk to be more like rest of the pdk modules Refactor Vault PDK to follow other Kong PDK modules. This means that functions are created inside `.new` function. This has benefit of being able to access up-value `self`, which means that no direct references to global `kong` is needed. In general, it makes testing and mocking easier too. I need this so I can pass some initial configuration very early on when Kong does process secrets resolving of Kong configuration references. commit 51565965fd268adf2e9de3f4345724a27ea5c31e Author: Aapo Talvensaari Date: Fri Apr 8 16:33:33 2022 +0300 feat(vaults) store dao references in $refs (needed for rotation) When there are references used in dao fields with `referenceable=true`, Kong replaces the references with values when the data is read (excluding admin api and control planes). When Kong replaces the reference, it is basically lost, and thus the automatic secret rotation cannot be implemented. This commit stores the references on returned entities to `"$refs"` property: ``` local certificate = kong.db.certificates:select(...) -- the possible reference can be found here: print(certificate["$refs"].key) ``` There will be helper functions so `"$refs"` property is not intended to end users. commit ac69743935f0c99f4846516b9154045456caaa4d Author: Aapo Talvensaari Date: Fri Apr 8 16:00:49 2022 +0300 fix(vaults) do not leak resolved vault references to .kong_env file When Kong prepares a `prefix` directory, it also stores current environment related to Kong in file called `.kong_env`. As Kong resolves the Vault references when it starts, the resolved values got leaked to `.kong_env` file. This was partly because for `vaults-beta` we didn't yet implement secret rotation, and we decided to also not keep the references around when they were resolved. Not that we have added the `"$refs"` property to `kong.configuration`, we can replace the values of configuration with the references before we write the `.kong_env` file. This commit fixes that. commit 7f13cbc91341c6a62f27c1c2e7baff7a73198b63 Author: Aapo Talvensaari Date: Fri Apr 8 15:53:29 2022 +0300 feat(vaults) store configuration references in $refs (needed for rotation and .kong_env cleanup) Kong vault references like `{vault://env/my-env-var}` when used in Kong configuration are replaced with actual secrets. This makes it hard to implement secret rotation as the reference is lost when it is replaced. This commit stores the original references on a side: ```lua kong.configuration[$refs][] = ``` commit bffa4af4df59c13b76236599ff59ab32f36210e8 Author: Mayo Date: Tue Apr 19 17:57:40 2022 +0800 chore(ci) changelog label Any PR includes a changelog will add a “core/docs” label which is unnecessary, this PR added an extra label 'changelog' to detect changelog file changes. commit 9eba2a1e4b78711def54c3ea5096634d7760ba06 Author: yankun-li-kong <77371186+yankun-li-kong@users.noreply.github.com> Date: Tue Apr 19 19:27:23 2022 +0900 feat(dao) use `cache_key` for target uniqueness detection Add new `cache_key(upstream, target)` in targets table for atomic uniqueness detection. Delete useless targets uniqueness detection functions. Targets API returns `409` when creating/updating delicate targets. Add migration functions to add `cache_key` column, delete duplicate targets and add `cache_key` for existing targets. Co-authored-by: Mayo commit d7a8e66634685c7eb2d1e1e259a4ff975ecf3612 Author: Mayo Date: Tue Apr 19 17:36:33 2022 +0800 fix(ldap-auth) free internal pointer after covert to lua string (#8696) commit d4bdae5562c731c9a65c86eb92362d42297e02f4 Author: Mayo Date: Tue Apr 19 12:08:09 2022 +0800 refactor(ldap-auth) openssl ffi based asn1 parser/decoder (#8663) Replace asn1 parser/decoder with openssl ffi based functions. commit 79f362de14a461337fb18c0d28ad7392488882c5 Author: Wheeler Law Date: Mon Apr 18 04:51:32 2022 -0500 chore(CODEOWNERS) add `CODEOWNERS` file to the repo --- .github/labeler.yml | 5 +- .requirements | 2 +- CHANGELOG.md | 15 +- CODEOWNERS | 1 + Jenkinsfile | 145 +--- kong/clustering/control_plane.lua | 12 +- kong/clustering/wrpc_control_plane.lua | 15 +- kong/cmd/utils/prefix_handler.lua | 6 + kong/cmd/vault.lua | 3 +- kong/conf_loader/init.lua | 203 +++--- kong/conf_loader/listeners.lua | 7 +- kong/db/declarative/init.lua | 37 ++ kong/db/schema/init.lua | 21 +- kong/pdk/response.lua | 22 +- kong/pdk/vault.lua | 594 ++++++++--------- kong/plugins/ldap-auth/asn1.lua | 617 ++++++++---------- kong/plugins/ldap-auth/ldap.lua | 102 ++- kong/runloop/balancer/targets.lua | 6 +- kong/vaults/env/init.lua | 5 +- spec/01-unit/03-conf_loader_spec.lua | 30 + spec/01-unit/04-prefix_handler_spec.lua | 25 + spec/02-integration/02-cmd/14-vault_spec.lua | 33 +- .../09-hybrid_mode/03-pki_spec.lua | 2 +- .../13-vaults/01-vault_spec.lua | 25 +- t/01-pdk/08-response/05-set_header.t | 29 + t/01-pdk/08-response/08-set_headers.t | 36 +- t/01-pdk/08-response/11-exit.t | 31 +- 27 files changed, 1074 insertions(+), 955 deletions(-) create mode 100644 CODEOWNERS diff --git a/.github/labeler.yml b/.github/labeler.yml index 760914eb6a4..f8539e47407 100644 --- a/.github/labeler.yml +++ b/.github/labeler.yml @@ -20,8 +20,11 @@ core/db/migrations: core/db: - any: ['kong/db/**/*', '!kong/db/migrations/**/*'] +changelog: +- CHANGELOG.md + core/docs: -- '**/*.md' +- any: ['**/*.md', '!CHANGELOG.md'] - 'kong/autodoc/**/*' core/language/go: diff --git a/.requirements b/.requirements index 4c4bb73009d..13537b16fdc 100644 --- a/.requirements +++ b/.requirements @@ -3,7 +3,7 @@ KONG_CONFLICTS=kong-enterprise-edition KONG_LICENSE="ASL 2.0" RESTY_VERSION=1.19.9.1 -RESTY_LUAROCKS_VERSION=3.8.0 +RESTY_LUAROCKS_VERSION=3.9.0 RESTY_OPENSSL_VERSION=1.1.1n RESTY_PCRE_VERSION=8.45 RESTY_LMDB_VERSION=master diff --git a/CHANGELOG.md b/CHANGELOG.md index 43700d3a060..89ddc37e6b6 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -64,6 +64,13 @@ ## Unreleased +### Fixes + +#### PDK + +- `pdk.response.set_header()`, `pdk.response.set_headers()`, `pdk.response.exit()` now ignore and emit warnings for manually set `Transfer-Encoding` headers. + [#8698](https://github.com/Kong/kong/pull/8698) + ### Breaking Changes #### Admin API @@ -72,6 +79,8 @@ method now. [#8596](https://github.com/Kong/kong/pull/8596). If you have scripts that depend on it being `POST`, these scripts will need to be updated when updating to Kong 3.0. +- Insert and update operations on duplicated target entities returns 409. + [#8179](https://github.com/Kong/kong/pull/8179) #### PDK @@ -113,7 +122,9 @@ - Bumped inspect from 3.1.2 to 3.1.3 [#8589](https://github.com/Kong/kong/pull/8589) - Bumped resty.acme from 0.7.2 to 0.8.0 - [#8680](https://github.com/Kong/kong/pull/8680 + [#8680](https://github.com/Kong/kong/pull/8680) +- Bumped luarocks from 3.8.0 to 3.9.0 + [#8700](https://github.com/Kong/kong/pull/8700) ### Additions @@ -144,6 +155,8 @@ - Fix issue where the Go plugin server instance would not be updated after a restart (e.g., upon a plugin server crash). [#8547](https://github.com/Kong/kong/pull/8547) +- Fixed an issue on trying to reschedule the DNS resolving timer when Kong was + being reloaded. [#8702](https://github.com/Kong/kong/pull/8702) #### Plugins diff --git a/CODEOWNERS b/CODEOWNERS new file mode 100644 index 00000000000..c9f58c8eeb0 --- /dev/null +++ b/CODEOWNERS @@ -0,0 +1 @@ +* @Kong/gateway diff --git a/Jenkinsfile b/Jenkinsfile index 289f71afaf0..f94f37fe389 100644 --- a/Jenkinsfile +++ b/Jenkinsfile @@ -57,7 +57,7 @@ pipeline { } } parallel { - stage('AmazonLinux') { + stage('RPM') { agent { node { label 'bionic' @@ -66,68 +66,8 @@ pipeline { environment { KONG_SOURCE_LOCATION = "${env.WORKSPACE}" KONG_BUILD_TOOLS_LOCATION = "${env.WORKSPACE}/../kong-build-tools" - AWS_ACCESS_KEY = credentials('AWS_ACCESS_KEY') - AWS_SECRET_ACCESS_KEY = credentials('AWS_SECRET_ACCESS_KEY') - } - steps { - sh 'echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_USERNAME" --password-stdin || true' - sh 'make setup-kong-build-tools' - sh 'PACKAGE_TYPE=rpm RESTY_IMAGE_BASE=amazonlinux RESTY_IMAGE_TAG=2 make release' - } - } - stage('src & Alpine') { - agent { - node { - label 'bionic' - } - } - environment { - KONG_SOURCE_LOCATION = "${env.WORKSPACE}" - KONG_BUILD_TOOLS_LOCATION = "${env.WORKSPACE}/../kong-build-tools" - AWS_ACCESS_KEY = credentials('AWS_ACCESS_KEY') - AWS_SECRET_ACCESS_KEY = credentials('AWS_SECRET_ACCESS_KEY') - } - steps { - sh 'echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_USERNAME" --password-stdin || true' - sh 'make setup-kong-build-tools' - sh 'PACKAGE_TYPE=src RESTY_IMAGE_BASE=src make release' - sh 'PACKAGE_TYPE=apk RESTY_IMAGE_BASE=alpine RESTY_IMAGE_TAG=3.14 CACHE=false DOCKER_MACHINE_ARM64_NAME="kong-"`cat /proc/sys/kernel/random/uuid` make release' - - } - } - stage('RedHat') { - agent { - node { - label 'bionic' - } - } - environment { - PACKAGE_TYPE = 'rpm' - RESTY_IMAGE_BASE = 'rhel' - KONG_SOURCE_LOCATION = "${env.WORKSPACE}" - KONG_BUILD_TOOLS_LOCATION = "${env.WORKSPACE}/../kong-build-tools" - PRIVATE_KEY_FILE = credentials('kong.private.gpg-key.asc') - PRIVATE_KEY_PASSPHRASE = credentials('kong.private.gpg-key.asc.password') - } - steps { - sh 'echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_USERNAME" --password-stdin || true' - sh 'make setup-kong-build-tools' - sh 'cp $PRIVATE_KEY_FILE ../kong-build-tools/kong.private.gpg-key.asc' - sh 'RESTY_IMAGE_TAG=7 make release' - sh 'RESTY_IMAGE_TAG=8 make release' - } - } - stage('CentOS') { - agent { - node { - label 'bionic' - } - } - environment { - PACKAGE_TYPE = 'rpm' - RESTY_IMAGE_BASE = 'centos' - KONG_SOURCE_LOCATION = "${env.WORKSPACE}" - KONG_BUILD_TOOLS_LOCATION = "${env.WORKSPACE}/../kong-build-tools" + GITHUB_SSH_KEY = credentials('github_bot_ssh_key') + PACKAGE_TYPE = "rpm" PRIVATE_KEY_FILE = credentials('kong.private.gpg-key.asc') PRIVATE_KEY_PASSPHRASE = credentials('kong.private.gpg-key.asc.password') } @@ -135,99 +75,60 @@ pipeline { sh 'echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_USERNAME" --password-stdin || true' sh 'make setup-kong-build-tools' sh 'cp $PRIVATE_KEY_FILE ../kong-build-tools/kong.private.gpg-key.asc' - sh 'RESTY_IMAGE_TAG=7 make release' - sh 'RESTY_IMAGE_TAG=8 make release' + sh 'make RESTY_IMAGE_BASE=amazonlinux RESTY_IMAGE_TAG=2 release' + sh 'make RESTY_IMAGE_BASE=centos RESTY_IMAGE_TAG=7 release' + sh 'make RESTY_IMAGE_BASE=centos RESTY_IMAGE_TAG=8 release' + sh 'make RESTY_IMAGE_BASE=rhel RESTY_IMAGE_TAG=7 release' + sh 'make RESTY_IMAGE_BASE=rhel RESTY_IMAGE_TAG=8 release' } } - stage('Debian OldStable') { + stage('DEB') { agent { node { label 'bionic' } } environment { - PACKAGE_TYPE = 'deb' - RESTY_IMAGE_BASE = 'debian' - KONG_SOURCE_LOCATION = "${env.WORKSPACE}" - KONG_BUILD_TOOLS_LOCATION = "${env.WORKSPACE}/../kong-build-tools" - } - steps { - sh 'echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_USERNAME" --password-stdin || true' - sh 'make setup-kong-build-tools' - sh 'RESTY_IMAGE_TAG=stretch make release' - } - } - stage('Debian Stable & Testing') { - agent { - node { - label 'bionic' - } - } - environment { - PACKAGE_TYPE = 'deb' - RESTY_IMAGE_BASE = 'debian' - KONG_SOURCE_LOCATION = "${env.WORKSPACE}" - KONG_BUILD_TOOLS_LOCATION = "${env.WORKSPACE}/../kong-build-tools" - } - steps { - sh 'echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_USERNAME" --password-stdin || true' - sh 'make setup-kong-build-tools' - sh 'RESTY_IMAGE_TAG=buster make release' - sh 'RESTY_IMAGE_TAG=bullseye make release' - } - } - stage('Ubuntu') { - agent { - node { - label 'bionic' - } - } - environment { - PACKAGE_TYPE = 'deb' - RESTY_IMAGE_BASE = 'ubuntu' - RESTY_IMAGE_TAG = 'bionic' KONG_SOURCE_LOCATION = "${env.WORKSPACE}" KONG_BUILD_TOOLS_LOCATION = "${env.WORKSPACE}/../kong-build-tools" + GITHUB_SSH_KEY = credentials('github_bot_ssh_key') + PACKAGE_TYPE = "deb" } steps { sh 'echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_USERNAME" --password-stdin || true' sh 'make setup-kong-build-tools' - sh 'RESTY_IMAGE_TAG=bionic make release' - sh 'RESTY_IMAGE_TAG=focal make release' + sh 'make RESTY_IMAGE_BASE=debian RESTY_IMAGE_TAG=9 release' + sh 'make RESTY_IMAGE_BASE=debian RESTY_IMAGE_TAG=10 release' + sh 'make RESTY_IMAGE_BASE=debian RESTY_IMAGE_TAG=11 release' + sh 'make RESTY_IMAGE_BASE=ubuntu RESTY_IMAGE_TAG=16.04 release' + sh 'make RESTY_IMAGE_BASE=ubuntu RESTY_IMAGE_TAG=18.04 release' + sh 'make RESTY_IMAGE_BASE=ubuntu RESTY_IMAGE_TAG=20.04 release' } } - stage('Ubuntu Xenial') { + stage('SRC & Alpine') { agent { node { label 'bionic' } } environment { - PACKAGE_TYPE = 'deb' - RESTY_IMAGE_BASE = 'ubuntu' - RESTY_IMAGE_TAG = 'xenial' - CACHE = 'false' - UPDATE_CACHE = 'true' - USER = 'travis' KONG_SOURCE_LOCATION = "${env.WORKSPACE}" KONG_BUILD_TOOLS_LOCATION = "${env.WORKSPACE}/../kong-build-tools" + GITHUB_SSH_KEY = credentials('github_bot_ssh_key') + PACKAGE_TYPE = "rpm" AWS_ACCESS_KEY = credentials('AWS_ACCESS_KEY') AWS_SECRET_ACCESS_KEY = credentials('AWS_SECRET_ACCESS_KEY') } steps { sh 'echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_USERNAME" --password-stdin || true' sh 'make setup-kong-build-tools' - sh 'DOCKER_MACHINE_ARM64_NAME="jenkins-kong-"`cat /proc/sys/kernel/random/uuid` make release' - } - post { - cleanup { - dir('../kong-build-tools'){ sh 'make cleanup-build' } - } + sh 'make RESTY_IMAGE_BASE=src RESTY_IMAGE_TAG=src PACKAGE_TYPE=src release' + sh 'make RESTY_IMAGE_BASE=alpine RESTY_IMAGE_TAG=3.10 PACKAGE_TYPE=apk DOCKER_MACHINE_ARM64_NAME="kong-"`cat /proc/sys/kernel/random/uuid` release' } } } } - stage('Post Packaging Steps') { + stage('Post Release Steps') { when { beforeAgent true allOf { diff --git a/kong/clustering/control_plane.lua b/kong/clustering/control_plane.lua index b31c0c2faa8..81c023e239a 100644 --- a/kong/clustering/control_plane.lua +++ b/kong/clustering/control_plane.lua @@ -58,6 +58,12 @@ local REMOVED_FIELDS = require("kong.clustering.compat.removed_fields") local _log_prefix = "[clustering] " +local function handle_export_deflated_reconfigure_payload(self) + local ok, p_err, err = pcall(self.export_deflated_reconfigure_payload, self) + return ok, p_err or err +end + + local function plugins_list_to_map(plugins_list) local versions = {} for _, plugin in ipairs(plugins_list) do @@ -481,7 +487,7 @@ function _M:handle_cp_websocket() self.clients[wb] = queue if not self.deflated_reconfigure_payload then - _, err = self:export_deflated_reconfigure_payload() + _, err = handle_export_deflated_reconfigure_payload(self) end if self.deflated_reconfigure_payload then @@ -654,8 +660,8 @@ local function push_config_loop(premature, self, push_config_semaphore, delay) return end - local _, err = self:export_deflated_reconfigure_payload() - if err then + local ok, err = handle_export_deflated_reconfigure_payload(self) + if not ok then ngx_log(ngx_ERR, _log_prefix, "unable to export initial config from database: ", err) end diff --git a/kong/clustering/wrpc_control_plane.lua b/kong/clustering/wrpc_control_plane.lua index 9ef85eb2774..4685c24d6a2 100644 --- a/kong/clustering/wrpc_control_plane.lua +++ b/kong/clustering/wrpc_control_plane.lua @@ -50,6 +50,13 @@ local _log_prefix = "[wrpc-clustering] " local wrpc_config_service + +local function handle_export_deflated_reconfigure_payload(self) + local ok, p_err, err = pcall(self.export_deflated_reconfigure_payload, self) + return ok, p_err or err +end + + local function get_config_service(self) if not wrpc_config_service then wrpc_config_service = wrpc.new_service() @@ -171,8 +178,8 @@ end function _M:push_config_one_client(client) if not self.config_call_rpc or not self.config_call_args then - local payload, err = self:export_deflated_reconfigure_payload() - if not payload then + local ok, err = handle_export_deflated_reconfigure_payload(self) + if not ok then ngx_log(ngx_ERR, _log_prefix, "unable to export config from database: ", err) return end @@ -558,8 +565,8 @@ local function push_config_loop(premature, self, push_config_semaphore, delay) end do - local _, err = self:export_deflated_reconfigure_payload() - if err then + local ok, err = handle_export_deflated_reconfigure_payload(self) + if not ok then ngx_log(ngx_ERR, _log_prefix, "unable to export initial config from database: ", err) end end diff --git a/kong/cmd/utils/prefix_handler.lua b/kong/cmd/utils/prefix_handler.lua index b26e208b428..78e5d9bd802 100644 --- a/kong/cmd/utils/prefix_handler.lua +++ b/kong/cmd/utils/prefix_handler.lua @@ -485,7 +485,13 @@ local function prepare_prefix(kong_config, nginx_custom_template_path, skip_writ "", } + local refs = kong_config["$refs"] + for k, v in pairs(kong_config) do + if refs and refs[k] then + v = refs[k] + end + if type(v) == "table" then if (getmetatable(v) or {}).__tostring then -- the 'tostring' meta-method knows how to serialize diff --git a/kong/cmd/vault.lua b/kong/cmd/vault.lua index b6fa853bef9..ccec61467cc 100644 --- a/kong/cmd/vault.lua +++ b/kong/cmd/vault.lua @@ -42,7 +42,6 @@ end local function get(args) - local vault = require "kong.pdk.vault".new() if args.command == "get" then local reference = args[1] if not reference then @@ -51,6 +50,8 @@ local function get(args) init_db(args) + local vault = kong.vault + if not vault.is_reference(reference) then -- assuming short form: /[/] reference = fmt("{vault://%s}", reference) diff --git a/kong/conf_loader/init.lua b/kong/conf_loader/init.lua index e42139ba369..2380a55620f 100644 --- a/kong/conf_loader/init.lua +++ b/kong/conf_loader/init.lua @@ -1,3 +1,6 @@ +local require = require + + local kong_default_conf = require "kong.templates.kong_defaults" local openssl_pkey = require "resty.openssl.pkey" local pl_stringio = require "pl.stringio" @@ -16,9 +19,34 @@ local ffi = require "ffi" local fmt = string.format +local sub = string.sub +local type = type +local sort = table.sort +local find = string.find +local gsub = string.gsub +local strip = pl_stringx.strip +local floor = math.floor +local lower = string.lower +local upper = string.upper +local match = string.match +local pairs = pairs +local assert = assert +local unpack = unpack +local ipairs = ipairs +local insert = table.insert +local remove = table.remove local concat = table.concat +local getenv = os.getenv +local exists = pl_path.exists +local abspath = pl_path.abspath +local tostring = tostring +local tonumber = tonumber +local setmetatable = setmetatable + + local C = ffi.C + ffi.cdef([[ struct group *getgrnam(const char *name); struct passwd *getpwnam(const char *name); @@ -100,13 +128,13 @@ local HEADERS = constants.HEADERS local HEADER_KEY_TO_NAME = { ["server_tokens"] = "server_tokens", ["latency_tokens"] = "latency_tokens", - [string.lower(HEADERS.VIA)] = HEADERS.VIA, - [string.lower(HEADERS.SERVER)] = HEADERS.SERVER, - [string.lower(HEADERS.PROXY_LATENCY)] = HEADERS.PROXY_LATENCY, - [string.lower(HEADERS.RESPONSE_LATENCY)] = HEADERS.RESPONSE_LATENCY, - [string.lower(HEADERS.ADMIN_LATENCY)] = HEADERS.ADMIN_LATENCY, - [string.lower(HEADERS.UPSTREAM_LATENCY)] = HEADERS.UPSTREAM_LATENCY, - [string.lower(HEADERS.UPSTREAM_STATUS)] = HEADERS.UPSTREAM_STATUS, + [lower(HEADERS.VIA)] = HEADERS.VIA, + [lower(HEADERS.SERVER)] = HEADERS.SERVER, + [lower(HEADERS.PROXY_LATENCY)] = HEADERS.PROXY_LATENCY, + [lower(HEADERS.RESPONSE_LATENCY)] = HEADERS.RESPONSE_LATENCY, + [lower(HEADERS.ADMIN_LATENCY)] = HEADERS.ADMIN_LATENCY, + [lower(HEADERS.UPSTREAM_LATENCY)] = HEADERS.UPSTREAM_LATENCY, + [lower(HEADERS.UPSTREAM_STATUS)] = HEADERS.UPSTREAM_STATUS, } @@ -700,11 +728,11 @@ local function infer_value(value, typ, opts) if not opts.from_kong_env then -- remove trailing comment, if any -- and remove escape chars from octothorpes - value = string.gsub(value, "[^\\]#.-$", "") - value = string.gsub(value, "\\#", "#") + value = gsub(value, "[^\\]#.-$", "") + value = gsub(value, "\\#", "#") end - value = pl_stringx.strip(value) + value = strip(value) end -- transform {boolean} values ("on"/"off" aliasing to true/false) @@ -730,7 +758,7 @@ local function infer_value(value, typ, opts) value = setmetatable(pl_stringx.split(value, ","), nil) -- remove List mt for i = 1, #value do - value[i] = pl_stringx.strip(value[i]) + value[i] = strip(value[i]) end end @@ -777,14 +805,14 @@ local function check_and_infer(conf, opts) local MAX_PORT = 65535 for _, port_map in ipairs(conf.port_maps) do - local colpos = string.find(port_map, ":", nil, true) + local colpos = find(port_map, ":", nil, true) if not colpos then errors[#errors + 1] = "invalid port mapping (`port_maps`): " .. port_map else - local host_port_str = string.sub(port_map, 1, colpos - 1) + local host_port_str = sub(port_map, 1, colpos - 1) local host_port_num = tonumber(host_port_str, 10) - local kong_port_str = string.sub(port_map, colpos + 1) + local kong_port_str = sub(port_map, colpos + 1) local kong_port_num = tonumber(kong_port_str, 10) if (host_port_num and host_port_num >= MIN_PORT and host_port_num <= MAX_PORT) @@ -800,8 +828,13 @@ local function check_and_infer(conf, opts) end if conf.database == "cassandra" then - log.deprecation("Support for Cassandra is deprecated. Please refer to https://konghq.com/blog/cassandra-support-deprecated", {after = "2.7", removal = "4.0"}) - if string.find(conf.cassandra_lb_policy, "DCAware", nil, true) + log.deprecation("Support for Cassandra is deprecated. Please refer to " .. + "https://konghq.com/blog/cassandra-support-deprecated", { + after = "2.7", + removal = "4.0" + }) + + if find(conf.cassandra_lb_policy, "DCAware", nil, true) and not conf.cassandra_local_datacenter then errors[#errors + 1] = "must specify 'cassandra_local_datacenter' when " .. @@ -837,9 +870,9 @@ local function check_and_infer(conf, opts) for _, prefix in ipairs({ "proxy_", "admin_", "status_" }) do local listen = conf[prefix .. "listen"] - local ssl_enabled = (concat(listen, ",") .. " "):find("%sssl[%s,]") ~= nil + local ssl_enabled = find(concat(listen, ",") .. " ", "%sssl[%s,]") ~= nil if not ssl_enabled and prefix == "proxy_" then - ssl_enabled = (concat(conf.stream_listen, ",") .. " "):find("%sssl[%s,]") ~= nil + ssl_enabled = find(concat(conf.stream_listen, ",") .. " ", "%sssl[%s,]") ~= nil end if prefix == "proxy_" then @@ -865,7 +898,7 @@ local function check_and_infer(conf, opts) if ssl_cert then for _, cert in ipairs(ssl_cert) do - if not pl_path.exists(cert) then + if not exists(cert) then errors[#errors + 1] = prefix .. "ssl_cert: no such file at " .. cert end end @@ -873,7 +906,7 @@ local function check_and_infer(conf, opts) if ssl_cert_key then for _, cert_key in ipairs(ssl_cert_key) do - if not pl_path.exists(cert_key) then + if not exists(cert_key) then errors[#errors + 1] = prefix .. "ssl_cert_key: no such file at " .. cert_key end end @@ -889,12 +922,12 @@ local function check_and_infer(conf, opts) errors[#errors + 1] = "client_ssl_cert must be specified" end - if conf.client_ssl_cert and not pl_path.exists(conf.client_ssl_cert) then + if conf.client_ssl_cert and not exists(conf.client_ssl_cert) then errors[#errors + 1] = "client_ssl_cert: no such file at " .. conf.client_ssl_cert end - if conf.client_ssl_cert_key and not pl_path.exists(conf.client_ssl_cert_key) then + if conf.client_ssl_cert_key and not exists(conf.client_ssl_cert_key) then errors[#errors + 1] = "client_ssl_cert_key: no such file at " .. conf.client_ssl_cert_key end @@ -910,19 +943,17 @@ local function check_and_infer(conf, opts) path = system_path else - - log.info("lua_ssl_trusted_certificate: unable to locate system bundle: " .. - err .. - ". Please set lua_ssl_trusted_certificate to a path with certificates" .. - " in order to remove this message") + log.info("lua_ssl_trusted_certificate: unable to locate system bundle: " .. err .. + ". Please set lua_ssl_trusted_certificate to a path with certificates " .. + "in order to remove this message") end end if path ~= "system" then - if not pl_path.exists(path) then - errors[#errors + 1] = "lua_ssl_trusted_certificate: no such file at " .. - path + if not exists(path) then + errors[#errors + 1] = "lua_ssl_trusted_certificate: no such file at " .. path end + new_paths[#new_paths + 1] = path end end @@ -954,14 +985,14 @@ local function check_and_infer(conf, opts) end if conf.ssl_dhparam then - if not is_predefined_dhgroup(conf.ssl_dhparam) and not pl_path.exists(conf.ssl_dhparam) then + if not is_predefined_dhgroup(conf.ssl_dhparam) and not exists(conf.ssl_dhparam) then errors[#errors + 1] = "ssl_dhparam: no such file at " .. conf.ssl_dhparam end else for _, key in ipairs({ "nginx_http_ssl_dhparam", "nginx_stream_ssl_dhparam" }) do local file = conf[key] - if file and not is_predefined_dhgroup(file) and not pl_path.exists(file) then + if file and not is_predefined_dhgroup(file) and not exists(file) then errors[#errors + 1] = key .. ": no such file at " .. file end end @@ -969,7 +1000,7 @@ local function check_and_infer(conf, opts) if conf.headers then for _, token in ipairs(conf.headers) do - if token ~= "off" and not HEADER_KEY_TO_NAME[string.lower(token)] then + if token ~= "off" and not HEADER_KEY_TO_NAME[lower(token)] then errors[#errors + 1] = fmt("headers: invalid entry '%s'", tostring(token)) end @@ -999,11 +1030,11 @@ local function check_and_infer(conf, opts) SRV = true, AAAA = true } for _, name in ipairs(conf.dns_order) do - if not allowed[name:upper()] then + if not allowed[upper(name)] then errors[#errors + 1] = fmt("dns_order: invalid entry '%s'", tostring(name)) end - if name:upper() == "AAAA" then + if upper(name) == "AAAA" then log.warn("the 'dns_order' configuration property specifies the " .. "experimental IPv6 entry 'AAAA'") @@ -1028,7 +1059,7 @@ local function check_and_infer(conf, opts) errors[#errors + 1] = "pg_max_concurrent_queries must be greater than 0" end - if conf.pg_max_concurrent_queries ~= math.floor(conf.pg_max_concurrent_queries) then + if conf.pg_max_concurrent_queries ~= floor(conf.pg_max_concurrent_queries) then errors[#errors + 1] = "pg_max_concurrent_queries must be an integer greater than 0" end @@ -1036,7 +1067,7 @@ local function check_and_infer(conf, opts) errors[#errors + 1] = "pg_semaphore_timeout must be greater than 0" end - if conf.pg_semaphore_timeout ~= math.floor(conf.pg_semaphore_timeout) then + if conf.pg_semaphore_timeout ~= floor(conf.pg_semaphore_timeout) then errors[#errors + 1] = "pg_semaphore_timeout must be an integer greater than 0" end @@ -1045,7 +1076,7 @@ local function check_and_infer(conf, opts) errors[#errors + 1] = "pg_ro_max_concurrent_queries must be greater than 0" end - if conf.pg_ro_max_concurrent_queries ~= math.floor(conf.pg_ro_max_concurrent_queries) then + if conf.pg_ro_max_concurrent_queries ~= floor(conf.pg_ro_max_concurrent_queries) then errors[#errors + 1] = "pg_ro_max_concurrent_queries must be an integer greater than 0" end end @@ -1055,7 +1086,7 @@ local function check_and_infer(conf, opts) errors[#errors + 1] = "pg_ro_semaphore_timeout must be greater than 0" end - if conf.pg_ro_semaphore_timeout ~= math.floor(conf.pg_ro_semaphore_timeout) then + if conf.pg_ro_semaphore_timeout ~= floor(conf.pg_ro_semaphore_timeout) then errors[#errors + 1] = "pg_ro_semaphore_timeout must be an integer greater than 0" end end @@ -1065,7 +1096,7 @@ local function check_and_infer(conf, opts) end if conf.role == "control_plane" then - if #conf.admin_listen < 1 or pl_stringx.strip(conf.admin_listen[1]) == "off" then + if #conf.admin_listen < 1 or strip(conf.admin_listen[1]) == "off" then errors[#errors + 1] = "admin_listen must be specified when role = \"control_plane\"" end @@ -1073,7 +1104,7 @@ local function check_and_infer(conf, opts) errors[#errors + 1] = "cluster_ca_cert must be specified when cluster_mtls = \"pki\"" end - if #conf.cluster_listen < 1 or pl_stringx.strip(conf.cluster_listen[1]) == "off" then + if #conf.cluster_listen < 1 or strip(conf.cluster_listen[1]) == "off" then errors[#errors + 1] = "cluster_listen must be specified when role = \"control_plane\"" end @@ -1082,7 +1113,7 @@ local function check_and_infer(conf, opts) end elseif conf.role == "data_plane" then - if #conf.proxy_listen < 1 or pl_stringx.strip(conf.proxy_listen[1]) == "off" then + if #conf.proxy_listen < 1 or strip(conf.proxy_listen[1]) == "off" then errors[#errors + 1] = "proxy_listen must be specified when role = \"data_plane\"" end @@ -1096,10 +1127,10 @@ local function check_and_infer(conf, opts) end if conf.cluster_mtls == "shared" then - table.insert(conf.lua_ssl_trusted_certificate, conf.cluster_cert) + insert(conf.lua_ssl_trusted_certificate, conf.cluster_cert) elseif conf.cluster_mtls == "pki" then - table.insert(conf.lua_ssl_trusted_certificate, conf.cluster_ca_cert) + insert(conf.lua_ssl_trusted_certificate, conf.cluster_ca_cert) end end @@ -1116,12 +1147,12 @@ local function check_and_infer(conf, opts) errors[#errors + 1] = "cluster certificate and key must be provided to use Hybrid mode" else - if not pl_path.exists(conf.cluster_cert) then + if not exists(conf.cluster_cert) then errors[#errors + 1] = "cluster_cert: no such file at " .. conf.cluster_cert end - if not pl_path.exists(conf.cluster_cert_key) then + if not exists(conf.cluster_cert_key) then errors[#errors + 1] = "cluster_cert_key: no such file at " .. conf.cluster_cert_key end @@ -1167,8 +1198,8 @@ local function overrides(k, default_v, opts, file_conf, arg_conf) if not opts.from_kong_env then -- environment variables have higher priority - local env_name = "KONG_" .. string.upper(k) - local env = os.getenv(env_name) + local env_name = "KONG_" .. upper(k) + local env = getenv(env_name) if env ~= nil then local to_print = env @@ -1193,7 +1224,7 @@ local function overrides(k, default_v, opts, file_conf, arg_conf) -- Escape "#" in env vars or overrides to avoid them being mangled by -- comments stripping logic. repeat - local s, n = string.gsub(value, [[([^\])#]], [[%1\#]]) + local s, n = gsub(value, [[([^\])#]], [[%1\#]]) value = s until n == 0 end @@ -1208,10 +1239,10 @@ local function parse_nginx_directives(dyn_namespace, conf, injected_in_namespace for k, v in pairs(conf) do if type(k) == "string" and not injected_in_namespace[k] then - local directive = string.match(k, dyn_namespace.prefix .. "(.+)") + local directive = match(k, dyn_namespace.prefix .. "(.+)") if directive then if v ~= "NONE" and not dyn_namespace.ignore[directive] then - table.insert(directives, { name = directive, value = v }) + insert(directives, { name = directive, value = v }) end injected_in_namespace[k] = true @@ -1344,7 +1375,7 @@ local function load(path, custom_conf, opts) --------------------- local from_file_conf = {} - if path and not pl_path.exists(path) then + if path and not exists(path) then -- file conf has been specified and must exist return nil, "no file at: " .. path end @@ -1353,7 +1384,7 @@ local function load(path, custom_conf, opts) -- try to look for a conf in default locations, but no big -- deal if none is found: we will use our defaults. for _, default_path in ipairs(DEFAULT_PATHS) do - if pl_path.exists(default_path) then + if exists(default_path) then path = default_path break end @@ -1410,7 +1441,7 @@ local function load(path, custom_conf, opts) t = t or {} for k, v in pairs(t) do - local directive = string.match(k, "^(" .. dyn_prefix .. ".+)") + local directive = match(k, "^(" .. dyn_prefix .. ".+)") if directive then dynamic_keys[directive] = true @@ -1433,7 +1464,7 @@ local function load(path, custom_conf, opts) end for k, v in pairs(env_vars) do - local kong_var = string.match(string.lower(k), "^kong_(.+)") + local kong_var = match(lower(k), "^kong_(.+)") if kong_var then -- the value will be read in `overrides()` kong_env_vars[kong_var] = true @@ -1481,6 +1512,7 @@ local function load(path, custom_conf, opts) --------------------------------- local loaded_vaults + local refs do -- validation local vaults_array = infer_value(conf.vaults, CONF_INFERENCES["vaults"].typ, opts) @@ -1490,7 +1522,7 @@ local function load(path, custom_conf, opts) if #vaults_array > 0 and vaults_array[1] ~= "off" then for i = 1, #vaults_array do - local vault_name = pl_stringx.strip(vaults_array[i]) + local vault_name = strip(vaults_array[i]) if vault_name ~= "off" then if vault_name == "bundled" then vaults = tablex.merge(constants.BUNDLED_VAULTS, vaults, true) @@ -1504,9 +1536,23 @@ local function load(path, custom_conf, opts) loaded_vaults = setmetatable(vaults, _nop_tostring_mt) - local vault = require "kong.pdk.vault".new() + local vault_conf = { loaded_vaults = loaded_vaults } + for k, v in pairs(conf) do + if sub(k, 1, 6) == "vault_" then + vault_conf[k] = v + end + end + + local vault = require("kong.pdk.vault").new({ configuration = vault_conf }) + for k, v in pairs(conf) do if vault.is_reference(v) then + if refs then + refs[k] = v + else + refs = setmetatable({ [k] = v }, _nop_tostring_mt) + end + local deref, deref_err = vault.get(v) if deref == nil or deref_err then return nil, fmt("failed to dereference '%s': %s for config option '%s'", v, deref_err, k) @@ -1531,14 +1577,16 @@ local function load(path, custom_conf, opts) end conf = tablex.merge(conf, defaults) -- intersection (remove extraneous properties) + conf.loaded_vaults = loaded_vaults + conf["$refs"] = refs local default_nginx_main_user = false local default_nginx_user = false do -- nginx 'user' directive - local user = utils.strip(conf.nginx_main_user):gsub("%s+", " ") + local user = gsub(strip(conf.nginx_main_user), "%s+", " ") if user == "nobody" or user == "nobody nobody" then conf.nginx_main_user = nil @@ -1546,7 +1594,7 @@ local function load(path, custom_conf, opts) default_nginx_main_user = true end - local user = utils.strip(conf.nginx_user):gsub("%s+", " ") + local user = gsub(strip(conf.nginx_user), "%s+", " ") if user == "nobody" or user == "nobody nobody" then conf.nginx_user = nil @@ -1598,7 +1646,7 @@ local function load(path, custom_conf, opts) conf_arr[#conf_arr+1] = k .. " = " .. pl_pretty.write(to_print, "") end - table.sort(conf_arr) + sort(conf_arr) for i = 1, #conf_arr do log.debug(conf_arr[i]) @@ -1615,8 +1663,7 @@ local function load(path, custom_conf, opts) if #conf.plugins > 0 and conf.plugins[1] ~= "off" then for i = 1, #conf.plugins do - local plugin_name = pl_stringx.strip(conf.plugins[i]) - + local plugin_name = strip(conf.plugins[i]) if plugin_name ~= "off" then if plugin_name == "bundled" then plugins = tablex.merge(constants.BUNDLED_PLUGINS, plugins, true) @@ -1640,7 +1687,7 @@ local function load(path, custom_conf, opts) for _, directive in pairs(http_directives) do if directive.name == "lua_shared_dict" - and string.find(directive.value, "prometheus_metrics", nil, true) + and find(directive.value, "prometheus_metrics", nil, true) then found = true break @@ -1648,7 +1695,7 @@ local function load(path, custom_conf, opts) end if not found then - table.insert(http_directives, { + insert(http_directives, { name = "lua_shared_dict", value = "prometheus_metrics 5m", }) @@ -1659,7 +1706,7 @@ local function load(path, custom_conf, opts) for _, directive in pairs(stream_directives) do if directive.name == "lua_shared_dict" - and string.find(directive.value, "stream_prometheus_metrics", nil, true) + and find(directive.value, "stream_prometheus_metrics", nil, true) then found = true break @@ -1667,7 +1714,7 @@ local function load(path, custom_conf, opts) end if not found then - table.insert(stream_directives, { + insert(stream_directives, { name = "lua_shared_dict", value = "stream_prometheus_metrics 5m", }) @@ -1676,7 +1723,7 @@ local function load(path, custom_conf, opts) for _, dyn_namespace in ipairs(DYNAMIC_KEY_NAMESPACES) do if dyn_namespace.injected_conf_name then - table.sort(conf[dyn_namespace.injected_conf_name], function(a, b) + sort(conf[dyn_namespace.injected_conf_name], function(a, b) return a.name < b.name end) end @@ -1704,7 +1751,7 @@ local function load(path, custom_conf, opts) if #conf.headers > 0 and conf.headers[1] ~= "off" then for _, token in ipairs(conf.headers) do if token ~= "off" then - enabled_headers[HEADER_KEY_TO_NAME[string.lower(token)]] = true + enabled_headers[HEADER_KEY_TO_NAME[lower(token)]] = true end end end @@ -1725,7 +1772,7 @@ local function load(path, custom_conf, opts) end -- load absolute paths - conf.prefix = pl_path.abspath(conf.prefix) + conf.prefix = abspath(conf.prefix) for _, prefix in ipairs({ "ssl", "admin_ssl", "status_ssl", "client_ssl", "cluster" }) do local ssl_cert = conf[prefix .. "_cert"] @@ -1734,26 +1781,26 @@ local function load(path, custom_conf, opts) if ssl_cert and ssl_cert_key then if type(ssl_cert) == "table" then for i, cert in ipairs(ssl_cert) do - ssl_cert[i] = pl_path.abspath(cert) + ssl_cert[i] = abspath(cert) end else - conf[prefix .. "_cert"] = pl_path.abspath(ssl_cert) + conf[prefix .. "_cert"] = abspath(ssl_cert) end if type(ssl_cert) == "table" then for i, key in ipairs(ssl_cert_key) do - ssl_cert_key[i] = pl_path.abspath(key) + ssl_cert_key[i] = abspath(key) end else - conf[prefix .. "_cert_key"] = pl_path.abspath(ssl_cert_key) + conf[prefix .. "_cert_key"] = abspath(ssl_cert_key) end end end if conf.cluster_ca_cert then - conf.cluster_ca_cert = pl_path.abspath(conf.cluster_ca_cert) + conf.cluster_ca_cert = abspath(conf.cluster_ca_cert) end local ssl_enabled = conf.proxy_ssl_enabled or @@ -1766,14 +1813,14 @@ local function load(path, custom_conf, opts) if directive.name == "ssl_dhparam" then if is_predefined_dhgroup(directive.value) then if ssl_enabled then - directive.value = pl_path.abspath(pl_path.join(conf.prefix, "ssl", directive.value .. ".pem")) + directive.value = abspath(pl_path.join(conf.prefix, "ssl", directive.value .. ".pem")) else - table.remove(conf[name], i) + remove(conf[name], i) end else - directive.value = pl_path.abspath(directive.value) + directive.value = abspath(directive.value) end break @@ -1787,7 +1834,7 @@ local function load(path, custom_conf, opts) tablex.map(pl_path.abspath, conf.lua_ssl_trusted_certificate) conf.lua_ssl_trusted_certificate_combined = - pl_path.abspath(pl_path.join(conf.prefix, ".ca_combined")) + abspath(pl_path.join(conf.prefix, ".ca_combined")) end -- attach prefix files paths diff --git a/kong/conf_loader/listeners.lua b/kong/conf_loader/listeners.lua index a18f1f2eb24..943ae15c88e 100644 --- a/kong/conf_loader/listeners.lua +++ b/kong/conf_loader/listeners.lua @@ -2,7 +2,12 @@ local pl_stringx = require "pl.stringx" local utils = require "kong.tools.utils" +local type = type +local insert = table.insert +local assert = assert +local ipairs = ipairs local concat = table.concat +local setmetatable = setmetatable local listeners = {} @@ -121,7 +126,7 @@ local function parse_listeners(values, flags) listener.listener = ip.host .. ":" .. ip.port .. (#cleaned_flags == 0 and "" or " " .. cleaned_flags) - table.insert(list, listener) + insert(list, listener) end return list diff --git a/kong/db/declarative/init.lua b/kong/db/declarative/init.lua index 5c06604959f..a070e9e34a4 100644 --- a/kong/db/declarative/init.lua +++ b/kong/db/declarative/init.lua @@ -398,6 +398,33 @@ function declarative.load_into_db(entities, meta) end +local function begin_transaction(db) + if db.strategy == "postgres" then + local ok, err = db.connector:connect("read") + if not ok then + return nil, err + end + + ok, err = db.connector:query("BEGIN TRANSACTION ISOLATION LEVEL REPEATABLE READ READ ONLY;", "read") + if not ok then + return nil, err + end + end + + return true +end + + +local function end_transaction(db) + if db.strategy == "postgres" then + -- just finish up the read-only transaction, + -- either COMMIT or ROLLBACK is fine. + db.connector:query("ROLLBACK;", "read") + db.connector:setkeepalive() + end +end + + local function export_from_db(emitter, skip_ws, skip_disabled_entities, expand_foreigns) local schemas = {} @@ -408,11 +435,18 @@ local function export_from_db(emitter, skip_ws, skip_disabled_entities, expand_f insert(schemas, dao.schema) end end + local sorted_schemas, err = schema_topological_sort(schemas) if not sorted_schemas then return nil, err end + local ok + ok, err = begin_transaction(db) + if not ok then + return nil, err + end + emitter:emit_toplevel({ _format_version = "2.1", _transform = false, @@ -439,6 +473,7 @@ local function export_from_db(emitter, skip_ws, skip_disabled_entities, expand_f end for row, err in db[name]:each(page_size, GLOBAL_QUERY_OPTS) do if not row then + end_transaction(db) kong.log.err(err) return nil, err end @@ -475,6 +510,8 @@ local function export_from_db(emitter, skip_ws, skip_disabled_entities, expand_f ::continue:: end + end_transaction(db) + return emitter:done() end diff --git a/kong/db/schema/init.lua b/kong/db/schema/init.lua index bfca395e4ae..05a0bf26da5 100644 --- a/kong/db/schema/init.lua +++ b/kong/db/schema/init.lua @@ -2,11 +2,9 @@ local tablex = require "pl.tablex" local pretty = require "pl.pretty" local utils = require "kong.tools.utils" local cjson = require "cjson" -local vault = require "kong.pdk.vault".new() +local is_reference = require "kong.pdk.vault".new().is_reference -local is_reference = vault.is_reference -local dereference = vault.get local setmetatable = setmetatable local getmetatable = getmetatable local re_match = ngx.re.match @@ -1682,6 +1680,8 @@ function Schema:process_auto_fields(data, context, nulls, opts) end end + local refs + for key, field in self:each_field(data) do local ftype = field.type local value = data[key] @@ -1736,9 +1736,16 @@ function Schema:process_auto_fields(data, context, nulls, opts) if resolve_references then if ftype == "string" and field.referenceable then if is_reference(value) then - local deref, err = dereference(value) + if refs then + refs[key] = value + else + refs = { [key] = value } + end + + local deref, err = kong.vault.get(value) if deref then value = deref + else if err then kong.log.warn("unable to resolve reference ", value, " (", err, ")") @@ -1755,9 +1762,11 @@ function Schema:process_auto_fields(data, context, nulls, opts) if subfield.type == "string" and subfield.referenceable then local count = #value if count > 0 then + refs[key] = new_tab(count, 0) for i = 1, count do if is_reference(value[i]) then - local deref, err = dereference(value[i]) + refs[key][i] = value[i] + local deref, err = kong.vault.get(value[i]) if deref then value[i] = deref else @@ -1809,6 +1818,8 @@ function Schema:process_auto_fields(data, context, nulls, opts) end end + data["$refs"] = refs + return data end diff --git a/kong/pdk/response.lua b/kong/pdk/response.lua index 4677b64c31f..a1fc9e8bded 100644 --- a/kong/pdk/response.lua +++ b/kong/pdk/response.lua @@ -392,6 +392,7 @@ local function new(self, major_version) -- -- Be aware that changing this setting might break any plugins that -- rely on the automatic underscore conversion. + -- You cannot set Transfer-Encoding header with this function. It will be ignored. -- -- @function kong.response.set_header -- @phases rewrite, access, header_filter, response, admin_api @@ -408,6 +409,11 @@ local function new(self, major_version) end validate_header(name, value) + local lower_name = lower(name) + if lower_name == "transfer-encoding" or lower_name == "transfer_encoding" then + self.log.warn("manually setting Transfer-Encoding. Ignored.") + return + end ngx.header[name] = normalize_header(value) end @@ -487,6 +493,8 @@ local function new(self, major_version) -- This function overrides any existing header bearing the same name as those -- specified in the `headers` argument. Other headers remain unchanged. -- + -- You cannot set Transfer-Encoding header with this function. It will be ignored. + -- -- @function kong.response.set_headers -- @phases rewrite, access, header_filter, response, admin_api -- @tparam table headers @@ -514,7 +522,12 @@ local function new(self, major_version) validate_headers(headers) for name, value in pairs(headers) do - ngx.header[name] = normalize_multi_header(value) + local lower_name = lower(name) + if lower_name == "transfer-encoding" or lower_name == "transfer_encoding" then + self.log.warn("manually setting Transfer-Encoding. Ignored.") + else + ngx.header[name] = normalize_multi_header(value) + end end end @@ -645,8 +658,13 @@ local function new(self, major_version) if headers ~= nil then for name, value in pairs(headers) do ngx.header[name] = normalize_multi_header(value) + local lower_name = lower(name) + if lower_name == "transfer-encoding" or lower_name == "transfer_encoding" then + self.log.warn("manually setting Transfer-Encoding. Ignored.") + else + ngx.header[name] = normalize_multi_header(value) + end if not has_content_type or not has_content_length then - local lower_name = lower(name) if lower_name == "content-type" or lower_name == "content_type" then diff --git a/kong/pdk/vault.lua b/kong/pdk/vault.lua index 2a43ae02de3..7d99c3351cf 100644 --- a/kong/pdk/vault.lua +++ b/kong/pdk/vault.lua @@ -11,11 +11,11 @@ local require = require local constants = require "kong.constants" local arguments = require "kong.api.arguments" +local lrucache = require "resty.lrucache" local cjson = require("cjson.safe").new() local clone = require "table.clone" - local ngx = ngx local fmt = string.format local sub = string.sub @@ -36,387 +36,373 @@ local parse_path = require "socket.url".parse_path local decode_json = cjson.decode -local BRACE_START = byte("{") -local BRACE_END = byte("}") -local COLON = byte(":") -local SLASH = byte("/") - +local function new(self) + local _VAULT = {} -local BUNDLED_VAULTS = constants.BUNDLED_VAULTS -local VAULT_NAMES + local LRU = lrucache.new(1000) + local BUNDLED_VAULTS = constants.BUNDLED_VAULTS + local VAULT_NAMES = BUNDLED_VAULTS and clone(BUNDLED_VAULTS) or {} + local BRACE_START = byte("{") + local BRACE_END = byte("}") + local COLON = byte(":") + local SLASH = byte("/") -local LRU = require("resty.lrucache").new(1000) + local vaults = self and self.configuration and self.configuration.loaded_vaults + if vaults then + for name in pairs(vaults) do + VAULT_NAMES[name] = true + end + end + local function build_cache_key(name, resource, version) + return version and fmt("reference:%s:%s:%s", name, resource, version) + or fmt("reference:%s:%s", name, resource) + end -local function build_cache_key(name, resource, version) - return version and fmt("reference:%s:%s:%s", name, resource, version) - or fmt("reference:%s:%s", name, resource) -end + local function validate_value(value, err, vault, resource, key, reference) + if type(value) ~= "string" then + if err then + return nil, fmt("unable to load value (%s) from vault (%s): %s [%s]", resource, vault, err, reference) + end + if value == nil then + return nil, fmt("unable to load value (%s) from vault (%s): not found [%s]", resource, vault, reference) + end -local function validate_value(value, err, vault, resource, key, reference) - if type(value) ~= "string" then - if err then - return nil, fmt("unable to load value (%s) from vault (%s): %s [%s]", resource, vault, err, reference) + return nil, fmt("unable to load value (%s) from vault (%s): invalid type (%s), string expected [%s]", + resource, vault, type(value), reference) end - if value == nil then - return nil, fmt("unable to load value (%s) from vault (%s): not found [%s]", resource, vault, reference) + if not key then + return value end - return nil, fmt("unable to load value (%s) from vault (%s): invalid type (%s), string expected [%s]", - resource, vault, type(value), reference) - end - - if not key then - return value - end + local json + json, err = decode_json(value) + if type(json) ~= "table" then + if err then + return nil, fmt("unable to json decode value (%s) received from vault (%s): %s [%s]", + resource, vault, err, reference) + end - local json - json, err = decode_json(value) - if type(json) ~= "table" then - if err then - return nil, fmt("unable to json decode value (%s) received from vault (%s): %s [%s]", - resource, vault, err, reference) + return nil, fmt("unable to json decode value (%s) received from vault (%s): invalid type (%s), table expected [%s]", + resource, vault, type(json), reference) end - return nil, fmt("unable to json decode value (%s) received from vault (%s): invalid type (%s), table expected [%s]", - resource, vault, type(json), reference) - end + value = json[key] + if type(value) ~= "string" then + if value == nil then + return nil, fmt("vault (%s) did not return value for resource '%s' with a key of '%s' [%s]", + vault, resource, key, reference) + end - value = json[key] - if type(value) ~= "string" then - if value == nil then - return nil, fmt("vault (%s) did not return value for resource '%s' with a key of '%s' [%s]", - vault, resource, key, reference) + return nil, fmt("invalid value received from vault (%s) for resource '%s' with a key of '%s': invalid type (%s), string expected [%s]", + vault, resource, key, type(value), reference) end - return nil, fmt("invalid value received from vault (%s) for resource '%s' with a key of '%s': invalid type (%s), string expected [%s]", - vault, resource, key, type(value), reference) + return value end - return value -end + local function process_secret(reference, opts) + local name = opts.name + local vaults = self and (self.db and self.db.vaults_beta) + local strategy + local field + if vaults and vaults.strategies then + strategy = vaults.strategies[name] + if not strategy then + return nil, fmt("could not find vault (%s) [%s]", name, reference) + end + local schema = vaults.schema.subschemas[name] + if not schema then + return nil, fmt("could not find vault schema (%s): %s [%s]", name, strategy, reference) + end -local function process_secret(reference, opts) - local name = opts.name - local kong = kong - local vaults = kong and (kong.db and kong.db.vaults_beta) - local strategy - local field - if vaults and vaults.strategies then - strategy = vaults.strategies[name] - if not strategy then - return nil, fmt("could not find vault (%s) [%s]", name, reference) - end + field = schema.fields.config - local schema = vaults.schema.subschemas[name] - if not schema then - return nil, fmt("could not find vault schema (%s): %s [%s]", name, strategy, reference) - end + else + local ok + ok, strategy = pcall(require, fmt("kong.vaults.%s", name)) + if not ok then + return nil, fmt("could not find vault (%s): %s [%s]", name, strategy, reference) + end - field = schema.fields.config + local def + ok, def = pcall(require, fmt("kong.vaults.%s.schema", name)) + if not ok then + return nil, fmt("could not find vault schema (%s): %s [%s]", name, def, reference) + end - else - local ok - ok, strategy = pcall(require, fmt("kong.vaults.%s", name)) - if not ok then - return nil, fmt("could not find vault (%s): %s [%s]", name, strategy, reference) - end + local schema = require("kong.db.schema").new(require("kong.db.schema.entities.vaults_beta")) - local def - ok, def = pcall(require, fmt("kong.vaults.%s.schema", name)) - if not ok then - return nil, fmt("could not find vault schema (%s): %s [%s]", name, def, reference) - end + local err + ok, err = schema:new_subschema(name, def) + if not ok then + return nil, fmt("could not load vault sub-schema (%s): %s [%s]", name, err, reference) + end - local schema = require("kong.db.schema").new(require("kong.db.schema.entities.vaults_beta")) + schema = schema.subschemas[name] + if not schema then + return nil, fmt("could not find vault sub-schema (%s) [%s]", name, reference) + end - local err - ok, err = schema:new_subschema(name, def) - if not ok then - return nil, fmt("could not load vault sub-schema (%s): %s [%s]", name, err, reference) + field = schema.fields.config end - schema = schema.subschemas[name] - if not schema then - return nil, fmt("could not find vault sub-schema (%s) [%s]", name, reference) + if strategy.init then + strategy.init() end - field = schema.fields.config - end - - if strategy.init then - strategy.init() - end - - local resource = opts.resource - local key = opts.key - local config = opts.config or {} - if kong and kong.configuration then - local configuration = kong.configuration - local fields = field.fields - local env_name = gsub(name, "-", "_") - for i = 1, #fields do - local k, f = next(fields[i]) - if config[k] == nil then - local n = lower(fmt("vault_%s_%s", env_name, k)) - local v = configuration[n] - if v ~= nil then - config[k] = v - elseif f.required and f.default ~= nil then - config[k] = f.default + local resource = opts.resource + local key = opts.key + local config = opts.config or {} + if self and self.configuration then + local configuration = self.configuration + local fields = field.fields + local env_name = gsub(name, "-", "_") + for i = 1, #fields do + local k, f = next(fields[i]) + if config[k] == nil then + local n = lower(fmt("vault_%s_%s", env_name, k)) + local v = configuration[n] + if v ~= nil then + config[k] = v + elseif f.required and f.default ~= nil then + config[k] = f.default + end end end end - end - config = arguments.infer_value(config, field) + config = arguments.infer_value(config, field) - local value, err = strategy.get(config, resource, opts.version) - return validate_value(value, err, name, resource, key, reference) -end - - -local function config_secret(reference, opts) - local name = opts.name - local kong = kong - local vaults = kong.db.vaults_beta - local cache = kong.core_cache - local vault - local err - if cache then - local cache_key = vaults:cache_key(name) - vault, err = cache:get(cache_key, nil, vaults.select_by_prefix, vaults, name) - - else - vault, err = vaults:select_by_prefix(name) - end - - if not vault then - if err then - return nil, fmt("vault not found (%s): %s [%s]", name, err, reference) - end - - return nil, fmt("vault not found (%s) [%s]", name, reference) + local value, err = strategy.get(config, resource, opts.version) + return validate_value(value, err, name, resource, key, reference) end - local vname = vault.name - local strategy = vaults.strategies[vname] - if not strategy then - return nil, fmt("vault not installed (%s) [%s]", vname, reference) - end + local function config_secret(reference, opts) + local name = opts.name + local vaults = self.db.vaults_beta + local cache = self.core_cache + local vault + local err + if cache then + local cache_key = vaults:cache_key(name) + vault, err = cache:get(cache_key, nil, vaults.select_by_prefix, vaults, name) - local schema = vaults.schema.subschemas[vname] - if not schema then - return nil, fmt("could not find vault sub-schema (%s) [%s]", vname, reference) - end + else + vault, err = vaults:select_by_prefix(name) + end - local config = opts.config - if config then - config = arguments.infer_value(config, schema.fields.config) - for k, v in pairs(vault.config) do - if v ~= nil and config[k] == nil then - config[k] = v + if not vault then + if err then + return nil, fmt("vault not found (%s): %s [%s]", name, err, reference) end + + return nil, fmt("vault not found (%s) [%s]", name, reference) end - else - config = vault.config - end + local vname = vault.name - local resource = opts.resource - local key = opts.key - local version = opts.version + local strategy = vaults.strategies[vname] + if not strategy then + return nil, fmt("vault not installed (%s) [%s]", vname, reference) + end - local cache_key = build_cache_key(name, resource, version) - local value - if cache then - value, err = cache:get(cache_key, nil, strategy.get, config, resource, version) - else - value, err = strategy.get(config, resource, version) - end + local schema = vaults.schema.subschemas[vname] + if not schema then + return nil, fmt("could not find vault sub-schema (%s) [%s]", vname, reference) + end - return validate_value(value, err, name, resource, key, reference) -end + local config = opts.config + if config then + config = arguments.infer_value(config, schema.fields.config) + for k, v in pairs(vault.config) do + if v ~= nil and config[k] == nil then + config[k] = v + end + end + else + config = vault.config + end ---- --- Checks if the passed in reference looks like a reference. --- Valid references start with '{vault://' and end with '}'. --- --- If you need more thorough validation, --- use `kong.vault.parse_reference`. --- --- @function kong.vault.is_reference --- @tparam string reference reference to check --- @treturn boolean `true` is the passed in reference looks like a reference, otherwise `false` --- --- @usage --- kong.vault.is_reference("{vault://env/key}") -- true --- kong.vault.is_reference("not a reference") -- false -local function is_reference(reference) - return type(reference) == "string" - and byte(reference, 1) == BRACE_START - and byte(reference, -1) == BRACE_END - and byte(reference, 7) == COLON - and byte(reference, 8) == SLASH - and byte(reference, 9) == SLASH - and sub(reference, 2, 6) == "vault" -end + local resource = opts.resource + local key = opts.key + local version = opts.version + local cache_key = build_cache_key(name, resource, version) + local value + if cache then + value, err = cache:get(cache_key, nil, strategy.get, config, resource, version) + else + value, err = strategy.get(config, resource, version) + end ---- --- Parses and decodes the passed in reference and returns a table --- containing its components. --- --- Given a following resource: --- ```lua --- "{vault://env/cert/key?prefix=SSL_#1}" --- ``` --- --- This function will return following table: --- --- ```lua --- { --- name = "env", -- name of the Vault entity or Vault strategy --- resource = "cert", -- resource where secret is stored --- key = "key", -- key to lookup if the resource is secret object --- config = { -- if there are any config options specified --- prefix = "SSL_" --- }, --- version = 1 -- if the version is specified --- } --- ``` --- --- @function kong.vault.parse_reference --- @tparam string reference reference to parse --- @treturn table|nil a table containing each component of the reference, or `nil` on error --- @treturn string|nil error message on failure, otherwise `nil` --- --- @usage --- local ref, err = kong.vault.parse_reference("{vault://env/cert/key?prefix=SSL_#1}") -- table -local function parse_reference(reference) - if not is_reference(reference) then - return nil, fmt("not a reference [%s]", tostring(reference)) + return validate_value(value, err, name, resource, key, reference) end - local url, err = parse_url(sub(reference, 2, -2)) - if not url then - return nil, fmt("reference is not url (%s) [%s]", err, reference) + --- + -- Checks if the passed in reference looks like a reference. + -- Valid references start with '{vault://' and end with '}'. + -- + -- If you need more thorough validation, + -- use `kong.vault.parse_reference`. + -- + -- @function kong.vault.is_reference + -- @tparam string reference reference to check + -- @treturn boolean `true` is the passed in reference looks like a reference, otherwise `false` + -- + -- @usage + -- kong.vault.is_reference("{vault://env/key}") -- true + -- kong.vault.is_reference("not a reference") -- false + function _VAULT.is_reference(reference) + return type(reference) == "string" + and byte(reference, 1) == BRACE_START + and byte(reference, -1) == BRACE_END + and byte(reference, 7) == COLON + and byte(reference, 8) == SLASH + and byte(reference, 9) == SLASH + and sub(reference, 2, 6) == "vault" end - local name = url.host - if not name then - return nil, fmt("reference url is missing host [%s]", reference) - end - local path = url.path - if not path then - return nil, fmt("reference url is missing path [%s]", reference) - end + --- + -- Parses and decodes the passed in reference and returns a table + -- containing its components. + -- + -- Given a following resource: + -- ```lua + -- "{vault://env/cert/key?prefix=SSL_#1}" + -- ``` + -- + -- This function will return following table: + -- + -- ```lua + -- { + -- name = "env", -- name of the Vault entity or Vault strategy + -- resource = "cert", -- resource where secret is stored + -- key = "key", -- key to lookup if the resource is secret object + -- config = { -- if there are any config options specified + -- prefix = "SSL_" + -- }, + -- version = 1 -- if the version is specified + -- } + -- ``` + -- + -- @function kong.vault.parse_reference + -- @tparam string reference reference to parse + -- @treturn table|nil a table containing each component of the reference, or `nil` on error + -- @treturn string|nil error message on failure, otherwise `nil` + -- + -- @usage + -- local ref, err = kong.vault.parse_reference("{vault://env/cert/key?prefix=SSL_#1}") -- table + function _VAULT.parse_reference(reference) + if not _VAULT.is_reference(reference) then + return nil, fmt("not a reference [%s]", tostring(reference)) + end - local resource = sub(path, 2) - if resource == "" then - return nil, fmt("reference url has empty path [%s]", reference) - end + local url, err = parse_url(sub(reference, 2, -2)) + if not url then + return nil, fmt("reference is not url (%s) [%s]", err, reference) + end - local version = url.fragment - if version then - version = tonumber(version, 10) - if not version then - return nil, fmt("reference url has invalid version [%s]", reference) + local name = url.host + if not name then + return nil, fmt("reference url is missing host [%s]", reference) end - end - local key - local parts = parse_path(resource) - local count = #parts - if count == 1 then - resource = unescape_uri(parts[1]) + local path = url.path + if not path then + return nil, fmt("reference url is missing path [%s]", reference) + end - else - resource = unescape_uri(concat(parts, "/", 1, count - 1)) - if parts[count] ~= "" then - key = unescape_uri(parts[count]) + local resource = sub(path, 2) + if resource == "" then + return nil, fmt("reference url has empty path [%s]", reference) end - end - if resource == "" then - return nil, fmt("reference url has invalid path [%s]", reference) - end + local version = url.fragment + if version then + version = tonumber(version, 10) + if not version then + return nil, fmt("reference url has invalid version [%s]", reference) + end + end - local config - local query = url.query - if query and query ~= "" then - config = decode_args(query) - end + local key + local parts = parse_path(resource) + local count = #parts + if count == 1 then + resource = unescape_uri(parts[1]) - return { - name = url.host, - resource = resource, - key = key, - config = config, - version = version, - } -end + else + resource = unescape_uri(concat(parts, "/", 1, count - 1)) + if parts[count] ~= "" then + key = unescape_uri(parts[count]) + end + end + if resource == "" then + return nil, fmt("reference url has invalid path [%s]", reference) + end ---- --- Resolves the passed in reference and returns the value of it. --- --- @function kong.vault.get --- @tparam string reference reference to resolve --- @treturn string|nil resolved value of the reference --- @treturn string|nil error message on failure, otherwise `nil` --- --- @usage --- local value, err = kong.vault.get("{vault://env/cert/key}") -local function get(reference) - local opts, err = parse_reference(reference) - if err then - return nil, err - end + local config + local query = url.query + if query and query ~= "" then + config = decode_args(query) + end - local value = LRU:get(reference) - if value then - return value + return { + name = url.host, + resource = resource, + key = key, + config = config, + version = version, + } end - if kong and kong.db and VAULT_NAMES[opts.name] == nil then - value, err = config_secret(reference, opts) - else - value, err = process_secret(reference, opts) - end - if not value then - return nil, err - end + --- + -- Resolves the passed in reference and returns the value of it. + -- + -- @function kong.vault.get + -- @tparam string reference reference to resolve + -- @treturn string|nil resolved value of the reference + -- @treturn string|nil error message on failure, otherwise `nil` + -- + -- @usage + -- local value, err = kong.vault.get("{vault://env/cert/key}") + function _VAULT.get(reference) + local opts, err = _VAULT.parse_reference(reference) + if err then + return nil, err + end - LRU:set(reference, value) + local value = LRU:get(reference) + if value then + return value + end - return value -end + if self and self.db and VAULT_NAMES[opts.name] == nil then + value, err = config_secret(reference, opts) + else + value, err = process_secret(reference, opts) + end + if not value then + return nil, err + end -local function new(self) - VAULT_NAMES = BUNDLED_VAULTS and clone(BUNDLED_VAULTS) or {} + LRU:set(reference, value) - local vaults = self and self.configuration and self.configuration.loaded_vaults - if vaults then - for name in pairs(vaults) do - VAULT_NAMES[name] = true - end + return value end - return { - is_reference = is_reference, - parse_reference = parse_reference, - get = get, - } + return _VAULT end diff --git a/kong/plugins/ldap-auth/asn1.lua b/kong/plugins/ldap-auth/asn1.lua index 23ff6aad48f..05c8ecdb2d8 100644 --- a/kong/plugins/ldap-auth/asn1.lua +++ b/kong/plugins/ldap-auth/asn1.lua @@ -1,408 +1,325 @@ -local lpack = require "lua_pack" -local bpack = lpack.pack -local bunpack = lpack.unpack - - -local setmetatable = setmetatable -local tonumber = tonumber -local reverse = string.reverse -local ipairs = ipairs -local concat = table.concat -local insert = table.insert -local pairs = pairs -local math = math -local type = type -local char = string.char -local bit = bit +local ffi = require "ffi" +local C = ffi.C +local ffi_new = ffi.new +local ffi_string = ffi.string +local ffi_cast = ffi.cast +local band = bit.band +local base = require "resty.core.base" +local new_tab = base.new_tab + +local cucharpp = ffi_new("const unsigned char*[1]") +local ucharpp = ffi_new("unsigned char*[1]") +local charpp = ffi_new("char*[1]") + + +ffi.cdef [[ + typedef struct asn1_string_st ASN1_OCTET_STRING; + typedef struct asn1_string_st ASN1_INTEGER; + typedef struct asn1_string_st ASN1_ENUMERATED; + typedef struct asn1_string_st ASN1_STRING; + + ASN1_OCTET_STRING *ASN1_OCTET_STRING_new(); + ASN1_INTEGER *ASN1_INTEGER_new(); + ASN1_ENUMERATED *ASN1_ENUMERATED_new(); + + void ASN1_INTEGER_free(ASN1_INTEGER *a); + void ASN1_STRING_free(ASN1_STRING *a); + + long ASN1_INTEGER_get(const ASN1_INTEGER *a); + long ASN1_ENUMERATED_get(const ASN1_ENUMERATED *a); + + int ASN1_INTEGER_set(ASN1_INTEGER *a, long v); + int ASN1_ENUMERATED_set(ASN1_ENUMERATED *a, long v); + int ASN1_STRING_set(ASN1_STRING *str, const void *data, int len); + + const unsigned char *ASN1_STRING_get0_data(const ASN1_STRING *x); + // openssl 1.1.0 + unsigned char *ASN1_STRING_data(ASN1_STRING *x); + + ASN1_OCTET_STRING *d2i_ASN1_OCTET_STRING(ASN1_OCTET_STRING **a, const unsigned char **ppin, long length); + ASN1_INTEGER *d2i_ASN1_INTEGER(ASN1_INTEGER **a, const unsigned char **ppin, long length); + ASN1_ENUMERATED *d2i_ASN1_ENUMERATED(ASN1_ENUMERATED **a, const unsigned char **ppin, long length); + + int i2d_ASN1_OCTET_STRING(const ASN1_OCTET_STRING *a, unsigned char **pp); + int i2d_ASN1_INTEGER(const ASN1_INTEGER *a, unsigned char **pp); + int i2d_ASN1_ENUMERATED(const ASN1_ENUMERATED *a, unsigned char **pp); + + int ASN1_get_object(const unsigned char **pp, long *plength, int *ptag, + int *pclass, long omax); + int ASN1_object_size(int constructed, int length, int tag); + + void ASN1_put_object(unsigned char **pp, int constructed, int length, + int tag, int xclass); +]] + + +local ASN1_STRING_get0_data +if not pcall(function() return C.ASN1_STRING_get0_data end) then + ASN1_STRING_get0_data = C.ASN1_STRING_data +else + ASN1_STRING_get0_data = C.ASN1_STRING_get0_data +end -local _M = {} +local _M = new_tab(0, 7) -_M.BERCLASS = { - Universal = 0, - Application = 64, - ContextSpecific = 128, - Private = 192 +local CLASS = { + UNIVERSAL = 0x00, + APPLICATION = 0x40, + CONTEXT_SPECIFIC = 0x80, + PRIVATE = 0xc0 } +_M.CLASS = CLASS + + +local TAG = { + -- ASN.1 tag values + EOC = 0, + BOOLEAN = 1, + INTEGER = 2, + OCTET_STRING = 4, + NULL = 5, + ENUMERATED = 10, + SEQUENCE = 16, +} +_M.TAG = TAG -_M.ASN1Decoder = { - new = function(self,o) - o = o or {} - setmetatable(o, self) - self.__index = self - o:registerBaseDecoders() - return o - end, - - decode = function(self, encStr, pos) - local etype, elen - local newpos = pos - - newpos, etype = bunpack(encStr, "X1", newpos) - newpos, elen = self.decodeLength(encStr, newpos) - - if self.decoder[etype] then - return self.decoder[etype](self, encStr, elen, newpos) - else - return newpos, nil - end - end, - - setStopOnError = function(self, val) - self.stoponerror = val - end, - - registerBaseDecoders = function(self) - self.decoder = {} - - self.decoder["0A"] = function(self, encStr, elen, pos) - return self.decodeInt(encStr, elen, pos) - end - - self.decoder["8A"] = function(self, encStr, elen, pos) - return bunpack(encStr, "A" .. elen, pos) - end - - self.decoder["31"] = function(self, encStr, elen, pos) - return pos, nil - end - - -- Boolean - self.decoder["01"] = function(self, encStr, elen, pos) - local val = bunpack(encStr, "X", pos) - if val ~= "FF" then - return pos, true - else - return pos, false - end - end - - -- Integer - self.decoder["02"] = function(self, encStr, elen, pos) - return self.decodeInt(encStr, elen, pos) - end - - -- Octet String - self.decoder["04"] = function(self, encStr, elen, pos) - return bunpack(encStr, "A" .. elen, pos) - end - - -- Null - self.decoder["05"] = function(self, encStr, elen, pos) - return pos, false - end - - -- Object Identifier - self.decoder["06"] = function(self, encStr, elen, pos) - return self:decodeOID(encStr, elen, pos) - end - - -- Context specific tags - self.decoder["30"] = function(self, encStr, elen, pos) - return self:decodeSeq(encStr, elen, pos) - end - end, - - registerTagDecoders = function(self, tagDecoders) - self:registerBaseDecoders() +local asn1_get_object +do + local lenp = ffi_new("long[1]") + local tagp = ffi_new("int[1]") + local classp = ffi_new("int[1]") + local strpp = ffi_new("const unsigned char*[1]") - for k, v in pairs(tagDecoders) do - self.decoder[k] = v + function asn1_get_object(der, start, stop) + start = start or 0 + stop = stop or #der + if stop <= start or stop > #der then + return nil, "invalid offset" end - end, - decodeLength = function(encStr, pos) - local elen - - pos, elen = bunpack(encStr, "C", pos) - if elen > 128 then - elen = elen - 128 - local elenCalc = 0 - local elenNext - - for i = 1, elen do - elenCalc = elenCalc * 256 - pos, elenNext = bunpack(encStr, "C", pos) - elenCalc = elenCalc + elenNext - end + local s_der = ffi_cast("const unsigned char *", der) + strpp[0] = s_der + start - elen = elenCalc + local ret = C.ASN1_get_object(strpp, lenp, tagp, classp, stop - start) + if band(ret, 0x80) == 0x80 then + return nil, "der with error encoding: " .. ret end - return pos, elen - end, - - decodeSeq = function(self, encStr, len, pos) - local seq = {} - local sPos = 1 - local sStr - - pos, sStr = bunpack(encStr, "A" .. len, pos) - - while (sPos < len) do - local newSeq - - sPos, newSeq = self:decode(sStr, sPos) - if not newSeq and self.stoponerror then - break - end - - insert(seq, newSeq) + local cons = false + if band(ret, 0x20) == 0x20 then + cons = true end - return pos, seq - end, - - decode_oid_component = function(encStr, pos) - local octet - local n = 0 - - repeat - pos, octet = bunpack(encStr, "b", pos) - n = n * 128 + bit.band(0x7F, octet) - until octet < 128 - - return pos, n - end, - - decodeOID = function(self, encStr, len, pos) - local last - local oid = {} - local octet - - last = pos + len - 1 - if pos <= last then - oid._snmp = "06" - pos, octet = bunpack(encStr, "C", pos) - oid[2] = math.fmod(octet, 40) - octet = octet - oid[2] - oid[1] = octet/40 - end - - while pos <= last do - local c - pos, c = self.decode_oid_component(encStr, pos) - oid[#oid + 1] = c - end - - return pos, oid - end, - - decodeInt = function(encStr, len, pos) - local hexStr - - pos, hexStr = bunpack(encStr, "X" .. len, pos) + local obj = { + tag = tagp[0], + class = classp[0], + len = tonumber(lenp[0]), + offset = strpp[0] - s_der, + hl = strpp[0] - s_der - start, -- header length + cons = cons, + } - local value = tonumber(hexStr, 16) - if value >= math.pow(256, len)/2 then - value = value - math.pow(256, len) - end - - return pos, value + return obj end -} - -_M.ASN1Encoder = { - new = function(self) - local o = {} - setmetatable(o, self) - self.__index = self - o:registerBaseEncoders() - return o - end, - - encodeSeq = function(self, seqData) - return bpack("XAA" , "30", self.encodeLength(#seqData), seqData) - end, +end +_M.get_object = asn1_get_object - encode = function(self, val) - local vtype = type(val) - if self.encoder[vtype] then - return self.encoder[vtype](self,val) - end - end, +local function asn1_put_object(tag, class, constructed, data, len) + len = type(data) == "string" and #data or len or 0 + if len <= 0 then + return nil, "invalid object length" + end - registerTagEncoders = function(self, tagEncoders) - self:registerBaseEncoders() + local outbuf = ffi_new("unsigned char[?]", len) + ucharpp[0] = outbuf - for k, v in pairs(tagEncoders) do - self.encoder[k] = v - end - end, + C.ASN1_put_object(ucharpp, constructed, len, tag, class) + if not data then + return ffi_string(outbuf) + end + return ffi_string(outbuf) .. data +end - registerBaseEncoders = function(self) - self.encoder = {} +_M.put_object = asn1_put_object - self.encoder["table"] = function(self, val) - if val._ldap == "0A" then - local ival = self.encodeInt(val[1]) - local len = self.encodeLength(#ival) - return bpack("XAA", "0A", len, ival) - end +local encode +do + local encoder = new_tab(0, 3) - if val._ldaptype then - local len + -- Integer + encoder[TAG.INTEGER] = function(val) + local typ = C.ASN1_INTEGER_new() + C.ASN1_INTEGER_set(typ, val) + charpp[0] = nil + local ret = C.i2d_ASN1_INTEGER(typ, charpp) + C.ASN1_INTEGER_free(typ) + return ffi_string(charpp[0], ret) + end - if val[1] == nil or #val[1] == 0 then - return bpack("XC", val._ldaptype, 0) - end + -- Octet String + encoder[TAG.OCTET_STRING] = function(val) + local typ = C.ASN1_OCTET_STRING_new() + C.ASN1_STRING_set(typ, val, #val) + charpp[0] = nil + local ret = C.i2d_ASN1_OCTET_STRING(typ, charpp) + C.ASN1_STRING_free(typ) + return ffi_string(charpp[0], ret) + end - len = self.encodeLength(#val[1]) - return bpack("XAA", val._ldaptype, len, val[1]) - end + encoder[TAG.ENUMERATED] = function(val) + local typ = C.ASN1_ENUMERATED_new() + C.ASN1_ENUMERATED_set(typ, val) + charpp[0] = nil + local ret = C.i2d_ASN1_ENUMERATED(typ, charpp) + C.ASN1_INTEGER_free(typ) + return ffi_string(charpp[0], ret) + end - local encVal = "" - for _, v in ipairs(val) do - encVal = encVal .. self.encode(v) -- todo: buffer? - end + encoder[TAG.SEQUENCE] = function(val) + return asn1_put_object(TAG.SEQUENCE, CLASS.UNIVERSAL, 1, val) + end - local tableType = "\x30" - if val["_snmp"] then - tableType = bpack("X", val["_snmp"]) + function encode(val, tag) + if tag == nil then + local typ = type(val) + if typ == "string" then + tag = TAG.OCTET_STRING + elseif typ == "number" then + tag = TAG.INTEGER end - - return bpack("AAA", tableType, self.encodeLength(#encVal), encVal) end - -- Boolean encoder - self.encoder["boolean"] = function(self, val) - if val then - return bpack("X", "01 01 FF") - else - return bpack("X", "01 01 00") - end + if encoder[tag] then + return encoder[tag](val) end + end +end +_M.encode = encode - -- Integer encoder - self.encoder["number"] = function(self, val) - local ival = self.encodeInt(val) - local len = self.encodeLength(#ival) - return bpack("XAA", "02", len, ival) - end +local decode +do + local decoder = new_tab(0, 3) - -- Octet String encoder - self.encoder["string"] = function(self, val) - local len = self.encodeLength(#val) - return bpack("XAA", "04", len, val) + decoder[TAG.OCTET_STRING] = function(der, offset, len) + assert(offset < #der) + cucharpp[0] = ffi_cast("const unsigned char *", der) + offset + local typ = C.d2i_ASN1_OCTET_STRING(nil, cucharpp, len) + if typ == nil then + return nil end + local ret = ffi_string(ASN1_STRING_get0_data(typ)) + C.ASN1_STRING_free(typ) + return ret + end - -- Null encoder - self.encoder["nil"] = function(self, val) - return bpack("X", "05 00") + decoder[TAG.INTEGER] = function(der, offset, len) + assert(offset < #der) + cucharpp[0] = ffi_cast("const unsigned char *", der) + offset + local typ = C.d2i_ASN1_INTEGER(nil, cucharpp, len) + if typ == nil then + return nil end - end, - - encode_oid_component = function(n) - local parts = {} + local ret = C.ASN1_INTEGER_get(typ) + C.ASN1_INTEGER_free(typ) + return tonumber(ret) + end - parts[1] = char(n % 128) - while n >= 128 do - n = bit.rshift(n, 7) - parts[#parts + 1] = char(n % 128 + 0x80) + decoder[TAG.ENUMERATED] = function(der, offset, len) + assert(offset < #der) + cucharpp[0] = ffi_cast("const unsigned char *", der) + offset + local typ = C.d2i_ASN1_ENUMERATED(nil, cucharpp, len) + if typ == nil then + return nil end + local ret = C.ASN1_ENUMERATED_get(typ) + C.ASN1_INTEGER_free(typ) + return tonumber(ret) + end - return reverse(concat(parts)) - end, - - encodeInt = function(val) - local lsb = 0 - - if val > 0 then - local valStr = "" - - while (val > 0) do - lsb = math.fmod(val, 256) - valStr = valStr .. bpack("C", lsb) - val = math.floor(val/256) - end - - if lsb > 127 then - valStr = valStr .. "\0" - end - - return reverse(valStr) - - elseif val < 0 then - local i = 1 - local tcval = val + 256 - - while tcval <= 127 do - tcval = tcval + (math.pow(256, i) * 255) - i = i+1 - end - - local valStr = "" - - while (tcval > 0) do - lsb = math.fmod(tcval, 256) - valStr = valStr .. bpack("C", lsb) - tcval = math.floor(tcval/256) - end - - return reverse(valStr) - - else -- val == 0 - return bpack("x") + -- offset starts from 0 + function decode(der, offset) + offset = offset or 0 + local obj, err = asn1_get_object(der, offset) + if not obj then + return nil, nil, err end - end, - encodeLength = function(len) - if len < 128 then - return char(len) - - else - local parts = {} - - while len > 0 do - parts[#parts + 1] = char(len % 256) - len = bit.rshift(len, 8) - end - - return char(#parts + 0x80) .. reverse(concat(parts)) + local ret + if decoder[obj.tag] then + ret = decoder[obj.tag](der, offset, obj.hl + obj.len) end + return obj.offset + obj.len, ret end -} - -function _M.BERtoInt(class, constructed, number) - local asn1_type = class + number - - if constructed == true then - asn1_type = asn1_type + 32 - end - - return asn1_type end +_M.decode = decode + + +--[[ +Encoded LDAP Result: https://ldap.com/ldapv3-wire-protocol-reference-ldap-result/ + +30 0c -- Begin the LDAPMessage sequence + 02 01 03 -- The message ID (integer value 3) + 69 07 -- Begin the add response protocol op + 0a 01 00 -- success result code (enumerated value 0) + 04 00 -- No matched DN (0-byte octet string) + 04 00 -- No diagnostic message (0-byte octet string) +--]] +local function parse_ldap_result(der) + local offset, err, _ + -- message ID (integer) + local id + offset, id, err = decode(der) + if err then + return nil, err + end + -- response protocol op + local obj + obj, err = asn1_get_object(der, offset) + if err then + return nil, err + end + local op = obj.tag -function _M.intToBER(i) - local ber = {} - - if bit.band(i, _M.BERCLASS.Application) == _M.BERCLASS.Application then - ber.class = _M.BERCLASS.Application - elseif bit.band(i, _M.BERCLASS.ContextSpecific) == _M.BERCLASS.ContextSpecific then - ber.class = _M.BERCLASS.ContextSpecific - elseif bit.band(i, _M.BERCLASS.Private) == _M.BERCLASS.Private then - ber.class = _M.BERCLASS.Private - else - ber.class = _M.BERCLASS.Universal + -- success result code + local code + offset, code, err = decode(der, obj.offset) + if err then + return nil, err end - if bit.band(i, 32) == 32 then - ber.constructed = true - ber.number = i - ber.class - 32 + -- matched DN (octet string) + local matched_dn + offset, matched_dn, err = decode(der, offset) + if err then + return nil, err + end - else - ber.primitive = true - ber.number = i - ber.class + -- diagnostic message (octet string) + local diagnostic_msg + _, diagnostic_msg, err = decode(der, offset) + if err then + return nil, err end - return ber + local res = { + message_id = id, + protocol_op = op, + result_code = code, + matched_dn = matched_dn, + diagnostic_msg = diagnostic_msg, + } + + return res end +_M.parse_ldap_result = parse_ldap_result + return _M diff --git a/kong/plugins/ldap-auth/ldap.lua b/kong/plugins/ldap-auth/ldap.lua index 827edd7c067..ac86afcf9d5 100644 --- a/kong/plugins/ldap-auth/ldap.lua +++ b/kong/plugins/ldap-auth/ldap.lua @@ -1,6 +1,9 @@ local asn1 = require "kong.plugins.ldap-auth.asn1" local bunpack = require "lua_pack".unpack local fmt = string.format +local asn1_parse_ldap_result = asn1.parse_ldap_result +local asn1_put_object = asn1.put_object +local asn1_encode = asn1.encode local _M = {} @@ -28,12 +31,6 @@ local APPNO = { } -local function encodeLDAPOp(encoder, appno, isConstructed, data) - local asn1_type = asn1.BERtoInt(asn1.BERCLASS.Application, isConstructed, appno) - return encoder:encode({ _ldaptype = fmt("%X", asn1_type), data }) -end - - local function calculate_payload_length(encStr, pos, socket) local elen @@ -59,23 +56,14 @@ end function _M.bind_request(socket, username, password) - local encoder = asn1.ASN1Encoder:new() - local decoder = asn1.ASN1Decoder:new() - - local ldapAuth = encoder:encode({ _ldaptype = 80, password }) - local bindReq = encoder:encode(3) .. encoder:encode(username) .. ldapAuth - local ldapMsg = encoder:encode(ldapMessageId) .. - encodeLDAPOp(encoder, APPNO.BindRequest, true, bindReq) + local ldapAuth = asn1_put_object(0, asn1.CLASS.CONTEXT_SPECIFIC, 0, password) + local bindReq = asn1_encode(3) ..asn1_encode(username) .. ldapAuth + local ldapMsg = asn1_encode(ldapMessageId) .. + asn1_put_object(APPNO.BindRequest, asn1.CLASS.APPLICATION, 1, bindReq) - local packet - local pos - local packet_len - local tmp - local _ + local packet, packet_len, _ - local response = {} - - packet = encoder:encodeSeq(ldapMsg) + packet = asn1_encode(ldapMsg, asn1.TAG.SEQUENCE) ldapMessageId = ldapMessageId + 1 @@ -86,27 +74,23 @@ function _M.bind_request(socket, username, password) _, packet_len = calculate_payload_length(packet, 2, socket) packet = socket:receive(packet_len) - pos, response.messageID = decoder:decode(packet, 1) - pos, tmp = bunpack(packet, "C", pos) - pos = decoder.decodeLength(packet, pos) - response.protocolOp = asn1.intToBER(tmp) - if response.protocolOp.number ~= APPNO.BindResponse then - return false, fmt("Received incorrect Op in packet: %d, expected %d", - response.protocolOp.number, APPNO.BindResponse) + local res, err = asn1_parse_ldap_result(packet) + if err then + return false, "Invalid LDAP message encoding: " .. err end - pos, response.resultCode = decoder:decode(packet, pos) + if res.protocol_op ~= APPNO.BindResponse then + return false, fmt("Received incorrect Op in packet: %d, expected %d", + res.protocol_op, APPNO.BindResponse) + end - if response.resultCode ~= 0 then - local error_msg - pos, response.matchedDN = decoder:decode(packet, pos) - _, response.errorMessage = decoder:decode(packet, pos) - error_msg = ERROR_MSG[response.resultCode] + if res.result_code ~= 0 then + local error_msg = ERROR_MSG[res.result_code] return false, fmt("\n Error: %s\n Details: %s", - error_msg or "Unknown error occurred (code: " .. - response.resultCode .. ")", response.errorMessage or "") + error_msg or "Unknown error occurred (code: " .. + res.result_code .. ")", res.diagnostic_msg or "") else return true @@ -116,14 +100,12 @@ end function _M.unbind_request(socket) local ldapMsg, packet - local encoder = asn1.ASN1Encoder:new() ldapMessageId = ldapMessageId + 1 - ldapMsg = encoder:encode(ldapMessageId) .. - encodeLDAPOp(encoder, APPNO.UnbindRequest, - false, nil) - packet = encoder:encodeSeq(ldapMsg) + ldapMsg = asn1_encode(ldapMessageId) .. + asn1_put_object(APPNO.UnbindRequest, asn1.CLASS.APPLICATION, 0) + packet = asn1_encode(ldapMsg, asn1.TAG.SEQUENCE) socket:send(packet) @@ -132,47 +114,39 @@ end function _M.start_tls(socket) - local ldapMsg, pos, packet, packet_len, tmp, _ - local response = {} - local encoder = asn1.ASN1Encoder:new() - local decoder = asn1.ASN1Decoder:new() + local ldapMsg, packet, packet_len, _ - local method_name = encoder:encode({ _ldaptype = 80, "1.3.6.1.4.1.1466.20037" }) + local method_name = asn1_put_object(0, asn1.CLASS.CONTEXT_SPECIFIC, 0, "1.3.6.1.4.1.1466.20037") ldapMessageId = ldapMessageId + 1 - ldapMsg = encoder:encode(ldapMessageId) .. - encodeLDAPOp(encoder, APPNO.ExtendedRequest, true, method_name) + ldapMsg = asn1_encode(ldapMessageId) .. + asn1_put_object(APPNO.ExtendedRequest, asn1.CLASS.APPLICATION, 1, method_name) - packet = encoder:encodeSeq(ldapMsg) + packet = asn1_encode(ldapMsg, asn1.TAG.SEQUENCE) socket:send(packet) packet = socket:receive(2) _, packet_len = calculate_payload_length(packet, 2, socket) packet = socket:receive(packet_len) - pos, response.messageID = decoder:decode(packet, 1) - pos, tmp = bunpack(packet, "C", pos) - pos = decoder.decodeLength(packet, pos) - response.protocolOp = asn1.intToBER(tmp) - if response.protocolOp.number ~= APPNO.ExtendedResponse then - return false, fmt("Received incorrect Op in packet: %d, expected %d", - response.protocolOp.number, APPNO.ExtendedResponse) + local res, err = asn1_parse_ldap_result(packet) + if err then + return false, "Invalid LDAP message encoding: " .. err end - pos, response.resultCode = decoder:decode(packet, pos) - - if response.resultCode ~= 0 then - local error_msg + if res.protocol_op ~= APPNO.ExtendedResponse then + return false, fmt("Received incorrect Op in packet: %d, expected %d", + res.protocol_op, APPNO.ExtendedResponse) + end - pos, response.matchedDN = decoder:decode(packet, pos) - _, response.errorMessage = decoder:decode(packet, pos) - error_msg = ERROR_MSG[response.resultCode] + if res.result_code ~= 0 then + local error_msg = ERROR_MSG[res.result_code] return false, fmt("\n Error: %s\n Details: %s", error_msg or "Unknown error occurred (code: " .. - response.resultCode .. ")", response.errorMessage or "") + res.result_code .. ")", res.diagnostic_msg or "") else return true diff --git a/kong/runloop/balancer/targets.lua b/kong/runloop/balancer/targets.lua index 9ca251b3991..cc7973dd463 100644 --- a/kong/runloop/balancer/targets.lua +++ b/kong/runloop/balancer/targets.lua @@ -229,7 +229,11 @@ end -- Timer invoked to update DNS records -function resolve_timer_callback() +function resolve_timer_callback(premature) + if premature then + return + end + local now = ngx_now() while (renewal_heap:peekValue() or math.huge) < now do diff --git a/kong/vaults/env/init.lua b/kong/vaults/env/init.lua index 53e93c2559e..30727c4b1ed 100644 --- a/kong/vaults/env/init.lua +++ b/kong/vaults/env/init.lua @@ -37,14 +37,11 @@ end local function get(conf, resource, version) local prefix = conf.prefix - - resource = gsub(resource, "-", "_") - if type(prefix) == "string" then resource = prefix .. resource end - resource = upper(resource) + resource = upper(gsub(resource, "-", "_")) if version == 2 then resource = resource .. "_PREVIOUS" diff --git a/spec/01-unit/03-conf_loader_spec.lua b/spec/01-unit/03-conf_loader_spec.lua index ed21f73366d..9832f5e58a2 100644 --- a/spec/01-unit/03-conf_loader_spec.lua +++ b/spec/01-unit/03-conf_loader_spec.lua @@ -1719,4 +1719,34 @@ describe("Configuration loader", function() assert.equal("2m", conf.nginx_http_client_body_buffer_size) end) end) + describe("vault references", function() + it("are collected under $refs property", function() + finally(function() + helpers.unsetenv("PG_DATABASE") + end) + + helpers.setenv("PG_DATABASE", "resolved-kong-database") + + local conf = assert(conf_loader(nil, { + pg_database = "{vault://env/pg-database}" + })) + + assert.equal("resolved-kong-database", conf.pg_database) + assert.equal("{vault://env/pg-database}", conf["$refs"].pg_database) + end) + it("are inferred and collected under $refs property", function() + finally(function() + helpers.unsetenv("PG_PORT") + end) + + helpers.setenv("PG_PORT", "5000") + + local conf = assert(conf_loader(nil, { + pg_port = "{vault://env/pg-port}" + })) + + assert.equal(5000, conf.pg_port) + assert.equal("{vault://env/pg-port}", conf["$refs"].pg_port) + end) + end) end) diff --git a/spec/01-unit/04-prefix_handler_spec.lua b/spec/01-unit/04-prefix_handler_spec.lua index 452e6b3c2dc..9629455c66b 100644 --- a/spec/01-unit/04-prefix_handler_spec.lua +++ b/spec/01-unit/04-prefix_handler_spec.lua @@ -844,6 +844,31 @@ describe("NGINX conf compiler", function() assert.True(in_prefix_kong_conf.loaded_plugins.bar) end) + describe("vault references", function() + it("are kept as references in .kong_env", function() + finally(function() + helpers.unsetenv("PG_DATABASE") + end) + + helpers.setenv("PG_DATABASE", "resolved-kong-database") + + local conf = assert(conf_loader(nil, { + prefix = tmp_config.prefix, + pg_database = "{vault://env/pg-database}", + })) + + assert.equal("resolved-kong-database", conf.pg_database) + assert.equal("{vault://env/pg-database}", conf["$refs"].pg_database) + + assert(prefix_handler.prepare_prefix(conf)) + + local contents = helpers.file.read(tmp_config.kong_env) + + assert.matches("pg_database = {vault://env/pg-database}", contents, nil, true) + assert.not_matches("resolved-kong-database", contents, nil, true) + end) + end) + describe("ssl", function() it("does not create SSL dir if disabled", function() local conf = conf_loader(nil, { diff --git a/spec/02-integration/02-cmd/14-vault_spec.lua b/spec/02-integration/02-cmd/14-vault_spec.lua index c46f6b6989d..dd34b8c2575 100644 --- a/spec/02-integration/02-cmd/14-vault_spec.lua +++ b/spec/02-integration/02-cmd/14-vault_spec.lua @@ -51,7 +51,38 @@ describe("kong vault", function() helpers.setenv("SECRETS_TEST", "testvalue") local ok, stderr, stdout = helpers.kong_exec("vault get env/secrets_test", { vaults = "env" }) assert.equal("", stderr) - assert.matches("testvalue", stdout) + assert.matches("testvalue", stdout, nil, true) + assert.is_true(ok) + + ok, stderr, stdout = helpers.kong_exec("vault get env/secrets-test", { vaults = "env" }) + assert.equal("", stderr) + assert.matches("testvalue", stdout, nil, true) + assert.is_true(ok) + end) + + it("vault get env with config", function() + finally(function() + helpers.unsetenv("KONG_VAULT_ENV_PREFIX") + helpers.unsetenv("SECRETS_TEST") + end) + helpers.setenv("KONG_VAULT_ENV_PREFIX", "SECRETS_") + helpers.setenv("SECRETS_TEST", "testvalue-with-config") + local ok, stderr, stdout = helpers.kong_exec("vault get env/test", { vaults = "env" }) + assert.equal("", stderr) + assert.matches("testvalue-with-config", stdout, nil, true) + assert.is_true(ok) + end) + + it("vault get env with config with dash", function() + finally(function() + helpers.unsetenv("KONG_VAULT_ENV_PREFIX") + helpers.unsetenv("SECRETS_AGAIN_TEST") + end) + helpers.setenv("KONG_VAULT_ENV_PREFIX", "SECRETS-AGAIN-") + helpers.setenv("SECRETS_AGAIN_TEST_TOO", "testvalue-with-config-again") + local ok, stderr, stdout = helpers.kong_exec("vault get env/test-too", { vaults = "env" }) + assert.equal("", stderr) + assert.matches("testvalue-with-config-again", stdout, nil, true) assert.is_true(ok) end) end) diff --git a/spec/02-integration/09-hybrid_mode/03-pki_spec.lua b/spec/02-integration/09-hybrid_mode/03-pki_spec.lua index d50f7a3d8b4..d5b8009d5b2 100644 --- a/spec/02-integration/09-hybrid_mode/03-pki_spec.lua +++ b/spec/02-integration/09-hybrid_mode/03-pki_spec.lua @@ -86,7 +86,7 @@ for _, cluster_protocol in ipairs{"json", "wrpc"} do end) end) - describe("sync works", function() + describe("#flaky sync works", function() local route_id it("proxy on DP follows CP config", function() diff --git a/spec/02-integration/13-vaults/01-vault_spec.lua b/spec/02-integration/13-vaults/01-vault_spec.lua index a6f334162d2..0468922edc5 100644 --- a/spec/02-integration/13-vaults/01-vault_spec.lua +++ b/spec/02-integration/13-vaults/01-vault_spec.lua @@ -48,7 +48,7 @@ for _, strategy in helpers.each_strategy() do end) it("create certificates with cert and key as secret", function() - local res, err = client:post("/certificates", { + local res, err = client:post("/certificates", { headers = { ["Content-Type"] = "application/json" }, body = { cert = "{vault://test-vault/cert}", @@ -64,11 +64,16 @@ for _, strategy in helpers.each_strategy() do assert.equal("{vault://test-vault/key}", certificate.key) assert.equal("{vault://unknown/cert}", certificate.cert_alt) assert.equal("{vault://unknown/missing-key}", certificate.key_alt) + assert.is_nil(certificate["$refs"]) certificate, err = db.certificates:select({ id = certificate.id }) assert.is_nil(err) assert.equal(ssl_fixtures.cert, certificate.cert) assert.equal(ssl_fixtures.key, certificate.key) + assert.equal("{vault://test-vault/cert}", certificate["$refs"].cert) + assert.equal("{vault://test-vault/key}", certificate["$refs"].key) + assert.equal("{vault://unknown/cert}", certificate["$refs"].cert_alt) + assert.equal("{vault://unknown/missing-key}", certificate["$refs"].key_alt) assert.is_nil(certificate.cert_alt) assert.is_nil(certificate.key_alt) @@ -81,16 +86,18 @@ for _, strategy in helpers.each_strategy() do assert.equal("{vault://test-vault/key}", certificate.key) assert.equal("{vault://unknown/cert}", certificate.cert_alt) assert.equal("{vault://unknown/missing-key}", certificate.key_alt) + assert.is_nil(certificate["$refs"]) -- verify that certificate attributes are of type reference when querying - local gres = client:get("/certificates/"..certificate.id) - local gbody = assert.res_status(200, gres) - local gcertificate = cjson.decode(gbody) - assert.is_equal("{vault://test-vault/cert}", gcertificate.cert) - assert.is_equal("{vault://test-vault/key}", gcertificate.key) - assert.is_equal("{vault://unknown/cert}", gcertificate.cert_alt) - assert.is_equal("{vault://unknown/missing-key}", gcertificate.key_alt) + res, err = client:get("/certificates/"..certificate.id) + assert.is_nil(err) + body = assert.res_status(200, res) + certificate = cjson.decode(body) + assert.is_equal("{vault://test-vault/cert}", certificate.cert) + assert.is_equal("{vault://test-vault/key}", certificate.key) + assert.is_equal("{vault://unknown/cert}", certificate.cert_alt) + assert.is_equal("{vault://unknown/missing-key}", certificate.key_alt) + assert.is_nil(certificate["$refs"]) end) - end) end diff --git a/t/01-pdk/08-response/05-set_header.t b/t/01-pdk/08-response/05-set_header.t index 5948bcbb9ff..57a9257d113 100644 --- a/t/01-pdk/08-response/05-set_header.t +++ b/t/01-pdk/08-response/05-set_header.t @@ -248,3 +248,32 @@ type: string X-Foo: {} --- no_error_log [error] + + + +=== TEST 8: response.set_header() does not set transfer-encoding +--- http_config eval: $t::Util::HttpConfig +--- config + location = /t { + header_filter_by_lua_block { + ngx.header.content_length = nil + local PDK = require "kong.pdk" + local pdk = PDK.new() + + pdk.response.set_header("Transfer-Encoding", "gzip") + ngx.status = 200 + } + + body_filter_by_lua_block { + local new_headers = ngx.resp.get_headers() + + ngx.arg[1] = "Transfer-Encoding: " .. new_headers["Transfer-Encoding"] + ngx.arg[2] = true + } + } +--- request +GET /t +--- response_body chop +Transfer-Encoding: chunked +--- error_log +manually setting Transfer-Encoding. Ignored. diff --git a/t/01-pdk/08-response/08-set_headers.t b/t/01-pdk/08-response/08-set_headers.t index f1881ecd397..749a5d33d3e 100644 --- a/t/01-pdk/08-response/08-set_headers.t +++ b/t/01-pdk/08-response/08-set_headers.t @@ -642,7 +642,7 @@ X-Foo: {zzz} local PDK = require "kong.pdk" local pdk = PDK.new() - local ok, err pdk.response.set_headers({}) + local ok, err = pdk.response.set_headers({}) if not ok then ngx.ctx.err = err end @@ -698,3 +698,37 @@ Content-Type: text/plain ok --- no_error_log [error] + + + +=== TEST 18: response.set_header() does not set transfer-encoding +--- http_config eval: $t::Util::HttpConfig +--- config + location = /t { + header_filter_by_lua_block { + ngx.header.content_length = nil + local PDK = require "kong.pdk" + local pdk = PDK.new() + + pdk.response.set_headers { + ["Transfer-Encoding"] = "gzip", + ["X-test"] = "test", + } + ngx.status = 200 + } + + body_filter_by_lua_block { + local new_headers = ngx.resp.get_headers() + + ngx.arg[1] = "Transfer-Encoding: " .. new_headers["Transfer-Encoding"] .. "\n" + .. "X-test: " .. new_headers["X-test"] + ngx.arg[2] = true + } + } +--- request +GET /t +--- response_body chop +Transfer-Encoding: chunked +X-test: test +--- error_log +manually setting Transfer-Encoding. Ignored. diff --git a/t/01-pdk/08-response/11-exit.t b/t/01-pdk/08-response/11-exit.t index ae2d27eac2c..1615f440d2c 100644 --- a/t/01-pdk/08-response/11-exit.t +++ b/t/01-pdk/08-response/11-exit.t @@ -4,7 +4,7 @@ use Test::Nginx::Socket::Lua; use Test::Nginx::Socket::Lua::Stream; do "./t/Util.pm"; -plan tests => repeat_each() * (blocks() * 4) + 10; +plan tests => repeat_each() * (blocks() * 4) + 11; run_tests(); @@ -1125,3 +1125,32 @@ unable to proxy stream connection, status: 400, err: error message [error] --- error_log finalize stream session: 200 + + + +=== TEST 18: response.exit() does not set transfer-encoding from headers +--- http_config eval: $t::Util::HttpConfig +--- config + location = /t { + access_by_lua_block { + ngx.header.content_length = nil + local PDK = require "kong.pdk" + local pdk = PDK.new() + + pdk.response.exit(200, "test\n", { + ["Transfer-Encoding"] = "gzip", + ["X-test"] = "test", + }) + } + } +--- request +GET /t +--- response_body +test +--- response_headers +Content-Length: 5 +X-test: test +--- error_log +manually setting Transfer-Encoding. Ignored. + +