From c0b3b5b435aab7a83aeecf0f1177d9bfdb06005c Mon Sep 17 00:00:00 2001 From: Justin Mclean Date: Fri, 10 Jan 2025 14:44:52 +1100 Subject: [PATCH 01/10] [#6173] fix Trino license and notice files (#6173) ### What changes were proposed in this pull request? Added LICENSE and NOTICE file for the Trino connector. ### Why are the changes needed? to comply with ASF policy Fix: #6173 ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Tested locally. --- LICENSE.trino | 243 +++++++++++++++++++++++++++++++++++ NOTICE.trino | 24 ++++ build.gradle.kts | 6 +- dev/release/release-build.sh | 2 + 4 files changed, 272 insertions(+), 3 deletions(-) create mode 100644 LICENSE.trino create mode 100644 NOTICE.trino diff --git a/LICENSE.trino b/LICENSE.trino new file mode 100644 index 00000000000..69db2dbdc41 --- /dev/null +++ b/LICENSE.trino @@ -0,0 +1,243 @@ + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "[]" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright [yyyy] [name of copyright owner] + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. + + This product bundles various third-party components also under the + Apache Software License 2.0 from: + + Apache Commons Collections + Apache Commons Lang + Error Prone Annotations + Guava InternalFutureFailureAccess and InternalFutures + Guava: Google Core Libraries For Java + Google Guice Core Library + Jackson Annotations + Jackson Core + Jackson Databind + Jackson Datatype Guava + Jackson Datatype JDK8 + Jackson Datatype Joda + Jackson Datatype JSR310 + Jackson Parameter Names + Jakarta Dependency Injection + Airlift + Apache Log4j 1.x Compatibility API + Apache Log4j API + Apache Log4j Core + Apache Log4j Layout For Templated JSON Encoding + Apache Log4j SLF4J Binding + Trino JDBC Driver + + This product bundles various third-party components also under the + MIT license + + Checker Framework + SLF4J API Module + + This product bundles various third-party components also under the + BSD license + + JSR305 + + This product bundles various third-party components also placed in + the public domain. + + AOP Alliance \ No newline at end of file diff --git a/NOTICE.trino b/NOTICE.trino new file mode 100644 index 00000000000..841478225d2 --- /dev/null +++ b/NOTICE.trino @@ -0,0 +1,24 @@ +Apache Gravitino (incubating) +Copyright 2025 The Apache Software Foundation + +This product includes software developed at +The Apache Software Foundation (http://www.apache.org/). + +The initial code for the Gravitino project was donated +to the ASF by Datastrato (https://datastrato.ai/) copyright 2023-2024. + +Apache Commons Collections +Copyright 2001-2025 The Apache Software Foundation + +The Java source file src/main/java/org/apache/commons/collections4/map/ConcurrentReferenceHashMap.java +is from https://github.com/hazelcast/hazelcast and the following notice applies: +Copyright (c) 2008-2020, Hazelcast, Inc. All Rights Reserved. + +Apache Commons Lang +Copyright 2001-2025 The Apache Software Foundation + +Apache log4j +Copyright 2010 The Apache Software Foundation + +Apache Log4j +Copyright 1999-2024 Apache Software Foundation \ No newline at end of file diff --git a/build.gradle.kts b/build.gradle.kts index 4ebd09a9a2e..1fe3c80d9b7 100644 --- a/build.gradle.kts +++ b/build.gradle.kts @@ -676,13 +676,13 @@ tasks { doLast { copy { from(projectDir.dir("licenses")) { into("${rootProject.name}-trino-connector/licenses") } - from(projectDir.file("LICENSE.bin")) { into("${rootProject.name}-trino-connector") } - from(projectDir.file("NOTICE.bin")) { into("${rootProject.name}-trino-connector") } + from(projectDir.file("LICENSE.trino")) { into("${rootProject.name}-trino-connector") } + from(projectDir.file("NOTICE.trino")) { into("${rootProject.name}-trino-connector") } from(projectDir.file("README.md")) { into("${rootProject.name}-trino-connector") } from(projectDir.file("DISCLAIMER_WIP.txt")) { into("${rootProject.name}-trino-connector") } into(outputDir) rename { fileName -> - fileName.replace(".bin", "") + fileName.replace(".trino", "") } } } diff --git a/dev/release/release-build.sh b/dev/release/release-build.sh index c2ff5b6bae9..4654a7881c8 100755 --- a/dev/release/release-build.sh +++ b/dev/release/release-build.sh @@ -205,6 +205,8 @@ if [[ "$1" == "package" ]]; then rm -f gravitino-$GRAVITINO_VERSION-src/NOTICE.bin rm -f gravitino-$GRAVITINO_VERSION-src/LICENSE.rest rm -f gravitino-$GRAVITINO_VERSION-src/NOTICE.rest + rm -f gravitino-$GRAVITINO_VERSION-src/LICENSE.trino + rm -f gravitino-$GRAVITINO_VERSION-src/NOTICE.trino rm -f gravitino-$GRAVITINO_VERSION-src/web/LICENSE.bin rm -f gravitino-$GRAVITINO_VERSION-src/web/NOTICE.bin From 7e96a540b2cea1e39850ae2dc59a5d6aab52d9e4 Mon Sep 17 00:00:00 2001 From: Justin Mclean Date: Fri, 10 Jan 2025 14:49:49 +1100 Subject: [PATCH 02/10] [Minor] Update year in NOTICE files (#6171) ### What changes were proposed in this pull request? Update year as we are going to make a new release. ### Why are the changes needed? ASF/legal policy. Fix: #N/A ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? N/A --- NOTICE | 2 +- NOTICE.rest | 2 +- web/web/NOTICE.bin | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/NOTICE b/NOTICE index a488662afee..c8b1eda0d1e 100644 --- a/NOTICE +++ b/NOTICE @@ -1,5 +1,5 @@ Apache Gravitino (incubating) -Copyright 2024 The Apache Software Foundation +Copyright 2025 The Apache Software Foundation This product includes software developed at The Apache Software Foundation (http://www.apache.org/). diff --git a/NOTICE.rest b/NOTICE.rest index 18e7620cd9b..8a06bae6fbe 100644 --- a/NOTICE.rest +++ b/NOTICE.rest @@ -1,5 +1,5 @@ Apache Gravitino (incubating) -Copyright 2024 The Apache Software Foundation +Copyright 2025 The Apache Software Foundation This product includes software developed at The Apache Software Foundation (http://www.apache.org/). diff --git a/web/web/NOTICE.bin b/web/web/NOTICE.bin index 796e9174f2d..e4d3a7f58b3 100644 --- a/web/web/NOTICE.bin +++ b/web/web/NOTICE.bin @@ -1,5 +1,5 @@ Apache Gravitino (incubating) -Copyright 2024 The Apache Software Foundation +Copyright 2025 The Apache Software Foundation This product includes software developed at The Apache Software Foundation (http://www.apache.org/). From 242940be15535c80bc31ea0a8c7f30cf40f7a8fc Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Fri, 10 Jan 2025 14:02:51 +0800 Subject: [PATCH 03/10] build(deps): bump nanoid from 3.3.7 to 3.3.8 in /web/web (#6176) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Bumps [nanoid](https://github.com/ai/nanoid) from 3.3.7 to 3.3.8.
Changelog

Sourced from nanoid's changelog.

3.3.8

  • Fixed a way to break Nano ID by passing non-integer size (by @​myndzi).
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=nanoid&package-manager=npm_and_yarn&previous-version=3.3.7&new-version=3.3.8)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/apache/gravitino/network/alerts).
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- web/web/pnpm-lock.yaml | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/web/web/pnpm-lock.yaml b/web/web/pnpm-lock.yaml index 0f2bbbf4bc5..1b8ef5427cc 100644 --- a/web/web/pnpm-lock.yaml +++ b/web/web/pnpm-lock.yaml @@ -2071,8 +2071,8 @@ packages: react: '*' react-dom: '*' - nanoid@3.3.7: - resolution: {integrity: sha512-eSRppjcPIatRIMC1U6UngP8XFcz8MQWGQdt1MTBQ7NaAmvXDfvNxbvWV3x2y6CdEUciCSsDHDQZbhYaB8QEo2g==} + nanoid@3.3.8: + resolution: {integrity: sha512-WNLf5Sd8oZxOm+TzppcYk8gVOgP+l58xNy58D0nbUnOxOWRWvlcCV4kUF7ltmI6PsrLl/BgKEyS4mqsGChFN0w==} engines: {node: ^10 || ^12 || ^13.7 || ^14 || >=15.0.1} hasBin: true @@ -5317,7 +5317,7 @@ snapshots: stacktrace-js: 2.0.2 stylis: 4.3.2 - nanoid@3.3.7: {} + nanoid@3.3.8: {} natural-compare@1.4.0: {} @@ -5530,13 +5530,13 @@ snapshots: postcss@8.4.31: dependencies: - nanoid: 3.3.7 + nanoid: 3.3.8 picocolors: 1.0.1 source-map-js: 1.2.0 postcss@8.4.39: dependencies: - nanoid: 3.3.7 + nanoid: 3.3.8 picocolors: 1.0.1 source-map-js: 1.2.0 From 2a1729273972c385be75d8b3c73580b03b2a3a9c Mon Sep 17 00:00:00 2001 From: Lord of Abyss <103809695+Abyss-lord@users.noreply.github.com> Date: Fri, 10 Jan 2025 14:51:37 +0800 Subject: [PATCH 04/10] [#6146] improve(CLI): Refactor topic commands in Gavitino CLI (#6174) ### What changes were proposed in this pull request? Refactor topic commands in Gavitino CLI ### Why are the changes needed? Fix: #6146 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? local test. --- .../gravitino/cli/GravitinoCommandLine.java | 91 +------- .../gravitino/cli/TopicCommandHandler.java | 196 ++++++++++++++++++ 2 files changed, 197 insertions(+), 90 deletions(-) create mode 100644 clients/cli/src/main/java/org/apache/gravitino/cli/TopicCommandHandler.java diff --git a/clients/cli/src/main/java/org/apache/gravitino/cli/GravitinoCommandLine.java b/clients/cli/src/main/java/org/apache/gravitino/cli/GravitinoCommandLine.java index 49c8b8e7c54..fbd4d5b0c83 100644 --- a/clients/cli/src/main/java/org/apache/gravitino/cli/GravitinoCommandLine.java +++ b/clients/cli/src/main/java/org/apache/gravitino/cli/GravitinoCommandLine.java @@ -139,7 +139,7 @@ private void executeCommand() { } else if (entity.equals(CommandEntities.METALAKE)) { handleMetalakeCommand(); } else if (entity.equals(CommandEntities.TOPIC)) { - handleTopicCommand(); + new TopicCommandHandler(this, line, command, ignore).handle(); } else if (entity.equals(CommandEntities.FILESET)) { handleFilesetCommand(); } else if (entity.equals(CommandEntities.USER)) { @@ -798,95 +798,6 @@ private void handleOwnerCommand() { } } - /** - * Handles the command execution for topics based on command type and the command line options. - */ - private void handleTopicCommand() { - String url = getUrl(); - String auth = getAuth(); - String userName = line.getOptionValue(GravitinoOptions.LOGIN); - FullName name = new FullName(line); - String metalake = name.getMetalakeName(); - String catalog = name.getCatalogName(); - String schema = name.getSchemaName(); - - Command.setAuthenticationMode(auth, userName); - - List missingEntities = Lists.newArrayList(); - if (catalog == null) missingEntities.add(CommandEntities.CATALOG); - if (schema == null) missingEntities.add(CommandEntities.SCHEMA); - - if (CommandActions.LIST.equals(command)) { - checkEntities(missingEntities); - newListTopics(url, ignore, metalake, catalog, schema).validate().handle(); - return; - } - - String topic = name.getTopicName(); - if (topic == null) missingEntities.add(CommandEntities.TOPIC); - checkEntities(missingEntities); - - switch (command) { - case CommandActions.DETAILS: - newTopicDetails(url, ignore, metalake, catalog, schema, topic).validate().handle(); - break; - - case CommandActions.CREATE: - { - String comment = line.getOptionValue(GravitinoOptions.COMMENT); - newCreateTopic(url, ignore, metalake, catalog, schema, topic, comment) - .validate() - .handle(); - break; - } - - case CommandActions.DELETE: - { - boolean force = line.hasOption(GravitinoOptions.FORCE); - newDeleteTopic(url, ignore, force, metalake, catalog, schema, topic).validate().handle(); - break; - } - - case CommandActions.UPDATE: - { - if (line.hasOption(GravitinoOptions.COMMENT)) { - String comment = line.getOptionValue(GravitinoOptions.COMMENT); - newUpdateTopicComment(url, ignore, metalake, catalog, schema, topic, comment) - .validate() - .handle(); - } - break; - } - - case CommandActions.SET: - { - String property = line.getOptionValue(GravitinoOptions.PROPERTY); - String value = line.getOptionValue(GravitinoOptions.VALUE); - newSetTopicProperty(url, ignore, metalake, catalog, schema, topic, property, value) - .validate() - .handle(); - break; - } - - case CommandActions.REMOVE: - { - String property = line.getOptionValue(GravitinoOptions.PROPERTY); - newRemoveTopicProperty(url, ignore, metalake, catalog, schema, topic, property) - .validate() - .handle(); - break; - } - - case CommandActions.PROPERTIES: - newListTopicProperties(url, ignore, metalake, catalog, schema, topic).validate().handle(); - break; - - default: - System.err.println(ErrorMessages.UNSUPPORTED_ACTION); - break; - } - } - /** * Handles the command execution for filesets based on command type and the command line options. */ diff --git a/clients/cli/src/main/java/org/apache/gravitino/cli/TopicCommandHandler.java b/clients/cli/src/main/java/org/apache/gravitino/cli/TopicCommandHandler.java new file mode 100644 index 00000000000..7c2a75db91b --- /dev/null +++ b/clients/cli/src/main/java/org/apache/gravitino/cli/TopicCommandHandler.java @@ -0,0 +1,196 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.apache.gravitino.cli; + +import com.google.common.collect.Lists; +import java.util.List; +import org.apache.commons.cli.CommandLine; +import org.apache.gravitino.cli.commands.Command; + +/** Handles the command execution for Topics based on command type and the command line options. */ +public class TopicCommandHandler extends CommandHandler { + private final GravitinoCommandLine gravitinoCommandLine; + private final CommandLine line; + private final String command; + private final boolean ignore; + private final String url; + private final FullName name; + private final String metalake; + private final String catalog; + private final String schema; + private String topic; + + /** + * Constructs a {@link TopicCommandHandler} instance. + * + * @param gravitinoCommandLine The Gravitino command line instance. + * @param line The command line arguments. + * @param command The command to execute. + * @param ignore Ignore server version mismatch. + */ + public TopicCommandHandler( + GravitinoCommandLine gravitinoCommandLine, CommandLine line, String command, boolean ignore) { + this.gravitinoCommandLine = gravitinoCommandLine; + this.line = line; + this.command = command; + this.ignore = ignore; + + this.url = getUrl(line); + this.name = new FullName(line); + this.metalake = name.getMetalakeName(); + this.catalog = name.getCatalogName(); + this.schema = name.getSchemaName(); + } + + /** Handles the command execution logic based on the provided command. */ + @Override + protected void handle() { + String userName = line.getOptionValue(GravitinoOptions.LOGIN); + Command.setAuthenticationMode(getAuth(line), userName); + + List missingEntities = Lists.newArrayList(); + if (catalog == null) missingEntities.add(CommandEntities.CATALOG); + if (schema == null) missingEntities.add(CommandEntities.SCHEMA); + + if (CommandActions.LIST.equals(command)) { + checkEntities(missingEntities); + handleListCommand(); + return; + } + + topic = name.getTopicName(); + if (topic == null) missingEntities.add(CommandEntities.TOPIC); + checkEntities(missingEntities); + + if (!executeCommand()) { + System.err.println(ErrorMessages.UNSUPPORTED_COMMAND); + Main.exit(-1); + } + } + + /** + * Executes the specific command based on the command type. + * + * @return true if the command is supported, false otherwise + */ + private boolean executeCommand() { + switch (command) { + case CommandActions.DETAILS: + handleDetailsCommand(); + return true; + + case CommandActions.CREATE: + handleCreateCommand(); + return true; + + case CommandActions.DELETE: + handleDeleteCommand(); + return true; + + case CommandActions.UPDATE: + handleUpdateCommand(); + return true; + + case CommandActions.SET: + handleSetCommand(); + return true; + + case CommandActions.REMOVE: + handleRemoveCommand(); + return true; + + case CommandActions.PROPERTIES: + handlePropertiesCommand(); + return true; + + default: + return false; + } + } + + /** Handles the "DETAILS" command. */ + private void handleDetailsCommand() { + gravitinoCommandLine + .newTopicDetails(url, ignore, metalake, catalog, schema, topic) + .validate() + .handle(); + } + + /** Handles the "CREATE" command. */ + private void handleCreateCommand() { + String comment = line.getOptionValue(GravitinoOptions.COMMENT); + gravitinoCommandLine + .newCreateTopic(url, ignore, metalake, catalog, schema, topic, comment) + .validate() + .handle(); + } + + /** Handles the "DELETE" command. */ + private void handleDeleteCommand() { + boolean force = line.hasOption(GravitinoOptions.FORCE); + gravitinoCommandLine + .newDeleteTopic(url, ignore, force, metalake, catalog, schema, topic) + .validate() + .handle(); + } + + /** Handles the "UPDATE" command. */ + private void handleUpdateCommand() { + if (line.hasOption(GravitinoOptions.COMMENT)) { + String comment = line.getOptionValue(GravitinoOptions.COMMENT); + gravitinoCommandLine + .newUpdateTopicComment(url, ignore, metalake, catalog, schema, topic, comment) + .validate() + .handle(); + } + } + + /** Handles the "SET" command. */ + private void handleSetCommand() { + String property = line.getOptionValue(GravitinoOptions.PROPERTY); + String value = line.getOptionValue(GravitinoOptions.VALUE); + gravitinoCommandLine + .newSetTopicProperty(url, ignore, metalake, catalog, schema, topic, property, value) + .validate() + .handle(); + } + + /** Handles the "REMOVE" command. */ + private void handleRemoveCommand() { + String property = line.getOptionValue(GravitinoOptions.PROPERTY); + gravitinoCommandLine + .newRemoveTopicProperty(url, ignore, metalake, catalog, schema, topic, property) + .validate() + .handle(); + } + + /** Handles the "PROPERTIES" command. */ + private void handlePropertiesCommand() { + gravitinoCommandLine + .newListTopicProperties(url, ignore, metalake, catalog, schema, topic) + .validate() + .handle(); + } + + /** Handles the "LIST" command. */ + private void handleListCommand() { + gravitinoCommandLine.newListTopics(url, ignore, metalake, catalog, schema).validate().handle(); + } +} From 20bfeba028848b3d28d1a7d606763815bbb3c4b9 Mon Sep 17 00:00:00 2001 From: Lord of Abyss <103809695+Abyss-lord@users.noreply.github.com> Date: Fri, 10 Jan 2025 14:55:13 +0800 Subject: [PATCH 05/10] [#6151] improve(CLI): Refactor group commands in Gavitino CLI (#6175) ### What changes were proposed in this pull request? Refactor group commands in Gavitino CLI. ### Why are the changes needed? Fix: #6151 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? local test. --- .../gravitino/cli/GravitinoCommandLine.java | 63 +------ .../gravitino/cli/GroupCommandHandler.java | 159 ++++++++++++++++++ 2 files changed, 160 insertions(+), 62 deletions(-) create mode 100644 clients/cli/src/main/java/org/apache/gravitino/cli/GroupCommandHandler.java diff --git a/clients/cli/src/main/java/org/apache/gravitino/cli/GravitinoCommandLine.java b/clients/cli/src/main/java/org/apache/gravitino/cli/GravitinoCommandLine.java index fbd4d5b0c83..2af8a2973a3 100644 --- a/clients/cli/src/main/java/org/apache/gravitino/cli/GravitinoCommandLine.java +++ b/clients/cli/src/main/java/org/apache/gravitino/cli/GravitinoCommandLine.java @@ -145,7 +145,7 @@ private void executeCommand() { } else if (entity.equals(CommandEntities.USER)) { handleUserCommand(); } else if (entity.equals(CommandEntities.GROUP)) { - handleGroupCommand(); + new GroupCommandHandler(this, line, command, ignore).handle(); } else if (entity.equals(CommandEntities.TAG)) { handleTagCommand(); } else if (entity.equals(CommandEntities.ROLE)) { @@ -374,67 +374,6 @@ protected void handleUserCommand() { } } - /** Handles the command execution for Group based on command type and the command line options. */ - protected void handleGroupCommand() { - String url = getUrl(); - String auth = getAuth(); - String userName = line.getOptionValue(GravitinoOptions.LOGIN); - FullName name = new FullName(line); - String metalake = name.getMetalakeName(); - String group = line.getOptionValue(GravitinoOptions.GROUP); - - Command.setAuthenticationMode(auth, userName); - - if (group == null && !CommandActions.LIST.equals(command)) { - System.err.println(ErrorMessages.MISSING_GROUP); - Main.exit(-1); - } - - switch (command) { - case CommandActions.DETAILS: - if (line.hasOption(GravitinoOptions.AUDIT)) { - newGroupAudit(url, ignore, metalake, group).validate().handle(); - } else { - newGroupDetails(url, ignore, metalake, group).validate().handle(); - } - break; - - case CommandActions.LIST: - newListGroups(url, ignore, metalake).validate().handle(); - break; - - case CommandActions.CREATE: - newCreateGroup(url, ignore, metalake, group).validate().handle(); - break; - - case CommandActions.DELETE: - boolean force = line.hasOption(GravitinoOptions.FORCE); - newDeleteGroup(url, ignore, force, metalake, group).validate().handle(); - break; - - case CommandActions.REVOKE: - String[] revokeRoles = line.getOptionValues(GravitinoOptions.ROLE); - for (String role : revokeRoles) { - newRemoveRoleFromGroup(url, ignore, metalake, group, role).validate().handle(); - } - System.out.printf("Remove roles %s from group %s%n", COMMA_JOINER.join(revokeRoles), group); - break; - - case CommandActions.GRANT: - String[] grantRoles = line.getOptionValues(GravitinoOptions.ROLE); - for (String role : grantRoles) { - newAddRoleToGroup(url, ignore, metalake, group, role).validate().handle(); - } - System.out.printf("Grant roles %s to group %s%n", COMMA_JOINER.join(grantRoles), group); - break; - - default: - System.err.println(ErrorMessages.UNSUPPORTED_ACTION); - Main.exit(-1); - break; - } - } - /** Handles the command execution for Tags based on command type and the command line options. */ protected void handleTagCommand() { String url = getUrl(); diff --git a/clients/cli/src/main/java/org/apache/gravitino/cli/GroupCommandHandler.java b/clients/cli/src/main/java/org/apache/gravitino/cli/GroupCommandHandler.java new file mode 100644 index 00000000000..e336003e6b5 --- /dev/null +++ b/clients/cli/src/main/java/org/apache/gravitino/cli/GroupCommandHandler.java @@ -0,0 +1,159 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.apache.gravitino.cli; + +import org.apache.commons.cli.CommandLine; +import org.apache.gravitino.cli.commands.Command; + +/** Handles the command execution for Groups based on command type and the command line options. */ +public class GroupCommandHandler extends CommandHandler { + private final GravitinoCommandLine gravitinoCommandLine; + private final CommandLine line; + private final String command; + private final boolean ignore; + private final String url; + private final FullName name; + private final String metalake; + private String group; + + /** + * Constructs a {@link GroupCommandHandler} instance. + * + * @param gravitinoCommandLine The Gravitino command line instance. + * @param line The command line arguments. + * @param command The command to execute. + * @param ignore Ignore server version mismatch. + */ + public GroupCommandHandler( + GravitinoCommandLine gravitinoCommandLine, CommandLine line, String command, boolean ignore) { + this.gravitinoCommandLine = gravitinoCommandLine; + this.line = line; + this.command = command; + this.ignore = ignore; + + this.url = getUrl(line); + this.name = new FullName(line); + this.metalake = name.getMetalakeName(); + } + + /** Handles the command execution logic based on the provided command. */ + @Override + protected void handle() { + String userName = line.getOptionValue(GravitinoOptions.LOGIN); + Command.setAuthenticationMode(getAuth(line), userName); + + if (CommandActions.LIST.equals(command)) { + handleListCommand(); + return; + } + + group = line.getOptionValue(GravitinoOptions.GROUP); + if (group == null) { + System.err.println(ErrorMessages.MISSING_GROUP); + Main.exit(-1); + } + + if (!executeCommand()) { + System.err.println(ErrorMessages.UNSUPPORTED_COMMAND); + Main.exit(-1); + } + } + + /** + * Executes the specific command based on the command type. + * + * @return true if the command is supported, false otherwise + */ + private boolean executeCommand() { + switch (command) { + case CommandActions.DETAILS: + handleDetailsCommand(); + return true; + + case CommandActions.CREATE: + handleCreateCommand(); + return true; + + case CommandActions.DELETE: + handleDeleteCommand(); + return true; + + case CommandActions.REVOKE: + handleRevokeCommand(); + return true; + + case CommandActions.GRANT: + handleGrantCommand(); + return true; + + default: + return false; + } + } + + /** Handles the "DETAILS" command. */ + private void handleDetailsCommand() { + if (line.hasOption(GravitinoOptions.AUDIT)) { + gravitinoCommandLine.newGroupAudit(url, ignore, metalake, group).validate().handle(); + } else { + gravitinoCommandLine.newGroupDetails(url, ignore, metalake, group).validate().handle(); + } + } + + /** Handles the "CREATE" command. */ + private void handleCreateCommand() { + gravitinoCommandLine.newCreateGroup(url, ignore, metalake, group).validate().handle(); + } + + /** Handles the "DELETE" command. */ + private void handleDeleteCommand() { + boolean force = line.hasOption(GravitinoOptions.FORCE); + gravitinoCommandLine.newDeleteGroup(url, ignore, force, metalake, group).validate().handle(); + } + + /** Handles the "REVOKE" command. */ + private void handleRevokeCommand() { + String[] revokeRoles = line.getOptionValues(GravitinoOptions.ROLE); + for (String role : revokeRoles) { + gravitinoCommandLine + .newRemoveRoleFromGroup(url, ignore, metalake, group, role) + .validate() + .handle(); + } + System.out.printf("Remove roles %s from group %s%n", COMMA_JOINER.join(revokeRoles), group); + } + + /** Handles the "GRANT" command. */ + private void handleGrantCommand() { + String[] grantRoles = line.getOptionValues(GravitinoOptions.ROLE); + for (String role : grantRoles) { + gravitinoCommandLine + .newAddRoleToGroup(url, ignore, metalake, group, role) + .validate() + .handle(); + } + System.out.printf("Grant roles %s to group %s%n", COMMA_JOINER.join(grantRoles), group); + } + + /** Handles the "LIST" command. */ + private void handleListCommand() { + gravitinoCommandLine.newListGroups(url, ignore, metalake).validate().handle(); + } +} From 74ef6601e465498877d273b1b9b58d0cef7aac6f Mon Sep 17 00:00:00 2001 From: FANNG Date: Fri, 10 Jan 2025 15:12:42 +0800 Subject: [PATCH 06/10] [#6165] feat(core): Use Gravitino cloud jar without hadoop packages for Iceberg REST server (#6168) ### What changes were proposed in this pull request? 1. use Gravitino cloud jar without hadoop packages for Iceberg REST server credential vending in test and document 2. For OSS, use Gravitino Aliyun bundle jar in test and docker image because Iceberg doesn't provide Iceberg Aliyun bundle jar ### Why are the changes needed? Fix: #6165 ### Does this PR introduce _any_ user-facing change? no ### How was this patch tested? test S3 OSS GCS ADLS Iceberg REST test in local --- .../iceberg-rest-server-dependency.sh | 18 +++++++--------- docs/iceberg-rest-service.md | 8 ++++++- docs/security/credential-vending.md | 21 ++++++++++++++----- iceberg/iceberg-rest-server/build.gradle.kts | 3 ++- .../test/IcebergRESTADLSTokenIT.java | 2 +- .../test/IcebergRESTAzureAccountKeyIT.java | 2 +- .../integration/test/IcebergRESTGCSIT.java | 2 +- .../integration/test/IcebergRESTOSSIT.java | 2 ++ .../test/IcebergRESTOSSSecretIT.java | 2 ++ ...ESTS3IT.java => IcebergRESTS3TokenIT.java} | 4 ++-- 10 files changed, 41 insertions(+), 23 deletions(-) rename iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/{IcebergRESTS3IT.java => IcebergRESTS3TokenIT.java} (98%) diff --git a/dev/docker/iceberg-rest-server/iceberg-rest-server-dependency.sh b/dev/docker/iceberg-rest-server/iceberg-rest-server-dependency.sh index 852b55b0206..aced0224f48 100755 --- a/dev/docker/iceberg-rest-server/iceberg-rest-server-dependency.sh +++ b/dev/docker/iceberg-rest-server/iceberg-rest-server-dependency.sh @@ -35,17 +35,18 @@ tar xfz gravitino-iceberg-rest-server-*.tar.gz cp -r gravitino-iceberg-rest-server*-bin ${iceberg_rest_server_dir}/packages/gravitino-iceberg-rest-server cd ${gravitino_home} -./gradlew :bundles:gcp-bundle:jar -./gradlew :bundles:aws-bundle:jar -./gradlew :bundles:azure-bundle:jar +./gradlew :bundles:gcp:jar +./gradlew :bundles:aws:jar +./gradlew :bundles:azure:jar +## Iceberg doesn't provide Iceberg Aliyun bundle jar, so use Gravitino aliyun bundle to provide OSS packages. ./gradlew :bundles:aliyun-bundle:jar # prepare bundle jar cd ${iceberg_rest_server_dir} mkdir -p bundles -cp ${gravitino_home}/bundles/gcp-bundle/build/libs/gravitino-gcp-bundle-*.jar bundles/ -cp ${gravitino_home}/bundles/aws-bundle/build/libs/gravitino-aws-bundle-*.jar bundles/ -cp ${gravitino_home}/bundles/azure-bundle/build/libs/gravitino-azure-bundle-*.jar bundles/ +cp ${gravitino_home}/bundles/gcp/build/libs/gravitino-gcp-*.jar bundles/ +cp ${gravitino_home}/bundles/aws/build/libs/gravitino-aws-*.jar bundles/ +cp ${gravitino_home}/bundles/azure/build/libs/gravitino-azure-*.jar bundles/ cp ${gravitino_home}/bundles/aliyun-bundle/build/libs/gravitino-aliyun-bundle-*.jar bundles/ iceberg_gcp_bundle="iceberg-gcp-bundle-1.5.2.jar" @@ -63,11 +64,6 @@ if [ ! -f "bundles/${iceberg_azure_bundle}" ]; then curl -L -s -o bundles/${iceberg_azure_bundle} https://repo1.maven.org/maven2/org/apache/iceberg/iceberg-azure-bundle/1.5.2/${iceberg_azure_bundle} fi -iceberg_aliyun_bundle="iceberg-aliyun-bundle-1.5.2.jar" -if [ ! -f "bundles/${iceberg_aliyun_bundle}" ]; then - curl -L -s -o bundles/${iceberg_aliyun_bundle} https://repo1.maven.org/maven2/org/apache/iceberg/iceberg-aliyun-bundle/1.5.2/${iceberg_aliyun_bundle} -fi - # download jdbc driver curl -L -s -o bundles/sqlite-jdbc-3.42.0.0.jar https://repo1.maven.org/maven2/org/xerial/sqlite-jdbc/3.42.0.0/sqlite-jdbc-3.42.0.0.jar diff --git a/docs/iceberg-rest-service.md b/docs/iceberg-rest-service.md index d42fc98b4dd..a4846d0e0dd 100644 --- a/docs/iceberg-rest-service.md +++ b/docs/iceberg-rest-service.md @@ -134,8 +134,14 @@ For other Iceberg OSS properties not managed by Gravitino like `client.security- Please refer to [OSS credentials](./security/credential-vending.md#oss-credentials) for credential related configurations. +Additionally, Iceberg doesn't provide Iceberg Aliyun bundle jar which contains OSS packages, there are two alternatives to use OSS packages: +1. Use [Gravitino Aliyun bundle jar with hadoop packages](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-aliyun-bundle). +2. Use [Aliyun JAVA SDK](https://gosspublic.alicdn.com/sdks/java/aliyun_java_sdk_3.10.2.zip) and extract `aliyun-sdk-oss-3.10.2.jar`, `hamcrest-core-1.1.jar`, `jdom2-2.0.6.jar` jars. + +Please place the above jars in the classpath of Iceberg REST server, please refer to [server management](#server-management) for classpath details. + :::info -Please set the `gravitino.iceberg-rest.warehouse` parameter to `oss://{bucket_name}/${prefix_name}`. Additionally, download the [Aliyun OSS SDK](https://gosspublic.alicdn.com/sdks/java/aliyun_java_sdk_3.10.2.zip) and copy `aliyun-sdk-oss-3.10.2.jar`, `hamcrest-core-1.1.jar`, `jdom2-2.0.6.jar` in the classpath of Iceberg REST server, `iceberg-rest-server/libs` for the auxiliary server, `libs` for the standalone server. +Please set the `gravitino.iceberg-rest.warehouse` parameter to `oss://{bucket_name}/${prefix_name}`. ::: #### GCS diff --git a/docs/security/credential-vending.md b/docs/security/credential-vending.md index 92370f4315d..b5391ac3152 100644 --- a/docs/security/credential-vending.md +++ b/docs/security/credential-vending.md @@ -132,12 +132,23 @@ Gravitino supports custom credentials, you can implement the `org.apache.graviti Besides setting credentials related configuration, please download Gravitino cloud bundle jar and place it in the classpath of Iceberg REST server or Hadoop catalog. -Gravitino cloud bundle jar: +For Hadoop catalog, please use Gravitino cloud bundle jar with Hadoop and cloud packages: -- [Gravitino AWS bundle jar](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-aws-bundle) -- [Gravitino Aliyun bundle jar](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-aliyun-bundle) -- [Gravitino GCP bundle jar](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-gcp-bundle) -- [Gravitino Azure bundle jar](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-azure-bundle) +- [Gravitino AWS bundle jar with Hadoop and cloud packages](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-aws-bundle) +- [Gravitino Aliyun bundle jar with Hadoop and cloud packages](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-aliyun-bundle) +- [Gravitino GCP bundle jar with Hadoop and cloud packages](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-gcp-bundle) +- [Gravitino Azure bundle jar with Hadoop and cloud packages](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-azure-bundle) + +For Iceberg REST catalog server, please use Gravitino cloud bundle jar without Hadoop and cloud packages: + +- [Gravitino AWS bundle jar without Hadoop and cloud packages](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-aws) +- [Gravitino Aliyun bundle jar without Hadoop and cloud packages](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-aliyun) +- [Gravitino GCP bundle jar without Hadoop and cloud packages](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-gcp) +- [Gravitino Azure bundle jar without Hadoop and cloud packages](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-azure) + +:::note +For OSS, Iceberg doesn't provide Iceberg Aliyun bundle jar which contains OSS packages, you could provide the OSS jar by yourself or use [Gravitino Aliyun bundle jar with Hadoop and cloud packages](https://mvnrepository.com/artifact/org.apache.gravitino/gravitino-aliyun-bundle), please refer to [OSS configuration](../iceberg-rest-service.md#oss-configuration) for more details. +::: The classpath of the server: diff --git a/iceberg/iceberg-rest-server/build.gradle.kts b/iceberg/iceberg-rest-server/build.gradle.kts index fe35c4e7789..925ad900762 100644 --- a/iceberg/iceberg-rest-server/build.gradle.kts +++ b/iceberg/iceberg-rest-server/build.gradle.kts @@ -62,7 +62,8 @@ dependencies { annotationProcessor(libs.lombok) compileOnly(libs.lombok) - testImplementation(project(":bundles:aliyun")) + // Iceberg doesn't provide Aliyun bundle jar, use Gravitino Aliyun bundle to provide OSS packages + testImplementation(project(":bundles:aliyun-bundle")) testImplementation(project(":bundles:aws")) testImplementation(project(":bundles:gcp", configuration = "shadow")) testImplementation(project(":bundles:azure", configuration = "shadow")) diff --git a/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTADLSTokenIT.java b/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTADLSTokenIT.java index 52ccb876df9..bf718d601d7 100644 --- a/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTADLSTokenIT.java +++ b/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTADLSTokenIT.java @@ -130,7 +130,7 @@ private void downloadIcebergAzureBundleJar() throws IOException { private void copyAzureBundleJar() { String gravitinoHome = System.getenv("GRAVITINO_HOME"); String targetDir = String.format("%s/iceberg-rest-server/libs/", gravitinoHome); - BaseIT.copyBundleJarsToDirectory("azure-bundle", targetDir); + BaseIT.copyBundleJarsToDirectory("azure", targetDir); } @Test diff --git a/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTAzureAccountKeyIT.java b/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTAzureAccountKeyIT.java index f999f84f58d..4f3c608fe3b 100644 --- a/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTAzureAccountKeyIT.java +++ b/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTAzureAccountKeyIT.java @@ -113,6 +113,6 @@ private void downloadIcebergAzureBundleJar() throws IOException { private void copyAzureBundleJar() { String gravitinoHome = System.getenv("GRAVITINO_HOME"); String targetDir = String.format("%s/iceberg-rest-server/libs/", gravitinoHome); - BaseIT.copyBundleJarsToDirectory("azure-bundle", targetDir); + BaseIT.copyBundleJarsToDirectory("azure", targetDir); } } diff --git a/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTGCSIT.java b/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTGCSIT.java index 3396b60e1fd..74bf55edc09 100644 --- a/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTGCSIT.java +++ b/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTGCSIT.java @@ -88,7 +88,7 @@ private Map getGCSConfig() { private void copyGCSBundleJar() { String gravitinoHome = System.getenv("GRAVITINO_HOME"); String targetDir = String.format("%s/iceberg-rest-server/libs/", gravitinoHome); - BaseIT.copyBundleJarsToDirectory("gcp-bundle", targetDir); + BaseIT.copyBundleJarsToDirectory("gcp", targetDir); } private void downloadIcebergBundleJar() throws IOException { diff --git a/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTOSSIT.java b/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTOSSIT.java index 4c4b4a953bc..8e72ce33f1b 100644 --- a/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTOSSIT.java +++ b/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTOSSIT.java @@ -126,6 +126,8 @@ private void downloadIcebergForAliyunJar() throws IOException { private void copyAliyunOSSJar() { String gravitinoHome = System.getenv("GRAVITINO_HOME"); String targetDir = String.format("%s/iceberg-rest-server/libs/", gravitinoHome); + // Iceberg doesn't provide Iceberg Aliyun bundle jar, so use Gravitino aliyun bundle to provide + // OSS packages. BaseIT.copyBundleJarsToDirectory("aliyun-bundle", targetDir); } } diff --git a/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTOSSSecretIT.java b/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTOSSSecretIT.java index 0be69cbe3d7..3b198c9d29e 100644 --- a/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTOSSSecretIT.java +++ b/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTOSSSecretIT.java @@ -111,6 +111,8 @@ private void downloadIcebergForAliyunJar() throws IOException { private void copyAliyunOSSJar() { String gravitinoHome = System.getenv("GRAVITINO_HOME"); String targetDir = String.format("%s/iceberg-rest-server/libs/", gravitinoHome); + // Iceberg doesn't provide Iceberg Aliyun bundle jar, so use Gravitino aliyun bundle to provide + // OSS packages. BaseIT.copyBundleJarsToDirectory("aliyun-bundle", targetDir); } } diff --git a/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTS3IT.java b/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTS3TokenIT.java similarity index 98% rename from iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTS3IT.java rename to iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTS3TokenIT.java index e906018f525..ef1551f91b4 100644 --- a/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTS3IT.java +++ b/iceberg/iceberg-rest-server/src/test/java/org/apache/gravitino/iceberg/integration/test/IcebergRESTS3TokenIT.java @@ -40,7 +40,7 @@ @SuppressWarnings("FormatStringAnnotation") @EnabledIfEnvironmentVariable(named = "GRAVITINO_TEST_CLOUD_IT", matches = "true") -public class IcebergRESTS3IT extends IcebergRESTJdbcCatalogIT { +public class IcebergRESTS3TokenIT extends IcebergRESTJdbcCatalogIT { private String s3Warehouse; private String accessKey; @@ -124,7 +124,7 @@ private void downloadIcebergAwsBundleJar() throws IOException { private void copyS3BundleJar() { String gravitinoHome = System.getenv("GRAVITINO_HOME"); String targetDir = String.format("%s/iceberg-rest-server/libs/", gravitinoHome); - BaseIT.copyBundleJarsToDirectory("aws-bundle", targetDir); + BaseIT.copyBundleJarsToDirectory("aws", targetDir); } /** From d2e261ac65b96470ba712f5a40c187dd9206cc9d Mon Sep 17 00:00:00 2001 From: FANNG Date: Fri, 10 Jan 2025 15:20:54 +0800 Subject: [PATCH 07/10] [#6054] feat(core): add more GCS permission to support fileset operations (#6141) ### What changes were proposed in this pull request? 1. for resource path like `a/b`, add "a", "a/", "a/b" read permission for GCS connector 2. replace `storage.legacyBucketReader` with `storage.insightsCollectorService`, because `storage.legacyBucketReader` provides extra list permission for the bucket. ### Why are the changes needed? Fix: #6054 ### Does this PR introduce _any_ user-facing change? no ### How was this patch tested? Iceberg GCS IT and fileset GCS credential IT --- .../gcs/credential/GCSTokenProvider.java | 72 +++++++++++++++---- .../gcs/credential/TestGCSTokenProvider.java | 64 +++++++++++++++++ 2 files changed, 124 insertions(+), 12 deletions(-) create mode 100644 bundles/gcp/src/test/java/org/apache/gravitino/gcs/credential/TestGCSTokenProvider.java diff --git a/bundles/gcp/src/main/java/org/apache/gravitino/gcs/credential/GCSTokenProvider.java b/bundles/gcp/src/main/java/org/apache/gravitino/gcs/credential/GCSTokenProvider.java index f499b8c3e85..0c1c2ab8af7 100644 --- a/bundles/gcp/src/main/java/org/apache/gravitino/gcs/credential/GCSTokenProvider.java +++ b/bundles/gcp/src/main/java/org/apache/gravitino/gcs/credential/GCSTokenProvider.java @@ -24,6 +24,8 @@ import com.google.auth.oauth2.CredentialAccessBoundary.AccessBoundaryRule; import com.google.auth.oauth2.DownscopedCredentials; import com.google.auth.oauth2.GoogleCredentials; +import com.google.common.annotations.VisibleForTesting; +import com.google.common.base.Preconditions; import java.io.File; import java.io.FileInputStream; import java.io.IOException; @@ -99,6 +101,57 @@ private AccessToken getToken(Set readLocations, Set writeLocatio return downscopedCredentials.refreshAccessToken(); } + private List getReadExpressions(String bucketName, String resourcePath) { + List readExpressions = new ArrayList<>(); + readExpressions.add( + String.format( + "resource.name.startsWith('projects/_/buckets/%s/objects/%s')", + bucketName, resourcePath)); + getAllResources(resourcePath) + .forEach( + parentResourcePath -> + readExpressions.add( + String.format( + "resource.name == 'projects/_/buckets/%s/objects/%s'", + bucketName, parentResourcePath))); + return readExpressions; + } + + @VisibleForTesting + // "a/b/c" will get ["a", "a/", "a/b", "a/b/", "a/b/c"] + static List getAllResources(String resourcePath) { + if (resourcePath.endsWith("/")) { + resourcePath = resourcePath.substring(0, resourcePath.length() - 1); + } + if (resourcePath.isEmpty()) { + return Arrays.asList(""); + } + Preconditions.checkArgument( + !resourcePath.startsWith("/"), resourcePath + " should not start with /"); + List parts = Arrays.asList(resourcePath.split("/")); + List results = new ArrayList<>(); + String parent = ""; + for (int i = 0; i < parts.size() - 1; i++) { + results.add(parts.get(i)); + parent += parts.get(i) + "/"; + results.add(parent); + } + results.add(parent + parts.get(parts.size() - 1)); + return results; + } + + @VisibleForTesting + // Remove the first '/', and append `/` if the path does not end with '/'. + static String normalizeUriPath(String resourcePath) { + if (resourcePath.startsWith("/")) { + resourcePath = resourcePath.substring(1); + } + if (resourcePath.endsWith("/")) { + return resourcePath; + } + return resourcePath + "/"; + } + private CredentialAccessBoundary getAccessBoundary( Set readLocations, Set writeLocations) { // bucketName -> read resource expressions @@ -116,14 +169,11 @@ private CredentialAccessBoundary getAccessBoundary( URI uri = URI.create(location); String bucketName = getBucketName(uri); readBuckets.add(bucketName); - String resourcePath = uri.getPath().substring(1); + String resourcePath = normalizeUriPath(uri.getPath()); List resourceExpressions = readExpressions.computeIfAbsent(bucketName, key -> new ArrayList<>()); // add read privilege - resourceExpressions.add( - String.format( - "resource.name.startsWith('projects/_/buckets/%s/objects/%s')", - bucketName, resourcePath)); + resourceExpressions.addAll(getReadExpressions(bucketName, resourcePath)); // add list privilege resourceExpressions.add( String.format( @@ -146,21 +196,19 @@ private CredentialAccessBoundary getAccessBoundary( CredentialAccessBoundary.newBuilder(); readBuckets.forEach( bucket -> { - // Hadoop GCS connector needs to get bucket info + // Hadoop GCS connector needs storage.buckets.get permission, the reason why not use + // inRole:roles/storage.legacyBucketReader is it provides extra list permission. AccessBoundaryRule bucketInfoRule = AccessBoundaryRule.newBuilder() .setAvailableResource(toGCSBucketResource(bucket)) - .setAvailablePermissions(Arrays.asList("inRole:roles/storage.legacyBucketReader")) + .setAvailablePermissions( + Arrays.asList("inRole:roles/storage.insightsCollectorService")) .build(); credentialAccessBoundaryBuilder.addRule(bucketInfoRule); List readConditions = readExpressions.get(bucket); AccessBoundaryRule rule = getAccessBoundaryRule( - bucket, - readConditions, - Arrays.asList( - "inRole:roles/storage.legacyObjectReader", - "inRole:roles/storage.objectViewer")); + bucket, readConditions, Arrays.asList("inRole:roles/storage.objectViewer")); if (rule == null) { return; } diff --git a/bundles/gcp/src/test/java/org/apache/gravitino/gcs/credential/TestGCSTokenProvider.java b/bundles/gcp/src/test/java/org/apache/gravitino/gcs/credential/TestGCSTokenProvider.java new file mode 100644 index 00000000000..66326fc2ba1 --- /dev/null +++ b/bundles/gcp/src/test/java/org/apache/gravitino/gcs/credential/TestGCSTokenProvider.java @@ -0,0 +1,64 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.apache.gravitino.gcs.credential; + +import com.google.common.collect.ImmutableMap; +import java.util.Arrays; +import java.util.List; +import java.util.Map; +import org.junit.jupiter.api.Assertions; +import org.junit.jupiter.api.Test; + +public class TestGCSTokenProvider { + + @Test + void testGetAllResources() { + Map> checkResults = + ImmutableMap.of( + "a/b", Arrays.asList("a", "a/", "a/b"), + "a/b/", Arrays.asList("a", "a/", "a/b"), + "a", Arrays.asList("a"), + "a/", Arrays.asList("a"), + "", Arrays.asList(""), + "/", Arrays.asList("")); + + checkResults.forEach( + (key, value) -> { + List parentResources = GCSTokenProvider.getAllResources(key); + Assertions.assertArrayEquals(value.toArray(), parentResources.toArray()); + }); + } + + @Test + void testNormalizePath() { + Map checkResults = + ImmutableMap.of( + "/a/b/", "a/b/", + "/a/b", "a/b/", + "a/b", "a/b/", + "a/b/", "a/b/"); + + checkResults.forEach( + (k, v) -> { + String normalizedPath = GCSTokenProvider.normalizeUriPath(k); + Assertions.assertEquals(v, normalizedPath); + }); + } +} From aa4fc6084371e21b6403f2ea30cdc649c26fb160 Mon Sep 17 00:00:00 2001 From: roryqi Date: Fri, 10 Jan 2025 15:51:10 +0800 Subject: [PATCH 08/10] [#6110] doc(authz): Add document for chain authorization plugin (#6115) ### What changes were proposed in this pull request? Add document for chain authorization plugin ### Why are the changes needed? Fix: #6110 ### Does this PR introduce _any_ user-facing change? Just document. ### How was this patch tested? No need. --------- Co-authored-by: Xun Co-authored-by: Qiming Teng --- docs/security/authorization-pushdown.md | 53 ++++++++++++++++++++++++- 1 file changed, 51 insertions(+), 2 deletions(-) diff --git a/docs/security/authorization-pushdown.md b/docs/security/authorization-pushdown.md index fe42a0955f4..9c8e9721939 100644 --- a/docs/security/authorization-pushdown.md +++ b/docs/security/authorization-pushdown.md @@ -21,12 +21,16 @@ In order to use the Ranger Hadoop SQL Plugin, you need to configure the followin |-------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------|---------------|----------|------------------| | `authorization-provider` | Providers to use to implement authorization plugin such as `ranger`. | (none) | No | 0.6.0-incubating | | `authorization.ranger.admin.url` | The Apache Ranger web URIs. | (none) | No | 0.6.0-incubating | +| `authorization.ranger.service.type` | The Apache Ranger service type, Currently only supports `HadoopSQL` or `HDFS` | (none) | No | 0.8.0-incubating | | `authorization.ranger.auth.type` | The Apache Ranger authentication type `simple` or `kerberos`. | `simple` | No | 0.6.0-incubating | | `authorization.ranger.username` | The Apache Ranger admin web login username (auth type=simple), or kerberos principal(auth type=kerberos), Need have Ranger administrator permission. | (none) | No | 0.6.0-incubating | | `authorization.ranger.password` | The Apache Ranger admin web login user password (auth type=simple), or path of the keytab file(auth type=kerberos) | (none) | No | 0.6.0-incubating | -| `authorization.ranger.service.type` | The Apache Ranger service type. | (none) | No | 0.8.0-incubating | | `authorization.ranger.service.name` | The Apache Ranger service name. | (none) | No | 0.6.0-incubating | +:::caution +The Gravitino Ranger authorization plugin only supports the Apache Ranger HadoopSQL Plugin and Apache Ranger HDFS Plugin. +::: + Once you have used the correct configuration, you can perform authorization operations by calling Gravitino [authorization RESTful API](https://gravitino.apache.org/docs/latest/api/rest/grant-roles-to-a-user). Gravitino will initially create three roles in Apache Ranger: @@ -55,4 +59,49 @@ authorization.ranger.service.name=hiveRepo Gravitino 0.8.0 only supports the authorization Apache Ranger Hive service , Apache Iceberg service and Apache Paimon Service. Spark can use Kyuubi authorization plugin to access Gravitino's catalog. But the plugin can't support to update or delete data for Paimon catalog. More data source authorization is under development. -::: \ No newline at end of file +::: + +### chain authorization plugin + +Gravitino supports chaining multiple authorization plugins to secure one catalog. +The authorization plugin chain is defined in the `authorization.chain.plugins` property, with the plugin names separated by commas. +When a user performs an authorization operation on data within a catalog, the chained plugin will apply the authorization rules for every plugin defined in the chain. + +In order to use the chained authorization plugin, you need to configure the following properties: + +| Property Name | Description | Default Value | Required | Since Version | +|-----------------------------------------------------------|----------------------------------------------------------------------------------------|---------------|-----------------------------|------------------| +| `authorization-provider` | Providers to use to implement authorization plugin such as `chain` | (none) | No | 0.8.0-incubating | +| `authorization.chain.plugins` | The comma-separated list of plugin names, like `${plugin-name1},${plugin-name2},...` | (none) | Yes if you use chain plugin | 0.8.0-incubating | +| `authorization.chain.${plugin-name}.ranger.admin.url` | The Ranger authorization plugin properties of the `${plugin-name}` | (none) | Yes if you use chain plugin | 0.8.0-incubating | +| `authorization.chain.${plugin-name}.ranger.service.type` | The Ranger authorization plugin properties of the `${plugin-name}` | (none) | Yes if you use chain plugin | 0.8.0-incubating | +| `authorization.chain.${plugin-name}.ranger.service.name` | The Ranger authorization plugin properties of the `${plugin-name}` | (none) | Yes if you use chain plugin | 0.8.0-incubating | +| `authorization.chain.${plugin-name}.ranger.username` | The Ranger authorization plugin properties of the `${plugin-name}` | (none) | Yes if you use chain plugin | 0.8.0-incubating | +| `authorization.chain.${plugin-name}.ranger.password` | The Ranger authorization plugin properties of the `${plugin-name}` | (none) | Yes if you use chain plugin | 0.8.0-incubating | + +:::caution +The Gravitino chain authorization plugin only supports the Apache Ranger HadoopSQL Plugin and Apache Ranger HDFS Plugin. +The properties of every chained authorization plugin should use `authorization.chain.${plugin-name}` as the prefix. +::: + +#### Example of using the chain authorization Plugin + +Suppose you have an Apache Hive service in your datacenter and have created a `hiveRepo` in Apache Ranger to manage its permissions. +The Apache Hive service will use HDFS to store its data. You have created a `hdfsRepo` in Apache Ranger to manage HDFS's permissions. + +```properties +authorization-provider=chain +authorization.chain.plugins=hive,hdfs +authorization.chain.hive.ranger.admin.url=http://ranger-service:6080 +authorization.chain.hive.ranger.service.type=HadoopSQL +authorization.chain.hive.ranger.service.name=hiveRepo +authorization.chain.hive.ranger.auth.type=simple +authorization.chain.hive.ranger.username=Jack +authorization.chain.hive.ranger.password=PWD123 +authorization.chain.hdfs.ranger.admin.url=http://ranger-service:6080 +authorization.chain.hdfs.ranger.service.type=HDFS +authorization.chain.hdfs.ranger.service.name=hdfsRepo +authorization.chain.hdfs.ranger.auth.type=simple +authorization.chain.hdfs.ranger.username=Jack +authorization.chain.hdfs.ranger.password=PWD123 +``` \ No newline at end of file From bba915767ff68eb0b9a2db18194c920579fd0eed Mon Sep 17 00:00:00 2001 From: luoshipeng <806855059@qq.com> Date: Fri, 10 Jan 2025 16:35:14 +0800 Subject: [PATCH 09/10] [#6144] improve(CLI): Refactor schema commands in Gravitino CLI (#6178) ### What changes were proposed in this pull request? Refactor schema commands in Gravitino CLI. ### Why are the changes needed? Fix: #6144 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Local test --- .../apache/gravitino/cli/CommandHandler.java | 12 +- .../gravitino/cli/GravitinoCommandLine.java | 75 +------ .../gravitino/cli/SchemaCommandHandler.java | 185 ++++++++++++++++++ 3 files changed, 192 insertions(+), 80 deletions(-) create mode 100644 clients/cli/src/main/java/org/apache/gravitino/cli/SchemaCommandHandler.java diff --git a/clients/cli/src/main/java/org/apache/gravitino/cli/CommandHandler.java b/clients/cli/src/main/java/org/apache/gravitino/cli/CommandHandler.java index 2b058b07af1..2af2487cc89 100644 --- a/clients/cli/src/main/java/org/apache/gravitino/cli/CommandHandler.java +++ b/clients/cli/src/main/java/org/apache/gravitino/cli/CommandHandler.java @@ -34,11 +34,11 @@ public abstract class CommandHandler { private boolean authSet = false; /** - * Retrieves the Gravitinno URL from the command line options or the GRAVITINO_URL environment - * variable or the Gravitio config file. + * Retrieves the Gravitino URL from the command line options or the GRAVITINO_URL environment + * variable or the Gravitino config file. * * @param line The command line instance. - * @return The Gravitinno URL, or null if not found. + * @return The Gravitino URL, or null if not found. */ public String getUrl(CommandLine line) { GravitinoConfig config = new GravitinoConfig(null); @@ -73,11 +73,11 @@ public String getUrl(CommandLine line) { } /** - * Retrieves the Gravitinno authentication from the command line options or the GRAVITINO_AUTH - * environment variable or the Gravitio config file. + * Retrieves the Gravitino authentication from the command line options or the GRAVITINO_AUTH + * environment variable or the Gravitino config file. * * @param line The command line instance. - * @return The Gravitinno authentication, or null if not found. + * @return The Gravitino authentication, or null if not found. */ public String getAuth(CommandLine line) { // If specified on the command line use that diff --git a/clients/cli/src/main/java/org/apache/gravitino/cli/GravitinoCommandLine.java b/clients/cli/src/main/java/org/apache/gravitino/cli/GravitinoCommandLine.java index 2af8a2973a3..74fadcb54b4 100644 --- a/clients/cli/src/main/java/org/apache/gravitino/cli/GravitinoCommandLine.java +++ b/clients/cli/src/main/java/org/apache/gravitino/cli/GravitinoCommandLine.java @@ -133,7 +133,7 @@ private void executeCommand() { } else if (entity.equals(CommandEntities.TABLE)) { new TableCommandHandler(this, line, command, ignore).handle(); } else if (entity.equals(CommandEntities.SCHEMA)) { - handleSchemaCommand(); + new SchemaCommandHandler(this, line, command, ignore).handle(); } else if (entity.equals(CommandEntities.CATALOG)) { new CatalogCommandHandler(this, line, command, ignore).handle(); } else if (entity.equals(CommandEntities.METALAKE)) { @@ -240,79 +240,6 @@ private void handleMetalakeCommand() { } } - /** - * Handles the command execution for Schemas based on command type and the command line options. - */ - private void handleSchemaCommand() { - String url = getUrl(); - String auth = getAuth(); - String userName = line.getOptionValue(GravitinoOptions.LOGIN); - FullName name = new FullName(line); - String metalake = name.getMetalakeName(); - String catalog = name.getCatalogName(); - - Command.setAuthenticationMode(auth, userName); - - List missingEntities = Lists.newArrayList(); - if (metalake == null) missingEntities.add(CommandEntities.METALAKE); - if (catalog == null) missingEntities.add(CommandEntities.CATALOG); - - // Handle the CommandActions.LIST action separately as it doesn't use `schema` - if (CommandActions.LIST.equals(command)) { - checkEntities(missingEntities); - newListSchema(url, ignore, metalake, catalog).validate().handle(); - return; - } - - String schema = name.getSchemaName(); - if (schema == null) missingEntities.add(CommandEntities.SCHEMA); - checkEntities(missingEntities); - - switch (command) { - case CommandActions.DETAILS: - if (line.hasOption(GravitinoOptions.AUDIT)) { - newSchemaAudit(url, ignore, metalake, catalog, schema).validate().handle(); - } else { - newSchemaDetails(url, ignore, metalake, catalog, schema).validate().handle(); - } - break; - - case CommandActions.CREATE: - String comment = line.getOptionValue(GravitinoOptions.COMMENT); - newCreateSchema(url, ignore, metalake, catalog, schema, comment).validate().handle(); - break; - - case CommandActions.DELETE: - boolean force = line.hasOption(GravitinoOptions.FORCE); - newDeleteSchema(url, ignore, force, metalake, catalog, schema).validate().handle(); - break; - - case CommandActions.SET: - String property = line.getOptionValue(GravitinoOptions.PROPERTY); - String value = line.getOptionValue(GravitinoOptions.VALUE); - newSetSchemaProperty(url, ignore, metalake, catalog, schema, property, value) - .validate() - .handle(); - break; - - case CommandActions.REMOVE: - property = line.getOptionValue(GravitinoOptions.PROPERTY); - newRemoveSchemaProperty(url, ignore, metalake, catalog, schema, property) - .validate() - .handle(); - break; - - case CommandActions.PROPERTIES: - newListSchemaProperties(url, ignore, metalake, catalog, schema).validate().handle(); - break; - - default: - System.err.println(ErrorMessages.UNSUPPORTED_COMMAND); - Main.exit(-1); - break; - } - } - /** Handles the command execution for Users based on command type and the command line options. */ protected void handleUserCommand() { String url = getUrl(); diff --git a/clients/cli/src/main/java/org/apache/gravitino/cli/SchemaCommandHandler.java b/clients/cli/src/main/java/org/apache/gravitino/cli/SchemaCommandHandler.java new file mode 100644 index 00000000000..4a0cf919fb8 --- /dev/null +++ b/clients/cli/src/main/java/org/apache/gravitino/cli/SchemaCommandHandler.java @@ -0,0 +1,185 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.apache.gravitino.cli; + +import com.google.common.collect.Lists; +import java.util.List; +import org.apache.commons.cli.CommandLine; +import org.apache.gravitino.cli.commands.Command; + +public class SchemaCommandHandler extends CommandHandler { + + private final GravitinoCommandLine gravitinoCommandLine; + private final CommandLine line; + private final String command; + private final boolean ignore; + private final String url; + private final FullName name; + private final String metalake; + private final String catalog; + private String schema; + + /** + * Constructs a {@link SchemaCommandHandler} instance. + * + * @param gravitinoCommandLine The Gravitino command line instance. + * @param line The command line arguments. + * @param command The command to execute. + * @param ignore Ignore server version mismatch. + */ + public SchemaCommandHandler( + GravitinoCommandLine gravitinoCommandLine, CommandLine line, String command, boolean ignore) { + this.gravitinoCommandLine = gravitinoCommandLine; + this.line = line; + this.command = command; + this.ignore = ignore; + + this.url = getUrl(line); + this.name = new FullName(line); + this.metalake = name.getMetalakeName(); + this.catalog = name.getCatalogName(); + } + + @Override + protected void handle() { + String userName = line.getOptionValue(GravitinoOptions.LOGIN); + Command.setAuthenticationMode(getAuth(line), userName); + + List missingEntities = Lists.newArrayList(); + if (metalake == null) missingEntities.add(CommandEntities.METALAKE); + if (catalog == null) missingEntities.add(CommandEntities.CATALOG); + + if (CommandActions.LIST.equals(command)) { + checkEntities(missingEntities); + handleListCommand(); + return; + } + + this.schema = name.getSchemaName(); + if (schema == null) missingEntities.add(CommandEntities.SCHEMA); + checkEntities(missingEntities); + + if (!executeCommand()) { + System.err.println(ErrorMessages.UNSUPPORTED_COMMAND); + Main.exit(-1); + } + } + + /** + * Executes the specific command based on the command type. + * + * @return true if the command is supported, false otherwise + */ + private boolean executeCommand() { + switch (command) { + case CommandActions.DETAILS: + handleDetailsCommand(); + return true; + + case CommandActions.CREATE: + handleCreateCommand(); + return true; + + case CommandActions.DELETE: + handleDeleteCommand(); + return true; + + case CommandActions.SET: + handleSetCommand(); + return true; + + case CommandActions.REMOVE: + handleRemoveCommand(); + return true; + + case CommandActions.PROPERTIES: + handlePropertiesCommand(); + return true; + + default: + return false; + } + } + + /** Handles the "LIST" command. */ + private void handleListCommand() { + gravitinoCommandLine.newListSchema(url, ignore, metalake, catalog).validate().handle(); + } + + /** Handles the "DETAILS" command. */ + private void handleDetailsCommand() { + if (line.hasOption(GravitinoOptions.AUDIT)) { + gravitinoCommandLine + .newSchemaAudit(url, ignore, metalake, catalog, schema) + .validate() + .handle(); + } else { + gravitinoCommandLine + .newSchemaDetails(url, ignore, metalake, catalog, schema) + .validate() + .handle(); + } + } + + /** Handles the "CREATE" command. */ + private void handleCreateCommand() { + String comment = line.getOptionValue(GravitinoOptions.COMMENT); + gravitinoCommandLine + .newCreateSchema(url, ignore, metalake, catalog, schema, comment) + .validate() + .handle(); + } + + /** Handles the "DELETE" command. */ + private void handleDeleteCommand() { + boolean force = line.hasOption(GravitinoOptions.FORCE); + gravitinoCommandLine + .newDeleteSchema(url, ignore, force, metalake, catalog, schema) + .validate() + .handle(); + } + + /** Handles the "SET" command. */ + private void handleSetCommand() { + String property = line.getOptionValue(GravitinoOptions.PROPERTY); + String value = line.getOptionValue(GravitinoOptions.VALUE); + gravitinoCommandLine + .newSetSchemaProperty(url, ignore, metalake, catalog, schema, property, value) + .validate() + .handle(); + } + + /** Handles the "REMOVE" command. */ + private void handleRemoveCommand() { + String property = line.getOptionValue(GravitinoOptions.PROPERTY); + gravitinoCommandLine + .newRemoveSchemaProperty(url, ignore, metalake, catalog, schema, property) + .validate() + .handle(); + } + + /** Handles the "PROPERTIES" command. */ + private void handlePropertiesCommand() { + gravitinoCommandLine + .newListSchemaProperties(url, ignore, metalake, catalog, schema) + .validate() + .handle(); + } +} From 14ec3833cb91858cfc43a4b275b3cb578193e8b5 Mon Sep 17 00:00:00 2001 From: Jerry Shao Date: Fri, 10 Jan 2025 22:03:34 +0800 Subject: [PATCH 10/10] [#6184]improve(core): Remove the protobuf dependency (#6185) ### What changes were proposed in this pull request? Remove the unused protobuf dependency. ### Why are the changes needed? Since we already removed the KV storage support, so protobuf dependency is not required any more. Fix: #6184 ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Existing tests. --- NOTICE.bin | 8 -------- NOTICE.rest | 14 +++----------- clients/client-java/build.gradle.kts | 4 ---- common/build.gradle.kts | 1 - core/build.gradle.kts | 4 ---- gradle/libs.versions.toml | 4 ---- 6 files changed, 3 insertions(+), 32 deletions(-) diff --git a/NOTICE.bin b/NOTICE.bin index 79645c85bf0..eef511ff7a7 100644 --- a/NOTICE.bin +++ b/NOTICE.bin @@ -184,14 +184,6 @@ zlib in pure Java, which can be obtained at: * HOMEPAGE: * http://www.jcraft.com/jzlib/ -This product optionally depends on 'Protocol Buffers', Google's data -interchange format, which can be obtained at: - - * LICENSE: - * license/LICENSE.protobuf.txt (New BSD License) - * HOMEPAGE: - * http://code.google.com/p/protobuf/ - This product optionally depends on 'SLF4J', a simple logging facade for Java, which can be obtained at: diff --git a/NOTICE.rest b/NOTICE.rest index 8a06bae6fbe..551b4d3eb3b 100644 --- a/NOTICE.rest +++ b/NOTICE.rest @@ -253,7 +253,7 @@ Dropwizard Hadoop Metrics Copyright 2016 Josh Elser AWS EventStream for Java -Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. +Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. Apache Gravitino (incubating) Copyright 2024 The Apache Software Foundation @@ -488,7 +488,7 @@ The Apache Software Foundation (https://www.apache.org/). This product includes software developed by Joda.org (https://www.joda.org/). -Kerby-kerb Admin +Kerby-kerb Admin Copyright 2014-2022 The Apache Software Foundation Kerby-kerb core @@ -524,7 +524,7 @@ Copyright 2014-2022 The Apache Software Foundation Kerby PKIX Project Copyright 2014-2022 The Apache Software Foundation -Kerby Util +Kerby Util Copyright 2014-2022 The Apache Software Foundation Kerby XDR Project @@ -605,14 +605,6 @@ zlib in pure Java, which can be obtained at: * HOMEPAGE: * http://www.jcraft.com/jzlib/ -This product optionally depends on 'Protocol Buffers', Google's data -interchange format, which can be obtained at: - - * LICENSE: - * license/LICENSE.protobuf.txt (New BSD License) - * HOMEPAGE: - * http://code.google.com/p/protobuf/ - This product optionally depends on 'SLF4J', a simple logging facade for Java, which can be obtained at: diff --git a/clients/client-java/build.gradle.kts b/clients/client-java/build.gradle.kts index d928c5ce006..d7518569c94 100644 --- a/clients/client-java/build.gradle.kts +++ b/clients/client-java/build.gradle.kts @@ -25,10 +25,6 @@ plugins { dependencies { implementation(project(":api")) implementation(project(":common")) - implementation(libs.protobuf.java.util) { - exclude("com.google.guava", "guava") - .because("Brings in Guava for Android, which we don't want (and breaks multimaps).") - } implementation(libs.jackson.databind) implementation(libs.jackson.annotations) implementation(libs.jackson.datatype.jdk8) diff --git a/common/build.gradle.kts b/common/build.gradle.kts index 91e2d137f25..1acf0e1b4c8 100644 --- a/common/build.gradle.kts +++ b/common/build.gradle.kts @@ -36,7 +36,6 @@ dependencies { implementation(libs.jackson.datatype.jdk8) implementation(libs.jackson.datatype.jsr310) implementation(libs.jackson.databind) - implementation(libs.protobuf.java) annotationProcessor(libs.lombok) compileOnly(libs.lombok) diff --git a/core/build.gradle.kts b/core/build.gradle.kts index 3ca446a51c1..ef23950b07c 100644 --- a/core/build.gradle.kts +++ b/core/build.gradle.kts @@ -36,10 +36,6 @@ dependencies { implementation(libs.guava) implementation(libs.h2db) implementation(libs.mybatis) - implementation(libs.protobuf.java.util) { - exclude("com.google.guava", "guava") - .because("Brings in Guava for Android, which we don't want (and breaks multimaps).") - } annotationProcessor(libs.lombok) diff --git a/gradle/libs.versions.toml b/gradle/libs.versions.toml index 52bccd9b480..3391daf30be 100644 --- a/gradle/libs.versions.toml +++ b/gradle/libs.versions.toml @@ -93,7 +93,6 @@ cglib = "2.2" ranger = "2.4.0" javax-jaxb-api = "2.3.1" javax-ws-rs-api = "2.1.1" -protobuf-plugin = "0.9.2" spotless-plugin = '6.11.0' gradle-extensions-plugin = '1.74' publish-plugin = '1.2.0' @@ -129,8 +128,6 @@ azure-identity = { group = "com.azure", name = "azure-identity", version.ref = " azure-storage-file-datalake = { group = "com.azure", name = "azure-storage-file-datalake", version.ref = "azure-storage-file-datalake"} reactor-netty-http = {group = "io.projectreactor.netty", name = "reactor-netty-http", version.ref = "reactor-netty-http"} reactor-netty-core = {group = "io.projectreactor.netty", name = "reactor-netty-core", version.ref = "reactor-netty-core"} -protobuf-java = { group = "com.google.protobuf", name = "protobuf-java", version.ref = "protoc" } -protobuf-java-util = { group = "com.google.protobuf", name = "protobuf-java-util", version.ref = "protoc" } jackson-databind = { group = "com.fasterxml.jackson.core", name = "jackson-databind", version.ref = "jackson" } jackson-annotations = { group = "com.fasterxml.jackson.core", name = "jackson-annotations", version.ref = "jackson" } jackson-datatype-jdk8 = { group = "com.fasterxml.jackson.datatype", name = "jackson-datatype-jdk8", version.ref = "jackson" } @@ -293,7 +290,6 @@ prometheus = ["prometheus-servlet", "prometheus-dropwizard", "prometheus-client" kerby = ["kerby-core", "kerby-simplekdc"] [plugins] -protobuf = { id = "com.google.protobuf", version.ref = "protobuf-plugin" } spotless = { id = "com.diffplug.spotless", version.ref = "spotless-plugin" } gradle-extensions = { id = "com.github.vlsi.gradle-extensions", version.ref = "gradle-extensions-plugin" } publish = { id = "io.github.gradle-nexus.publish-plugin", version.ref = "publish-plugin" }