Skip to content

Commit

Permalink
[FLINK-36761][docs] Add document about how to use SQL Gateway and SQL…
Browse files Browse the repository at this point in the history
… Client to dpeloy script
  • Loading branch information
fsk119 committed Dec 11, 2024
1 parent 930b2b5 commit 3e09814
Show file tree
Hide file tree
Showing 10 changed files with 1,540 additions and 18 deletions.
28 changes: 28 additions & 0 deletions docs/content.zh/docs/dev/table/sql-gateway/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -116,6 +116,34 @@ $ curl --request GET http://localhost:8083/v1/sessions/${sessionHandle}/operatio

结果中的 `nextResultUri` 不是null时,用于获取下一批结果。

### Deploy Script

SQL Gateway supports to deploy a script in [Application Mode]({{< ref "docs/deployment/overview" >}}). In application mode, Job Master is responsible for the script compiling.
If you want to use custom resources in the script, e.g. Kafka Source, please use [ADD JAR]({{< ref "docs/dev/table/sql/jar">}}) command to download the required sources.

Here is an example to deploy script to Flink native K8S Cluster with cluster id `CLUSTER_ID`.

```bash
$ curl --request POST http://localhost:8083/sessions/${SESSION_HANDLE}/scripts \
--header 'Content-Type: application/json' \
--data-raw '{
"script": "CREATE TEMPORARY TABLE sink(a INT) WITH ( '\''connector'\'' = '\''blackhole'\''); INSERT INTO sink VALUES (1), (2), (3);",
"executionConfig": {
"execution.target": "kubernetes-application",
"kubernetes.cluster-id": "'${CLUSTER_ID}'",
"kubernetes.container.image.ref": "'${FLINK_IMAGE_NAME}'",
"jobmanager.memory.process.size": "1000m",
"taskmanager.memory.process.size": "1000m",
"kubernetes.jobmanager.cpu": 0.5,
"kubernetes.taskmanager.cpu": 0.5,
"kubernetes.rest-service.exposed.type": "NodePort"
}
}'
```

<span class="label label-info">Note</span> If you want to run script with PyFlink, please use an image with PyFlink installed. You can refer to
[Enabling PyFlink in docker]({{< ref "docs/deployment/resource-providers/standalone/docker" >}}#enabling-python) for more details.

```bash
$ curl --request GET ${nextResultUri}
```
Expand Down
17 changes: 12 additions & 5 deletions docs/content.zh/docs/dev/table/sql-gateway/rest.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,11 +93,13 @@ REST API

OpenAPI 规范如下,默认版本是 v3。

| Version | Description |
| ----------- |--------------------------------|
| [Open API v1 specification]({{< ref_static "generated/rest_v1_sql_gateway.yml" >}}) | 允许用户提交 SQL 语句到 Gateway 并执行。 |
| [Open API v2 specification]({{< ref_static "generated/rest_v2_sql_gateway.yml" >}}) | 支持 SQL 客户端连接到 Gateway。 |
| [Open API v3 specification]({{< ref_static "generated/rest_v3_sql_gateway.yml" >}}) | 支持 Materialized Table 刷新操作。 |
| Version | Description |
| ----------- |-------------------------------|
| [Open API v1 specification]({{< ref_static "generated/rest_v1_sql_gateway.yml" >}}) | 允许用户提交 SQL 语句到 Gateway 并执行。 |
| [Open API v2 specification]({{< ref_static "generated/rest_v2_sql_gateway.yml" >}}) | 支持 SQL 客户端连接到 Gateway。 |
| [Open API v3 specification]({{< ref_static "generated/rest_v3_sql_gateway.yml" >}}) | 支持 Materialized Table 刷新操作。 |
| [Open API v4 specification]({{< ref_static "generated/rest_v4_sql_gateway.yml" >}}) | 支持使用 Application 模式部署 SQL 脚本。 |


{{< hint warning >}}
OpenAPI 规范目前仍处于实验阶段。
Expand All @@ -106,6 +108,11 @@ OpenAPI 规范目前仍处于实验阶段。
#### API reference

{{< tabs "f00ed142-b05f-44f0-bafc-799080c1d40d" >}}
{{< tab "v4" >}}

{{< generated/rest_v4_sql_gateway >}}

{{< /tab >}}
{{< tab "v3" >}}

{{< generated/rest_v3_sql_gateway >}}
Expand Down
20 changes: 19 additions & 1 deletion docs/content.zh/docs/dev/table/sqlClient.md
Original file line number Diff line number Diff line change
Expand Up @@ -610,7 +610,7 @@ SQL Client will print success message if the statement is executed successfully.
By default, the error message only contains the error cause. In order to print the full exception stack for debugging, please set the
`sql-client.verbose` to true through command `SET 'sql-client.verbose' = 'true';`.
### Execute SQL Files
### Execute SQL Files in a Session Cluster
SQL Client supports to execute a SQL script file with the `-f` option. SQL Client will execute
statements one by one in the SQL script file and print execution messages for each executed statements.
Expand Down Expand Up @@ -663,6 +663,24 @@ This configuration:
<span class="label label-danger">Attention</span> Compared to the interactive mode, SQL Client will stop execution and exits when there are errors.
### Deploy SQL Files in an Application Cluster
SQL Client also supports to deploy a SQL script file to an application cluster with the `-f` option if you specify the deployment target in the config.yaml or startup options.
Here is an example to deploy script file in an application cluster.
```bash
./bin/sql-client.sh -f oss://path/to/script.sql \
-Dexecution.target=kubernetes-application \
-Dkubernetes.cluster-id=${CLUSTER_ID} \
-Dkubernetes.container.image.ref=${FLINK_IMAGE_NAME}'
```
After execution, SQL Client print the cluster id on the terminal. The script can contain any statement that is supported by Flink. But Application cluster only supports one job, please refer to the
[Application Mode]({{< ref "docs/deployment/overview#application-mode" >}}) for the limitations.
<span class="label label-danger">Attention</span> When deploying a script to the cluster, SQL Client only supports to run with `--jars` startup option, other options, e.g. `--init`
are not supported.
### Execute a set of SQL statements
SQL Client execute each INSERT INTO statement as a single Flink job. However, this is sometimes not
Expand Down
28 changes: 28 additions & 0 deletions docs/content/docs/dev/table/sql-gateway/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -122,6 +122,34 @@ The `nextResultUri` in the results is used to fetch the next batch results if it
$ curl --request GET ${nextResultUri}
```

### Deploy Script

SQL Gateway supports to deploy a script in [Application Mode]({{< ref "docs/deployment/overview" >}}). In application mode, Job Master is responsible for the script compiling.
If you want to use custom resources in the script, e.g. Kafka Source, please use [ADD JAR]({{< ref "docs/dev/table/sql/jar">}}) command to download the required sources.

Here is an example to deploy script to Flink native K8S Cluster with cluster id `CLUSTER_ID`.

```bash
$ curl --request POST http://localhost:8083/sessions/${SESSION_HANDLE}/scripts \
--header 'Content-Type: application/json' \
--data-raw '{
"script": "CREATE TEMPORARY TABLE sink(a INT) WITH ( '\''connector'\'' = '\''blackhole'\''); INSERT INTO sink VALUES (1), (2), (3);",
"executionConfig": {
"execution.target": "kubernetes-application",
"kubernetes.cluster-id": "'${CLUSTER_ID}'",
"kubernetes.container.image.ref": "'${FLINK_IMAGE_NAME}'",
"jobmanager.memory.process.size": "1000m",
"taskmanager.memory.process.size": "1000m",
"kubernetes.jobmanager.cpu": 0.5,
"kubernetes.taskmanager.cpu": 0.5,
"kubernetes.rest-service.exposed.type": "NodePort"
}
}'
```

<span class="label label-info">Note</span> If you want to run script with PyFlink, please use an image with PyFlink installed. You can refer to
[Enabling PyFlink in docker]({{< ref "docs/deployment/resource-providers/standalone/docker" >}}#enabling-python) for more details.

Configuration
----------------

Expand Down
14 changes: 10 additions & 4 deletions docs/content/docs/dev/table/sql-gateway/rest.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,11 +96,12 @@ REST API

The available OpenAPI specification is as follows. The default version is v3.

| Version | Description |
| ----------- | ------- |
| Version | Description |
|-------------------------------------------------------------------------------------|--------------------------------------------------------------|
| [Open API v1 specification]({{< ref_static "generated/rest_v1_sql_gateway.yml" >}}) | Allow users to submit statements to the gateway and execute. |
| [Open API v2 specification]({{< ref_static "generated/rest_v2_sql_gateway.yml" >}}) | Supports SQL Client to connect to the gateway. |
| [Open API v3 specification]({{< ref_static "generated/rest_v3_sql_gateway.yml" >}}) | Supports Materialized Table refresh operation. |
| [Open API v2 specification]({{< ref_static "generated/rest_v2_sql_gateway.yml" >}}) | Supports SQL Client to connect to the gateway. |
| [Open API v3 specification]({{< ref_static "generated/rest_v3_sql_gateway.yml" >}}) | Supports Materialized Table refresh operation. |
| [Open API v4 specification]({{< ref_static "generated/rest_v4_sql_gateway.yml" >}}) | Supports to deploy script in application mode. |

{{< hint warning >}}
The OpenAPI specification is still experimental.
Expand All @@ -109,6 +110,11 @@ The OpenAPI specification is still experimental.
#### API reference

{{< tabs "f00ed142-b05f-44f0-bafc-799080c1d40d" >}}
{{< tab "v4" >}}

{{< generated/rest_v4_sql_gateway >}}

{{< /tab >}}
{{< tab "v3" >}}

{{< generated/rest_v3_sql_gateway >}}
Expand Down
20 changes: 19 additions & 1 deletion docs/content/docs/dev/table/sqlClient.md
Original file line number Diff line number Diff line change
Expand Up @@ -548,7 +548,7 @@ SQL Client will print success message if the statement is executed successfully.
By default, the error message only contains the error cause. In order to print the full exception stack for debugging, please set the
`sql-client.verbose` to true through command `SET 'sql-client.verbose' = 'true';`.
### Execute SQL Files
### Execute SQL Files in a Session Cluster
SQL Client supports to execute a SQL script file with the `-f` option. SQL Client will execute
statements one by one in the SQL script file and print execution messages for each executed statements.
Expand Down Expand Up @@ -601,6 +601,24 @@ This configuration:
<span class="label label-danger">Attention</span> Compared to the interactive mode, SQL Client will stop execution and exits when there are errors.
### Deploy SQL Files in an Application Cluster
SQL Client also supports to deploy a SQL script file to an application cluster with the `-f` option if you specify the deployment target in the config.yaml or startup options.
Here is an example to deploy script file in an application cluster.
```bash
./bin/sql-client.sh -f oss://path/to/script.sql \
-Dexecution.target=kubernetes-application \
-Dkubernetes.cluster-id=${CLUSTER_ID} \
-Dkubernetes.container.image.ref=${FLINK_IMAGE_NAME}'
```
After execution, SQL Client print the cluster id on the terminal. The script can contain any statement that is supported by Flink. But Application cluster only supports one job, please refer to the
[Application Mode]({{< ref "docs/deployment/overview#application-mode" >}}) for the limitations.
<span class="label label-danger">Attention</span> When deploying a script to the cluster, SQL Client only supports to run with `--jars` startup option, other options, e.g. `--init`
are not supported.
### Execute a set of SQL statements
SQL Client execute each INSERT INTO statement as a single Flink job. However, this is sometimes not
Expand Down
Loading

0 comments on commit 3e09814

Please sign in to comment.