Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature-16699][Task Plugin] Flink submit option parameter supports placeholder substitution #16703

Open
wants to merge 7 commits into
base: dev
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/docs/en/guide/task/flink.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Flink task type, used to execute Flink programs. For Flink nodes:

- Please refer to [DolphinScheduler Task Parameters Appendix](appendix.md) `Default Task Parameters` section for default parameters.

| **Parameter** | **Description** |
| **Parameter** | **Description** |
|-------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Program type | Support Java, Scala, Python and SQL four languages. |
| Class of main function | The **full path** of Main Class, the entry point of the Flink program. |
Expand All @@ -37,7 +37,7 @@ Flink task type, used to execute Flink programs. For Flink nodes:
| Parallelism | Used to set the degree of parallelism for executing Flink tasks. |
| Yarn queue | Used to set the yarn queue, use `default` queue by default. |
| Main program parameters | Set the input parameters for the Flink program and support the substitution of custom parameter variables. |
| Optional parameters | Set the flink command options, such as `-D`, `-C`, `-yt`. |
| Optional parameters | Set the flink command options, such as `-D`, `-C`, `-yt`, and support the substitution of custom parameter variables. |
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add some examples of how to use it.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok,I have added examples.

| Custom parameter | It is a local user-defined parameter for Flink, and will replace the content with `${variable}` in the script. |

## Task Example
Expand Down
4 changes: 2 additions & 2 deletions docs/docs/zh/guide/task/flink.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Flink 任务类型,用于执行 Flink 程序。对于 Flink 节点:

- 默认参数说明请参考[DolphinScheduler任务参数附录](appendix.md)`默认任务参数`一栏。

| **任务参数** | **描述** |
| **任务参数** | **描述** |
|-----------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 程序类型 | 支持 Java、Scala、 Python 和 SQL 四种语言 |
| 主函数的 Class | Flink 程序的入口 Main Class 的**全路径** |
Expand All @@ -37,7 +37,7 @@ Flink 任务类型,用于执行 Flink 程序。对于 Flink 节点:
| 并行度 | 用于设置执行 Flink 任务的并行度 |
| Yarn 队列 | 用于设置 Yarn 队列,默认使用 default 队列 |
| 主程序参数 | 设置 Flink 程序的输入参数,支持自定义参数变量的替换 |
| 选项参数 | 设置Flink命令的选项参数,例如`-D`, `-C`, `-yt` |
| 选项参数 | 设置Flink命令的选项参数,例如`-D`, `-C`, `-yt`,支持自定义参数变量的替换 |
| 自定义参数 | 是 Flink 局部的用户自定义参数,会替换脚本中以 ${变量} 的内容 |

## 任务样例
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,8 @@ private static List<String> buildRunCommandLineForSql(TaskExecutionContext taskE

String others = flinkParameters.getOthers();
if (StringUtils.isNotEmpty(others)) {
args.add(others);
Map<String, Property> paramsMap = taskExecutionContext.getPrepareParamsMap();
args.add(ParameterUtils.convertParameterPlaceholders(others, ParameterUtils.convert(paramsMap)));
}
return args;
}
Expand Down Expand Up @@ -271,7 +272,8 @@ private static List<String> buildRunCommandLineForOthers(TaskExecutionContext ta

// -s -yqu -yat -yD -D
if (StringUtils.isNotEmpty(others)) {
args.add(others);
Map<String, Property> paramsMap = taskExecutionContext.getPrepareParamsMap();
args.add(ParameterUtils.convertParameterPlaceholders(others, ParameterUtils.convert(paramsMap)));
}

// determine yarn queue
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,14 @@
package org.apache.dolphinscheduler.plugin.task.flink;

import org.apache.dolphinscheduler.plugin.task.api.TaskExecutionContext;
import org.apache.dolphinscheduler.plugin.task.api.enums.DataType;
import org.apache.dolphinscheduler.plugin.task.api.model.Property;
import org.apache.dolphinscheduler.plugin.task.api.model.ResourceInfo;
import org.apache.dolphinscheduler.plugin.task.api.resource.ResourceContext;

import java.util.HashMap;
import java.util.List;
import java.util.Map;

import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Test;
Expand All @@ -45,6 +49,7 @@ private FlinkParameters buildTestFlinkParametersWithDeployMode(FlinkDeployMode f
flinkParameters.setAppName("demo-app-name");
flinkParameters.setJobManagerMemory("1024m");
flinkParameters.setTaskManagerMemory("1024m");
flinkParameters.setOthers("-Dyarn.application.name=${job-name}");

return flinkParameters;
}
Expand All @@ -60,6 +65,12 @@ private TaskExecutionContext buildTestTaskExecutionContext() {
ResourceContext resourceContext = new ResourceContext();
resourceContext.addResourceItem(resourceItem);
taskExecutionContext.setResourceContext(resourceContext);

Map<String, Property> parameters = new HashMap<>();
parameters.put("job-name",
Property.builder().type(DataType.VARCHAR).prop("job-name").value("demo-app-name").build());
taskExecutionContext.setPrepareParamsMap(parameters);

return taskExecutionContext;
}

Expand All @@ -69,7 +80,7 @@ public void testRunJarInApplicationMode() throws Exception {
List<String> commandLine = FlinkArgsUtils.buildRunCommandLine(buildTestTaskExecutionContext(), flinkParameters);

Assertions.assertEquals(
"${FLINK_HOME}/bin/flink run-application -t yarn-application -ys 4 -ynm demo-app-name -yjm 1024m -ytm 1024m -p 4 -sae -c org.example.Main /opt/job.jar",
"${FLINK_HOME}/bin/flink run-application -t yarn-application -ys 4 -ynm demo-app-name -yjm 1024m -ytm 1024m -p 4 -sae -Dyarn.application.name=demo-app-name -c org.example.Main /opt/job.jar",
joinStringListWithSpace(commandLine));
}

Expand All @@ -81,23 +92,23 @@ public void testRunJarInClusterMode() {
FlinkArgsUtils.buildRunCommandLine(buildTestTaskExecutionContext(), flinkParameters);

Assertions.assertEquals(
"${FLINK_HOME}/bin/flink run -m yarn-cluster -ys 4 -ynm demo-app-name -yjm 1024m -ytm 1024m -p 4 -sae -c org.example.Main /opt/job.jar",
"${FLINK_HOME}/bin/flink run -m yarn-cluster -ys 4 -ynm demo-app-name -yjm 1024m -ytm 1024m -p 4 -sae -Dyarn.application.name=demo-app-name -c org.example.Main /opt/job.jar",
joinStringListWithSpace(commandLine1));

flinkParameters.setFlinkVersion("<1.10");
List<String> commandLine2 =
FlinkArgsUtils.buildRunCommandLine(buildTestTaskExecutionContext(), flinkParameters);

Assertions.assertEquals(
"${FLINK_HOME}/bin/flink run -m yarn-cluster -ys 4 -ynm demo-app-name -yjm 1024m -ytm 1024m -p 4 -sae -c org.example.Main /opt/job.jar",
"${FLINK_HOME}/bin/flink run -m yarn-cluster -ys 4 -ynm demo-app-name -yjm 1024m -ytm 1024m -p 4 -sae -Dyarn.application.name=demo-app-name -c org.example.Main /opt/job.jar",
joinStringListWithSpace(commandLine2));

flinkParameters.setFlinkVersion(">=1.12");
List<String> commandLine3 =
FlinkArgsUtils.buildRunCommandLine(buildTestTaskExecutionContext(), flinkParameters);

Assertions.assertEquals(
"${FLINK_HOME}/bin/flink run -t yarn-per-job -ys 4 -ynm demo-app-name -yjm 1024m -ytm 1024m -p 4 -sae -c org.example.Main /opt/job.jar",
"${FLINK_HOME}/bin/flink run -t yarn-per-job -ys 4 -ynm demo-app-name -yjm 1024m -ytm 1024m -p 4 -sae -Dyarn.application.name=demo-app-name -c org.example.Main /opt/job.jar",
joinStringListWithSpace(commandLine3));
}

Expand All @@ -107,7 +118,7 @@ public void testRunJarInLocalMode() {
List<String> commandLine = FlinkArgsUtils.buildRunCommandLine(buildTestTaskExecutionContext(), flinkParameters);

Assertions.assertEquals(
"${FLINK_HOME}/bin/flink run -p 4 -sae -c org.example.Main /opt/job.jar",
"${FLINK_HOME}/bin/flink run -p 4 -sae -Dyarn.application.name=demo-app-name -c org.example.Main /opt/job.jar",
joinStringListWithSpace(commandLine));
}

Expand All @@ -118,7 +129,7 @@ public void testRunSql() {
List<String> commandLine = FlinkArgsUtils.buildRunCommandLine(buildTestTaskExecutionContext(), flinkParameters);

Assertions.assertEquals(
"${FLINK_HOME}/bin/sql-client.sh -i /tmp/execution/app-id_init.sql -f /tmp/execution/app-id_node.sql",
"${FLINK_HOME}/bin/sql-client.sh -i /tmp/execution/app-id_init.sql -f /tmp/execution/app-id_node.sql -Dyarn.application.name=demo-app-name",
joinStringListWithSpace(commandLine));
}

Expand Down
Loading