Skip to content

Commit

Permalink
Merge branch 'dev' into fix-workflow-instance-delete
Browse files Browse the repository at this point in the history
  • Loading branch information
fuchanghai authored Jan 17, 2024
2 parents 4a780f3 + e4de06b commit f1b0493
Show file tree
Hide file tree
Showing 5 changed files with 14 additions and 4 deletions.
2 changes: 1 addition & 1 deletion docs/docs/en/guide/task/flink.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ Flink task type, used to execute Flink programs. For Flink nodes:
| Parallelism | Used to set the degree of parallelism for executing Flink tasks. |
| Yarn queue | Used to set the yarn queue, use `default` queue by default. |
| Main program parameters | Set the input parameters for the Flink program and support the substitution of custom parameter variables. |
| Optional parameters | Support `--jar`, `--files`,` --archives`, `--conf` format. |
| Optional parameters | Set the flink command options, such as `-D`, `-C`, `-yt`. |
| Custom parameter | It is a local user-defined parameter for Flink, and will replace the content with `${variable}` in the script. |

## Task Example
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/en/guide/task/spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Spark task type for executing Spark application. When executing the Spark task,
| Executor memory size | Set the size of Executor memories, which can be set according to the actual production environment. |
| Yarn queue | Set the yarn queue, use `default` queue by default. |
| Main program parameters | Set the input parameters of the Spark program and support the substitution of custom parameter variables. |
| Optional parameters | Support `--jars`, `--files`,` --archives`, `--conf` format. |
| Optional parameters | Set the spark command options, such as `--jars`, `--files`,` --archives`, `--conf`. |
| Resource | Appoint resource files in the `Resource` if parameters refer to them. |
| Custom parameter | It is a local user-defined parameter for Spark, and will replace the content with `${variable}` in the script. |
| Predecessor task | Selecting a predecessor task for the current task, will set the selected predecessor task as upstream of the current task. |
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/zh/guide/task/flink.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ Flink 任务类型,用于执行 Flink 程序。对于 Flink 节点:
| 并行度 | 用于设置执行 Flink 任务的并行度 |
| Yarn 队列 | 用于设置 Yarn 队列,默认使用 default 队列 |
| 主程序参数 | 设置 Flink 程序的输入参数,支持自定义参数变量的替换 |
| 选项参数 | 支持 `--jar``--files``--archives``--conf` 格式 |
| 选项参数 | 设置Flink命令的选项参数,例如`-D`, `-C`, `-yt` |
| 自定义参数 | 是 Flink 局部的用户自定义参数,会替换脚本中以 ${变量} 的内容 |

## 任务样例
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/zh/guide/task/spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ Spark 任务类型用于执行 Spark 应用。对于 Spark 节点,worker 支
- Executor 内存数:用于设置 Executor 内存数,可根据实际生产环境设置对应的内存数。
- Yarn 队列:用于设置 Yarn 队列,默认使用 default 队列。
- 主程序参数:设置 Spark 程序的输入参数,支持自定义参数变量的替换。
- 选项参数:支持 `--jars``--files``--archives``--conf` 格式
- 选项参数:设置Spark命令的选项参数,例如`--jars``--files``--archives``--conf`
- 资源:如果其他参数中引用了资源文件,需要在资源中选择指定。
- 自定义参数:是 Spark 局部的用户自定义参数,会替换脚本中以 ${变量} 的内容。

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -490,6 +490,16 @@ public void releaseTaskGroup(TaskInstance taskInstance) {
} else {
ProcessInstance processInstance =
processService.findProcessInstanceById(nextTaskInstance.getProcessInstanceId());
if (processInstance == null) {
log.error("WorkflowInstance is null cannot wakeup, processInstanceId:{}",
nextTaskInstance.getProcessInstanceId());
return;
}
if (processInstance.getHost() == null || Constants.NULL.equals(processInstance.getHost())) {
log.warn("The next WorkflowInstance: {} host is null no need to wakeup, maybe it is in failover",
processInstance);
return;
}
ILogicTaskInstanceOperator taskInstanceOperator = SingletonJdkDynamicRpcClientProxyFactory
.getProxyClient(processInstance.getHost(), ILogicTaskInstanceOperator.class);
taskInstanceOperator.wakeupTaskInstance(
Expand Down

0 comments on commit f1b0493

Please sign in to comment.