Skip to content

Commit

Permalink
Change the spark version to 3.1.3 (#743)
Browse files Browse the repository at this point in the history
Signed-off-by: Chen Jing <[email protected]>

Signed-off-by: Chen Jing <[email protected]>
  • Loading branch information
JingChen23 authored Sep 1, 2022
1 parent a7d8839 commit f5af31a
Show file tree
Hide file tree
Showing 6 changed files with 8 additions and 8 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ services:
- 9380:9380
- 9360:9360
volumes:
- ./confs/spark/spark-defaults.conf:/data/projects/spark-3.1.2-bin-hadoop3.2/conf/spark-defaults.conf
- ./confs/spark/spark-defaults.conf:/data/projects/spark-3.1.3-bin-hadoop3.2/conf/spark-defaults.conf
- ./confs/fate_flow/conf/service_conf.yaml:/data/projects/fate/conf/service_conf.yaml
- ./confs/fate_flow/conf/pulsar_route_table.yaml:/data/projects/fate/conf/pulsar_route_table.yaml
- ./confs/fate_flow/conf/rabbitmq_route_table.yaml:/data/projects/fate/conf/rabbitmq_route_table.yaml
Expand Down
2 changes: 1 addition & 1 deletion docker-deploy/training_template/docker-compose-spark.yml
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ services:
- 9380:9380
- 9360:9360
volumes:
- ./confs/spark/spark-defaults.conf:/data/projects/spark-3.1.2-bin-hadoop3.2/conf/spark-defaults.conf
- ./confs/spark/spark-defaults.conf:/data/projects/spark-3.1.3-bin-hadoop3.2/conf/spark-defaults.conf
- ./confs/fate_flow/conf/service_conf.yaml:/data/projects/fate/conf/service_conf.yaml
- ./confs/fate_flow/conf/pulsar_route_table.yaml:/data/projects/fate/conf/pulsar_route_table.yaml
- ./confs/fate_flow/conf/rabbitmq_route_table.yaml:/data/projects/fate/conf/rabbitmq_route_table.yaml
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ fate_on_eggroll:
fate_on_spark:
spark:
# default use SPARK_HOME environment variable
home: /data/projects/spark-3.1.2-bin-hadoop3.2/
home: /data/projects/spark-3.1.3-bin-hadoop3.2/
cores_per_node: 20
nodes: 2
linkis_spark:
Expand Down
2 changes: 1 addition & 1 deletion docs/FATE_On_Spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ As the above figure show, the EggRoll provide both computing and storage resourc

Since FATE v1.5.0 a user can select Spark as the underlying computing engine, however, spark itself is an in-memory computing engine without the data persistence. Thus, HDFS is also needed to be deployed to help on data persistence. For example, a user need to upload their data to HDFS through FATE before doing any training job, and the output data of each component will also be stored in the HDFS module.

**Currently the verifed Spark version is [3.1.2](https://archive.apache.org/dist/spark/spark-3.1.2/spark-3.1.2-bin-hadoop3.2.tgz) and the Hadoop is [3.2.1](https://archive.apache.org/dist/hadoop/common/hadoop-3.2.1/hadoop-3.2.1.tar.gz)**
**Currently the verifed Spark version is [3.1.3](https://archive.apache.org/dist/spark/spark-3.1.3/spark-3.1.3-bin-hadoop3.2.tgz) and the Hadoop is [3.2.1](https://archive.apache.org/dist/hadoop/common/hadoop-3.2.1/hadoop-3.2.1.tar.gz)**

The following picture shows the architecture of FATE on Spark:
<div align="center">
Expand Down
2 changes: 1 addition & 1 deletion helm-charts/FATE/templates/core/fateflow/configmap.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ data:
fate_on_spark:
spark:
# default use SPARK_HOME environment variable
home: /data/projects/spark-3.1.2-bin-hadoop3.2/
home: /data/projects/spark-3.1.3-bin-hadoop3.2/
cores_per_node: {{ .Values.modules.python.spark.cores_per_node }}
nodes: {{ .Values.modules.python.spark.nodes }}
linkis_spark:
Expand Down
6 changes: 3 additions & 3 deletions helm-charts/FATE/templates/core/python-spark.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -125,8 +125,8 @@ spec:
# fix fateflow conf must use IP
sed -i "s/host: fateflow/host: ${POD_IP}/g" /data/projects/fate/conf/service_conf.yaml
cp /data/projects/spark-3.1.2-bin-hadoop3.2/conf/spark-defaults-template.conf /data/projects/spark-3.1.2-bin-hadoop3.2/conf/spark-defaults.conf
sed -i "s/fateflow/${POD_IP}/g" /data/projects/spark-3.1.2-bin-hadoop3.2/conf/spark-defaults.conf
cp /data/projects/spark-3.1.3-bin-hadoop3.2/conf/spark-defaults-template.conf /data/projects/spark-3.1.3-bin-hadoop3.2/conf/spark-defaults.conf
sed -i "s/fateflow/${POD_IP}/g" /data/projects/spark-3.1.3-bin-hadoop3.2/conf/spark-defaults.conf
sleep 5 && python fateflow/python/fate_flow/fate_flow_server.py
livenessProbe:
Expand Down Expand Up @@ -161,7 +161,7 @@ spec:
subPath: logs
- mountPath: /data/projects/fate/conf-tmp/
name: python-confs
- mountPath: /data/projects/spark-3.1.2-bin-hadoop3.2/conf/spark-defaults-template.conf
- mountPath: /data/projects/spark-3.1.3-bin-hadoop3.2/conf/spark-defaults-template.conf
name: python-confs
subPath: spark-defaults.conf
{{- if eq .Values.federation "RabbitMQ" }}
Expand Down

0 comments on commit f5af31a

Please sign in to comment.