-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] [task] can't get env when runing task #15352
Comments
I have the same issue with version 3.2 export SPARK_HOME=${SPARK_HOME:-/data/spark3}
export PYTHON_LAUNCHER=${PYTHON_LAUNCHER:-/usr/local/python3/bin/python3}
export DATAX_LAUNCHER=${DATAX_LAUNCHER:-/usr/local/datax/bin/datax.py}
export PATH=$HADOOP_HOME/bin:$SPARK_HOME/bin:$PYTHON_LAUNCHER:$JAVA_HOME/bin:$HIVE_HOME/bin:$FLINK_HOME/bin:$DATAX_LAUNCHER:$PATH When I pass the task or the result of these variables echo " ------shell task 1 ------"
echo "The python launcher program is ${PYTHON_LAUNCHER} "
echo "The spark home is ${SPARK_HOME}"
echo "The datax launcher program is ${DATAX_LAUNCHER}"` As shown in the log below, they are all null.
|
Add environment variable parameters from the management interface environment management options |
you can set these env to /etc/profile to sloved |
Hi all, you need to create new export PYTHON_LAUNCHER=/Users/zhongjiajie/.pyenv/shims/python3
export DATAX_LAUNCHER=/Users/zhongjiajie/Documents/dist/datax/bin/datax.py and then use the env in task, I test latest code and it work, I would like to close this issue, you can reopen it if you have further issues |
Search before asking
What happened
can't get env when runing task, just like SPARKSQL or PYTHON or DATAX,maybe more。
i set env info into (./bin/env/dolphinscheduler_env.sh)
and set user env info :
but get error when run python task .
i debug find ,the system env not set:
and user env also not set:
What you expected to happen
not error
How to reproduce
everytime
Anything else
No response
Version
3.2.x
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: