Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] [task] can't get env when runing task #15352

Closed
2 of 3 tasks
wangbowen1024 opened this issue Dec 22, 2023 · 6 comments
Closed
2 of 3 tasks

[Bug] [task] can't get env when runing task #15352

wangbowen1024 opened this issue Dec 22, 2023 · 6 comments
Labels
need to verify priority:high Waiting for user feedback Waiting for feedback from issue/PR author
Milestone

Comments

@wangbowen1024
Copy link
Contributor

Search before asking

  • I had searched in the issues and found no similar issues.

What happened

can't get env when runing task, just like SPARKSQL or PYTHON or DATAX,maybe more。

i set env info into (./bin/env/dolphinscheduler_env.sh)
image
and set user env info :
image
but get error when run python task .
i debug find ,the system env not set:
image
image

and user env also not set:
image
image

What you expected to happen

not error

How to reproduce

everytime

Anything else

No response

Version

3.2.x

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@wangbowen1024 wangbowen1024 added bug Something isn't working Waiting for reply Waiting for reply labels Dec 22, 2023
@ruanwenjun ruanwenjun removed the Waiting for reply Waiting for reply label Dec 22, 2023
@ruanwenjun
Copy link
Member

ruanwenjun commented Dec 22, 2023

Try to import the env in your task.
image

@ruanwenjun ruanwenjun added need to verify Waiting for user feedback Waiting for feedback from issue/PR author and removed bug Something isn't working labels Dec 22, 2023
@wangbowen1024
Copy link
Contributor Author

Try to import the env in your task. image

I tried, but I failed. Now I write these environment variables to /etc/profile to resolve

@cnfree0355
Copy link

I have the same issue with version 3.2
Here is my environment setup in dolphinscheduler_env.sh

  export SPARK_HOME=${SPARK_HOME:-/data/spark3}

  export PYTHON_LAUNCHER=${PYTHON_LAUNCHER:-/usr/local/python3/bin/python3}

  export DATAX_LAUNCHER=${DATAX_LAUNCHER:-/usr/local/datax/bin/datax.py}

  export PATH=$HADOOP_HOME/bin:$SPARK_HOME/bin:$PYTHON_LAUNCHER:$JAVA_HOME/bin:$HIVE_HOME/bin:$FLINK_HOME/bin:$DATAX_LAUNCHER:$PATH

When I pass the task or the result of these variables

echo " ------shell task 1 ------"
echo "The python launcher program is ${PYTHON_LAUNCHER} "
echo "The spark home is ${SPARK_HOME}"
echo "The datax launcher program is ${DATAX_LAUNCHER}"`

As shown in the log below, they are all null.

 ------shell task 1 ------
	The python launcher program is  
	The spark home is 
	The datax launcher program is 

@cnfree0355
Copy link

I have the same issue with version 3.2 Here is my environment setup in dolphinscheduler_env.sh

  export SPARK_HOME=${SPARK_HOME:-/data/spark3}

  export PYTHON_LAUNCHER=${PYTHON_LAUNCHER:-/usr/local/python3/bin/python3}

  export DATAX_LAUNCHER=${DATAX_LAUNCHER:-/usr/local/datax/bin/datax.py}

  export PATH=$HADOOP_HOME/bin:$SPARK_HOME/bin:$PYTHON_LAUNCHER:$JAVA_HOME/bin:$HIVE_HOME/bin:$FLINK_HOME/bin:$DATAX_LAUNCHER:$PATH

When I pass the task or the result of these variables

echo " ------shell task 1 ------"
echo "The python launcher program is ${PYTHON_LAUNCHER} "
echo "The spark home is ${SPARK_HOME}"
echo "The datax launcher program is ${DATAX_LAUNCHER}"`

As shown in the log below, they are all null.

 ------shell task 1 ------
	The python launcher program is  
	The spark home is 
	The datax launcher program is 

Add environment variable parameters from the management interface environment management options
Then associate the above variables during flow and project operations, but it still doesn't work.

@wangbowen1024
Copy link
Contributor Author

I have the same issue with version 3.2 Here is my environment setup in dolphinscheduler_env.sh

  export SPARK_HOME=${SPARK_HOME:-/data/spark3}

  export PYTHON_LAUNCHER=${PYTHON_LAUNCHER:-/usr/local/python3/bin/python3}

  export DATAX_LAUNCHER=${DATAX_LAUNCHER:-/usr/local/datax/bin/datax.py}

  export PATH=$HADOOP_HOME/bin:$SPARK_HOME/bin:$PYTHON_LAUNCHER:$JAVA_HOME/bin:$HIVE_HOME/bin:$FLINK_HOME/bin:$DATAX_LAUNCHER:$PATH

When I pass the task or the result of these variables

echo " ------shell task 1 ------"
echo "The python launcher program is ${PYTHON_LAUNCHER} "
echo "The spark home is ${SPARK_HOME}"
echo "The datax launcher program is ${DATAX_LAUNCHER}"`

As shown in the log below, they are all null.

 ------shell task 1 ------
	The python launcher program is  
	The spark home is 
	The datax launcher program is 

Add environment variable parameters from the management interface environment management options Then associate the above variables during flow and project operations, but it still doesn't work.

you can set these env to /etc/profile to sloved

@zhongjiajie zhongjiajie added this to the 3.2.1 milestone Jan 30, 2024
@zhongjiajie
Copy link
Member

Hi all, you need to create new env in security center with both PYTHON_LAUNCHER and DATAX_LAUNCHER

export PYTHON_LAUNCHER=/Users/zhongjiajie/.pyenv/shims/python3
export DATAX_LAUNCHER=/Users/zhongjiajie/Documents/dist/datax/bin/datax.py

and then use the env in task, I test latest code and it work, I would like to close this issue, you can reopen it if you have further issues

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
need to verify priority:high Waiting for user feedback Waiting for feedback from issue/PR author
Projects
None yet
Development

No branches or pull requests

4 participants