Skip to content
This repository has been archived by the owner on Nov 23, 2017. It is now read-only.

"--spark-version" does not work #112

Open
wangshusen opened this issue Aug 10, 2017 · 1 comment
Open

"--spark-version" does not work #112

wangshusen opened this issue Aug 10, 2017 · 1 comment

Comments

@wangshusen
Copy link

wangshusen commented Aug 10, 2017

If I don't specify "--spark-version", the "/root/spark/" directory contains nothing but "/conf/".

When I specify "--spark-version=2.0.0", "--spark-version=2.0.2", "--spark-version=2.1.0", or "--spark-version=2.2.0", the error is like

Don't know about Spark version: 2.0.0

and spark-ec2 does not launch EC2 clusters.

But when I specify "--spark-version=1.4.1", it works fine.

I need to use Spark version after 2.0. Please help!

@wangshusen
Copy link
Author

The issue is solved by using branch-2.0.

git clone -b branch-2.0 https://github.com/amplab/spark-ec2.git

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant