You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With the following diff applied against 8a51ef5, I was able to get Spark installed on the debian/jessie64 Vagrant box:
diff --git a/examples/Vagrantfile b/examples/Vagrantfile
index c309b84..d42cf9a 100644
--- a/examples/Vagrantfile+++ b/examples/Vagrantfile@@ -15,7 +15,9 @@ if [ "up", "provision" ].include?(ARGV.first) &&
end
Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
- config.vm.box = "ubuntu/trusty64"+ config.vm.box = "debian/jessie64"++ config.vm.synced_folder ".", "/vagrant", disabled: true
config.vm.network "forwarded_port", guest: 4040, host: 4040
diff --git a/examples/site.yml b/examples/site.yml
index 0016d20..a8340b7 100644
--- a/examples/site.yml+++ b/examples/site.yml@@ -2,7 +2,8 @@
- hosts: all
vars:
- java_version: "7u51-2.4.*"+ spark_version: "1.6.0-bin-hadoop2.6"+ java_version: "7u95-2.6.*"
spark_env_extras:
TEST_B: "b"
TEST_A: "a"
Output of running spark-shell:
vagrant@debian-jessie:~$ mkdir /tmp/spark-events && spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.propertiesTo adjust logging level use sc.setLogLevel("INFO")Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.0
/_/
Using Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.7.0_95)
Type in expressions to have them evaluated.
Type :help for more information.
16/03/24 19:23:15 WARN Utils: Your hostname, debian-jessie resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface eth0)
16/03/24 19:23:15 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Spark context available as sc.
16/03/24 19:23:20 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/spark-1.6.0-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/spark/lib/datanucleus-api-jdo-3.2.6.jar."
16/03/24 19:23:20 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/spark/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/spark-1.6.0-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar."
16/03/24 19:23:20 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/spark-1.6.0-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/spark/lib/datanucleus-rdbms-3.2.9.jar."
16/03/24 19:23:20 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/03/24 19:23:21 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/03/24 19:23:26 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/03/24 19:23:27 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/03/24 19:23:30 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/spark-1.6.0-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/spark/lib/datanucleus-api-jdo-3.2.6.jar."
16/03/24 19:23:30 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/spark/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/spark-1.6.0-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar."
16/03/24 19:23:30 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/spark-1.6.0-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/spark/lib/datanucleus-rdbms-3.2.9.jar."
16/03/24 19:23:30 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/03/24 19:23:31 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/03/24 19:23:37 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/03/24 19:23:37 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
SQL context available as sqlContext.
scala>
The text was updated successfully, but these errors were encountered:
With the following diff applied against 8a51ef5, I was able to get Spark installed on the
debian/jessie64
Vagrant box:Output of running
spark-shell
:The text was updated successfully, but these errors were encountered: