Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

./tpch-setup.sh execution failed #37

Open
rc452860 opened this issue Oct 26, 2021 · 4 comments
Open

./tpch-setup.sh execution failed #37

rc452860 opened this issue Oct 26, 2021 · 4 comments

Comments

@rc452860
Copy link

I using tpch-setup.sh in CDH6.3.2
the environment variable below :

XDG_SESSION_ID=104
SPARK_HOME=/opt/cloudera/parcels/CDH/lib/hadoop/../spark
HOSTNAME=node01.cdh6.citms.cn
TERM=xterm
SHELL=/bin/bash
HADOOP_HOME=/opt/cloudera/parcels/CDH/lib/hadoop
HISTSIZE=1000
SSH_CLIENT=192.168.0.99 57577 22
HADOOP_PREFIX=/opt/cloudera/parcels/CDH/lib/hadoop
SSH_TTY=/dev/pts/0
USER=hdfs
LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36:
HBASE_HOME=/opt/cloudera/parcels/CDH/lib/hadoop/../hbase
MAIL=/var/spool/mail/root
PATH=.:/usr/local/mysql/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin
HADOOP_HDFS_HOME=/opt/cloudera/parcels/CDH/lib/hadoop/../hadoop-hdfs
HIVE_HOME=/opt/cloudera/parcels/CDH/lib/hadoop/../hive
JAVA=/usr/java/jdk1.8.0_271-amd64/bin/java
kylin_hadoop_conf_dir=/usr/local/apache-kylin-4.0.0-bin-spark2/hadoop_conf
PWD=/root/hive-testbench
HADOOP_YARN_HOME=/opt/cloudera/parcels/CDH/lib/hadoop/../hadoop-yarn
JAVA_HOME=/usr/java/jdk1.8.0_271-amd64
HADOOP_CONF_DIR=/etc/hadoop/conf
LANG=zh_CN.UTF-8
HISTCONTROL=ignoredups
SHLVL=5
HOME=/root
HADOOP_MAPRED_HOME=/opt/cloudera/parcels/CDH/lib/hadoop/../hadoop-mapreduce
KYLIN_HOME=/usr/local/apache-kylin-4.0.0-bin-spark2
LOGNAME=hdfs
SSH_CONNECTION=192.168.0.99 57577 192.168.10.146 22
LESSOPEN=||/usr/bin/lesspipe.sh %s
MYSQL_HOME=/usr/local/mysql
XDG_RUNTIME_DIR=/run/user/0
_=/usr/bin/env
OLDPWD=/root/hive-testbench/tpch-gen

the output:

[root@node01 hive-testbench]# ./tpch-setup.sh 2
WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX.
WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX.
ls: `/tmp/tpch-generate/2/lineitem': No such file or directory
Generating data at scale factor 2.
WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX.
WARNING: Use "yarn jar" to launch YARN applications.
Exception in thread "main" java.lang.IllegalAccessError: class org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its superinterface org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator
   at java.lang.ClassLoader.defineClass1(Native Method)
   at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
   at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
   at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
   at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
   at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:348)
   at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
   at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
   at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
   at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3151)
   at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3196)
   at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3235)
   at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:123)
   at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3286)
   at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3254)
   at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:478)
   at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
   at org.notmysock.tpch.GenTable.genInput(GenTable.java:171)
   at org.notmysock.tpch.GenTable.run(GenTable.java:98)
   at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
   at org.notmysock.tpch.GenTable.main(GenTable.java:54)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:498)
   at org.apache.hadoop.util.RunJar.run(RunJar.java:313)
   at org.apache.hadoop.util.RunJar.main(RunJar.java:227)
WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX.
ls: `/tmp/tpch-generate/2/lineitem': No such file or directory
Data generation failed, exiting.
@rc452860
Copy link
Author

rc452860 commented Oct 26, 2021

I think this issue about hadoop version. but I don't know how to fix it.

@richtertang
Copy link

hello~~

Du kann von tpcds-gen/*/lib alle Files zu tpch-gen copy. Natürlich , musst du "lib/dsdgen.jar" nicht abdecken.

@jashpindergill
Copy link

Can somebody please help with this error message ? I have tried making changes to the POM.XML file with the currently used hadoop version , exported the environment variables, still no help

@Mukvin
Copy link

Mukvin commented Aug 29, 2022

Hi @rc452860,
I also met this error, and I used this repo on the apache Hadoop whose version is 3.2.1.

As following is my solution.

  1. remove the older hive-testbench directory.
  2. I check the tpch-gen/pom.xml and change the
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <version>2.4.0</version>
      <scope>compile</scope>
    </dependency>

to

    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <version>3.2.1</version>
      <scope>compile</scope>
    </dependency>

3.2.1 is Hadoop version of my cluster

  1. replace the useless hive parameters in settings/*.sql
set hive.optimize.sort.dynamic.partition.threshold=0;

to

set hive.optimize.sort.dynamic.partition=true;
  1. execute the commands.
git clone https://github.com/hortonworks/hive-testbench.git
cd hive-testbench/
./tpch-build.sh
./tpch-setup.sh 2

everything is ok.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants