Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Table of Contents

...

These packages are: flume, mahout, pig, sqoop, zookeeper. If all components were already built in earlier step, skip this step.

Code Block
languagebash
docker run -v `pwd`:/ws bigtop/slaves:erp17.08-deb-8-aarch64 bash -l -c 'cd /ws ; ./gradlew flume-deb mahout-deb pig-deb sqoop-deb zookeeper-deb'

...

Code Block
languagebash
$ vim provisioner/docker/config_debian-8-aarch64.yaml

Do the below changes

components: [hdfs, yarn, mapreduce, hive, spark]
enable_local_repo: true
smoke_test_components: [spark]
jdk: "openjdk-8-jdk"

save and quit.

$ vim provisioner/utils/smoke-tests.sh

Add the below lines

export HADOOP_HOME=/usr/lib/hadoop
export HADOOP_PREFIX=$HADOOP_HOME
export HADOOP_OPTS="-Djava.library.path=$HADOOP_PREFIX/lib/native"
export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec
export HADOOP_CONF_DIR=/etc/hadoop/conf
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce
export HADOOP_HDFS_HOME=/usr/lib/hadoop-hdfs
export YARN_HOME=/usr/lib/hadoop-yarn
export HADOOP_YARN_HOME=/usr/lib/hadoop-yarn/
export HADOOP_USER_NAME=hdfs
export CLASSPATH=$CLASSPATH:.
export CLASSPATH=$CLASSPATH:$HADOOP_HOME/hadoop-common-2.7.2.jar:$HADOOP_HOME/client/hadoop-hdfs-2.7.2.jar:$HADOOP_HOME/hadoop-auth-2.7.2.jar:/usr/lib/hadoop-mapreduce/*:/usr/lib/hive/lib/*:/usr/lib/hadoop/lib/*:
export JAVA_HOME=$(readlink -f /usr/bin/java | sed "s:bin/java::")
export PATH=/usr/lib/hadoop/libexec:/etc/hadoop/conf:$HADOOP_HOME/bin/:$PATH
export SPARK_HOME=/usr/lib/spark
export PATH=$HADOOP_HOME\bin:$PATH
export SPARK_DIST_CLASSPATH=$HADOOP_HOME\bin\hadoop:$CLASSPATH:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop-mapreduce/*:.
export CLASSPATH=$CLASSPATH:/usr/lib/hadoop/lib/*:.

export SPARK_HOME=/usr/lib/spark
export SPARK_MASTER_IP=<IP Address of your host>
export SPARK_MASTER_PORT=7077

save and quit.

Run the command smoke test commands

$ ./docker-hadoop.sh -C config_debian-8-aarch64.yaml -c 3 -s -d

If the smoke test is still failing.  It might be because of you have't set the "SPARK_MASTER_IP" properly.  Another way to do is that remove the $masterMode and add local[*] as below

$ vim bigtop-tests/smoke-tests/spark/TestSpark.groovy

make the below change

final String SPARK_SHELL = SPARK_HOME + "/bin/spark-shell --master local[*]"

save and quit. Now run the spark smoke-tests again.

...

TestSparkExample
TestSparkPythonExample
ShellTest
HDFSTest
JobTest
  • Hive

TestHiveSmoke


Component versions

  • alluxio v1.0.1
  • Apache ambari v2.5.0
  • Apache apex v3.5.0
  • groovy v2.4.10
  • Apache commons - jsvc v1.0.15
  • Apache tomcat v6.0.45
  • bigtop_utils v1.2.0
  • Apache crunch v0.14.0
  • Pig UDF datafu v1.3.0
  • Apache flink v1.1.3
  • Apache flume v1.7.0
  • Apache giraph v1.1.0
  • Greenplum gpdb v5.0.0-alpha.0
  • Apache hadoop v2.7.3
  • Apache hama v0.7.0
  • Apache hbase v1.1.3
  • Apache hive v1.2.1
  • Apache hue v3.11.0
  • Apache ignite v1.9.0
  • Apache kafka v0.10.1.1
  • kite v1.1.0
  • Apache mahout v0.12.2
  • Apache oozie v4.3.0
  • Apache phoenix v4.9.0-HBase-1.1
  • Apache pig v0.15.0
  • Quantcast qfs v1.1.4
  • Apache solr v4.10.4
  • Apache spark 1.1 v1.6.2
  • Apache spark 2.0 v2.1.1
  • Apache sqoop v1 v1.4.6
  • Apache sqoop v2 v1.99.4
  • Apache tajo v0.11.1
  • Apache tez v0.6.2
  • ycsb v0.4.0
  • Apache zeppelin v0.7.0
  • Apache zookeeper v3.4.6

...