Please take this survey to help us learn more about how you use third party tools. Your input is greatly appreciated!

Hive Integration

Hello,

I have installed hadoop and hive
[[email protected] 2.3.2.0-2950]$ hadoop version
Hadoop 2.7.1.2.3.2.0-2950
Subversion [email protected]:hortonworks/hadoop.git -r 5cc60e0003e33aa98205f18bccaeaf36cb193c1c
Compiled by jenkins on 2015-09-30T18:08Z
Compiled with protoc 2.5.0
From source with checksum 69a3bf8c667267c2c252a54fbbf23d
This command was run using /usr/hdp/2.3.2.0-2950/hadoop/lib/hadoop-common-2.7.1.2.3.2.0-2950.jar

[[email protected] 2.3.2.0-2950]$ hive --version
WARNING: Use "yarn jar" to launch YARN applications.
Hive 1.2.1.2.3.2.0-2950
Subversion git://c66-slave-20176e25-6/grid/0/jenkins/workspace/HDP-2.3-maint-centos6/bigtop/build/hive/rpm/BUILD/hive-1.2.1.2.3.2.0 -r c67988138ca472655a6978f50c7423525b71dc27
Compiled by jenkins on Wed Sep 30 19:07:31 UTC 2015
From source with checksum 6c57f9c021f6f833a9ef075f629c8b03

When i try to run the following command to copy the jars some of the packages are missing .
Should i install all the jars? Please Help!

[[email protected] 2.3.2.0-2950]$ sudo /opt/vertica/packages/hcat/tools/hcatUtil --copyJars \

--hadoopHiveHome="$HADOOP_HOME/lib;$HIVE_HOME/lib;/usr/hdp/2.3.2.0-2950/hive-hcatalog/share/hcatalog" \
--hadoopHiveConfPath="$HADOOP_CONF_DIR;$HIVE_CONF_DIR;$WEBHCAT_CONF_DIR" \
--hcatLibPath="/usr/hdp/2.3.2.0-2950"

Looking at [/lib] for Jars
Looking at [/usr/hdp/2.3.2.0-2950/hive/lib] for Jars
Looking at [/usr/hdp/2.3.2.0-2950/hive-hcatalog/share/hcatalog] for Jars

ERROR:: Following required Jars were not found:
[NOTE: Some jars not found below may only be needed for some versions of Hadoop. If your HP Vertica HCatalog connector works correctly without these missing jars, they are not actually required for your Hadoop distribution]

hadoop-hdfs-.jar
hadoop-mapreduce-client-core-.jar
slf4j-log4j12.jar
hadoop-common-.jar
hadoop-archives-.jar
hadoop-mapreduce-client-common-.jar
commons-configuration-.jar
protobuf-java-2..jar
hive-webhcat-java-client-.jar
hive-shims-common-secure-.jar
slf4j-api.jar
hadoop-annotations-.jar
hadoop-yarn-api-.jar
hadoop-auth-.jar
webhcat-.jar

WARNING:: Following optional Serde and Codec Jars were not found:

hadoop-lzo.jar

WARNING:: Following HDP required Jars were not found:

commons-lang3-.jar
kryo-.jar

WARNING:: Following MapR required Jars were not found:

hadoop-yarn-common-.jar
maprfs-.jar

WARNING: Following native libraries are not found:

libgplcompression.so
libsnappy
.so
libhadoop*.so

Summary

Found 21 out of 36 required JARs
Found 4 out of 5 optional Serde and Codec JARs
Found 0 out of 2 HDP required JARs
Found 1 out of 3 MapR required JARs
Found 0 out of 3 optional native libraries, 0 files in total
Start copying JARs and native libraries to hcatLibPath [/usr/hdp/2.3.2.0-2950]..........................
Done
Looking at [] for configuration files
WARNING: hadoopHiveConfPath [] does not exist
Looking at [] for configuration files
WARNING: hadoopHiveConfPath [] does not exist
Looking at [] for configuration files
WARNING: hadoopHiveConfPath [] does not exist
ERROR: Could not find required Hive configuration file [hive-site.xml]
WARNING: Could not find Hadoop configuration file [core-site.xml]. If you are using Kerberos authentication, please make sure this file is copied
WARNING: Could not find Yarn configuration file [yarn-site.xml]. If you are using Kerberos authentication, please make sure this file is copied
WARNING: Could not find WebHCat configration file [webhcat-site.xml]. You must specify WEBSERVICE_HOSTNAME and WEBSERVICE_PORT when CREATE HCATALOG SCHEMA
WARNING: Could not find HDFS configration file [hdfs-site.xml]. You must specify WEBHDFS_PORT when CREATE HCATALOG SCHEMA
Start copying configuration files to hcatLibPath [/usr/hdp/2.3.2.0-2950]
Done

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file

Can't find what you're looking for? Search the Vertica Documentation, Knowledge Base, or Blog for more information.