Not able to install HCatalog Connector

I copied all hadoop files in 

 

 

 

/opt/vertica/packages/hcat/lib

now when I run 

./hcatUtil --verifyJars --hcatLibPath=/opt/vertica/packages/hcat

 

I see 

 

 

Summary
-------
Found 27 out of 27 required JARs
Found 5 out of 5 optional Hive SerDe related JARs
Found 2 out of 2 optional native libraries, 5 files in total
Looking at [/opt/vertica/packages/hcat] for configuration files
Found configuration file [cli.xml]
Found configuration file [data-manipulation-statements.xml]
Found configuration file [joins.xml]
Found configuration file [reflect.xml]
Found configuration file [index.xml]
Found configuration file [hdfs-site.xml]
Found configuration file [yarn-site.xml]
Found configuration file [ssl-client.xml]
Found configuration file [core-site.xml]
Found configuration file [project.xml]
Found configuration file [mapred-site.xml]
Found configuration file [var_substitution.xml]
Found configuration file [hive-site.xml]
Found configuration file [working_with_bucketed_tables.xml]

 

 

But when I run

 

 

 

vsql
\cd /opt/vertica/packages/hcat/ddl
\i install.sql

 

I get an error

 

 

sql:install.sql:13: NOTICE 6564: Found Hadoop configuration files in dependency paths. If any of the configuration files is changed, please re-install HCatalog connector library with modified configuration files
CREATE LIBRARY
vsql:install.sql:16: ROLLBACK 3399: Failure in UDx RPC call InvokeSetExecContext(): Error in User Defined Object [VHCatSource], error code: 0
Couldn't instantiate class com.vertica.hcatalogudl.HCatalogSplitsNoOpSourceFactory
vsql:install.sql:17: ROLLBACK 2059: Source with specified name and parameters does not exist: VHCatSource
vsql:install.sql:18: ROLLBACK 3399: Failure in UDx RPC call InvokeSetExecContext(): Error in User Defined Object [VHCatParser], error code: 0
Couldn't instantiate class com.vertica.hcatalogudl.HCatalogSplitsParserFactory
vsql:install.sql:19: ROLLBACK 2059: Parser with specified name and parameters does not exist: VHCatParser
vsql:install.sql:20: ROLLBACK 3472: Function with same name and number of parameters already exists: get_webhcat_host
vertica=>

 

I googled and found this thread

 

https://community.dev.hpe.com/t5/Vertica-Forum/hcatalog-connector/td-p/222733

 

So I found the /etc/hadoop/conf directory on my of the hadoop machines and copied the contents of that directory to /etc/hadoop-hdfs-conf directory in vertica.

 

now I copied this directory to all node of the vertica cluster in the same place.

 

but on doing

 

\i /opt/vertica/packages/hcat/ddl/uninstall.sql

ALTER DATABASE ddcanalytics SET JavaClassPathSuffixForUDx='/etc/hadoop-hdfs-conf';

\i /opt/vertica/packages/hcat/ddl/install.sql

 

I get the very same error

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file