Facing problem in Vertica integration with Hcatalog

I am getting the following error message when I tried to run query on hcatalog.

 

select * from hcat.emp;

 

 

 

2015-03-23 15:40:59.247 [Java-49518] 0x0a com.vertica.sdk.UdfException: Error message is [ Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected ] HINT hive metastore service is thrift://172.16.1.203:9083 (check UDxLogs/UDxFencedProcessesJava.log in the catalog directory for more information)
        at com.vertica.hcatalogudl.HCatalogSplitsNoOpSourceFactory.plan(HCatalogSplitsNoOpSourceFactory.java:130)
        at com.vertica.udxfence.UDxExecContext.planUDSource(UDxExecContext.java:981)
        at com.vertica.udxfence.UDxExecContext.planCurrentUDLType(UDxExecContext.java:959)
        at com.vertica.udxfence.UDxExecContext.planUDL(UDxExecContext.java:916)
        at com.vertica.udxfence.UDxExecContext.run(UDxExecContext.java:247)
        at java.lang.Thread.run(Thread.java:745)

2015-03-23 15:40:59.253 [Java-49518] 0x0a com.vertica.sdk.UdfException: Socket Socket[addr=/127.0.0.1,port=58659,localport=56694] has been closed abnormally
        at com.vertica.udxfence.UDxRPCMessage.recvall(UDxRPCMessage.java:111)
        at com.vertica.udxfence.UDxRPCMessage.recv(UDxRPCMessage.java:88)
        at com.vertica.udxfence.UDxExecContext.run(UDxExecContext.java:168)
        at java.lang.Thread.run(Thread.java:745)

 

 

Note: I am able to query table names  and column names but facing issue when i query the data

Comments

  • Hello SChandra,

     

    Could you try the followings to see if they will help?  Try one setting at a time so we know which one works:

     

    1. Increase HCat connection timeout:

        alter database <mydb>  set HCatConnectionTimeout = 900

     

    2. Try to increase timeout in Hive (you might need Hive's admin help):

        hive.server.read.socket.timeout=1000
        hive.server.write.socket.timeout=1000

     

    3. Try to query a smaller table, will that work?

     

    4. Try to query a table that is associated with uncompressed text file in HDFS. 

     

    Regards,

    Han

  •  Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected

     

    This suggests incompatible versions of hadoop components are being used?

     

      --Sharon

     

  • I have cloudera 5.2.x version and copied the jars using HCatutil.

     

     

  • Hello  SChandra,

     

    I kind of skipped the first error because you were able to read the meta data.  The class error might mean vertica wasn't able to find an appropriate serde to parse the data.  This was the reason I asked to test with text file to see if vertica can query the data.

     

    Regards,

    Han

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file