Unable to write in HDFS
Prakhar84
Vertica Customer ✭
- Hi I am getting error when trying to write in HDFS ,error is something like
- java.lang.Exception: S2V: FATAL ERROR for job S2V_job5888402616862836297. Job status information is available in the Vertica table xxx.S2V_JOB_STATUS_USER_xxxx. Unable to create/insert into target table xxx.test_vertica with SaveMode: Append. ERROR MESSAGE: ERROR: java.sql.SQLException: [Vertica]VJDBC ERROR: Failed to glob [hdfs://xxxxxx01/a/tmp/vertica/S2V_job5888402616862836297/*.orc] because of error: Could not connect to [hdfs://xxxxxx01]
Tagged:
0
Answers
Try instructions listed here if you haven't already: https://forum.vertica.com/discussion/comment/242650/#Comment_242650
Let us know how it goes!
Just to add here we have vertica as kerberosed and also HDFS is cloudera kerberosed. Please let us know if any additional settings are needed for the same.
Please review the section "Configuring Users and the Keytab File" in the below link in regards to kerberos additional configuration
https://www.vertica.com/docs/9.2.x/HTML/Content/Authoring/HadoopIntegrationGuide/Kerberos/KerberosTGT.htm
Can you please check and share the Complete DataNode log?
It should show the actual cause of the issue. For example if you see "too many open files" error or something in your DataNode logs?
The HDFS client does not currently support the LzoCodec and the core-site.xml file you are using includes it.
It should work after you remove “com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec” from the “io.compression.codecs” property in the “core-site.xml” file you have referenced in your putHDFS processor.
Maybe you should see How Hadoop HDFS Data Read and Write and you may figure out the problem.