Please take this survey to help us learn more about how you use third party tools. Your input is greatly appreciated!

vertica hive integration: failure to login: No LoginModules configured for hadoop_simple

Hi:

 

I've problem with Vertica 7.2.3 hcatalog ( on Mapr 5.1)  I've got everything setup , I can see hive table and table definition. but I can't select data

 

dbadmin=> select * from hcat.pokes;

ERROR 3399:  Failure in UDx RPC call InvokePlanUDL(): Error in User Defined Object [VHCatSource], error code: 0

Error message is [ java.io.IOException: failure to login: No LoginModules configured for hadoop_simple ] HINT hive metastore service is thrift://n003:9083 (check UDxLogs/UDxFencedProcessesJava.log in the catalog directory for more information)

 

dbadmin=> \x

Expanded display is on.

dbadmin=> SELECT * FROM v_catalog.hcatalog_schemata;

-[ RECORD 1 ]----------------+-----------------------------

schema_id                    | 45035996273773886

schema_name                  | hcat

schema_owner_id              | 45035996273704962

schema_owner                 | dbadmin

create_time                  | 2016-07-21 23:59:18.56546+10

hostname                     | n003

port                         | 9083

webservice_hostname          | n003

webservice_port              | 50111

webhdfs_port                 | 50075

hcatalog_schema_name         | default

hcatalog_user_name           | hive

hcatalog_connection_timeout  | -1

hcatalog_slow_transfer_limit | -1

hcatalog_slow_transfer_time  | -1

 

dbadmin=> SELECT * FROM v_catalog.hcatalog_table_list;

-[ RECORD 1 ]------+------------------

table_schema_id    | 45035996273773886

table_schema       | hcat

hcatalog_schema    | default

table_name         | pokes

hcatalog_user_name | hive

 

dbadmin=> SELECT * FROM HCATALOG_TABLES WHERE table_name = 'pokes';
-[ RECORD 1 ]---------+-----------------------------------------------------------
table_schema_id       | 45035996273773886
table_schema          | hcat
hcatalog_schema       | default
table_name            | pokes
hcatalog_user_name    | hive
min_file_size_bytes   | 0
total_number_files    | 0
location              | maprfs:/user/hive/warehouse/pokes
last_update_time      | 2016-07-18 12:37:07.936+10
output_format         | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
last_access_time      | 2016-07-18 12:37:07+10
max_file_size_bytes   | 0
is_partitioned        | f
partition_expression  |
table_owner           | mapr
input_format          | org.apache.hadoop.mapred.TextInputFormat
total_file_size_bytes | 0
hcatalog_group        |
permission            |

 

dbadmin=> SELECT * FROM HCATALOG_COLUMNS WHERE table_name = 'pokes';

-[ RECORD 1 ]------------+---------------
table_schema             | hcat
hcatalog_schema          | default
table_name               | pokes
is_partition_column      | f
column_name              | foo
hcatalog_data_type       | int
data_type                | int
data_type_id             | 6
data_type_length         | 8
character_maximum_length |
numeric_precision        |
numeric_scale            |
datetime_precision       |
interval_precision       |
ordinal_position         | 1
-[ RECORD 2 ]------------+---------------
table_schema             | hcat
hcatalog_schema          | default
table_name               | pokes
is_partition_column      | f
column_name              | bar
hcatalog_data_type       | string
data_type                | varchar(65000)
data_type_id             | 9
data_type_length         | 65000
character_maximum_length | 65000
numeric_precision        |
numeric_scale            |
datetime_precision       |
interval_precision       |
ordinal_position         | 2

 

[[email protected] ~]$ rpm -qa|grep hive
mapr-hive-1.2.201606020917-1.noarch
mapr-hivewebhcat-1.2.201606020917-1.noarch
mapr-hivemetastore-1.2.201606020917-1.noarch
mapr-hiveserver2-1.2.201606020917-1.noarch

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file

Can't find what you're looking for? Search the Vertica Documentation, Knowledge Base, or Blog for more information.