Since Vertica has a 1600 column limit and also a column name length limit, I'm trying to use Vertica Flex.
Is it possible to load into Vertica Flex column names that are up to 300~500 characters long? And with the total column cound around 3000?
I believe that Flex supports long key names and there is no limit on the number of keys. However, the provided fjsonparser has a limit of 250 bytes (ascii characters) for key names.
Thank you Derrick,
Unfortunately for TSV files, it seems there is some limit for the column name. Some of my files with merely a few hundred columns are injested properly, but the large one with 200~500 char column names fail silently. No error message at all.
When I exclude the headers, the data seems to be loaded correctly, so it looks like its not a number of columns problem, but the column names are too large.
I'm trying to see if increasing the size of the _raw_ column might help, but would you have any suggestions by any chance?
If the data is being successfully loaded, take a look at what public.MapToString(__raw__) looks like. Does it have the full column names, or are they truncated / omitted? If they are there, then you know that it's not the load process that is failing.
Be sure to look at rejections from your load. Documentation for the COPY statement explains the rejections mechanism.
The next thing to try is to see if you can successfully use public.MapLookup(__raw__, 'columnName') for long names.
Finally, ensure your column names have no special characters or - if they do - use identifier quoting. That's different from string literal quoting (it's double quotes, not singles)
Thanks Derrick, unfortunately no data is being loaded at all. I get Rows 0;. I even reduced my tsv such that all it has is the headers, and 3 rows. Still failing. And the column names are very similar to the other files I'm loading no problem.
I did read in the doc:
Note: Specifying rejected data and exceptions files is not currently supported while loading flex tables.
So not sure if I can get the rejected data, but I'm currently trying!
Success! After increasing the _raw_ size, everything seems to be being loaded just fine.
alter table ad.food alter column __raw__ set data type long varbinary(1200000);