Error importing an Avro file into a Flex table

robannrobann Vertica Customer

I'm experimenting with importing Avro data into Flex tables. I have a fairly complex avro file which I've tried to import using both the VerticaCopyStream form java code and a COPY command using a file. Both methods give a VIAssert error (this is from the java code)

com.vertica.support.exceptions.NonTransientException: [Vertica]VJDBC ERROR: Error calling process() in User Function UDParser at [/data/jenkins/workspace/RE-ReleaseBuilds/RE-Jackhammer/server/udx/supported/UDxHelpers/AvroParser.cpp:199], error code: 0, message: User code caused Vertica to throw exception "VIAssert(numeric_len == 8) failed"

I've successfully imported a (much simpler) avro file before. Any pointers towards what could be wrong? The error message isn't very helpful and the code 5861 is very generic.

Thanks,
Robert

Best Answer

Answers

  • Bryan_HBryan_H Vertica Employee Administrator

    This is actually a server-side error in the Avro parser. I've seen this happen when the Avro file contains decimals with precision > 38. Is this the case?
    If you can open a support case, please do that and submit the schema and sample data if possible.

  • robannrobann Vertica Customer

    So this error is to do with decimals but on a precision much lower than that. Having done some binary searching, the following works:

    { "name": "price", "type": { "type": "bytes", "logicalType": "decimal", "precision": 18, "scale": 9 } }

    but this doesn't:

    { "name": "price", "type": { "type": "bytes", "logicalType": "decimal", "precision": 20, "scale": 10 } }

    I assume this is unexpected/bug? I can provide code and sample files - how do I raise a support case?

    Thanks,
    Robert

  • SergeBSergeB - Select Field - Employee

    What version of Vertica are you testing with? Are you using a generic flex table or a hybrid one (with some "real" columns) ?

    The error seems to be raised when mapping the parsed value (price) to its target column.

    Could you run a test with a real table with just one column price defined with proper expected precisions and scales?

  • robannrobann Vertica Customer

    We've actually raised a support request for this, but if anyone is interested: Vertica Analytic Database v11.1.1 and it's a newly created flex table with no other columns defined.

    Basically if you generate binary avro data using this schema:

    [ { "namespace": "com.test", "type": "record", "name": "Test", "doc": "Record documentation", "fields": [ { "name": "name", "type": "string", "doc": "A simple doc" }, { "name": "price", "type": { "type": "bytes", "logicalType": "decimal", "precision": 20, "scale": 10 } } ] } ]

    you can see the error in the console if you import like this:

    create flex table Schemaname.test(); COPY Schemaname.test FROM LOCAL 'c:/temp/fails.avro' PARSER public.FAvroParser() REJECTED DATA AS TABLE "Schemaname.test_rejects";

    However if you use this precision and scale instead it works:

    "precision": 18, "scale": 9

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file