We're Moving!

The Vertica Forum is moving to a new OpenText Analytics Database (Vertica) Community.

Join us there to post discussion topics, learn about

product releases, share tips, access the blog, and much more.

Create My New Community Account Now


VERTICA COPY QUERY Via AWS S3 bucket — Vertica Forum
Options

VERTICA COPY QUERY Via AWS S3 bucket

I’m running a CREATE TABLE and COPY query to load data from a CSV file into sas.temp_catalogue_186709_3936d63aec66f2046a3dcc9ae26cba91_2fe79b66_be80_4a85_94a9_35ac6378a7b0. The session is configured with AWSAuth and AWSRegion parameters, and the data is being loaded from a GZIP-compressed CSV on S3.

ALTER SESSION SET AWSAuth='XXXX:XXXXX';
ALTER SESSION SET AWSRegion='us-east-1';

COPY sas.temp_catalogue_... (columns...)
FROM 's3://.../Shopify_import_data_186709_a3bc2ac2c1704de0_3365.csv.gz'
GZIP
DELIMITER ','
ENCLOSED BY '"'
NULL AS 'NULL';
The issue arises because the description field in the CSV contains double quotes within the text, which breaks the parsing logic despite using ENCLOSED BY '"'. This causes affected rows to be skipped during the load.

I’m looking for a way to properly escape or handle embedded double quotes in fields like description during the COPY operation to avoid data loss.

Tagged:

Best Answer

  • Answer ✓

    got the answer to this missed out using fcsvparser for the above post that issue is resolved

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file