ODBC drivers bulk-load API support

When will Vertica's drivers support the ODBC bulk-load API ?

Comments

  • One week has passed and not a single response. Is anyone from the Vertica dev team reading these posts at all?
  • We're sorry that you've experienced a delay in response! Our community is still developing and we're working through our response process. I'll be sure to bring this to the team's attention immediately.
  • Again, sorry for the delay in our response. I want to make sure I understand what you're asking for ...Vertica does indeed support bulk load in ODBC through COPY: https://my.vertica.com/docs/6.1.x/HTML/index.htm#12736.htm Is this what you're looking for? If not, please let me know and I'll find out more.
  • Hi Beth, The bulk support you are referring is non-standard functionality implemented by Vertica. This non-standard extension requires the input file present on the server system to load it into the Vertica database. As you probably understand this will not work if a user tries to load data from remote location. I'm referring to the standard bulk-load support included in the ODBC standard. It is based around the SQLBulkOperations function. The standard API allows loading of data from remote locations and doesn't require the input file to be present on the server system.
  • Thanks for clarifying--I'll see what I can find out for you and will respond as soon as I can.
  • I did find information on how Vertica handles bulk load from remote locations through ODBC. You'll find it here: https://my.vertica.com/docs/6.1.x/HTML/index.htm#17415.htm Is this what you're looking to do?
  • Beth, Again this is non-standard extension by Vertica. This is not part of the ODBC standard. The ODBC based standard is based on the SQLBulkOperations function.
  • Got it--I'm going to pass your question along to others. We will respond again soon.
  • Hi CozyRock Our ODBC driver does not support the SQLBulkOperations API. The driver only supports forward only cursors, and does not have the bookmark cursors typically used by the SQLBulkOperations API. To do bulk loading you have a couple of options. You can use COPY and COPY LOCAL as Beth suggested, but as you said these are non standard. The other option is to bind parameter arrays with an INSERT statement with the SQLExecute API. Internally, the driver will convert & execute the INSERT statement as a semantically equivalent COPY statement. This allows applications to use standard INSERTs while still getting the bulk load performance that COPY allows. There is an example of doing this here: https://my.vertica.com/docs/6.1.x/HTML/index.htm#15086.htm Tom
  • Hi Tom, Thank you for your assistance! The second option you have provided link to worked like magic. We just did tests and the load speed is 400x faster compared to standard row-by-row insertion. 100,000 records are now loading in only 5 seconds, compared to 35 minutes for the standard insert. I suspect you might not be the right person to talk to, but I will share the importance of this. The company I work for COZYROC provides third-party components for SQL Server Integration Services (SSIS). We are the leading third-party provider. We wanted to include bulk-load support for Vertica for some time and I'm glad we have finally gotten this to work. A new parameter is included in the ODBC Destination component, which will appear in the next service release of COZYROC SSIS+ library. Perhaps it would be worth sharing this with your existing/prospective customers that there is now support for Vertica in SSIS. Thank you again for your help!
  • Is it possible same in c# Code using ODBC Connection. if yes please send me an Example.

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file