got exception when using databaseMetaData.getTables
when i use jdbc interface to get tables info from vertica
rs = databaseMetaData.getTables(catalog, schema, "%", tableTypes);
the params are like ("","","%",["TABLE"])
it got exception like :
Caused by: java.sql.SQLNonTransientException: [Vertica]VJDBC ERROR: Request size too big
[Vertica][VJDBC]Detail: Total allocated memory (MiB) = 1024.007 is about to exceed limit imposed by MaxParsedQuerySizeMB (MiB) = 1024.000
at com.vertica.util.ServerErrorData.buildException(Unknown Source)
at com.vertica.dataengine.VResultSet.fetchChunk(Unknown Source)
at com.vertica.dataengine.VResultSet.initialize(Unknown Source)
at com.vertica.dataengine.VQueryExecutor.readExecuteResponse(Unknown Source)
at com.vertica.dataengine.VQueryExecutor.handleExecuteResponse(Unknown Source)
at com.vertica.dataengine.VQueryExecutor.execute(Unknown Source)
at com.vertica.dataengine.SimpleQueryExecutor.execute(Unknown Source)
at com.vertica.dataengine.SimpleQueryExecutor.execute(Unknown Source)
at com.vertica.dataengine.VMetadataSource.runMetadataQuery(Unknown Source)
at com.vertica.dataengine.VMetadataSource.(Unknown Source)
at com.vertica.dataengine.VMetadataSource.(Unknown Source)
at com.vertica.core.VConnection.getMetadataSource(Unknown Source)
at com.vertica.dataengine.VDataEngine.makeNewMetadataSource(Unknown Source)
at com.vertica.dsi.dataengine.impl.DSIDataEngine.makeNewMetadataResult(Unknown Source)
at com.vertica.dataengine.VDataEngine.makeNewMetadataResult(Unknown Source)
at com.vertica.jdbc.jdbc4.S4DatabaseMetaData.createMetaDataResult(Unknown Source)
at com.vertica.jdbc.common.SDatabaseMetaData.getTables(Unknown Source)
the vertica verion is 9.2 ,updated in these days.
3q。
Answers
Hi,
From your output:
[Vertica][VJDBC]Detail: Total allocated memory (MiB) = 1024.007 is about to exceed limit imposed by MaxParsedQuerySizeMB (MiB) = 1024.000
Looks like the request exceeded the "Maximum amount of memory allowed for parsing a single request" as determined by the MaxParsedQuerySizeMB parameter, but just by a little.
Try increasing the MaxParsedQuerySizeMB parameter for your DB.
Example
(My DB is named test_db):
Hello!
I have the same problem
Every hour we save new records from query_requests and dc_projection_used to a regular table. And then we make extracts from the table for technical analysis.
Everything worked fine for over a year (9.0, 9.1), but after upgrading to 9.2 there was a problem.
If the incremental extract contains more than 200k-400k records, then this leads to problems with the MaxParsedQuerySizeMB value = 1 GB
Column "request" has the type varchar (64000)
I tried to increase the value, but with a large interval of data (29M records) this stops helping even with 16GB