Spark Connector Read
Since we are limited how many connections we can make as a client, the reader has this error: java.sql.SQLNonTransientConnectionException: [Vertica]VJDBC FATAL: New session rejected because connection limit of 16 on database already met for M21176
at com.vertica.util.ServerErrorData.buildException(Unknown Source)
We don't know how many the reader really needs, sometimes it is only 1, sometimes it needs many, how we avoid this exception (lift the the limit 16 is not an option for us).
0
Comments
You control the limit on the number of connections via the MAXCONNECTIONS property of individual users or via the configuration parameter MaxClientSessions on the database or individual nodes.
See:
https://www.vertica.com/docs/9.2.x/HTML/Content/Authoring/AdministratorsGuide/ManagingClientConnections/LimitingTheNumberOfClientConnections.htm
Hi PaulWu, we need more information to assist you. Can you specify how are you using the Spark connector to get this error? is it when moving data from Spark into Vertica? or, is it when moving data from Vertica to Spark? what is the workload? how many nodes your Vertica cluster has? In the case of V2S, is your Vertica source table segmented or unsegmented and how many segments? How many Spark partitions "numPartitions" (default is 16)?.