Please take this survey to help us learn more about how you use third party tools. Your input is greatly appreciated!

Java.sql.SQLException: No suitable driver found for jdbc:vertica://172.16.251.171:5433/analyticsdb?u

I want to connect vertica database from spark driver program using 'JdbcRDD' but  facing issue
Please help me to fix this issue!

Description :

I am using Vertica 'v6.1.1-0' and vertica jdbc driver 'vertica-jdk5-6.1.1-0.jar' and spark1.1.0 version. I have spark driver program written in scala. I have to run same driver program on spark-shel, which provide me SparkContext object as 'sc'. While running this program, i am facing 'java.sql.SQLException: No suitable driver found for jdbc:vertica' exception.



Scala Program :

    import org.apache.spark.rdd.JdbcRDD
    import java.sql.{Connection, DriverManager, ResultSet}

    classOf[com.vertica.jdbc.Driver]
    println("com.vertica.jdbc.Driver loaded successfully ")

    val conn_str = "jdbc:vertica://172.16.251.171:5433/analyticsdb?user=dbadmin&password=analytics"
//    val conn = DriverManager.getConnection(conn_str)

    println("Connection with vertica established ")
    val myRDD = new JdbcRDD( sc, () =>
        DriverManager.getConnection(conn_str),
        "select ROLE_NAME from ROLE_TB WHERE ROLE_ID > ? AND ROLE_ID",
        0, 10, 2, r => r.getString("ROLE_NAME"))
   
    myRDD.count()


Program Output:

scala> :load Vertica.scala
Loading Vertica.scala...
import org.apache.spark.rdd.JdbcRDD
import java.sql.{Connection, DriverManager, ResultSet}
res0: Class[com.vertica.jdbc.Driver] = class com.vertica.jdbc.Driver
com.vertica.jdbc.Driver loaded successfully
conn_str: String = jdbc:vertica://172.16.251.171:5433/analyticsdb?user=dbadmin&password=analytics
Connection with vertica established
myRDD: org.apache.spark.rdd.JdbcRDD[String] = JdbcRDD[0] at JdbcRDD at <console>:16
23/12/2014 07:02:25 [ERROR] org.apache.spark.Logging$class: Task 0 in stage 0.0 failed 4 times; aborting job
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 6, verticacdn3): java.sql.SQLException: No suitable driver found for jdbc:vertica://172.16.251.171:5433/analyticsdb?user=dbadmin&password=analytics
        java.sql.DriverManager.getConnection(DriverManager.java:596)
        java.sql.DriverManager.getConnection(DriverManager.java:233)
        $iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:17)
        $iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:16)
        org.apache.spark.rdd.JdbcRDD$$anon$1.<init>(JdbcRDD.scala:73)
        org.apache.spark.rdd.JdbcRDD.compute(JdbcRDD.scala:70)
        org.apache.spark.rdd.JdbcRDD.compute(JdbcRDD.scala:50)
        org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
        org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
        org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
        org.apache.spark.scheduler.Task.run(Task.scala:54)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:744)
Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1185)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1174)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1173)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1173)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:688)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:688)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:688)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1391)
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
        at akka.actor.ActorCell.invoke(ActorCell.scala:456)
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
        at akka.dispatch.Mailbox.run(Mailbox.scala:219)
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)








Comments

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file

Can't find what you're looking for? Search the Vertica Documentation, Knowledge Base, or Blog for more information.