当我尝试使用我的spark程序连接oracle db时,我正在使用spark-sql 2.4.1
火花程序
val o_url =//"jdbc:oracle:thin: etc ... it is correct and working
val query ="( SELECT 1 FROM DUAL ) T";
val dfReader = spark.read.format("jdbc")
.option("url", o_url)
.option("driver", "oracle.jdbc.OracleDriver")
.option("user", "ABC")
.option("password", "ZYX")
.option("fetchsize","10000")
val ss = dfReader
.option("inferSchema", true)
.option("schema","LDF")
.option("dbtable", query)
.load();
println(" Table Count : "+ ss.count());
spark.close();
当然,我在类路径中添加了“ ojdbc14.jar” jar。
我遇到错误
错误
java.sql.SQLException: Io exception: NL Exception was generated
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:112)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:146)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:255)
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:387)
at oracle.jdbc.driver.PhysicalConnection.<init>(PhysicalConnection.java:414)
at oracle.jdbc.driver.T4CConnection.<init>(T4CConnection.java:165)
at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:35)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:801)
at org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper.connect(DriverWrapper.scala:45)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:63)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:56)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:115)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:52)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:340)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
这里怎么了?如何解决此问题?
val o_url = //“ jdbc:oracle:thin:etc ...正确且有效
较早
val o_url =“ jdbc:oracle:thin:@(DESCRIPTION =(ADDRESS =(PROTOCOL = TCP)...; “;
我通过删除“;”将其纠正如下。内引号。
val o_url =“ jdbc:oracle:thin:@(DESCRIPTION =(ADDRESS =(PROTOCOL = TCP)...“;