我是 spark 新手,并試圖將數(shù)據(jù)幀寫入 db2 表。我得到的錯誤是:Exception in thread "main" java.lang.IllegalArgumentException: Can't get JDBC type for struct <data:int, day:int, hours:int, minutes:int, month:int, seconds:int, time:bigint, timeZoneOffset:int, year:int>我的數(shù)據(jù)庫架構(gòu)是localId <-- Integer typeeffectiveDate <-- TimestampactivityDate <-- TimestampinDate <-- TimestampoutDate <-- Timestamp我為我的 db 表創(chuàng)建了一個 POJO 類,如下所示public class StowageTable { private long localId; private Date effectiveDate; private Date activityDate; private Date inDate; private Date outDate; //setters and getters}然后,我基本上讀取了一個與 db 表具有相同架構(gòu)的 csv,如下所示:JavaRDD<String> dataFromCSV = javaSparkContext.textFile(fileURL);//The I create a JavaRDD of the POJO typeJavaRDD<StowageTable> dataToPOJO = dataFromCSV.map((Function<String, StowageTable) line -> { String[] fields = line.split(","); StowageTable st = createNewStowageTable(fields); return st;});//converting the RDD to DataFrameDataFrame stowageTableDF = sqlContext.createDataFrame(dataToPOJO, StowageTable.class);//call jdbc persisterpersistToTable(stowageTableDF);我的persistToTable(DataFrame df)方法如下:private void persistToTable(DataFrame df) { Class.forName("")//driver here //skipping a few lines for brevity df.write().mode(SaveMode.Append).jdbc(url, table, connectionProperties);}我在這里找到了一些解決方案:Spark DataFrame write to JDBC - Can't get JDBC type for array<array<int>> and java.lang.IllegalArgumentException: Can't get JDBC type for array<string> but could not find any它解決了日期時間數(shù)據(jù)類型問題。請建議我一些解決方案。我在火花 1.6.3。
1 回答

www說
TA貢獻1775條經(jīng)驗 獲得超8個贊
由于我還找不到任何答案,并且在此期間為自己找到了解決方案,因此這是基本思想。如果數(shù)據(jù)庫的數(shù)據(jù)類型為時間戳,那么您必須在對象的 POJO 中使用時間戳,然后將該時間戳轉(zhuǎn)換為 spark 的結(jié)構(gòu)類型。
添加回答
舉報
0/150
提交
取消