scala & spark实战

java.lang.Long is not a valid external type for schema of string   java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: java.lang.String is not a valid external type for schema of bigint 原代码:
val rddStatsEcSubsDay4G = sc.textFile(path + “test”).map(_.split(“,”))
.map(r => Row(r(0), r(1), DateTimeTool.toTimestamp(r(2), pattern)))  
sqlContext.createDataFrame(rddStatsEcSubsDay4G, Schema.TEST)   map类型和schema类型不一致导致问题,Schema中定义为Long,但是map的时候映射为String,这里只要把r(1)变为r(1).toLong即可。   java.lang.IllegalArgumentException: Invalid format: “2016-11-22 15:42:42” is malformed at “-11-22 15:42:42”
数据提供的格式yyyy-MM-dd HH:mm:ss和map中定义的DateTimeTool.toDate(r(8), pattern)pattern不一致导致(源代码中pattern中是“yyyy/MM/dd HH:mm:ss”),根据错误提示可以知道,只是匹配到了年,之后格式无法匹配   java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: java.sql.Date is not a valid external type for schema of timestamp 因为在TestBase中定义这在map的时候将某个字段映射为Date: val rddGroupInfo = sc.textFile(path + “cm_cu_groupinfo”) .map(_.replaceAll(“null”, “0”)) .map(_.split(“,”)).map(r => Row(r(0), r(1), r(2), r(3), r(4), r(5), r(6),
DateTimeTool.toDate(r(7), pattern), DateTimeTool.toDate(r(8), pattern))) 加粗部分修改为toTimestamp即可,还是map类型和schema中类型不一致造成   java.lang.ArrayIndexOutOfBoundsException: 22 发现原来是23,24列都为null,因为没有填充null导致  

点赞