Spark SQL操作MongoDB,读写

亲测有效
spark版本2.2.1
华为云mapreduce机群,华为云文档数据库

需要依赖
dependency>
<groupId>org.mongodb.spark</groupId>
<artifactId>mongo-spark-connector_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>bson</artifactId>
<version>3.7.0</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-core</artifactId>
<version>3.7.0</version>
</dependency> 
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver</artifactId>
<version>3.7.0</version>
</dependency>


mongodb数据库MyDB,集合user,内容
use MyDB;
db.user.save({ID:"1",Name:"A",Gender:"F",Birthday:"1996-09-12"});
db.user.save({ID:"2",Name:"B",Gender:"M",Birthday:"1995-12-23"});
db.user.save({ID:"3",Name:"C",Gender:"M",Birthday:"1996-10-29"});
db.user.save({ID:"4",Name:"D",Gender:"M",Birthday:"1995-02-25"});
db.user.save({ID:"5",Name:"E",Gender:"F",Birthday:"1997-06-06"});

启动spark-shell
spark-shell \
--jars hdfs:///tmp/test/jars/bson-3.7.0.jar,hdfs:///tmp/test/jars/mongodb-driver-3.7.0.jar,hdfs:///tmp/test/jars/mongodb-driver-core-3.7.0.jar,hdfs:///tmp/test/jars/mongo-spark-connector_2.11-2.2.1.jar \
--master yarn

val readDatabaseName="MyDB"
val readCollectionName="user"
val connectionString=Option("mongodb://rwuser:3363018!tiaN@192.168.1.16:8635/admin")
val readConfig=com.mongodb.spark.config.ReadConfig(readDatabaseName,readCollectionName,connectionString)
val df=com.mongodb.spark.MongoSpark.load(spark,readConfig)
df.show()
val writeDatabaseName="newMyDB"
val writeCollectionName="user"
val writeConfig=com.mongodb.spark.config.WriteConfig(writeDatabaseName,writeCollectionName,connectionString)
com.mongodb.spark.MongoSpark.save(df.write,writeConfig)


    原文作者:王社英
    原文地址: https://www.jianshu.com/p/2715cef7fd74
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞