[mongodb-user] Re: Is there a mongo-hadoop-spark built on Scala 2.10?
Re: Is there a mongo-hadoop-spark built on Scala 2.10?
Wan Bachtiar <
Mon, 2 May 2016 17:40:24 -0700 (PDT)
Maybe I understood something wrong, but are all the versions of the
mongo-hadoop-spark dependent on spark-core_2.11?
If you are referring to mongo-hadoop connector for Spark
>, as mentioned on
the installation section
there are only two dependencies for running MongoDB Hadoop Connector with
- The ‘spark’ jar, which is called mongo-hadoop-spark.jar
- MongoDB Java Driver ‘uber’ jar called mongo-java-driver.jar
Assuming you are using the latest Apache Spark
>, currently at
v1.6.1 with default of Scala v2.10. You could add the .jar files above to
the class path. For example using the spark shell
(Scala v2.10) you could specify:
./spark-1.6.1-bin-hadoop2.6/bin/spark-shell --jars mongo-java-driver-3.2.2.jar,mongo-hadoop-spark-1.5.2.jar
However if you are referring to the MongoDB Scala Driver
currently it only offers compatibility for Scala v2.11
Although the use of the Scala Driver is not required for setting up MongoDB
You received this message because you are subscribed to the Google Groups "mongodb-user"
For other MongoDB technical support options, see:
You received this message because you are subscribed to the Google Groups "mongodb-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to mongodb-user+unsubscribe@xxxxxxxxxxxxxxxx.
To post to this group, send email to mongodb-user@xxxxxxxxxxxxxxxx.
Visit this group at
To view this discussion on the web visit
For more options, visit
for more updates.