You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Various configuration options are available for the MongoDB Spark
14
-
Connector.
14
+
Connector. To learn more about the options you can set, see
15
+
:ref:`spark-write-conf` and :ref:`spark-read-conf`.
15
16
16
17
Specify Configuration
17
18
---------------------
18
19
19
-
Via ``SparkConf``
20
-
~~~~~~~~~~~~~~~~~
20
+
.. _spark-conf:
21
21
22
-
You can specify these options via ``SparkConf`` using the ``--conf``
23
-
setting or the ``$SPARK_HOME/conf/spark-default.conf`` file, and
24
-
MongoDB Spark Connector will use the settings in ``SparkConf`` as the
22
+
Using ``SparkConf``
23
+
~~~~~~~~~~~~~~~~~~~
24
+
25
+
You can specify configuration options with ``SparkConf`` using any of
26
+
the following approaches:
27
+
28
+
.. tabs-selector:: drivers
29
+
30
+
.. tabs-drivers::
31
+
32
+
tabs:
33
+
- id: java-sync
34
+
content: |
35
+
36
+
- The ``SparkConf`` constructor in your application. To learn more, see the `Java SparkConf documentation <https://spark.apache.org/docs/latest/api/java/index.html?org/apache/spark/SparkConf.html>`__.
37
+
38
+
- id: python
39
+
content: |
40
+
41
+
- The ``SparkConf`` constructor in your application. To learn more, see the `Python SparkConf documentation <https://spark.apache.org/docs/latest/api/python/reference/api/pyspark.SparkConf.html>`__.
42
+
43
+
- id: scala
44
+
content: |
45
+
46
+
- The ``SparkConf`` constructor in your application. To learn more, see the `Scala SparkConf documentation <https://spark.apache.org/docs/latest/api/scala/org/apache/spark/SparkConf.html>`__.
47
+
48
+
- The ``--conf`` flag at runtime. To learn more, see
49
+
`Dynamically Loading Spark Properties <https://spark.apache.org/docs/latest/configuration.html#dynamically-loading-spark-properties>`__ in
50
+
the Spark documentation.
51
+
52
+
- The ``$SPARK_HOME/conf/spark-default.conf`` file.
53
+
54
+
The MongoDB Spark Connector will use the settings in ``SparkConf`` as
25
55
defaults.
26
56
27
57
.. important::
28
58
29
-
When setting configurations via ``SparkConf``, you must prefix the
30
-
configuration options. Refer to the configuration sections for the
31
-
specific prefix.
59
+
When setting configurations with ``SparkConf``, you must prefix the
60
+
configuration options. Refer to :ref:`spark-write-conf` and
61
+
:ref:`spark-read-conf` for the specific prefixes.
32
62
33
-
Via ``ReadConfig`` and ``WriteConfig``
34
-
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
63
+
.. _options-map:
35
64
36
-
Various methods in the MongoDB Connector API accept an optional
refer to the Java Spark documentation for the ``.option()``
82
+
method.
83
+
84
+
- id: python
85
+
content: |
86
+
87
+
To learn more about specifying options with
88
+
`DataFrameReader <https://spark.apache.org/docs/latest/api/python/reference/api/pyspark.sql.DataFrameReader.option.html#pyspark.sql.DataFrameReader.option>`__ and
refer to the Java Spark documentation for the ``.option()``
91
+
method.
92
+
93
+
- id: scala
94
+
content: |
95
+
96
+
To learn more about specifying options with
97
+
`DataFrameReader <https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/DataFrameReader.html#option(key:String,value:Double):org.apache.spark.sql.DataFrameReader>`__ and
0 commit comments