You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Various configuration options are available for the MongoDB Spark
14
-
Connector. To learn more about the options you can set, see
15
-
:ref:`spark-write-conf` and :ref:`spark-read-conf`.
14
+
Connector.
16
15
17
16
Specify Configuration
18
17
---------------------
19
18
20
-
.. _spark-conf:
19
+
Via ``SparkConf``
20
+
~~~~~~~~~~~~~~~~~
21
21
22
-
Using ``SparkConf``
23
-
~~~~~~~~~~~~~~~~~~~
24
-
25
-
You can specify configuration options with ``SparkConf`` using any of
26
-
the following approaches:
27
-
28
-
.. tabs-selector:: drivers
29
-
30
-
.. tabs-drivers::
31
-
32
-
tabs:
33
-
- id: java-sync
34
-
content: |
35
-
36
-
- The ``SparkConf`` constructor in your application. To learn more, see the `Java SparkConf documentation <https://spark.apache.org/docs/latest/api/java/index.html?org/apache/spark/SparkConf.html>`__.
37
-
38
-
- id: python
39
-
content: |
40
-
41
-
- The ``SparkConf`` constructor in your application. To learn more, see the `Python SparkConf documentation <https://spark.apache.org/docs/latest/api/python/reference/api/pyspark.SparkConf.html>`__.
42
-
43
-
- id: scala
44
-
content: |
45
-
46
-
- The ``SparkConf`` constructor in your application. To learn more, see the `Scala SparkConf documentation <https://spark.apache.org/docs/latest/api/scala/org/apache/spark/SparkConf.html>`__.
47
-
48
-
- The ``--conf`` flag at runtime. To learn more, see
49
-
`Dynamically Loading Spark Properties <https://spark.apache.org/docs/latest/configuration.html#dynamically-loading-spark-properties>`__ in
50
-
the Spark documentation.
51
-
52
-
- The ``$SPARK_HOME/conf/spark-default.conf`` file.
53
-
54
-
The MongoDB Spark Connector will use the settings in ``SparkConf`` as
22
+
You can specify these options via ``SparkConf`` using the ``--conf``
23
+
setting or the ``$SPARK_HOME/conf/spark-default.conf`` file, and
24
+
MongoDB Spark Connector will use the settings in ``SparkConf`` as the
55
25
defaults.
56
26
57
27
.. important::
58
28
59
-
When setting configurations with ``SparkConf``, you must prefix the
60
-
configuration options. Refer to :ref:`spark-write-conf` and
61
-
:ref:`spark-read-conf` for the specific prefixes.
62
-
63
-
.. _options-map:
64
-
65
-
Using an Options Map
66
-
~~~~~~~~~~~~~~~~~~~~
29
+
When setting configurations via ``SparkConf``, you must prefix the
30
+
configuration options. Refer to the configuration sections for the
31
+
specific prefix.
67
32
68
-
In the Spark API, the DataFrameReader and DataFrameWriter methods
69
-
accept options in the form of a ``Map[String, String]``. Options
70
-
specified this way override any corresponding settings in ``SparkConf``.
33
+
Via ``ReadConfig`` and ``WriteConfig``
34
+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
71
35
72
-
.. tabs-drivers::
36
+
Various methods in the MongoDB Connector API accept an optional
refer to the Java Spark documentation for the ``.option()``
82
-
method.
49
+
In the Spark API, some methods (e.g. ``DataFrameReader`` and
50
+
``DataFrameWriter``) accept options in the form of a ``Map[String,
51
+
String]``.
83
52
84
-
- id: python
85
-
content: |
53
+
You can convert custom ``ReadConfig`` or ``WriteConfig`` settings into
54
+
a ``Map`` via the ``asOptions()`` method.
86
55
87
-
To learn more about specifying options with
88
-
`DataFrameReader <https://spark.apache.org/docs/latest/api/python/reference/api/pyspark.sql.DataFrameReader.option.html#pyspark.sql.DataFrameReader.option>`__ and
refer to the Java Spark documentation for the ``.option()``
91
-
method.
92
-
93
-
- id: scala
94
-
content: |
95
-
96
-
To learn more about specifying options with
97
-
`DataFrameReader <https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/DataFrameReader.html#option(key:String,value:Double):org.apache.spark.sql.DataFrameReader>`__ and
0 commit comments