You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: source/structured-streaming.txt
+29-27Lines changed: 29 additions & 27 deletions
Original file line number
Diff line number
Diff line change
@@ -211,14 +211,14 @@ more about continuous processing, see the `Spark documentation <https://spark.ap
211
211
.load()
212
212
)
213
213
214
-
query = (streamingDataFrame
214
+
dataStreamWriter = (streamingDataFrame
215
215
.writeStream
216
216
.trigger(continuous="1 second")
217
217
.format("memory")
218
218
.outputMode("append")
219
219
)
220
220
221
-
query.start()
221
+
query = dataStreamWriter.start()
222
222
223
223
.. note::
224
224
@@ -279,12 +279,12 @@ more about continuous processing, see the `Spark documentation <https://spark.ap
279
279
.format("mongodb")
280
280
.load()
281
281
282
-
val query = streamingDataFrame.writeStream
282
+
val dataStreamWriter = streamingDataFrame.writeStream
283
283
.trigger(Trigger.Continuous("1 second"))
284
284
.format("memory")
285
285
.outputMode("append")
286
286
287
-
query.start()
287
+
val query = dataStreamWriter.start()
288
288
289
289
.. note::
290
290
@@ -334,7 +334,7 @@ Stream to MongoDB from a CSV File
334
334
.getOrCreate()
335
335
336
336
# define a streaming query
337
-
query = (spark
337
+
dataStreamWriter = (spark
338
338
.readStream
339
339
.format("csv")
340
340
.option("header", "true")
@@ -352,7 +352,7 @@ Stream to MongoDB from a CSV File
352
352
)
353
353
354
354
# run the query
355
-
query.start()
355
+
query = dataStreamWriter.start()
356
356
357
357
- id: scala
358
358
content: |
@@ -381,7 +381,7 @@ Stream to MongoDB from a CSV File
381
381
.getOrCreate()
382
382
383
383
// define a streaming query
384
-
val query = spark.readStream
384
+
val dataStreamWriter = spark.readStream
385
385
.format("csv")
386
386
.option("header", "true")
387
387
.schema(<csv-schema>)
@@ -397,10 +397,10 @@ Stream to MongoDB from a CSV File
397
397
.outputMode("append")
398
398
399
399
// run the query
400
-
query.start()
400
+
val query = dataStreamWriter.start()
401
401
402
-
Stream to a CSV File from MongoDB
403
-
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
402
+
Stream to your Console from MongoDB
403
+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
404
404
405
405
.. tabs-drivers::
406
406
@@ -409,17 +409,19 @@ Stream to a CSV File from MongoDB
409
409
- id: python
410
410
content: |
411
411
412
-
To create a :ref:`read stream <read-structured-stream>` to a
413
-
``.csv`` file from MongoDB, first create a `DataStreamReader <https://spark.apache.org/docs/latest/api/python/reference/api/pyspark.sql.streaming.DataStreamReader.html>`__
412
+
To create a :ref:`read stream <read-structured-stream>`
413
+
output to your console from MongoDB, first create a `DataStreamReader <https://spark.apache.org/docs/latest/api/python/reference/api/pyspark.sql.streaming.DataStreamReader.html>`__
414
414
from MongoDB, then use that ``DataStreamReader`` to
415
415
create a `DataStreamWriter <https://spark.apache.org/docs/latest/api/python/reference/api/pyspark.sql.streaming.DataStreamWriter.html>`__
416
-
to a new ``.csv`` file. Finally, use the ``start()`` method
416
+
to the console. Finally, use the ``start()`` method
417
417
to begin the stream.
418
418
419
419
As new data is inserted into MongoDB, MongoDB streams that
420
-
data out to a ``.csv`` file in the `outputMode <https://spark.apache.org/docs/latest/api/python/reference/api/pyspark.sql.streaming.DataStreamWriter.outputMode.html#pyspark.sql.streaming.DataStreamWriter.outputMode>`__
420
+
data out to your console in the `outputMode <https://spark.apache.org/docs/latest/api/python/reference/api/pyspark.sql.streaming.DataStreamWriter.outputMode.html#pyspark.sql.streaming.DataStreamWriter.outputMode>`__
421
421
you specify.
422
422
423
+
.. include:: /includes/warn-console-stream.txt
424
+
423
425
.. code-block:: python
424
426
:copyable: true
425
427
:emphasize-lines: 19, 27, 30
@@ -438,10 +440,10 @@ Stream to a CSV File from MongoDB
@@ -451,29 +453,30 @@ Stream to a CSV File from MongoDB
451
453
.load()
452
454
# manipulate your streaming data
453
455
.writeStream
454
-
.format("csv")
455
-
.option("path", "/output/")
456
+
.format("console")
456
457
.trigger(continuous="1 second")
457
458
.outputMode("append")
458
459
)
459
460
460
461
# run the query
461
-
query.start()
462
+
query = dataStreamWriter.start()
462
463
463
464
- id: scala
464
465
content: |
465
466
466
-
To create a :ref:`read stream <read-structured-stream>` to a
467
-
``.csv`` file from MongoDB, first create a `DataStreamReader <https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/streaming/DataStreamReader.htmll>`__
467
+
To create a :ref:`read stream <read-structured-stream>`
468
+
output to your console from MongoDB, first create a `DataStreamReader <https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/streaming/DataStreamReader.htmll>`__
468
469
from MongoDB, then use that ``DataStreamReader`` to
469
470
create a `DataStreamWriter <https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/streaming/DataStreamWriter.html>`__
470
-
to a new ``.csv`` file. Finally, use the ``start()`` method
471
+
to the console. Finally, use the ``start()`` method
471
472
to begin the stream.
472
473
473
474
As new data is inserted into MongoDB, MongoDB streams that
474
-
data out to a ``.csv`` file in the `outputMode <https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/streaming/DataStreamWriter.html#outputMode(outputMode:String):org.apache.spark.sql.streaming.DataStreamWriter[T]>`__
475
+
data out to your console in the `outputMode <https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/streaming/DataStreamWriter.html#outputMode(outputMode:String):org.apache.spark.sql.streaming.DataStreamWriter[T]>`__
475
476
you specify.
476
477
478
+
.. include:: /includes/warn-console-stream.txt
479
+
477
480
.. code-block:: scala
478
481
:copyable: true
479
482
:emphasize-lines: 17, 25, 28
@@ -494,7 +497,7 @@ Stream to a CSV File from MongoDB
0 commit comments