Skip to content

Commit 82bee58

Browse files
authored
DOCSP-19649: Audited Kafka connector names. (#91)
1 parent faf5456 commit 82bee58

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

43 files changed

+84
-84
lines changed

source/contribute.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ Gradle checks. You can run the checks with the following command:
3939
.. note:: Skipped Tests
4040

4141
You can skip tests in the ``integrationTest`` task related to
42-
the following areas unless your code specifically modifies {+connector+} behavior
42+
the following areas unless your code specifically modifies connector behavior
4343
related to these areas:
4444

4545
- Specific versions of MongoDB
@@ -54,7 +54,7 @@ Gradle checks. You can run the checks with the following command:
5454
You can run tests related to a specific MongoDB version by deploying a local replica set
5555
with that version of MongoDB.
5656

57-
To learn more about the {+connector+} source code, see the :github:`GitHub repository <mongodb/mongo-kafka>`.
57+
To learn more about the connector source code, see the :github:`GitHub repository <mongodb/mongo-kafka>`.
5858

5959
To learn more about Gradle, see the official
6060
`Gradle website <https://docs.gradle.org/>`__.
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
A {+source-connector+} works by opening a single change stream with
1+
The source connector works by opening a single change stream with
22
MongoDB and sending data from that change stream to {+kafka-connect+}. Your source
33
connector maintains its change stream for the duration of its runtime, and your
44
connector closes its change stream when you stop it.

source/introduction/connect.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ to interact with MongoDB.
6060
Version {+connector_version+} of the {+connector+} uses version
6161
{+connector_driver_version+} of the MongoDB Java driver.
6262

63-
To learn what connection URI options are available in the {+connector+}, see
63+
To learn what connection URI options are available in the connector, see
6464
`the MongoDB Java driver Connection guide <{+connector_driver_url_base+}fundamentals/connection/#connection-options>`__.
6565

6666
Authentication
@@ -86,5 +86,5 @@ The following is an example of a connection URI that authenticates with MongoDB
8686
To learn what authentication mechanisms are available, see
8787
`the MongoDB Java driver Authentication Mechanisms guide <{+connector_driver_url_base+}fundamentals/auth/#mechanisms>`__.
8888

89-
To learn more about authentication in the {+connector+}, see the
89+
To learn more about authentication in the connector, see the
9090
:doc:`Security and Authentication guide </security-and-authentication>`.

source/introduction/converters.txt

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ This guide describes how to use **converters** with the {+connector+}.
1919
Converters are programs that translate between bytes and
2020
{+kafka-connect+}'s runtime data format.
2121

22-
Converters pass data between {+kafka-connect+} and Apache Kafka. The {+connector+} passes data
22+
Converters pass data between {+kafka-connect+} and Apache Kafka. The connector passes data
2323
between MongoDB and {+kafka-connect+}. The following diagram shows these relationships:
2424

2525
.. figure:: /includes/figures/converters.png
@@ -34,12 +34,12 @@ To learn more about converters, see the following resources:
3434
Available Converters
3535
--------------------
3636

37-
As the {+connector+} converts your MongoDB data into {+kafka-connect+}'s runtime data
38-
format, the {+connector+} works with all available converters.
37+
As the connector converts your MongoDB data into {+kafka-connect+}'s runtime data
38+
format, the connector works with all available converters.
3939

4040
.. important:: Use the Same Converter for your Source and Sink Connectors
4141

42-
You must use the same converter in your source and sink connectors.
42+
You must use the same converter in your {+source-connector+} and {+sink-connector+}.
4343
For example, if your source connector writes to a topic using Protobuf, your
4444
sink connector must use Protobuf to read from the topic.
4545

@@ -49,7 +49,7 @@ Converters with Schemas
4949
~~~~~~~~~~~~~~~~~~~~~~~
5050

5151
If you use a schema-based converter such as the converter for Avro, Protobuf, or
52-
JSON Schema, you should define a schema in your {+connector+} source connector.
52+
JSON Schema, you should define a schema in your source connector.
5353

5454
To learn how to specify a schema, see the
5555
:ref:`<kafka-source-apply-schemas>` guide.
@@ -58,7 +58,7 @@ Connector Configuration
5858
-----------------------
5959

6060
This section provides templates for properties files to configure the following
61-
converters in a {+connector+} pipeline:
61+
converters in a connector pipeline:
6262

6363
- :ref:`Avro Converter <avro-converter-sample-properties>`
6464
- :ref:`Protobuf Converter <protobuf-converter-sample-properties>`
@@ -274,7 +274,7 @@ Click the following tabs to view properties files that work with the String conv
274274

275275
.. important:: Received Strings Must be Valid JSON
276276

277-
Your {+connector+} sink connector must receive valid JSON strings from your
277+
Your sink connector must receive valid JSON strings from your
278278
{+kafka+} topic even when using a String converter.
279279

280280
To use the preceding properties file, replace the placeholder text in angle

source/introduction/data-formats.txt

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ represent the :ref:`sample document <kafka-df-sample-doc>` in JSON like this:
3737

3838
{"company":"MongoDB"}
3939

40-
You may encounter the following data formats related to JSON when working with the {+connector+}:
40+
You may encounter the following data formats related to JSON when working with the connector:
4141

4242
- :ref:`Raw JSON <kafka-df-raw-json>`
4343
- :ref:`BSON <kafka-df-bson>`
@@ -130,7 +130,7 @@ Avro
130130
----
131131

132132
Apache Avro is an open-source framework for serializing and transporting
133-
data described by schemas. Avro defines two data formats relevant to the {+connector+}:
133+
data described by schemas. Avro defines two data formats relevant to the connector:
134134

135135
- :ref:`Avro schema <kafka-df-avro-schema>`
136136
- :ref:`Avro binary encoding <kafka-df-avro-encoding>`
@@ -152,7 +152,7 @@ specification of the following groups of data types:
152152

153153
.. warning:: Unsupported Avro Types
154154

155-
{+connector+} does not support the following Avro types:
155+
The connector does not support the following Avro types:
156156

157157
- ``enum`` types. Use ``string`` instead.
158158
- ``fixed`` types. Use ``bytes`` instead.
@@ -162,8 +162,8 @@ specification of the following groups of data types:
162162

163163
.. important:: Sink Connectors and Logical Types
164164

165-
{+connector+} sink connectors support all Avro schema primitive and complex types,
166-
however {+connector+} sink connectors support only the following logical types:
165+
The {+sink-connector+} supports all Avro schema primitive and complex types,
166+
however sink connectors support only the following logical types:
167167

168168
- ``decimal``
169169
- ``date``
@@ -191,7 +191,7 @@ like this:
191191
}
192192

193193
You use Avro schema when you
194-
:ref:`define a schema for a {+connector+} source connector <source-specify-avro-schema>`.
194+
:ref:`define a schema for a {+source-connector+} <source-specify-avro-schema>`.
195195

196196
For a list of all Avro schema types, see the
197197
`Apache Avro specification <https://avro.apache.org/docs/current/spec.html>`__.

source/introduction/install.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ Install the MongoDB Kafka Connector
1515
Overview
1616
--------
1717

18-
Learn how to install the {+connector+}. The {+connector+} is available for Confluent Platform and
18+
Learn how to install the {+connector+}. The connector is available for Confluent Platform and
1919
{+kafka+} deployments. To see installation instructions for your deployment type,
2020
navigate to one of the following sections:
2121

@@ -84,7 +84,7 @@ Install the Connector on Apache Kafka
8484
Download a Connector JAR File
8585
-----------------------------
8686

87-
You can download the {+connector+} source and JAR files from the following locations:
87+
You can download the connector source and JAR files from the following locations:
8888

8989
.. _kafka-connector-installation-reference:
9090

source/issues-and-help.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ Bugs / Feature Requests
1313
-----------------------
1414

1515
If you think you've found a bug or want to see a new feature in the
16-
Kafka Connector, please open a case in our issue management tool, JIRA:
16+
{+connector+}, please open a case in our issue management tool, JIRA:
1717

1818
* `Create an account and login <https://jira.mongodb.org>`__.
1919
* Navigate to `the KAFKA project <https://jira.mongodb.org/browse/KAFKA>`__.

source/migrate-from-kafka-connect-mongodb.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ to the :github:`official MongoDB Kafka connector <mongodb/mongo-kafka>`.
1212

1313
The following sections list the changes you must make to your Kafka
1414
Connect sink connector configuration settings and custom classes to transition
15-
to the MongoDB Kafka connector.
15+
to the {+sink-connector+}.
1616

1717
Update Configuration Settings
1818
-----------------------------

source/monitoring.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,8 +15,8 @@ Monitoring
1515
Overview
1616
--------
1717

18-
Learn how to observe the behavior of your MongoDB source or sink
19-
connector through **monitoring**.
18+
Learn how to observe the behavior of your {+source-connector+} or
19+
{+sink-connector+} through **monitoring**.
2020
Monitoring is the process of getting information about the
2121
activities a running program performs for use in an application
2222
or an application performance management library.

source/security-and-authentication/tls-and-x509.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ Store Certificates on the Worker
5151
--------------------------------
5252

5353
Store your certificates in a **keystore** and **truststore** to secure
54-
your certificate credentials for each server you run your {+connector+} worker
54+
your certificate credentials for each server you run your connector worker
5555
instance on.
5656

5757
Keystore
@@ -105,7 +105,7 @@ testing purposes, see
105105
Add Credentials to the Connector
106106
--------------------------------
107107

108-
The {+connector+} worker processes JVM options from your ``KAFKA_OPTS``
108+
The connector worker processes JVM options from your ``KAFKA_OPTS``
109109
environment variable. The environment variable contains the path and
110110
password to your keystore and truststore.
111111

0 commit comments

Comments
 (0)