Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions source/api-docs.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@
API Documentation
=================

.. meta::
:description: Explore API documentation for the Spark Connector compatible with Scala 2.13 and 2.12.

- `Spark Connector for Scala 2.13 <https://www.javadoc.io/doc/org.mongodb.spark/{+artifact-id-2-13+}/{+current-version+}/index.html>`__
- `Spark Connector for Scala 2.12 <https://www.javadoc.io/doc/org.mongodb.spark/{+artifact-id-2-12+}/{+current-version+}/index.html>`__

Expand Down
3 changes: 3 additions & 0 deletions source/batch-mode.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@
Batch Mode
==========

.. meta::
:description: Explore how to use the Spark Connector to read and write data to MongoDB in batch mode using Spark's Dataset and DataFrame APIs.

.. contents:: On this page
:local:
:backlinks: none
Expand Down
1 change: 1 addition & 0 deletions source/batch-mode/batch-read-config.txt
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ Batch Read Configuration Options

.. meta::
:keywords: partitioner, customize, settings
:description: Configure batch read options for the Spark Connector, including connection URI, database, collection, and partitioner settings for efficient data processing.

.. _spark-batch-input-conf:

Expand Down
3 changes: 3 additions & 0 deletions source/batch-mode/batch-read.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Read from MongoDB in Batch Mode
===============================

.. meta::
:description: Learn how to read data from MongoDB in batch mode using Spark, including configuration settings, schema inference, and applying filters for efficient data retrieval.

.. toctree::
:caption: Batch Read Configuration Options

Expand Down
3 changes: 3 additions & 0 deletions source/batch-mode/batch-write-config.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Batch Write Configuration Options
=================================

.. meta::
:description: Configure batch write operations to MongoDB using various properties like connection URI, database, collection, and write concern options.

.. contents:: On this page
:local:
:backlinks: none
Expand Down
3 changes: 3 additions & 0 deletions source/batch-mode/batch-write.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Write to MongoDB in Batch Mode
==============================

.. meta::
:description: Learn how to write data to MongoDB in batch mode using the Spark Connector, specifying format and configuration settings for Java, Python, and Scala.

.. toctree::
:caption: Batch Write Configuration Options

Expand Down
3 changes: 3 additions & 0 deletions source/configuration.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Configuring Spark
=================

.. meta::
:description: Configure read and write operations in Spark using `SparkConf`, options maps, or system properties for batch and streaming modes.

Check failure on line 8 in source/configuration.txt

View workflow job for this annotation

GitHub Actions / vale

[vale] source/configuration.txt#L8

[MongoDB.CommaOxford] Use the Oxford comma in ' Configure read and write operations in Spark using `SparkConf`, options maps, or system properties for batch and '.
Raw output
{"message": "[MongoDB.CommaOxford] Use the Oxford comma in ' Configure read and write operations in Spark using `SparkConf`, options maps, or system properties for batch and '.", "location": {"path": "source/configuration.txt", "range": {"start": {"line": 8, "column": 17}}}, "severity": "ERROR"}

.. contents:: On this page
:local:
:backlinks: none
Expand Down
3 changes: 3 additions & 0 deletions source/faq.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@
FAQ
===

.. meta::
:description: Find solutions for achieving data locality, resolving pipeline stage errors, using mTLS for authentication, and sharing a MongoClient instance across threads with the Spark Connector.

How can I achieve data locality?
--------------------------------

Expand Down
1 change: 1 addition & 0 deletions source/getting-started.txt
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ Getting Started with the {+connector-short+}

.. meta::
:keywords: quick start, tutorial, code example
:description: Get started with the Spark Connector by setting up dependencies, configuring connections, and integrating with platforms like Amazon EMR, Databricks, Docker, and Kubernetes.

Prerequisites
-------------
Expand Down
3 changes: 3 additions & 0 deletions source/index.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@
MongoDB Connector for Spark
===========================

.. meta::
:description: Integrate MongoDB with Apache Spark using the MongoDB Connector for Spark, supporting Spark Structured Streaming.

.. toctree::
:titlesonly:

Expand Down
1 change: 1 addition & 0 deletions source/release-notes.txt
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ Release Notes

.. meta::
:keywords: update, new feature, deprecation, breaking change
:description: Explore the latest features and changes in the MongoDB Connector for Spark, including updates for Java Sync Driver and support for multiple Spark versions.

.. contents:: On this page
:local:
Expand Down
3 changes: 3 additions & 0 deletions source/streaming-mode.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Streaming Mode
==============

.. meta::
:description: Explore how to use the Spark Connector for reading and writing data in streaming mode with Spark Structured Streaming.

.. contents:: On this page
:local:
:backlinks: none
Expand Down
1 change: 1 addition & 0 deletions source/streaming-mode/streaming-read-config.txt
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ Streaming Read Configuration Options

.. meta::
:keywords: change stream, customize
:description: Configure streaming read options for MongoDB in Spark, including connection settings, parsing strategies, and change stream configurations.

.. _spark-streaming-input-conf:

Expand Down
1 change: 1 addition & 0 deletions source/streaming-mode/streaming-read.txt
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ Read from MongoDB in Streaming Mode

.. meta::
:keywords: change stream
:description: Learn to read data from MongoDB in streaming mode using the Spark Connector, supporting micro-batch and continuous processing with configuration options.

Overview
--------
Expand Down
3 changes: 3 additions & 0 deletions source/streaming-mode/streaming-write-config.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Streaming Write Configuration Options
=====================================

.. meta::
:description: Configure properties for writing data to MongoDB in streaming mode, including connection URI, database, collection, and checkpoint settings.

.. contents:: On this page
:local:
:backlinks: none
Expand Down
3 changes: 3 additions & 0 deletions source/streaming-mode/streaming-write.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
Write to MongoDB in Streaming Mode
==================================

.. meta::
:description: Learn to write data to MongoDB in streaming mode using Spark Connector with configuration settings for Java, Python, and Scala.

.. toctree::
:caption: Streaming Write Configuration Options

Expand Down
1 change: 1 addition & 0 deletions source/tls.txt
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ Configure TLS/SSL

.. meta::
:keywords: code example, authenticate
:description: Learn how to configure TLS/SSL for secure communication between the Spark Connector and MongoDB, including setting up JVM trust and key stores.

Overview
--------
Expand Down
Loading