Skip to main content
Filter by
Sorted by
Tagged with
2 votes
0 answers
42 views

I have a collection which I need to covert to timeseries using mongodb kafka connector. The source collection have document of this shape { "_id": "6798680c885bb2a8fbe93485", ...
Zeeshan's user avatar
  • 639
0 votes
0 answers
38 views

My collection looks like this Collection And the source configuration looks like this curl -X POST "http://localhost:8083/connectors" \ -H "Content-Type: application/json" \ ...
Nazar Mazuryk's user avatar
1 vote
0 answers
173 views

I've to used timestamp converter with SMT "io.debezium.connector.mongodb.transforms.ExtractNewDocumentState" but message in topic unix timestamp fields not convert to format that i want here ...
Narut Promsuparoj's user avatar
0 votes
1 answer
224 views

I have setup a MSK cluster using public subnets. I also have turned on public access for this MSK cluster. The cluster is up and running. I am trying to setup MongoDB Kafka Source Connector using MSK ...
Subhadip Sahoo's user avatar
0 votes
1 answer
192 views

I am trying to find a way to prevent truncation of resume tokenwhen error is thrown by mongodb server for 16 mb size limit. I am reading resume token to make sure documents over 16 mb is read properly ...
Pranav Chavan's user avatar
1 vote
1 answer
179 views

Is there a way to dynamically choose the WriteStrategy of a MongoDB sink connector? Use case at hand - Read from a topic and write to mongo atlas using mongodb sink connector in a way that the ...
Nilay Sundarkar's user avatar
1 vote
1 answer
672 views

I am using Apache Kafka with Confluent Connect v7.3.2 platform and a MongoDB Connector deployed as Sink Connector in order to stream messages with 2 timestamp fields into a Collection. These values ...
Guy_g23's user avatar
  • 395
0 votes
0 answers
367 views

I am trying to integrate document db (5.0.0) with kafka redshift connector. when doing so i keep on getting this issue where it complains that '$changeStream.fullDocumentBeforeChange' is an unknown ...
ebadfd's user avatar
  • 419
0 votes
1 answer
497 views

Im trying to connect MongoDB with AWS Managed kafka using the source connector (downloaded from Confluent Hub), but while creating AWS connector im getting the error message: Code: InvalidInput....
A Rafay's user avatar
0 votes
0 answers
372 views

I want make a CQRS with data mesh, and i already success to create MongoSourceConnector and ElasticsearchSinkConnector, but im confuse why my MongoSourceConnector configuration didn't publish null ...
Difa Al's user avatar
0 votes
0 answers
80 views

Legacy MongoDb is having _id in below format legacy mongodb id New MongoDB is having _id in below format _id: "{"$oid": "57f2773113b28328a0dabfe4"}" How to make data ...
sai jyothsna pentyala's user avatar
0 votes
0 answers
653 views

I have a mongoDB Kafka connector that doesn't consume any event anymore. My configuration is the following: { "name": "event-mongodb-sink", "config": { "...
Omegaspard's user avatar
  • 2,000
0 votes
1 answer
230 views

Mongo Sink Connector failed to start with below error: With the configured document ID strategy, all records are required to have keys, which must be either maps or structs. Record Key String Format ...
sai jyothsna pentyala's user avatar
0 votes
1 answer
104 views

I have mongo sink connector for my kafka cluster which sinks topic data into mongo database. I am looking for properties to exclude one topic in my sink connector if there is any. Explanation: local....
Abhishek's user avatar
0 votes
1 answer
118 views

Is there a way in the MongoDB Kafka Sink Connector to create a MongoDB collection for per Kafka topic by defining a pattern or prefix or other way?
Seyed Abbas's user avatar
  • 1,584
1 vote
1 answer
473 views

I am trying to implement a custom write strategy for a sink connector that writes to mongodb as per the documentation here: https://www.mongodb.com/docs/kafka-connector/current/sink-connector/...
Esben Folger Thomas's user avatar
0 votes
2 answers
698 views

As the title states, I'm using debezium Postgres source connector and I would like MongoDB sink connector to group kafka topics in different collection and databases (also different dbs to isolate ...
d3vr10's user avatar
  • 1
0 votes
1 answer
612 views

I'm running a MongoDB Kafka source connector (official mongodb connector version 1.7.0), and defining both pipeline and copy.existing.pipeline properties in order to filter some columns- see below ...
Gal Shaboodi's user avatar
0 votes
2 answers
1k views

I am new to kafka connector. I have been explore about it about a week. I have used create and update the mongodb via mongodb connector curl commands. I am bit struggling to understand the concept and ...
Viji Lakshmi's user avatar
0 votes
1 answer
137 views

How can I connect kafka events to a mongodb sink? The resources I found on the net using confluent they make a cluster for you and didn't find how to connect my already existing cluster
scanpower's user avatar
0 votes
1 answer
793 views

I am new to Kafka connect. I am trying sync the change stream from 1 mongo collection to another using Kafka connectors, both Inserts and updates operations Source config- { "name": "...
Ambuj Mehra's user avatar
1 vote
0 answers
402 views

I'm trying to ingest json data from Kafka topic to MongoDB using MongoDB-Kafka-Connect and have the following properties configured topics=example connector.class=com.mongodb.kafka.connect....
junelane's user avatar
  • 161
0 votes
1 answer
2k views

I'm trying to use AWS DocumentDB as a sink for storing data received from Kafka and was wondering if the MongoDB Kafka connector works with DocumentDB as its documentation mentions that it is ...
junelane's user avatar
  • 161
0 votes
2 answers
994 views

We have several collections in Mongo based on n tenants and want the kafka connector to only watch for specific collections. Below is my mongosource.properties file where I have added the pipeline ...
Harinder Singh's user avatar
0 votes
1 answer
958 views

The application writes data every month to a new collection (for example, journal_2205, journal_2206). Is it possible to configure the connector so that it reads the oplog from the new collection and ...
roman_'s user avatar
  • 141