Skip to main content
Filter by
Sorted by
Tagged with
Best practices
0 votes
0 replies
27 views

I am reading data from a Kafka topic using Kafka Streams and more specifically GlobalKTable to populate my store. In the case of corrupt data that can not successfully be parsed I wish to throw a ...
Andreas10001's user avatar
0 votes
0 answers
73 views

I’m running a Kafka Streams application (tested with versions 3.9.1 and 4.1.0) that uses a GlobalKTable backed by RocksDB. There are multiple instances of the application, each with 4 stream threads. ...
Sona Torosyan's user avatar
0 votes
1 answer
32 views

Here is a simple topology: stream1 .merge(stream2) .merge(stream3) .to("topic1) stream2 = builder.stream("topic1") stream2. process(() -> new Processor<>() { process() { Read ...
user3092576's user avatar
0 votes
1 answer
48 views

I need to consume two topics A with 40 partitions and B with 10 partitions and keep some info in a shared persistent state store with social security number SSN of type string as its key and a custom ...
losingsleeep's user avatar
  • 1,899
-2 votes
1 answer
35 views

I have a kafka topic producing product for multiple companies {"company": "c1", "productIds": ["p1", "p2", "p3"]} {"company": &...
user2890683's user avatar
0 votes
1 answer
52 views

I have read the explanation written here that one StreamTask is handeling all messages from co-partitioned topics: How do co-partitioning ensure that partition from 2 different topics end up assigned ...
Fatemah Soliman's user avatar
0 votes
1 answer
46 views

I'm working on a Change Data Capture (CDC) application using Kafka Streams, and I have the following setup: A KStream is converted to a KTable (let's call it kt1). I perform a left join between kt1 ...
Hari Krishnan U's user avatar
0 votes
0 answers
23 views

I have written sample Kafka Producer application without mentioning partition which sends message to Consumer application. My Consumer application is written using Kafka Streams as it is using state ...
Bandita Pradhan's user avatar
4 votes
0 answers
168 views

I have a project with kafka streams to create one minute candle on price for stock. My topology code is : List<String> inputTopics = new ArrayList<>(); inputTopics.add(tradeTopic); ...
mohammadjavadkh's user avatar
0 votes
1 answer
35 views

If I have a kafka input topic with multiple partitions and then in Kafka Streams I use kStream.map to change the key of each record and write that to an output topic, I will face the problem, that ...
selbstereg's user avatar
0 votes
1 answer
29 views

I have a stream of <K,V> messages. When emitting any satisfied sliding windows, I want to know the list that the window matched. I want to avoid taking on the accumulation job myself within my ....
redgiant's user avatar
  • 606
1 vote
1 answer
121 views

I am designing a Kafka stream app and want to know few details in order to design my failover strategy. I tried reading Kafka stream doc and existing stackoverflow posts, but was unable to find the ...
Aakash Gupta's user avatar
2 votes
1 answer
132 views

We are using Kafka Streams and Karpenter with normal Deployment in order to manage the pods for a service that we have. After Karpenter decides to kill the pod, it brings a new Pod up, and we are ...
alext's user avatar
  • 842
0 votes
1 answer
32 views

I use below: windowedBy(TimeWindows.of(Duration.ofHours(6))) .aggregate(aggregator, aggregator, Materialized.as("my-agg")) Changelog was created with below configs cleanup....
Abe's user avatar
  • 716
0 votes
1 answer
54 views

What is the semantics for a Kafka Streams (3.7.1) KTable-KTable foreign key join, where the extracted foreign key has never matched against the primary key in the right-side ktable? In this example ...
KarlP's user avatar
  • 5,221
0 votes
1 answer
29 views

I want to use Helidon SE 4.1.6 and producer the data to a specific partition of Apache Kafka using producer. Detail : I have gone through the https://helidon.io/docs/latest/se/reactive-messaging#...
MOHAMMAD SHADAB's user avatar
1 vote
1 answer
59 views

We are using Kafka Streams in our application to process events. In order to join operation we use both repartition and selectKey methods of Kafka Streams and this cause to create an internal ...
ceb's user avatar
  • 39
1 vote
0 answers
85 views

Context and analysis Our Spring Boot application relies on information stored in Kafka to answer REST requests. It does so by retrieving information from a global state store via the ...
Cédric Schaller's user avatar
2 votes
1 answer
44 views

We have a Java application which uses Streams library to process data. The streams application does not join data from multiple topics and processes each message received independently. The ...
Andrey's user avatar
  • 478
0 votes
0 answers
22 views

I'm reading a Confluent blog about Windowing in Kafka Strams: https://www.confluent.io/blog/windowing-in-kafka-streams/ and I found this under the Session Windows: If your inactivity window is too ...
gregof's user avatar
  • 23
1 vote
1 answer
76 views

I have a Kafka Streams app that takes input from a Kafka Topic, aggregates it on three fields in the original's value in 5 minute windows. On the output side of this, I need to translate the ...
John Ament's user avatar
  • 11.8k
0 votes
0 answers
65 views

org.apache.kafka:kafka-streams:jar:3.9.0 is suposed to use org.apache.kafka:kafka-clients:jar:3.9.0, but when I run mvn dependency:tree, I get [INFO] +- org.apache.kafka:kafka-streams:jar:3.9.0:...
M. Bouzaien's user avatar
1 vote
1 answer
192 views

What's the easiest or best way to create multiple topologies in a Spring Boot Kafka streams application? Is it possible to use the same default StreamBuilder bean? Or if I need to create a new ...
losingsleeep's user avatar
  • 1,899
0 votes
0 answers
71 views

I have kafka stream application which is running in springboot-2.x, kafka-streams-2.5.1 and spring-cloud-sleuth (log tracing) and I’m using KafkaStreamsTracing for print traceId and spanId. Which is ...
sundararajan s's user avatar
0 votes
1 answer
31 views

I'm trying to debug a problem in our production Kafka Streams app. The (simplified) topology looks something like this builder.stream("input").groupByKey().reduce( (agg, val) -> "...
Egor's user avatar
  • 1,660

1
2 3 4 5
81