1

I am creating streaming analytics application using Spark, Flink & Kafka. Each analytics/functionality will implement as a Microservice so that this analytics can able to use in the different project later.

I run my Spark/Flink job perfectly in Simple Scala application and submit this job over Spark & Flink cluster respectively. But I have to start/run this job when REST POST startJob() request invoke to my web service.

How can I integrate my Spark & Flink data processing functionality in a web service oriented application?

Till now I tried Lagom Microservice but i found so many issues you can check

  1. Best approach to ingest Streaming Data in Lagom Microservice
  2. java.io.NotSerializableException using Apache Flink with Lagom

I think i am not taking the right direction for Stream Processing Microservice Application. Looking for right direction to implement this analytics over REST Service.

2 Answers 2

0

Flink has a REST API you can use to submit and control jobs -- it's used by the Flink Web UI. See the docs here. See also this previous question.

Sign up to request clarification or add additional context in comments.

Comments

0

I think the REST API provides job running details, Any Flink API provides suppose if Spring Boot REST end point call connects Kafka streaming data, and returns Kafka data?

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.