스프링부트 멀티모듈 구성하기(4) - Stream 어플리케이션 Spring Boot Multi-Module Setup (4) - Stream Application
- The stream module hasn’t had a need to use JPA or
module-coreentities yet. - While setting up the stream module, I’m going to document how SpringBoot connects with Kafka and how Consumer and Producer are configured.
- For Kafka, you can use local Kafka, server-deployed Kafka, Docker, Confluent, or any solution that supports Kafka.
- This post is written assuming Kafka is already prepared.
Creating module-stream
- We’ll continue working from the
module-streamwe created last time. - Since Kafka will only be used in the
module-streamapplication, we only need to add dependencies tomodule-stream. - Let’s create Consumer and Producer within the SpringBoot Application
Configuring module-stream
build.gradle.kts Configuration
-
Since Kafka will only be used in the
module-streamapplication, add dependencies tomodule-stream’sbuild.gradle.ktsplugins{ } dependencies{ implementation("org.springframework.kafka:spring-kafka") }
Writing Springboot Application Class
- Since we’ll use it in Springboot, create a class to apply
@SpringBootApplication -
Similarly, name the package
com.wool, and createModuleStreamApplication.ktunderneath with the following source codepackage com.wool import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.boot.runApplication @SpringBootApplication class ModuleStreamApplication fun main(args: Array<String>) { runApplication<ModuleStreamApplication>(*args) }
Writing application.yml
- Since we’ll use it in Springboot, write
application.yml - Write
application.ymlunder the resources package and add Kafka configuration values -
For now, since we’re only doing Produce and Consume testing, it may look a bit different from module work
spring: jackson: serialization: fail-on-empty-beans: false kafka: properties: session: timeout: ms: 45000 sasl: mechanism: PLAIN jaas: config: {} security: protocol: SASL_SSL bootstrap: servers: {}
Connecting to Kafka
- Let’s create a Consumer Class that subscribes to order-related data
- If you’ve properly added configuration values in application.yml, Spring will automatically connect to Kafka
Writing OrderConsumer Class
-
Create a
consumerpackage under thecom.woolpackage and create theOrderConsumerclasspackage com.wool.consumer import org.springframework.kafka.annotation.KafkaListener import org.springframework.stereotype.Service @Service class OrderConsumer { @KafkaListener(topics = ["order"], groupId = "order-consumer") fun consume(message: String){ println("###########################") println(message) println("###########################") } }- Using
@KafkaListener, we created a Consumer that subscribes to theordertopic with group IDorder-consumer - Added the
@Serviceannotation to register it as a Bean in SpringBoot
- Using
Writing OrderProducer Class
-
Create a
producerpackage under thecom.woolpackage and create theOrderProducerclasspackage com.wool.producer import com.fasterxml.jackson.databind.ObjectMapper import com.wool.controller.dto.OrderProduceDto import org.springframework.kafka.core.KafkaTemplate import org.springframework.stereotype.Service @Service class OrderProducer( private val kafkaTemplate: KafkaTemplate<String, String> ) { final val KAFKA_ORDER_TOPIC: String = "order" fun sendOrderMessage(message: OrderProduceDto){ // JSON serialize OrderProduceDto val obm:ObjectMapper = ObjectMapper() val jsomMessage = obm.writeValueAsString(message) kafkaTemplate.send(KAFKA_ORDER_TOPIC, jsomMessage) } }- Added the
@Serviceannotation to register it as a Bean in SpringBoot - Injected
KafkaTemplateand created aproducemethod - Through
kafkaTemplate.send(KAFKA_ORDER_TOPIC, jsomMessage), we can send messages to theordertopic
- Added the
Writing ProducerController
- It would be nice to produce messages periodically whenever the Stream Application runs, but since the data comes after
receiving orders, let’s create a Controller that calls the Producer - Let’s create a Controller and write it to call
OrderProducerwhen the API is called - Create dto and controller in the
com.wool.controllerpackage
Writing OrderProduceDto
package com.wool.controller.dto
data class OrderProduceDto(
val orderStoreName: String,
val orderStoreAddress: String,
val orderItem: String,
val orderPrice: String,
val customerId: Int,
)
- This was created assuming that during produce, the user is already logged in from the “app” or “web” and can send their id value
Writing ProducerController
package com.wool.controller
import com.wool.controller.dto.OrderProduceDto
import com.wool.producer.OrderProducer
import org.springframework.web.bind.annotation.PostMapping
import org.springframework.web.bind.annotation.RequestBody
import org.springframework.web.bind.annotation.RestController
@RestController
class ProducerController(
private val orderProducer: OrderProducer
) {
@PostMapping("/order-produce")
fun produceOrder(@RequestBody orderDto: OrderProduceDto) {
orderProducer.sendOrderMessage(orderDto)
}
}
Wrapping Up
- When you run the Springboot Application, the Consumer watches Kafka and remains subscribed to the
ordertopic - When you send a
POSTrequest tohttp://localhost:8080/order-produce,OrderProducersends a message to theordertopic, and the Consumer receives and prints the message - Although this content wasn’t directly related to multi-module, in the next post let’s follow the flow of
api order request -> order produce -> order consume -> save order -> api queryto take advantage of multi-module benefits - It’s a flow that may not seem necessary, but let’s try building it with multi-module in mind
댓글남기기