3 분 소요


  • The stream module hasn’t had a need to use JPA or module-core entities yet.
  • While setting up the stream module, I’m going to document how SpringBoot connects with Kafka and how Consumer and Producer are configured.
  • For Kafka, you can use local Kafka, server-deployed Kafka, Docker, Confluent, or any solution that supports Kafka.
  • This post is written assuming Kafka is already prepared.

Creating module-stream

  • We’ll continue working from the module-stream we created last time.
  • Since Kafka will only be used in the module-stream application, we only need to add dependencies to module-stream.
  • Let’s create Consumer and Producer within the SpringBoot Application

Configuring module-stream

build.gradle.kts Configuration

  • Since Kafka will only be used in the module-stream application, add dependencies to module-stream’s build.gradle.kts

    plugins{
    
    }
    
    dependencies{
        implementation("org.springframework.kafka:spring-kafka")
    }
    

Writing Springboot Application Class

  • Since we’ll use it in Springboot, create a class to apply @SpringBootApplication
  • Similarly, name the package com.wool, and create ModuleStreamApplication.kt underneath with the following source code

    package com.wool
    
    import org.springframework.boot.autoconfigure.SpringBootApplication
    import org.springframework.boot.runApplication
    
    @SpringBootApplication
    class ModuleStreamApplication
    
    fun main(args: Array<String>) {
        runApplication<ModuleStreamApplication>(*args)
    }
    

Writing application.yml

  • Since we’ll use it in Springboot, write application.yml
  • Write application.yml under the resources package and add Kafka configuration values
  • For now, since we’re only doing Produce and Consume testing, it may look a bit different from module work

    spring:
      jackson:
        serialization:
          fail-on-empty-beans: false
    
      kafka:
        properties:
          session:
            timeout:
              ms: 45000
          sasl:
            mechanism: PLAIN
            jaas:
              config: {}
          security:
            protocol: SASL_SSL
          bootstrap:
            servers: {}
    

Connecting to Kafka

  • Let’s create a Consumer Class that subscribes to order-related data
  • If you’ve properly added configuration values in application.yml, Spring will automatically connect to Kafka

Writing OrderConsumer Class

  • Create a consumer package under the com.wool package and create the OrderConsumer class

    package com.wool.consumer
    
    import org.springframework.kafka.annotation.KafkaListener
    import org.springframework.stereotype.Service
    
    
    @Service
    class OrderConsumer {
    
        @KafkaListener(topics = ["order"], groupId = "order-consumer")
        fun consume(message: String){
            println("###########################")
            println(message)
            println("###########################")
        }
    }
    
    • Using @KafkaListener, we created a Consumer that subscribes to the order topic with group ID order-consumer
    • Added the @Service annotation to register it as a Bean in SpringBoot

Writing OrderProducer Class

  • Create a producer package under the com.wool package and create the OrderProducer class

    package com.wool.producer
    
    import com.fasterxml.jackson.databind.ObjectMapper
    import com.wool.controller.dto.OrderProduceDto
    import org.springframework.kafka.core.KafkaTemplate
    import org.springframework.stereotype.Service
    
    @Service
    class OrderProducer(
    private val kafkaTemplate: KafkaTemplate<String, String>
    ) {
    
        final val KAFKA_ORDER_TOPIC: String = "order"
    
        fun sendOrderMessage(message: OrderProduceDto){
            // JSON serialize OrderProduceDto
            val obm:ObjectMapper = ObjectMapper()
            val jsomMessage = obm.writeValueAsString(message)
    
            kafkaTemplate.send(KAFKA_ORDER_TOPIC, jsomMessage)
        }
    
    }
    
    • Added the @Service annotation to register it as a Bean in SpringBoot
    • Injected KafkaTemplate and created a produce method
    • Through kafkaTemplate.send(KAFKA_ORDER_TOPIC, jsomMessage), we can send messages to the order topic

Writing ProducerController

  • It would be nice to produce messages periodically whenever the Stream Application runs, but since the data comes after receiving orders, let’s create a Controller that calls the Producer
  • Let’s create a Controller and write it to call OrderProducer when the API is called
  • Create dto and controller in the com.wool.controller package

Writing OrderProduceDto

package com.wool.controller.dto

data class OrderProduceDto(
    val orderStoreName: String,
    val orderStoreAddress: String,
    val orderItem: String,
    val orderPrice: String,
    val customerId: Int,
)
  • This was created assuming that during produce, the user is already logged in from the “app” or “web” and can send their id value

Writing ProducerController

package com.wool.controller

import com.wool.controller.dto.OrderProduceDto
import com.wool.producer.OrderProducer
import org.springframework.web.bind.annotation.PostMapping
import org.springframework.web.bind.annotation.RequestBody
import org.springframework.web.bind.annotation.RestController


@RestController
class ProducerController(
    private val orderProducer: OrderProducer
) {

    @PostMapping("/order-produce")
    fun produceOrder(@RequestBody orderDto: OrderProduceDto) {
        orderProducer.sendOrderMessage(orderDto)
    }

}

Wrapping Up

  • When you run the Springboot Application, the Consumer watches Kafka and remains subscribed to the order topic
  • When you send a POST request to http://localhost:8080/order-produce, OrderProducer sends a message to the order topic, and the Consumer receives and prints the message
  • Although this content wasn’t directly related to multi-module, in the next post let’s follow the flow of api order request -> order produce -> order consume -> save order -> api query to take advantage of multi-module benefits
  • It’s a flow that may not seem necessary, but let’s try building it with multi-module in mind

댓글남기기