Issue
I am trying to make a simple application where i can save one user information in database. So i'm trying to send user information by producer and want to consume that user information in consumer. For that i had make one EventModel where i am setting all user information, and that information i'm passing through producer . below i have written the usable code. Problem what i had figure out throw log its sending the user information but its not able to consume those user information.
Exception in kafka.log
[]2018-01-22T06:41:15,797Z INFO
o.s.k.l.KafkaMessageListenerContainer - partitions revoked:[]
[]2018-01-22T06:41:15,828Z INFO o.s.k.l.KafkaMessageListenerContainer
- partitions assigned:[com.combine.domain.addUser-0]
[]2018-01-22T06:42:30,962Z ERROR o.s.k.listener.LoggingErrorHandler -
Error while processing: ConsumerRecord(topic =
com.combine.domain.addUser, partition = 0,
offset = 1, key = null, value = AddUserEventModel(name=Test-User-
Kafka-2, address=Test-User-Kafka-2, age=26))
org.springframework.kafka.KafkaException: No method found for class com.example.data.combine.eventmodel.AddUserEventModel
at org.springframework.kafka.listener.adapter.DelegatingInvocableHandler.getHandlerForPayload(DelegatingInvocableHandler.java:92)
at org.springframework.kafka.listener.adapter.DelegatingInvocableHandler.getMethodNameFor(DelegatingInvocableHandler.java:146)
at org.springframework.kafka.listener.adapter.HandlerAdapter.getMethodAsString(HandlerAdapter.java:60)
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:131)
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.onMessage(MessagingMessageListenerAdapter.java:101)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:618)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.access$1500(KafkaMessageListenerContainer.java:236)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer$ListenerInvoker.run(KafkaMessageListenerContainer.java:797)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.lang.Thread.run(Thread.java:745)
[]2018-01-22T08:21:20,074Z INFO o.s.k.l.KafkaMessageListenerContainer-
partitions revoked:[com.combine.domain.addUser-0]
[]2018-01-22T08:21:20,081Z INFO o.s.k.l.KafkaMessageListenerContainer-
partitions assigned:[com.combine.domain.addUser-0]
Exception in app.log
[]2018-01-22T06:42:30,800Z INFO c.e.d.c.controller.MongoController
- # send user by kafka model : AddUserEventModel(name=Test-User-
Kafka-2, address=Test-User-Kafka-2, age=26) with parameter
[]2018-01-22T06:42:30,800Z INFO c.e.d.c.publisher.AddUserPublished
- #sending addUserEventModel
[]2018-01-22T06:42:30,809Z INFO o.a.k.c.producer.ProducerConfig -
ProducerConfig values:
compression.type = none
metric.reporters = []
metadata.max.age.ms = 300000
metadata.fetch.timeout.ms = 60000
reconnect.backoff.ms = 50
sasl.kerberos.ticket.renew.window.factor = 0.8
bootstrap.servers = [localhost:9092]
retry.backoff.ms = 100
sasl.kerberos.kinit.cmd = /usr/bin/kinit
buffer.memory = 33554432
timeout.ms = 30000
key.serializer = class org.springframework.kafka.support.serializer.JsonSerializer
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
ssl.keystore.type = JKS
ssl.trustmanager.algorithm = PKIX
block.on.buffer.full = false
ssl.key.password = null
max.block.ms = 60000
sasl.kerberos.min.time.before.relogin = 60000
connections.max.idle.ms = 540000
ssl.truststore.password = null
max.in.flight.requests.per.connection = 5
metrics.num.samples = 2
client.id =
ssl.endpoint.identification.algorithm = null
ssl.protocol = TLS
request.timeout.ms = 30000
ssl.provider = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
acks = 1
batch.size = 16384
ssl.keystore.location = null
receive.buffer.bytes = 32768
ssl.cipher.suites = null
ssl.truststore.type = JKS
security.protocol = PLAINTEXT
retries = 0
max.request.size = 1048576
value.serializer = class org.springframework.kafka.support.serializer.JsonSerializer
ssl.truststore.location = null
ssl.keystore.password = null
ssl.keymanager.algorithm = SunX509
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
send.buffer.bytes = 131072
linger.ms = 0
[]2018-01-22T06:42:30,832Z INFO o.a.k.common.utils.AppInfoParser -
Kafka version : 0.9.0.1
[]2018-01-22T06:42:30,833Z INFO o.a.k.common.utils.AppInfoParser -
Kafka commitId : 23c69d62a0cabf06
[]2018-01-22T08:21:15,047Z INFO o.a.k.c.c.i.AbstractCoordinator -
Marking the coordinator 2147483647 dead.
[]2018-01-22T08:21:19,485Z ERROR o.a.k.c.c.i.ConsumerCoordinator -
Error UNKNOWN_MEMBER_ID occurred while committing offsets for group
com.combine.domain.addUser
[]2018-01-22T08:21:19,486Z WARN o.a.k.c.c.i.ConsumerCoordinator -
Auto offset commit failed: Commit cannot be completed due to group
rebalance
[]2018-01-22T08:21:19,487Z ERROR o.a.k.c.c.i.ConsumerCoordinator -
Error UNKNOWN_MEMBER_ID occurred while committing offsets for
group com.combine.domain.addUser
[]2018-01-22T08:21:19,487Z WARN o.a.k.c.c.i.ConsumerCoordinator -
Auto offset commit failed:
[]2018-01-22T08:21:20,075Z INFO o.a.k.c.c.i.AbstractCoordinator -
Attempt to join group com.combine.domain.addUser failed due to
unknown member id, resetting and retrying.
Pom.xml
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
<version>${spring-kafka-version}</version>
</dependency>
Kafka-configuration</>
@Configuration
public class BaseKafkaConfiguration {
@Value("${spring.kafka.bootstrap-servers}")
private String servers;
@Bean
public ProducerFactory<String, AddUserEventModel> producerFactory() {
Map<String, Object> property = new HashMap<>();
property.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, servers);
property.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
JsonSerializer.class);
property.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
JsonSerializer.class);
return new DefaultKafkaProducerFactory<>(property);
}
@Bean
public KafkaTemplate<String, AddUserEventModel> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
BasicConsumerConfig</>
@EnableKafka
@Configuration
public class BasicConsumerConfig {
@Value("${spring.kafka.bootstrap-servers}")
private String servers;
public ConsumerFactory<String, AddUserEventModel>
kafkaConsumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, servers);
props.put(ConsumerConfig.GROUP_ID_CONFIG,
DomainEventNames.COM_COMBINE_DOMAIN_ADD_USER);
return new DefaultKafkaConsumerFactory<>(props, new
StringDeserializer(), new JsonDeserializer<>(AddUserEventModel.class));
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String,
AddUserEventModel> containerFactory() {
ConcurrentKafkaListenerContainerFactory<String, AddUserEventModel>
factory =new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(kafkaConsumerFactory());
return factory;
}
}
publisher
@Slf4j
@Component
public class AddUserPublished {
@Autowired
private KafkaTemplate<String, AddUserEventModel> kafkaTemplate;
public void publish(AddUserEventModel addUserEventModel) {
log.info("#sending addUserEventModel");
kafkaTemplate.send(DomainEventNames.COM_COMBINE_DOMAIN_ADD_USER,
addUserEventModel);
try {
TimeUnit.MILLISECONDS.sleep(2000);
} catch (InterruptedException e) {
log.error("exception at addUserEventModel while thread sleep", e);
}
}
}
consumer
@Component
@KafkaListener(topics =
DomainEventNames.COM_COMBINE_DOMAIN_ADD_USER, containerFactory =
"containerFactory")
@Slf4j
public class AddUserConsumer {
@Autowired
private MongoUserRepository mongoUserRepository;
@Autowired
private ObjectMapper objectMapper;
public void addUserConsumer(AddUserEventModel addUserEventModel)
{
log.info("#AdduserConsumer consuming addUserEventModel :
{} ", addUserEventModel);
try {
MongoUser mongoUser = new MongoUser();
BeanUtils.copyProperties(addUserEventModel, mongoUser);
this.mongoUserRepository.save(mongoUser);
log.info("#SuccessFully saved consumed object : {}",
mongoUser);
} catch (Exception e) {
log.error("#AddUserConsumer exception during consume
addUserEventModel : {}, with error : {}", addUserEventModel,
e);
}
}
}
DomainEventNames
public final class DomainEventNames {
public static final String COM_COMBINE_DOMAIN_ADD_USER =
"com.combine.domain.addUser";
}
application.properties
spring.kafka.bootstrap-servers=localhost:9092
topicName
com.combine.domain.addUser
I had created the above topic in my local.
Solution
When you use @KafkaListener
on the class level you have to use @KafkaHandler
on the method level: https://docs.spring.io/spring-kafka/docs/2.1.1.RELEASE/reference/html/_reference.html#class-level-kafkalistener
Answered By - Artem Bilan