I wanted to use the ruby kafka client library for producing events, but have come up against a problem that I am not sure how to solve. Any help would be appreciated.
I have tried using kafka-rb (acrosa, mheffner and bpot forks). The problem is that no matter what I send to it via the library e.g.
require 'kafka'
host = 'localhost'
port = 9092
producer = Kafka::Producer.new(
:topic => 'login',
:host => host,
:port => port
)
producer.send([Kafka::Message.new("aaaaa")])
I get a:
java.nio.BufferUnderflowException
at java.nio.HeapByteBuffer.get(HeapByteBuffer.java:127)
at java.nio.ByteBuffer.get(ByteBuffer.java:675)
at kafka.api.ApiUtils$.readShortString(ApiUtils.scala:22)
at kafka.api.ProducerRequest$.readFrom(ProducerRequest.scala:34)
at kafka.api.RequestKeys$$anonfun$1.apply(RequestKeys.scala:34)
at kafka.api.RequestKeys$$anonfun$1.apply(RequestKeys.scala:34)
at kafka.network.RequestChannel$Request.<init>(RequestChannel.scala:48)
at kafka.network.Processor.read(SocketServer.scala:321)
at kafka.network.Processor.run(SocketServer.scala:231)
at java.lang.Thread.run(Thread.java:680)
on the server. On the same server I can send text through the provided console producer without any issues.
If you have seen this before I would appreciate the help. As I am not very familiar with Scala, I am not sure what the problem is, but it seems to me that the line where this exception is thrown, has to do with reading the clientId from the socket, and it also seems to me that there is no such thing sent from the ruby client.
When I look at the messages produced on tcpdump form kafka-rb and the provided producer. The ruby ones seem shorter. Additionally it does not matter whether I use kafka-0.7 or 0.8, i get the exact same behaviour.
