Your payload is so small that I suspect it's an encoding issue. Is your
producer set to expect a byte array and you're passing a string? Or vice
versa?

On Sat, Jul 11, 2015 at 11:08 PM, David Montgomery <
davidmontgom...@gmail.com> wrote:

> I cant send this soooo simple payload using python.
>
> topic: topic-test-development
> payload: {"utcdt": "2015-07-12T03:59:36", "ghznezzhmx": "apple"}
>
>
> No handlers could be found for logger "kafka.conn"
> Traceback (most recent call last):
>   File "/home/ubuntu/workspace/feed-tests/tests/druid-adstar.py", line 81,
> in <module>
>     test_send_data_to_realtimenode()
>   File "/home/ubuntu/workspace/feed-tests/tests/druid-adstar.py", line 38,
> in test_send_data_to_realtimenode
>     response = producer.send_messages(test_topic,test_payload)
>   File "/usr/local/lib/python2.7/dist-packages/kafka/producer/simple.py",
> line 54, in send_messages
>     topic, partition, *msg
>   File "/usr/local/lib/python2.7/dist-packages/kafka/producer/base.py",
> line 349, in send_messages
>     return self._send_messages(topic, partition, *msg)
>   File "/usr/local/lib/python2.7/dist-packages/kafka/producer/base.py",
> line 390, in _send_messages
>     fail_on_error=self.sync_fail_on_error
>   File "/usr/local/lib/python2.7/dist-packages/kafka/client.py", line 480,
> in send_produce_request
>     (not fail_on_error or not self._raise_on_response_error(resp))]
>   File "/usr/local/lib/python2.7/dist-packages/kafka/client.py", line 247,
> in _raise_on_response_error
>     raise resp
> kafka.common.FailedPayloadsError
>
> Here is what is in my logs
> [2015-07-12 03:29:58,103] INFO Closing socket connection to
> /xxx.xxx.xxx.xxx due to invalid request: Request of length 1550939497 is
> not valid, it is larger than the maximum size of 104857600 bytes.
> (kafka.network.Processor)
>
>
>
> Server is 4 gigs of ram.
>
> I used export KAFKA_HEAP_OPTS=-Xmx256M -Xms128M in kafka-server-start.sh
>
> So.....why?
>

Reply via email to