Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error decoding packet: CRC didn't match #100

Closed
giftig opened this issue Apr 2, 2019 · 1 comment
Closed

error decoding packet: CRC didn't match #100

giftig opened this issue Apr 2, 2019 · 1 comment

Comments

@giftig
Copy link

giftig commented Apr 2, 2019

While reading from a test topic in a local wurstmeister/kafka docker container:

(kafka) Downloads $ kt consume -topic foo -offsets all=oldest: -verbose
sarama client configuration &sarama.Config{Net:struct { MaxOpenRequests int; DialTimeout time.Duration; ReadTimeout time.Duration; WriteTimeout time.Duration; TLS struct { Enable bool; Config *tls.Config }; SASL struct { Enable bool; Handshake bool; User string; Password string }; KeepAlive time.Duration }{MaxOpenRequests:5, DialTimeout:30000000000, ReadTimeout:30000000000, WriteTimeout:30000000000, TLS:struct { Enable bool; Config *tls.Config }{Enable:false, Config:(*tls.Config)(nil)}, SASL:struct { Enable bool; Handshake bool; User string; Password string }{Enable:false, Handshake:true, User:"", Password:""}, KeepAlive:0}, Metadata:struct { Retry struct { Max int; Backoff time.Duration }; RefreshFrequency time.Duration }{Retry:struct { Max int; Backoff time.Duration }{Max:3, Backoff:250000000}, RefreshFrequency:600000000000}, Producer:struct { MaxMessageBytes int; RequiredAcks sarama.RequiredAcks; Timeout time.Duration; Compression sarama.CompressionCodec; Partitioner sarama.PartitionerConstructor; Return struct { Successes bool; Errors bool }; Flush struct { Bytes int; Messages int; Frequency time.Duration; MaxMessages int }; Retry struct { Max int; Backoff time.Duration } }{MaxMessageBytes:1000000, RequiredAcks:1, Timeout:10000000000, Compression:0, Partitioner:(sarama.PartitionerConstructor)(0x64b6b0), Return:struct { Successes bool; Errors bool }{Successes:false, Errors:true}, Flush:struct { Bytes int; Messages int; Frequency time.Duration; MaxMessages int }{Bytes:0, Messages:0, Frequency:0, MaxMessages:0}, Retry:struct { Max int; Backoff time.Duration }{Max:3, Backoff:100000000}}, Consumer:struct { Retry struct { Backoff time.Duration }; Fetch struct { Min int32; Default int32; Max int32 }; MaxWaitTime time.Duration; MaxProcessingTime time.Duration; Return struct { Errors bool }; Offsets struct { CommitInterval time.Duration; Initial int64; Retention time.Duration } }{Retry:struct { Backoff time.Duration }{Backoff:2000000000}, Fetch:struct { Min int32; Default int32; Max int32 }{Min:1, Default:32768, Max:0}, MaxWaitTime:250000000, MaxProcessingTime:100000000, Return:struct { Errors bool }{Errors:false}, Offsets:struct { CommitInterval time.Duration; Initial int64; Retention time.Duration }{CommitInterval:1000000000, Initial:-1, Retention:0}}, Clie
ntID:"kt-consume-giftiger_wunsch", ChannelBufferSize:256, Version:sarama.KafkaVersion{version:[4]uint{0x0, 0xa, 0x0, 0x0}}, MetricRegistry:(*metrics.StandardRegistry)(0xc420084820)}
2019/04/02 22:01:09 Initializing new client
2019/04/02 22:01:09 client/metadata fetching metadata for all topics from broker localhost:9092
2019/04/02 22:01:09 Connected to broker at localhost:9092 (unregistered)
2019/04/02 22:01:09 client/brokers registered new broker #1001 at localhost:9092
2019/04/02 22:01:09 Successfully initialized new client
2019/04/02 22:01:09 Connected to broker at localhost:9092 (registered as #1001)
2019/04/02 22:01:09 consumer/broker/1001 added subscription to foo/0
2019/04/02 22:01:09 consumer/broker/1001 disconnecting due to error processing FetchRequest: kafka: error decoding packet: CRC didn't match
2019/04/02 22:01:09 Closed connection to broker localhost:9092
2019/04/02 22:01:09 kafka: error while consuming foo/0: kafka: error decoding packet: CRC didn't match

Seems to be an issue in the underlying sarama library, but was fixed last year: IBM/sarama#1149

I'm using Kafka 2.12-2.1.0 and the latest kt release. I note that your latest release is significantly earlier than the bugfix in sarama, so probably just requires a sarama upgrade.

@giftig
Copy link
Author

giftig commented Apr 2, 2019

Actually I notice that d9ec6d updates sarama, so I think it's just a lack of a release that's the problem. I'll close this and raise a separate issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant