Hello,

I'm running PySpark on Dataproc and I tried to add a new python package. I
zipped the confluent_kafka package and added it in the spark submit using
py-files.

For some reason I keep on getting the following error:
ModuleNotFoundError: No module named 'confluent_kafka.cimpl'

When running locally with Python3 I didn't encounter any issue using this
package.

The version I used is confluent_kafka 2.6.1.

Any suggestions?

Reply via email to