Hi Group

I want to send image data between C++ and Python websockets. I use in both 
cases QtWebSockets and protobuf. My message looks like that: 
message Image {
  bytes data = 1;
  int32 width = 2;
  int32 height = 3;
  int32 channels = 4;
}

In the C++ side I use the following code to create the stream:
byteArray.resize(proto_image.ByteSize()); ret = proto_image.SerializeToArray
(byteArray.data(), byteArray.size());
and use the sendBinaryMessage of the websockets to send the message.

In the Python side I use the following out of QtPy5: 
def processBinaryMessage(self, message):
  image = projectname_pb2.Image()
  image.ParseFromString(message)

which works fine. If I print image.width I get the correct width. However I 
am not sure how to further process the image data image.data, for example 
to put it into a np.array or show it with matplotlib. Any Hints? 

Thank you for your help
Best

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/df197279-cce0-408d-b197-e4cf1278fb6dn%40googlegroups.com.

Reply via email to