I'm trying to write a unit test for some code I wrote that generates one image from another. In the main app, the source data comes from Open CV as a buffer of 3 byte-per-pixel elements. My code generates a CGImage. In the unit test, I load a saved version of one of those images from a PNG file to UIImage, get at the buffer, pass it to my code, and then compare the result to a saved version of that same output.
The saved version is a PNG. I load that, and then get the data buffer using let fiData = fi.dataProvider?.data as Data? I do a similar thing with the generated CGImage. Then I compare the two buffers, byte by byte. They are similar, but contain differences (sometimes more than I would expect). But if I save both as PNG and look at them in Preview they look identical. My guess is something's happening somewhere with color correction. In my code, I make a CG(bitmap)Context, specifying device RGB color space (should that be generic?). I don't really know what happens to the PNGs I save and load. Is there a way to ensure the bytes in the buffer are compressed and decompressed exactly as written? -- Rick Mann rm...@latencyzero.com _______________________________________________ Cocoa-dev mailing list (Cocoa-dev@lists.apple.com) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to arch...@mail-archive.com