Hello again, and please excuse a novice question.

I need to observe the visible contents of a UIWebView dIsplayed in my iOS 
application, and each time the visible contents changes, I need to extract the 
contents as image, and transmit it to our server. A bit like screen-sharing on 
the Mac, but not for the whole screen contents, just a single view.

I DO NOT wish to re-transmit when the user rotates his iOS device and the view 
rotates. However I DO want to re-transmit when the user pinches to zoom in and 
out, or scroll through the view's contents. Also, I do not want the whole 
contents transmitted. just the visible part of it.

I started looking for the right delegate call, and quite soon got lost. My 
background is Cocoa/MacOS and iOS view/CALayer system still confuses me.

This UIWebView might have subviews I'm not aware of, and it may be rendering a 
URL to some file (PDF or other). 

I already succeeded in extracting the contents of the view as an image-buffer 
(out of the CALAyer's context) and I can solve my problem by setting up a timer 
that will sample the view's contents N times a second. However, I'd like to 
avoid transmitting anything when the view doesn't change, and I want to avoid 
comparing image-buffers N times a second, for frame-differencing. 

The "right" way as I see it, would be to get notified when the UIWebView 
decides to re-display or redraw its contents after something changes. (either 
content change - redrawing, or some user manipulation zoom, rotate, translate, 
animation).

Can anyone hint on where to start here? 

Thanks.
Motti Shneor




_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to