On Apr 1, 2011, at 10:25 PM, Rikza Azriyan wrote:

> Yes, i want to make an application like ibooks that the user can navigate the 
> page by swipe the screen, than 
>> - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
> Will respond for that swipe...
> But in my case, i don't want to make that swipe with my finger...
> I want to make that swipe by my face....
> So that i want to implement my facetracking algorithm on my iphone 
> 4...detecting the face by frontcamera, and give the output result as the face 
> position cordinate.
> I'm going to use this cordinate as realtime dragging event on my PDF reader 
> as if the finger swiping in my screen. As ibooks did use finger for swipe...

Chase's solution is still the way you want to go, no matter what how you gather 
the points. He just phrased it as an example that uses a touch event in one 
case and a "scripted" movement in another. In each case, they call a common 
method that handles the actual movement between a pair of points, but doesn't 
care how those points were calculated.

For your case, your face tracking algorithm could calculate points based on 
whatever algorithm it uses and then pass those points to that common movement 
method. Conceivably you could have a book reading experience where someone who 
has use of their hands could swipe the screen and someone who doesn't could 
take advantage of face tracking.

_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to