Hi,

I'm attempting to recreate something kind of like iPhoto's mechanism for refining face autodetection, like here:

http://news.cnet.com/i/bto/20090130/nicoleleeconfirm_610x381.png

In my situation, I'd like to have two NSCollectionView instances. One on top, one on the bottom. The entries on top are the ones which my batch processor will act on, and the ones on the bottom are the entries which my algorithm has decided aren't candidates for the image processing. The idea is that the user can pick entries from the bottom and override the automatic detection and thus "move" them to the top. And visa versa.

The trouble is, to do this I need to take the NSCollectionViews out of their enclosing scroll views and put them both into one scroll view -- and I'd write code that queries how tall they want to be and would lay the two of them out one over the other. That would be fine if I could find a way to determine how tall the NSCollectionView wants to be to display its contents. Trouble is that all the internal properties of NSCollectionView appear to be private.

There doesn't seem to be any way to ask an NSCollectionView how tall it wants to be, or how many rows its showing, etc etc.

Obviously, in the end I'm willing to just write a custom view that does this manually but NSCollectionView is pretty nice in that I can perform my bindings in IB and I get animation for free...

shamyl zakariya
        - squamous and rugose



_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to