Hey, Short version of my question: I believe I am having rounding errors because I am working with really, really small values. Would it help if I multiplied these values by a scalar (say, 1,000), did math with them, and then divided them by the scalar? I remember learning how IEEE floating point numbers are stored, but I can't remember enough about it to know if this would have any effect on precision. If not, what is a good way to get better precision? I am already using doubles instead of floats.
Long explanation of my question: In my project, I have users clicking and dragging to adjust values. I would like to map values from the left-most of the view being zero, and the right-most of the view to be one. This is fairly simple. However, it feels a little unnatural if the value initially "jumps" to match where the mouse clicks before dragging. For example, if the initial value is 0.3, and they click in the middle of the view, the value jumps to 0.5. To take care of this I construct a polynomial which maps 0 to 0, the right-most of the view to 1, and the initial click location (middle) to the initial value (0.3). This works very well, and feels natural. However, when any of the values near the edges (the user clicks just off the left of the view), the values go crazy. I believe this is because of a rounding error. The same question still stands, would multiplying (and later dividing) everything by a scalar help me get greater precision? Thank You, Bridger Maxwell _______________________________________________ Cocoa-dev mailing list (Cocoa-dev@lists.apple.com) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to [EMAIL PROTECTED]