I'm trying to develop an app to take in dimensions (rowCount and colCount) from an NSForm and set up an NSView subclass to display a matrix of squares accordingly. Here's the action that responds to the NSForm:

- (IBAction)setSize:(id)sender
{
        NSRect aRect;
        //Start with column and row counts from LASize.
        LASize = sender;
        columnCount = [[LASize cellAtIndex:0] intValue];
        rowCount = [[LASize cellAtIndex:1] intValue];
        if (columnCount < 5 | rowCount < 5) return; //wait for something better
        
        aRect.origin.x = 310;
        aRect.origin.y = 15;
        aRect.size.width = columnCount * 5;
        aRect.size.height= rowCount * 5;
theGrid = [[LogicArtMatrix alloc] initWithCol: columnCount row: rowCount]; thePicture=[[LogicArtDisplay alloc] initWithMatrix: theGrid cols: columnCount rows: rowCount andRect: aRect];

        return;

}

When I stop right after initializing the NSRect local variable, the debugger reports miniscule values for the portions of the rectangle, like it got overlaid with integers instead of float values.

TIA

Dean Ritchie
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to