Hello everyone,

I'm trying to understand how NSImageRep works, so I read the docs and came up 
with this simple code that seems to... do something. 

I'm confused about what's actually doing, I only see a big black block in my 
custom view. 

My intention is to make a 2x2 pixels image with different grayscale colors. For 
example one pixel black, the other white, etc, so I can see if it actually 
draws the array I'm passing in to NSImageRep.

I'm guessing that the 2x2 pixel image will scale to fit the entire view so I 
will see the pixels as big squares in the view.

Here's the relevant code:

@implementation PixelsView

- (id)initWithFrame:(NSRect)frame {
    self = [super initWithFrame:frame];
    if (self) {
                pixels = (unsigned char *)malloc(4*sizeof(*pixels));
                aSimpleBitmap = [[NSBitmapImageRep alloc] 
initWithBitmapDataPlanes:&pixels 
                                                                                
                                  pixelsWide:2 
                                                                                
                                  pixelsHigh:2
                                                                                
                           bitsPerSample:8 
                                                                                
                         samplesPerPixel:1 
                                                                                
                                        hasAlpha:NO 
                                                                                
                                        isPlanar:NO 
                                                                                
                          colorSpaceName:NSDeviceWhiteColorSpace 
                                                                                
                                 bytesPerRow:2
                                                                                
                                bitsPerPixel:8];
                pixels[0] = 1;
                pixels[1] = 0;
                pixels[2] = 0;
                pixels[3] = 0;
                
    }
    return self;
}

- (void)drawRect:(NSRect)dirtyRect {
        NSRect bounds = [self bounds];
        [NSGraphicsContext saveGraphicsState];  
        [aSimpleBitmap drawInRect:bounds];      
        [NSGraphicsContext restoreGraphicsState];
}
@end

The docs say that NSDeviceWhiteColorSpace has pure white at 1.0, so I'm 
guessing that pixels[0] = 1 will set a white pixel, am I wrong?

Xcode is not throwing any errors, but the view is not displaying what I expect. 
Since I don't fully understand bitmap images, is there anything obvious that 
I'm missing?



Thanks in advance._______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to