Re: Points vs pixels in a bash script

2020-06-09 Thread Alex Zavatone via Cocoa-dev
Don’t you need to find out what the pixels per inch of the display is first? At 72 dpi or ppi, a pixel = a point. Can you get that info first? > On Jun 8, 2020, at 4:43 PM, Gabriel Zachmann via Cocoa-dev > wrote: > > I know this mailing list might not be the perfect fit for my question,

Re: Points vs pixels in a bash script

2020-06-09 Thread Gabriel Zachmann via Cocoa-dev
> you're not taking into account the current screen resolution (a.k.a. display > mode). The user > [...] > necessarily have to build a C utility for it. You can invoke the Swift > interpreter to execute Thanks a lot for your hints, but unfortunately , I don't have the time to learn Swift or

Re: Points vs pixels in a bash script

2020-06-09 Thread Gabriel Zachmann via Cocoa-dev
> In particular, you're not taking into account the current screen resolution > (a.k.a. display mode). The user can select different scaling for a Retina > display in System Preferences > Displays. Good point. I wasn't taking that into consideration. So, what would be a robust way to determine

Re: Points vs pixels in a bash script

2020-06-09 Thread Gabriel Zachmann via Cocoa-dev
> > I don’t have an answer to your question, but to add some clarity Points are > scale factor independent unit of measurement. On a retina display there are > 2 pixels per point. On a non-retina display there is 1 pixel per point. Say > Apple comes out with a display with a scale Thanks a