On 14/01/2014 15:32, Jon Clayden wrote:
Dear all,

I am trying to find a way to reliably and programmatically establish the
resolution (i.e. DPI or equivalent) of an on-screen device. It seemed to me
that

   dev.new(width=1, height=1)
   dpi <- dev.size("px")

would do the trick, but the result does not seem to be correct, at least on
OS X 10.9.1 using the "quartz" device.

It is correct: you need to read what it actually claims to do.

   units: the units in which to return the value - inches, cm, or
          pixels (device units).

The device units are nominal pixels: they may or may not be actual ones. PDF (as used by Quartz) has a nominal 72 pixels per inch irrespective of display.

> Specifically, the window that
appears is 1 inch square, as expected, but the result from dev.size() is
c(72,72), which isn't correct. My display is 1440x900, but if I call

   dev.new(width=720/72, height=450/72)

the resulting device fills much more than half the screen. But R gets the
size right in inches, so it, or something it calls, must presumably know
the real DPI value.

So, could anyone tell me whether there is a reliable way to determine DPI,
please?

No.  And even when the OS reports this, it is often wrong.

--
Brian D. Ripley,                  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to