Re: [Python-Dev] PEP 564: Add new time functions with nanosecond resolution

2017-10-22 Thread Antoine Pitrou

Hi Victor,

I made some small fixes to the PEP.

As far as I'm concerned, the PEP is ok and should be approved :-)

Regards

Antoine.


On Mon, 16 Oct 2017 12:42:30 +0200
Victor Stinner  wrote:
> Hi,
> 
> While discussions on this PEP are not over on python-ideas, I proposed
> this PEP directly on python-dev since I consider that my PEP already
> summarizes current and past proposed alternatives.
> 
> python-ideas threads:
> 
> * Add time.time_ns(): system clock with nanosecond resolution
> * Why not picoseconds?
> 
> The PEP 564 will be shortly online at:
> https://www.python.org/dev/peps/pep-0564/
> 
> Victor
> 
> 
> PEP: 564
> Title: Add new time functions with nanosecond resolution
> Version: $Revision$
> Last-Modified: $Date$
> Author: Victor Stinner 
> Status: Draft
> Type: Standards Track
> Content-Type: text/x-rst
> Created: 16-October-2017
> Python-Version: 3.7
> 
> 
> Abstract
> 
> 
> Add five new functions to the ``time`` module: ``time_ns()``,
> ``perf_counter_ns()``, ``monotonic_ns()``, ``clock_gettime_ns()`` and
> ``clock_settime_ns()``. They are similar to the function without the
> ``_ns`` suffix, but have nanosecond resolution: use a number of
> nanoseconds as a Python int.
> 
> The best ``time.time_ns()`` resolution measured in Python is 3 times
> better then ``time.time()`` resolution on Linux and Windows.
> 
> 
> Rationale
> =
> 
> Float type limited to 104 days
> --
> 
> The clocks resolution of desktop and latop computers is getting closer
> to nanosecond resolution. More and more clocks have a frequency in MHz,
> up to GHz for the CPU TSC clock.
> 
> The Python ``time.time()`` function returns the current time as a
> floatting point number which is usually a 64-bit binary floatting number
> (in the IEEE 754 format).
> 
> The problem is that the float type starts to lose nanoseconds after 104
> days.  Conversion from nanoseconds (``int``) to seconds (``float``) and
> then back to nanoseconds (``int``) to check if conversions lose
> precision::
> 
> # no precision loss
> >>> x = 2 ** 52 + 1; int(float(x * 1e-9) * 1e9) - x  
> 0
> # precision loss! (1 nanosecond)
> >>> x = 2 ** 53 + 1; int(float(x * 1e-9) * 1e9) - x  
> -1
> >>> print(datetime.timedelta(seconds=2 ** 53 / 1e9))  
> 104 days, 5:59:59.254741
> 
> ``time.time()`` returns seconds elapsed since the UNIX epoch: January
> 1st, 1970. This function loses precision since May 1970 (47 years ago)::
> 
> >>> import datetime
> >>> unix_epoch = datetime.datetime(1970, 1, 1)
> >>> print(unix_epoch + datetime.timedelta(seconds=2**53 / 1e9))  
> 1970-04-15 05:59:59.254741
> 
> 
> Previous rejected PEP
> -
> 
> Five years ago, the PEP 410 proposed a large and complex change in all
> Python functions returning time to support nanosecond resolution using
> the ``decimal.Decimal`` type.
> 
> The PEP was rejected for different reasons:
> 
> * The idea of adding a new optional parameter to change the result type
>   was rejected. It's an uncommon (and bad?) programming practice in
>   Python.
> 
> * It was not clear if hardware clocks really had a resolution of 1
>   nanosecond, especially at the Python level.
> 
> * The ``decimal.Decimal`` type is uncommon in Python and so requires
>   to adapt code to handle it.
> 
> 
> CPython enhancements of the last 5 years
> 
> 
> Since the PEP 410 was rejected:
> 
> * The ``os.stat_result`` structure got 3 new fields for timestamps as
>   nanoseconds (Python ``int``): ``st_atime_ns``, ``st_ctime_ns``
>   and ``st_mtime_ns``.
> 
> * The PEP 418 was accepted, Python 3.3 got 3 new clocks:
>   ``time.monotonic()``, ``time.perf_counter()`` and
>   ``time.process_time()``.
> 
> * The CPython private "pytime" C API handling time now uses a new
>   ``_PyTime_t`` type: simple 64-bit signed integer (C ``int64_t``).
>   The ``_PyTime_t`` unit is an implementation detail and not part of the
>   API. The unit is currently ``1 nanosecond``.
> 
> Existing Python APIs using nanoseconds as int
> -
> 
> The ``os.stat_result`` structure has 3 fields for timestamps as
> nanoseconds (``int``): ``st_atime_ns``, ``st_ctime_ns`` and
> ``st_mtime_ns``.
> 
> The ``ns`` parameter of the ``os.utime()`` function accepts a
> ``(atime_ns: int, mtime_ns: int)`` tuple: nanoseconds.
> 
> 
> Changes
> ===
> 
> New functions
> -
> 
> This PEP adds five new functions to the ``time`` module:
> 
> * ``time.clock_gettime_ns(clock_id)``
> * ``time.clock_settime_ns(clock_id, time: int)``
> * ``time.perf_counter_ns()``
> * ``time.monotonic_ns()``
> * ``time.time_ns()``
> 
> These functions are similar to the version without the ``_ns`` suffix,
> but use nanoseconds as Python ``int``.
> 
> For example, ``time.monotonic_ns() == int(time.monotonic() * 1e9)`` if
> ``monotonic()`` value is small enough to not lose precision.
> 
> Unchanged fu

Re: [Python-Dev] PEP 564: Add new time functions with nanosecond resolution

2017-10-22 Thread Wes Turner
On Saturday, October 21, 2017, Nick Coghlan  wrote:

> On 22 October 2017 at 09:32, Victor Stinner  > wrote:
>
>> Le 21 oct. 2017 20:31, "francismb" > > a écrit :
>>
>> I understand that one can just multiply/divide the nanoseconds returned,
>> (or it could be a factory) but wouldn't it help for future enhancements
>> to reduce the number of functions (the 'pico' question)?
>>
>>
>> If you are me to predict the future, I predict that CPU frequency will be
>> stuck below 10 GHz for the next 10 years :-)
>>
>
> There are actually solid physical reasons for that prediction likely being
> true. Aside from the power consumption, heat dissipation, and EM radiation
> issues that arise with higher switching frequencies, you also start running
> into more problems with digital circuit metastability ([1], [2]): the more
> clock edges you have per second, the higher the chances of an asynchronous
> input changing state at a bad time.
>
> So yeah, for nanosecond resolution to not be good enough for programs
> running in Python, we're going to be talking about some genuinely
> fundamental changes in the nature of computing hardware, and it's currently
> unclear if or how established programming languages will make that jump
> (see [3] for a gentle introduction to the current state of practical
> quantum computing). At that point, picoseconds vs nanoseconds is likely to
> be the least of our conceptual modeling challenges :)
>

There are current applications with greater-than nanosecond precision:

- relativity experiments
- particle experiments

Must they always use their own implementations of time., datetime.
__init__, fromordinal, fromtimestamp ?!

- https://scholar.google.com/scholar?q=femtosecond
- https://scholar.google.com/scholar?q=attosecond
- GPS now supports nanosecond resolution
-

https://en.wikipedia.org/wiki/Quantum_clock#More_accurate_experimental_clocks

> In 2015 JILA  evaluated the
absolute frequency uncertainty of their latest strontium-87
 optical lattice
clock at 2.1 × 10−18, which corresponds to a measurable gravitational time
dilation  for
an elevation change of 2 cm (0.79 in)

What about bus latency (and variance)?

From
https://www.nist.gov/publications/optical-two-way-time-and-frequency-transfer-over-free-space
:

> Optical two-way time and frequency transfer over free space
> Abstract
> The transfer of high-quality time-frequency signals between remote
locations underpins many applications, including precision navigation and
timing, clock-based geodesy, long-baseline interferometry, coherent radar
arrays, tests of general relativity and fundamental constants, and future
redefinition of the second. However, present microwave-based time-frequency
transfer is inadequate for state-of-the-art optical clocks and oscillators
that have femtosecond-level timing jitter and accuracies below 1 × 10−17.
Commensurate optically based transfer methods are therefore needed. Here we
demonstrate optical time-frequency transfer over free space via two-way
exchange between coherent frequency combs, each phase-locked to the local
optical oscillator. We achieve 1 fs timing deviation, residual instability
below 1 × 10−18 at 1,000 s and systematic offsets below 4 × 10−19, despite
frequent signal fading due to atmospheric turbulence or obstructions across
the 2 km link. This free-space transfer can enable terrestrial links to
support clock-based geodesy. Combined with satellite-based optical
communications, it provides a path towards global-scale geodesy,
high-accuracy time-frequency distribution and satellite-based relativity
experiments.

How much wider must an epoch-relative time struct be for various realistic
time precisions/accuracies?

10-6 micro µ
10-9 nano n -- int64
10-12 pico p
10-15 femto f
10-18 atto a
10-21 zepto z
10-24 yocto y

I'm at a loss to recommend a library to prefix these with the epoch; but
future compatibility may be a helpful, realistic objective.

Natural keys with such time resolution are still unfortunately likely to
collide.


>
> Cheers,
> Nick.
>
> [1] https://en.wikipedia.org/wiki/Metastability_in_electronics
> [2] https://electronics.stackexchange.com/questions/
> 14816/what-is-metastability
> [3] https://medium.com/@decodoku/how-to-program-a-quantum-
> computer-982a9329ed02
>
>
> --
> Nick Coghlan   |   [email protected]
>|   Brisbane,
> Australia
>
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 564: Add new time functions with nanosecond resolution

2017-10-22 Thread Chris Angelico
On Mon, Oct 23, 2017 at 2:06 AM, Wes Turner  wrote:
> What about bus latency (and variance)?

I'm currently in Los Angeles. Bus latency is measured in minutes, and
may easily exceed sixty of them. :|

Seriously though: For applications requiring accurate representation
of relativistic effects, the stdlib datetime module has a good few
problems besides lacking sub-nanosecond precision. I'd be inclined to
YAGNI this away unless/until some third-party module demonstrates that
there's actually a use for a datetime module that can handle all that.

ChrisA
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 564: Add new time functions with nanosecond resolution

2017-10-22 Thread Nick Coghlan
On 23 October 2017 at 01:06, Wes Turner  wrote:

> On Saturday, October 21, 2017, Nick Coghlan  wrote:
>
>> So yeah, for nanosecond resolution to not be good enough for programs
>> running in Python, we're going to be talking about some genuinely
>> fundamental changes in the nature of computing hardware, and it's currently
>> unclear if or how established programming languages will make that jump
>> (see [3] for a gentle introduction to the current state of practical
>> quantum computing). At that point, picoseconds vs nanoseconds is likely to
>> be the least of our conceptual modeling challenges :)
>>
> There are current applications with greater-than nanosecond precision:
>
> - relativity experiments
> - particle experiments
>
> Must they always use their own implementations of time., datetime.
> __init__, fromordinal, fromtimestamp ?!
>
Yes, as time is a critical part of their experimental setup - when you're
operating at relativistic speeds and the kinds of energy levels that
particle accelerators hit, it's a bad idea to assume that regular time
libraries that assume Newtonian physics applies are going to be up to the
task.

Normal software assumes a nanosecond is almost no time at all - in high
energy particle physics, a nanosecond is enough time for light to travel 30
centimetres, and a high energy particle that stuck around that long before
decaying into a lower energy state would be classified as "long lived".

Cheers.
Nick.

P.S. "Don't take code out of the environment it was designed for and assume
it will just keep working normally" is one of the main lessons folks
learned from the destruction of the first Ariane 5 launch rocket in 1996
(see the first paragraph in
https://en.wikipedia.org/wiki/Ariane_5#Notable_launches )

-- 
Nick Coghlan   |   [email protected]   |   Brisbane, Australia
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 564: Add new time functions with nanosecond resolution

2017-10-22 Thread David Mertz
I worked at a molecular dynamics lab for a number of years. I advocated
switching all our code to using attosecond units (rather than fractional
picoseconds).

However, this had nothing whatsoever to do with the machine clock speeds,
but only with the physical quantities represented and the scaling/rounding
math.

It didn't happen, for various reasons. But if it had, I certainly wouldn't
have expected standard library support for this. The 'time' module is about
wall clock out calendar time, not about *simulation time*.

FWIW, a very long simulation might cover a millisecond of simulated
time we're a very long way from looking at molecular behavior over 104
days.

On Oct 22, 2017 8:10 AM, "Wes Turner"  wrote:



On Saturday, October 21, 2017, Nick Coghlan  wrote:

> On 22 October 2017 at 09:32, Victor Stinner 
> wrote:
>
>> Le 21 oct. 2017 20:31, "francismb"  a écrit :
>>
>> I understand that one can just multiply/divide the nanoseconds returned,
>> (or it could be a factory) but wouldn't it help for future enhancements
>> to reduce the number of functions (the 'pico' question)?
>>
>>
>> If you are me to predict the future, I predict that CPU frequency will be
>> stuck below 10 GHz for the next 10 years :-)
>>
>
> There are actually solid physical reasons for that prediction likely being
> true. Aside from the power consumption, heat dissipation, and EM radiation
> issues that arise with higher switching frequencies, you also start running
> into more problems with digital circuit metastability ([1], [2]): the more
> clock edges you have per second, the higher the chances of an asynchronous
> input changing state at a bad time.
>
> So yeah, for nanosecond resolution to not be good enough for programs
> running in Python, we're going to be talking about some genuinely
> fundamental changes in the nature of computing hardware, and it's currently
> unclear if or how established programming languages will make that jump
> (see [3] for a gentle introduction to the current state of practical
> quantum computing). At that point, picoseconds vs nanoseconds is likely to
> be the least of our conceptual modeling challenges :)
>

There are current applications with greater-than nanosecond precision:

- relativity experiments
- particle experiments

Must they always use their own implementations of time., datetime.
__init__, fromordinal, fromtimestamp ?!

- https://scholar.google.com/scholar?q=femtosecond
- https://scholar.google.com/scholar?q=attosecond
- GPS now supports nanosecond resolution
-

https://en.wikipedia.org/wiki/Quantum_clock#More_accurate_
experimental_clocks

> In 2015 JILA  evaluated the
absolute frequency uncertainty of their latest strontium-87
 optical lattice
clock at 2.1 × 10−18, which corresponds to a measurable gravitational time
dilation  for
an elevation change of 2 cm (0.79 in)

What about bus latency (and variance)?

>From https://www.nist.gov/publications/optical-two-way-
time-and-frequency-transfer-over-free-space :

> Optical two-way time and frequency transfer over free space
> Abstract
> The transfer of high-quality time-frequency signals between remote
locations underpins many applications, including precision navigation and
timing, clock-based geodesy, long-baseline interferometry, coherent radar
arrays, tests of general relativity and fundamental constants, and future
redefinition of the second. However, present microwave-based time-frequency
transfer is inadequate for state-of-the-art optical clocks and oscillators
that have femtosecond-level timing jitter and accuracies below 1 × 10−17.
Commensurate optically based transfer methods are therefore needed. Here we
demonstrate optical time-frequency transfer over free space via two-way
exchange between coherent frequency combs, each phase-locked to the local
optical oscillator. We achieve 1 fs timing deviation, residual instability
below 1 × 10−18 at 1,000 s and systematic offsets below 4 × 10−19, despite
frequent signal fading due to atmospheric turbulence or obstructions across
the 2 km link. This free-space transfer can enable terrestrial links to
support clock-based geodesy. Combined with satellite-based optical
communications, it provides a path towards global-scale geodesy,
high-accuracy time-frequency distribution and satellite-based relativity
experiments.

How much wider must an epoch-relative time struct be for various realistic
time precisions/accuracies?

10-6 micro µ
10-9 nano n -- int64
10-12 pico p
10-15 femto f
10-18 atto a
10-21 zepto z
10-24 yocto y

I'm at a loss to recommend a library to prefix these with the epoch; but
future compatibility may be a helpful, realistic objective.

Natural keys with such time resolution are still unfortunately likely to
collide.


>
> Cheers,
> Nick.
>
> [1] https://en.wikipedia.org/wiki/Metastability_in_electronics
> [2] https

Re: [Python-Dev] PEP 564: Add new time functions with nanosecond resolution

2017-10-22 Thread Wes Turner
On Sunday, October 22, 2017, David Mertz  wrote:

> I worked at a molecular dynamics lab for a number of years. I advocated
> switching all our code to using attosecond units (rather than fractional
> picoseconds).
>
> However, this had nothing whatsoever to do with the machine clock speeds,
> but only with the physical quantities represented and the scaling/rounding
> math.
>
> It didn't happen, for various reasons. But if it had, I certainly wouldn't
> have expected standard library support for this. The 'time' module is about
> wall clock out calendar time, not about *simulation time*.
>
> FWIW, a very long simulation might cover a millisecond of simulated
> time we're a very long way from looking at molecular behavior over 104
> days.
>

Maybe that's why we haven't found any CTCs (closed timelike curves) yet.

Aligning simulation data in context to other events may be enlightening: is
there a good library for handing high precision time units in Python
(and/or CFFI)?

...

http://opendata.cern.ch/

http://opendata.cern.ch/getting-started/CMS


>
>
> On Oct 22, 2017 8:10 AM, "Wes Turner"  > wrote:
>
>
>
> On Saturday, October 21, 2017, Nick Coghlan  > wrote:
>
>> On 22 October 2017 at 09:32, Victor Stinner 
>> wrote:
>>
>>> Le 21 oct. 2017 20:31, "francismb"  a écrit :
>>>
>>> I understand that one can just multiply/divide the nanoseconds returned,
>>> (or it could be a factory) but wouldn't it help for future enhancements
>>> to reduce the number of functions (the 'pico' question)?
>>>
>>>
>>> If you are me to predict the future, I predict that CPU frequency will
>>> be stuck below 10 GHz for the next 10 years :-)
>>>
>>
>> There are actually solid physical reasons for that prediction likely
>> being true. Aside from the power consumption, heat dissipation, and EM
>> radiation issues that arise with higher switching frequencies, you also
>> start running into more problems with digital circuit metastability ([1],
>> [2]): the more clock edges you have per second, the higher the chances of
>> an asynchronous input changing state at a bad time.
>>
>> So yeah, for nanosecond resolution to not be good enough for programs
>> running in Python, we're going to be talking about some genuinely
>> fundamental changes in the nature of computing hardware, and it's currently
>> unclear if or how established programming languages will make that jump
>> (see [3] for a gentle introduction to the current state of practical
>> quantum computing). At that point, picoseconds vs nanoseconds is likely to
>> be the least of our conceptual modeling challenges :)
>>
>
> There are current applications with greater-than nanosecond precision:
>
> - relativity experiments
> - particle experiments
>
> Must they always use their own implementations of time., datetime.
> __init__, fromordinal, fromtimestamp ?!
>
> - https://scholar.google.com/scholar?q=femtosecond
> - https://scholar.google.com/scholar?q=attosecond
> - GPS now supports nanosecond resolution
> -
>
> https://en.wikipedia.org/wiki/Quantum_clock#More_accurate_ex
> perimental_clocks
>
> > In 2015 JILA  evaluated the
> absolute frequency uncertainty of their latest strontium-87
>  optical lattice
> clock at 2.1 × 10−18, which corresponds to a measurable gravitational
> time dilation
>  for an
> elevation change of 2 cm (0.79 in)
>
> What about bus latency (and variance)?
>
> From https://www.nist.gov/publications/optical-two-way-time-and-
> frequency-transfer-over-free-space :
>
> > Optical two-way time and frequency transfer over free space
> > Abstract
> > The transfer of high-quality time-frequency signals between remote
> locations underpins many applications, including precision navigation and
> timing, clock-based geodesy, long-baseline interferometry, coherent radar
> arrays, tests of general relativity and fundamental constants, and future
> redefinition of the second. However, present microwave-based time-frequency
> transfer is inadequate for state-of-the-art optical clocks and oscillators
> that have femtosecond-level timing jitter and accuracies below 1 × 10−17.
> Commensurate optically based transfer methods are therefore needed. Here we
> demonstrate optical time-frequency transfer over free space via two-way
> exchange between coherent frequency combs, each phase-locked to the local
> optical oscillator. We achieve 1 fs timing deviation, residual instability
> below 1 × 10−18 at 1,000 s and systematic offsets below 4 × 10−19,
> despite frequent signal fading due to atmospheric turbulence or
> obstructions across the 2 km link. This free-space transfer can enable
> terrestrial links to support clock-based geodesy. Combined with
> satellite-based optical communications, it provides a path towards
> global-scale geodesy, high-accuracy time-frequency distribution and
> satellite-based relativity experiments

Re: [Python-Dev] PEP 564: Add new time functions with nanosecond resolution

2017-10-22 Thread Victor Stinner
Le 22 oct. 2017 17:06, "Wes Turner"  a écrit :

Must they always use their own implementations of time., datetime.
__init__, fromordinal, fromtimestamp ?!


Yes, exactly.

Note: Adding resolution better than 1 us to datetime is not in the scope of
the PEP but there is an issue, open since a long time.

I don't think that time.time_ns() is usable for such experiment. Again,
calling a function is Python takes around 50 ns.

Victor
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com