Re: [Python-Dev] PEP 564: Add new time functions with nanosecond resolution

2017-10-21 Thread francismb
Hi Victor,

On 10/18/2017 01:14 AM, Victor Stinner wrote:
> I updated my PEP 564 to add time.process_time_ns():
> https://github.com/python/peps/blob/master/pep-0564.rst
> 
> The HTML version should be updated shortly:
> https://www.python.org/dev/peps/pep-0564/

** In practive, the resolution of 1 nanosecond **

** no need for resolution better than 1 nanosecond in practive in the
Python standard library.**

practice vs practice



If I understood you correctly on Python-ideas (here just for the
records, otherwise please ignore it):

why not something like (please change '_in' for what you like):

time.time_in(precision)
time.monotonic_in(precision)


where precision is an enumeration for: 'seconds', 'milliseconds'
'microseconds'... (or 's', 'ms', 'us', 'ns', ...)


Thanks,
--francis
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 564: Add new time functions with nanosecond resolution

2017-10-21 Thread Guido van Rossum
That sounds like unnecessary generality, and also suggests that the API
might support precisions way beyond what is realistic.

On Sat, Oct 21, 2017 at 4:39 AM, francismb  wrote:

> Hi Victor,
>
> On 10/18/2017 01:14 AM, Victor Stinner wrote:
> > I updated my PEP 564 to add time.process_time_ns():
> > https://github.com/python/peps/blob/master/pep-0564.rst
> >
> > The HTML version should be updated shortly:
> > https://www.python.org/dev/peps/pep-0564/
>
> ** In practive, the resolution of 1 nanosecond **
>
> ** no need for resolution better than 1 nanosecond in practive in the
> Python standard library.**
>
> practice vs practice
>
>
>
> If I understood you correctly on Python-ideas (here just for the
> records, otherwise please ignore it):
>
> why not something like (please change '_in' for what you like):
>
> time.time_in(precision)
> time.monotonic_in(precision)
>
>
> where precision is an enumeration for: 'seconds', 'milliseconds'
> 'microseconds'... (or 's', 'ms', 'us', 'ns', ...)
>
>
> Thanks,
> --francis
> ___
> Python-Dev mailing list
> [email protected]
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> guido%40python.org
>



-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 564: Add new time functions with nanosecond resolution

2017-10-21 Thread francismb
If it sounds as there is no need or is unnecessary to you then
it its ok :-), thank you for the feedback ! I'm just curious on:

On 10/21/2017 05:45 PM, Guido van Rossum wrote:
> That sounds like unnecessary generality, 
Meaning that the selection of precision on running time 'costs'?

I understand that one can just multiply/divide the nanoseconds returned,
(or it could be a factory) but wouldn't it help for future enhancements
to reduce the number of functions (the 'pico' question)?

> and also suggests that the API
> might support precisions way beyond what is realistic.
Doesn't that depends on the offered/supported enums (in that case down
to 'ns' as Victor proposed) ?

Thanks,
--francis
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 564: Add new time functions with nanosecond resolution

2017-10-21 Thread Victor Stinner
Le 21 oct. 2017 20:31, "francismb"  a écrit :

I understand that one can just multiply/divide the nanoseconds returned,
(or it could be a factory) but wouldn't it help for future enhancements
to reduce the number of functions (the 'pico' question)?


If you are me to predict the future, I predict that CPU frequency will be
stuck below 10 GHz for the next 10 years :-)

Did you hear that the Moore law is no more true since 2012 (Intel said
since 2015)? Since 2002, CPUs frequency are blocked around 3 GHz. Overclock
records are around 8 GHz with very specialized hardware, not usable for a
classical PC.

I don't want to overengineer an API "just in case". Let's provide
nanoseconds. We can discuss picoseconds later, maybe in 10 years?

You can now start to bet if decimal128 will come before or after
picoseconds in mainstream CPUs :-)

By the way, we are talking about a resolution of 1 ns, but remember that a
Python function call is closer to 50 ns. I am not sure that picosecond
makes sense if CPU doesn't become much faster.

I am too shy to put such predictions in a very offical PEP ;-)

Victor
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 564: Add new time functions with nanosecond resolution

2017-10-21 Thread Nick Coghlan
On 22 October 2017 at 09:32, Victor Stinner 
wrote:

> Le 21 oct. 2017 20:31, "francismb"  a écrit :
>
> I understand that one can just multiply/divide the nanoseconds returned,
> (or it could be a factory) but wouldn't it help for future enhancements
> to reduce the number of functions (the 'pico' question)?
>
>
> If you are me to predict the future, I predict that CPU frequency will be
> stuck below 10 GHz for the next 10 years :-)
>

There are actually solid physical reasons for that prediction likely being
true. Aside from the power consumption, heat dissipation, and EM radiation
issues that arise with higher switching frequencies, you also start running
into more problems with digital circuit metastability ([1], [2]): the more
clock edges you have per second, the higher the chances of an asynchronous
input changing state at a bad time.

So yeah, for nanosecond resolution to not be good enough for programs
running in Python, we're going to be talking about some genuinely
fundamental changes in the nature of computing hardware, and it's currently
unclear if or how established programming languages will make that jump
(see [3] for a gentle introduction to the current state of practical
quantum computing). At that point, picoseconds vs nanoseconds is likely to
be the least of our conceptual modeling challenges :)

Cheers,
Nick.

[1] https://en.wikipedia.org/wiki/Metastability_in_electronics
[2]
https://electronics.stackexchange.com/questions/14816/what-is-metastability
[3]
https://medium.com/@decodoku/how-to-program-a-quantum-computer-982a9329ed02


-- 
Nick Coghlan   |   [email protected]   |   Brisbane, Australia
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com