On Thu, Dec 15, 2011 at 11:50, AndrewVSutherland
wrote:
> e.g. by including some ``warm-up" code that runs
> before you start timing
that's a good idea! next to the histogram i did above, i also want to
plot the time-series. this should show this or other trends.
h
--
To post to this group, se
Just to amplify what Bill said below, most CPUs these days have
some sort of power-saving mode that will be in use by default. This
means that if the CPU is not fully loaded when you start your test,
it will take some time (typically not more than a few seconds) for it
to ramp up to full speed (ir
On Wed, Dec 14, 2011 at 8:59 AM, William Stein wrote:
>
> On Dec 14, 2011 12:40 AM, "Nils Bruin" wrote:
>>
>> On Dec 13, 11:27 pm, William Stein wrote:
>> > At least since we're mathematically savvy, we know that there's a lot
>> > more to statistics than the normal distribution.
>> >
>> > Havin
On 12/14/11 4:50 PM, Harald Schilly wrote:
I hacked something together that plots a histogram. Well, it looks odd,
maybe there is a big obvious bug -- but nevertheless, it's not symmetric
and in no way a normal distribution!
You could also use http://trac.sagemath.org/sage_trac/ticket/9671. I
People have hit on some of the issues, e.g.
(a) the algorithm + algorithm must be deterministic (no calls to
"random")
(b) the computation might be slow because of multiprocessing (time
slots scheduled to something else)
but also there are other often quite critical issues affecting
timing, like:
On Dec 14, 2011 12:40 AM, "Nils Bruin" wrote:
>
> On Dec 13, 11:27 pm, William Stein wrote:
> > At least since we're mathematically savvy, we know that there's a lot
> > more to statistics than the normal distribution.
> >
> > Having never seen any timing distributions (since they are hidden by
>
On Dec 13, 11:27 pm, William Stein wrote:
> At least since we're mathematically savvy, we know that there's a lot
> more to statistics than the normal distribution.
>
> Having never seen any timing distributions (since they are hidden by
> timeit), I don't even know anything about how timings are
Hi,
> For low level assembly language we sometimes compute the exact number
> of cycles using the cycle counter rather than do a timing. This varies
> per architecture and assumes cache affects are not relevant.
I remember doing that on a 6502 or 8086/80286/80386. I'm curious: Is it still
r
On 13 December 2011 21:39, William Stein wrote:
> On Tue, Dec 13, 2011 at 1:15 PM, Nils Bruin wrote:
>> I recall reading something about that in the Python documentation and
>> indeed, quoting from
>>
>> http://docs.python.org/library/timeit.html
>>
>> we find:
>>
>> """
>> Note
>>
>> It’s tempti
Hi!
Currently, "timeit" often does not more than finding the "best of
three". I suspect that computing a standard deviation from just three
samples is not quite reliable. Hence, if one wants to get a statistic
then one needs a lot more runs, thus, the test will take much more
time.
Do we want tha
On Tue, Dec 13, 2011 at 1:41 PM, Nathann Cohen wrote:
>> we find:
>
> Yep, but it looks like they have a purely deterministic algorithm in mind. I
> often do things like that :
>
> graphs.RandomGNP(10,.2) .chromatic_number()
>
> Well, the "chromatic_number()" method is deterministic, but each time
>
> we find:
>
Yep, but it looks like they have a purely deterministic algorithm in mind.
I often do things like that :
graphs.RandomGNP(10,.2) .chromatic_number()
Well, the "chromatic_number()" method is deterministic, but each time this
method is run it is run on a different random graph.
%
On Tue, Dec 13, 2011 at 1:15 PM, Nils Bruin wrote:
> I recall reading something about that in the Python documentation and
> indeed, quoting from
>
> http://docs.python.org/library/timeit.html
>
> we find:
>
> """
> Note
>
> It’s tempting to calculate mean and standard deviation from the result
>
I recall reading something about that in the Python documentation and
indeed, quoting from
http://docs.python.org/library/timeit.html
we find:
"""
Note
It’s tempting to calculate mean and standard deviation from the result
vector and report these. However, this is not very useful. In a
typical
On 12/13/11 2:34 PM, William Stein wrote:
Hi,
I was just looking at some timings for trac 12149, and it occurred to
me that our "timeout" command may be fine for programmers, but for us
mathematicians surely we want something that gives a better measure of
the distribution of timings? Wouldn't
On Mon, Nov 24, 2008 at 2:47 PM, Simon King <[EMAIL PROTECTED]> wrote:
>
> Dear Tim,
>
> On Nov 24, 8:41 pm, "Tim Lahey" <[EMAIL PROTECTED]> wrote:
>> var('x')
>> f = 2*x/sin(x)^2
>> f.integrate(x)
>> axiom.integrate(f,x)
>> timeit(f.integrate(x))
>> timeit(axiom.integrate(f,x))
>
> AFAIK, unlike
Dear Tim,
On Nov 24, 8:41 pm, "Tim Lahey" <[EMAIL PROTECTED]> wrote:
> var('x')
> f = 2*x/sin(x)^2
> f.integrate(x)
> axiom.integrate(f,x)
> timeit(f.integrate(x))
> timeit(axiom.integrate(f,x))
AFAIK, unlike "time", which is prefixed to an actual command, "timeit"
is a function that expects a s
On Mon, Nov 24, 2008 at 2:09 PM, William Stein <[EMAIL PROTECTED]> wrote:
>
>
> Precisely *exactly* how are you using timeit?! Paste in an exact session.
>
> Also, note that there is timeit('stuff'), which is a function call, and
> %timeit stuff
> which is an ipython magic command that is "buggy
On Mon, Nov 24, 2008 at 8:29 AM, Tim Lahey <[EMAIL PROTECTED]> wrote:
>
> Hi,
>
> In exploring integration using FriCAS vs. Maxima, I've
> noticed that timeit doesn't seem to want to work
> for integration.
Precisely *exactly* how are you using timeit?! Paste in an exact session.
Also, note tha
On Mon, Feb 25, 2008 at 10:06 AM, Nick Alexander <[EMAIL PROTECTED]> wrote:
>
> > It might be. I don't like the time function as it is written now,
> > since it's
> > done with the preparser and doesn't work when it isn't the first
> > thing on
> > a line, which is annoying.
> >
> > sage:
> It might be. I don't like the time function as it is written now,
> since it's
> done with the preparser and doesn't work when it isn't the first
> thing on
> a line, which is annoying.
>
> sage: 2 + 2; time 2 + 2
>
>File "",
On Mon, Feb 25, 2008 at 9:20 AM, Joel B. Mohler <[EMAIL PROTECTED]> wrote:
>
>
> On Monday 25 February 2008 10:56, William Stein wrote:
> > On Mon, Feb 25, 2008 at 7:49 AM, Joel B. Mohler <[EMAIL PROTECTED]>
> wrote:
> > > Hi,
> > >
> > > I just noticed that the timeit short-cut seems more
On Monday 25 February 2008 10:56, William Stein wrote:
> On Mon, Feb 25, 2008 at 7:49 AM, Joel B. Mohler <[EMAIL PROTECTED]>
wrote:
> > Hi,
> >
> > I just noticed that the timeit short-cut seems more broken than normal
> > (at least I think this worked previous to 2.10.2:
> > sage: R.=ZZ[]
> >
On Mon, Feb 25, 2008 at 7:57 AM, mabshoff
<[EMAIL PROTECTED]> wrote:
>
>
>
> On Feb 25, 4:49 pm, "Joel B. Mohler" <[EMAIL PROTECTED]> wrote:
> > Hi,
> >
> > I just noticed that the timeit short-cut seems more broken than normal (at
> > least I think this worked previous to 2.10.2:
> > sage:
On Mon, Feb 25, 2008 at 4:57 PM, mabshoff
<[EMAIL PROTECTED]> wrote:
>
>
>
> On Feb 25, 4:49 pm, "Joel B. Mohler" <[EMAIL PROTECTED]> wrote:
> > Hi,
> >
> > I just noticed that the timeit short-cut seems more broken than normal (at
> > least I think this worked previous to 2.10.2:
> > sage:
On Feb 25, 4:49 pm, "Joel B. Mohler" <[EMAIL PROTECTED]> wrote:
> Hi,
>
> I just noticed that the timeit short-cut seems more broken than normal (at
> least I think this worked previous to 2.10.2:
> sage: R.=ZZ[]
> sage: f=x^2-1
> sage: timeit f.factor()
> ---
On Mon, Feb 25, 2008 at 7:49 AM, Joel B. Mohler <[EMAIL PROTECTED]> wrote:
>
> Hi,
>
> I just noticed that the timeit short-cut seems more broken than normal (at
> least I think this worked previous to 2.10.2:
> sage: R.=ZZ[]
> sage: f=x^2-1
> sage: timeit f.factor()
> -
27 matches
Mail list logo