On Sun, 20 Dec 2009 21:02:02 -0800, Rami Chowdhury wrote:
>
> On Dec 20, 2009, at 17:41 , Peter Pearson wrote:
>
>> Why not use a good cipher, such as AES, to generate a pseudorandom
>> bit stream by encrypting successive integers?
>
> Isn't the Fortuna PRNG based around that approximate concept?
David Cournapeau wrote:
On Sun, Dec 20, 2009 at 6:47 PM, Lie Ryan wrote:
On 12/20/2009 2:53 PM, sturlamolden wrote:
On 20 Des, 01:46, Lie Ryan wrote:
Not necessarily, you only need to be certain that the two streams don't
overlap in any reasonable amount of time. For that purpose, you can u
On 2009-12-21 16:57 PM, r0g wrote:
sturlamolden wrote:
On 19 Des, 16:20, Carl Johan Rehn wrote:
How about mulit-core or (perhaps more exciting) GPU and CUDA? I must
admit that I am extremely interested in trying the CUDA-alternative.
Obviously, cuBLAS is not an option here, so what is the sa
sturlamolden wrote:
On 19 Des, 22:58, sturlamolden wrote:
If you pick two random states (using any PRNG), you need error-
checking that states are always unique, i.e. that each PRNG never
reaches the starting state of the other(s).
Another note on this:
Ideally, we would e.g. know how to fi
sturlamolden wrote:
On 19 Des, 16:20, Carl Johan Rehn wrote:
How about mulit-core or (perhaps more exciting) GPU and CUDA? I must
admit that I am extremely interested in trying the CUDA-alternative.
Obviously, cuBLAS is not an option here, so what is the safest route
for a novice parallel-pro
sturlamolden wrote:
On 19 Des, 14:06, Carl Johan Rehn wrote:
Matlab and numpy have (by chance?) the exact names for the same
functionality,
Common ancenstry, NumPy and Matlab borrowed the name from IDL.
LabView, Octave and SciLab uses the name randn as well.
So the basioc question is, h
sturlamolden wrote:
> On 19 Des, 16:20, Carl Johan Rehn wrote:
>
>> How about mulit-core or (perhaps more exciting) GPU and CUDA? I must
>> admit that I am extremely interested in trying the CUDA-alternative.
>>
>> Obviously, cuBLAS is not an option here, so what is the safest route
>> for a novi
On 2009-12-19 09:14 AM, Carl Johan Rehn wrote:
On Dec 19, 2:49 pm, sturlamolden wrote:
On 19 Des, 11:05, Carl Johan Rehn wrote:
I plan to port a Monte Carlo engine from Matlab to Python. However,
when I timed randn(N1, N2) in Python and compared it with Matlab's
randn, Matlab came out as a c
On 20 Des, 01:46, Lie Ryan wrote:
> Not necessarily, you only need to be certain that the two streams don't
> overlap in any reasonable amount of time. For that purpose, you can use
> a PRNG that have extremely high period like Mersenne Twister and puts
> the generators to very distant states.
E
On Dec 20, 2009, at 17:41 , Peter Pearson wrote:
> On Sun, 20 Dec 2009 07:26:03 +1100, Lie Ryan wrote:
>> On 12/20/2009 4:02 AM, Carl Johan Rehn wrote:
>>
>> Parallel PRNGs are an unsolved problem in computer science.
>>>
>>> Thanks again for sharing your knowledge. I had no idea. This mea
On Sun, 20 Dec 2009 07:26:03 +1100, Lie Ryan wrote:
> On 12/20/2009 4:02 AM, Carl Johan Rehn wrote:
>
> Parallel PRNGs are an unsolved problem in computer science.
>>
>> Thanks again for sharing your knowledge. I had no idea. This means
>> that if I want to speed up my application I have to go
On 12/21/2009 1:13 AM, David Cournapeau wrote:
But the OP case mostly like falls in your estimated 0.01% case. PRNG
quality is essential for reliable Monte Carlo procedures. I don't
think long period is enough to guarantee those good properties for //
random generators - at least it is not obviou
On Sun, Dec 20, 2009 at 6:47 PM, Lie Ryan wrote:
> On 12/20/2009 2:53 PM, sturlamolden wrote:
>>
>> On 20 Des, 01:46, Lie Ryan wrote:
>>
>>> Not necessarily, you only need to be certain that the two streams don't
>>> overlap in any reasonable amount of time. For that purpose, you can use
>>> a PR
On 12/20/2009 2:53 PM, sturlamolden wrote:
On 20 Des, 01:46, Lie Ryan wrote:
Not necessarily, you only need to be certain that the two streams don't
overlap in any reasonable amount of time. For that purpose, you can use
a PRNG that have extremely high period like Mersenne Twister and puts
the
On Sat, 19 Dec 2009 13:58:37 -0800, sturlamolden wrote:
> On 19 Des, 21:26, Lie Ryan wrote:
>
>> you can just start two PRNG at two distinct states
>
> No. You have to know for certain that the outputs don't overlap.
"For certain"? Why?
Presumably you never do a Monte Carlo simulation once, y
On 12/20/2009 8:58 AM, sturlamolden wrote:
On 19 Des, 21:26, Lie Ryan wrote:
you can just start two PRNG at two distinct states
No. You have to know for certain that the outputs don't overlap.
Not necessarily, you only need to be certain that the two streams don't
overlap in any reasonabl
On Sat, 19 Dec 2009 09:02:38 -0800, Carl Johan Rehn wrote:
> Well, in Matlab I used "tic; for i = 1:1000, randn(100, 1), end;
> toc" and in IPython i used a similar construct but with "time" instead
> of tic/(toc.
I don't know if this will make any significant difference, but for the
record
Lie Ryan wrote:
If you don't care about "repeatability" (which is already extremely
difficult in parallel processing even without random number generators),
you can just start two PRNG at two distinct states (and probably from
two different algorithms)
There's no need for different algorithm
On 19 Dec, 23:09, sturlamolden wrote:
> On 19 Des, 22:58, sturlamolden wrote:
>
> > If you pick two random states (using any PRNG), you need error-
> > checking that states are always unique, i.e. that each PRNG never
> > reaches the starting state of the other(s).
>
> Another note on this:
>
> I
On 19 Des, 22:58, sturlamolden wrote:
> If you pick two random states (using any PRNG), you need error-
> checking that states are always unique, i.e. that each PRNG never
> reaches the starting state of the other(s).
Another note on this:
Ideally, we would e.g. know how to find (analytically)
On 19 Des, 21:26, Lie Ryan wrote:
> you can just start two PRNG at two distinct states
No. You have to know for certain that the outputs don't overlap.
If you pick two random states (using any PRNG), you need error-
checking that states are always unique, i.e. that each PRNG never
reaches the s
On 12/20/2009 4:02 AM, Carl Johan Rehn wrote:
How did you time it?
Well, in Matlab I used "tic; for i = 1:1000, randn(100, 1), end;
toc" and in IPython i used a similar construct but with "time" instead
of tic/(toc.
Code?
Parallel PRNGs are an unsolved problem in computer science.
Tha
On Dec 19, 4:47 pm, sturlamolden wrote:
> On 19 Des, 16:20, Carl Johan Rehn wrote:
>
> > How about mulit-core or (perhaps more exciting) GPU and CUDA? I must
> > admit that I am extremely interested in trying the CUDA-alternative.
>
> > Obviously, cuBLAS is not an option here, so what is the safe
On 19 Des, 16:20, Carl Johan Rehn wrote:
> How about mulit-core or (perhaps more exciting) GPU and CUDA? I must
> admit that I am extremely interested in trying the CUDA-alternative.
>
> Obviously, cuBLAS is not an option here, so what is the safest route
> for a novice parallel-programmer?
The
On Sat, 19 Dec 2009 05:06:53 -0800, Carl Johan Rehn wrote:
> so I was very happy with numpy's implementation until I timed it.
How did you time it?
--
Steven
--
http://mail.python.org/mailman/listinfo/python-list
On Dec 19, 3:16 pm, sturlamolden wrote:
> On 19 Des, 14:06, Carl Johan Rehn wrote:
>
> > Matlab and numpy have (by chance?) the exact names for the same
> > functionality,
>
> Common ancenstry, NumPy and Matlab borrowed the name from IDL.
>
> LabView, Octave and SciLab uses the name randn as well
On Dec 19, 2:49 pm, sturlamolden wrote:
> On 19 Des, 11:05, Carl Johan Rehn wrote:
>
> > I plan to port a Monte Carlo engine from Matlab to Python. However,
> > when I timed randn(N1, N2) in Python and compared it with Matlab's
> > randn, Matlab came out as a clear winner with a speedup of 3-4 ti
On 19 Des, 14:06, Carl Johan Rehn wrote:
> Matlab and numpy have (by chance?) the exact names for the same
> functionality,
Common ancenstry, NumPy and Matlab borrowed the name from IDL.
LabView, Octave and SciLab uses the name randn as well.
> So the basioc question is, how can I speed up r
On 19 Des, 12:29, Steven D'Aprano wrote:
> Perhaps
> the Matlab random number generator is a low-quality generator which is
> fast but not very random. Python uses a very high quality RNG which is
> not cheap.
Marsaglia and Matlab's implementation of ziggurat uses a slightly
lower quality RNG fo
On 19 Des, 11:05, Carl Johan Rehn wrote:
> I plan to port a Monte Carlo engine from Matlab to Python. However,
> when I timed randn(N1, N2) in Python and compared it with Matlab's
> randn, Matlab came out as a clear winner with a speedup of 3-4 times.
> This was truly disappointing. I ran tthis t
On Dec 19, 12:29 pm, Steven D'Aprano wrote:
> On Sat, 19 Dec 2009 02:05:17 -0800, Carl Johan Rehn wrote:
> > Dear friends,
>
> > I plan to port a Monte Carlo engine from Matlab to Python. However, when
> > I timed randn(N1, N2) in Python and compared it with Matlab's randn,
>
> What's randn? I don
On Sat, 19 Dec 2009 02:05:17 -0800, Carl Johan Rehn wrote:
> Dear friends,
>
> I plan to port a Monte Carlo engine from Matlab to Python. However, when
> I timed randn(N1, N2) in Python and compared it with Matlab's randn,
What's randn? I don't know that function. I know the randint, random, and
Dear friends,
I plan to port a Monte Carlo engine from Matlab to Python. However,
when I timed randn(N1, N2) in Python and compared it with Matlab's
randn, Matlab came out as a clear winner with a speedup of 3-4 times.
This was truly disappointing. I ran tthis test on a Win32 machine and
without t
On 2009-04-24 08:05, timlash wrote:
Essentially, I'm testing tens of thousands of scenarios on a
relatively small number of test cases. Each scenario requires all
elements of each test case to be scored, then summarized, then sorted
and grouped with some top scores captured for reporting.
It se
Thanks for your replies.
@Peter - My arrays are not sparse at all, but I'll take a quick look
as scipy. I also should have mentioned that my numpy arrays are of
Object type as each data point (row) has one or more text labels for
categorization.
@Robert - Thanks for the comments about how numpy
On 2009-04-23 10:32, timlash wrote:
Still fairly new to Python. I wrote a program that used a class
called RectangularArray as described here:
class RectangularArray:
def __init__(self, rows, cols, value=0):
self.arr = [None]*rows
self.row = [value]*cols
def __getitem__(se
timlash wrote:
> Still fairly new to Python. I wrote a program that used a class
> called RectangularArray as described here:
>
> class RectangularArray:
>def __init__(self, rows, cols, value=0):
> self.arr = [None]*rows
> self.row = [value]*cols
>def __getitem__(self, (i, j)
Still fairly new to Python. I wrote a program that used a class
called RectangularArray as described here:
class RectangularArray:
def __init__(self, rows, cols, value=0):
self.arr = [None]*rows
self.row = [value]*cols
def __getitem__(self, (i, j)):
return (self.arr[i] or
TG wrote:
> Hi there.
>
> Reading the page on python performance ( http://scipy.org/PerformancePython
> ) made me realize that I can achieve tremendous code acceleration with
> numpy just by using "u[:,:]" kind of syntax the clever way.
>
> Here is a little problem (Oja's rule of synaptic plastic
On Apr 3, 5:42 pm, "TG" <[EMAIL PROTECTED]> wrote:
> Hi there.
>
> Reading the page on python performance (http://scipy.org/PerformancePython
> ) made me realize that I can achieve tremendous code acceleration with
> numpy just by using "u[:,:]" kind of syntax the clever way.
>
> Here is a little p
Hi there.
Reading the page on python performance ( http://scipy.org/PerformancePython
) made me realize that I can achieve tremendous code acceleration with
numpy just by using "u[:,:]" kind of syntax the clever way.
Here is a little problem (Oja's rule of synaptic plasticity)
* W is a matrix co
41 matches
Mail list logo