On Thu, Jun 16, 2016 at 2:45 AM, meInvent bbird wrote:
> the name in c# is not called concurrent list, it is called
> blockingcollection
>
> dictionary called concurrent dictionary
>
> thread safe these kind of things
>
> https://msdn.microsoft.com/en-us/library/dd267312(v=vs.110).aspx
>
> https:/
how can list be synchronized when multiprocessor working in it?
will one thread updating non-updated version, but another processor updating
the version?
On Thursday, June 16, 2016 at 4:30:33 PM UTC+8, Steven D'Aprano wrote:
> On Thursday 16 June 2016 17:28, meInvent bbird wrote:
>
> > is there
the name in c# is not called concurrent list, it is called
blockingcollection
dictionary called concurrent dictionary
thread safe these kind of things
https://msdn.microsoft.com/en-us/library/dd267312(v=vs.110).aspx
https://msdn.microsoft.com/en-us/library/dd997369(v=vs.110).aspx
https://msdn.
On Thursday 16 June 2016 17:28, meInvent bbird wrote:
> is there like c# have concurrent list ?
What is a concurrent list?
Can you link to the C# documentation for this?
To me, "concurrent" describes a style of execution flow, and "list" describes a
data structure. I am struggling to understan
is there like c# have concurrent list ?
i find something these, but how can it pass an initlist list variable
is it doing the same function as itertools.combinations ?
def comb(n, initlist): # the argument n is the number of items to select
res = list(itertools.combinations(initlist, n))
For such tasks my choice would be twisted combined with ampoule.
Let's you spread out work to whatever amount of processes you desire,
maxing out whatever iron you're sitting on..
HTH, Werner
http://twistedmatrix.com/trac/
https://launchpad.net/ampoule
On 29.05.2012 16:43, Jabba Laci wrote:
Hehe, I just asked this question a few days ago but I didn't become
much cleverer:
http://www.gossamer-threads.com/lists/python/python/985701
Best,
Laszlo
On Thu, May 10, 2012 at 2:14 PM, Jabba Laci wrote:
> Hi,
>
> I would like to do some parallel programming with Python but I don't
> know ho
On 05/10/2012 06:46 AM, Devin Jeanpierre wrote:
> On Thu, May 10, 2012 at 8:14 AM, Jabba Laci wrote:
>> What's the best way?
>
>>From what I've heard, http://scrapy.org/ . It is a single-thread
> single-process web crawler that nonetheless can download things
> concurrently.
Yes, for i/o bound t
Hi,
Thanks for the answer. I use Linux with CPython 2.7. I plan to work
with CPU bound and I/O bound problems too. Which packages to use in
these cases? Could you redirect me to some guides? When to use
multiprocessing / gevent?
Thanks,
Laszlo
On Thu, May 10, 2012 at 2:34 PM, Dave Angel wrote
On Thu, May 10, 2012 at 8:14 AM, Jabba Laci wrote:
> What's the best way?
>From what I've heard, http://scrapy.org/ . It is a single-thread
single-process web crawler that nonetheless can download things
concurrently.
Doing what you want in Scrapy would probably involve learning about
Twisted, t
On 05/10/2012 08:14 AM, Jabba Laci wrote:
> Hi,
>
> I would like to do some parallel programming with Python but I don't
> know how to start. There are several ways to go but I don't know what
> the differences are between them: threads, multiprocessing, gevent,
> etc.
>
> I want to use a single ma
Hi,
I would like to do some parallel programming with Python but I don't
know how to start. There are several ways to go but I don't know what
the differences are between them: threads, multiprocessing, gevent,
etc.
I want to use a single machine with several cores. I want to solve
problems like
12 matches
Mail list logo