> Please consider this situation :
> Each line in "massive_input.txt" need to be churned by the
> "time_intensive_stuff" function, so I am trying to background it.
>
> import threading
>
> def time_intensive_stuff(arg):
>    # some code, some_conditional
>    return (some_conditional)
>
> with open("massive_input.txt") as fobj:
>    for i in fobj:
>       thread_thingy = thread.Threading(target=time_intensive_stuff, args=(i,) 
> )
>       thread_thingy.start()
>
>
> With above code, it still does not feel like it is backgrounding at
> scale,  I am sure there is a better pythonic way.


You might be looking for the multiprocessing library:
https://docs.python.org/3.6/library/multiprocessing.html.

Can you say more about the nature of the "time_intensive_stuff" part
though?  More context may help.
_______________________________________________
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
https://mail.python.org/mailman/listinfo/tutor

Reply via email to