@Niphlod, @LightDot,

thanks a lot :) The script I'm referring to parses 10 different RSS feeds 
at once and stores news (only the latest ones) into the database. Parsing 
takes around 20-30 s, plus there's bulk_insert() performed. I was wondering 
to put the script to sleep everytime it finishes parsing a specific feed 
and inserts results into the db. I guess I'll really start to worry when 
there's actually something to worry (e.g. high server load).

On Monday, September 30, 2013 1:58:32 PM UTC+2, LightDot wrote:
>
> @lesssugar Define efficient. :) Are you after a shorter overall execution 
> time for this task, lower server load, lower power consumption etc.?
>
> I would generally start the task and leave the management to the database 
> backend, kernel governor, etc. unless there is a specific need that they 
> can't provide for.
>
> Compulsory xkcd <http://xkcd.com/1205/>. And how ever efficient you are, 
> you're still contributing towards the heat death of the universe, or so 
> we're told. So theoretically... don't bother unless you need to.
>
> Regards
>
> On Monday, September 30, 2013 2:29:57 AM UTC+2, lesssugar wrote:
>>
>> That was a theoretical question. I'm asking because I don't know which 
>> approach is better. There's no specific issue I'm trying to address. I just 
>> want to find out which way is more efficient.
>
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to