Since all my analytics will be done through the database, i require the log 
data in the table(by end of day at least).. 
So if I write all logs to redis and then can I run scheduler to flush it to 
db? Do you think that would be the right approach?

On Monday, June 17, 2013 9:10:31 PM UTC+5:30, Niphlod wrote:
>
> continous operations kinda don't play well with logs files.... issues are 
> more prominent in a multiprocess environment.
>
> Redis can do it without hiccup, if you have it in your environment...
>
>
>
> Il giorno lunedì 17 giugno 2013 15:22:43 UTC+2, ssuresh ha scritto:
>>
>> hi,
>> I am planning to build  some sort of user action analytics into my 
>> system. For this, I need to log all user actions into the table . Since an 
>> additional db write for every function call can be a performance issue, my 
>> idea is to introduce an async batch write(say after every hundred lines of 
>> logs, a job will bulk insert that into the table).
>>
>> Please advice on the best way of doing this using web2py.
>>
>> regds,
>> Suresh
>>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to