Stack: Django 1.7 + Postgres 9.3 + Linux (No caching)

Our application has a view which is called/executed very frequently. The
view receives, parses and responds a JSON.

In between request and response, there are about 3-5 inserts and around
1200-5000 look ups depending upon some if..else business logic. At around
2-4 seconds the view is very slow.

However, a lot of the look ups (which are the bottlenecks) can be
parallelized. But I do not know how can I do the same within a
request-response cycle.

If it was a web UI, I could use celery+polling, since it a machine-machine
API call, the parallelisation has to be possible within a View's life cycle.

If parallelisation is not possible, what alternatives do I have for scaling
and reducing response time.


Thanks!

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at http://groups.google.com/group/django-users.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-users/CABSvzZCGjY4DH2D2VtbsVDDHTAk-m%2BReJJ_BwmK_jn0wCDp%3DMg%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to