Pretty interesting..

So, I've an idea here..

Why don't you log this output in some file.. (and share)

def page (request):
   start = time.time()
   # function definition
   elapsed = (time.time() - start)
   return HttpResponse()

This way, see if its the framework that's taking the time, or its the
processing/DB connection (put the same before and after a DB call)

Tx

On Jul 6, 12:54 am, drakkan <drakkan1...@gmail.com> wrote:
> Hi,
>
> I did a small test app that basically render a web page and do 3
> queries to the database (these queries are needed to fetch some data
> displayed on the web page). I deployed this app with nginx+uswgi
>
> here is the relevant nginx conf:
>
> location / {
>                 uwsgi_pass 127.0.0.1:49152;
>                 include uwsgi_params;
>         }
>
> I configured nginx to serve the static files too:
>
> location /media {
>                 root <root path for media>;
>                 autoindex on;
>         }
>
> I launch uswgi as follow:
>
> uwsgi --chdir=<path> --
> module='django.core.handlers.wsgi:WSGIHandler()' --env
> DJANGO_SETTINGS_MODULE=myapp.settings --master --pidfile=/tmp/project-
> master.pid --socket=127.0.0.1:49152 --max-requests=5000 --process=5
>
> then I benchmarked the app with ab:
>
> ab -n 1000 -c 4http://127.0.0.1:80/
> This is ApacheBench, Version 2.3 <$Revision: 655654 $>
> Copyright 1996 Adam Twiss, Zeus Technology Ltd,http://www.zeustech.net/
> Licensed to The Apache Software Foundation,http://www.apache.org/
>
> Benchmarking 127.0.0.1 (be patient)
> Completed 100 requests
> Completed 200 requests
> Completed 300 requests
> Completed 400 requests
> Completed 500 requests
> Completed 600 requests
> Completed 700 requests
> Completed 800 requests
> Completed 900 requests
> Completed 1000 requests
> Finished 1000 requests
>
> Server Software:        nginx
> Server Hostname:        127.0.0.1
> Server Port:            80
>
> Document Path:          /
> Document Length:        24293 bytes
>
> Concurrency Level:      4
> Time taken for tests:   28.914 seconds
> Complete requests:      1000
> Failed requests:        0
> Write errors:           0
> Total transferred:      24423000 bytes
> HTML transferred:       24293000 bytes
> Requests per second:    34.59 [#/sec] (mean)
> Time per request:       115.654 [ms] (mean)
> Time per request:       28.914 [ms] (mean, across all concurrent
> requests)
> Transfer rate:          824.89 [Kbytes/sec] received
>
> Connection Times (ms)
>               min  mean[+/-sd] median   max
> Connect:        0    0   0.1      0       4
> Processing:    46  115  42.6    110     636
> Waiting:       46  115  42.5    109     636
> Total:         46  116  42.6    110     636
>
> Percentage of the requests served within a certain time (ms)
>   50%    110
>   66%    121
>   75%    131
>   80%    139
>   90%    161
>   95%    177
>   98%    203
>   99%    220
>  100%    636 (longest request)
>
> now I implemented the same app using playframework, a java based
> framework and benchmarked with ab again:
>
> ab -n 1000 -c 4http://127.0.0.1:9000/
> This is ApacheBench, Version 2.3 <$Revision: 655654 $>
> Copyright 1996 Adam Twiss, Zeus Technology Ltd,http://www.zeustech.net/
> Licensed to The Apache Software Foundation,http://www.apache.org/
>
> Benchmarking 127.0.0.1 (be patient)
> Completed 100 requests
> Completed 200 requests
> Completed 300 requests
> Completed 400 requests
> Completed 500 requests
> Completed 600 requests
> Completed 700 requests
> Completed 800 requests
> Completed 900 requests
> Completed 1000 requests
> Finished 1000 requests
>
> Server Software:        Play!
> Server Hostname:        127.0.0.1
> Server Port:            9000
>
> Document Path:          /
> Document Length:        19614 bytes
>
> Concurrency Level:      4
> Time taken for tests:   4.436 seconds
> Complete requests:      1000
> Failed requests:        0
> Write errors:           0
> Total transferred:      19961000 bytes
> HTML transferred:       19614000 bytes
> Requests per second:    225.44 [#/sec] (mean)
> Time per request:       17.743 [ms] (mean)
> Time per request:       4.436 [ms] (mean, across all concurrent
> requests)
> Transfer rate:          4394.59 [Kbytes/sec] received
>
> Connection Times (ms)
>               min  mean[+/-sd] median   max
> Connect:        0    0   0.0      0       1
> Processing:     7   18   6.6     16      47
> Waiting:        6   17   6.6     16      47
> Total:          7   18   6.6     16      47
>
> Percentage of the requests served within a certain time (ms)
>   50%     16
>   66%     19
>   75%     22
>   80%     23
>   90%     27
>   95%     30
>   98%     34
>   99%     38
>  100%     47 (longest request)
>
> so play is outperforming django! obviously django is not in debug mode
> ecc..., is there something wrong in my test setup (I already tried to
> adjust the uwsgi launch line I tryed more process or 1 process with
> threads ecc with no relevant improvement) or django/python is simply
> much slower than java? I tried to run play behind nginx proxy too: the
> results are pratically identical. Note the response time too: the
> slowest play response is 47 ms, the fastest django one is 110 ms,
>
> any suggestion to improve performance is appreciated,
>
> thanks in advance,
> drakkan

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com.
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en.

Reply via email to