i'm currently playing with amazon EC2 and S3 for data and ressources.
I found that Adrian Holovty used S3 for chicagocrime.org
http://www.holovaty.com/blog/archive/2006/04/07/0927/?highlight=s3

2007/5/7, Kyle Fox <[EMAIL PROTECTED]>:
>
>
> 1)  We found S3 to be a bit slow, both upstream and down.  Upstream is
> slow largely because of the SOAP overhead (from what I understand),
> and we were sending lots of data (about 5 resized images for each
> image uploaded to django).  Downstream, well, not much you can do
> about that, regardless of how you set up you models.  We simply stored
> the image's S3 key in the database, and used that when constructing
> get_absolute_url().
>
> 2) I'm not sure you can do this.  When you say you want to have a
> connection with the service, are you referring to a connection using
> the S3 python library?  I'm not sure how you would create a global,
> persistant connection to S3, which is what I assume you want.  We
> simple made a wrapper class around S3, and added a add_to_bucket()
> function to our user's profile model that used the S3 wrapper class.
> Again, this seemed reasonably fast, the bottleneck was not around
> managing the connection to S3, but the actual upload to S3 itself.
>
> Kyle.
>
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to