Matteo wrote:
srcdata = urlopen(url).read()
dstfile = open(path,mode='wb')
dstfile.write(srcdata)
dstfile.close()
print("Done!")
Have you tried reading all files first, then saving each one on the
appropriate directory? It might work if you have enough me
Anders Eriksson wrote:
> I have made a short program that given an url will download all referenced
> files on that url.
>
> It works, but I'm thinking it could use some optimization since it's very
> slow.
What's slow about it? Is downloading each file slow, is it the overhead of
connecting to t
Anders Eriksson wrote:
> Hello,
>
> I have made a short program that given an url will download all referenced
> files on that url.
>
> It works, but I'm thinking it could use some optimization since it's very
> slow.
>
> I create a list of tuples where each tuple consist of the url to the file
> srcdata = urlopen(url).read()
> dstfile = open(path,mode='wb')
> dstfile.write(srcdata)
> dstfile.close()
> print("Done!")
Have you tried reading all files first, then saving each one on the
appropriate directory? It might work if you have enough memory, i
Hello,
I have made a short program that given an url will download all referenced
files on that url.
It works, but I'm thinking it could use some optimization since it's very
slow.
I create a list of tuples where each tuple consist of the url to the file
and the path to where I want to save it.