I think, this also depends on the oprating system. I would say that any
development team would avoid loading file type data into fast memory. These
problems are all over applications. From the PHP point of view, it could
mean that file data have to be read into memory, but it could not mean that
th
Are you actually having a problem with memory, or simply that you have
to transfer it over a network first? Depending on the protocol used, you
may be able to read it in chunks, but those chunks will still have to be
copied to the computer that is reading it before it can be processed.
The other o
On Tue, 2009-09-01 at 10:43 -0700, Grace Shibley wrote:
> HTTP
>
> On Tue, Sep 1, 2009 at 10:36 AM, Ashley Sheridan
> wrote:
>
> > On Tue, 2009-09-01 at 10:34 -0700, Grace Shibley wrote:
> > > Is there a way to read large (possibly 500 MB) remote files without
> > loading
> > > the whole file int
HTTP
On Tue, Sep 1, 2009 at 10:36 AM, Ashley Sheridan
wrote:
> On Tue, 2009-09-01 at 10:34 -0700, Grace Shibley wrote:
> > Is there a way to read large (possibly 500 MB) remote files without
> loading
> > the whole file into memory?
> > We are trying to write a function that will return chunks of
From: Grace Shibley
> Is there a way to read large (possibly 500 MB) remote files without
loading
> the whole file into memory?
> We are trying to write a function that will return chunks of binary
data
> from a file on our server given a file location, specified offset and
data
> size.
>
> But,
On Tue, 2009-09-01 at 10:34 -0700, Grace Shibley wrote:
> Is there a way to read large (possibly 500 MB) remote files without loading
> the whole file into memory?
> We are trying to write a function that will return chunks of binary data
> from a file on our server given a file location, specified
6 matches
Mail list logo