What kind of data do you store in that file? if this is someting that can be
grouped by date, i'd say break it up into smaller files like mm.json.
you might want to check this out as well:
http://cookbooks.adobe.com/post_Improving_speed_of_JSON_data_parsing-13226.html
On Tue, Dec 1, 2009 at
Hi,
On Tue, Dec 1, 2009 at 11:46 PM, waseem sabjee wrote:
> if the user has to upload the file to you, you read it, edit and then save
> it and promt the user to download it - i would say thats an effecient
> process...until you deal with a huge file.
>
> No the json files are on the server side
if the user has to upload the file to you, you read it, edit and then save
it and promt the user to download it - i would say thats an effecient
process...until you deal with a huge file.
is the user uploading the file to you the only method you are willing to use
to retreive the data ?
if you kn
well i got it.
On the server side, I can open the json file (100MB) and read the data
structure directly into python and process required fields; - then return
them to the client. but 1) persistence ? - does the 100MB file be loaded in
memory always ? or 2) loaded multiple times based on request
one way this can be achieved. a mixture of server side and client side code.
set up a seperate file or web service to get your data.
make an ajax call to this file passing your parameters via the call.
on success of that call, if the data your requested is returned, make
another ajax call with ne
5 matches
Mail list logo