well i got it. On the server side, I can open the json file (100MB) and read the data structure directly into python and process required fields; - then return them to the client. but 1) persistence ? - does the 100MB file be loaded in memory always ? or 2) loaded multiple times based on request ? so is there a memory efficient way of reading big json files ?
thanks, Krishna On Tue, Dec 1, 2009 at 10:50 PM, waseem sabjee <waseemsab...@gmail.com>wrote: > one way this can be achieved. a mixture of server side and client side > code. > > set up a seperate file or web service to get your data. > > make an ajax call to this file passing your parameters via the call. > on success of that call, if the data your requested is returned, make > another ajax call with new parameters. > > say my first call was to get the data rows 1- 50. > my second was to ge rows 51 - 100 > > do you mean something like that ? > > 100Mb is a bit much for client side scripting though. > > i would suggest using a server side script and cleaning server side memory > when required. > here you would be just eating your users bandwidth...eating it like theres > no tomorow. > > > On Tue, Dec 1, 2009 at 3:44 PM, MorningZ <morni...@gmail.com> wrote: > >> Let's put it simply, JavaScript code just isn't that smart.... >> >> Your client-side code (1) makes a request, then (2) the server >> responds, ** that's it **.......... as the person above me suggests, >> use your server side code, the one providing the "big JSON" data to do >> the filtering >> >> On Dec 1, 3:44 am, km <srikrishnamo...@gmail.com> wrote: >> > Hi all, >> > >> > I am currently using $.getJSON to load a big JSON format file (100MB). >> > So is there a way to selectively parse a few fields of the JSON file so >> that >> > the full file doesnt get loaded in memory ? >> > In summary i am looking for parsing a few keys in the JSON file and >> fetch >> > those values only to display on the webpage. >> > >> > any ideas ? >> > thanks, >> > >> > regards >> > Krishna >> > >