[issue34393] json.dumps - allow compression

2018-08-22 Thread Eric V. Smith
Eric V. Smith added the comment: I'm not asking to be difficult, I'm asking because a full specification would be required in order to implement this. For example, you're excluding the "compresslevel" parameter, so presumably you'd want the default of 9, which is the slowest option. I'm not

[issue34393] json.dumps - allow compression

2018-08-14 Thread liad
liad added the comment: I'm sure I will find a work-around. I posted it for other who will face the same issue as me. There are many who uses cloud storage but not many work with PB size files. This will likely to change in the near future as more and more company start to process huge amount

[issue34393] json.dumps - allow compression

2018-08-14 Thread Eric V. Smith
Eric V. Smith added the comment: If you really want to see this added to Python, then I suggest you put together a proposal on what the API would be, what options you would or wouldn't support, and post that on python-ideas. But I'll warn you that your chances of success are low. Your bette

[issue34393] json.dumps - allow compression

2018-08-14 Thread liad
liad added the comment: True there are endless versions of compression just like there are endless version of file formats. Still there are some build-ins like conversion from string to json. For example you don't support of json to orc file. Same argument could have been raise here : how wo

[issue34393] json.dumps - allow compression

2018-08-14 Thread Eric V. Smith
Eric V. Smith added the comment: There are too many types of compression for this to be built-in to json. There's zlib, gzip, bzip, zip, and no doubt dozens of others. How would we chose one, or several? Which to leave out? How to pass in the many parameters to all of these compression algor

[issue34393] json.dumps - allow compression

2018-08-13 Thread liad
liad added the comment: The gzip module may work for saving file localy but for example: This upload json to Google Storage: import datalab.storage as storage storage.Bucket('mybucket').item(path).write_to(json.dumps(response), 'application/json') Your won't work here unless I save the file

[issue34393] json.dumps - allow compression

2018-08-13 Thread rushter
rushter added the comment: You can use the gzip module. with gzip.GzipFile('products.json', 'w') as outfile: outfile.write(json.dumps(data, outfile, sort_keys=True)) -- nosy: +rushter ___ Python tracker

[issue34393] json.dumps - allow compression

2018-08-13 Thread liad
New submission from liad : The list of arguments of json.dump() can be seen here: https://docs.python.org/2/library/json.html Notice that there is no way to make compression. For example pandas allows you to do: df.to_csv(path_or_buf=file_name, index=False, encoding='utf-8',