[issue34393] json.dumps - allow compression

2018-08-14 Thread liad
liad added the comment: I'm sure I will find a work-around. I posted it for other who will face the same issue as me. There are many who uses cloud storage but not many work with PB size files. This will likely to change in the near future as more and more company start to process huge a

[issue34393] json.dumps - allow compression

2018-08-14 Thread liad
liad added the comment: True there are endless versions of compression just like there are endless version of file formats. Still there are some build-ins like conversion from string to json. For example you don't support of json to orc file. Same argument could have been raise here

[issue34393] json.dumps - allow compression

2018-08-13 Thread liad
liad added the comment: The gzip module may work for saving file localy but for example: This upload json to Google Storage: import datalab.storage as storage storage.Bucket('mybucket').item(path).write_to(json.dumps(response), 'application/json') Your won't wo

[issue34393] json.dumps - allow compression

2018-08-13 Thread liad
New submission from liad : The list of arguments of json.dump() can be seen here: https://docs.python.org/2/library/json.html Notice that there is no way to make compression. For example pandas allows you to do: df.to_csv(path_or_buf=file_name, index=False, encoding='