Hello,

> Am 08.04.2022 um 17:34 schrieb Lalwani, Jayesh <jlalw...@amazon.com>:
> 
> What format are you writing the file to? Are you planning on your own custom 
> format, or are you planning to use standard formats like parquet?

I’m dealing with geo-spatial data (Apache Sedona), so I have got a data frame 
with such information and would like to export it to LAS format (see 
https://en.wikipedia.org/wiki/LAS_file_format )

> 
> Note that Spark can write numeric data in most standard formats. If you use  
> custom format instead, whoever consumes the data needs to parse your data. 
> This adds complexity to your and your consumer's code. You will also need to 
> worry about backward compatibility. 
> 
> I would suggest that you explore standard formats first before you write 
> custom code. If you do have to write data in a custom format, udf is a good 
> way to serialize the data into your format

The numerical data must be converted into a binary representation of LAS format 
specification see 
http://www.asprs.org/wp-content/uploads/2019/07/LAS_1_4_r15.pdf section 2.6, 
Table 7

Thank


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to