On 11/10/19 4:05 PM, Nicolas Paris wrote:
Hello postgres users,
Interesting. FYI, the announcement list is:
https://www.postgresql.org/list/pgsql-announce/
Spark-postgres is designed for reliable and performant ETL in big-data
workload and offers read/write/scd capability to better bridge s
> I would like to import (lots of) Apache parquet files to a PostgreSQL 11
you might be intersted in spark-postgres library. Basically the library
allows you to bulk load parquet files in one spark command:
> spark
> .read.format("parquet")
> .load(parquetFilesPath) // read the parquet files
> .w
Hello postgres users,
Spark-postgres is designed for reliable and performant ETL in big-data
workload and offers read/write/scd capability to better bridge spark and
postgres. The version 3 introduces a datasource API. It outperforms
sqoop by factor 8 and the apache spark core jdbc by infinity.
F