Hi Everyone,

Version 1.6.3 of the Riak Spark Connector has been released, and is now
available through both GitHub
<https://github.com/basho/spark-riak-connector/releases/tag/v1.6.3> and Spark
Packages <https://spark-packages.org/package/basho/spark-riak-connector>.
I'd like to thank the team, especially Sergey Galkin
<https://github.com/srgg>, for all their hard work on this release.

This release fixes some issues, and enhances some features:


   - Enhancement - Data locality support for Coverage Plan Based
   Partitioning #230
   <https://github.com/basho/spark-riak-connector/pull/230>
   - Enhancement - Improve Coverage Plan Based Partitioning: smart split
   calculation and more accurate coverage entry distribution across the
   partitions #231 <https://github.com/basho/spark-riak-connector/pull/231>
   - Critical Fix - Python serialization for empty JSON objects #226
   <https://github.com/basho/spark-riak-connector/pull/226>
   - Fix - Remove double filtering for DataFrames #228
   <https://github.com/basho/spark-riak-connector/pull/228>


As always, issues and PRs are welcome via the project's GitHub page
<https://github.com/basho/spark-riak-connector>.

Thanks,
Alex Moore
Clients Team Lead, Basho
amo...@basho.com
_______________________________________________
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to