; Regards,
>>
>> Phillip
>>
>>
>>
>>
>> On Mon, Jun 19, 2023 at 9:37 AM Deepak Sharma
>> wrote:
>>
>>> It can be as simple as adding a function to the spark session builder
>>> specifically on the read which can take the yaml f
; On Mon, Jun 19, 2023 at 9:37 AM Deepak Sharma
> wrote:
>
>> It can be as simple as adding a function to the spark session builder
>> specifically on the read which can take the yaml file(definition if data
>> co tracts to be in yaml) and apply it to the data frame .
>&
pak Sharma wrote:
> It can be as simple as adding a function to the spark session builder
> specifically on the read which can take the yaml file(definition if data
> co tracts to be in yaml) and apply it to the data frame .
> It can ignore the rows not matching the data contrac
It can be as simple as adding a function to the spark session builder
specifically on the read which can take the yaml file(definition if data
co tracts to be in yaml) and apply it to the data frame .
It can ignore the rows not matching the data contracts defined in the yaml .
Thanks
Deepak
On
ppear to
be attempting to implement data contracts within their ecosystem.
Unfortunately, I think it's closed source and Python only.
Regards,
Phillip
On Sat, Jun 17, 2023 at 11:06 AM Mich Talebzadeh
wrote:
> It would be interesting if we think about creating a contract validation
> lib
easily, like schema
> validation, some rules validations. Spark could also generate an embryo of
> data contracts…
>
> —jgp
>
>
> On Jun 13, 2023, at 07:25, Mich Talebzadeh
> wrote:
>
> From my limited understanding of data contracts, there are two factors
> that deem
validations. Spark could also generate an embryo of data contracts…
—jgp
> On Jun 13, 2023, at 07:25, Mich Talebzadeh wrote:
>
> From my limited understanding of data contracts, there are two factors that
> deem necessary.
>
> procedure matter
> technical matter
> I
>From my limited understanding of data contracts, there are two factors that
deem necessary.
1. procedure matter
2. technical matter
I mean this is nothing new. Some tools like Cloud data fusion can assist
when the procedures are validated. Simply "The process of integrating
multi
control. Using pull requests you could
> collaborate on changing the contract and making sure that the change has
> gotten enough attention before pushing it to production. Hope this helps!
>
> Kind regards,
> Fokko
>
> Op di 13 jun 2023 om 04:31 schreef Deepak Sharma :
&
before pushing it to production. Hope this helps!
Kind regards,
Fokko
Op di 13 jun 2023 om 04:31 schreef Deepak Sharma :
> Spark can be used with tools like great expectations as well to implement
> the data contracts .
> I am not sure though if spark alone can do the data contracts .
> I
Spark can be used with tools like great expectations as well to implement
the data contracts .
I am not sure though if spark alone can do the data contracts .
I was reading a blog on data mesh and how to glue it together with data
contracts , that’s where I came across this spark and great
communicate a contract. Interestingly I believe this approach has been
applied to both JsonSchema and protobuf as part of the Confluent Schema
registry.
Elliot.
On Mon, 12 Jun 2023 at 12:43, Phillip Henry wrote:
> Hi, folks.
>
> There currently seems to be a buzz around "data co
Hey Phillip,
You're right that we can improve tooling to help with data contracts, but I
think that a contract still needs to be an agreement between people.
Constraints help by helping to ensure a data producer adheres to the
contract and gives feedback as soon as possible when assumption
Hi, folks.
There currently seems to be a buzz around "data contracts". From what I can
tell, these mainly advocate a cultural solution. But instead, could big
data tools be used to enforce these contracts?
My questions really are: are there any plans to implement data constraints
in
14 matches
Mail list logo