reviews
Thread
Date
Earlier messages
Messages by Thread
[PR] draft [spark]
via GitHub
Re: [PR] [SPARK-51848] Fix parsing XML records with defined schema of array/structs of Variant [spark]
via GitHub
[PR] add new parameter option to views [spark]
via GitHub
[PR] [SPARK-51836][PYTHON][CONNECT][TESTS] Avoid per-test-function connect session setup [spark]
via GitHub
Re: [PR] [SPARK-51836][PYTHON][CONNECT][TESTS] Avoid per-test-function connect session setup [spark]
via GitHub
Re: [PR] [SPARK-51836][PYTHON][CONNECT][TESTS] Avoid per-test-function connect session setup [spark]
via GitHub
Re: [PR] Delay `Join.metadataOutput` computation until `Join` is resolved [spark]
via GitHub
[PR] [SPARK-51847][PYTHON][TESTS] Extend PySpark testing framework util functions with basic data tests [spark]
via GitHub
Re: [PR] [SPARK-49488][SQL][FOLLOWUP] Do not push down extract expression if extracted field is second [spark]
via GitHub
Re: [PR] [SPARK-49386][SPARK-27734][CORE][SQL] Add memory based thresholds for shuffle spill [spark]
via GitHub
[PR] [SPARK-51846] Upgrade `gRPC Swift Protobuf` to 1.2 and `gRPC Swift NIO Transport` to 1.0.3 [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51846] Upgrade `gRPC Swift Protobuf` to 1.2 and `gRPC Swift NIO Transport` to 1.0.3 [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51846] Upgrade `gRPC Swift Protobuf` to 1.2 and `gRPC Swift NIO Transport` to 1.0.3 [spark-connect-swift]
via GitHub
[PR] [SPARK-51841] Support `isLocal` and `isStreaming` for `DataFrame` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51841] Support `isLocal` and `isStreaming` for `DataFrame` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51841] Support `isLocal` and `isStreaming` for `DataFrame` [spark-connect-swift]
via GitHub
[PR] [SPARK-51845][ML][CONNECT] Add proto message `Clean` to clean up all ml cache [spark]
via GitHub
[PR] Revert "[SPARK-51691][CORE][TESTS] SerializationDebugger should swallow exception when try to find the reason of serialization problem" [spark]
via GitHub
Re: [PR] Revert "[SPARK-51691][CORE][TESTS] SerializationDebugger should swallow exception when try to find the reason of serialization problem" [spark]
via GitHub
[PR] [SPARK-51844][PYTHON][TESTS] Add ReusedMixedTestCase for test env with both classic and connect [spark]
via GitHub
[PR] [SPARK-51843][PYTHON][ML][TESTS] Avoid per-test classic session start & stop [spark]
via GitHub
Re: [PR] [SPARK-51663][SQL][FOLLOWUP] change buildLeft and buildRight to function [spark]
via GitHub
Re: [PR] [SPARK-51663][SQL][FOLLOWUP] change buildLeft and buildRight to function [spark]
via GitHub
Re: [PR] [SPARK-51663][SQL][FOLLOWUP] change buildLeft and buildRight to function [spark]
via GitHub
Re: [PR] [SPARK-51663][SQL][FOLLOWUP] change buildLeft and buildRight to function [spark]
via GitHub
[PR] [MINOR][INFRA] https://github.com/apache/spark/pull/50447 [spark-connect-swift]
via GitHub
Re: [PR] [MINOR][INFRA] Fix `merge_spark_pr` script for no jira case [spark-connect-swift]
via GitHub
Re: [PR] [MINOR][INFRA] Fix `merge_spark_pr` script for no jira case [spark-connect-swift]
via GitHub
Re: [PR] [MINOR][INFRA] Fix `merge_spark_pr` script for no jira case [spark-connect-swift]
via GitHub
[PR] [SPARK-51774][CONNECT][FOLLOW-UP][TESTS] Skip ConnectErrorsTest if grpc is not available [spark]
via GitHub
Re: [PR] [SPARK-51774][CONNECT][FOLLOW-UP][TESTS] Skip ConnectErrorsTest if grpc is not available [spark]
via GitHub
Re: [PR] [SPARK-51774][CONNECT][FOLLOW-UP][TESTS] Skip ConnectErrorsTest if grpc is not available [spark]
via GitHub
[PR] [SPARK-51836][PYTHON][CONNECT][TESTS][FOLLOW-UP] update `test_connect_classification` [spark]
via GitHub
Re: [PR] [SPARK-51836][PYTHON][CONNECT][TESTS][FOLLOW-UP] update `test_connect_classification` [spark]
via GitHub
Re: [PR] [SPARK-51836][PYTHON][CONNECT][TESTS][FOLLOW-UP] update `test_connect_classification` [spark]
via GitHub
Re: [PR] [SPARK-49488][SQL][FOLLOWUP] Use correct MySQL datetime functions when pushing down EXTRACT [spark]
via GitHub
Re: [PR] [SPARK-49488][SQL][FOLLOWUP] Use correct MySQL datetime functions when pushing down EXTRACT [spark]
via GitHub
Re: [PR] [SPARK-51663][SQL][FOLLOWUP] modify buildLeft and buildRight with lazy [spark]
via GitHub
Re: [PR] [SPARK-51663][SQL][FOLLOWUP] modify buildLeft and buildRight with lazy [spark]
via GitHub
Re: [PR] [SPARK-51663][SQL][FOLLOWUP] modify buildLeft and buildRight with lazy [spark]
via GitHub
Re: [PR] [SPARK-51663][SQL][FOLLOWUP] modify buildLeft and buildRight with lazy [spark]
via GitHub
[PR] [SPARK-51839] Support `except(All)?/intersect(All)?/union(All)?/unionByName` in `DataFrame` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51839] Support `except(All)?/intersect(All)?/union(All)?/unionByName` in `DataFrame` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51839] Support `except(All)?/intersect(All)?/union(All)?/unionByName` in `DataFrame` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51839] Support `except(All)?/intersect(All)?/union(All)?/unionByName` in `DataFrame` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51839] Support `except(All)?/intersect(All)?/union(All)?/unionByName` in `DataFrame` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51839] Support `except(All)?/intersect(All)?/union(All)?/unionByName` in `DataFrame` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51609][SQL] Optimize Recursive CTE execution for simple queries [spark]
via GitHub
[PR] Revert "[SPARK-51829][PYTHON][ML] Client side should update `client.thread_local.ml_caches` after deletion" From 4.0 [spark]
via GitHub
Re: [PR] Revert "[SPARK-51829][PYTHON][ML] Client side should update `client.thread_local.ml_caches` after deletion" From 4.0 [spark]
via GitHub
Re: [PR] Revert "[SPARK-51829][PYTHON][ML] Client side should update `client.thread_local.ml_caches` after deletion" From 4.0 [spark]
via GitHub
[PR] [SPARK-51838][PYTHON][TESTS] Add a test to check function wildcard import [spark]
via GitHub
Re: [PR] [SPARK-51838][PYTHON][TESTS] Add a test to check function wildcard import [spark]
via GitHub
Re: [PR] [SPARK-51838][PYTHON][TESTS] Add a test to check function wildcard import [spark]
via GitHub
[PR] [SPARK-51837] Support `inputFiles` for `DataFrame` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51837] Support `inputFiles` for `DataFrame` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51837] Support `inputFiles` for `DataFrame` [spark-connect-swift]
via GitHub
Re: [PR] [WIP][SPARK-50585][PYTHON] Visualize doctest examples for PySpark plotting [spark]
via GitHub
[PR] [Do not Merge][SQL] Investigate default is null [spark]
via GitHub
Re: [PR] [SPARK-51842][SQL] Remove unnecessary judgement from fillDefaultValue [spark]
via GitHub
[PR] [SPARK-51834][SQL] Support end-to-end table constraint management [spark]
via GitHub
Re: [PR] [SPARK-51834][SQL] Support end-to-end table constraint management [spark]
via GitHub
Re: [PR] [SPARK-51834][SQL] Support end-to-end table constraint management [spark]
via GitHub
[PR] [SPARK-51272][CORE] Aborting instead of re-submitting of partially completed indeterminate result stage [spark]
via GitHub
Re: [PR] [SPARK-51272][CORE] Aborting instead of re-submitting of partially completed indeterminate result stage [spark]
via GitHub
Re: [PR] [SPARK-51272][CORE] Aborting instead of re-submitting of partially completed indeterminate result stage [spark]
via GitHub
Re: [PR] [SPARK-51272][CORE] Aborting instead of re-submitting of partially completed indeterminate result stage [spark]
via GitHub
Re: [PR] [SPARK-51272][CORE] Aborting instead of re-submitting of partially completed indeterminate result stage [spark]
via GitHub
Re: [PR] [SPARK-51272][CORE] Aborting instead of re-submitting of partially completed indeterminate result stage [spark]
via GitHub
Re: [PR] [SPARK-51272][CORE] Aborting instead of re-submitting of partially completed indeterminate result stage [spark]
via GitHub
Re: [PR] [SPARK-51272][CORE] Aborting instead of re-submitting of partially completed indeterminate result stage [spark]
via GitHub
[PR] [MINOR][FOLLOW-UP] Update `flatMapGroupsWithState` test conf [spark]
via GitHub
Re: [PR] [SPARK-50967][SS][MINOR][FOLLOW-UP] Update `flatMapGroupsWithState` test conf [spark]
via GitHub
Re: [PR] [SPARK-50967][SS][MINOR][FOLLOW-UP] Update `flatMapGroupsWithState` test conf [spark]
via GitHub
Re: [PR] [SPARK-50967][SS][MINOR][FOLLOW-UP] Update `flatMapGroupsWithState` test conf [spark]
via GitHub
[PR] [SPARK-51758] Fix test case related to extra batch causing empty df due to watermark [spark]
via GitHub
Re: [PR] [SPARK-51758][SS] Fix test case related to extra batch causing empty df due to watermark [spark]
via GitHub
Re: [PR] [SPARK-51758][SS] Fix test case related to extra batch causing empty df due to watermark [spark]
via GitHub
Re: [PR] [SPARK-51758][SS] Fix test case related to extra batch causing empty df due to watermark [spark]
via GitHub
Re: [PR] [SPARK-51758][SS] Fix test case related to extra batch causing empty df due to watermark [spark]
via GitHub
[PR] Use the same property map for TableInfo [spark]
via GitHub
Re: [PR] [SPARK-51372][SQL][FOLLOW-UP] Retain the property map for DataSourceV2 TableInfo [spark]
via GitHub
Re: [PR] [SPARK-51372][SQL][FOLLOW-UP] Retain the property map for DataSourceV2 TableInfo [spark]
via GitHub
Re: [PR] [SPARK-51372][SQL][FOLLOW-UP] Retain the property map for DataSourceV2 TableInfo [spark]
via GitHub
Re: [PR] [SPARK-51372][SQL][FOLLOW-UP] Retain the property map for DataSourceV2 TableInfo [spark]
via GitHub
Re: [PR] [SPARK-51372][SQL][FOLLOW-UP] Retain the property map for DataSourceV2 TableInfo [spark]
via GitHub
[PR] SwiftyLab [spark-connect-swift]
via GitHub
Re: [PR] SwiftyLab [spark-connect-swift]
via GitHub
Re: [PR] SwiftyLab [spark-connect-swift]
via GitHub
[PR] Revert "[SPARK-51758][SS][FOLLOWUP][TESTS] Fix flaky test around watermark due to additional batch causing empty df" [spark]
via GitHub
Re: [PR] Revert "[SPARK-51758][SS][FOLLOWUP][TESTS] Fix flaky test around watermark due to additional batch causing empty df" [spark]
via GitHub
Re: [PR] Revert "[SPARK-51758][SS][FOLLOWUP][TESTS] Fix flaky test around watermark due to additional batch causing empty df" [spark]
via GitHub
Re: [PR] [SPARK-51223][CONNECT] Always use an ephemeral port for local connect [spark]
via GitHub
Re: [PR] [SPARK-51223][CONNECT] Always use an ephemeral port for local connect [spark]
via GitHub
[PR] FIX change type hint of pyspark percentile functions to reflect tuple of variable amount of floats [spark]
via GitHub
Re: [PR] [MINOR][PYTHON] FIX change type hint of pyspark percentile functions to reflect tuple of variable amount of floats [spark]
via GitHub
Re: [PR] [MINOR][PYTHON] FIX change type hint of pyspark percentile functions to reflect tuple of variable amount of floats [spark]
via GitHub
[PR] [WIP]HiveException: Unable to alter table. partition keys can not be changed. [spark]
via GitHub
Re: [PR] [SPARK-51840][SQL] Restore Partition columns in HiveExternalCatalog#alterTable [spark]
via GitHub
Re: [PR] [SPARK-51840][SQL] Restore Partition columns in HiveExternalCatalog#alterTable [spark]
via GitHub
Re: [PR] [SPARK-51840][SQL] Restore Partition columns in HiveExternalCatalog#alterTable [spark]
via GitHub
Re: [PR] [SPARK-51840][SQL] Restore Partition columns in HiveExternalCatalog#alterTable [spark]
via GitHub
Re: [PR] [SPARK-51840][SQL] Restore Partition columns in HiveExternalCatalog#alterTable [spark]
via GitHub
Re: [PR] [SPARK-51840][SQL] Restore Partition columns in HiveExternalCatalog#alterTable [spark]
via GitHub
Re: [PR] [SPARK-51758][SS][FOLLOWUP][TESTS] Fix `TransformWithStateInPandasParityTests.test_transform_with_state_with_wmark_and_non_event_time` [spark]
via GitHub
Re: [PR] [SPARK-51758][SS][FOLLOWUP][TESTS] Fix `TransformWithStateInPandasParityTests.test_transform_with_state_with_wmark_and_non_event_time` [spark]
via GitHub
Re: [PR] [SPARK-51758][SS][FOLLOWUP][TESTS] Fix `TransformWithStateInPandasParityTests.test_transform_with_state_with_wmark_and_non_event_time` [spark]
via GitHub
[PR] [SPARK-51832][BUILD] Use -q option to simplify maven version evaluation [spark]
via GitHub
Re: [PR] [SPARK-51832][BUILD] Use -q option to simplify maven version evaluation [spark]
via GitHub
Re: [PR] [SPARK-51832][BUILD] Use -q option to simplify maven version evaluation [spark]
via GitHub
[PR] Windows [spark-connect-swift]
via GitHub
Re: [PR] Windows [spark-connect-swift]
via GitHub
Re: [PR] Windows [spark-connect-swift]
via GitHub
[PR] [SPARK-51758][FOLLOWUP][SS] Fix flaky test around watermark due to additional batch causing empty df [spark]
via GitHub
Re: [PR] [SPARK-51758][FOLLOWUP][SS] Fix flaky test around watermark due to additional batch causing empty df [spark]
via GitHub
Re: [PR] [SPARK-51758][FOLLOWUP][SS] Fix flaky test around watermark due to additional batch causing empty df [spark]
via GitHub
Re: [PR] [SPARK-51758][FOLLOWUP][SS] Fix flaky test around watermark due to additional batch causing empty df [spark]
via GitHub
Re: [PR] [SPARK-51758][SS][FOLLOWUP][TESTS] Fix flaky test around watermark due to additional batch causing empty df [spark]
via GitHub
Re: [PR] [SPARK-51758][SS][FOLLOWUP][TESTS] Fix flaky test around watermark due to additional batch causing empty df [spark]
via GitHub
Re: [PR] [SPARK-51758][SS][FOLLOWUP][TESTS] Fix flaky test around watermark due to additional batch causing empty df [spark]
via GitHub
Re: [PR] [SPARK-51758][SS][FOLLOWUP][TESTS] Fix flaky test around watermark due to additional batch causing empty df [spark]
via GitHub
Re: [PR] [SPARK-51758][SS][FOLLOWUP][TESTS] Fix flaky test around watermark due to additional batch causing empty df [spark]
via GitHub
[PR] [SPARK-51828] Update `README.md` with YuniKorn 1.6.2 [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-51828] Update `README.md` with YuniKorn 1.6.2 [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-51828] Update `README.md` with YuniKorn 1.6.2 [spark-kubernetes-operator]
via GitHub
[PR] [SPARK-51829][PYTHON][ML] Client side should update `client.thread_local.ml_caches` after deletion [spark]
via GitHub
Re: [PR] [SPARK-51829][PYTHON][ML] Client side should update `client.thread_local.ml_caches` after deletion [spark]
via GitHub
Re: [PR] [SPARK-51829][PYTHON][ML] Client side should update `client.thread_local.ml_caches` after deletion [spark]
via GitHub
Re: [PR] [SPARK-51757][SQL] Fix LEAD/LAG Function Offset Exceeds Window Group Size [spark]
via GitHub
Re: [PR] [SPARK-51757][SQL] Fix LEAD/LAG Function Offset Exceeds Window Group Size [spark]
via GitHub
[PR] [SPARK-51826][K8S][DOCS] Update `YuniKorn` docs with `1.6.2` [spark]
via GitHub
Re: [PR] [SPARK-51826][K8S][DOCS] Update `YuniKorn` docs with `1.6.2` [spark]
via GitHub
Re: [PR] [SPARK-51826][K8S][DOCS] Update `YuniKorn` docs with `1.6.2` [spark]
via GitHub
Re: [PR] [SPARK-51826][K8S][DOCS] Update `YuniKorn` docs with `1.6.2` [spark]
via GitHub
[PR] [SPARK-51800][INFRA][FOLLOW-UP] Respect PYSPARK_UDS_MODE environment variable in SparkContext initialization [spark]
via GitHub
Re: [PR] [SPARK-51800][INFRA][FOLLOW-UP] Respect PYSPARK_UDS_MODE environment variable in SparkContext initialization [spark]
via GitHub
Re: [PR] [SPARK-51800][INFRA][FOLLOW-UP] Respect PYSPARK_UDS_MODE environment variable in SparkContext initialization [spark]
via GitHub
[PR] [SPARK-51825] Add `SparkFileUtils` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51825] Add `SparkFileUtils` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-51825] Add `SparkFileUtils` [spark-connect-swift]
via GitHub
[PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51823][SS] Add config to not persist state store on executors [spark]
via GitHub
Re: [PR] [SPARK-51806][BUILD][4.0] Upgrade kryo-shaded to 4.0.3 [spark]
via GitHub
Re: [PR] [SPARK-51806][BUILD][4.0] Upgrade kryo-shaded to 4.0.3 [spark]
via GitHub
Re: [PR] [SPARK-51806][BUILD][4.0] Upgrade kryo-shaded to 4.0.3 [spark]
via GitHub
Re: [PR] [SPARK-51250][K8S] Add Support for K8s PriorityClass Configuration fo… [spark]
via GitHub
Re: [PR] [SPARK-51250][K8S] Add Support for K8s PriorityClass Configuration fo… [spark]
via GitHub
Re: [PR] [SPARK-51250][K8S] Add Support for K8s PriorityClass Configuration fo… [spark]
via GitHub
Re: [PR] [SPARK-51250][K8S] Add Support for K8s PriorityClass Configuration fo… [spark]
via GitHub
Re: [PR] [SPARK-51250][K8S] Add Support for K8s PriorityClass Configuration fo… [spark]
via GitHub
[PR] [SPARK-51824][ML][CONNECT][TESTS] Force to clean up the ML cache after each test [spark]
via GitHub
Re: [PR] [SPARK-51824][ML][CONNECT][TESTS] Force to clean up the ML cache after each test [spark]
via GitHub
Re: [PR] [SPARK-51824][ML][CONNECT][TESTS] Force to clean up the ML cache after each test [spark]
via GitHub
Re: [PR] [SPARK-51824][ML][CONNECT][TESTS] Force to clean up the ML cache after each test [spark]
via GitHub
Re: [PR] [SPARK-51824][ML][CONNECT][TESTS] Force to clean up the ML cache after each test [spark]
via GitHub
Re: [PR] [SPARK-51824][ML][CONNECT][TESTS] Force to clean up the ML cache after each test [spark]
via GitHub
[PR] [SPARK-51824][ML][CONNECT][TESTS] Force to clean up the ML cache after each test [spark]
via GitHub
Re: [PR] [SPARK-51824][ML][CONNECT][TESTS] Force to clean up the ML cache after each test [spark]
via GitHub
Re: [PR] [SPARK-51824][ML][CONNECT][TESTS] Force to clean up the ML cache after each test [spark]
via GitHub
[PR] [SPARK-51739][PYTHON][FOLLOW-UP] Set spark.sql.execution.arrow.pyspark.validateSchema.enabled for 3.5 connect client build [spark]
via GitHub
Re: [PR] [SPARK-51739][PYTHON][FOLLOW-UP] Set spark.sql.execution.arrow.pyspark.validateSchema.enabled for 3.5 connect client build [spark]
via GitHub
Re: [PR] [SPARK-51739][PYTHON][FOLLOW-UP] Set spark.sql.execution.arrow.pyspark.validateSchema.enabled for 3.5 connect client build [spark]
via GitHub
Re: [PR] [SPARK-51739][PYTHON][FOLLOW-UP] Set spark.sql.execution.arrow.pyspark.validateSchema.enabled for 3.5 connect client build [spark]
via GitHub
[PR] Bump golang.org/x/net from 0.36.0 to 0.38.0 [spark-connect-go]
via GitHub
[PR] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822]][SS] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822]][SS] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822]][SS] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822][SS] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822][SS] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-51822][SS] Throwing classified error when disallowed functions are called during StatefulProcessor.init() [spark]
via GitHub
Re: [PR] [SPARK-50639][SQL] Improve warning logging in CacheManager [spark]
via GitHub
Re: [PR] [SPARK-51149][CORE] Log classpath in SparkSubmit on ClassNotFoundException [spark]
via GitHub
[PR] Error handling for partition datatype conversion call [spark]
via GitHub
Re: [PR] Error handling for partition datatype conversion call [spark]
via GitHub
Re: [PR] [SPARK-51830] Exception handling for partition datatype conversion call [spark]
via GitHub
[PR] Split resolve ddl and view [spark]
via GitHub
Re: [PR] Split resolve ddl and view [spark]
via GitHub
[PR] [SPARK-51819][PYTHON] Update pyspark-errors test module to include missing tests [spark]
via GitHub
Re: [PR] [SPARK-51819][PYTHON] Update pyspark-errors test module to include missing tests [spark]
via GitHub
Re: [PR] [SPARK-51819][PYTHON] Update pyspark-errors test module to include missing tests [spark]
via GitHub
Re: [PR] [WIP][SPARK-51554][SQL] Add the time_trunc() function for TIME datatype [spark]
via GitHub
Earlier messages