Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[SPARK-45652][SQL] SPJ: Handle empty input partitions after dynamic f…
…iltering Handle the case when input partitions become empty after V2 dynamic filtering, when SPJ is enabled. Current in the situation when all input partitions are filtered out via dynamic filtering, SPJ doesn't work but instead will panic: ``` java.util.NoSuchElementException: None.get at scala.None$.get(Option.scala:529) at scala.None$.get(Option.scala:527) at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.filteredPartitions$lzycompute(BatchScanExec.scala:108) at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.filteredPartitions(BatchScanExec.scala:65) at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.inputRDD$lzycompute(BatchScanExec.scala:136) at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.inputRDD(BatchScanExec.scala:135) at org.apache.spark.sql.boson.BosonBatchScanExec.inputRDD$lzycompute(BosonBatchScanExec.scala:28) ``` This is because the `groupPartitions` method will return `None` in this scenario. We should handle the case. No Added a test case for this. No Closes apache#43531 from sunchao/SPARK-45652. Authored-by: Chao Sun <[email protected]> Signed-off-by: Chao Sun <[email protected]>
- Loading branch information