-
Notifications
You must be signed in to change notification settings - Fork 241
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add DeltaLake support without Deletion Vectors for Databricks 14.3 [databricks] #12048
Open
razajafri
wants to merge
32
commits into
NVIDIA:branch-25.02
Choose a base branch
from
razajafri:SP-10661-db-14.3-delta-lake-support
base: branch-25.02
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from 17 commits
Commits
Show all changes
32 commits
Select commit
Hold shift + click to select a range
36aab4b
Added delta-lake support for Databricks 14.3
razajafri ec3b55d
xfailed delta_lake_delete_test.py due to lacking deletion vector writ…
razajafri 9cd136c
skip low_shuffle_merge_test for any databricks version besides 13.3
razajafri 4b23ccd
added Execs to run on CPU for auto_compact tests
razajafri fe85f65
xfailed delta_lake_merge_test.py
razajafri fcce0a8
xfailed delta_lake_test.py
razajafri e5b2ff0
xfailed delta_lake_update_test.py
razajafri e482937
reverted change
razajafri 4d98a62
xfailed and fixed some failing tests in delta_lake_write_test.py
razajafri f14ff42
xfailed delta_zorder_test.py
razajafri 9c908f7
Signing off
razajafri 6cfce00
updated copyrights and fixed line length
razajafri c6f9199
Removed multiple versions of DatabricksDeltaProviderBase
razajafri 1c77a22
reverted test.sh changes
razajafri 7360dd3
updated copyrights
razajafri 5b3aaf2
removed fastparquet.txt
razajafri c41d731
Added copyrights from Delta-io project
razajafri 013a794
Modified the logic for turning deletion vectors on/off to use the
razajafri bf90960
renamed parameter names
razajafri df583cb
Added DeltaInvariantCheckerExec to fallback
razajafri 70edea2
addressed review comments
razajafri 5b3acd9
Update copyrights on DeltaProbe.scala
razajafri 2223cd8
Addressed review comments
razajafri f77ef04
Improved formatting
razajafri f7390e3
Merge remote-tracking branch 'origin/branch-25.02' into HEAD
razajafri 88457fc
Removed DeltaInvariant from the allow_non_gpu as it's included from t…
razajafri 3ef1f17
Disabling deletion vector using the conf as setting the tblproperties…
razajafri 6f47d0e
hard coding disabling deletion vectors as it doesn't matter in low_sh…
razajafri 1a8b0f9
added fallback tests
razajafri 6e2175f
Fixed unclosed paranthesis
razajafri c235956
Add dv tblproperties only for specific versions of delta-lake
razajafri 4a97097
fixed syntax error
razajafri File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
File renamed without changes.
File renamed without changes.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
61 changes: 61 additions & 0 deletions
61
...332db/src/main/scala/com/databricks/sql/transaction/tahoe/rapids/GpuDeltaDataSource.scala
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,61 @@ | ||
/* | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This file is the same in every version of Databricks except for Databricks 14.3 |
||
* Copyright (c) 2022-2025, NVIDIA CORPORATION. | ||
mythrocks marked this conversation as resolved.
Show resolved
Hide resolved
|
||
* | ||
* This file was derived from DeltaDataSource.scala in the | ||
* Delta Lake project at https://github.com/delta-io/delta. | ||
* | ||
* Copyright (2021) The Delta Lake Project Authors. | ||
* | ||
* Licensed under the Apache License, Version 2.0 (the "License"); | ||
* you may not use this file except in compliance with the License. | ||
* You may obtain a copy of the License at | ||
* | ||
* http://www.apache.org/licenses/LICENSE-2.0 | ||
* | ||
* Unless required by applicable law or agreed to in writing, software | ||
* distributed under the License is distributed on an "AS IS" BASIS, | ||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
* See the License for the specific language governing permissions and | ||
* limitations under the License. | ||
*/ | ||
|
||
package com.nvidia.spark.rapids.delta | ||
|
||
import com.databricks.sql.transaction.tahoe.{DeltaConfigs, DeltaErrors, DeltaOptions} | ||
import com.databricks.sql.transaction.tahoe.commands.WriteIntoDelta | ||
import com.databricks.sql.transaction.tahoe.rapids.{GpuDeltaLog, GpuWriteIntoDelta} | ||
import com.databricks.sql.transaction.tahoe.sources.{DeltaDataSource, DeltaSourceUtils} | ||
import com.nvidia.spark.rapids.{GpuCreatableRelationProvider, RapidsConf} | ||
|
||
import org.apache.spark.sql.{DataFrame, SaveMode, SQLContext} | ||
import org.apache.spark.sql.sources.BaseRelation | ||
|
||
/** GPU version of DeltaDataSource from Delta Lake. */ | ||
class GpuDeltaDataSource(rapidsConf: RapidsConf) extends GpuCreatableRelationProvider { | ||
override def createRelation( | ||
sqlContext: SQLContext, | ||
mode: SaveMode, | ||
parameters: Map[String, String], | ||
data: DataFrame): BaseRelation = { | ||
val path = parameters.getOrElse("path", { | ||
throw DeltaErrors.pathNotSpecifiedException | ||
}) | ||
val partitionColumns = parameters.get(DeltaSourceUtils.PARTITIONING_COLUMNS_KEY) | ||
.map(DeltaDataSource.decodePartitioningColumns) | ||
.getOrElse(Nil) | ||
|
||
val gpuDeltaLog = GpuDeltaLog.forTable(sqlContext.sparkSession, path, parameters, rapidsConf) | ||
GpuWriteIntoDelta( | ||
gpuDeltaLog, | ||
WriteIntoDelta( | ||
deltaLog = gpuDeltaLog.deltaLog, | ||
mode = mode, | ||
new DeltaOptions(parameters, sqlContext.sparkSession.sessionState.conf), | ||
partitionColumns = partitionColumns, | ||
configuration = DeltaConfigs.validateConfigurations( | ||
parameters.filterKeys(_.startsWith("delta.")).toMap), | ||
data = data)).run(sqlContext.sparkSession) | ||
|
||
gpuDeltaLog.deltaLog.createRelation() | ||
} | ||
} |
73 changes: 73 additions & 0 deletions
73
...k332db/src/main/scala/com/databricks/sql/transaction/tahoe/rapids/GpuWriteIntoDelta.scala
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,73 @@ | ||
/* | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This file is the same in every version of Databricks except for 14.3 |
||
* Copyright (c) 2022-2025, NVIDIA CORPORATION. | ||
* | ||
* This file was derived from WriteIntoDelta.scala | ||
* in the Delta Lake project at https://github.com/delta-io/delta. | ||
* | ||
* Copyright (2021) The Delta Lake Project Authors. | ||
* | ||
* Licensed under the Apache License, Version 2.0 (the "License"); | ||
* you may not use this file except in compliance with the License. | ||
* You may obtain a copy of the License at | ||
* | ||
* http://www.apache.org/licenses/LICENSE-2.0 | ||
* | ||
* Unless required by applicable law or agreed to in writing, software | ||
* distributed under the License is distributed on an "AS IS" BASIS, | ||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
* See the License for the specific language governing permissions and | ||
* limitations under the License. | ||
*/ | ||
|
||
package com.databricks.sql.transaction.tahoe.rapids | ||
|
||
import com.databricks.sql.transaction.tahoe.{DeltaOperations, OptimisticTransaction} | ||
import com.databricks.sql.transaction.tahoe.commands.WriteIntoDelta | ||
|
||
import org.apache.spark.sql._ | ||
import org.apache.spark.sql.execution.command.LeafRunnableCommand | ||
|
||
/** GPU version of Delta Lake's WriteIntoDelta. */ | ||
case class GpuWriteIntoDelta( | ||
gpuDeltaLog: GpuDeltaLog, | ||
cpuWrite: WriteIntoDelta) | ||
extends LeafRunnableCommand { | ||
|
||
override def run(sparkSession: SparkSession): Seq[Row] = { | ||
gpuDeltaLog.withNewTransaction { txn => | ||
// If this batch has already been executed within this query, then return. | ||
val skipExecution = hasBeenExecuted(txn) | ||
if (skipExecution) { | ||
return Seq.empty | ||
} | ||
|
||
val actions = cpuWrite.write(txn, sparkSession) | ||
val operation = DeltaOperations.Write( | ||
cpuWrite.mode, | ||
Option(cpuWrite.partitionColumns), | ||
cpuWrite.options.replaceWhere, | ||
cpuWrite.options.userMetadata) | ||
txn.commit(actions, operation) | ||
} | ||
Seq.empty | ||
} | ||
|
||
/** | ||
* Returns true if there is information in the spark session that indicates that this write, which | ||
* is part of a streaming query and a batch, has already been successfully written. | ||
*/ | ||
private def hasBeenExecuted(txn: OptimisticTransaction): Boolean = { | ||
val txnVersion = cpuWrite.options.txnVersion | ||
val txnAppId = cpuWrite.options.txnAppId | ||
for (v <- txnVersion; a <- txnAppId) { | ||
val currentVersion = txn.txnVersion(a) | ||
if (currentVersion >= v) { | ||
logInfo(s"Transaction write of version $v for application id $a " + | ||
s"has already been committed in Delta table id ${txn.deltaLog.tableId}. " + | ||
s"Skipping this write.") | ||
return true | ||
} | ||
} | ||
false | ||
} | ||
} |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This method is overridden in Databricks 14.3's version of
GpuDeltaCatalog