Package org.apache.iceberg.spark.source
Class SparkTable
java.lang.Object
org.apache.iceberg.spark.source.SparkTable
- All Implemented Interfaces:
- org.apache.spark.sql.connector.catalog.SupportsDeleteV2,- org.apache.spark.sql.connector.catalog.SupportsMetadataColumns,- org.apache.spark.sql.connector.catalog.SupportsRead,- org.apache.spark.sql.connector.catalog.SupportsRowLevelOperations,- org.apache.spark.sql.connector.catalog.SupportsWrite,- org.apache.spark.sql.connector.catalog.Table,- org.apache.spark.sql.connector.catalog.TruncatableTable
- Direct Known Subclasses:
- StagedSparkTable
public class SparkTable
extends Object
implements org.apache.spark.sql.connector.catalog.Table, org.apache.spark.sql.connector.catalog.SupportsRead, org.apache.spark.sql.connector.catalog.SupportsWrite, org.apache.spark.sql.connector.catalog.SupportsDeleteV2, org.apache.spark.sql.connector.catalog.SupportsRowLevelOperations, org.apache.spark.sql.connector.catalog.SupportsMetadataColumns
- 
Constructor SummaryConstructorsConstructorDescriptionSparkTable(Table icebergTable, boolean refreshEagerly) SparkTable(Table icebergTable, Long snapshotId, boolean refreshEagerly) SparkTable(Table icebergTable, Long snapshotId, boolean refreshEagerly, boolean isTableRewrite) SparkTable(Table icebergTable, String branch, boolean refreshEagerly) 
- 
Method SummaryModifier and TypeMethodDescriptionbranch()booleancanDeleteWhere(org.apache.spark.sql.connector.expressions.filter.Predicate[] predicates) Set<org.apache.spark.sql.connector.catalog.TableCapability> copyWithBranch(String targetBranch) copyWithSnapshotId(long newSnapshotId) voiddeleteWhere(org.apache.spark.sql.connector.expressions.filter.Predicate[] predicates) booleaninthashCode()org.apache.spark.sql.connector.catalog.MetadataColumn[]name()org.apache.spark.sql.connector.write.RowLevelOperationBuildernewRowLevelOperationBuilder(org.apache.spark.sql.connector.write.RowLevelOperationInfo info) org.apache.spark.sql.connector.read.ScanBuildernewScanBuilder(org.apache.spark.sql.util.CaseInsensitiveStringMap options) org.apache.spark.sql.connector.write.WriteBuildernewWriteBuilder(org.apache.spark.sql.connector.write.LogicalWriteInfo info) org.apache.spark.sql.connector.expressions.Transform[]org.apache.spark.sql.types.StructTypeschema()table()toString()Methods inherited from class java.lang.Objectclone, finalize, getClass, notify, notifyAll, wait, wait, waitMethods inherited from interface org.apache.spark.sql.connector.catalog.SupportsDeleteV2truncateTableMethods inherited from interface org.apache.spark.sql.connector.catalog.SupportsMetadataColumnscanRenameConflictingMetadataColumnsMethods inherited from interface org.apache.spark.sql.connector.catalog.Tablecolumns
- 
Constructor Details- 
SparkTable
- 
SparkTable
- 
SparkTable
- 
SparkTable
 
- 
- 
Method Details- 
table
- 
name- Specified by:
- namein interface- org.apache.spark.sql.connector.catalog.Table
 
- 
snapshotId
- 
branch
- 
copyWithSnapshotId
- 
copyWithBranch
- 
schemapublic org.apache.spark.sql.types.StructType schema()- Specified by:
- schemain interface- org.apache.spark.sql.connector.catalog.Table
 
- 
partitioningpublic org.apache.spark.sql.connector.expressions.Transform[] partitioning()- Specified by:
- partitioningin interface- org.apache.spark.sql.connector.catalog.Table
 
- 
properties- Specified by:
- propertiesin interface- org.apache.spark.sql.connector.catalog.Table
 
- 
capabilities- Specified by:
- capabilitiesin interface- org.apache.spark.sql.connector.catalog.Table
 
- 
metadataColumnspublic org.apache.spark.sql.connector.catalog.MetadataColumn[] metadataColumns()- Specified by:
- metadataColumnsin interface- org.apache.spark.sql.connector.catalog.SupportsMetadataColumns
 
- 
newScanBuilderpublic org.apache.spark.sql.connector.read.ScanBuilder newScanBuilder(org.apache.spark.sql.util.CaseInsensitiveStringMap options) - Specified by:
- newScanBuilderin interface- org.apache.spark.sql.connector.catalog.SupportsRead
 
- 
newWriteBuilderpublic org.apache.spark.sql.connector.write.WriteBuilder newWriteBuilder(org.apache.spark.sql.connector.write.LogicalWriteInfo info) - Specified by:
- newWriteBuilderin interface- org.apache.spark.sql.connector.catalog.SupportsWrite
 
- 
newRowLevelOperationBuilderpublic org.apache.spark.sql.connector.write.RowLevelOperationBuilder newRowLevelOperationBuilder(org.apache.spark.sql.connector.write.RowLevelOperationInfo info) - Specified by:
- newRowLevelOperationBuilderin interface- org.apache.spark.sql.connector.catalog.SupportsRowLevelOperations
 
- 
canDeleteWherepublic boolean canDeleteWhere(org.apache.spark.sql.connector.expressions.filter.Predicate[] predicates) - Specified by:
- canDeleteWherein interface- org.apache.spark.sql.connector.catalog.SupportsDeleteV2
 
- 
deleteWherepublic void deleteWhere(org.apache.spark.sql.connector.expressions.filter.Predicate[] predicates) - Specified by:
- deleteWherein interface- org.apache.spark.sql.connector.catalog.SupportsDeleteV2
 
- 
toString
- 
equals
- 
hashCodepublic int hashCode()
 
-