public class SparkTable extends java.lang.Object implements org.apache.spark.sql.connector.catalog.Table, org.apache.spark.sql.connector.catalog.SupportsRead, org.apache.spark.sql.connector.catalog.SupportsWrite, ExtendedSupportsDelete, SupportsMerge
Constructor and Description |
---|
SparkTable(Table icebergTable,
boolean refreshEagerly) |
SparkTable(Table icebergTable,
org.apache.spark.sql.types.StructType requestedSchema,
boolean refreshEagerly) |
Modifier and Type | Method and Description |
---|---|
boolean |
canDeleteWhere(org.apache.spark.sql.sources.Filter[] filters)
Checks if it is possible to delete data from a data source table that matches filter expressions.
|
java.util.Set<org.apache.spark.sql.connector.catalog.TableCapability> |
capabilities() |
void |
deleteWhere(org.apache.spark.sql.sources.Filter[] filters) |
boolean |
equals(java.lang.Object other) |
int |
hashCode() |
java.lang.String |
name() |
MergeBuilder |
newMergeBuilder(java.lang.String operation,
org.apache.spark.sql.connector.write.LogicalWriteInfo info)
Returns a
MergeBuilder which can be used to create both a scan and a write for a row-level
operation. |
org.apache.spark.sql.connector.read.ScanBuilder |
newScanBuilder(org.apache.spark.sql.util.CaseInsensitiveStringMap options) |
org.apache.spark.sql.connector.write.WriteBuilder |
newWriteBuilder(org.apache.spark.sql.connector.write.LogicalWriteInfo info) |
org.apache.spark.sql.connector.expressions.Transform[] |
partitioning() |
java.util.Map<java.lang.String,java.lang.String> |
properties() |
org.apache.spark.sql.types.StructType |
schema() |
Table |
table() |
java.lang.String |
toString() |
public SparkTable(Table icebergTable, boolean refreshEagerly)
public SparkTable(Table icebergTable, org.apache.spark.sql.types.StructType requestedSchema, boolean refreshEagerly)
public Table table()
public java.lang.String name()
name
in interface org.apache.spark.sql.connector.catalog.Table
public org.apache.spark.sql.types.StructType schema()
schema
in interface org.apache.spark.sql.connector.catalog.Table
public org.apache.spark.sql.connector.expressions.Transform[] partitioning()
partitioning
in interface org.apache.spark.sql.connector.catalog.Table
public java.util.Map<java.lang.String,java.lang.String> properties()
properties
in interface org.apache.spark.sql.connector.catalog.Table
public java.util.Set<org.apache.spark.sql.connector.catalog.TableCapability> capabilities()
capabilities
in interface org.apache.spark.sql.connector.catalog.Table
public org.apache.spark.sql.connector.read.ScanBuilder newScanBuilder(org.apache.spark.sql.util.CaseInsensitiveStringMap options)
newScanBuilder
in interface org.apache.spark.sql.connector.catalog.SupportsRead
public org.apache.spark.sql.connector.write.WriteBuilder newWriteBuilder(org.apache.spark.sql.connector.write.LogicalWriteInfo info)
newWriteBuilder
in interface org.apache.spark.sql.connector.catalog.SupportsWrite
public MergeBuilder newMergeBuilder(java.lang.String operation, org.apache.spark.sql.connector.write.LogicalWriteInfo info)
SupportsMerge
MergeBuilder
which can be used to create both a scan and a write for a row-level
operation. Spark will call this method to configure each data source row-level operation.newMergeBuilder
in interface SupportsMerge
info
- write infopublic boolean canDeleteWhere(org.apache.spark.sql.sources.Filter[] filters)
ExtendedSupportsDelete
Rows should be deleted from the data source iff all of the filter expressions match. That is, the expressions must be interpreted as a set of filters that are ANDed together.
Spark will call this method to check if the delete is possible without significant effort. Otherwise, Spark will try to rewrite the delete operation if the data source table supports row-level operations.
canDeleteWhere
in interface ExtendedSupportsDelete
filters
- filter expressions, used to select rows to delete when all expressions matchpublic void deleteWhere(org.apache.spark.sql.sources.Filter[] filters)
deleteWhere
in interface org.apache.spark.sql.connector.catalog.SupportsDelete
public java.lang.String toString()
toString
in class java.lang.Object
public boolean equals(java.lang.Object other)
equals
in class java.lang.Object
public int hashCode()
hashCode
in class java.lang.Object