Package org.apache.iceberg.spark
Class SparkSessionCatalog<T extends org.apache.spark.sql.connector.catalog.TableCatalog & org.apache.spark.sql.connector.catalog.FunctionCatalog & org.apache.spark.sql.connector.catalog.SupportsNamespaces>
- java.lang.Object
-
- org.apache.iceberg.spark.SparkSessionCatalog<T>
-
- Type Parameters:
T- CatalogPlugin class to avoid casting to TableCatalog, FunctionCatalog and SupportsNamespaces.
- All Implemented Interfaces:
HasIcebergCatalog,org.apache.spark.sql.connector.catalog.CatalogExtension,org.apache.spark.sql.connector.catalog.CatalogPlugin,org.apache.spark.sql.connector.catalog.FunctionCatalog,org.apache.spark.sql.connector.catalog.StagingTableCatalog,org.apache.spark.sql.connector.catalog.SupportsNamespaces,org.apache.spark.sql.connector.catalog.TableCatalog,ProcedureCatalog
public class SparkSessionCatalog<T extends org.apache.spark.sql.connector.catalog.TableCatalog & org.apache.spark.sql.connector.catalog.FunctionCatalog & org.apache.spark.sql.connector.catalog.SupportsNamespaces> extends java.lang.Object implements org.apache.spark.sql.connector.catalog.CatalogExtensionA Spark catalog that can also load non-Iceberg tables.
-
-
Constructor Summary
Constructors Constructor Description SparkSessionCatalog()
-
Method Summary
All Methods Instance Methods Concrete Methods Default Methods Modifier and Type Method Description voidalterNamespace(java.lang.String[] namespace, org.apache.spark.sql.connector.catalog.NamespaceChange... changes)org.apache.spark.sql.connector.catalog.TablealterTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.connector.catalog.TableChange... changes)protected org.apache.spark.sql.connector.catalog.TableCatalogbuildSparkCatalog(java.lang.String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options)Build aSparkCatalogto be used for Iceberg operations.voidcreateNamespace(java.lang.String[] namespace, java.util.Map<java.lang.String,java.lang.String> metadata)org.apache.spark.sql.connector.catalog.TablecreateTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, java.util.Map<java.lang.String,java.lang.String> properties)java.lang.String[]defaultNamespace()booleandropNamespace(java.lang.String[] namespace, boolean cascade)booleandropTable(org.apache.spark.sql.connector.catalog.Identifier ident)CatalogicebergCatalog()Returns the underlyingCatalogbacking this Spark Catalogvoidinitialize(java.lang.String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options)voidinvalidateTable(org.apache.spark.sql.connector.catalog.Identifier ident)booleanisExistingNamespace(java.lang.String[] namespace)booleanisFunctionNamespace(java.lang.String[] namespace)default org.apache.spark.sql.connector.catalog.Identifier[]listFunctions(java.lang.String[] namespace)java.lang.String[][]listNamespaces()java.lang.String[][]listNamespaces(java.lang.String[] namespace)org.apache.spark.sql.connector.catalog.Identifier[]listTables(java.lang.String[] namespace)org.apache.spark.sql.connector.catalog.functions.UnboundFunctionloadFunction(org.apache.spark.sql.connector.catalog.Identifier ident)java.util.Map<java.lang.String,java.lang.String>loadNamespaceMetadata(java.lang.String[] namespace)ProcedureloadProcedure(org.apache.spark.sql.connector.catalog.Identifier ident)Load astored procedurebyidentifier.org.apache.spark.sql.connector.catalog.TableloadTable(org.apache.spark.sql.connector.catalog.Identifier ident)org.apache.spark.sql.connector.catalog.TableloadTable(org.apache.spark.sql.connector.catalog.Identifier ident, long timestamp)org.apache.spark.sql.connector.catalog.TableloadTable(org.apache.spark.sql.connector.catalog.Identifier ident, java.lang.String version)java.lang.Stringname()booleannamespaceExists(java.lang.String[] namespace)booleanpurgeTable(org.apache.spark.sql.connector.catalog.Identifier ident)voidrenameTable(org.apache.spark.sql.connector.catalog.Identifier from, org.apache.spark.sql.connector.catalog.Identifier to)voidsetDelegateCatalog(org.apache.spark.sql.connector.catalog.CatalogPlugin sparkSessionCatalog)org.apache.spark.sql.connector.catalog.StagedTablestageCreate(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, java.util.Map<java.lang.String,java.lang.String> properties)org.apache.spark.sql.connector.catalog.StagedTablestageCreateOrReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, java.util.Map<java.lang.String,java.lang.String> properties)org.apache.spark.sql.connector.catalog.StagedTablestageReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, java.util.Map<java.lang.String,java.lang.String> properties)-
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface org.apache.spark.sql.connector.catalog.FunctionCatalog
functionExists
-
-
-
-
Method Detail
-
buildSparkCatalog
protected org.apache.spark.sql.connector.catalog.TableCatalog buildSparkCatalog(java.lang.String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options)Build aSparkCatalogto be used for Iceberg operations.The default implementation creates a new SparkCatalog with the session catalog's name and options.
- Parameters:
name- catalog nameoptions- catalog options- Returns:
- a SparkCatalog to be used for Iceberg tables
-
defaultNamespace
public java.lang.String[] defaultNamespace()
- Specified by:
defaultNamespacein interfaceorg.apache.spark.sql.connector.catalog.CatalogPlugin
-
listNamespaces
public java.lang.String[][] listNamespaces() throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
listNamespacesin interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
-
listNamespaces
public java.lang.String[][] listNamespaces(java.lang.String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
listNamespacesin interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
-
namespaceExists
public boolean namespaceExists(java.lang.String[] namespace)
- Specified by:
namespaceExistsin interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces
-
loadNamespaceMetadata
public java.util.Map<java.lang.String,java.lang.String> loadNamespaceMetadata(java.lang.String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
loadNamespaceMetadatain interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
-
createNamespace
public void createNamespace(java.lang.String[] namespace, java.util.Map<java.lang.String,java.lang.String> metadata) throws org.apache.spark.sql.catalyst.analysis.NamespaceAlreadyExistsException- Specified by:
createNamespacein interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces- Throws:
org.apache.spark.sql.catalyst.analysis.NamespaceAlreadyExistsException
-
alterNamespace
public void alterNamespace(java.lang.String[] namespace, org.apache.spark.sql.connector.catalog.NamespaceChange... changes) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
alterNamespacein interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
-
dropNamespace
public boolean dropNamespace(java.lang.String[] namespace, boolean cascade) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException, org.apache.spark.sql.catalyst.analysis.NonEmptyNamespaceException- Specified by:
dropNamespacein interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceExceptionorg.apache.spark.sql.catalyst.analysis.NonEmptyNamespaceException
-
listTables
public org.apache.spark.sql.connector.catalog.Identifier[] listTables(java.lang.String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
listTablesin interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
-
loadTable
public org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException- Specified by:
loadTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchTableException
-
loadTable
public org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident, java.lang.String version) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException- Specified by:
loadTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchTableException
-
loadTable
public org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident, long timestamp) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException- Specified by:
loadTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchTableException
-
invalidateTable
public void invalidateTable(org.apache.spark.sql.connector.catalog.Identifier ident)
- Specified by:
invalidateTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog
-
createTable
public org.apache.spark.sql.connector.catalog.Table createTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, java.util.Map<java.lang.String,java.lang.String> properties) throws org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException, org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
createTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsExceptionorg.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
-
stageCreate
public org.apache.spark.sql.connector.catalog.StagedTable stageCreate(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, java.util.Map<java.lang.String,java.lang.String> properties) throws org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException, org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
stageCreatein interfaceorg.apache.spark.sql.connector.catalog.StagingTableCatalog- Throws:
org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsExceptionorg.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
-
stageReplace
public org.apache.spark.sql.connector.catalog.StagedTable stageReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, java.util.Map<java.lang.String,java.lang.String> properties) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException, org.apache.spark.sql.catalyst.analysis.NoSuchTableException- Specified by:
stageReplacein interfaceorg.apache.spark.sql.connector.catalog.StagingTableCatalog- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceExceptionorg.apache.spark.sql.catalyst.analysis.NoSuchTableException
-
stageCreateOrReplace
public org.apache.spark.sql.connector.catalog.StagedTable stageCreateOrReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, java.util.Map<java.lang.String,java.lang.String> properties) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
stageCreateOrReplacein interfaceorg.apache.spark.sql.connector.catalog.StagingTableCatalog- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
-
alterTable
public org.apache.spark.sql.connector.catalog.Table alterTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.connector.catalog.TableChange... changes) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException- Specified by:
alterTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchTableException
-
dropTable
public boolean dropTable(org.apache.spark.sql.connector.catalog.Identifier ident)
- Specified by:
dropTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog
-
purgeTable
public boolean purgeTable(org.apache.spark.sql.connector.catalog.Identifier ident)
- Specified by:
purgeTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog
-
renameTable
public void renameTable(org.apache.spark.sql.connector.catalog.Identifier from, org.apache.spark.sql.connector.catalog.Identifier to) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException, org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException- Specified by:
renameTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchTableExceptionorg.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
-
initialize
public final void initialize(java.lang.String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options)- Specified by:
initializein interfaceorg.apache.spark.sql.connector.catalog.CatalogPlugin
-
setDelegateCatalog
public void setDelegateCatalog(org.apache.spark.sql.connector.catalog.CatalogPlugin sparkSessionCatalog)
- Specified by:
setDelegateCatalogin interfaceorg.apache.spark.sql.connector.catalog.CatalogExtension
-
name
public java.lang.String name()
- Specified by:
namein interfaceorg.apache.spark.sql.connector.catalog.CatalogPlugin
-
icebergCatalog
public Catalog icebergCatalog()
Description copied from interface:HasIcebergCatalogReturns the underlyingCatalogbacking this Spark Catalog- Specified by:
icebergCatalogin interfaceHasIcebergCatalog
-
loadFunction
public org.apache.spark.sql.connector.catalog.functions.UnboundFunction loadFunction(org.apache.spark.sql.connector.catalog.Identifier ident) throws org.apache.spark.sql.catalyst.analysis.NoSuchFunctionException- Specified by:
loadFunctionin interfaceorg.apache.spark.sql.connector.catalog.FunctionCatalog- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchFunctionException
-
loadProcedure
public Procedure loadProcedure(org.apache.spark.sql.connector.catalog.Identifier ident) throws NoSuchProcedureException
Description copied from interface:ProcedureCatalogLoad astored procedurebyidentifier.- Specified by:
loadProcedurein interfaceProcedureCatalog- Parameters:
ident- a stored procedure identifier- Returns:
- the stored procedure's metadata
- Throws:
NoSuchProcedureException- if there is no matching stored procedure
-
isFunctionNamespace
public boolean isFunctionNamespace(java.lang.String[] namespace)
-
isExistingNamespace
public boolean isExistingNamespace(java.lang.String[] namespace)
-
listFunctions
public default org.apache.spark.sql.connector.catalog.Identifier[] listFunctions(java.lang.String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
listFunctionsin interfaceorg.apache.spark.sql.connector.catalog.FunctionCatalog- Throws:
org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
-
-