Package org.apache.iceberg.spark
Class SparkSessionCatalog<T extends org.apache.spark.sql.connector.catalog.TableCatalog & org.apache.spark.sql.connector.catalog.FunctionCatalog & org.apache.spark.sql.connector.catalog.SupportsNamespaces & org.apache.spark.sql.connector.catalog.ViewCatalog>
java.lang.Object
org.apache.iceberg.spark.SparkSessionCatalog<T>
- Type Parameters:
- T- CatalogPlugin class to avoid casting to TableCatalog, FunctionCatalog and SupportsNamespaces.
- All Implemented Interfaces:
- HasIcebergCatalog,- SupportsReplaceView,- org.apache.spark.sql.connector.catalog.CatalogExtension,- org.apache.spark.sql.connector.catalog.CatalogPlugin,- org.apache.spark.sql.connector.catalog.FunctionCatalog,- org.apache.spark.sql.connector.catalog.StagingTableCatalog,- org.apache.spark.sql.connector.catalog.SupportsNamespaces,- org.apache.spark.sql.connector.catalog.TableCatalog,- org.apache.spark.sql.connector.catalog.ViewCatalog,- ProcedureCatalog
public class SparkSessionCatalog<T extends org.apache.spark.sql.connector.catalog.TableCatalog & org.apache.spark.sql.connector.catalog.FunctionCatalog & org.apache.spark.sql.connector.catalog.SupportsNamespaces & org.apache.spark.sql.connector.catalog.ViewCatalog>
extends Object
implements org.apache.spark.sql.connector.catalog.CatalogExtension
A Spark catalog that can also load non-Iceberg tables.
- 
Field SummaryFields inherited from interface org.apache.spark.sql.connector.catalog.SupportsNamespacesPROP_COMMENT, PROP_LOCATION, PROP_OWNERFields inherited from interface org.apache.spark.sql.connector.catalog.TableCatalogOPTION_PREFIX, PROP_COMMENT, PROP_EXTERNAL, PROP_IS_MANAGED_LOCATION, PROP_LOCATION, PROP_OWNER, PROP_PROVIDERFields inherited from interface org.apache.spark.sql.connector.catalog.ViewCatalogPROP_COMMENT, PROP_CREATE_ENGINE_VERSION, PROP_ENGINE_VERSION, PROP_OWNER, RESERVED_PROPERTIES
- 
Constructor SummaryConstructors
- 
Method SummaryModifier and TypeMethodDescriptionvoidalterNamespace(String[] namespace, org.apache.spark.sql.connector.catalog.NamespaceChange... changes) org.apache.spark.sql.connector.catalog.TablealterTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.connector.catalog.TableChange... changes) org.apache.spark.sql.connector.catalog.ViewalterView(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.connector.catalog.ViewChange... changes) protected org.apache.spark.sql.connector.catalog.TableCatalogbuildSparkCatalog(String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options) Build aSparkCatalogto be used for Iceberg operations.voidcreateNamespace(String[] namespace, Map<String, String> metadata) org.apache.spark.sql.connector.catalog.TablecreateTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) org.apache.spark.sql.connector.catalog.ViewcreateView(org.apache.spark.sql.connector.catalog.Identifier ident, String sql, String currentCatalog, String[] currentNamespace, org.apache.spark.sql.types.StructType schema, String[] queryColumnNames, String[] columnAliases, String[] columnComments, Map<String, String> properties) String[]booleandropNamespace(String[] namespace, boolean cascade) booleandropTable(org.apache.spark.sql.connector.catalog.Identifier ident) booleandropView(org.apache.spark.sql.connector.catalog.Identifier ident) Returns the underlyingCatalogbacking this Spark Catalogfinal voidinitialize(String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options) voidinvalidateTable(org.apache.spark.sql.connector.catalog.Identifier ident) booleanisExistingNamespace(String[] namespace) booleanisFunctionNamespace(String[] namespace) default org.apache.spark.sql.connector.catalog.Identifier[]listFunctions(String[] namespace) String[][]String[][]listNamespaces(String[] namespace) org.apache.spark.sql.connector.catalog.Identifier[]listTables(String[] namespace) org.apache.spark.sql.connector.catalog.Identifier[]org.apache.spark.sql.connector.catalog.functions.UnboundFunctionloadFunction(org.apache.spark.sql.connector.catalog.Identifier ident) loadNamespaceMetadata(String[] namespace) loadProcedure(org.apache.spark.sql.connector.catalog.Identifier ident) Load astored procedurebyidentifier.org.apache.spark.sql.connector.catalog.TableloadTable(org.apache.spark.sql.connector.catalog.Identifier ident) org.apache.spark.sql.connector.catalog.TableloadTable(org.apache.spark.sql.connector.catalog.Identifier ident, long timestamp) org.apache.spark.sql.connector.catalog.Tableorg.apache.spark.sql.connector.catalog.ViewloadView(org.apache.spark.sql.connector.catalog.Identifier ident) name()booleannamespaceExists(String[] namespace) booleanpurgeTable(org.apache.spark.sql.connector.catalog.Identifier ident) voidrenameTable(org.apache.spark.sql.connector.catalog.Identifier from, org.apache.spark.sql.connector.catalog.Identifier to) voidrenameView(org.apache.spark.sql.connector.catalog.Identifier fromIdentifier, org.apache.spark.sql.connector.catalog.Identifier toIdentifier) org.apache.spark.sql.connector.catalog.ViewreplaceView(org.apache.spark.sql.connector.catalog.Identifier ident, String sql, String currentCatalog, String[] currentNamespace, org.apache.spark.sql.types.StructType schema, String[] queryColumnNames, String[] columnAliases, String[] columnComments, Map<String, String> properties) Replace a view in the catalogvoidsetDelegateCatalog(org.apache.spark.sql.connector.catalog.CatalogPlugin sparkSessionCatalog) org.apache.spark.sql.connector.catalog.StagedTablestageCreate(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) org.apache.spark.sql.connector.catalog.StagedTablestageCreateOrReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) org.apache.spark.sql.connector.catalog.StagedTablestageReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) booleanMethods inherited from class java.lang.Objectclone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface org.apache.spark.sql.connector.catalog.FunctionCatalogfunctionExistsMethods inherited from interface org.apache.spark.sql.connector.catalog.StagingTableCatalogstageCreate, stageCreateOrReplace, stageReplaceMethods inherited from interface org.apache.spark.sql.connector.catalog.TableCatalogcapabilities, createTable, loadTable, tableExists, useNullableQuerySchemaMethods inherited from interface org.apache.spark.sql.connector.catalog.ViewCataloginvalidateView, viewExists
- 
Constructor Details- 
SparkSessionCatalogpublic SparkSessionCatalog()
 
- 
- 
Method Details- 
buildSparkCatalogprotected org.apache.spark.sql.connector.catalog.TableCatalog buildSparkCatalog(String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options) Build aSparkCatalogto be used for Iceberg operations.The default implementation creates a new SparkCatalog with the session catalog's name and options. - Parameters:
- name- catalog name
- options- catalog options
- Returns:
- a SparkCatalog to be used for Iceberg tables
 
- 
defaultNamespace- Specified by:
- defaultNamespacein interface- org.apache.spark.sql.connector.catalog.CatalogPlugin
 
- 
listNamespacespublic String[][] listNamespaces() throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
- listNamespacesin interface- org.apache.spark.sql.connector.catalog.SupportsNamespaces
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 
- 
listNamespacespublic String[][] listNamespaces(String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException - Specified by:
- listNamespacesin interface- org.apache.spark.sql.connector.catalog.SupportsNamespaces
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 
- 
namespaceExists- Specified by:
- namespaceExistsin interface- org.apache.spark.sql.connector.catalog.SupportsNamespaces
 
- 
loadNamespaceMetadatapublic Map<String,String> loadNamespaceMetadata(String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException - Specified by:
- loadNamespaceMetadatain interface- org.apache.spark.sql.connector.catalog.SupportsNamespaces
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 
- 
createNamespacepublic void createNamespace(String[] namespace, Map<String, String> metadata) throws org.apache.spark.sql.catalyst.analysis.NamespaceAlreadyExistsException- Specified by:
- createNamespacein interface- org.apache.spark.sql.connector.catalog.SupportsNamespaces
- Throws:
- org.apache.spark.sql.catalyst.analysis.NamespaceAlreadyExistsException
 
- 
alterNamespacepublic void alterNamespace(String[] namespace, org.apache.spark.sql.connector.catalog.NamespaceChange... changes) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException - Specified by:
- alterNamespacein interface- org.apache.spark.sql.connector.catalog.SupportsNamespaces
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 
- 
dropNamespacepublic boolean dropNamespace(String[] namespace, boolean cascade) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException, org.apache.spark.sql.catalyst.analysis.NonEmptyNamespaceException - Specified by:
- dropNamespacein interface- org.apache.spark.sql.connector.catalog.SupportsNamespaces
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
- org.apache.spark.sql.catalyst.analysis.NonEmptyNamespaceException
 
- 
listTablespublic org.apache.spark.sql.connector.catalog.Identifier[] listTables(String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException - Specified by:
- listTablesin interface- org.apache.spark.sql.connector.catalog.TableCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 
- 
loadTablepublic org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException - Specified by:
- loadTablein interface- org.apache.spark.sql.connector.catalog.TableCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchTableException
 
- 
loadTablepublic org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident, String version) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException - Specified by:
- loadTablein interface- org.apache.spark.sql.connector.catalog.TableCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchTableException
 
- 
loadTablepublic org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident, long timestamp) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException - Specified by:
- loadTablein interface- org.apache.spark.sql.connector.catalog.TableCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchTableException
 
- 
invalidateTablepublic void invalidateTable(org.apache.spark.sql.connector.catalog.Identifier ident) - Specified by:
- invalidateTablein interface- org.apache.spark.sql.connector.catalog.TableCatalog
 
- 
createTablepublic org.apache.spark.sql.connector.catalog.Table createTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) throws org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException, org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
- createTablein interface- org.apache.spark.sql.connector.catalog.TableCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 
- 
stageCreatepublic org.apache.spark.sql.connector.catalog.StagedTable stageCreate(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) throws org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException, org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
- stageCreatein interface- org.apache.spark.sql.connector.catalog.StagingTableCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 
- 
stageReplacepublic org.apache.spark.sql.connector.catalog.StagedTable stageReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException, org.apache.spark.sql.catalyst.analysis.NoSuchTableException- Specified by:
- stageReplacein interface- org.apache.spark.sql.connector.catalog.StagingTableCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
- org.apache.spark.sql.catalyst.analysis.NoSuchTableException
 
- 
stageCreateOrReplacepublic org.apache.spark.sql.connector.catalog.StagedTable stageCreateOrReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
- stageCreateOrReplacein interface- org.apache.spark.sql.connector.catalog.StagingTableCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 
- 
alterTablepublic org.apache.spark.sql.connector.catalog.Table alterTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.connector.catalog.TableChange... changes) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException - Specified by:
- alterTablein interface- org.apache.spark.sql.connector.catalog.TableCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchTableException
 
- 
dropTablepublic boolean dropTable(org.apache.spark.sql.connector.catalog.Identifier ident) - Specified by:
- dropTablein interface- org.apache.spark.sql.connector.catalog.TableCatalog
 
- 
purgeTablepublic boolean purgeTable(org.apache.spark.sql.connector.catalog.Identifier ident) - Specified by:
- purgeTablein interface- org.apache.spark.sql.connector.catalog.TableCatalog
 
- 
renameTablepublic void renameTable(org.apache.spark.sql.connector.catalog.Identifier from, org.apache.spark.sql.connector.catalog.Identifier to) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException, org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException - Specified by:
- renameTablein interface- org.apache.spark.sql.connector.catalog.TableCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchTableException
- org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
 
- 
initializepublic final void initialize(String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options) - Specified by:
- initializein interface- org.apache.spark.sql.connector.catalog.CatalogPlugin
 
- 
setDelegateCatalogpublic void setDelegateCatalog(org.apache.spark.sql.connector.catalog.CatalogPlugin sparkSessionCatalog) - Specified by:
- setDelegateCatalogin interface- org.apache.spark.sql.connector.catalog.CatalogExtension
 
- 
name- Specified by:
- namein interface- org.apache.spark.sql.connector.catalog.CatalogPlugin
 
- 
icebergCatalogDescription copied from interface:HasIcebergCatalogReturns the underlyingCatalogbacking this Spark Catalog- Specified by:
- icebergCatalogin interface- HasIcebergCatalog
 
- 
loadFunctionpublic org.apache.spark.sql.connector.catalog.functions.UnboundFunction loadFunction(org.apache.spark.sql.connector.catalog.Identifier ident) throws org.apache.spark.sql.catalyst.analysis.NoSuchFunctionException - Specified by:
- loadFunctionin interface- org.apache.spark.sql.connector.catalog.FunctionCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchFunctionException
 
- 
listViews- Specified by:
- listViewsin interface- org.apache.spark.sql.connector.catalog.ViewCatalog
 
- 
loadViewpublic org.apache.spark.sql.connector.catalog.View loadView(org.apache.spark.sql.connector.catalog.Identifier ident) throws org.apache.spark.sql.catalyst.analysis.NoSuchViewException - Specified by:
- loadViewin interface- org.apache.spark.sql.connector.catalog.ViewCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchViewException
 
- 
createViewpublic org.apache.spark.sql.connector.catalog.View createView(org.apache.spark.sql.connector.catalog.Identifier ident, String sql, String currentCatalog, String[] currentNamespace, org.apache.spark.sql.types.StructType schema, String[] queryColumnNames, String[] columnAliases, String[] columnComments, Map<String, String> properties) throws org.apache.spark.sql.catalyst.analysis.ViewAlreadyExistsException, org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
- createViewin interface- org.apache.spark.sql.connector.catalog.ViewCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.ViewAlreadyExistsException
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 
- 
replaceViewpublic org.apache.spark.sql.connector.catalog.View replaceView(org.apache.spark.sql.connector.catalog.Identifier ident, String sql, String currentCatalog, String[] currentNamespace, org.apache.spark.sql.types.StructType schema, String[] queryColumnNames, String[] columnAliases, String[] columnComments, Map<String, String> properties) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException, org.apache.spark.sql.catalyst.analysis.NoSuchViewExceptionDescription copied from interface:SupportsReplaceViewReplace a view in the catalog- Specified by:
- replaceViewin interface- SupportsReplaceView
- Parameters:
- ident- a view identifier
- sql- the SQL text that defines the view
- currentCatalog- the current catalog
- currentNamespace- the current namespace
- schema- the view query output schema
- queryColumnNames- the query column names
- columnAliases- the column aliases
- columnComments- the column comments
- properties- the view properties
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- If the identifier namespace does not exist (optional)
- org.apache.spark.sql.catalyst.analysis.NoSuchViewException- If the view doesn't exist or is a table
 
- 
alterViewpublic org.apache.spark.sql.connector.catalog.View alterView(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.connector.catalog.ViewChange... changes) throws org.apache.spark.sql.catalyst.analysis.NoSuchViewException, IllegalArgumentException - Specified by:
- alterViewin interface- org.apache.spark.sql.connector.catalog.ViewCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchViewException
- IllegalArgumentException
 
- 
dropViewpublic boolean dropView(org.apache.spark.sql.connector.catalog.Identifier ident) - Specified by:
- dropViewin interface- org.apache.spark.sql.connector.catalog.ViewCatalog
 
- 
renameViewpublic void renameView(org.apache.spark.sql.connector.catalog.Identifier fromIdentifier, org.apache.spark.sql.connector.catalog.Identifier toIdentifier) throws org.apache.spark.sql.catalyst.analysis.NoSuchViewException, org.apache.spark.sql.catalyst.analysis.ViewAlreadyExistsException - Specified by:
- renameViewin interface- org.apache.spark.sql.connector.catalog.ViewCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchViewException
- org.apache.spark.sql.catalyst.analysis.ViewAlreadyExistsException
 
- 
loadProcedurepublic Procedure loadProcedure(org.apache.spark.sql.connector.catalog.Identifier ident) throws NoSuchProcedureException Description copied from interface:ProcedureCatalogLoad astored procedurebyidentifier.- Specified by:
- loadProcedurein interface- ProcedureCatalog
- Parameters:
- ident- a stored procedure identifier
- Returns:
- the stored procedure's metadata
- Throws:
- NoSuchProcedureException- if there is no matching stored procedure
 
- 
isFunctionNamespace
- 
isExistingNamespace
- 
useNullableQuerySchemapublic boolean useNullableQuerySchema()- Specified by:
- useNullableQuerySchemain interface- org.apache.spark.sql.connector.catalog.TableCatalog
 
- 
listFunctionsdefault org.apache.spark.sql.connector.catalog.Identifier[] listFunctions(String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException - Specified by:
- listFunctionsin interface- org.apache.spark.sql.connector.catalog.FunctionCatalog
- Throws:
- org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 
 
-