public class SparkCatalog extends java.lang.Object implements org.apache.spark.sql.connector.catalog.ViewCatalog, SupportsReplaceView
Catalog.
This supports the following catalog configuration options:
type - catalog type, "hive" or "hadoop" or "rest". To specify a non-hive or
hadoop catalog, use the catalog-impl option.
uri - the Hive Metastore URI for Hive catalog or REST URI for REST catalog
warehouse - the warehouse path (Hadoop catalog only)
catalog-impl - a custom Catalog implementation to use
io-impl - a custom FileIO implementation to use
metrics-reporter-impl - a custom MetricsReporter implementation to use
default-namespace - a namespace to use as the default
cache-enabled - whether to enable catalog cache
cache.case-sensitive - whether the catalog cache should compare table
identifiers in a case sensitive way
cache.expiration-interval-ms - interval in millis before expiring tables from
catalog cache. Refer to CatalogProperties.CACHE_EXPIRATION_INTERVAL_MS for further
details and significant values.
table-default.$tablePropertyKey - table property $tablePropertyKey default at
catalog level
table-override.$tablePropertyKey - table property $tablePropertyKey enforced
at catalog level
PROP_COMMENT, PROP_CREATE_ENGINE_VERSION, PROP_ENGINE_VERSION, PROP_OWNER, RESERVED_PROPERTIES| Constructor and Description |
|---|
SparkCatalog() |
| Modifier and Type | Method and Description |
|---|---|
void |
alterNamespace(java.lang.String[] namespace,
org.apache.spark.sql.connector.catalog.NamespaceChange... changes) |
org.apache.spark.sql.connector.catalog.Table |
alterTable(org.apache.spark.sql.connector.catalog.Identifier ident,
org.apache.spark.sql.connector.catalog.TableChange... changes) |
org.apache.spark.sql.connector.catalog.View |
alterView(org.apache.spark.sql.connector.catalog.Identifier ident,
org.apache.spark.sql.connector.catalog.ViewChange... changes) |
protected Catalog |
buildIcebergCatalog(java.lang.String name,
org.apache.spark.sql.util.CaseInsensitiveStringMap options)
Build an Iceberg
Catalog to be used by this Spark catalog adapter. |
protected TableIdentifier |
buildIdentifier(org.apache.spark.sql.connector.catalog.Identifier identifier)
Build an Iceberg
TableIdentifier for the given Spark identifier. |
void |
createNamespace(java.lang.String[] namespace,
java.util.Map<java.lang.String,java.lang.String> metadata) |
org.apache.spark.sql.connector.catalog.Table |
createTable(org.apache.spark.sql.connector.catalog.Identifier ident,
org.apache.spark.sql.types.StructType schema,
org.apache.spark.sql.connector.expressions.Transform[] transforms,
java.util.Map<java.lang.String,java.lang.String> properties) |
org.apache.spark.sql.connector.catalog.View |
createView(org.apache.spark.sql.connector.catalog.Identifier ident,
java.lang.String sql,
java.lang.String currentCatalog,
java.lang.String[] currentNamespace,
org.apache.spark.sql.types.StructType schema,
java.lang.String[] queryColumnNames,
java.lang.String[] columnAliases,
java.lang.String[] columnComments,
java.util.Map<java.lang.String,java.lang.String> properties) |
java.lang.String[] |
defaultNamespace() |
boolean |
dropNamespace(java.lang.String[] namespace,
boolean cascade) |
boolean |
dropTable(org.apache.spark.sql.connector.catalog.Identifier ident) |
boolean |
dropView(org.apache.spark.sql.connector.catalog.Identifier ident) |
Catalog |
icebergCatalog()
Returns the underlying
Catalog backing this Spark Catalog |
void |
initialize(java.lang.String name,
org.apache.spark.sql.util.CaseInsensitiveStringMap options) |
void |
invalidateTable(org.apache.spark.sql.connector.catalog.Identifier ident) |
boolean |
isExistingNamespace(java.lang.String[] namespace) |
boolean |
isFunctionNamespace(java.lang.String[] namespace) |
default org.apache.spark.sql.connector.catalog.Identifier[] |
listFunctions(java.lang.String[] namespace) |
java.lang.String[][] |
listNamespaces() |
java.lang.String[][] |
listNamespaces(java.lang.String[] namespace) |
org.apache.spark.sql.connector.catalog.Identifier[] |
listTables(java.lang.String[] namespace) |
org.apache.spark.sql.connector.catalog.Identifier[] |
listViews(java.lang.String... namespace) |
default org.apache.spark.sql.connector.catalog.functions.UnboundFunction |
loadFunction(org.apache.spark.sql.connector.catalog.Identifier ident) |
java.util.Map<java.lang.String,java.lang.String> |
loadNamespaceMetadata(java.lang.String[] namespace) |
Procedure |
loadProcedure(org.apache.spark.sql.connector.catalog.Identifier ident)
Load a
stored procedure by identifier. |
org.apache.spark.sql.connector.catalog.Table |
loadTable(org.apache.spark.sql.connector.catalog.Identifier ident) |
org.apache.spark.sql.connector.catalog.Table |
loadTable(org.apache.spark.sql.connector.catalog.Identifier ident,
long timestamp) |
org.apache.spark.sql.connector.catalog.Table |
loadTable(org.apache.spark.sql.connector.catalog.Identifier ident,
java.lang.String version) |
org.apache.spark.sql.connector.catalog.View |
loadView(org.apache.spark.sql.connector.catalog.Identifier ident) |
java.lang.String |
name() |
boolean |
purgeTable(org.apache.spark.sql.connector.catalog.Identifier ident) |
void |
renameTable(org.apache.spark.sql.connector.catalog.Identifier from,
org.apache.spark.sql.connector.catalog.Identifier to) |
void |
renameView(org.apache.spark.sql.connector.catalog.Identifier fromIdentifier,
org.apache.spark.sql.connector.catalog.Identifier toIdentifier) |
org.apache.spark.sql.connector.catalog.View |
replaceView(org.apache.spark.sql.connector.catalog.Identifier ident,
java.lang.String sql,
java.lang.String currentCatalog,
java.lang.String[] currentNamespace,
org.apache.spark.sql.types.StructType schema,
java.lang.String[] queryColumnNames,
java.lang.String[] columnAliases,
java.lang.String[] columnComments,
java.util.Map<java.lang.String,java.lang.String> properties)
Replace a view in the catalog
|
org.apache.spark.sql.connector.catalog.StagedTable |
stageCreate(org.apache.spark.sql.connector.catalog.Identifier ident,
org.apache.spark.sql.types.StructType schema,
org.apache.spark.sql.connector.expressions.Transform[] transforms,
java.util.Map<java.lang.String,java.lang.String> properties) |
org.apache.spark.sql.connector.catalog.StagedTable |
stageCreateOrReplace(org.apache.spark.sql.connector.catalog.Identifier ident,
org.apache.spark.sql.types.StructType schema,
org.apache.spark.sql.connector.expressions.Transform[] transforms,
java.util.Map<java.lang.String,java.lang.String> properties) |
org.apache.spark.sql.connector.catalog.StagedTable |
stageReplace(org.apache.spark.sql.connector.catalog.Identifier ident,
org.apache.spark.sql.types.StructType schema,
org.apache.spark.sql.connector.expressions.Transform[] transforms,
java.util.Map<java.lang.String,java.lang.String> properties) |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitinvalidateView, viewExistsstageCreate, stageCreateOrReplace, stageReplacenamespaceExistsprotected Catalog buildIcebergCatalog(java.lang.String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options)
Catalog to be used by this Spark catalog adapter.name - Spark's catalog nameoptions - Spark's catalog optionsprotected TableIdentifier buildIdentifier(org.apache.spark.sql.connector.catalog.Identifier identifier)
TableIdentifier for the given Spark identifier.identifier - Spark's identifierpublic org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident)
throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException
loadTable in interface org.apache.spark.sql.connector.catalog.TableCatalogorg.apache.spark.sql.catalyst.analysis.NoSuchTableExceptionpublic org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident,
java.lang.String version)
throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException
loadTable in interface org.apache.spark.sql.connector.catalog.TableCatalogorg.apache.spark.sql.catalyst.analysis.NoSuchTableExceptionpublic org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident,
long timestamp)
throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException
loadTable in interface org.apache.spark.sql.connector.catalog.TableCatalogorg.apache.spark.sql.catalyst.analysis.NoSuchTableExceptionpublic org.apache.spark.sql.connector.catalog.Table createTable(org.apache.spark.sql.connector.catalog.Identifier ident,
org.apache.spark.sql.types.StructType schema,
org.apache.spark.sql.connector.expressions.Transform[] transforms,
java.util.Map<java.lang.String,java.lang.String> properties)
throws org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
createTable in interface org.apache.spark.sql.connector.catalog.TableCatalogorg.apache.spark.sql.catalyst.analysis.TableAlreadyExistsExceptionpublic org.apache.spark.sql.connector.catalog.StagedTable stageCreate(org.apache.spark.sql.connector.catalog.Identifier ident,
org.apache.spark.sql.types.StructType schema,
org.apache.spark.sql.connector.expressions.Transform[] transforms,
java.util.Map<java.lang.String,java.lang.String> properties)
throws org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
stageCreate in interface org.apache.spark.sql.connector.catalog.StagingTableCatalogorg.apache.spark.sql.catalyst.analysis.TableAlreadyExistsExceptionpublic org.apache.spark.sql.connector.catalog.StagedTable stageReplace(org.apache.spark.sql.connector.catalog.Identifier ident,
org.apache.spark.sql.types.StructType schema,
org.apache.spark.sql.connector.expressions.Transform[] transforms,
java.util.Map<java.lang.String,java.lang.String> properties)
throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException
stageReplace in interface org.apache.spark.sql.connector.catalog.StagingTableCatalogorg.apache.spark.sql.catalyst.analysis.NoSuchTableExceptionpublic org.apache.spark.sql.connector.catalog.StagedTable stageCreateOrReplace(org.apache.spark.sql.connector.catalog.Identifier ident,
org.apache.spark.sql.types.StructType schema,
org.apache.spark.sql.connector.expressions.Transform[] transforms,
java.util.Map<java.lang.String,java.lang.String> properties)
stageCreateOrReplace in interface org.apache.spark.sql.connector.catalog.StagingTableCatalogpublic org.apache.spark.sql.connector.catalog.Table alterTable(org.apache.spark.sql.connector.catalog.Identifier ident,
org.apache.spark.sql.connector.catalog.TableChange... changes)
throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException
alterTable in interface org.apache.spark.sql.connector.catalog.TableCatalogorg.apache.spark.sql.catalyst.analysis.NoSuchTableExceptionpublic boolean dropTable(org.apache.spark.sql.connector.catalog.Identifier ident)
dropTable in interface org.apache.spark.sql.connector.catalog.TableCatalogpublic boolean purgeTable(org.apache.spark.sql.connector.catalog.Identifier ident)
purgeTable in interface org.apache.spark.sql.connector.catalog.TableCatalogpublic void renameTable(org.apache.spark.sql.connector.catalog.Identifier from,
org.apache.spark.sql.connector.catalog.Identifier to)
throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException,
org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
renameTable in interface org.apache.spark.sql.connector.catalog.TableCatalogorg.apache.spark.sql.catalyst.analysis.NoSuchTableExceptionorg.apache.spark.sql.catalyst.analysis.TableAlreadyExistsExceptionpublic void invalidateTable(org.apache.spark.sql.connector.catalog.Identifier ident)
invalidateTable in interface org.apache.spark.sql.connector.catalog.TableCatalogpublic org.apache.spark.sql.connector.catalog.Identifier[] listTables(java.lang.String[] namespace)
listTables in interface org.apache.spark.sql.connector.catalog.TableCatalogpublic java.lang.String[] defaultNamespace()
defaultNamespace in interface org.apache.spark.sql.connector.catalog.CatalogPluginpublic java.lang.String[][] listNamespaces()
listNamespaces in interface org.apache.spark.sql.connector.catalog.SupportsNamespacespublic java.lang.String[][] listNamespaces(java.lang.String[] namespace)
throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
listNamespaces in interface org.apache.spark.sql.connector.catalog.SupportsNamespacesorg.apache.spark.sql.catalyst.analysis.NoSuchNamespaceExceptionpublic java.util.Map<java.lang.String,java.lang.String> loadNamespaceMetadata(java.lang.String[] namespace)
throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
loadNamespaceMetadata in interface org.apache.spark.sql.connector.catalog.SupportsNamespacesorg.apache.spark.sql.catalyst.analysis.NoSuchNamespaceExceptionpublic void createNamespace(java.lang.String[] namespace,
java.util.Map<java.lang.String,java.lang.String> metadata)
throws org.apache.spark.sql.catalyst.analysis.NamespaceAlreadyExistsException
createNamespace in interface org.apache.spark.sql.connector.catalog.SupportsNamespacesorg.apache.spark.sql.catalyst.analysis.NamespaceAlreadyExistsExceptionpublic void alterNamespace(java.lang.String[] namespace,
org.apache.spark.sql.connector.catalog.NamespaceChange... changes)
throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
alterNamespace in interface org.apache.spark.sql.connector.catalog.SupportsNamespacesorg.apache.spark.sql.catalyst.analysis.NoSuchNamespaceExceptionpublic boolean dropNamespace(java.lang.String[] namespace,
boolean cascade)
throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
dropNamespace in interface org.apache.spark.sql.connector.catalog.SupportsNamespacesorg.apache.spark.sql.catalyst.analysis.NoSuchNamespaceExceptionpublic org.apache.spark.sql.connector.catalog.Identifier[] listViews(java.lang.String... namespace)
listViews in interface org.apache.spark.sql.connector.catalog.ViewCatalogpublic org.apache.spark.sql.connector.catalog.View loadView(org.apache.spark.sql.connector.catalog.Identifier ident)
throws org.apache.spark.sql.catalyst.analysis.NoSuchViewException
loadView in interface org.apache.spark.sql.connector.catalog.ViewCatalogorg.apache.spark.sql.catalyst.analysis.NoSuchViewExceptionpublic org.apache.spark.sql.connector.catalog.View createView(org.apache.spark.sql.connector.catalog.Identifier ident,
java.lang.String sql,
java.lang.String currentCatalog,
java.lang.String[] currentNamespace,
org.apache.spark.sql.types.StructType schema,
java.lang.String[] queryColumnNames,
java.lang.String[] columnAliases,
java.lang.String[] columnComments,
java.util.Map<java.lang.String,java.lang.String> properties)
throws org.apache.spark.sql.catalyst.analysis.ViewAlreadyExistsException,
org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
createView in interface org.apache.spark.sql.connector.catalog.ViewCatalogorg.apache.spark.sql.catalyst.analysis.ViewAlreadyExistsExceptionorg.apache.spark.sql.catalyst.analysis.NoSuchNamespaceExceptionpublic org.apache.spark.sql.connector.catalog.View replaceView(org.apache.spark.sql.connector.catalog.Identifier ident,
java.lang.String sql,
java.lang.String currentCatalog,
java.lang.String[] currentNamespace,
org.apache.spark.sql.types.StructType schema,
java.lang.String[] queryColumnNames,
java.lang.String[] columnAliases,
java.lang.String[] columnComments,
java.util.Map<java.lang.String,java.lang.String> properties)
throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException,
org.apache.spark.sql.catalyst.analysis.NoSuchViewException
SupportsReplaceViewreplaceView in interface SupportsReplaceViewident - a view identifiersql - the SQL text that defines the viewcurrentCatalog - the current catalogcurrentNamespace - the current namespaceschema - the view query output schemaqueryColumnNames - the query column namescolumnAliases - the column aliasescolumnComments - the column commentsproperties - the view propertiesorg.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException - If the identifier namespace does not exist (optional)org.apache.spark.sql.catalyst.analysis.NoSuchViewException - If the view doesn't exist or is a tablepublic org.apache.spark.sql.connector.catalog.View alterView(org.apache.spark.sql.connector.catalog.Identifier ident,
org.apache.spark.sql.connector.catalog.ViewChange... changes)
throws org.apache.spark.sql.catalyst.analysis.NoSuchViewException,
java.lang.IllegalArgumentException
alterView in interface org.apache.spark.sql.connector.catalog.ViewCatalogorg.apache.spark.sql.catalyst.analysis.NoSuchViewExceptionjava.lang.IllegalArgumentExceptionpublic boolean dropView(org.apache.spark.sql.connector.catalog.Identifier ident)
dropView in interface org.apache.spark.sql.connector.catalog.ViewCatalogpublic void renameView(org.apache.spark.sql.connector.catalog.Identifier fromIdentifier,
org.apache.spark.sql.connector.catalog.Identifier toIdentifier)
throws org.apache.spark.sql.catalyst.analysis.NoSuchViewException,
org.apache.spark.sql.catalyst.analysis.ViewAlreadyExistsException
renameView in interface org.apache.spark.sql.connector.catalog.ViewCatalogorg.apache.spark.sql.catalyst.analysis.NoSuchViewExceptionorg.apache.spark.sql.catalyst.analysis.ViewAlreadyExistsExceptionpublic final void initialize(java.lang.String name,
org.apache.spark.sql.util.CaseInsensitiveStringMap options)
initialize in interface org.apache.spark.sql.connector.catalog.CatalogPluginpublic java.lang.String name()
name in interface org.apache.spark.sql.connector.catalog.CatalogPluginpublic Catalog icebergCatalog()
HasIcebergCatalogCatalog backing this Spark CatalogicebergCatalog in interface HasIcebergCatalogpublic Procedure loadProcedure(org.apache.spark.sql.connector.catalog.Identifier ident) throws NoSuchProcedureException
ProcedureCatalogstored procedure by identifier.loadProcedure in interface ProcedureCatalogident - a stored procedure identifierNoSuchProcedureException - if there is no matching stored procedurepublic boolean isFunctionNamespace(java.lang.String[] namespace)
public boolean isExistingNamespace(java.lang.String[] namespace)
public org.apache.spark.sql.connector.catalog.Identifier[] listFunctions(java.lang.String[] namespace)
throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
listFunctions in interface org.apache.spark.sql.connector.catalog.FunctionCatalogorg.apache.spark.sql.catalyst.analysis.NoSuchNamespaceExceptionpublic org.apache.spark.sql.connector.catalog.functions.UnboundFunction loadFunction(org.apache.spark.sql.connector.catalog.Identifier ident)
throws org.apache.spark.sql.catalyst.analysis.NoSuchFunctionException
loadFunction in interface org.apache.spark.sql.connector.catalog.FunctionCatalogorg.apache.spark.sql.catalyst.analysis.NoSuchFunctionException