A B C D E F G H I J K L M N O P Q R S T U V W Y _
All Classes All Packages
All Classes All Packages
All Classes All Packages
A
- abort() - Method in class org.apache.iceberg.io.BaseTaskWriter
- abort() - Method in interface org.apache.iceberg.io.TaskWriter
-
Close the writer and delete the completed files if possible when aborting.
- abortFileGroup(RewriteFileGroup) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager
-
Clean up a specified file set by removing any files created for that operation, should not throw any exceptions
- abortJob(JobContext, int) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
-
Removes the generated data files if there is a commit file already generated for them.
- abortStagedChanges() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- abortStagedChanges() - Method in class org.apache.iceberg.spark.source.StagedSparkTable
- abortTask(TaskAttemptContext) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
-
Removes files generated by this task.
- abortWith(Tasks.Task<I, ?>) - Method in class org.apache.iceberg.util.Tasks.Builder
- AbstractMapredIcebergRecordReader<T> - Class in org.apache.iceberg.mr.mapred
- AbstractMapredIcebergRecordReader(IcebergInputFormat<?>, IcebergSplit, JobConf, Reporter) - Constructor for class org.apache.iceberg.mr.mapred.AbstractMapredIcebergRecordReader
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanValueContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DoubleLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExponentLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExpressionContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FieldListContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FloatLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentityTransformContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IntegerLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MultipartIdentifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NamedArgumentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumericLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.PositionalArgumentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleStatementContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SmallIntLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TinyIntLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformArgumentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TypeConstructorContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.UnquotedIdentifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteSpecContext
- accept(Path) - Method in class org.apache.iceberg.hadoop.HiddenPathFilter
- accessKeyId() - Method in class org.apache.iceberg.aliyun.AliyunProperties
- accessKeySecret() - Method in class org.apache.iceberg.aliyun.AliyunProperties
- accessor() - Method in class org.apache.iceberg.expressions.BoundReference
- accessor() - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- Accessor<T> - Interface in org.apache.iceberg
- accessorForField(int) - Method in class org.apache.iceberg.Schema
-
Returns an accessor for retrieving the data from
StructLike
. - Accessors - Class in org.apache.iceberg
-
Position2Accessor and Position3Accessor here is an optimization.
- acquire(String, String) - Method in interface org.apache.iceberg.LockManager
-
Try to acquire a lock
- acquireIntervalMs() - Method in class org.apache.iceberg.util.LockManagers.BaseLockManager
- acquireTimeoutMs() - Method in class org.apache.iceberg.util.LockManagers.BaseLockManager
- Action<ThisT,R> - Interface in org.apache.iceberg.actions
-
An action performed on a table.
- Actions - Class in org.apache.iceberg.flink.actions
- ActionsProvider - Interface in org.apache.iceberg.actions
-
An API that should be implemented by query engine integrations for providing actions.
- add(int, StructLike) - Method in class org.apache.iceberg.util.PartitionSet
- add(D) - Method in interface org.apache.iceberg.io.FileAppender
- add(D) - Method in class org.apache.iceberg.parquet.ParquetWriteAdapter
-
Deprecated.
- add(F) - Method in class org.apache.iceberg.ManifestWriter
-
Add an added entry for a file.
- add(F, long) - Method in class org.apache.iceberg.ManifestWriter
-
Add an added entry for a file with a specific sequence number.
- add(CharSequence) - Method in class org.apache.iceberg.util.CharSequenceSet
- add(WriteResult) - Method in class org.apache.iceberg.io.WriteResult.Builder
- add(StructLike) - Method in class org.apache.iceberg.util.StructLikeSet
- add(Pair<Integer, StructLike>) - Method in class org.apache.iceberg.util.PartitionSet
- add(T) - Method in class org.apache.iceberg.io.DataWriter
-
Deprecated.since 0.13.0, will be removed in 0.14.0; use
DataWriter.write(Object)
instead. - add(T[], T) - Static method in class org.apache.iceberg.util.ArrayUtil
-
Copies the given array and adds the given element at the end of the new array.
- ADD - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- ADD - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- ADD() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- ADD() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- addAll(Iterable<D>) - Method in interface org.apache.iceberg.io.FileAppender
- addAll(Iterable<WriteResult>) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addAll(Collection<? extends CharSequence>) - Method in class org.apache.iceberg.util.CharSequenceSet
- addAll(Collection<? extends StructLike>) - Method in class org.apache.iceberg.util.StructLikeSet
- addAll(Collection<? extends Pair<Integer, StructLike>>) - Method in class org.apache.iceberg.util.PartitionSet
- addAll(Iterator<D>) - Method in interface org.apache.iceberg.io.FileAppender
- addCloseable(Closeable) - Method in class org.apache.iceberg.io.CloseableGroup
-
Register a closeable to be managed by this class.
- addCloseable(AutoCloseable) - Method in class org.apache.iceberg.io.CloseableGroup
-
Register an autocloseables to be managed by this class.
- addColumn(String, String, Type) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new column to a nested struct.
- addColumn(String, String, Type, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new column to a nested struct.
- addColumn(String, Type) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new top-level column.
- addColumn(String, Type, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new top-level column.
- addDataFiles(Iterable<DataFile>) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addDataFiles(DataFile...) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addDeleteFiles(Iterable<DeleteFile>) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addDeleteFiles(DeleteFile...) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addDeletes(DeleteFile) - Method in interface org.apache.iceberg.RowDelta
-
Add a
DeleteFile
to the table. - ADDED_DELETE_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_EQ_DELETES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_FILE_SIZE_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_FILES_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- ADDED_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_POS_DELETES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_RECORDS_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_ROWS_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- addedDataFiles() - Method in class org.apache.iceberg.actions.RewriteDataFilesActionResult
- addedDataFilesCount() - Method in class org.apache.iceberg.actions.BaseFileGroupRewriteResult
- addedDataFilesCount() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.FileGroupRewriteResult
- addedDataFilesCount() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.Result
- addedDeleteFilesCount() - Method in interface org.apache.iceberg.actions.RewritePositionDeleteFiles.Result
-
Returns the count of the added delete files.
- addedFile(PartitionSpec, DataFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- addedFile(PartitionSpec, DeleteFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- addedFiles() - Method in class org.apache.iceberg.actions.RewriteFileGroup
- addedFiles() - Method in interface org.apache.iceberg.Snapshot
-
Return all files added to the table in this snapshot.
- addedFilesCount() - Method in class org.apache.iceberg.GenericManifestFile
- addedFilesCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the number of data files with status ADDED in the manifest file.
- addedFilesCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- addedManifest(ManifestFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- addedManifests() - Method in class org.apache.iceberg.actions.BaseRewriteManifestsActionResult
- addedManifests() - Method in interface org.apache.iceberg.actions.RewriteManifests.Result
-
Returns added manifests.
- addedPositionDeleteFilesCount() - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteFiles.Result
-
Returns the count of the added position delete files.
- addedRowsCount() - Method in class org.apache.iceberg.GenericManifestFile
- addedRowsCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the total number of rows in all data files with status ADDED in the manifest file.
- addedRowsCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- addElement(I, E) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- addElement(List<E>, E) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ListReader
- addExtension(String) - Method in enum org.apache.iceberg.FileFormat
-
Returns filename with this format's extension added, if necessary.
- addFallbackIds(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
- addField(String) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Add a new partition field from a source column.
- addField(String, Term) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Add a new partition field from an
expression term
, with the given partition field name. - addField(Term) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Add a new partition field from an
expression term
. - addFile(DataFile) - Method in class org.apache.iceberg.BaseOverwriteFiles
- addFile(DataFile) - Method in class org.apache.iceberg.BaseReplacePartitions
- addFile(DataFile) - Method in interface org.apache.iceberg.OverwriteFiles
-
Add a
DataFile
to the table. - addFile(DataFile) - Method in interface org.apache.iceberg.ReplacePartitions
-
Add a
DataFile
to the table. - addManifest(ManifestFile) - Method in class org.apache.iceberg.BaseRewriteManifests
- addManifest(ManifestFile) - Method in interface org.apache.iceberg.RewriteManifests
-
Adds a
manifest file
to the table. - addPair(I, K, V) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- addPair(Map<K, V>, K, V) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.MapReader
- AddPartitionFieldContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- addPartitionSpec(PartitionSpec) - Method in class org.apache.iceberg.TableMetadata.Builder
- AddPartitionSpec(PartitionSpec) - Constructor for class org.apache.iceberg.MetadataUpdate.AddPartitionSpec
- addReferencedDataFiles(CharSequence...) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addReferencedDataFiles(Iterable<CharSequence>) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addRequiredColumn(String, String, Type) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new required top-level column.
- addRequiredColumn(String, String, Type, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new required top-level column.
- addRequiredColumn(String, Type) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new required top-level column.
- addRequiredColumn(String, Type, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new required top-level column.
- addResult(DataWriteResult) - Method in class org.apache.iceberg.io.ClusteredDataWriter
- addResult(DataWriteResult) - Method in class org.apache.iceberg.io.FanoutDataWriter
- addResult(DataWriteResult) - Method in class org.apache.iceberg.io.RollingDataWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.ClusteredEqualityDeleteWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.ClusteredPositionDeleteWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.RollingEqualityDeleteWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.RollingPositionDeleteWriter
- addRows(DataFile) - Method in interface org.apache.iceberg.RowDelta
-
Add a
DataFile
to the table. - addSchema(Schema) - Method in class org.apache.iceberg.data.avro.IcebergDecoder
-
Adds an
Iceberg schema
that can be used to decode buffers. - addSchema(Schema, int) - Method in class org.apache.iceberg.TableMetadata.Builder
- AddSchema(Schema, int) - Constructor for class org.apache.iceberg.MetadataUpdate.AddSchema
- addSnapshot(Snapshot) - Method in class org.apache.iceberg.TableMetadata.Builder
- AddSnapshot(Snapshot) - Constructor for class org.apache.iceberg.MetadataUpdate.AddSnapshot
- addSortOrder(SortOrder) - Method in class org.apache.iceberg.TableMetadata.Builder
- AddSortOrder(SortOrder) - Constructor for class org.apache.iceberg.MetadataUpdate.AddSortOrder
- addStagedSnapshot(Snapshot) - Method in class org.apache.iceberg.TableMetadata
- addValue(double) - Method in class org.apache.iceberg.DoubleFieldMetrics.Builder
- addValue(float) - Method in class org.apache.iceberg.FloatFieldMetrics.Builder
- ADJUST_TO_UTC_PROP - Static variable in class org.apache.iceberg.avro.AvroSchemaUtil
- advance() - Method in class org.apache.iceberg.parquet.BaseColumnIterator
- advanceNextPageCount - Variable in class org.apache.iceberg.parquet.BaseColumnIterator
- after(long) - Method in class org.apache.iceberg.ScanSummary.Builder
- after(String) - Method in class org.apache.iceberg.ScanSummary.Builder
- afterElementField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- afterElementField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterField(String, TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- afterField(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- afterField(Types.NestedField) - Method in class org.apache.iceberg.types.IndexParents
- afterField(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- afterField(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- afterField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterKeyField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- afterKeyField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterListElement(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- afterListElement(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- afterListElement(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- afterMapKey(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- afterMapKey(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- afterMapKey(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- afterMapValue(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- afterMapValue(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- afterMapValue(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- afterRepeatedElement(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterRepeatedKeyValue(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterValueField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- afterValueField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- aggregatedResult() - Method in class org.apache.iceberg.io.ClusteredDataWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.ClusteredEqualityDeleteWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.ClusteredPositionDeleteWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.FanoutDataWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.RollingDataWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.RollingEqualityDeleteWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.RollingPositionDeleteWriter
- aliasToId(String) - Method in class org.apache.iceberg.Schema
-
Returns the column id for the given column alias.
- AliyunClientFactories - Class in org.apache.iceberg.aliyun
- AliyunClientFactory - Interface in org.apache.iceberg.aliyun
- aliyunProperties() - Method in interface org.apache.iceberg.aliyun.AliyunClientFactory
-
Returns an initialized
AliyunProperties
- AliyunProperties - Class in org.apache.iceberg.aliyun
- AliyunProperties() - Constructor for class org.apache.iceberg.aliyun.AliyunProperties
- AliyunProperties(Map<String, String>) - Constructor for class org.apache.iceberg.aliyun.AliyunProperties
- ALL_DATA_FILES - org.apache.iceberg.MetadataTableType
- ALL_ENTRIES - org.apache.iceberg.MetadataTableType
- ALL_MANIFESTS - org.apache.iceberg.MetadataTableType
- AllDataFilesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's valid data files as rows. - AllDataFilesTable.AllDataFilesTableScan - Class in org.apache.iceberg
- AllEntriesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's manifest entries as rows, for both delete and data files. - allManifests() - Method in interface org.apache.iceberg.Snapshot
-
Return all
ManifestFile
instances for either data or delete manifests in this snapshot. - AllManifestsTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's valid manifest files as rows. - AllManifestsTable.AllManifestsTableScan - Class in org.apache.iceberg
- allowIncompatibleChanges() - Method in interface org.apache.iceberg.UpdateSchema
-
Allow incompatible changes to the schema.
- AlreadyExistsException - Exception in org.apache.iceberg.exceptions
-
Exception raised when attempting to create a table that already exists.
- AlreadyExistsException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.AlreadyExistsException
- AlreadyExistsException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.AlreadyExistsException
- ALTER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- ALTER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- alterDatabase(String, CatalogDatabase, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterFunction(ObjectPath, CatalogFunction, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterNamespace(String[], NamespaceChange...) - Method in class org.apache.iceberg.spark.SparkCatalog
- alterNamespace(String[], NamespaceChange...) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- alterPartition(ObjectPath, CatalogPartitionSpec, CatalogPartition, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterPartitionColumnStatistics(ObjectPath, CatalogPartitionSpec, CatalogColumnStatistics, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterPartitionStatistics(ObjectPath, CatalogPartitionSpec, CatalogTableStatistics, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterTable(ObjectPath, CatalogBaseTable, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterTable(Identifier, TableChange...) - Method in class org.apache.iceberg.spark.SparkCatalog
- alterTable(Identifier, TableChange...) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- alterTableColumnStatistics(ObjectPath, CatalogColumnStatistics, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterTableStatistics(ObjectPath, CatalogTableStatistics, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alwaysFalse() - Static method in class org.apache.iceberg.expressions.Expressions
- alwaysFalse() - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- alwaysNull() - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a
Transform
that always produces null. - alwaysNull(int, String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- alwaysNull(String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- alwaysNull(String, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- alwaysTrue() - Static method in class org.apache.iceberg.expressions.Expressions
- alwaysTrue() - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- ancestorIds(Snapshot, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- ancestorIdsBetween(long, Long, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- ancestorsBetween(long, Long, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- ancestorsOf(long, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- AncestorsOfProcedure - Class in org.apache.iceberg.spark.procedures
- and(Expression, Expression) - Static method in class org.apache.iceberg.expressions.Expressions
- and(Expression, Expression, Expression...) - Static method in class org.apache.iceberg.expressions.Expressions
- and(R, R) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- And - Class in org.apache.iceberg.expressions
- AND - org.apache.iceberg.expressions.Expression.Operation
- ANYWHERE - Static variable in class org.apache.iceberg.mr.mapreduce.IcebergSplit
- APP_ID - Static variable in class org.apache.iceberg.CatalogProperties
- append() - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
-
Append the iceberg sink operators to write records to iceberg table.
- APPEND - Static variable in class org.apache.iceberg.DataOperations
-
New data is appended to the table and no data is removed or deleted.
- appendFile(DataFile) - Method in interface org.apache.iceberg.AppendFiles
-
Append a
DataFile
to the table. - AppendFiles - Interface in org.apache.iceberg
-
API for appending new files in a table.
- appendManifest(ManifestFile) - Method in interface org.apache.iceberg.AppendFiles
-
Append a
ManifestFile
to the table. - appendsAfter(long) - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- appendsAfter(long) - Method in class org.apache.iceberg.DataFilesTable.FilesTableScan
- appendsAfter(long) - Method in class org.apache.iceberg.DataTableScan
- appendsAfter(long) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
to read appended data fromfromSnapshotId
exclusive to the current snapshot inclusive. - appendsBetween(long, long) - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- appendsBetween(long, long) - Method in class org.apache.iceberg.DataFilesTable.FilesTableScan
- appendsBetween(long, long) - Method in class org.apache.iceberg.DataTableScan
- appendsBetween(long, long) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
to read appended data fromfromSnapshotId
exclusive totoSnapshotId
inclusive. - apply() - Method in class org.apache.iceberg.BaseReplaceSortOrder
- apply() - Method in interface org.apache.iceberg.PendingUpdate
-
Apply the pending changes and return the uncommitted changes for validation.
- apply() - Method in class org.apache.iceberg.SetLocation
- apply() - Method in class org.apache.iceberg.SnapshotManager
- apply(TableMetadata) - Method in class org.apache.iceberg.BaseReplacePartitions
- apply(TableMetadata) - Method in class org.apache.iceberg.BaseRewriteManifests
- apply(S) - Method in interface org.apache.iceberg.transforms.Transform
-
Transforms a value to its corresponding partition value.
- apply(S) - Method in class org.apache.iceberg.transforms.UnknownTransform
- applyFilters(List<ResolvedExpression>) - Method in class org.apache.iceberg.flink.IcebergTableSource
- applyLimit(long) - Method in class org.apache.iceberg.flink.IcebergTableSource
- applyNameMapping(MessageType, NameMapping) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
- applyOverwrite(boolean) - Method in class org.apache.iceberg.flink.IcebergTableSink
- applyProjection(int[][]) - Method in class org.apache.iceberg.flink.IcebergTableSource
- applyPropertyChanges(UpdateProperties, List<TableChange>) - Static method in class org.apache.iceberg.spark.Spark3Util
-
Applies a list of Spark table changes to an
UpdateProperties
operation. - applySchemaChanges(UpdateSchema, List<TableChange>) - Static method in class org.apache.iceberg.spark.Spark3Util
-
Applies a list of Spark table changes to an
UpdateSchema
operation. - applyStaticPartition(Map<String, String>) - Method in class org.apache.iceberg.flink.IcebergTableSink
- ApplyTransformContext(IcebergSqlExtensionsParser.TransformContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- arguments - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- array(Schema, Schema) - Method in class org.apache.iceberg.avro.RemoveIds
- array(Schema, T) - Method in class org.apache.iceberg.avro.AvroSchemaVisitor
- array(ValueReader<T>) - Static method in class org.apache.iceberg.avro.ValueReaders
- array(ValueWriter<T>) - Static method in class org.apache.iceberg.avro.ValueWriters
- array(OrcValueReader<?>) - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- array(Types.ListType, Schema, T) - Method in class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- array(P, Schema, T) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- arrayElementType(LogicalType) - Method in class org.apache.iceberg.flink.data.AvroWithFlinkSchemaVisitor
- arrayElementType(DataType) - Method in class org.apache.iceberg.spark.data.AvroWithSparkSchemaVisitor
- arrayElementType(P) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- arrayMap(ValueReader<K>, ValueReader<V>) - Static method in class org.apache.iceberg.avro.ValueReaders
- arrayMap(ValueWriter<K>, ValueWriter<V>) - Static method in class org.apache.iceberg.avro.ValueWriters
- ArrayUtil - Class in org.apache.iceberg.util
- ArrowAllocation - Class in org.apache.iceberg.arrow
- ArrowReader - Class in org.apache.iceberg.arrow.vectorized
-
Vectorized reader that returns an iterator of
ColumnarBatch
. - ArrowReader(TableScan, int, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.ArrowReader
-
Create a new instance of the reader.
- ArrowSchemaUtil - Class in org.apache.iceberg.arrow
- ArrowVectorAccessor<DecimalT,Utf8StringT,ArrayT,ChildVectorT extends java.lang.AutoCloseable> - Class in org.apache.iceberg.arrow.vectorized
- ArrowVectorAccessor(ValueVector) - Constructor for class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- ArrowVectorAccessor(ValueVector, ChildVectorT[]) - Constructor for class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- ArrowVectorAccessors - Class in org.apache.iceberg.spark.data.vectorized
- as(String) - Method in interface org.apache.iceberg.actions.SnapshotTable
-
Sets the table identifier for the newly created Iceberg table.
- as(String) - Method in class org.apache.iceberg.spark.actions.BaseSnapshotTableSparkAction
- AS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- AS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- AS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- AS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- AS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- AS_OF_TIMESTAMP - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- AS_OF_TIMESTAMP - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- asc(String) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add a field to the sort by field name, ascending with nulls first.
- asc(String, NullOrder) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add a field to the sort by field name, ascending with the given null order.
- asc(Term) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add an expression term to the sort, ascending with nulls first.
- asc(Term, NullOrder) - Method in class org.apache.iceberg.BaseReplaceSortOrder
- asc(Term, NullOrder) - Method in class org.apache.iceberg.SortOrder.Builder
-
Add an expression term to the sort, ascending with the given null order.
- asc(Term, NullOrder) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add an expression term to the sort, ascending with the given null order.
- ASC - org.apache.iceberg.SortDirection
- ASC - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- ASC - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- ASC() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- ASC() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- asCombinedScanTask() - Method in interface org.apache.iceberg.CombinedScanTask
- asCombinedScanTask() - Method in interface org.apache.iceberg.ScanTask
-
Returns this cast to
CombinedScanTask
if it is one - asDataTask() - Method in interface org.apache.iceberg.DataTask
- asDataTask() - Method in interface org.apache.iceberg.ScanTask
-
Returns this cast to
DataTask
if it is one - asFileScanTask() - Method in interface org.apache.iceberg.FileScanTask
- asFileScanTask() - Method in interface org.apache.iceberg.ScanTask
-
Returns this cast to
FileScanTask
if it is one - asListType() - Method in interface org.apache.iceberg.types.Type
- asListType() - Method in class org.apache.iceberg.types.Types.ListType
- asLiteralPredicate() - Method in class org.apache.iceberg.expressions.BoundLiteralPredicate
- asLiteralPredicate() - Method in class org.apache.iceberg.expressions.BoundPredicate
- asMappedFields() - Method in class org.apache.iceberg.mapping.NameMapping
- asMapType() - Method in interface org.apache.iceberg.types.Type
- asMapType() - Method in class org.apache.iceberg.types.Types.MapType
- asNestedType() - Method in interface org.apache.iceberg.types.Type
- asNestedType() - Method in class org.apache.iceberg.types.Type.NestedType
- asOfTime(long) - Method in class org.apache.iceberg.AllDataFilesTable.AllDataFilesTableScan
- asOfTime(long) - Method in class org.apache.iceberg.AllManifestsTable.AllManifestsTableScan
- asOfTime(long) - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- asOfTime(long) - Method in class org.apache.iceberg.FindFiles.Builder
-
Base results on files in the snapshot that was current as of a timestamp.
- asOfTime(long) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- asOfTime(long) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
from this scan's configuration that will use the most recent snapshot as of the given time in milliseconds. - asOfTimestamp() - Method in class org.apache.iceberg.spark.SparkReadConf
- asOfTimestamp(Long) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- asOptional() - Method in class org.apache.iceberg.types.Types.NestedField
- asPrimitiveType() - Method in interface org.apache.iceberg.types.Type
- asPrimitiveType() - Method in class org.apache.iceberg.types.Type.PrimitiveType
- asRequired() - Method in class org.apache.iceberg.types.Types.NestedField
- asResult() - Method in class org.apache.iceberg.actions.RewriteFileGroup
- asSetPredicate() - Method in class org.apache.iceberg.expressions.BoundPredicate
- asSetPredicate() - Method in class org.apache.iceberg.expressions.BoundSetPredicate
- assignFreshIds(int, Schema, TypeUtil.NextID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns fresh ids from the
nextId function
for all fields in a schema. - assignFreshIds(Schema, Schema, TypeUtil.NextID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns ids to match a given schema, and fresh ids from the
nextId function
for all other fields. - assignFreshIds(Schema, TypeUtil.NextID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns fresh ids from the
nextId function
for all fields in a schema. - assignFreshIds(Type, TypeUtil.NextID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns fresh ids from the
nextId function
for all fields in a type. - assignIncreasingFreshIds(Schema) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns strictly increasing fresh ids for all fields in a schema, starting from 1.
- assignUUID() - Method in class org.apache.iceberg.TableMetadata.Builder
- AssignUUID(String) - Constructor for class org.apache.iceberg.MetadataUpdate.AssignUUID
- asStatic() - Method in class org.apache.iceberg.common.DynFields.UnboundField
-
Returns this field as a StaticField.
- asStatic() - Method in class org.apache.iceberg.common.DynMethods.UnboundMethod
-
Returns this method as a StaticMethod.
- asStruct() - Method in class org.apache.iceberg.Schema
-
Returns the underlying
struct type
for this schema. - asStructLike(Record) - Method in class org.apache.iceberg.data.GenericDeleteFilter
- asStructLike(T) - Method in class org.apache.iceberg.data.DeleteFilter
- asStructLike(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
-
Wrap the data as a
StructLike
. - asStructLikeKey(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
-
Wrap the passed in key of a row as a
StructLike
- asStructType() - Method in interface org.apache.iceberg.types.Type
- asStructType() - Method in class org.apache.iceberg.types.Types.StructType
- AssumeRoleAwsClientFactory - Class in org.apache.iceberg.aws
- AssumeRoleAwsClientFactory() - Constructor for class org.apache.iceberg.aws.AssumeRoleAwsClientFactory
- asSummaryString() - Method in class org.apache.iceberg.flink.IcebergTableSink
- asSummaryString() - Method in class org.apache.iceberg.flink.IcebergTableSource
- asUnaryPredicate() - Method in class org.apache.iceberg.expressions.BoundPredicate
- asUnaryPredicate() - Method in class org.apache.iceberg.expressions.BoundUnaryPredicate
- Avro - Class in org.apache.iceberg.avro
- AVRO - org.apache.iceberg.FileFormat
- AVRO_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- AVRO_COMPRESSION_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- Avro.DataWriteBuilder - Class in org.apache.iceberg.avro
- Avro.DeleteWriteBuilder - Class in org.apache.iceberg.avro
- Avro.ReadBuilder - Class in org.apache.iceberg.avro
- Avro.WriteBuilder - Class in org.apache.iceberg.avro
- AvroEncoderUtil - Class in org.apache.iceberg.avro
- AvroIterable<D> - Class in org.apache.iceberg.avro
- AvroMetrics - Class in org.apache.iceberg.avro
- AvroSchemaUtil - Class in org.apache.iceberg.avro
- AvroSchemaVisitor<T> - Class in org.apache.iceberg.avro
- AvroSchemaVisitor() - Constructor for class org.apache.iceberg.avro.AvroSchemaVisitor
- AvroSchemaWithTypeVisitor<T> - Class in org.apache.iceberg.avro
- AvroSchemaWithTypeVisitor() - Constructor for class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- AvroWithFlinkSchemaVisitor<T> - Class in org.apache.iceberg.flink.data
- AvroWithFlinkSchemaVisitor() - Constructor for class org.apache.iceberg.flink.data.AvroWithFlinkSchemaVisitor
- AvroWithPartnerByStructureVisitor<P,T> - Class in org.apache.iceberg.avro
-
A abstract avro schema visitor with partner type.
- AvroWithPartnerByStructureVisitor() - Constructor for class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- AvroWithSparkSchemaVisitor<T> - Class in org.apache.iceberg.spark.data
- AvroWithSparkSchemaVisitor() - Constructor for class org.apache.iceberg.spark.data.AvroWithSparkSchemaVisitor
- AwsClientFactories - Class in org.apache.iceberg.aws
- AwsClientFactory - Interface in org.apache.iceberg.aws
-
Interface to customize AWS clients used by Iceberg.
- AwsProperties - Class in org.apache.iceberg.aws
- AwsProperties() - Constructor for class org.apache.iceberg.aws.AwsProperties
- AwsProperties(Map<String, String>) - Constructor for class org.apache.iceberg.aws.AwsProperties
B
- BACKQUOTED_IDENTIFIER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BACKQUOTED_IDENTIFIER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BACKQUOTED_IDENTIFIER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierContext
- BASE_NAMESPACE - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- BaseBatchReader<T> - Class in org.apache.iceberg.arrow.vectorized
-
A base BatchReader class that contains common functionality
- BaseBatchReader(List<VectorizedReader<?>>) - Constructor for class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- BaseColumnIterator - Class in org.apache.iceberg.parquet
- BaseColumnIterator(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.BaseColumnIterator
- BaseCombinedScanTask - Class in org.apache.iceberg
- BaseCombinedScanTask(List<FileScanTask>) - Constructor for class org.apache.iceberg.BaseCombinedScanTask
- BaseCombinedScanTask(FileScanTask...) - Constructor for class org.apache.iceberg.BaseCombinedScanTask
- BaseDeleteOrphanFilesActionResult - Class in org.apache.iceberg.actions
- BaseDeleteOrphanFilesActionResult(Iterable<String>) - Constructor for class org.apache.iceberg.actions.BaseDeleteOrphanFilesActionResult
- BaseDeleteOrphanFilesSparkAction - Class in org.apache.iceberg.spark.actions
-
An action that removes orphan metadata and data files by listing a given location and comparing the actual files in that location with data and metadata files referenced by all valid snapshots.
- BaseDeleteOrphanFilesSparkAction(SparkSession, Table) - Constructor for class org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction
- BaseDeleteReachableFilesActionResult - Class in org.apache.iceberg.actions
- BaseDeleteReachableFilesActionResult(long, long, long, long) - Constructor for class org.apache.iceberg.actions.BaseDeleteReachableFilesActionResult
- BaseDeleteReachableFilesSparkAction - Class in org.apache.iceberg.spark.actions
-
An implementation of
DeleteReachableFiles
that uses metadata tables in Spark to determine which files should be deleted. - BaseDeleteReachableFilesSparkAction(SparkSession, String) - Constructor for class org.apache.iceberg.spark.actions.BaseDeleteReachableFilesSparkAction
- BaseEqualityDeltaWriter(StructLike, Schema, Schema) - Constructor for class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
- BaseExpireSnapshotsActionResult - Class in org.apache.iceberg.actions
- BaseExpireSnapshotsActionResult(long, long, long) - Constructor for class org.apache.iceberg.actions.BaseExpireSnapshotsActionResult
- BaseExpireSnapshotsSparkAction - Class in org.apache.iceberg.spark.actions
-
An action that performs the same operation as
ExpireSnapshots
but uses Spark to determine the delta in files between the pre and post-expiration table metadata. - BaseExpireSnapshotsSparkAction(SparkSession, Table) - Constructor for class org.apache.iceberg.spark.actions.BaseExpireSnapshotsSparkAction
- BaseFileGroupRewriteResult - Class in org.apache.iceberg.actions
- BaseFileGroupRewriteResult(RewriteDataFiles.FileGroupInfo, int, int) - Constructor for class org.apache.iceberg.actions.BaseFileGroupRewriteResult
- BaseFileWriterFactory<T> - Class in org.apache.iceberg.data
-
A base writer factory to be extended by query engine integrations.
- BaseFileWriterFactory(Table, FileFormat, Schema, SortOrder, FileFormat, int[], Schema, SortOrder, Schema) - Constructor for class org.apache.iceberg.data.BaseFileWriterFactory
- BaseLockManager() - Constructor for class org.apache.iceberg.util.LockManagers.BaseLockManager
- BaseMetastoreCatalog - Class in org.apache.iceberg
- BaseMetastoreCatalog() - Constructor for class org.apache.iceberg.BaseMetastoreCatalog
- BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder - Class in org.apache.iceberg
- BaseMetastoreCatalogTableBuilder(TableIdentifier, Schema) - Constructor for class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- BaseMetastoreTableOperations - Class in org.apache.iceberg
- BaseMetastoreTableOperations() - Constructor for class org.apache.iceberg.BaseMetastoreTableOperations
- BaseMetastoreTableOperations.CommitStatus - Enum in org.apache.iceberg
- BaseMigrateTableActionResult - Class in org.apache.iceberg.actions
- BaseMigrateTableActionResult(long) - Constructor for class org.apache.iceberg.actions.BaseMigrateTableActionResult
- BaseMigrateTableSparkAction - Class in org.apache.iceberg.spark.actions
-
Takes a Spark table in the source catalog and attempts to transform it into an Iceberg table in the same location with the same identifier.
- BaseMigrateTableSparkAction(SparkSession, CatalogPlugin, Identifier) - Constructor for class org.apache.iceberg.spark.actions.BaseMigrateTableSparkAction
- BaseOverwriteFiles - Class in org.apache.iceberg
- BaseOverwriteFiles(String, TableOperations) - Constructor for class org.apache.iceberg.BaseOverwriteFiles
- BasePageIterator - Class in org.apache.iceberg.parquet
- BasePageIterator(ColumnDescriptor, String) - Constructor for class org.apache.iceberg.parquet.BasePageIterator
- BasePageIterator.IntIterator - Class in org.apache.iceberg.parquet
- BaseParquetReaders<T> - Class in org.apache.iceberg.data.parquet
- BaseParquetReaders() - Constructor for class org.apache.iceberg.data.parquet.BaseParquetReaders
- BaseParquetWriter<T> - Class in org.apache.iceberg.data.parquet
- BaseParquetWriter() - Constructor for class org.apache.iceberg.data.parquet.BaseParquetWriter
- BasePositionDeltaWriter<T> - Class in org.apache.iceberg.io
- BasePositionDeltaWriter(PartitioningWriter<T, DataWriteResult>, PartitioningWriter<PositionDelete<T>, DeleteWriteResult>) - Constructor for class org.apache.iceberg.io.BasePositionDeltaWriter
- BaseReplacePartitions - Class in org.apache.iceberg
- BaseReplaceSortOrder - Class in org.apache.iceberg
- BaseRewriteDataFilesAction<ThisT> - Class in org.apache.iceberg.actions
- BaseRewriteDataFilesAction(Table) - Constructor for class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- BaseRewriteDataFilesFileGroupInfo - Class in org.apache.iceberg.actions
- BaseRewriteDataFilesFileGroupInfo(int, int, StructLike) - Constructor for class org.apache.iceberg.actions.BaseRewriteDataFilesFileGroupInfo
- BaseRewriteDataFilesResult - Class in org.apache.iceberg.actions
- BaseRewriteDataFilesResult(List<RewriteDataFiles.FileGroupRewriteResult>) - Constructor for class org.apache.iceberg.actions.BaseRewriteDataFilesResult
- BaseRewriteDataFilesSpark3Action - Class in org.apache.iceberg.spark.actions
- BaseRewriteDataFilesSpark3Action(SparkSession, Table) - Constructor for class org.apache.iceberg.spark.actions.BaseRewriteDataFilesSpark3Action
- BaseRewriteManifests - Class in org.apache.iceberg
- BaseRewriteManifestsActionResult - Class in org.apache.iceberg.actions
- BaseRewriteManifestsActionResult(Iterable<ManifestFile>, Iterable<ManifestFile>) - Constructor for class org.apache.iceberg.actions.BaseRewriteManifestsActionResult
- BaseRewriteManifestsSparkAction - Class in org.apache.iceberg.spark.actions
-
An action that rewrites manifests in a distributed manner and co-locates metadata for partitions.
- BaseRewriteManifestsSparkAction(SparkSession, Table) - Constructor for class org.apache.iceberg.spark.actions.BaseRewriteManifestsSparkAction
- BaseSnapshotTableActionResult - Class in org.apache.iceberg.actions
- BaseSnapshotTableActionResult(long) - Constructor for class org.apache.iceberg.actions.BaseSnapshotTableActionResult
- BaseSnapshotTableSparkAction - Class in org.apache.iceberg.spark.actions
-
Creates a new Iceberg table based on a source Spark table.
- BaseSnapshotTableSparkAction(SparkSession, CatalogPlugin, Identifier, CatalogPlugin, Identifier) - Constructor for class org.apache.iceberg.spark.actions.BaseSnapshotTableSparkAction
- BaseTable - Class in org.apache.iceberg
-
Base
Table
implementation. - BaseTable(TableOperations, String) - Constructor for class org.apache.iceberg.BaseTable
- BaseTaskWriter<T> - Class in org.apache.iceberg.io
- BaseTaskWriter(PartitionSpec, FileFormat, FileAppenderFactory<T>, OutputFileFactory, FileIO, long) - Constructor for class org.apache.iceberg.io.BaseTaskWriter
- BaseTaskWriter.BaseEqualityDeltaWriter - Class in org.apache.iceberg.io
-
Base equality delta writer to write both insert records and equality-deletes.
- BaseTaskWriter.RollingEqDeleteWriter - Class in org.apache.iceberg.io
- BaseTaskWriter.RollingFileWriter - Class in org.apache.iceberg.io
- BaseVectorizedParquetValuesReader - Class in org.apache.iceberg.arrow.vectorized.parquet
-
A values reader for Parquet's run-length encoded data that reads column data in batches instead of one value at a time.
- BaseVectorizedParquetValuesReader(int, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- BaseVectorizedParquetValuesReader(int, int, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- BaseVectorizedParquetValuesReader(int, int, boolean, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- BatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.BatchReader
- before(long) - Method in class org.apache.iceberg.ScanSummary.Builder
- before(String) - Method in class org.apache.iceberg.ScanSummary.Builder
- beforeElementField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- beforeElementField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeField(String, TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- beforeField(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- beforeField(Types.NestedField) - Method in class org.apache.iceberg.types.IndexParents
- beforeField(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- beforeField(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- beforeField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeKeyField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- beforeKeyField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeListElement(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- beforeListElement(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- beforeListElement(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- beforeMapKey(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- beforeMapKey(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- beforeMapKey(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- beforeMapValue(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- beforeMapValue(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- beforeMapValue(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- beforeRepeatedElement(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeRepeatedKeyValue(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeValueField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- beforeValueField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- BIGDECIMAL_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BIGDECIMAL_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BIGDECIMAL_LITERAL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- BigDecimalLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- BIGINT_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BIGINT_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BIGINT_LITERAL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- BigIntLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- BINARY - org.apache.iceberg.orc.ORCSchemaUtil.BinaryType
- BINARY - org.apache.iceberg.types.Type.TypeID
- BinaryAsDecimalReader(ColumnDescriptor, int) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.BinaryAsDecimalReader
- BinaryType() - Constructor for class org.apache.iceberg.types.Types.BinaryType
- BinaryUtil - Class in org.apache.iceberg.util
- bind(Object) - Method in class org.apache.iceberg.common.DynConstructors.Ctor
- bind(Object) - Method in class org.apache.iceberg.common.DynFields.UnboundField
-
Returns this method as a BoundMethod for the given receiver.
- bind(Object) - Method in class org.apache.iceberg.common.DynMethods.UnboundMethod
-
Returns this method as a BoundMethod for the given receiver.
- bind(Types.StructType, boolean) - Method in class org.apache.iceberg.expressions.NamedReference
- bind(Types.StructType, boolean) - Method in interface org.apache.iceberg.expressions.Unbound
-
Bind this value expression to concrete types.
- bind(Types.StructType, boolean) - Method in class org.apache.iceberg.expressions.UnboundPredicate
-
Bind this UnboundPredicate.
- bind(Types.StructType, boolean) - Method in class org.apache.iceberg.expressions.UnboundTransform
- bind(Types.StructType, Expression, boolean) - Static method in class org.apache.iceberg.expressions.Binder
-
Replaces all unbound/named references with bound references to fields in the given struct.
- Binder - Class in org.apache.iceberg.expressions
-
Rewrites
expressions
by replacing unbound named references with references to fields in a struct schema. - binPack() - Method in interface org.apache.iceberg.actions.RewriteDataFiles
-
Choose BINPACK as a strategy for this rewrite operation
- BinPacking - Class in org.apache.iceberg.util
- BinPacking() - Constructor for class org.apache.iceberg.util.BinPacking
- BinPacking.ListPacker<T> - Class in org.apache.iceberg.util
- BinPacking.PackingIterable<T> - Class in org.apache.iceberg.util
- binPackStrategy() - Method in class org.apache.iceberg.spark.actions.BaseRewriteDataFilesSpark3Action
- BinPackStrategy - Class in org.apache.iceberg.actions
-
A rewrite strategy for data files which determines which files to rewrite based on their size.
- BinPackStrategy() - Constructor for class org.apache.iceberg.actions.BinPackStrategy
- blockLocations(CombinedScanTask, Configuration) - Static method in class org.apache.iceberg.hadoop.Util
- blockLocations(FileIO, CombinedScanTask) - Static method in class org.apache.iceberg.hadoop.Util
- BOOLEAN - org.apache.iceberg.types.Type.TypeID
- booleanBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- BooleanBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.BooleanBatchReader
- BooleanLiteralContext(IcebergSqlExtensionsParser.ConstantContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanLiteralContext
- booleans() - Static method in class org.apache.iceberg.avro.ValueReaders
- booleans() - Static method in class org.apache.iceberg.avro.ValueWriters
- booleans() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- booleans() - Static method in class org.apache.iceberg.orc.OrcValueReaders
- booleans(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- BooleanType() - Constructor for class org.apache.iceberg.types.Types.BooleanType
- booleanValue() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanLiteralContext
- booleanValue() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BooleanValueContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanValueContext
- Bound<T> - Interface in org.apache.iceberg.expressions
-
Represents a bound value expression.
- BoundExpressionVisitor() - Constructor for class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- BoundLiteralPredicate<T> - Class in org.apache.iceberg.expressions
- BoundPredicate<T> - Class in org.apache.iceberg.expressions
- BoundPredicate(Expression.Operation, BoundTerm<T>) - Constructor for class org.apache.iceberg.expressions.BoundPredicate
- BoundReference<T> - Class in org.apache.iceberg.expressions
- boundReferences(Types.StructType, List<Expression>, boolean) - Static method in class org.apache.iceberg.expressions.Binder
- BoundSetPredicate<T> - Class in org.apache.iceberg.expressions
- BoundTerm<T> - Interface in org.apache.iceberg.expressions
-
Represents a bound term.
- BoundTransform<S,T> - Class in org.apache.iceberg.expressions
-
A transform expression.
- BoundUnaryPredicate<T> - Class in org.apache.iceberg.expressions
- BoundVisitor() - Constructor for class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- BRACKETED_COMMENT - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BRACKETED_COMMENT - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- bucket() - Method in class org.apache.iceberg.aliyun.oss.OSSURI
-
Return OSS bucket name.
- bucket(int, String, int, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- bucket(String, int) - Static method in class org.apache.iceberg.expressions.Expressions
- bucket(String, int) - Method in class org.apache.iceberg.PartitionSpec.Builder
- bucket(String, int, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- bucket(String, int, int, SortDirection, NullOrder) - Method in interface org.apache.iceberg.transforms.SortOrderVisitor
- bucket(String, int, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- bucket(Type, int) - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a bucket
Transform
for the given type and number of buckets. - buffer() - Method in interface org.apache.iceberg.encryption.EncryptionKeyMetadata
-
Opaque blob representing metadata about a file's encryption key.
- build() - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- build() - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- build() - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- build() - Method in class org.apache.iceberg.common.DynClasses.Builder
-
Returns the first implementation or throws RuntimeException if one was not found.
- build() - Method in class org.apache.iceberg.common.DynConstructors.Builder
- build() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a UnboundField or throws a NoSuchFieldException if there is none.
- build() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a UnboundMethod or throws a RuntimeError if there is none.
- build() - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- build() - Method in class org.apache.iceberg.DataFiles.Builder
- build() - Method in class org.apache.iceberg.DoubleFieldMetrics.Builder
- build() - Method in class org.apache.iceberg.FileMetadata.Builder
- build() - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
-
Deprecated.this will be removed in 0.14.0; use
FlinkSink.Builder.append()
because its returnedDataStreamSink
has a more correct data type. - build() - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- build() - Method in class org.apache.iceberg.FloatFieldMetrics.Builder
- build() - Method in class org.apache.iceberg.GenericManifestFile.CopyBuilder
- build() - Method in class org.apache.iceberg.io.OutputFileFactory.Builder
- build() - Method in class org.apache.iceberg.io.WriteResult.Builder
- build() - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- build() - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- build() - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- build() - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- build() - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- build() - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- build() - Method in class org.apache.iceberg.PartitionSpec.Builder
- build() - Method in class org.apache.iceberg.ScanSummary.Builder
-
Summarizes a table scan as a map of partition key to metrics for that partition.
- build() - Method in class org.apache.iceberg.SnapshotSummary.Builder
- build() - Method in class org.apache.iceberg.SortOrder.Builder
- build() - Method in interface org.apache.iceberg.spark.procedures.SparkProcedures.ProcedureBuilder
- build() - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- build() - Method in class org.apache.iceberg.TableMetadata.Builder
- build() - Method in interface org.apache.spark.sql.connector.iceberg.write.DeltaWriteBuilder
-
Returns a logical delta write.
- build() - Method in interface org.apache.spark.sql.connector.iceberg.write.RowLevelOperationBuilder
-
Returns a row-level operation that controls how Spark rewrites data for DELETE, UPDATE, MERGE commands.
- build(Object) - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a BoundMethod or throws a RuntimeException if there is none.
- build(Object) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a BoundMethod or throws a RuntimeError if there is none.
- buildAvroProjection(Schema, Schema, Map<String, String>) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- buildChecked() - Method in class org.apache.iceberg.common.DynClasses.Builder
-
Returns the first implementation or throws ClassNotFoundException if one was not found.
- buildChecked() - Method in class org.apache.iceberg.common.DynConstructors.Builder
- buildChecked() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a UnboundField or throws a NoSuchFieldException if there is none.
- buildChecked() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a UnboundMethod or throws a NoSuchMethodException if there is none.
- buildChecked(Object) - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a BoundMethod or throws a NoSuchMethodException if there is none.
- buildChecked(Object) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a BoundMethod or throws a NoSuchMethodException if there is none.
- buildCopyOnWriteDistribution(Table, RowLevelOperation.Command, DistributionMode) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- buildCopyOnWriteOrdering(Table, RowLevelOperation.Command, Distribution) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- buildCopyOnWriteScan() - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- buildEqualityWriter() - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- buildEqualityWriter() - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- buildEqualityWriter() - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- builder() - Static method in class org.apache.iceberg.common.DynClasses
- builder() - Static method in class org.apache.iceberg.common.DynConstructors
- builder() - Static method in class org.apache.iceberg.common.DynFields
- builder() - Static method in class org.apache.iceberg.io.WriteResult
- builder() - Static method in class org.apache.iceberg.SnapshotSummary
- builder() - Static method in class org.apache.iceberg.spark.procedures.AncestorsOfProcedure
- builder() - Static method in class org.apache.iceberg.spark.procedures.ExpireSnapshotsProcedure
- builder() - Static method in class org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure
- builder(Class<?>) - Static method in class org.apache.iceberg.common.DynConstructors
- builder(String) - Static method in class org.apache.iceberg.common.DynMethods
-
Constructs a new builder for calling methods dynamically.
- builder(PartitionSpec) - Static method in class org.apache.iceberg.DataFiles
- Builder() - Constructor for class org.apache.iceberg.common.DynClasses.Builder
- Builder() - Constructor for class org.apache.iceberg.common.DynConstructors.Builder
- Builder() - Constructor for class org.apache.iceberg.common.DynFields.Builder
- Builder() - Constructor for class org.apache.iceberg.flink.source.FlinkSource.Builder
- Builder() - Constructor for class org.apache.iceberg.SnapshotSummary.Builder
- Builder(int) - Constructor for class org.apache.iceberg.DoubleFieldMetrics.Builder
- Builder(int) - Constructor for class org.apache.iceberg.FloatFieldMetrics.Builder
- Builder(Class<?>) - Constructor for class org.apache.iceberg.common.DynConstructors.Builder
- Builder(Iterable<I>) - Constructor for class org.apache.iceberg.util.Tasks.Builder
- Builder(String) - Constructor for class org.apache.iceberg.common.DynMethods.Builder
- Builder(PartitionSpec) - Constructor for class org.apache.iceberg.DataFiles.Builder
- Builder(Table) - Constructor for class org.apache.iceberg.FindFiles.Builder
- Builder(TableScan) - Constructor for class org.apache.iceberg.ScanSummary.Builder
- builderFor(int) - Method in class org.apache.iceberg.FloatFieldMetrics
- builderFor(DataStream<T>, MapFunction<T, RowData>, TypeInformation<RowData>) - Static method in class org.apache.iceberg.flink.sink.FlinkSink
-
Initialize a
FlinkSink.Builder
to export the data from generic input data stream into iceberg table. - builderFor(Schema) - Static method in class org.apache.iceberg.PartitionSpec
-
Creates a new
partition spec builder
for the givenSchema
. - builderFor(Schema) - Static method in class org.apache.iceberg.SortOrder
-
Creates a new
sort order builder
for the givenSchema
. - builderFor(Table, int, long) - Static method in class org.apache.iceberg.io.OutputFileFactory
- buildFormat() - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- buildFrom(TableMetadata) - Static method in class org.apache.iceberg.TableMetadata
- buildIcebergCatalog(String, Map<String, String>, Object) - Static method in class org.apache.iceberg.CatalogUtil
-
Build an Iceberg
Catalog
based on a map of catalog properties and optional Hadoop configuration. - buildIcebergCatalog(String, CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.SparkCatalog
-
Build an Iceberg
Catalog
to be used by this Spark catalog adapter. - buildIdentifier(Identifier) - Method in class org.apache.iceberg.spark.SparkCatalog
-
Build an Iceberg
TableIdentifier
for the given Spark identifier. - buildList(I) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- buildList(List<E>) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ListReader
- buildMap(I) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- buildMap(Map<K, V>) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.MapReader
- buildMergeOnReadScan() - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- buildOrcProjection(Schema, TypeDescription) - Static method in class org.apache.iceberg.orc.ORCSchemaUtil
-
Converts an Iceberg schema to a corresponding ORC schema within the context of an existing ORC file schema.
- buildOtherMetadataFileDF(Table) - Method in class org.apache.iceberg.spark.actions.BaseDeleteReachableFilesSparkAction
- buildPositionDeltaDistribution(Table, RowLevelOperation.Command, DistributionMode) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- buildPositionDeltaOrdering(Table, RowLevelOperation.Command, Distribution) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- buildPositionWriter() - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- buildPositionWriter() - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- buildPositionWriter() - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- buildReader(Schema, TypeDescription) - Static method in class org.apache.iceberg.data.orc.GenericOrcReader
- buildReader(Schema, TypeDescription, Map<Integer, ?>) - Static method in class org.apache.iceberg.data.orc.GenericOrcReader
- buildReader(Schema, TypeDescription, Map<Integer, ?>) - Static method in class org.apache.iceberg.spark.data.vectorized.VectorizedSparkOrcReaders
- buildReader(Schema, MessageType) - Static method in class org.apache.iceberg.data.parquet.GenericParquetReaders
- buildReader(Schema, MessageType) - Static method in class org.apache.iceberg.flink.data.FlinkParquetReaders
- buildReader(Schema, MessageType) - Static method in class org.apache.iceberg.parquet.ParquetAvroValueReaders
- buildReader(Schema, MessageType) - Static method in class org.apache.iceberg.spark.data.SparkParquetReaders
- buildReader(Schema, MessageType, boolean) - Static method in class org.apache.iceberg.spark.data.vectorized.VectorizedSparkParquetReaders
- buildReader(Schema, MessageType, boolean, Map<Integer, ?>) - Static method in class org.apache.iceberg.spark.data.vectorized.VectorizedSparkParquetReaders
- buildReader(Schema, MessageType, boolean, Map<Integer, ?>, DeleteFilter<InternalRow>) - Static method in class org.apache.iceberg.spark.data.vectorized.VectorizedSparkParquetReaders
- buildReader(Schema, MessageType, Map<Integer, ?>) - Static method in class org.apache.iceberg.data.parquet.GenericParquetReaders
- buildReader(Schema, MessageType, Map<Integer, ?>) - Static method in class org.apache.iceberg.flink.data.FlinkParquetReaders
- buildReader(Schema, MessageType, Map<Integer, ?>) - Static method in class org.apache.iceberg.spark.data.SparkParquetReaders
- buildReader(MessageType, Schema, Map<Integer, Object>) - Static method in class org.apache.iceberg.pig.PigParquetReader
- buildReplacement(Schema, PartitionSpec, SortOrder, String, Map<String, String>) - Method in class org.apache.iceberg.TableMetadata
- buildRequiredDistribution(Table, DistributionMode) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- buildRequiredOrdering(Table, Distribution) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- buildSortOrder(Schema, PartitionSpec, SortOrder) - Static method in class org.apache.iceberg.util.SortOrderUtil
- buildSortOrder(Table) - Static method in class org.apache.iceberg.util.SortOrderUtil
- buildSortOrder(Table, SortOrder) - Static method in class org.apache.iceberg.util.SortOrderUtil
- buildSparkCatalog(String, CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
-
Build a
SparkCatalog
to be used for Iceberg operations. - buildStatic() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a StaticField or throws a RuntimeException if there is none.
- buildStatic() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a StaticMethod or throws a RuntimeException if there is none.
- buildStaticChecked() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a StaticField or throws a NoSuchFieldException if there is none.
- buildStaticChecked() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a StaticMethod or throws a NoSuchMethodException if there is none.
- buildStruct(I) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- buildTable(String, Schema) - Method in class org.apache.iceberg.hadoop.HadoopTables
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.BaseMetastoreCatalog
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.CachingCatalog
- buildTable(TableIdentifier, Schema) - Method in interface org.apache.iceberg.catalog.Catalog
-
/** Instantiate a builder to either create a table or start a create/replace transaction.
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- buildWriter(LogicalType, MessageType) - Static method in class org.apache.iceberg.flink.data.FlinkParquetWriters
- buildWriter(RowType, Schema) - Static method in class org.apache.iceberg.flink.data.FlinkOrcWriter
- buildWriter(Schema, TypeDescription) - Static method in class org.apache.iceberg.data.orc.GenericOrcWriter
- buildWriter(MessageType) - Static method in class org.apache.iceberg.data.parquet.GenericParquetWriter
- buildWriter(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetAvroWriter
- buildWriter(StructType, MessageType) - Static method in class org.apache.iceberg.spark.data.SparkParquetWriters
- BY - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BY - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BY() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- BY() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- BY() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- byId() - Method in class org.apache.iceberg.types.IndexByName
-
Returns a mapping from field ID to full name.
- byName() - Method in class org.apache.iceberg.types.IndexByName
-
Returns a mapping from full field name to ID.
- ByteArrayReader(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.ByteArrayReader
- byteArrays() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- byteBuffers() - Static method in class org.apache.iceberg.avro.ValueReaders
- byteBuffers() - Static method in class org.apache.iceberg.avro.ValueWriters
- byteBuffers() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- byteBuffers(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- ByteBuffers - Class in org.apache.iceberg.util
- bytes() - Static method in class org.apache.iceberg.avro.ValueReaders
- bytes() - Static method in class org.apache.iceberg.avro.ValueWriters
- bytes() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- bytes() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- bytes() - Static method in class org.apache.iceberg.orc.OrcValueReaders
- BytesReader(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.BytesReader
C
- CACHE_ENABLED - Static variable in class org.apache.iceberg.CatalogProperties
-
Controls whether the catalog will cache table entries upon load.
- CACHE_ENABLED - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- CACHE_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- CACHE_EXPIRATION_INTERVAL_MS - Static variable in class org.apache.iceberg.CatalogProperties
-
Controls the duration for which entries in the catalog are cached.
- CACHE_EXPIRATION_INTERVAL_MS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- CACHE_EXPIRATION_INTERVAL_MS_OFF - Static variable in class org.apache.iceberg.CatalogProperties
- CachedClientPool - Class in org.apache.iceberg.hive
- CachingCatalog - Class in org.apache.iceberg
-
Class that wraps an Iceberg Catalog to cache tables.
- CachingCatalog(Catalog, boolean, long, Ticker) - Constructor for class org.apache.iceberg.CachingCatalog
- call(InternalRow) - Method in class org.apache.iceberg.spark.procedures.AncestorsOfProcedure
- call(InternalRow) - Method in class org.apache.iceberg.spark.procedures.ExpireSnapshotsProcedure
- call(InternalRow) - Method in class org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure
- call(InternalRow) - Method in interface org.apache.spark.sql.connector.iceberg.catalog.Procedure
-
Executes this procedure.
- CALL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- CALL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- CALL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- CALL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- callArgument() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- callArgument() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- callArgument(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- CallArgumentContext() - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallArgumentContext
- CallArgumentContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallArgumentContext
- CallContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- callInit() - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- cancel() - Method in class org.apache.iceberg.flink.source.StreamingMonitorFunction
- canContainAny(ManifestFile, Iterable<StructLike>, Function<Integer, PartitionSpec>) - Static method in class org.apache.iceberg.util.ManifestFileUtil
- canContainAny(ManifestFile, Iterable<Pair<Integer, StructLike>>, Map<Integer, PartitionSpec>) - Static method in class org.apache.iceberg.util.ManifestFileUtil
- canDeleteWhere(Filter[]) - Method in class org.apache.iceberg.spark.source.SparkTable
- canTransform(Type) - Method in interface org.apache.iceberg.transforms.Transform
-
Checks whether this function can be applied to the given
Type
. - canTransform(Type) - Method in class org.apache.iceberg.transforms.UnknownTransform
- capabilities() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- capabilities() - Method in class org.apache.iceberg.spark.source.SparkTable
- CASE_SENSITIVE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- CASE_SENSITIVE_DEFAULT - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- caseInsensitive() - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- caseInsensitive() - Method in class org.apache.iceberg.FindFiles.Builder
- caseInsensitive() - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- caseInsensitiveField(String) - Method in class org.apache.iceberg.types.Types.StructType
- caseInsensitiveFindField(String) - Method in class org.apache.iceberg.Schema
-
Returns a sub-field by name as a
Types.NestedField
. - caseInsensitiveSelect(String...) - Method in class org.apache.iceberg.Schema
-
Creates a projection schema for a subset of columns, selected by case insensitive names
- caseInsensitiveSelect(Collection<String>) - Method in class org.apache.iceberg.Schema
-
Creates a projection schema for a subset of columns, selected by case insensitive names
- caseSensitive() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- caseSensitive() - Method in class org.apache.iceberg.spark.SparkReadConf
- caseSensitive(boolean) - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
-
Is it case sensitive
- caseSensitive(boolean) - Method in interface org.apache.iceberg.DeleteFiles
-
Enables or disables case sensitive expression binding for methods that accept expressions.
- caseSensitive(boolean) - Method in class org.apache.iceberg.FindFiles.Builder
- caseSensitive(boolean) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- caseSensitive(boolean) - Method in class org.apache.iceberg.ManifestReader
- caseSensitive(boolean) - Method in class org.apache.iceberg.MicroBatches.MicroBatchBuilder
- caseSensitive(boolean) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- caseSensitive(boolean) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- caseSensitive(boolean) - Method in interface org.apache.iceberg.OverwriteFiles
-
Enables or disables case sensitive expression binding for validations that accept expressions.
- caseSensitive(boolean) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- caseSensitive(boolean) - Method in interface org.apache.iceberg.RowDelta
-
Enables or disables case sensitive expression binding for validations that accept expressions.
- caseSensitive(boolean) - Method in class org.apache.iceberg.SortOrder.Builder
- caseSensitive(boolean) - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- caseSensitive(boolean) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
from this that, if data columns where selected viaTableScan.select(java.util.Collection)
, controls whether the match to the schema will be done with case sensitivity. - caseSensitive(boolean) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Set whether column resolution in the source schema should be case sensitive.
- castAndThrow(Throwable, Class<E>) - Static method in class org.apache.iceberg.util.ExceptionUtil
- catalog() - Method in class org.apache.iceberg.flink.FlinkCatalog
- catalog() - Method in class org.apache.iceberg.spark.Spark3Util.CatalogAndIdentifier
- Catalog - Interface in org.apache.iceberg.catalog
-
A Catalog API for table create, drop, and load operations.
- CATALOG - Static variable in class org.apache.iceberg.mr.InputFormatConfig
-
Deprecated.please use
InputFormatConfig.catalogPropertyConfigKey(String, String)
with config keyCatalogUtil.ICEBERG_CATALOG_TYPE
to specify the type of a catalog. - CATALOG_CONFIG_PREFIX - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- CATALOG_IMPL - Static variable in class org.apache.iceberg.CatalogProperties
- CATALOG_LOADER_CLASS - Static variable in class org.apache.iceberg.mr.InputFormatConfig
-
Deprecated.please use
InputFormatConfig.catalogPropertyConfigKey(String, String)
with config keyCatalogProperties.CATALOG_IMPL
to specify the implementation of a catalog. - CATALOG_NAME - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- Catalog.TableBuilder - Interface in org.apache.iceberg.catalog
-
A builder used to create valid
tables
or start create/replacetransactions
. - catalogAndIdentifier(String, SparkSession, String) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(String, SparkSession, String, CatalogPlugin) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(List<String>, Function<String, C>, BiFunction<String[], String, T>, C, String[]) - Static method in class org.apache.iceberg.spark.SparkUtil
-
A modified version of Spark's LookupCatalog.CatalogAndIdentifier.unapply Attempts to find the catalog and identifier a multipart identifier represents
- catalogAndIdentifier(SparkSession, String) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(SparkSession, String, CatalogPlugin) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(SparkSession, List<String>) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(SparkSession, List<String>, CatalogPlugin) - Static method in class org.apache.iceberg.spark.Spark3Util
-
A modified version of Spark's LookupCatalog.CatalogAndIdentifier.unapply Attempts to find the catalog and identifier a multipart identifier represents
- CatalogAndIdentifier(Pair<CatalogPlugin, Identifier>) - Constructor for class org.apache.iceberg.spark.Spark3Util.CatalogAndIdentifier
- CatalogAndIdentifier(CatalogPlugin, Identifier) - Constructor for class org.apache.iceberg.spark.Spark3Util.CatalogAndIdentifier
- CatalogLoader - Interface in org.apache.iceberg.flink
-
Serializable loader to load an Iceberg
Catalog
. - CatalogLoader.CustomCatalogLoader - Class in org.apache.iceberg.flink
- CatalogLoader.HadoopCatalogLoader - Class in org.apache.iceberg.flink
- CatalogLoader.HiveCatalogLoader - Class in org.apache.iceberg.flink
- catalogName(Configuration, String) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
-
Returns the catalog name serialized to the configuration.
- CatalogProperties - Class in org.apache.iceberg
- catalogPropertyConfigKey(String, String) - Static method in class org.apache.iceberg.mr.InputFormatConfig
-
Get Hadoop config key of a catalog property based on catalog name
- Catalogs - Class in org.apache.iceberg.mr
-
Class for catalog resolution and accessing the common functions for
Catalog
API. - CatalogUtil - Class in org.apache.iceberg
- CHANGED_PARTITION_COUNT_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- CHANGED_PARTITION_PREFIX - Static variable in class org.apache.iceberg.SnapshotSummary
- changes() - Method in class org.apache.iceberg.TableMetadata
- channelNames - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- channelReadChunkSize() - Method in class org.apache.iceberg.gcp.GCPProperties
- channelWriteChunkSize() - Method in class org.apache.iceberg.gcp.GCPProperties
- charAt(int) - Method in class org.apache.iceberg.util.CharSequenceWrapper
- charSequences() - Static method in class org.apache.iceberg.types.Comparators
- CharSequenceSet - Class in org.apache.iceberg.util
- CharSequenceWrapper - Class in org.apache.iceberg.util
-
Wrapper class to adapt CharSequence for use in maps and sets.
- check(boolean, String, Object...) - Static method in exception org.apache.iceberg.exceptions.NoSuchIcebergTableException
- check(boolean, String, Object...) - Static method in exception org.apache.iceberg.exceptions.ValidationException
- CHECK_NULLABILITY - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- CHECK_NULLABILITY - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- CHECK_NULLABILITY_DEFAULT - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- CHECK_ORDERING - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- CHECK_ORDERING - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- CHECK_ORDERING_DEFAULT - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- checkAndSetIoConfig(Configuration, Table) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
-
If enabled, it populates the FileIO's hadoop configuration with the input config object.
- checkAndSkipIoConfigSerialization(Configuration, Table) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
-
If enabled, it ensures that the FileIO's hadoop configuration will not be serialized.
- checkCommitStatus(String, TableMetadata) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
-
Attempt to load the table and see if any current or past metadata location matches the one we were attempting to set.
- checkCompatibility(SortOrder, Schema) - Static method in class org.apache.iceberg.SortOrder
- CheckCompatibility - Class in org.apache.iceberg.types
- checkNullability() - Method in class org.apache.iceberg.spark.SparkWriteConf
- checkOrdering() - Method in class org.apache.iceberg.spark.SparkWriteConf
- checkOutputSpecs(FileSystem, JobConf) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputFormat
- checkSourceCatalog(CatalogPlugin) - Method in class org.apache.iceberg.spark.actions.BaseMigrateTableSparkAction
- checkSourceCatalog(CatalogPlugin) - Method in class org.apache.iceberg.spark.actions.BaseSnapshotTableSparkAction
- cherrypick(long) - Method in interface org.apache.iceberg.ManageSnapshots
-
Apply supported changes in given snapshot and create a new snapshot which will be set as the current snapshot on commit.
- cherrypick(long) - Method in class org.apache.iceberg.SnapshotManager
- CherrypickAncestorCommitException - Exception in org.apache.iceberg.exceptions
-
This exception occurs when one cherrypicks an ancestor or when the picked snapshot is already linked to a published ancestor.
- CherrypickAncestorCommitException(long) - Constructor for exception org.apache.iceberg.exceptions.CherrypickAncestorCommitException
- CherrypickAncestorCommitException(long, long) - Constructor for exception org.apache.iceberg.exceptions.CherrypickAncestorCommitException
- child() - Method in class org.apache.iceberg.expressions.Not
- childColumn(int) - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- classLoader(ClassLoader) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- cleanExpiredFiles(boolean) - Method in interface org.apache.iceberg.ExpireSnapshots
-
Allows expiration of snapshots without any cleanup of underlying manifest or data files.
- cleanUncommitted(Set<ManifestFile>) - Method in class org.apache.iceberg.BaseRewriteManifests
- clear() - Method in class org.apache.iceberg.DataFiles.Builder
- clear() - Method in class org.apache.iceberg.FileMetadata.Builder
- clear() - Method in class org.apache.iceberg.SnapshotSummary.Builder
- clear() - Method in class org.apache.iceberg.util.CharSequenceSet
- clear() - Method in class org.apache.iceberg.util.PartitionSet
- clear() - Method in class org.apache.iceberg.util.SerializableMap
- clear() - Method in class org.apache.iceberg.util.StructLikeMap
- clear() - Method in class org.apache.iceberg.util.StructLikeSet
- clearRewrite(Table, String) - Method in class org.apache.iceberg.spark.FileRewriteCoordinator
- CLIENT_ACCESS_KEY_ID - Static variable in class org.apache.iceberg.aliyun.AliyunProperties
-
Aliyun uses an AccessKey pair, which includes an AccessKey ID and an AccessKey secret to implement symmetric encryption and verify the identity of a requester.
- CLIENT_ACCESS_KEY_SECRET - Static variable in class org.apache.iceberg.aliyun.AliyunProperties
-
Aliyun uses an AccessKey pair, which includes an AccessKey ID and an AccessKey secret to implement symmetric encryption and verify the identity of a requester.
- CLIENT_ASSUME_ROLE_ARN - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_EXTERNAL_ID - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_REGION - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_TIMEOUT_SEC - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_TIMEOUT_SEC_DEFAULT - Static variable in class org.apache.iceberg.aws.AwsProperties
- CLIENT_ENABLE_ETAG_CHECK_DEFAULT - Static variable in class org.apache.iceberg.aws.AwsProperties
- CLIENT_FACTORY - Static variable in class org.apache.iceberg.aliyun.AliyunProperties
-
The implementation class of
AliyunClientFactory
to customize Aliyun client configurations. - CLIENT_FACTORY - Static variable in class org.apache.iceberg.aws.AwsProperties
-
The implementation class of
AwsClientFactory
to customize AWS client configurations. - CLIENT_POOL_CACHE_EVICTION_INTERVAL_MS - Static variable in class org.apache.iceberg.CatalogProperties
- CLIENT_POOL_CACHE_EVICTION_INTERVAL_MS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- CLIENT_POOL_SIZE - Static variable in class org.apache.iceberg.CatalogProperties
- CLIENT_POOL_SIZE_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- clientLibToken() - Method in class org.apache.iceberg.gcp.GCPProperties
- ClientPool<C,E extends java.lang.Exception> - Interface in org.apache.iceberg
- ClientPool.Action<R,C,E extends java.lang.Exception> - Interface in org.apache.iceberg
- ClientPoolImpl<C,E extends java.lang.Exception> - Class in org.apache.iceberg
- ClientPoolImpl(int, Class<? extends E>, boolean) - Constructor for class org.apache.iceberg.ClientPoolImpl
- clone(RowData, RowData, RowType, TypeSerializer[]) - Static method in class org.apache.iceberg.flink.data.RowDataUtil
-
Similar to the private
RowDataSerializer.copyRowData(RowData, RowData)
method. - close() - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager.CommitService
- close() - Method in class org.apache.iceberg.aliyun.oss.OSSFileIO
- close() - Method in class org.apache.iceberg.aliyun.oss.OSSInputStream
- close() - Method in class org.apache.iceberg.aliyun.oss.OSSOutputStream
- close() - Method in class org.apache.iceberg.arrow.vectorized.ArrowReader
- close() - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- close() - Method in class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- close() - Method in class org.apache.iceberg.arrow.vectorized.ColumnarBatch
-
Called to close all the columns in this batch.
- close() - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- close() - Method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- close() - Method in class org.apache.iceberg.arrow.vectorized.VectorizedTableScanIterable
- close() - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- close() - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- close() - Method in class org.apache.iceberg.aws.s3.S3FileIO
- close() - Method in class org.apache.iceberg.ClientPoolImpl
- close() - Method in class org.apache.iceberg.deletes.EqualityDeleteWriter
- close() - Method in class org.apache.iceberg.deletes.PositionDeleteWriter
- close() - Method in class org.apache.iceberg.flink.FlinkCatalog
- close() - Method in class org.apache.iceberg.flink.source.DataIterator
- close() - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- close() - Method in class org.apache.iceberg.flink.source.StreamingMonitorFunction
- close() - Method in class org.apache.iceberg.flink.source.StreamingReaderOperator
- close() - Method in class org.apache.iceberg.flink.TableLoader.CatalogTableLoader
- close() - Method in class org.apache.iceberg.flink.TableLoader.HadoopTableLoader
- close() - Method in class org.apache.iceberg.gcp.gcs.GCSFileIO
- close() - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- close() - Method in class org.apache.iceberg.io.BasePositionDeltaWriter
- close() - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
- close() - Method in class org.apache.iceberg.io.CloseableGroup
-
Close all the registered resources.
- close() - Method in class org.apache.iceberg.io.DataWriter
- close() - Method in interface org.apache.iceberg.io.FileIO
-
Close File IO to release underlying resources.
- close() - Method in class org.apache.iceberg.io.FilterIterator
- close() - Method in class org.apache.iceberg.io.PartitionedFanoutWriter
- close() - Method in class org.apache.iceberg.io.PartitionedWriter
- close() - Method in class org.apache.iceberg.io.ResolvingFileIO
- close() - Method in class org.apache.iceberg.io.UnpartitionedWriter
- close() - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- close() - Method in class org.apache.iceberg.ManifestWriter
- close() - Method in class org.apache.iceberg.mr.mapred.AbstractMapredIcebergRecordReader
- close() - Method in class org.apache.iceberg.nessie.NessieCatalog
- close() - Method in class org.apache.iceberg.orc.VectorizedRowBatchIterator
- close() - Method in class org.apache.iceberg.parquet.ParquetWriteAdapter
-
Deprecated.
- close() - Method in interface org.apache.iceberg.parquet.VectorizedReader
-
Release any resources allocated.
- close() - Method in class org.apache.iceberg.pig.IcebergPigInputFormat.IcebergRecordReader
- close() - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- close() - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- close(C) - Method in class org.apache.iceberg.ClientPoolImpl
- close(Closeable, boolean) - Static method in class org.apache.iceberg.util.Exceptions
- close(IMetaStoreClient) - Method in class org.apache.iceberg.hive.HiveClientPool
- CloseableGroup - Class in org.apache.iceberg.io
-
This class acts as a helper for handling the closure of multiple resource.
- CloseableGroup() - Constructor for class org.apache.iceberg.io.CloseableGroup
- CloseableIterable<T> - Interface in org.apache.iceberg.io
- CloseableIterable.ConcatCloseableIterable<E> - Class in org.apache.iceberg.io
- CloseableIterator<T> - Interface in org.apache.iceberg.io
- closeVectors() - Method in class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- ClosingIterator<T> - Class in org.apache.iceberg.io
-
A convenience wrapper around
CloseableIterator
, providing auto-close functionality when all of the elements in the iterator are consumed. - ClosingIterator(CloseableIterator<T>) - Constructor for class org.apache.iceberg.io.ClosingIterator
- clusterBy(Function<DataFile, Object>) - Method in class org.apache.iceberg.BaseRewriteManifests
- clusterBy(Function<DataFile, Object>) - Method in interface org.apache.iceberg.RewriteManifests
-
Groups an existing
DataFile
by a cluster key produced by a function. - ClusteredDataWriter<T> - Class in org.apache.iceberg.io
-
A data writer capable of writing to multiple specs and partitions that requires the incoming records to be properly clustered by partition spec and by partition within each spec.
- ClusteredDataWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, FileFormat, long) - Constructor for class org.apache.iceberg.io.ClusteredDataWriter
- ClusteredEqualityDeleteWriter<T> - Class in org.apache.iceberg.io
-
An equality delete writer capable of writing to multiple specs and partitions that requires the incoming delete records to be properly clustered by partition spec and by partition within each spec.
- ClusteredEqualityDeleteWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, FileFormat, long) - Constructor for class org.apache.iceberg.io.ClusteredEqualityDeleteWriter
- ClusteredPositionDeleteWriter<T> - Class in org.apache.iceberg.io
-
A position delete writer capable of writing to multiple specs and partitions that requires the incoming delete records to be properly clustered by partition spec and by partition within each spec.
- ClusteredPositionDeleteWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, FileFormat, long) - Constructor for class org.apache.iceberg.io.ClusteredPositionDeleteWriter
- clusterHadoopConf() - Static method in class org.apache.iceberg.flink.FlinkCatalogFactory
- collect() - Method in class org.apache.iceberg.FindFiles.Builder
-
Returns all files in the table that match all of the filters.
- collections(int, int, ParquetValueWriter<E>) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- column - Variable in class org.apache.iceberg.parquet.ParquetValueReaders.PrimitiveReader
- column - Variable in class org.apache.iceberg.parquet.ParquetValueWriters.PrimitiveWriter
- column() - Method in interface org.apache.iceberg.parquet.ParquetValueReader
- column() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.PrimitiveReader
- column() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- column() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- column() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- column(int) - Method in class org.apache.iceberg.arrow.vectorized.ColumnarBatch
-
Returns the column at `ordinal`.
- COLUMN_SIZES - Static variable in interface org.apache.iceberg.DataFile
- ColumnarBatch - Class in org.apache.iceberg.arrow.vectorized
-
This class is inspired by Spark's
ColumnarBatch
. - ColumnarBatchReader - Class in org.apache.iceberg.spark.data.vectorized
-
VectorizedReader
that returns Spark'sColumnarBatch
to support Spark's vectorized read path. - ColumnarBatchReader(List<VectorizedReader<?>>) - Constructor for class org.apache.iceberg.spark.data.vectorized.ColumnarBatchReader
- ColumnIterator<T> - Class in org.apache.iceberg.parquet
- columnMode(String) - Method in class org.apache.iceberg.MetricsConfig
- columns() - Method in interface org.apache.iceberg.parquet.ParquetValueReader
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.PrimitiveReader
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- columns() - Method in interface org.apache.iceberg.parquet.ParquetValueWriter
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.PrimitiveWriter
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedKeyValueWriter
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedWriter
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.StructWriter
- columns() - Method in class org.apache.iceberg.Schema
-
Returns a List of the
columns
in this Schema. - columnSizes() - Method in interface org.apache.iceberg.ContentFile
-
Returns if collected, map from column ID to the size of the column in bytes, null otherwise.
- columnSizes() - Method in class org.apache.iceberg.Metrics
-
Get the number of bytes for all fields in a file.
- columnSizes() - Method in class org.apache.iceberg.spark.SparkDataFile
- ColumnVector - Class in org.apache.iceberg.arrow.vectorized
-
This class is inspired by Spark's
ColumnVector
. - ColumnVectorWithFilter - Class in org.apache.iceberg.spark.data.vectorized
- ColumnVectorWithFilter(VectorHolder, int[]) - Constructor for class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- ColumnWriter<T> - Class in org.apache.iceberg.parquet
- combine(Iterable<E>, Closeable) - Static method in interface org.apache.iceberg.io.CloseableIterable
- CombinedScanTask - Interface in org.apache.iceberg
-
A scan task made of several ranges from files.
- command() - Method in interface org.apache.spark.sql.connector.iceberg.write.RowLevelOperation
-
Returns the actual SQL operation being performed.
- command() - Method in interface org.apache.spark.sql.connector.iceberg.write.RowLevelOperationInfo
-
Returns the SQL command (e.g.
- commit() - Method in class org.apache.iceberg.BaseReplaceSortOrder
- commit() - Method in interface org.apache.iceberg.PendingUpdate
-
Apply the pending changes and commit.
- commit() - Method in class org.apache.iceberg.SetLocation
- commit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- commit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.hadoop.HadoopTableOperations
- commit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.StaticTableOperations
- commit(TableMetadata, TableMetadata) - Method in interface org.apache.iceberg.TableOperations
-
Replace the base table metadata with a new version.
- commit(Offset) - Method in class org.apache.iceberg.spark.source.SparkMicroBatchStream
- COMMIT_FILE_THREAD_POOL_SIZE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- COMMIT_FILE_THREAD_POOL_SIZE_DEFAULT - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- COMMIT_MAX_RETRY_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_MAX_RETRY_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_MIN_RETRY_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_MIN_RETRY_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_NUM_RETRIES - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_NUM_RETRIES_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_NUM_STATUS_CHECKS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_NUM_STATUS_CHECKS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_MAX_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_MAX_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_MIN_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_MIN_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_TOTAL_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_TOTAL_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_TABLE_THREAD_POOL_SIZE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- COMMIT_TABLE_THREAD_POOL_SIZE_DEFAULT - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- COMMIT_TOTAL_RETRY_TIME_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_TOTAL_RETRY_TIME_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- commitCreateTable(Table) - Method in class org.apache.iceberg.mr.hive.HiveIcebergMetaHook
- commitDropTable(Table, boolean) - Method in class org.apache.iceberg.mr.hive.HiveIcebergMetaHook
- CommitFailedException - Exception in org.apache.iceberg.exceptions
-
Exception raised when a commit fails because of out of date metadata.
- CommitFailedException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.CommitFailedException
- CommitFailedException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.CommitFailedException
- commitFileGroups(Set<RewriteFileGroup>) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager
-
Perform a commit operation on the table adding and removing files as required for this set of file groups
- commitJob(JobContext) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
-
Reads the commit files stored in the temp directories and collects the generated committed data files.
- commitOrClean(Set<RewriteFileGroup>) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager
- commitStagedChanges() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- commitStagedChanges() - Method in class org.apache.iceberg.spark.source.StagedSparkTable
- CommitStateUnknownException - Exception in org.apache.iceberg.exceptions
-
Exception for a failure to confirm either affirmatively or negatively that a commit was applied.
- CommitStateUnknownException(Throwable) - Constructor for exception org.apache.iceberg.exceptions.CommitStateUnknownException
- commitTask(TaskAttemptContext) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
-
Collects the generated data files and creates a commit file storing the data file list.
- commitTransaction() - Method in interface org.apache.iceberg.Transaction
-
Apply the pending changes from all actions and commit.
- comparator() - Method in interface org.apache.iceberg.expressions.BoundTerm
-
Returns a
Comparator
for values produced by this term. - comparator() - Method in interface org.apache.iceberg.expressions.Literal
-
Return a
Comparator
for values. - Comparators - Class in org.apache.iceberg.types
- CompatibilityTaskAttemptContextImpl(Configuration, TaskAttemptID, Reporter) - Constructor for class org.apache.iceberg.mr.mapred.MapredIcebergInputFormat.CompatibilityTaskAttemptContextImpl
- compatibleWith(PartitionSpec) - Method in class org.apache.iceberg.PartitionSpec
-
Returns true if this spec is equivalent to the other, with partition field ids ignored.
- complete() - Method in class org.apache.iceberg.io.BaseTaskWriter
- complete() - Method in interface org.apache.iceberg.io.TaskWriter
-
Close the writer and get the completed data and delete files.
- COMPRESSION_FACTOR - Static variable in class org.apache.iceberg.spark.actions.Spark3SortStrategy
-
The number of shuffle partitions and consequently the number of output files created by the Spark Sort is based on the size of the input data files used in this rewrite operation.
- concat(Iterable<File>, File, int, Schema, Map<String, String>) - Static method in class org.apache.iceberg.parquet.Parquet
-
Combines several files into one
- concat(Iterable<CloseableIterable<E>>) - Static method in interface org.apache.iceberg.io.CloseableIterable
- conf() - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- conf() - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- config(String, String) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- config(String, String) - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
-
Deprecated.Please use #set(String, String) instead
- CONFIG - Static variable in class org.apache.iceberg.flink.actions.Actions
- CONFIG_CLIENT_BUILDER_IMPL - Static variable in class org.apache.iceberg.nessie.NessieUtil
- CONFIG_SERIALIZATION_DISABLED - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- CONFIG_SERIALIZATION_DISABLED_DEFAULT - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- ConfigBuilder(Configuration) - Constructor for class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- ConfigProperties - Class in org.apache.iceberg.hadoop
- Configurable<C> - Interface in org.apache.iceberg.hadoop
-
Interface used to avoid runtime dependencies on Hadoop Configurable
- configure(Configuration) - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- configure(JobConf) - Static method in class org.apache.iceberg.mr.mapred.MapredIcebergInputFormat
-
Configures the
JobConf
to use theMapredIcebergInputFormat
and returns a helper to add further configuration. - configure(Job) - Static method in class org.apache.iceberg.mr.mapreduce.IcebergInputFormat
-
Configures the
Job
to use theIcebergInputFormat
and returns a helper to add further configuration. - configureDataWrite(Avro.DataWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureDataWrite(ORC.DataWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureDataWrite(Parquet.DataWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureEqualityDelete(Avro.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureEqualityDelete(ORC.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureEqualityDelete(Parquet.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureHadoopConf(Object, Object) - Static method in class org.apache.iceberg.CatalogUtil
-
Dynamically detects whether an object is a Hadoop Configurable and calls setConf.
- configureInputJobCredentials(TableDesc, Map<String, String>) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- configureInputJobProperties(TableDesc, Map<String, String>) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- configureJobConf(TableDesc, JobConf) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- configureOutputJobProperties(TableDesc, Map<String, String>) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- configurePositionDelete(Avro.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configurePositionDelete(ORC.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configurePositionDelete(Parquet.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureTableJobProperties(TableDesc, Map<String, String>) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- conflictDetectionFilter(Expression) - Method in class org.apache.iceberg.BaseOverwriteFiles
- conflictDetectionFilter(Expression) - Method in interface org.apache.iceberg.OverwriteFiles
-
Sets a conflict detection filter used to validate concurrently added data and delete files.
- conflictDetectionFilter(Expression) - Method in interface org.apache.iceberg.RowDelta
-
Sets a conflict detection filter used to validate concurrently added data and delete files.
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExpressionContext
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformArgumentContext
- constant(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- constant(C) - Static method in class org.apache.iceberg.parquet.ParquetValueReaders
- ConstantContext() - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ConstantContext
- ConstantContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ConstantContext
- constantHolder(int, T) - Static method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- constants(C) - Static method in class org.apache.iceberg.orc.OrcValueReaders
- constantsMap(FileScanTask) - Static method in class org.apache.iceberg.util.PartitionUtil
- constantsMap(FileScanTask, BiFunction<Type, Object, Object>) - Static method in class org.apache.iceberg.util.PartitionUtil
- constantsMap(FileScanTask, Types.StructType, BiFunction<Type, Object, Object>) - Static method in class org.apache.iceberg.util.PartitionUtil
- ConstantVectorHolder(int) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorHolder.ConstantVectorHolder
- ConstantVectorHolder(int, T) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorHolder.ConstantVectorHolder
- ConstantVectorReader(T) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader.ConstantVectorReader
- Container<T> - Class in org.apache.iceberg.mr.mapred
-
A simple container of objects that you can get and set.
- Container() - Constructor for class org.apache.iceberg.mr.mapred.Container
- contains(int, StructLike) - Method in class org.apache.iceberg.util.PartitionSet
- contains(Object) - Method in class org.apache.iceberg.util.CharSequenceSet
- contains(Object) - Method in class org.apache.iceberg.util.PartitionSet
- contains(Object) - Method in class org.apache.iceberg.util.StructLikeSet
- containsAll(Collection<?>) - Method in class org.apache.iceberg.util.CharSequenceSet
- containsAll(Collection<?>) - Method in class org.apache.iceberg.util.PartitionSet
- containsAll(Collection<?>) - Method in class org.apache.iceberg.util.StructLikeSet
- containsKey(Object) - Method in class org.apache.iceberg.util.SerializableMap
- containsKey(Object) - Method in class org.apache.iceberg.util.StructLikeMap
- containsNaN() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- containsNaN() - Method in interface org.apache.iceberg.ManifestFile.PartitionFieldSummary
-
Returns true if at least one data file in the manifest has a NaN value for the field.
- containsNull() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- containsNull() - Method in interface org.apache.iceberg.ManifestFile.PartitionFieldSummary
-
Returns true if at least one data file in the manifest has a null value for the field.
- containsValue(Object) - Method in class org.apache.iceberg.util.SerializableMap
- containsValue(Object) - Method in class org.apache.iceberg.util.StructLikeMap
- content() - Method in interface org.apache.iceberg.ContentFile
-
Returns type of content stored in the file; one of DATA, POSITION_DELETES, or EQUALITY_DELETES.
- content() - Method in interface org.apache.iceberg.DataFile
- content() - Method in class org.apache.iceberg.GenericManifestFile
- content() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the content stored in the manifest; either DATA or DELETES.
- content() - Method in class org.apache.iceberg.ManifestWriter
- content() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- CONTENT - Static variable in interface org.apache.iceberg.DataFile
- ContentFile<F> - Interface in org.apache.iceberg
-
Superinterface of
DataFile
andDeleteFile
that exposes common methods. - Conversions - Class in org.apache.iceberg.types
- convert(byte[]) - Static method in class org.apache.iceberg.util.UUIDUtil
- convert(byte[], int) - Static method in class org.apache.iceberg.util.UUIDUtil
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergBinaryObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDateObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDecimalObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergFixedObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimeObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampWithZoneObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergUUIDObjectInspector
- convert(Object) - Method in interface org.apache.iceberg.mr.hive.serde.objectinspector.WriteObjectInspector
- convert(ByteBuffer) - Static method in class org.apache.iceberg.util.UUIDUtil
- convert(List<String>, List<TypeInfo>, List<String>) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts the Hive list of column names and column types to an Iceberg schema.
- convert(List<String>, List<TypeInfo>, List<String>, boolean) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts the Hive list of column names and column types to an Iceberg schema.
- convert(List<FieldSchema>) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts a Hive schema (list of FieldSchema objects) to an Iceberg schema.
- convert(List<FieldSchema>, boolean) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts a Hive schema (list of FieldSchema objects) to an Iceberg schema.
- convert(UUID) - Static method in class org.apache.iceberg.util.UUIDUtil
- convert(Schema) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(TableSchema) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert the flink table schema to apache iceberg schema.
- convert(Expression) - Static method in class org.apache.iceberg.flink.FlinkFilters
-
Convert flink expression to iceberg expression.
- convert(TypeInfo) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts a Hive typeInfo object to an Iceberg type.
- convert(Schema) - Static method in class org.apache.iceberg.arrow.ArrowSchemaUtil
-
Convert Iceberg schema to Arrow Schema.
- convert(Schema) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert a
Schema
to aFlink type
. - convert(Schema) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts the Iceberg schema to a Hive schema (list of FieldSchema objects).
- convert(Schema) - Static method in class org.apache.iceberg.orc.ORCSchemaUtil
- convert(Schema) - Static method in class org.apache.iceberg.pig.SchemaUtil
- convert(Schema) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a
Schema
to aSpark type
. - convert(Schema, String) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Schema, String) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
- convert(Schema, String) - Method in class org.apache.iceberg.parquet.TypeToMessageType
- convert(Schema, Map<Types.StructType, String>) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Schema, TableSchema) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert a Flink
TableSchema
to aSchema
based on the given schema. - convert(Schema, Row) - Static method in class org.apache.iceberg.spark.SparkValueConverter
- convert(Schema, StructType) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aSchema
based on the given schema. - convert(SortOrder) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- convert(Type) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Type) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert a
Type
to aFlink type
. - convert(Type) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts an Iceberg type to a Hive TypeInfo object.
- convert(Type) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a
Type
to aSpark type
. - convert(Type, Object) - Static method in class org.apache.iceberg.spark.SparkValueConverter
- convert(Type, Map<Types.StructType, String>) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Types.NestedField) - Static method in class org.apache.iceberg.arrow.ArrowSchemaUtil
- convert(Types.StructType, String) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(TypeDescription) - Static method in class org.apache.iceberg.orc.ORCSchemaUtil
-
Convert an ORC schema to an Iceberg schema.
- convert(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
-
Converts a Parquet schema to an Iceberg schema.
- convert(Filter) - Static method in class org.apache.iceberg.spark.SparkFilters
- convert(Filter[]) - Static method in class org.apache.iceberg.spark.SparkFilters
- convert(DataType) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aType
with new field ids. - convert(StructType) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aSchema
with new field ids. - convert(StructType, boolean) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aSchema
with new field ids. - convertAndPrune(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
-
Converts a Parquet schema to an Iceberg schema and prunes fields without IDs.
- convertConstant(Type, Object) - Static method in class org.apache.iceberg.data.IdentityPartitionConverters
-
Conversions from internal representations to Iceberg generic values.
- convertConstant(Type, Object) - Static method in class org.apache.iceberg.flink.data.RowDataUtil
- convertDeleteFiles(Iterable<DeleteFile>) - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteStrategy
-
Define how to convert the deletes.
- convertedEqualityDeleteFilesCount() - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteFiles.Result
-
Returns the count of the deletes that been converted.
- ConvertEqualityDeleteFiles - Interface in org.apache.iceberg.actions
-
An action for converting the equality delete files to position delete files.
- ConvertEqualityDeleteFiles.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- ConvertEqualityDeleteStrategy - Interface in org.apache.iceberg.actions
-
A strategy for the action to convert equality delete to position deletes.
- convertToByteBuffer(UUID) - Static method in class org.apache.iceberg.util.UUIDUtil
- convertTypes(Types.StructType, String) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- copy() - Method in interface org.apache.iceberg.ContentFile
-
Copies this file.
- copy() - Method in class org.apache.iceberg.data.GenericRecord
- copy() - Method in interface org.apache.iceberg.data.Record
- copy() - Method in interface org.apache.iceberg.encryption.EncryptionKeyMetadata
- copy() - Method in class org.apache.iceberg.flink.IcebergTableSink
- copy() - Method in class org.apache.iceberg.flink.IcebergTableSource
- copy() - Method in class org.apache.iceberg.GenericManifestFile
- copy() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- copy() - Method in interface org.apache.iceberg.ManifestFile
-
Copies this
manifest file
. - copy() - Method in interface org.apache.iceberg.ManifestFile.PartitionFieldSummary
-
Copies this
summary
. - copy() - Method in class org.apache.iceberg.PartitionKey
- copy() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- copy() - Method in class org.apache.iceberg.spark.SparkDataFile
- copy(String, Object) - Method in interface org.apache.iceberg.data.Record
- copy(String, Object, String, Object) - Method in interface org.apache.iceberg.data.Record
- copy(String, Object, String, Object, String, Object) - Method in interface org.apache.iceberg.data.Record
- copy(ByteBuffer) - Static method in class org.apache.iceberg.util.ByteBuffers
- copy(Map<String, Object>) - Method in class org.apache.iceberg.data.GenericRecord
- copy(Map<String, Object>) - Method in interface org.apache.iceberg.data.Record
- copy(DataFile) - Method in class org.apache.iceberg.DataFiles.Builder
- copy(DeleteFile) - Method in class org.apache.iceberg.FileMetadata.Builder
- copy(PartitionSpec, StructLike) - Static method in class org.apache.iceberg.DataFiles
- COPY_ON_WRITE - org.apache.iceberg.RowLevelOperationMode
- copyFrom(IcebergSqlExtensionsParser.CallArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallArgumentContext
- copyFrom(IcebergSqlExtensionsParser.ConstantContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ConstantContext
- copyFrom(IcebergSqlExtensionsParser.IdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentifierContext
- copyFrom(IcebergSqlExtensionsParser.NumberContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumberContext
- copyFrom(IcebergSqlExtensionsParser.StatementContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StatementContext
- copyFrom(IcebergSqlExtensionsParser.TransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformContext
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergBinaryObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDateObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDecimalObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergFixedObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimeObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampWithZoneObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergUUIDObjectInspector
- copyOf(Map<K, V>) - Static method in class org.apache.iceberg.util.SerializableMap
- copyOf(ManifestFile) - Static method in class org.apache.iceberg.GenericManifestFile
- copyOf(Table) - Static method in class org.apache.iceberg.SerializableTable
-
Creates a read-only serializable table that can be sent to other nodes in a cluster.
- copyOnWriteDeleteDistributionMode() - Method in class org.apache.iceberg.spark.SparkWriteConf
- copyOnWriteMergeDistributionMode() - Method in class org.apache.iceberg.spark.SparkWriteConf
- copyOnWriteUpdateDistributionMode() - Method in class org.apache.iceberg.spark.SparkWriteConf
- copyWithoutStats() - Method in interface org.apache.iceberg.ContentFile
-
Copies this file without file stats.
- copyWithoutStats() - Method in class org.apache.iceberg.spark.SparkDataFile
- Counts() - Constructor for class org.apache.iceberg.MetricsModes.Counts
- create() - Method in class org.apache.iceberg.aws.s3.S3OutputFile
-
Create an output stream for the specified location if the target object does not exist in S3 at the time of invocation.
- create() - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- create() - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Creates the table.
- create() - Static method in class org.apache.iceberg.deletes.PositionDelete
- create() - Method in class org.apache.iceberg.flink.sink.RowDataTaskWriterFactory
- create() - Method in interface org.apache.iceberg.flink.sink.TaskWriterFactory
-
Initialize a
TaskWriter
with given task id and attempt id. - create() - Method in class org.apache.iceberg.gcp.gcs.GCSOutputFile
-
Create an output stream for the specified location if the target object does not exist in GCS at the time of invocation.
- create() - Method in class org.apache.iceberg.hadoop.HadoopOutputFile
- create() - Method in interface org.apache.iceberg.io.OutputFile
-
Create a new file and return a
PositionOutputStream
to it. - create() - Method in class org.apache.iceberg.orc.OrcValueReaders.StructReader
- create(Map<Integer, PartitionSpec>) - Static method in class org.apache.iceberg.util.PartitionSet
- create(Schema) - Static method in class org.apache.iceberg.data.avro.DataWriter
- create(RowType, Types.StructType, Types.StructType) - Static method in class org.apache.iceberg.flink.data.RowDataProjection
-
Creates a projecting wrapper for
RowData
rows. - create(Schema) - Static method in class org.apache.iceberg.data.GenericRecord
- create(Schema) - Static method in class org.apache.iceberg.mapping.MappingUtil
-
Create a name-based mapping for a schema.
- create(Schema) - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- create(Schema, String) - Method in interface org.apache.iceberg.Tables
- create(Schema, Set<Integer>) - Static method in class org.apache.iceberg.util.StructProjection
-
Creates a projecting wrapper for
StructLike
rows. - create(Schema, Schema) - Static method in class org.apache.iceberg.data.avro.DataReader
- create(Schema, Schema, Map<Integer, ?>) - Static method in class org.apache.iceberg.data.avro.DataReader
- create(Schema, PartitionSpec, String) - Method in interface org.apache.iceberg.Tables
- create(Schema, PartitionSpec, Map<String, String>, String) - Method in interface org.apache.iceberg.Tables
- create(Schema, PartitionSpec, SortOrder, Map<String, String>, String) - Method in class org.apache.iceberg.hadoop.HadoopTables
-
Create a table using the FileSystem implementation resolve from location.
- create(Schema, PartitionSpec, SortOrder, Map<String, String>, String) - Method in interface org.apache.iceberg.Tables
- create(Schema, Schema) - Static method in class org.apache.iceberg.flink.data.RowDataProjection
-
Creates a projecting wrapper for
RowData
rows. - create(Schema, Schema) - Static method in class org.apache.iceberg.util.StructProjection
-
Creates a projecting wrapper for
StructLike
rows. - create(Types.NestedField...) - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- create(Types.StructType) - Static method in class org.apache.iceberg.data.GenericRecord
- create(Types.StructType) - Static method in class org.apache.iceberg.util.StructLikeMap
- create(Types.StructType) - Static method in class org.apache.iceberg.util.StructLikeSet
- create(Types.StructType, Types.StructType) - Static method in class org.apache.iceberg.util.StructProjection
-
Creates a projecting wrapper for
StructLike
rows. - createAllowMissing(Types.StructType, Types.StructType) - Static method in class org.apache.iceberg.util.StructProjection
-
Creates a projecting wrapper for
StructLike
rows. - createBatchedReaderFunc(Function<TypeDescription, OrcBatchReader<?>>) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- createBatchedReaderFunc(Function<MessageType, VectorizedReader<?>>) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- createBatchWriterFactory(PhysicalWriteInfo) - Method in interface org.apache.spark.sql.connector.iceberg.write.DeltaBatchWrite
- createCatalog(String, Map<String, String>) - Method in class org.apache.iceberg.flink.FlinkCatalogFactory
- createCatalog(String, Map<String, String>, Configuration) - Method in class org.apache.iceberg.flink.FlinkCatalogFactory
- createDatabase(String, CatalogDatabase, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- createDynamicTableSink(DynamicTableFactory.Context) - Method in class org.apache.iceberg.flink.FlinkDynamicTableFactory
- createDynamicTableSource(DynamicTableFactory.Context) - Method in class org.apache.iceberg.flink.FlinkDynamicTableFactory
- createFunction(ObjectPath, CatalogFunction, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- createInputSplits(int) - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- createKey() - Method in class org.apache.iceberg.mr.mapred.AbstractMapredIcebergRecordReader
- createMetadataTableInstance(TableOperations, String, String, MetadataTableType) - Static method in class org.apache.iceberg.MetadataTableUtils
- createMetadataTableInstance(TableOperations, String, TableIdentifier, TableIdentifier, MetadataTableType) - Static method in class org.apache.iceberg.MetadataTableUtils
- createMetadataTableInstance(Table, MetadataTableType) - Static method in class org.apache.iceberg.MetadataTableUtils
- createNamespace(String[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkCatalog
- createNamespace(String[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- createNamespace(Namespace) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
Create a namespace in the catalog.
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- createNamespace(Namespace, Map<String, String>) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
Create a namespace in the catalog.
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.hive.HiveCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.nessie.NessieCatalog
-
creating namespaces in nessie is implicit, therefore this is a no-op.
- createNanValueCounts(Stream<FieldMetrics<?>>, MetricsConfig, Schema) - Static method in class org.apache.iceberg.MetricsUtil
-
Construct mapping relationship between column id to NaN value counts from input metrics and metrics config.
- createOrOverwrite() - Method in class org.apache.iceberg.aws.s3.S3OutputFile
- createOrOverwrite() - Method in class org.apache.iceberg.gcp.gcs.GCSOutputFile
- createOrOverwrite() - Method in class org.apache.iceberg.hadoop.HadoopOutputFile
- createOrOverwrite() - Method in interface org.apache.iceberg.io.OutputFile
-
Create a new file and return a
PositionOutputStream
to it. - createOrReplaceTableTransaction(String, TableOperations, TableMetadata) - Static method in class org.apache.iceberg.Transactions
- createOrReplaceTransaction() - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- createOrReplaceTransaction() - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Starts a transaction to create or replace the table.
- createPartition(ObjectPath, CatalogPartitionSpec, CatalogPartition, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- createReader(Schema, MessageType) - Method in class org.apache.iceberg.data.parquet.BaseParquetReaders
- createReader(Schema, MessageType, Map<Integer, ?>) - Method in class org.apache.iceberg.data.parquet.BaseParquetReaders
- createReaderFactory() - Method in class org.apache.iceberg.spark.source.SparkMicroBatchStream
- createReaderFunc(BiFunction<Schema, Schema, DatumReader<?>>) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- createReaderFunc(Function<Schema, DatumReader<?>>) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- createReaderFunc(Function<TypeDescription, OrcRowReader<?>>) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- createReaderFunc(Function<MessageType, ParquetValueReader<?>>) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- createRecordReader(InputSplit, TaskAttemptContext) - Method in class org.apache.iceberg.mr.mapreduce.IcebergInputFormat
- createRecordReader(InputSplit, TaskAttemptContext) - Method in class org.apache.iceberg.pig.IcebergPigInputFormat
- CreateSnapshotEvent - Class in org.apache.iceberg.events
- CreateSnapshotEvent(String, String, long, long, Map<String, String>) - Constructor for class org.apache.iceberg.events.CreateSnapshotEvent
- createStructReader(List<Type>, List<ParquetValueReader<?>>, Types.StructType) - Method in class org.apache.iceberg.data.parquet.BaseParquetReaders
- createStructReader(List<Type>, List<ParquetValueReader<?>>, Types.StructType) - Method in class org.apache.iceberg.data.parquet.GenericParquetReaders
- createStructReader(Types.StructType, List<ValueReader<?>>, Map<Integer, ?>) - Method in class org.apache.iceberg.data.avro.DataReader
- createStructWriter(List<ValueWriter<?>>) - Method in class org.apache.iceberg.data.avro.DataWriter
- createStructWriter(List<ParquetValueWriter<?>>) - Method in class org.apache.iceberg.data.parquet.BaseParquetWriter
- createStructWriter(List<ParquetValueWriter<?>>) - Method in class org.apache.iceberg.data.parquet.GenericParquetWriter
- createTable(ObjectPath, CatalogBaseTable, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- createTable(Configuration, Properties) - Static method in class org.apache.iceberg.mr.Catalogs
-
Creates an Iceberg table using the catalog specified by the configuration.
- createTable(TableIdentifier, Schema) - Method in interface org.apache.iceberg.catalog.Catalog
-
Create an unpartitioned table.
- createTable(TableIdentifier, Schema, PartitionSpec) - Method in interface org.apache.iceberg.catalog.Catalog
-
Create a table.
- createTable(TableIdentifier, Schema, PartitionSpec, String, Map<String, String>) - Method in interface org.apache.iceberg.catalog.Catalog
-
Create a table.
- createTable(TableIdentifier, Schema, PartitionSpec, Map<String, String>) - Method in interface org.apache.iceberg.catalog.Catalog
-
Create a table.
- createTable(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkCatalog
- createTable(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- createTableTransaction(String, TableOperations, TableMetadata) - Static method in class org.apache.iceberg.Transactions
- createTransaction() - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- createTransaction() - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Starts a transaction to create the table.
- createVectorSchemaRootFromVectors() - Method in class org.apache.iceberg.arrow.vectorized.ColumnarBatch
-
Create a new instance of
VectorSchemaRoot
from the arrow vectors stored in this arrow batch. - createWriter(int, long) - Method in interface org.apache.spark.sql.connector.iceberg.write.DeltaWriterFactory
- createWriter(MessageType) - Method in class org.apache.iceberg.data.parquet.BaseParquetWriter
- createWriterFunc(BiFunction<Schema, TypeDescription, OrcRowWriter<?>>) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- createWriterFunc(BiFunction<Schema, TypeDescription, OrcRowWriter<?>>) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- createWriterFunc(BiFunction<Schema, TypeDescription, OrcRowWriter<?>>) - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- createWriterFunc(Function<Schema, DatumWriter<?>>) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- createWriterFunc(Function<Schema, DatumWriter<?>>) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- createWriterFunc(Function<Schema, DatumWriter<?>>) - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- createWriterFunc(Function<MessageType, ParquetValueWriter<?>>) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- createWriterFunc(Function<MessageType, ParquetValueWriter<?>>) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- createWriterFunc(Function<MessageType, ParquetValueWriter<?>>) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- ctorImpl(Class<?>, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
- ctorImpl(String, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
- current() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- current() - Method in class org.apache.iceberg.hadoop.HadoopTableOperations
- current() - Method in class org.apache.iceberg.StaticTableOperations
- current() - Method in interface org.apache.iceberg.TableOperations
-
Return the currently loaded table metadata, without checking for updates.
- currentAncestorIds(Table) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Return the snapshot IDs for the ancestors of the current table state.
- currentAncestors(Table) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Returns an iterable that traverses the table's snapshots from the current to the last known ancestor.
- currentDefinitionLevel() - Method in class org.apache.iceberg.parquet.ColumnIterator
- currentDL - Variable in class org.apache.iceberg.parquet.BasePageIterator
- currentFieldName() - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- currentHash() - Method in class org.apache.iceberg.nessie.NessieCatalog
- currentMetadataLocation() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- currentPageCount() - Method in class org.apache.iceberg.parquet.BasePageIterator
- currentPath() - Method in class org.apache.iceberg.flink.data.ParquetWithFlinkSchemaVisitor
- currentPath() - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- currentPath() - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- currentPath() - Method in class org.apache.iceberg.parquet.TypeWithSchemaVisitor
- currentPath() - Method in class org.apache.iceberg.spark.data.ParquetWithSparkSchemaVisitor
- currentRepetitionLevel() - Method in class org.apache.iceberg.parquet.ColumnIterator
- currentRL - Variable in class org.apache.iceberg.parquet.BasePageIterator
- currentSchemaId() - Method in class org.apache.iceberg.TableMetadata
- currentSnapshot() - Method in class org.apache.iceberg.BaseTable
- currentSnapshot() - Method in class org.apache.iceberg.SerializableTable
- currentSnapshot() - Method in interface org.apache.iceberg.Table
-
Get the current
snapshot
for this table, or null if there are no snapshots. - currentSnapshot() - Method in class org.apache.iceberg.TableMetadata
- currentVersion() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- custom(String, Map<String, String>, Configuration, String) - Static method in interface org.apache.iceberg.flink.CatalogLoader
- CustomOrderSchemaVisitor() - Constructor for class org.apache.iceberg.types.TypeUtil.CustomOrderSchemaVisitor
D
- data(PartitionSpec, String) - Static method in class org.apache.iceberg.DataFiles
- DATA - org.apache.iceberg.FileContent
- DATA - org.apache.iceberg.ManifestContent
- DATA_FILES - org.apache.iceberg.ManifestReader.FileType
- databaseExists(String) - Method in class org.apache.iceberg.flink.FlinkCatalog
- DataFile - Interface in org.apache.iceberg
-
Interface for data files listed in a table manifest.
- dataFileFormat() - Method in class org.apache.iceberg.spark.SparkWriteConf
- dataFiles() - Method in class org.apache.iceberg.io.DataWriteResult
- dataFiles() - Method in interface org.apache.iceberg.io.TaskWriter
-
Close the writer and get the completed data files, it requires that the task writer would produce data files only.
- dataFiles() - Method in class org.apache.iceberg.io.WriteResult
- DataFiles - Class in org.apache.iceberg
- DataFiles.Builder - Class in org.apache.iceberg
- DataFilesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's data files as rows. - DataFilesTable.FilesTableScan - Class in org.apache.iceberg
- DataIterator<T> - Class in org.apache.iceberg.flink.source
-
Flink data iterator that reads
CombinedScanTask
into aCloseableIterator
- DataIterator(FileScanTaskReader<T>, CombinedScanTask, FileIO, EncryptionManager) - Constructor for class org.apache.iceberg.flink.source.DataIterator
- dataManifests() - Method in interface org.apache.iceberg.Snapshot
-
Return a
ManifestFile
for each data manifest in this snapshot. - DataOperations - Class in org.apache.iceberg
-
Data operations that produce snapshots.
- DataReader<T> - Class in org.apache.iceberg.data.avro
- DataReader(Schema, Schema, Map<Integer, ?>) - Constructor for class org.apache.iceberg.data.avro.DataReader
- dataSchema() - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- DataTableScan - Class in org.apache.iceberg
- DataTableScan(TableOperations, Table) - Constructor for class org.apache.iceberg.DataTableScan
- DataTableScan(TableOperations, Table, Schema, TableScanContext) - Constructor for class org.apache.iceberg.DataTableScan
- DataTask - Interface in org.apache.iceberg
-
A task that returns data as
rows
instead of where to read data. - dataTimestampMillis() - Method in class org.apache.iceberg.ScanSummary.PartitionMetrics
- dataType() - Method in class org.apache.iceberg.spark.source.SparkMetadataColumn
- dataType() - Method in interface org.apache.spark.sql.connector.iceberg.catalog.ProcedureParameter
-
Returns the type of this parameter.
- DataWriter<T> - Class in org.apache.iceberg.data.avro
- DataWriter<T> - Class in org.apache.iceberg.io
- DataWriter(Schema) - Constructor for class org.apache.iceberg.data.avro.DataWriter
- DataWriter(FileAppender<T>, FileFormat, String, PartitionSpec, StructLike, EncryptionKeyMetadata) - Constructor for class org.apache.iceberg.io.DataWriter
- DataWriter(FileAppender<T>, FileFormat, String, PartitionSpec, StructLike, EncryptionKeyMetadata, SortOrder) - Constructor for class org.apache.iceberg.io.DataWriter
- DataWriteResult - Class in org.apache.iceberg.io
-
A result of writing data files.
- DataWriteResult(List<DataFile>) - Constructor for class org.apache.iceberg.io.DataWriteResult
- DataWriteResult(DataFile) - Constructor for class org.apache.iceberg.io.DataWriteResult
- DATE - org.apache.iceberg.types.Type.TypeID
- DATE_INSPECTOR - Static variable in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- dateFromDays(int) - Static method in class org.apache.iceberg.util.DateTimeUtil
- dates() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- dates() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- DateTimeUtil - Class in org.apache.iceberg.util
- DateType() - Constructor for class org.apache.iceberg.types.Types.DateType
- day(int, String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- day(String) - Static method in class org.apache.iceberg.expressions.Expressions
- day(String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- day(String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- day(String, int, SortDirection, NullOrder) - Method in interface org.apache.iceberg.transforms.SortOrderVisitor
- day(String, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- day(Type) - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a day
Transform
for date or timestamp types. - daysFromDate(LocalDate) - Static method in class org.apache.iceberg.util.DateTimeUtil
- daysFromInstant(Instant) - Static method in class org.apache.iceberg.util.DateTimeUtil
- decimal(int, int) - Static method in class org.apache.iceberg.avro.ValueWriters
- decimal(int, int) - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- decimal(ValueReader<byte[]>, int) - Static method in class org.apache.iceberg.avro.ValueReaders
- DECIMAL - org.apache.iceberg.types.Type.TypeID
- DECIMAL_64 - org.apache.hadoop.hive.ql.exec.vector.VectorizedSupport.Support
- DECIMAL_INT32_MAX_DIGITS - Static variable in class org.apache.iceberg.parquet.TypeToMessageType
- DECIMAL_INT64_MAX_DIGITS - Static variable in class org.apache.iceberg.parquet.TypeToMessageType
- DECIMAL_VALUE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- DECIMAL_VALUE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- DECIMAL_VALUE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- decimalAsFixed(ColumnDescriptor, int, int) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- decimalAsInteger(ColumnDescriptor, int, int) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- decimalAsLong(ColumnDescriptor, int, int) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- decimalBytesReader(Schema) - Static method in class org.apache.iceberg.avro.ValueReaders
- DecimalLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- decimalRequiredBytes(int) - Static method in class org.apache.iceberg.types.TypeUtil
- decimals() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- decimals(int, int) - Static method in class org.apache.iceberg.spark.data.SparkOrcValueReaders
- DecimalUtil - Class in org.apache.iceberg.util
- decode(byte[]) - Static method in class org.apache.iceberg.avro.AvroEncoderUtil
- decode(byte[]) - Static method in class org.apache.iceberg.ManifestFiles
-
Decode the binary data into a
ManifestFile
. - decode(InputStream, D) - Method in class org.apache.iceberg.data.avro.IcebergDecoder
- DecoderResolver - Class in org.apache.iceberg.data.avro
-
Resolver to resolve
Decoder
to aResolvingDecoder
. - decomposePredicate(JobConf, Deserializer, ExprNodeDesc) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- decrypt(Iterable<EncryptedInputFile>) - Method in interface org.apache.iceberg.encryption.EncryptionManager
-
Variant of
EncryptionManager.decrypt(EncryptedInputFile)
that provides a sequence of files that all need to be decrypted in a single context. - decrypt(EncryptedInputFile) - Method in interface org.apache.iceberg.encryption.EncryptionManager
-
Given an
EncryptedInputFile.encryptedInputFile()
representing the raw encrypted bytes from the underlying file system, and given metadata about how the file was encrypted viaEncryptedInputFile.keyMetadata()
, return anInputFile
that returns decrypted input streams. - decrypt(EncryptedInputFile) - Method in class org.apache.iceberg.encryption.PlaintextEncryptionManager
- decryptionKey() - Method in class org.apache.iceberg.gcp.GCPProperties
- dedupName() - Method in interface org.apache.iceberg.transforms.Transform
-
Return the unique transform name to check if similar transforms for the same source field are added multiple times in partition spec builder.
- DEFAULT_BATCH_SIZE - Static variable in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- DEFAULT_DATABASE - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- DEFAULT_DATABASE_NAME - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- DEFAULT_FILE_FORMAT - Static variable in class org.apache.iceberg.TableProperties
- DEFAULT_FILE_FORMAT_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- DEFAULT_NAME_MAPPING - Static variable in class org.apache.iceberg.TableProperties
- DEFAULT_WRITE_METRICS_MODE - Static variable in class org.apache.iceberg.TableProperties
- DEFAULT_WRITE_METRICS_MODE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- defaultAlwaysNull() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Instructs this builder to return AlwaysNull if no implementation is found.
- defaultFactory() - Static method in class org.apache.iceberg.aliyun.AliyunClientFactories
- defaultFactory() - Static method in class org.apache.iceberg.aws.AwsClientFactories
- defaultFormat(FileFormat) - Method in interface org.apache.iceberg.UpdateProperties
-
Set the default file format for the table.
- defaultLocationProperty() - Static method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
-
The property used to set a default location for tables in a namespace.
- defaultLockManager() - Static method in class org.apache.iceberg.util.LockManagers
- defaultNamespace() - Method in class org.apache.iceberg.spark.SparkCatalog
- defaultNamespace() - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- defaultSortOrderId() - Method in class org.apache.iceberg.TableMetadata
- defaultSpec(PartitionSpec) - Method in class org.apache.iceberg.io.OutputFileFactory.Builder
- defaultSpecId() - Method in class org.apache.iceberg.TableMetadata
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
-
This method produces the same result as using a HiveCatalog.
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.BaseMetastoreCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.hive.HiveCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.nessie.NessieCatalog
- definitionLevels - Variable in class org.apache.iceberg.parquet.BasePageIterator
- DelegatingInputStream - Interface in org.apache.iceberg.io
- DelegatingOutputStream - Interface in org.apache.iceberg.io
- delete(long) - Method in interface org.apache.iceberg.deletes.PositionDeleteIndex
-
Set a deleted row position.
- delete(long, long) - Method in interface org.apache.iceberg.deletes.PositionDeleteIndex
-
Set a range of deleted row positions.
- delete(F) - Method in class org.apache.iceberg.ManifestWriter
-
Add a delete entry for a file.
- delete(CharSequence, long) - Method in class org.apache.iceberg.deletes.PositionDeleteWriter
-
Deprecated.since 0.13.0, will be removed in 0.14.0; use
PositionDeleteWriter.write(PositionDelete)
instead. - delete(CharSequence, long, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.PositionDeltaWriter
-
Deletes a position in the provided spec/partition.
- delete(CharSequence, long, T) - Method in class org.apache.iceberg.deletes.PositionDeleteWriter
-
Deprecated.since 0.13.0, will be removed in 0.14.0; use
PositionDeleteWriter.write(PositionDelete)
instead. - delete(CharSequence, long, T, PartitionSpec, StructLike) - Method in class org.apache.iceberg.io.BasePositionDeltaWriter
- delete(CharSequence, long, T, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.PositionDeltaWriter
-
Deletes a position in the provided spec/partition and records the deleted row in the delete file.
- delete(T) - Method in class org.apache.iceberg.deletes.EqualityDeleteWriter
-
Deprecated.since 0.13.0, will be removed in 0.14.0; use
EqualityDeleteWriter.write(Object)
instead. - delete(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
-
Delete those rows whose equality fields has the same values with the given row.
- delete(T, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.EqualityDeltaWriter
-
Deletes a row from the provided spec/partition.
- delete(T, T) - Method in interface org.apache.spark.sql.connector.iceberg.write.DeltaWriter
-
Passes information for a row that must be deleted.
- DELETE - org.apache.spark.sql.connector.iceberg.write.RowLevelOperation.Command
- DELETE - Static variable in class org.apache.iceberg.DataOperations
-
Data is deleted from the table and no data is added.
- DELETE_AVRO_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- DELETE_DEFAULT_FILE_FORMAT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_DISTRIBUTION_MODE - Static variable in class org.apache.iceberg.TableProperties
- DELETE_FILE_PATH - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_POS - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_ROW_DOC - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_ROW_FIELD_ID - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_ROW_FIELD_NAME - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_THRESHOLD - Static variable in class org.apache.iceberg.actions.BinPackStrategy
-
The minimum number of deletes that needs to be associated with a data file for it to be considered for rewriting.
- DELETE_FILE_THRESHOLD_DEFAULT - Static variable in class org.apache.iceberg.actions.BinPackStrategy
- DELETE_FILES - org.apache.iceberg.ManifestReader.FileType
- DELETE_FORMAT - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- DELETE_ISOLATION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ISOLATION_LEVEL_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_MODE - Static variable in class org.apache.iceberg.TableProperties
- DELETE_MODE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_COMPRESSION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_DICT_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_PAGE_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_ROW_GROUP_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_TARGET_FILE_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_TARGET_FILE_SIZE_BYTES_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- deleteAll(Iterable<T>) - Method in class org.apache.iceberg.deletes.EqualityDeleteWriter
-
Deprecated.since 0.13.0, will be removed in 0.14.0; use
FileWriter.write(Iterable)
instead. - deleteColumn(String) - Method in interface org.apache.iceberg.UpdateSchema
-
Delete a column in the schema.
- DELETED_DUPLICATE_FILES - Static variable in class org.apache.iceberg.SnapshotSummary
- DELETED_FILES_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- DELETED_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- DELETED_RECORDS_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- DELETED_ROWS_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- deletedDataFiles() - Method in class org.apache.iceberg.actions.RewriteDataFilesActionResult
- deletedDataFilesCount() - Method in class org.apache.iceberg.actions.BaseDeleteReachableFilesActionResult
- deletedDataFilesCount() - Method in class org.apache.iceberg.actions.BaseExpireSnapshotsActionResult
- deletedDataFilesCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted data files.
- deletedDataFilesCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted data files.
- deletedFile(PartitionSpec, ContentFile<?>) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- deletedFile(PartitionSpec, DataFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- deletedFile(PartitionSpec, DeleteFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- deletedFiles() - Method in interface org.apache.iceberg.Snapshot
-
Return all files deleted from the table in this snapshot.
- deletedFilesCount() - Method in class org.apache.iceberg.GenericManifestFile
- deletedFilesCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the number of data files with status DELETED in the manifest file.
- deletedFilesCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- deletedManifestListsCount() - Method in class org.apache.iceberg.actions.BaseDeleteReachableFilesActionResult
- deletedManifestListsCount() - Method in class org.apache.iceberg.actions.BaseExpireSnapshotsActionResult
- deletedManifestListsCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted manifest lists.
- deletedManifestListsCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted manifest lists.
- deletedManifestsCount() - Method in class org.apache.iceberg.actions.BaseDeleteReachableFilesActionResult
- deletedManifestsCount() - Method in class org.apache.iceberg.actions.BaseExpireSnapshotsActionResult
- deletedManifestsCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted manifests.
- deletedManifestsCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted manifests.
- deletedOtherFilesCount() - Method in class org.apache.iceberg.actions.BaseDeleteReachableFilesActionResult
- deletedOtherFilesCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted metadata json, version hint files.
- deletedRowPositions() - Method in class org.apache.iceberg.data.DeleteFilter
- deletedRowsCount() - Method in class org.apache.iceberg.GenericManifestFile
- deletedRowsCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the total number of rows in all data files with status DELETED in the manifest file.
- deletedRowsCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- deleteFile(CharSequence) - Method in interface org.apache.iceberg.DeleteFiles
-
Delete a file path from the underlying table.
- deleteFile(String) - Method in class org.apache.iceberg.aliyun.oss.OSSFileIO
- deleteFile(String) - Method in class org.apache.iceberg.aws.s3.S3FileIO
- deleteFile(String) - Method in class org.apache.iceberg.gcp.gcs.GCSFileIO
- deleteFile(String) - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- deleteFile(String) - Method in interface org.apache.iceberg.io.FileIO
-
Delete the file at the given path.
- deleteFile(String) - Method in class org.apache.iceberg.io.ResolvingFileIO
- deleteFile(DataFile) - Method in class org.apache.iceberg.BaseOverwriteFiles
- deleteFile(DataFile) - Method in interface org.apache.iceberg.DeleteFiles
-
Delete a file tracked by a
DataFile
from the underlying table. - deleteFile(DataFile) - Method in interface org.apache.iceberg.OverwriteFiles
-
Delete a
DataFile
from the table. - deleteFile(InputFile) - Method in interface org.apache.iceberg.io.FileIO
- deleteFile(OutputFile) - Method in interface org.apache.iceberg.io.FileIO
-
Convenience method to
delete
anOutputFile
. - DeleteFile - Interface in org.apache.iceberg
-
Interface for delete files listed in a table delete manifest.
- deleteFileBuilder(PartitionSpec) - Static method in class org.apache.iceberg.FileMetadata
- deleteFileFormat() - Method in class org.apache.iceberg.spark.SparkWriteConf
- deleteFiles() - Method in class org.apache.iceberg.io.DeleteWriteResult
- deleteFiles() - Method in class org.apache.iceberg.io.WriteResult
- DeleteFiles - Interface in org.apache.iceberg
-
API for deleting files from a table.
- DeleteFilter<T> - Class in org.apache.iceberg.data
- DeleteFilter(FileScanTask, Schema, Schema) - Constructor for class org.apache.iceberg.data.DeleteFilter
- deleteFromRowFilter(Expression) - Method in interface org.apache.iceberg.DeleteFiles
-
Delete files that match an
Expression
on data rows from the table. - deleteKey(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
-
Delete those rows with the given key.
- deleteKey(T, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.EqualityDeltaWriter
-
Deletes a key from the provided spec/partition.
- deleteManifest(ManifestFile) - Method in class org.apache.iceberg.BaseRewriteManifests
- deleteManifest(ManifestFile) - Method in interface org.apache.iceberg.RewriteManifests
-
Deletes a
manifest file
from the table. - deleteManifests() - Method in interface org.apache.iceberg.Snapshot
-
Return a
ManifestFile
for each delete manifest in this snapshot. - deleteOrphanFiles(Table) - Method in interface org.apache.iceberg.actions.ActionsProvider
-
Instantiates an action to delete orphan files.
- DeleteOrphanFiles - Interface in org.apache.iceberg.actions
-
An action that deletes orphan files in a table.
- DeleteOrphanFiles.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- deletePositions(CharSequence, List<CloseableIterable<T>>) - Static method in class org.apache.iceberg.deletes.Deletes
- deletePositions(CharSequence, CloseableIterable<StructLike>) - Static method in class org.apache.iceberg.deletes.Deletes
- deleteReachableFiles(String) - Method in interface org.apache.iceberg.actions.ActionsProvider
-
Instantiates an action to delete all the files reachable from given metadata location.
- DeleteReachableFiles - Interface in org.apache.iceberg.actions
-
An action that deletes all files referenced by a table metadata file.
- DeleteReachableFiles.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- deletes() - Method in interface org.apache.iceberg.FileScanTask
-
A list of
delete files
to apply when reading the task's data file. - Deletes - Class in org.apache.iceberg.deletes
- DELETES - org.apache.iceberg.ManifestContent
- DeleteSchemaUtil - Class in org.apache.iceberg.io
- deleteWhere(Filter[]) - Method in class org.apache.iceberg.spark.RollbackStagedTable
- deleteWhere(Filter[]) - Method in class org.apache.iceberg.spark.source.SparkTable
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.actions.DeleteOrphanFiles
-
Passes an alternative delete implementation that will be used for orphan files.
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.actions.DeleteReachableFiles
-
Passes an alternative delete implementation that will be used for files.
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.actions.ExpireSnapshots
-
Passes an alternative delete implementation that will be used for manifests and data files.
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.ExpireSnapshots
-
Passes an alternative delete implementation that will be used for manifests and data files.
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.SnapshotUpdate
-
Set a callback to delete files instead of the table's default.
- deleteWith(Consumer<String>) - Method in class org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction
- deleteWith(Consumer<String>) - Method in class org.apache.iceberg.spark.actions.BaseDeleteReachableFilesSparkAction
- deleteWith(Consumer<String>) - Method in class org.apache.iceberg.spark.actions.BaseExpireSnapshotsSparkAction
- DeleteWriteResult - Class in org.apache.iceberg.io
-
A result of writing delete files.
- DeleteWriteResult(List<DeleteFile>) - Constructor for class org.apache.iceberg.io.DeleteWriteResult
- DeleteWriteResult(List<DeleteFile>, CharSequenceSet) - Constructor for class org.apache.iceberg.io.DeleteWriteResult
- DeleteWriteResult(DeleteFile) - Constructor for class org.apache.iceberg.io.DeleteWriteResult
- DeleteWriteResult(DeleteFile, CharSequenceSet) - Constructor for class org.apache.iceberg.io.DeleteWriteResult
- DeltaBatchWrite - Interface in org.apache.spark.sql.connector.iceberg.write
-
An interface that defines how to write a delta of rows during batch processing.
- DeltaWrite - Interface in org.apache.spark.sql.connector.iceberg.write
-
A logical representation of a data source write that handles a delta of rows.
- DeltaWriteBuilder - Interface in org.apache.spark.sql.connector.iceberg.write
-
An interface for building delta writes.
- DeltaWriter<T> - Interface in org.apache.spark.sql.connector.iceberg.write
-
A data writer responsible for writing a delta of rows.
- DeltaWriterFactory - Interface in org.apache.spark.sql.connector.iceberg.write
-
A factory for creating and initializing delta writers at the executor side.
- desc - Variable in class org.apache.iceberg.parquet.BaseColumnIterator
- desc - Variable in class org.apache.iceberg.parquet.BasePageIterator
- desc(String) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add a field to the sort by field name, ascending with nulls first.
- desc(String, NullOrder) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add a field to the sort by field name, ascending with the given null order.
- desc(Term) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add an expression term to the sort, ascending with nulls first.
- desc(Term, NullOrder) - Method in class org.apache.iceberg.BaseReplaceSortOrder
- desc(Term, NullOrder) - Method in class org.apache.iceberg.SortOrder.Builder
-
Add an expression term to the sort, ascending with the given null order.
- desc(Term, NullOrder) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add an expression term to the sort, ascending with the given null order.
- DESC - org.apache.iceberg.SortDirection
- DESC - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- DESC - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- DESC() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- DESC() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- describe(Expression) - Static method in class org.apache.iceberg.spark.Spark3Util
- describe(Schema) - Static method in class org.apache.iceberg.spark.Spark3Util
- describe(SortOrder) - Static method in class org.apache.iceberg.spark.Spark3Util
- describe(Type) - Static method in class org.apache.iceberg.spark.Spark3Util
- description() - Method in class org.apache.iceberg.spark.JobGroupInfo
- description() - Method in class org.apache.iceberg.spark.procedures.AncestorsOfProcedure
- description() - Method in class org.apache.iceberg.spark.procedures.ExpireSnapshotsProcedure
- description() - Method in class org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure
- description() - Method in interface org.apache.spark.sql.connector.iceberg.catalog.Procedure
-
Returns the description of this procedure.
- description() - Method in interface org.apache.spark.sql.connector.iceberg.write.RowLevelOperation
-
Returns the description associated with this row-level operation.
- descriptor() - Method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- deserialize(int, byte[]) - Method in class org.apache.iceberg.flink.source.split.IcebergSourceSplitSerializer
- deserialize(Writable) - Method in class org.apache.iceberg.mr.hive.HiveIcebergSerDe
- deserializeFromBase64(String) - Static method in class org.apache.iceberg.util.SerializationUtil
- deserializeFromBytes(byte[]) - Static method in class org.apache.iceberg.util.SerializationUtil
- deserializeOffset(String) - Method in class org.apache.iceberg.spark.source.SparkMicroBatchStream
- destCatalog() - Method in class org.apache.iceberg.spark.actions.BaseMigrateTableSparkAction
- destCatalog() - Method in class org.apache.iceberg.spark.actions.BaseSnapshotTableSparkAction
- destTableIdent() - Method in class org.apache.iceberg.spark.actions.BaseMigrateTableSparkAction
- destTableIdent() - Method in class org.apache.iceberg.spark.actions.BaseSnapshotTableSparkAction
- destTableProps() - Method in class org.apache.iceberg.spark.actions.BaseMigrateTableSparkAction
- destTableProps() - Method in class org.apache.iceberg.spark.actions.BaseSnapshotTableSparkAction
- dictionary - Variable in class org.apache.iceberg.parquet.BaseColumnIterator
- dictionary - Variable in class org.apache.iceberg.parquet.BasePageIterator
- dictionary() - Method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- dictionaryBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- DictionaryBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.DictionaryBatchReader
- dictionaryIdReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- direction - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- direction() - Method in class org.apache.iceberg.SortField
-
Returns the sort direction
- disableRefresh() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- discardChanges() - Method in class org.apache.iceberg.TableMetadata.Builder
- DISTRIBUTED - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- DISTRIBUTED - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- DISTRIBUTED() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- DISTRIBUTED() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- DISTRIBUTION_MODE - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- distributionMode() - Method in class org.apache.iceberg.spark.SparkWriteConf
- distributionMode(DistributionMode) - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
-
Configure the write
DistributionMode
that the flink sink will use. - DistributionMode - Enum in org.apache.iceberg
-
Enum of supported write distribution mode, it defines the write behavior of batch or streaming job:
- doc() - Method in class org.apache.iceberg.types.Types.NestedField
- doCommit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- doCommit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.hive.HiveTableOperations
- doCommit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.nessie.NessieTableOperations
- doRefresh() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- doRefresh() - Method in class org.apache.iceberg.hive.HiveTableOperations
- doRefresh() - Method in class org.apache.iceberg.nessie.NessieTableOperations
- DOUBLE - org.apache.iceberg.types.Type.TypeID
- DOUBLE_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- DOUBLE_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- DOUBLE_LITERAL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DoubleLiteralContext
- doubleBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- DoubleBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.DoubleBatchReader
- doubleDictEncodedReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- DoubleFieldMetrics - Class in org.apache.iceberg
-
Iceberg internally tracked field level metrics, used by Parquet and ORC writers only.
- DoubleFieldMetrics.Builder - Class in org.apache.iceberg
- DoubleLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DoubleLiteralContext
- doubles() - Static method in class org.apache.iceberg.avro.ValueReaders
- doubles() - Static method in class org.apache.iceberg.avro.ValueWriters
- doubles() - Static method in class org.apache.iceberg.orc.OrcValueReaders
- doubles(int) - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- doubles(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- DoubleType() - Constructor for class org.apache.iceberg.types.Types.DoubleType
- DROP - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- DROP - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- DROP() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- DROP() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- DROP() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- dropDatabase(String, boolean, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- dropFunction(ObjectPath, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- DropIdentifierFieldsContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- dropNamespace(String[]) - Method in class org.apache.iceberg.spark.SparkCatalog
- dropNamespace(String[]) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- dropNamespace(Namespace) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
Drop a namespace.
- dropNamespace(Namespace) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.hive.HiveCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.nessie.NessieCatalog
-
Namespaces in Nessie are implicit and therefore cannot be dropped
- dropPartition(ObjectPath, CatalogPartitionSpec, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- DropPartitionFieldContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- dropTable(String) - Method in class org.apache.iceberg.hadoop.HadoopTables
-
Drop a table and delete all data and metadata files.
- dropTable(String, boolean) - Method in class org.apache.iceberg.hadoop.HadoopTables
-
Drop a table; optionally delete data and metadata files.
- dropTable(ObjectPath, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- dropTable(Configuration, Properties) - Static method in class org.apache.iceberg.mr.Catalogs
-
Drops an Iceberg table using the catalog specified by the configuration.
- dropTable(TableIdentifier) - Method in interface org.apache.iceberg.catalog.Catalog
-
Drop a table and delete all data and metadata files.
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.CachingCatalog
- dropTable(TableIdentifier, boolean) - Method in interface org.apache.iceberg.catalog.Catalog
-
Drop a table; optionally delete data and metadata files.
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.hive.HiveCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.nessie.NessieCatalog
- dropTable(Identifier) - Method in class org.apache.iceberg.spark.SparkCatalog
- dropTable(Identifier) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- dropTableData(FileIO, TableMetadata) - Static method in class org.apache.iceberg.CatalogUtil
-
Drops all data and metadata files referenced by TableMetadata.
- dummyHolder(int) - Static method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- DuplicateWAPCommitException - Exception in org.apache.iceberg.exceptions
-
This exception occurs when the WAP workflow detects a duplicate wap commit.
- DuplicateWAPCommitException(String) - Constructor for exception org.apache.iceberg.exceptions.DuplicateWAPCommitException
- dynamo() - Method in class org.apache.iceberg.aws.AssumeRoleAwsClientFactory
- dynamo() - Method in interface org.apache.iceberg.aws.AwsClientFactory
-
Create a Amazon DynamoDB client
- DYNAMODB_TABLE_NAME - Static variable in class org.apache.iceberg.aws.AwsProperties
-
DynamoDB table name for
DynamoDbCatalog
- DYNAMODB_TABLE_NAME_DEFAULT - Static variable in class org.apache.iceberg.aws.AwsProperties
- DynamoDbCatalog - Class in org.apache.iceberg.aws.dynamodb
-
DynamoDB implementation of Iceberg catalog
- DynamoDbCatalog() - Constructor for class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- dynamoDbTableName() - Method in class org.apache.iceberg.aws.AwsProperties
- DynClasses - Class in org.apache.iceberg.common
- DynClasses.Builder - Class in org.apache.iceberg.common
- DynConstructors - Class in org.apache.iceberg.common
-
Copied from parquet-common
- DynConstructors.Builder - Class in org.apache.iceberg.common
- DynConstructors.Ctor<C> - Class in org.apache.iceberg.common
- DynFields - Class in org.apache.iceberg.common
- DynFields.BoundField<T> - Class in org.apache.iceberg.common
- DynFields.Builder - Class in org.apache.iceberg.common
- DynFields.StaticField<T> - Class in org.apache.iceberg.common
- DynFields.UnboundField<T> - Class in org.apache.iceberg.common
-
Convenience wrapper class around
Field
. - DynMethods - Class in org.apache.iceberg.common
-
Copied from parquet-common
- DynMethods.BoundMethod - Class in org.apache.iceberg.common
- DynMethods.Builder - Class in org.apache.iceberg.common
- DynMethods.StaticMethod - Class in org.apache.iceberg.common
- DynMethods.UnboundMethod - Class in org.apache.iceberg.common
-
Convenience wrapper class around
Method
.
E
- ELEMENT_ID_PROP - Static variable in class org.apache.iceberg.avro.AvroSchemaUtil
- elementId() - Method in class org.apache.iceberg.types.Types.ListType
- elementName() - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- elements(L) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedWriter
- elementType() - Method in class org.apache.iceberg.types.Types.ListType
- empty() - Static method in class org.apache.iceberg.actions.BaseRewriteManifestsActionResult
- empty() - Static method in class org.apache.iceberg.catalog.Namespace
- empty() - Static method in interface org.apache.iceberg.encryption.EncryptionKeyMetadata
- empty() - Static method in interface org.apache.iceberg.io.CloseableIterable
- empty() - Static method in interface org.apache.iceberg.io.CloseableIterator
- empty() - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergRecordObjectInspector
- empty() - Static method in class org.apache.iceberg.util.CharSequenceSet
- EMPTY - Static variable in interface org.apache.iceberg.encryption.EncryptionKeyMetadata
- EMPTY_BOOLEAN_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- EMPTY_BYTE_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- EMPTY_DOUBLE_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- EMPTY_FLOAT_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- EMPTY_INT_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- EMPTY_LONG_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- EMPTY_SHORT_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- encode(D) - Method in class org.apache.iceberg.data.avro.IcebergEncoder
- encode(D, OutputStream) - Method in class org.apache.iceberg.data.avro.IcebergEncoder
- encode(ManifestFile) - Static method in class org.apache.iceberg.ManifestFiles
-
Encode the
ManifestFile
to a byte array by using avro encoder. - encode(T, Schema) - Static method in class org.apache.iceberg.avro.AvroEncoderUtil
- encrypt(Iterable<OutputFile>) - Method in interface org.apache.iceberg.encryption.EncryptionManager
-
Variant of
EncryptionManager.encrypt(OutputFile)
that provides a sequence of files that all need to be encrypted in a single context. - encrypt(OutputFile) - Method in interface org.apache.iceberg.encryption.EncryptionManager
-
Given a handle on an
OutputFile
that writes raw bytes to the underlying file system, return a bundle of anEncryptedOutputFile.encryptingOutputFile()
that writes encrypted bytes to the underlying file system, and theEncryptedOutputFile.keyMetadata()
that points to the encryption key that is being used to encrypt this file. - encrypt(OutputFile) - Method in class org.apache.iceberg.encryption.PlaintextEncryptionManager
- EncryptedFiles - Class in org.apache.iceberg.encryption
- encryptedInput(InputFile, byte[]) - Static method in class org.apache.iceberg.encryption.EncryptedFiles
- encryptedInput(InputFile, ByteBuffer) - Static method in class org.apache.iceberg.encryption.EncryptedFiles
- encryptedInput(InputFile, EncryptionKeyMetadata) - Static method in class org.apache.iceberg.encryption.EncryptedFiles
- encryptedInputFile() - Method in interface org.apache.iceberg.encryption.EncryptedInputFile
-
The
InputFile
that is reading raw encrypted bytes from the underlying file system. - EncryptedInputFile - Interface in org.apache.iceberg.encryption
-
Thin wrapper around an
InputFile
instance that is encrypted. - encryptedOutput(OutputFile, byte[]) - Static method in class org.apache.iceberg.encryption.EncryptedFiles
- encryptedOutput(OutputFile, ByteBuffer) - Static method in class org.apache.iceberg.encryption.EncryptedFiles
- encryptedOutput(OutputFile, EncryptionKeyMetadata) - Static method in class org.apache.iceberg.encryption.EncryptedFiles
- EncryptedOutputFile - Interface in org.apache.iceberg.encryption
-
Thin wrapper around a
OutputFile
that is encrypting bytes written to the underlying file system, via an encryption key that is symbolized by the enclosedEncryptionKeyMetadata
. - encryptingOutputFile() - Method in interface org.apache.iceberg.encryption.EncryptedOutputFile
-
An OutputFile instance that encrypts the bytes that are written to its output streams.
- encryption() - Method in class org.apache.iceberg.BaseTable
- encryption() - Method in class org.apache.iceberg.SerializableTable
- encryption() - Method in interface org.apache.iceberg.Table
-
Returns an
EncryptionManager
to encrypt and decrypt data files. - encryption() - Method in interface org.apache.iceberg.TableOperations
-
Returns a
EncryptionManager
to encrypt and decrypt data files. - encryptionKey() - Method in class org.apache.iceberg.gcp.GCPProperties
- EncryptionKeyMetadata - Interface in org.apache.iceberg.encryption
-
Light typedef over a ByteBuffer that indicates that the given bytes represent metadata about an encrypted data file's encryption key.
- EncryptionKeyMetadatas - Class in org.apache.iceberg.encryption
- encryptionManager() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- EncryptionManager - Interface in org.apache.iceberg.encryption
-
Module for encrypting and decrypting table data files.
- END_SNAPSHOT_ID - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- endFileIndex() - Method in class org.apache.iceberg.MicroBatches.MicroBatch
- endSnapshotId() - Method in class org.apache.iceberg.spark.SparkReadConf
- endSnapshotId(Long) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- ENGINE_HIVE_ENABLED - Static variable in class org.apache.iceberg.hadoop.ConfigProperties
- ENGINE_HIVE_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- ENGINE_HIVE_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- enrichContextWithAttemptWrapper(TaskAttemptContext) - Static method in class org.apache.iceberg.mr.hive.TezUtil
-
Creates a new taskAttemptContext by replacing the taskAttemptID with a wrapped object.
- enrichContextWithVertexId(JobContext) - Static method in class org.apache.iceberg.mr.hive.TezUtil
-
If the Tez vertex id is present in config, creates a new jobContext by appending the Tez vertex id to the jobID.
- enterAddPartitionField(IcebergSqlExtensionsParser.AddPartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
addPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterAddPartitionField(IcebergSqlExtensionsParser.AddPartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
addPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterApplyTransform(IcebergSqlExtensionsParser.ApplyTransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
applyTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - enterApplyTransform(IcebergSqlExtensionsParser.ApplyTransformContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
applyTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - enterBigDecimalLiteral(IcebergSqlExtensionsParser.BigDecimalLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
bigDecimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterBigDecimalLiteral(IcebergSqlExtensionsParser.BigDecimalLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
bigDecimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterBigIntLiteral(IcebergSqlExtensionsParser.BigIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
bigIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterBigIntLiteral(IcebergSqlExtensionsParser.BigIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
bigIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterBooleanLiteral(IcebergSqlExtensionsParser.BooleanLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
booleanLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterBooleanLiteral(IcebergSqlExtensionsParser.BooleanLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
booleanLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterBooleanValue(IcebergSqlExtensionsParser.BooleanValueContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.booleanValue()
. - enterBooleanValue(IcebergSqlExtensionsParser.BooleanValueContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.booleanValue()
. - enterCall(IcebergSqlExtensionsParser.CallContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
call
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterCall(IcebergSqlExtensionsParser.CallContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
call
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterDecimalLiteral(IcebergSqlExtensionsParser.DecimalLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
decimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterDecimalLiteral(IcebergSqlExtensionsParser.DecimalLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
decimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterDoubleLiteral(IcebergSqlExtensionsParser.DoubleLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
doubleLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterDoubleLiteral(IcebergSqlExtensionsParser.DoubleLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
doubleLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterDropIdentifierFields(IcebergSqlExtensionsParser.DropIdentifierFieldsContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
dropIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterDropIdentifierFields(IcebergSqlExtensionsParser.DropIdentifierFieldsContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
dropIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterDropPartitionField(IcebergSqlExtensionsParser.DropPartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
dropPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterDropPartitionField(IcebergSqlExtensionsParser.DropPartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
dropPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterEveryRule(ParserRuleContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
- enterExponentLiteral(IcebergSqlExtensionsParser.ExponentLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
exponentLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterExponentLiteral(IcebergSqlExtensionsParser.ExponentLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
exponentLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterExpression(IcebergSqlExtensionsParser.ExpressionContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.expression()
. - enterExpression(IcebergSqlExtensionsParser.ExpressionContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.expression()
. - enterFieldList(IcebergSqlExtensionsParser.FieldListContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.fieldList()
. - enterFieldList(IcebergSqlExtensionsParser.FieldListContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.fieldList()
. - enterFloatLiteral(IcebergSqlExtensionsParser.FloatLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
floatLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterFloatLiteral(IcebergSqlExtensionsParser.FloatLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
floatLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterIdentityTransform(IcebergSqlExtensionsParser.IdentityTransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
identityTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - enterIdentityTransform(IcebergSqlExtensionsParser.IdentityTransformContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
identityTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - enterIntegerLiteral(IcebergSqlExtensionsParser.IntegerLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
integerLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterIntegerLiteral(IcebergSqlExtensionsParser.IntegerLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
integerLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterMultipartIdentifier(IcebergSqlExtensionsParser.MultipartIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.multipartIdentifier()
. - enterMultipartIdentifier(IcebergSqlExtensionsParser.MultipartIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.multipartIdentifier()
. - enterNamedArgument(IcebergSqlExtensionsParser.NamedArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
namedArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - enterNamedArgument(IcebergSqlExtensionsParser.NamedArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
namedArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - enterNonReserved(IcebergSqlExtensionsParser.NonReservedContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.nonReserved()
. - enterNonReserved(IcebergSqlExtensionsParser.NonReservedContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.nonReserved()
. - enterNumericLiteral(IcebergSqlExtensionsParser.NumericLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
numericLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterNumericLiteral(IcebergSqlExtensionsParser.NumericLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
numericLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterOrder(IcebergSqlExtensionsParser.OrderContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.order()
. - enterOrder(IcebergSqlExtensionsParser.OrderContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.order()
. - enterOrderField(IcebergSqlExtensionsParser.OrderFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.orderField()
. - enterOrderField(IcebergSqlExtensionsParser.OrderFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.orderField()
. - enterPositionalArgument(IcebergSqlExtensionsParser.PositionalArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
positionalArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - enterPositionalArgument(IcebergSqlExtensionsParser.PositionalArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
positionalArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - enterQuotedIdentifier(IcebergSqlExtensionsParser.QuotedIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.quotedIdentifier()
. - enterQuotedIdentifier(IcebergSqlExtensionsParser.QuotedIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.quotedIdentifier()
. - enterQuotedIdentifierAlternative(IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
quotedIdentifierAlternative
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - enterQuotedIdentifierAlternative(IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
quotedIdentifierAlternative
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - enterReplacePartitionField(IcebergSqlExtensionsParser.ReplacePartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
replacePartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterReplacePartitionField(IcebergSqlExtensionsParser.ReplacePartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
replacePartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanValueContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DoubleLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExponentLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExpressionContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FieldListContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FloatLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentityTransformContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IntegerLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MultipartIdentifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NamedArgumentContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumericLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.PositionalArgumentContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleStatementContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SmallIntLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TinyIntLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformArgumentContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TypeConstructorContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.UnquotedIdentifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteSpecContext
- enterSetIdentifierFields(IcebergSqlExtensionsParser.SetIdentifierFieldsContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
setIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterSetIdentifierFields(IcebergSqlExtensionsParser.SetIdentifierFieldsContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
setIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterSetWriteDistributionAndOrdering(IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
setWriteDistributionAndOrdering
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterSetWriteDistributionAndOrdering(IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
setWriteDistributionAndOrdering
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterSingleStatement(IcebergSqlExtensionsParser.SingleStatementContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.singleStatement()
. - enterSingleStatement(IcebergSqlExtensionsParser.SingleStatementContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.singleStatement()
. - enterSmallIntLiteral(IcebergSqlExtensionsParser.SmallIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
smallIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterSmallIntLiteral(IcebergSqlExtensionsParser.SmallIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
smallIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterStringLiteral(IcebergSqlExtensionsParser.StringLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
stringLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterStringLiteral(IcebergSqlExtensionsParser.StringLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
stringLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterStringMap(IcebergSqlExtensionsParser.StringMapContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.stringMap()
. - enterStringMap(IcebergSqlExtensionsParser.StringMapContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.stringMap()
. - enterTinyIntLiteral(IcebergSqlExtensionsParser.TinyIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
tinyIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterTinyIntLiteral(IcebergSqlExtensionsParser.TinyIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
tinyIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterTransformArgument(IcebergSqlExtensionsParser.TransformArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.transformArgument()
. - enterTransformArgument(IcebergSqlExtensionsParser.TransformArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.transformArgument()
. - enterTypeConstructor(IcebergSqlExtensionsParser.TypeConstructorContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
typeConstructor
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterTypeConstructor(IcebergSqlExtensionsParser.TypeConstructorContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
typeConstructor
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterUnquotedIdentifier(IcebergSqlExtensionsParser.UnquotedIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
unquotedIdentifier
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - enterUnquotedIdentifier(IcebergSqlExtensionsParser.UnquotedIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
unquotedIdentifier
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - enterWriteDistributionSpec(IcebergSqlExtensionsParser.WriteDistributionSpecContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.writeDistributionSpec()
. - enterWriteDistributionSpec(IcebergSqlExtensionsParser.WriteDistributionSpecContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.writeDistributionSpec()
. - enterWriteOrderingSpec(IcebergSqlExtensionsParser.WriteOrderingSpecContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.writeOrderingSpec()
. - enterWriteOrderingSpec(IcebergSqlExtensionsParser.WriteOrderingSpecContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.writeOrderingSpec()
. - enterWriteSpec(IcebergSqlExtensionsParser.WriteSpecContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.writeSpec()
. - enterWriteSpec(IcebergSqlExtensionsParser.WriteSpecContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.writeSpec()
. - ENTRIES - org.apache.iceberg.MetadataTableType
- entrySet() - Method in class org.apache.iceberg.util.SerializableMap
- entrySet() - Method in class org.apache.iceberg.util.StructLikeMap
- enums(List<String>) - Static method in class org.apache.iceberg.avro.ValueReaders
- env(StreamExecutionEnvironment) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- EOF() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleStatementContext
- EPOCH - Static variable in class org.apache.iceberg.util.DateTimeUtil
- EPOCH_DAY - Static variable in class org.apache.iceberg.util.DateTimeUtil
- eq(Bound<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- eq(BoundReference<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- EQ - org.apache.iceberg.expressions.Expression.Operation
- eqDeletedRowFilter() - Method in class org.apache.iceberg.data.DeleteFilter
- equal(String, T) - Static method in class org.apache.iceberg.expressions.Expressions
- equal(UnboundTerm<T>, T) - Static method in class org.apache.iceberg.expressions.Expressions
- EQUALITY_DELETES - org.apache.iceberg.FileContent
- EQUALITY_IDS - Static variable in interface org.apache.iceberg.DataFile
- EqualityDeleteRowReader - Class in org.apache.iceberg.spark.source
- EqualityDeleteRowReader(CombinedScanTask, Table, Schema, boolean) - Constructor for class org.apache.iceberg.spark.source.EqualityDeleteRowReader
- equalityDeleteRowSchema() - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- EqualityDeleteWriter<T> - Class in org.apache.iceberg.deletes
- EqualityDeleteWriter(FileAppender<T>, FileFormat, String, PartitionSpec, StructLike, EncryptionKeyMetadata, SortOrder, int...) - Constructor for class org.apache.iceberg.deletes.EqualityDeleteWriter
- EqualityDeltaWriter<T> - Interface in org.apache.iceberg.io
-
A writer capable of writing data and equality deletes that may belong to different specs and partitions.
- equalityFieldColumns(List<String>) - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
-
Configuring the equality field columns for iceberg table that accept CDC or UPSERT events.
- equalityFieldIds() - Method in interface org.apache.iceberg.ContentFile
-
Returns the set of field IDs used for equality comparison, in equality delete files.
- equalityFieldIds() - Method in interface org.apache.iceberg.DataFile
- equalityFieldIds(int...) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- equalityFieldIds(int...) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- equalityFieldIds(int...) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- equalityFieldIds(List<Integer>) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- equalityFieldIds(List<Integer>) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- equalityFieldIds(List<Integer>) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- equals(Object) - Method in class org.apache.iceberg.catalog.Namespace
- equals(Object) - Method in class org.apache.iceberg.catalog.TableIdentifier
- equals(Object) - Method in class org.apache.iceberg.data.GenericRecord
- equals(Object) - Method in class org.apache.iceberg.GenericManifestFile
- equals(Object) - Method in class org.apache.iceberg.mapping.MappedField
- equals(Object) - Method in class org.apache.iceberg.mapping.MappedFields
- equals(Object) - Method in class org.apache.iceberg.MetricsModes.Truncate
- equals(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergRecordObjectInspector
- equals(Object) - Method in class org.apache.iceberg.PartitionField
- equals(Object) - Method in class org.apache.iceberg.PartitionKey
- equals(Object) - Method in class org.apache.iceberg.PartitionSpec
- equals(Object) - Method in class org.apache.iceberg.SortField
- equals(Object) - Method in class org.apache.iceberg.SortOrder
- equals(Object) - Method in class org.apache.iceberg.spark.source.SparkTable
- equals(Object) - Method in class org.apache.iceberg.spark.SparkTableUtil.SparkPartition
- equals(Object) - Method in class org.apache.iceberg.TableMetadata.MetadataLogEntry
- equals(Object) - Method in class org.apache.iceberg.TableMetadata.SnapshotLogEntry
- equals(Object) - Method in class org.apache.iceberg.transforms.UnknownTransform
- equals(Object) - Method in class org.apache.iceberg.types.Types.DecimalType
- equals(Object) - Method in class org.apache.iceberg.types.Types.FixedType
- equals(Object) - Method in class org.apache.iceberg.types.Types.ListType
- equals(Object) - Method in class org.apache.iceberg.types.Types.MapType
- equals(Object) - Method in class org.apache.iceberg.types.Types.NestedField
- equals(Object) - Method in class org.apache.iceberg.types.Types.StructType
- equals(Object) - Method in class org.apache.iceberg.types.Types.TimestampType
- equals(Object) - Method in class org.apache.iceberg.util.CharSequenceWrapper
- equals(Object) - Method in class org.apache.iceberg.util.Pair
- equals(Object) - Method in class org.apache.iceberg.util.SerializableMap
- equals(Object) - Method in class org.apache.iceberg.util.StructLikeSet
- equals(Object) - Method in class org.apache.iceberg.util.StructLikeWrapper
- estimateSize(StructType, long) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Estimate approximate table size based on Spark schema and total records.
- eval(ContentFile<?>) - Method in class org.apache.iceberg.expressions.InclusiveMetricsEvaluator
-
Test whether the file may contain records that match the expression.
- eval(ContentFile<?>) - Method in class org.apache.iceberg.expressions.StrictMetricsEvaluator
-
Test whether all records within the file match the expression.
- eval(ManifestFile) - Method in class org.apache.iceberg.expressions.ManifestEvaluator
-
Test whether the file may contain records that match the expression.
- eval(StructLike) - Method in interface org.apache.iceberg.expressions.Bound
-
Produce a value from the struct for this expression.
- eval(StructLike) - Method in class org.apache.iceberg.expressions.BoundPredicate
- eval(StructLike) - Method in class org.apache.iceberg.expressions.BoundReference
- eval(StructLike) - Method in class org.apache.iceberg.expressions.BoundTransform
- eval(StructLike) - Method in class org.apache.iceberg.expressions.Evaluator
- Evaluator - Class in org.apache.iceberg.expressions
-
Evaluates an
Expression
for data described by aTypes.StructType
. - Evaluator(Types.StructType, Expression) - Constructor for class org.apache.iceberg.expressions.Evaluator
- Evaluator(Types.StructType, Expression, boolean) - Constructor for class org.apache.iceberg.expressions.Evaluator
- Exceptions - Class in org.apache.iceberg.util
- ExceptionUtil - Class in org.apache.iceberg.util
- ExceptionUtil.Block<R,E1 extends java.lang.Exception,E2 extends java.lang.Exception,E3 extends java.lang.Exception> - Interface in org.apache.iceberg.util
- ExceptionUtil.CatchBlock - Interface in org.apache.iceberg.util
- ExceptionUtil.FinallyBlock - Interface in org.apache.iceberg.util
- execute() - Method in interface org.apache.iceberg.actions.Action
-
Executes this action.
- execute() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- execute() - Method in class org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction
- execute() - Method in class org.apache.iceberg.spark.actions.BaseDeleteReachableFilesSparkAction
- execute() - Method in class org.apache.iceberg.spark.actions.BaseExpireSnapshotsSparkAction
- execute() - Method in class org.apache.iceberg.spark.actions.BaseMigrateTableSparkAction
- execute() - Method in class org.apache.iceberg.spark.actions.BaseRewriteManifestsSparkAction
- execute() - Method in class org.apache.iceberg.spark.actions.BaseSnapshotTableSparkAction
- executeDeleteWith(ExecutorService) - Method in interface org.apache.iceberg.actions.DeleteOrphanFiles
-
Passes an alternative executor service that will be used for removing orphaned files.
- executeDeleteWith(ExecutorService) - Method in interface org.apache.iceberg.actions.DeleteReachableFiles
-
Passes an alternative executor service that will be used for files removal.
- executeDeleteWith(ExecutorService) - Method in interface org.apache.iceberg.actions.ExpireSnapshots
-
Passes an alternative executor service that will be used for manifests and data files deletion.
- executeDeleteWith(ExecutorService) - Method in interface org.apache.iceberg.ExpireSnapshots
-
Passes an alternative executor service that will be used for manifests and data files deletion.
- executeDeleteWith(ExecutorService) - Method in class org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction
- executeDeleteWith(ExecutorService) - Method in class org.apache.iceberg.spark.actions.BaseDeleteReachableFilesSparkAction
- executeDeleteWith(ExecutorService) - Method in class org.apache.iceberg.spark.actions.BaseExpireSnapshotsSparkAction
- executeWith(ExecutorService) - Method in class org.apache.iceberg.util.Tasks.Builder
- existing(F, long, long) - Method in class org.apache.iceberg.ManifestWriter
-
Add an existing entry for a file.
- EXISTING_FILES_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- EXISTING_ROWS_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- existingFilesCount() - Method in class org.apache.iceberg.GenericManifestFile
- existingFilesCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the number of data files with status EXISTING in the manifest file.
- existingFilesCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- existingRowsCount() - Method in class org.apache.iceberg.GenericManifestFile
- existingRowsCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the total number of rows in all data files with status EXISTING in the manifest file.
- existingRowsCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- exists() - Method in class org.apache.iceberg.hadoop.HadoopInputFile
- exists() - Method in interface org.apache.iceberg.io.InputFile
-
Checks whether the file exists.
- exists(String) - Method in class org.apache.iceberg.hadoop.HadoopTables
- exists(String) - Method in interface org.apache.iceberg.Tables
- exitAddPartitionField(IcebergSqlExtensionsParser.AddPartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
addPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitAddPartitionField(IcebergSqlExtensionsParser.AddPartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
addPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitApplyTransform(IcebergSqlExtensionsParser.ApplyTransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
applyTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - exitApplyTransform(IcebergSqlExtensionsParser.ApplyTransformContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
applyTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - exitBigDecimalLiteral(IcebergSqlExtensionsParser.BigDecimalLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
bigDecimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitBigDecimalLiteral(IcebergSqlExtensionsParser.BigDecimalLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
bigDecimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitBigIntLiteral(IcebergSqlExtensionsParser.BigIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
bigIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitBigIntLiteral(IcebergSqlExtensionsParser.BigIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
bigIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitBooleanLiteral(IcebergSqlExtensionsParser.BooleanLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
booleanLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - exitBooleanLiteral(IcebergSqlExtensionsParser.BooleanLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
booleanLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - exitBooleanValue(IcebergSqlExtensionsParser.BooleanValueContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.booleanValue()
. - exitBooleanValue(IcebergSqlExtensionsParser.BooleanValueContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.booleanValue()
. - exitCall(IcebergSqlExtensionsParser.CallContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
call
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitCall(IcebergSqlExtensionsParser.CallContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
call
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitDecimalLiteral(IcebergSqlExtensionsParser.DecimalLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
decimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitDecimalLiteral(IcebergSqlExtensionsParser.DecimalLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
decimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitDoubleLiteral(IcebergSqlExtensionsParser.DoubleLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
doubleLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitDoubleLiteral(IcebergSqlExtensionsParser.DoubleLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
doubleLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitDropIdentifierFields(IcebergSqlExtensionsParser.DropIdentifierFieldsContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
dropIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitDropIdentifierFields(IcebergSqlExtensionsParser.DropIdentifierFieldsContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
dropIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitDropPartitionField(IcebergSqlExtensionsParser.DropPartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
dropPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitDropPartitionField(IcebergSqlExtensionsParser.DropPartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
dropPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitEveryRule(ParserRuleContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
- exitExponentLiteral(IcebergSqlExtensionsParser.ExponentLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
exponentLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitExponentLiteral(IcebergSqlExtensionsParser.ExponentLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
exponentLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitExpression(IcebergSqlExtensionsParser.ExpressionContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.expression()
. - exitExpression(IcebergSqlExtensionsParser.ExpressionContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.expression()
. - exitFieldList(IcebergSqlExtensionsParser.FieldListContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.fieldList()
. - exitFieldList(IcebergSqlExtensionsParser.FieldListContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.fieldList()
. - exitFloatLiteral(IcebergSqlExtensionsParser.FloatLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
floatLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitFloatLiteral(IcebergSqlExtensionsParser.FloatLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
floatLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitIdentityTransform(IcebergSqlExtensionsParser.IdentityTransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
identityTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - exitIdentityTransform(IcebergSqlExtensionsParser.IdentityTransformContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
identityTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - exitIntegerLiteral(IcebergSqlExtensionsParser.IntegerLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
integerLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitIntegerLiteral(IcebergSqlExtensionsParser.IntegerLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
integerLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitMultipartIdentifier(IcebergSqlExtensionsParser.MultipartIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.multipartIdentifier()
. - exitMultipartIdentifier(IcebergSqlExtensionsParser.MultipartIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.multipartIdentifier()
. - exitNamedArgument(IcebergSqlExtensionsParser.NamedArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
namedArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - exitNamedArgument(IcebergSqlExtensionsParser.NamedArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
namedArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - exitNonReserved(IcebergSqlExtensionsParser.NonReservedContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.nonReserved()
. - exitNonReserved(IcebergSqlExtensionsParser.NonReservedContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.nonReserved()
. - exitNumericLiteral(IcebergSqlExtensionsParser.NumericLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
numericLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - exitNumericLiteral(IcebergSqlExtensionsParser.NumericLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
numericLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - exitOrder(IcebergSqlExtensionsParser.OrderContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.order()
. - exitOrder(IcebergSqlExtensionsParser.OrderContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.order()
. - exitOrderField(IcebergSqlExtensionsParser.OrderFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.orderField()
. - exitOrderField(IcebergSqlExtensionsParser.OrderFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.orderField()
. - exitPositionalArgument(IcebergSqlExtensionsParser.PositionalArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
positionalArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - exitPositionalArgument(IcebergSqlExtensionsParser.PositionalArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
positionalArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - exitQuotedIdentifier(IcebergSqlExtensionsParser.QuotedIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.quotedIdentifier()
. - exitQuotedIdentifier(IcebergSqlExtensionsParser.QuotedIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.quotedIdentifier()
. - exitQuotedIdentifierAlternative(IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
quotedIdentifierAlternative
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - exitQuotedIdentifierAlternative(IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
quotedIdentifierAlternative
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - exitReplacePartitionField(IcebergSqlExtensionsParser.ReplacePartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
replacePartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitReplacePartitionField(IcebergSqlExtensionsParser.ReplacePartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
replacePartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanLiteralContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanValueContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DoubleLiteralContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExponentLiteralContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExpressionContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FieldListContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FloatLiteralContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentityTransformContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IntegerLiteralContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MultipartIdentifierContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NamedArgumentContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumericLiteralContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.PositionalArgumentContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleStatementContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SmallIntLiteralContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringLiteralContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TinyIntLiteralContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformArgumentContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TypeConstructorContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.UnquotedIdentifierContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- exitRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteSpecContext
- exitSetIdentifierFields(IcebergSqlExtensionsParser.SetIdentifierFieldsContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
setIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitSetIdentifierFields(IcebergSqlExtensionsParser.SetIdentifierFieldsContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
setIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitSetWriteDistributionAndOrdering(IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
setWriteDistributionAndOrdering
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitSetWriteDistributionAndOrdering(IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
setWriteDistributionAndOrdering
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitSingleStatement(IcebergSqlExtensionsParser.SingleStatementContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.singleStatement()
. - exitSingleStatement(IcebergSqlExtensionsParser.SingleStatementContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.singleStatement()
. - exitSmallIntLiteral(IcebergSqlExtensionsParser.SmallIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
smallIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitSmallIntLiteral(IcebergSqlExtensionsParser.SmallIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
smallIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitStringLiteral(IcebergSqlExtensionsParser.StringLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
stringLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - exitStringLiteral(IcebergSqlExtensionsParser.StringLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
stringLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - exitStringMap(IcebergSqlExtensionsParser.StringMapContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.stringMap()
. - exitStringMap(IcebergSqlExtensionsParser.StringMapContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.stringMap()
. - exitTinyIntLiteral(IcebergSqlExtensionsParser.TinyIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
tinyIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitTinyIntLiteral(IcebergSqlExtensionsParser.TinyIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
tinyIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitTransformArgument(IcebergSqlExtensionsParser.TransformArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.transformArgument()
. - exitTransformArgument(IcebergSqlExtensionsParser.TransformArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.transformArgument()
. - exitTypeConstructor(IcebergSqlExtensionsParser.TypeConstructorContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
typeConstructor
labeled alternative inIcebergSqlExtensionsParser.constant()
. - exitTypeConstructor(IcebergSqlExtensionsParser.TypeConstructorContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
typeConstructor
labeled alternative inIcebergSqlExtensionsParser.constant()
. - exitUnquotedIdentifier(IcebergSqlExtensionsParser.UnquotedIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
unquotedIdentifier
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - exitUnquotedIdentifier(IcebergSqlExtensionsParser.UnquotedIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
unquotedIdentifier
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - exitWriteDistributionSpec(IcebergSqlExtensionsParser.WriteDistributionSpecContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.writeDistributionSpec()
. - exitWriteDistributionSpec(IcebergSqlExtensionsParser.WriteDistributionSpecContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.writeDistributionSpec()
. - exitWriteOrderingSpec(IcebergSqlExtensionsParser.WriteOrderingSpecContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.writeOrderingSpec()
. - exitWriteOrderingSpec(IcebergSqlExtensionsParser.WriteOrderingSpecContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.writeOrderingSpec()
. - exitWriteSpec(IcebergSqlExtensionsParser.WriteSpecContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.writeSpec()
. - exitWriteSpec(IcebergSqlExtensionsParser.WriteSpecContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.writeSpec()
. - expirationIntervalMillis - Variable in class org.apache.iceberg.CachingCatalog
- expire() - Method in class org.apache.iceberg.spark.actions.BaseExpireSnapshotsSparkAction
-
Expires snapshots and commits the changes to the table, returning a Dataset of files to delete.
- expireOlderThan(long) - Method in interface org.apache.iceberg.actions.ExpireSnapshots
-
Expires all snapshots older than the given timestamp.
- expireOlderThan(long) - Method in interface org.apache.iceberg.ExpireSnapshots
-
Expires all snapshots older than the given timestamp.
- expireOlderThan(long) - Method in class org.apache.iceberg.spark.actions.BaseExpireSnapshotsSparkAction
- expireSnapshotId(long) - Method in interface org.apache.iceberg.actions.ExpireSnapshots
-
Expires a specific
Snapshot
identified by id. - expireSnapshotId(long) - Method in interface org.apache.iceberg.ExpireSnapshots
-
Expires a specific
Snapshot
identified by id. - expireSnapshotId(long) - Method in class org.apache.iceberg.spark.actions.BaseExpireSnapshotsSparkAction
- expireSnapshots() - Method in class org.apache.iceberg.BaseTable
- expireSnapshots() - Method in class org.apache.iceberg.SerializableTable
- expireSnapshots() - Method in interface org.apache.iceberg.Table
-
Create a new
expire API
to manage snapshots in this table and commit. - expireSnapshots() - Method in interface org.apache.iceberg.Transaction
-
Create a new
expire API
to manage snapshots in this table. - expireSnapshots(Table) - Method in interface org.apache.iceberg.actions.ActionsProvider
-
Instantiates an action to expire snapshots.
- ExpireSnapshots - Interface in org.apache.iceberg.actions
-
An action that expires snapshots in a table.
- ExpireSnapshots - Interface in org.apache.iceberg
-
API for removing old
snapshots
from a table. - ExpireSnapshots.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- ExpireSnapshotsProcedure - Class in org.apache.iceberg.spark.procedures
-
A procedure that expires snapshots in a table.
- EXPONENT_VALUE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- EXPONENT_VALUE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- EXPONENT_VALUE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExponentLiteralContext
- exponentialBackoff(long, long, long, double) - Method in class org.apache.iceberg.util.Tasks.Builder
- ExponentLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExponentLiteralContext
- expression() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- expression() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NamedArgumentContext
- expression() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.PositionalArgumentContext
- Expression - Interface in org.apache.iceberg.expressions
-
Represents a boolean expression tree.
- Expression.Operation - Enum in org.apache.iceberg.expressions
- ExpressionContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExpressionContext
- Expressions - Class in org.apache.iceberg.expressions
-
Factory methods for creating
expressions
. - ExpressionVisitor() - Constructor for class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- ExpressionVisitors - Class in org.apache.iceberg.expressions
-
Utils for traversing
expressions
. - ExpressionVisitors.BoundExpressionVisitor<R> - Class in org.apache.iceberg.expressions
- ExpressionVisitors.BoundVisitor<R> - Class in org.apache.iceberg.expressions
- ExpressionVisitors.ExpressionVisitor<R> - Class in org.apache.iceberg.expressions
- ExtendedLogicalWriteInfo - Interface in org.apache.spark.sql.connector.iceberg.write
-
A class that holds logical write information not covered by LogicalWriteInfo in Spark.
- extensionsEnabled(SparkSession) - Static method in class org.apache.iceberg.spark.Spark3Util
- EXTERNAL_TABLE_PURGE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- EXTRA_METADATA_PREFIX - Static variable in class org.apache.iceberg.SnapshotSummary
- extractCatalog(CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.source.IcebergSource
- extractIdentifier(CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.source.IcebergSource
- extraSnapshotMetadata() - Method in class org.apache.iceberg.spark.SparkWriteConf
F
- factory() - Static method in class org.apache.iceberg.util.JsonUtil
- factoryIdentifier() - Method in class org.apache.iceberg.flink.FlinkDynamicTableFactory
- FAILURE - org.apache.iceberg.BaseMetastoreTableOperations.CommitStatus
- False - Class in org.apache.iceberg.expressions
-
An
expression
that is always false. - FALSE - org.apache.iceberg.expressions.Expression.Operation
- FALSE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- FALSE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- FALSE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanValueContext
- FALSE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- FANOUT_ENABLED - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- FanoutDataWriter<T> - Class in org.apache.iceberg.io
-
A data writer capable of writing to multiple specs and partitions that keeps data writers for each seen spec/partition pair open until this writer is closed.
- FanoutDataWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, FileFormat, long) - Constructor for class org.apache.iceberg.io.FanoutDataWriter
- fanoutWriterEnabled() - Method in class org.apache.iceberg.spark.SparkWriteConf
- fetchNewDataFiles(Table, String) - Method in class org.apache.iceberg.spark.FileRewriteCoordinator
- fetchSetIDs(Table) - Method in class org.apache.iceberg.spark.FileRewriteCoordinator
- fetchSetIDs(Table) - Method in class org.apache.iceberg.spark.FileScanTaskSetManager
- fetchTasks(Table, String) - Method in class org.apache.iceberg.spark.FileScanTaskSetManager
- field() - Method in class org.apache.iceberg.expressions.BoundReference
- field(int) - Method in class org.apache.iceberg.mapping.MappedFields
- field(int) - Method in class org.apache.iceberg.types.Type.NestedType
- field(int) - Method in class org.apache.iceberg.types.Types.ListType
- field(int) - Method in class org.apache.iceberg.types.Types.MapType
- field(int) - Method in class org.apache.iceberg.types.Types.StructType
- field(String) - Method in class org.apache.iceberg.types.Types.StructType
- field(String, int, SortDirection, NullOrder) - Method in interface org.apache.iceberg.transforms.SortOrderVisitor
- field(Types.NestedField) - Method in class org.apache.iceberg.parquet.TypeToMessageType
- field(Types.NestedField, Integer, Boolean) - Method in class org.apache.iceberg.schema.UnionByNameVisitor
- field(Types.NestedField, String) - Method in class org.apache.iceberg.spark.Spark3Util.DescribeSchemaVisitor
- field(Types.NestedField, Supplier<List<String>>) - Method in class org.apache.iceberg.types.CheckCompatibility
- field(Types.NestedField, Supplier<Type>) - Method in class org.apache.iceberg.spark.PruneColumnsWithoutReordering
- field(Types.NestedField, Supplier<Type>) - Method in class org.apache.iceberg.spark.PruneColumnsWithReordering
- field(Types.NestedField, Supplier<Type>) - Method in class org.apache.iceberg.types.FixupTypes
- field(Types.NestedField, Supplier<T>) - Method in class org.apache.iceberg.types.TypeUtil.CustomOrderSchemaVisitor
- field(Types.NestedField, Map<Integer, Integer>) - Method in class org.apache.iceberg.types.IndexParents
- field(Types.NestedField, Map<String, Integer>) - Method in class org.apache.iceberg.types.IndexByName
- field(Types.NestedField, ObjectInspector) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- field(Types.NestedField, P, R) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- field(Types.NestedField, T) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- FIELD - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- FIELD - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- FIELD() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- FIELD() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- FIELD() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- FIELD() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- FIELD_ID_PROP - Static variable in class org.apache.iceberg.avro.AvroSchemaUtil
- fieldId() - Method in class org.apache.iceberg.expressions.BoundReference
- fieldId() - Method in class org.apache.iceberg.PartitionField
-
Returns the partition field id across all the table metadata's partition specs.
- fieldId() - Method in class org.apache.iceberg.types.Types.NestedField
- fieldId(TypeDescription) - Static method in class org.apache.iceberg.orc.ORCSchemaUtil
- fieldList() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- fieldList() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- fieldList() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- FieldListContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FieldListContext
- FieldMetrics<T> - Class in org.apache.iceberg
-
Iceberg internally tracked field level metrics.
- FieldMetrics(int, long, long, long, T, T) - Constructor for class org.apache.iceberg.FieldMetrics
- fieldNameAndType(LogicalType, int) - Method in class org.apache.iceberg.flink.data.AvroWithFlinkSchemaVisitor
- fieldNameAndType(DataType, int) - Method in class org.apache.iceberg.spark.data.AvroWithSparkSchemaVisitor
- fieldNameAndType(P, int) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- fieldNames - Variable in class org.apache.iceberg.parquet.TypeWithSchemaVisitor
- fieldNames() - Method in class org.apache.iceberg.avro.AvroSchemaVisitor
- fieldPartner(P, int, String) - Method in interface org.apache.iceberg.schema.SchemaWithPartnerVisitor.PartnerAccessors
- fields - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FieldListContext
- fields - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderContext
- fields() - Method in class org.apache.iceberg.mapping.MappedFields
- fields() - Method in class org.apache.iceberg.PartitionSpec
-
Returns the list of
partition fields
for this spec. - fields() - Method in class org.apache.iceberg.SortOrder
-
Returns the list of
sort fields
for this sort order - fields() - Method in class org.apache.iceberg.types.Type.NestedType
- fields() - Method in class org.apache.iceberg.types.Types.ListType
- fields() - Method in class org.apache.iceberg.types.Types.MapType
- fields() - Method in class org.apache.iceberg.types.Types.StructType
- FIELDS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- FIELDS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- FIELDS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- FIELDS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- FIELDS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- fieldType(String) - Method in class org.apache.iceberg.types.Type.NestedType
- fieldType(String) - Method in class org.apache.iceberg.types.Types.ListType
- fieldType(String) - Method in class org.apache.iceberg.types.Types.MapType
- fieldType(String) - Method in class org.apache.iceberg.types.Types.StructType
- file() - Method in interface org.apache.iceberg.FileScanTask
-
The
file
to scan. - file() - Method in class org.apache.iceberg.ManifestReader
- file() - Method in class org.apache.iceberg.TableMetadata.MetadataLogEntry
- FILE_FORMAT - Static variable in interface org.apache.iceberg.DataFile
- FILE_IO_IMPL - Static variable in class org.apache.iceberg.CatalogProperties
- FILE_OPEN_COST - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- FILE_PATH - Static variable in interface org.apache.iceberg.DataFile
- FILE_PATH - Static variable in class org.apache.iceberg.MetadataColumns
- FILE_SCAN_TASK_SET_ID - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- FILE_SIZE - Static variable in interface org.apache.iceberg.DataFile
- FileAppender<D> - Interface in org.apache.iceberg.io
- FileAppenderFactory<T> - Interface in org.apache.iceberg.io
-
Factory to create a new
FileAppender
to write records. - FileContent - Enum in org.apache.iceberg
-
Content type stored in a file, one of DATA, POSITION_DELETES, or EQUALITY_DELETES.
- fileCount() - Method in class org.apache.iceberg.ScanSummary.PartitionMetrics
- FileFormat - Enum in org.apache.iceberg
-
Enum of supported file formats.
- fileIO() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- fileIO() - Method in class org.apache.iceberg.flink.actions.RewriteDataFilesAction
- FileIO - Interface in org.apache.iceberg.io
-
Pluggable module for reading, writing, and deleting files.
- FileMetadata - Class in org.apache.iceberg
- FileMetadata.Builder - Class in org.apache.iceberg
- fileMetrics(InputFile, MetricsConfig) - Static method in class org.apache.iceberg.parquet.ParquetUtil
- fileMetrics(InputFile, MetricsConfig, NameMapping) - Static method in class org.apache.iceberg.parquet.ParquetUtil
- fileOffset() - Method in class org.apache.iceberg.flink.source.split.IcebergSourceSplit
- FileRewriteCoordinator - Class in org.apache.iceberg.spark
- files() - Method in class org.apache.iceberg.BaseCombinedScanTask
- files() - Method in interface org.apache.iceberg.CombinedScanTask
-
Return the
tasks
in this combined task. - Files - Class in org.apache.iceberg
- FILES - org.apache.iceberg.MetadataTableType
- fileScans() - Method in class org.apache.iceberg.actions.RewriteFileGroup
- FileScanTask - Interface in org.apache.iceberg
-
A scan task over a range of a single file.
- FileScanTaskReader<T> - Interface in org.apache.iceberg.flink.source
-
Read a
FileScanTask
into aCloseableIterator
- fileScanTaskSetId() - Method in class org.apache.iceberg.spark.SparkReadConf
- FileScanTaskSetManager - Class in org.apache.iceberg.spark
- fileSizeInBytes() - Method in interface org.apache.iceberg.ContentFile
-
Returns the file size in bytes.
- fileSizeInBytes() - Method in class org.apache.iceberg.spark.SparkDataFile
- FileWriter<T,R> - Interface in org.apache.iceberg.io
-
A writer capable of writing files of a single type (i.e.
- FileWriterFactory<T> - Interface in org.apache.iceberg.io
-
A factory for creating data and delete writers.
- filter() - Method in class org.apache.iceberg.events.IncrementalScanEvent
- filter() - Method in class org.apache.iceberg.events.ScanEvent
- filter() - Method in interface org.apache.iceberg.TableScan
-
Returns this scan's filter
Expression
. - filter(Iterable<T>) - Method in class org.apache.iceberg.util.Filter
- filter(Expression) - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
-
Pass a row Expression to filter DataFiles to be rewritten.
- filter(Expression) - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteFiles
-
A filter for finding the equality deletes to convert.
- filter(Expression) - Method in interface org.apache.iceberg.actions.RewriteDataFiles
-
A user provided filter for determining which files will be considered by the rewrite strategy.
- filter(Expression) - Method in interface org.apache.iceberg.actions.RewritePositionDeleteFiles
-
A filter for finding deletes to rewrite.
- filter(Expression) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- filter(Expression) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- filter(Expression) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- filter(Expression) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
from the results of this filtered by theExpression
. - filter(CloseableIterable<E>, Predicate<E>) - Static method in interface org.apache.iceberg.io.CloseableIterable
- filter(CloseableIterable<T>) - Method in class org.apache.iceberg.data.DeleteFilter
- filter(CloseableIterable<T>) - Method in class org.apache.iceberg.util.Filter
- filter(CloseableIterable<T>, Function<T, Long>, PositionDeleteIndex) - Static method in class org.apache.iceberg.deletes.Deletes
- filter(CloseableIterable<T>, Function<T, StructLike>, StructLikeSet) - Static method in class org.apache.iceberg.deletes.Deletes
- Filter<T> - Class in org.apache.iceberg.util
-
A Class for generic filters
- Filter() - Constructor for class org.apache.iceberg.util.Filter
- FILTER_EXPRESSION - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- FilterIterator<T> - Class in org.apache.iceberg.io
-
An Iterator that filters another Iterator.
- FilterIterator(Iterator<T>) - Constructor for class org.apache.iceberg.io.FilterIterator
- filterPartitions(List<SparkTableUtil.SparkPartition>, Map<String, String>) - Static method in class org.apache.iceberg.spark.SparkTableUtil
- filterPartitions(Expression) - Method in class org.apache.iceberg.ManifestReader
- filterRecords(boolean) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- filterRows(Expression) - Method in class org.apache.iceberg.ManifestReader
- filters(List<Expression>) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- finalize() - Method in class org.apache.iceberg.aliyun.oss.OSSInputStream
- finalize() - Method in class org.apache.iceberg.aliyun.oss.OSSOutputStream
- find(int) - Method in class org.apache.iceberg.mapping.NameMapping
- find(String...) - Method in class org.apache.iceberg.mapping.NameMapping
- find(List<String>) - Method in class org.apache.iceberg.mapping.NameMapping
- find(Schema, Predicate<Type>) - Static method in class org.apache.iceberg.types.TypeUtil
- findColumnName(int) - Method in class org.apache.iceberg.Schema
-
Returns the full column name for the given id.
- findEqualityDeleteRows(CloseableIterable<T>) - Method in class org.apache.iceberg.data.DeleteFilter
- findField(int) - Method in class org.apache.iceberg.Schema
-
Returns the sub-field identified by the field id as a
Types.NestedField
. - findField(String) - Method in class org.apache.iceberg.Schema
-
Returns a sub-field by name as a
Types.NestedField
. - FindFiles - Class in org.apache.iceberg
- FindFiles.Builder - Class in org.apache.iceberg
- findType(int) - Method in class org.apache.iceberg.Schema
-
Returns the
Type
of a sub-field identified by the field id. - findType(String) - Method in class org.apache.iceberg.Schema
-
Returns the
Type
of a sub-field identified by the field name. - finish() - Method in class org.apache.iceberg.flink.source.StreamingReaderOperator
- first() - Method in class org.apache.iceberg.util.Pair
- FIRST - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- FIRST - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- FIRST() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- FIRST() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- fixed(int) - Static method in class org.apache.iceberg.avro.ValueReaders
- fixed(int) - Static method in class org.apache.iceberg.avro.ValueWriters
- fixed(Schema) - Static method in class org.apache.iceberg.avro.ValueReaders
- FIXED - org.apache.iceberg.orc.ORCSchemaUtil.BinaryType
- FIXED - org.apache.iceberg.types.Type.TypeID
- fixedLengthDecimalBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- FixedLengthDecimalBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.FixedLengthDecimalBatchReader
- fixedLengthDecimalDictEncodedReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- fixedSizeBinaryBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- FixedSizeBinaryBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.FixedSizeBinaryBatchReader
- fixedSizeBinaryDictEncodedReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- fixedWidthBinaryDictEncodedReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- fixedWidthTypeBinaryBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- FixedWidthTypeBinaryBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.FixedWidthTypeBinaryBatchReader
- fixupPrimitive(Type.PrimitiveType, Type) - Method in class org.apache.iceberg.types.FixupTypes
- FixupTypes - Class in org.apache.iceberg.types
-
This is used to fix primitive types to match a table schema.
- FixupTypes(Schema) - Constructor for class org.apache.iceberg.types.FixupTypes
- FlinkAppenderFactory - Class in org.apache.iceberg.flink.sink
- FlinkAppenderFactory(Schema, RowType, Map<String, String>, PartitionSpec) - Constructor for class org.apache.iceberg.flink.sink.FlinkAppenderFactory
- FlinkAppenderFactory(Schema, RowType, Map<String, String>, PartitionSpec, int[], Schema, Schema) - Constructor for class org.apache.iceberg.flink.sink.FlinkAppenderFactory
- FlinkAvroReader - Class in org.apache.iceberg.flink.data
- FlinkAvroReader(Schema, Schema) - Constructor for class org.apache.iceberg.flink.data.FlinkAvroReader
- FlinkAvroReader(Schema, Schema, Map<Integer, ?>) - Constructor for class org.apache.iceberg.flink.data.FlinkAvroReader
- FlinkAvroWriter - Class in org.apache.iceberg.flink.data
- FlinkAvroWriter(RowType) - Constructor for class org.apache.iceberg.flink.data.FlinkAvroWriter
- FlinkCatalog - Class in org.apache.iceberg.flink
-
A Flink Catalog implementation that wraps an Iceberg
Catalog
. - FlinkCatalog(String, String, Namespace, CatalogLoader, boolean) - Constructor for class org.apache.iceberg.flink.FlinkCatalog
- FlinkCatalogFactory - Class in org.apache.iceberg.flink
-
A Flink Catalog factory implementation that creates
FlinkCatalog
. - FlinkCatalogFactory() - Constructor for class org.apache.iceberg.flink.FlinkCatalogFactory
- FlinkCompatibilityUtil - Class in org.apache.iceberg.flink.util
-
This is a small util class that try to hide calls to Flink Internal or PublicEvolve interfaces as Flink can change those APIs during minor version release.
- flinkConf(ReadableConfig) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- FlinkConfigOptions - Class in org.apache.iceberg.flink
- FlinkDynamicTableFactory - Class in org.apache.iceberg.flink
- FlinkDynamicTableFactory() - Constructor for class org.apache.iceberg.flink.FlinkDynamicTableFactory
- FlinkDynamicTableFactory(FlinkCatalog) - Constructor for class org.apache.iceberg.flink.FlinkDynamicTableFactory
- FlinkFilters - Class in org.apache.iceberg.flink
- FlinkInputFormat - Class in org.apache.iceberg.flink.source
-
Flink
InputFormat
for Iceberg. - FlinkInputSplit - Class in org.apache.iceberg.flink.source
-
TODO Implement
LocatableInputSplit
. - FlinkOrcReader - Class in org.apache.iceberg.flink.data
- FlinkOrcReader(Schema, TypeDescription) - Constructor for class org.apache.iceberg.flink.data.FlinkOrcReader
- FlinkOrcReader(Schema, TypeDescription, Map<Integer, ?>) - Constructor for class org.apache.iceberg.flink.data.FlinkOrcReader
- FlinkOrcWriter - Class in org.apache.iceberg.flink.data
- FlinkParquetReaders - Class in org.apache.iceberg.flink.data
- FlinkParquetWriters - Class in org.apache.iceberg.flink.data
- FlinkSchemaUtil - Class in org.apache.iceberg.flink
-
Converter between Flink types and Iceberg type.
- FlinkSink - Class in org.apache.iceberg.flink.sink
- FlinkSink.Builder - Class in org.apache.iceberg.flink.sink
- FlinkSource - Class in org.apache.iceberg.flink.source
- FlinkSource.Builder - Class in org.apache.iceberg.flink.source
-
Source builder to build
DataStream
. - FlinkSplitPlanner - Class in org.apache.iceberg.flink.source
- FlinkTypeVisitor<T> - Class in org.apache.iceberg.flink
- FlinkTypeVisitor() - Constructor for class org.apache.iceberg.flink.FlinkTypeVisitor
- FlinkValueReaders - Class in org.apache.iceberg.flink.data
- FlinkValueWriters - Class in org.apache.iceberg.flink.data
- flipLR() - Method in enum org.apache.iceberg.expressions.Expression.Operation
-
Returns the equivalent operation when the left and right operands are exchanged.
- FLOAT - org.apache.iceberg.types.Type.TypeID
- FLOAT_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- FLOAT_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- FLOAT_LITERAL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FloatLiteralContext
- FloatAsDoubleReader(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.FloatAsDoubleReader
- floatBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- FloatBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.FloatBatchReader
- floatDictEncodedReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- FloatFieldMetrics - Class in org.apache.iceberg
-
Iceberg internally tracked field level metrics, used by Parquet and ORC writers only.
- FloatFieldMetrics.Builder - Class in org.apache.iceberg
- FloatLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FloatLiteralContext
- floats() - Static method in class org.apache.iceberg.avro.ValueReaders
- floats() - Static method in class org.apache.iceberg.avro.ValueWriters
- floats() - Static method in class org.apache.iceberg.orc.OrcValueReaders
- floats(int) - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- floats(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- FloatType() - Constructor for class org.apache.iceberg.types.Types.FloatType
- flush() - Method in class org.apache.iceberg.aliyun.oss.OSSOutputStream
- footerMetrics(ParquetMetadata, Stream<FieldMetrics<?>>, MetricsConfig) - Static method in class org.apache.iceberg.parquet.ParquetUtil
- footerMetrics(ParquetMetadata, Stream<FieldMetrics<?>>, MetricsConfig, NameMapping) - Static method in class org.apache.iceberg.parquet.ParquetUtil
- foreach(I...) - Static method in class org.apache.iceberg.util.Tasks
- foreach(Iterable<I>) - Static method in class org.apache.iceberg.util.Tasks
- foreach(Iterator<I>) - Static method in class org.apache.iceberg.util.Tasks
- foreach(Stream<I>) - Static method in class org.apache.iceberg.util.Tasks
- forHolder(VectorHolder, int[], int) - Static method in class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- format() - Method in interface org.apache.iceberg.ContentFile
-
Returns format of the file.
- format() - Method in class org.apache.iceberg.spark.SparkDataFile
- format(FileFormat) - Method in class org.apache.iceberg.io.OutputFileFactory.Builder
- FORMAT_VERSION - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for table format version.
- formatTimestampMillis(long) - Static method in class org.apache.iceberg.util.DateTimeUtil
- formatVersion() - Method in class org.apache.iceberg.MetadataUpdate.UpgradeFormatVersion
- formatVersion() - Method in class org.apache.iceberg.TableMetadata
- forPartitionFilter(Expression, PartitionSpec, boolean) - Static method in class org.apache.iceberg.expressions.ManifestEvaluator
- forPositionDelete(Table) - Static method in class org.apache.iceberg.MetricsConfig
-
Creates a metrics config for a position delete file.
- forRow(DataStream<Row>, TableSchema) - Static method in class org.apache.iceberg.flink.sink.FlinkSink
-
Initialize a
FlinkSink.Builder
to export the data from input data stream withRow
s into iceberg table. - forRowData() - Static method in class org.apache.iceberg.flink.source.FlinkSource
-
Initialize a
FlinkSource.Builder
to read the data from iceberg table. - forRowData(DataStream<RowData>) - Static method in class org.apache.iceberg.flink.sink.FlinkSink
-
Initialize a
FlinkSink.Builder
to export the data from input data stream withRowData
s into iceberg table. - forRowFilter(Expression, PartitionSpec, boolean) - Static method in class org.apache.iceberg.expressions.ManifestEvaluator
- forTable(StreamExecutionEnvironment, Table) - Static method in class org.apache.iceberg.flink.actions.Actions
- forTable(Table) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- forTable(Table) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- forTable(Table) - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- forTable(Table) - Static method in class org.apache.iceberg.flink.actions.Actions
- forTable(Table) - Static method in class org.apache.iceberg.MetricsConfig
-
Creates a metrics config from a table.
- forTable(Table) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- forTable(Table) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- forTable(Table) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- forTable(Table) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- forTable(Table) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- forType(Type) - Static method in interface org.apache.iceberg.types.JavaHash
- forType(Type.PrimitiveType) - Static method in class org.apache.iceberg.types.Comparators
- forType(Types.ListType) - Static method in class org.apache.iceberg.types.Comparators
- forType(Types.StructType) - Static method in class org.apache.iceberg.types.Comparators
- forType(Types.StructType) - Static method in class org.apache.iceberg.util.StructLikeWrapper
- from(String) - Static method in enum org.apache.iceberg.MetadataTableType
- from(Map<String, String>) - Static method in class org.apache.iceberg.aliyun.AliyunClientFactories
- from(Map<String, String>) - Static method in class org.apache.iceberg.aws.AwsClientFactories
- from(Map<String, String>) - Static method in class org.apache.iceberg.util.LockManagers
- from(Snapshot, FileIO) - Static method in class org.apache.iceberg.MicroBatches
- fromByteBuffer(Type, ByteBuffer) - Static method in class org.apache.iceberg.types.Conversions
- fromCatalog(CatalogLoader, TableIdentifier) - Static method in interface org.apache.iceberg.flink.TableLoader
- fromCombinedScanTask(CombinedScanTask) - Static method in class org.apache.iceberg.flink.source.split.IcebergSourceSplit
- fromCombinedScanTask(CombinedScanTask, int, long) - Static method in class org.apache.iceberg.flink.source.split.IcebergSourceSplit
- fromFileName(CharSequence) - Static method in enum org.apache.iceberg.FileFormat
- fromFileName(String) - Static method in enum org.apache.iceberg.TableMetadataParser.Codec
- fromFixed(GenericFixed, Schema, LogicalType) - Method in class org.apache.iceberg.avro.UUIDConversion
- fromHadoopTable(String) - Static method in interface org.apache.iceberg.flink.TableLoader
- fromHadoopTable(String, Configuration) - Static method in interface org.apache.iceberg.flink.TableLoader
- fromInputFile(InputFile) - Static method in class org.apache.iceberg.orc.OrcMetrics
- fromInputFile(InputFile, MetricsConfig) - Static method in class org.apache.iceberg.orc.OrcMetrics
- fromInputFile(InputFile, MetricsConfig, NameMapping) - Static method in class org.apache.iceberg.orc.OrcMetrics
- fromJson(JsonNode) - Static method in class org.apache.iceberg.SchemaParser
- fromJson(String) - Static method in class org.apache.iceberg.mapping.NameMappingParser
- fromJson(String) - Static method in class org.apache.iceberg.SchemaParser
- fromJson(FileIO, String) - Static method in class org.apache.iceberg.SnapshotParser
- fromJson(FileIO, String) - Static method in class org.apache.iceberg.TableMetadataParser
-
Read TableMetadata from a JSON string.
- fromJson(FileIO, String, String) - Static method in class org.apache.iceberg.TableMetadataParser
-
Read TableMetadata from a JSON string.
- fromJson(Schema, JsonNode) - Static method in class org.apache.iceberg.PartitionSpecParser
- fromJson(Schema, JsonNode) - Static method in class org.apache.iceberg.SortOrderParser
- fromJson(Schema, String) - Static method in class org.apache.iceberg.PartitionSpecParser
- fromJson(Schema, String) - Static method in class org.apache.iceberg.SortOrderParser
- fromLocation(CharSequence, long, Configuration) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromLocation(CharSequence, long, FileSystem) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromLocation(CharSequence, Configuration) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromLocation(CharSequence, Configuration) - Static method in class org.apache.iceberg.hadoop.HadoopOutputFile
- fromLocation(CharSequence, FileSystem) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromLocation(CharSequence, FileSystem) - Static method in class org.apache.iceberg.hadoop.HadoopOutputFile
- fromLocation(String, Storage, GCPProperties) - Static method in class org.apache.iceberg.gcp.gcs.GCSInputFile
- fromLocation(String, Storage, GCPProperties) - Static method in class org.apache.iceberg.gcp.gcs.GCSOutputFile
- fromLocation(String, S3Client) - Static method in class org.apache.iceberg.aws.s3.S3InputFile
- fromLocation(String, S3Client) - Static method in class org.apache.iceberg.aws.s3.S3OutputFile
- fromLocation(String, S3Client, AwsProperties) - Static method in class org.apache.iceberg.aws.s3.S3InputFile
- fromLocation(String, S3Client, AwsProperties) - Static method in class org.apache.iceberg.aws.s3.S3OutputFile
- fromManifest(ManifestFile) - Static method in class org.apache.iceberg.DataFiles
- fromName(String) - Static method in enum org.apache.iceberg.DistributionMode
- fromName(String) - Static method in enum org.apache.iceberg.IsolationLevel
- fromName(String) - Static method in enum org.apache.iceberg.RowLevelOperationMode
- fromName(String) - Static method in enum org.apache.iceberg.TableMetadataParser.Codec
- fromPartitionString(Type, String) - Static method in class org.apache.iceberg.types.Conversions
- fromPath(Path, long, Configuration) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromPath(Path, long, FileSystem) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromPath(Path, long, FileSystem, Configuration) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromPath(Path, Configuration) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromPath(Path, Configuration) - Static method in class org.apache.iceberg.hadoop.HadoopOutputFile
- fromPath(Path, FileSystem) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromPath(Path, FileSystem) - Static method in class org.apache.iceberg.hadoop.HadoopOutputFile
- fromPath(Path, FileSystem, Configuration) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromPath(Path, FileSystem, Configuration) - Static method in class org.apache.iceberg.hadoop.HadoopOutputFile
- fromPrimitiveString(String) - Static method in class org.apache.iceberg.types.Types
- fromProperties(Map<String, String>) - Static method in class org.apache.iceberg.MetricsConfig
-
Deprecated.
- fromSnapshotId() - Method in class org.apache.iceberg.events.IncrementalScanEvent
- fromStatus(FileStatus, Configuration) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromStatus(FileStatus, FileSystem) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromStatus(FileStatus, FileSystem, Configuration) - Static method in class org.apache.iceberg.hadoop.HadoopInputFile
- fromString(String) - Static method in class org.apache.iceberg.MetricsModes
- fromString(Type, String) - Static method in class org.apache.iceberg.transforms.Transforms
- Full() - Constructor for class org.apache.iceberg.MetricsModes.Full
- fullTableName(String, TableIdentifier) - Static method in class org.apache.iceberg.BaseMetastoreCatalog
- functionExists(ObjectPath) - Method in class org.apache.iceberg.flink.FlinkCatalog
G
- GC_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- GC_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- GCPProperties - Class in org.apache.iceberg.gcp
- GCPProperties() - Constructor for class org.apache.iceberg.gcp.GCPProperties
- GCPProperties(Map<String, String>) - Constructor for class org.apache.iceberg.gcp.GCPProperties
- GCS_CHANNEL_READ_CHUNK_SIZE - Static variable in class org.apache.iceberg.gcp.GCPProperties
- GCS_CHANNEL_WRITE_CHUNK_SIZE - Static variable in class org.apache.iceberg.gcp.GCPProperties
- GCS_CLIENT_LIB_TOKEN - Static variable in class org.apache.iceberg.gcp.GCPProperties
- GCS_DECRYPTION_KEY - Static variable in class org.apache.iceberg.gcp.GCPProperties
- GCS_ENCRYPTION_KEY - Static variable in class org.apache.iceberg.gcp.GCPProperties
- GCS_PROJECT_ID - Static variable in class org.apache.iceberg.gcp.GCPProperties
- GCS_SERVICE_HOST - Static variable in class org.apache.iceberg.gcp.GCPProperties
- GCS_USER_PROJECT - Static variable in class org.apache.iceberg.gcp.GCPProperties
- GCSFileIO - Class in org.apache.iceberg.gcp.gcs
-
FileIO Implementation backed by Google Cloud Storage (GCS)
- GCSFileIO() - Constructor for class org.apache.iceberg.gcp.gcs.GCSFileIO
-
No-arg constructor to load the FileIO dynamically.
- GCSFileIO(SerializableSupplier<Storage>, GCPProperties) - Constructor for class org.apache.iceberg.gcp.gcs.GCSFileIO
-
Constructor with custom storage supplier and GCP properties.
- GCSInputFile - Class in org.apache.iceberg.gcp.gcs
- GCSOutputFile - Class in org.apache.iceberg.gcp.gcs
- generate(long, long, boolean) - Method in class org.apache.iceberg.MicroBatches.MicroBatchBuilder
- generateFilterExpression(SearchArgument) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergFilterFactory
- GENERIC - org.apache.iceberg.mr.InputFormatConfig.InMemoryDataModel
- GenericAppenderFactory - Class in org.apache.iceberg.data
-
Factory to create a new
FileAppender
to writeRecord
s. - GenericAppenderFactory(Schema) - Constructor for class org.apache.iceberg.data.GenericAppenderFactory
- GenericAppenderFactory(Schema, PartitionSpec) - Constructor for class org.apache.iceberg.data.GenericAppenderFactory
- GenericAppenderFactory(Schema, PartitionSpec, int[], Schema, Schema) - Constructor for class org.apache.iceberg.data.GenericAppenderFactory
- GenericArrowVectorAccessorFactory<DecimalT,Utf8StringT,ArrayT,ChildVectorT extends java.lang.AutoCloseable> - Class in org.apache.iceberg.arrow.vectorized
-
This class is creates typed
ArrowVectorAccessor
fromVectorHolder
. - GenericArrowVectorAccessorFactory(Supplier<GenericArrowVectorAccessorFactory.DecimalFactory<DecimalT>>, Supplier<GenericArrowVectorAccessorFactory.StringFactory<Utf8StringT>>, Supplier<GenericArrowVectorAccessorFactory.StructChildFactory<ChildVectorT>>, Supplier<GenericArrowVectorAccessorFactory.ArrayFactory<ChildVectorT, ArrayT>>) - Constructor for class org.apache.iceberg.arrow.vectorized.GenericArrowVectorAccessorFactory
-
The constructor is parameterized using the decimal, string, struct and array factories.
- GenericArrowVectorAccessorFactory.ArrayFactory<ChildVectorT,ArrayT> - Interface in org.apache.iceberg.arrow.vectorized
-
Create an array value of type
ArrayT
from arrow vector value. - GenericArrowVectorAccessorFactory.DecimalFactory<DecimalT> - Interface in org.apache.iceberg.arrow.vectorized
-
Create a decimal value of type
DecimalT
from arrow vector value. - GenericArrowVectorAccessorFactory.StringFactory<Utf8StringT> - Interface in org.apache.iceberg.arrow.vectorized
-
Create a UTF8 String value of type
Utf8StringT
from arrow vector value. - GenericArrowVectorAccessorFactory.StructChildFactory<ChildVectorT> - Interface in org.apache.iceberg.arrow.vectorized
-
Create a struct child vector of type
ChildVectorT
from arrow vector value. - GenericDeleteFilter - Class in org.apache.iceberg.data
- GenericDeleteFilter(FileIO, FileScanTask, Schema, Schema) - Constructor for class org.apache.iceberg.data.GenericDeleteFilter
- genericFixed(int) - Static method in class org.apache.iceberg.avro.ValueWriters
- GenericManifestFile - Class in org.apache.iceberg
- GenericManifestFile(String, long, int, ManifestContent, long, long, Long, int, long, int, long, int, long, List<ManifestFile.PartitionFieldSummary>, ByteBuffer) - Constructor for class org.apache.iceberg.GenericManifestFile
- GenericManifestFile(Schema) - Constructor for class org.apache.iceberg.GenericManifestFile
-
Used by Avro reflection to instantiate this class when reading manifest files.
- GenericManifestFile.CopyBuilder - Class in org.apache.iceberg
- GenericOrcReader - Class in org.apache.iceberg.data.orc
- GenericOrcReader(Schema, TypeDescription, Map<Integer, ?>) - Constructor for class org.apache.iceberg.data.orc.GenericOrcReader
- GenericOrcReaders - Class in org.apache.iceberg.data.orc
- GenericOrcWriter - Class in org.apache.iceberg.data.orc
- GenericOrcWriters - Class in org.apache.iceberg.data.orc
- GenericOrcWriters.StructWriter<S> - Class in org.apache.iceberg.data.orc
- GenericParquetReaders - Class in org.apache.iceberg.data.parquet
- GenericParquetWriter - Class in org.apache.iceberg.data.parquet
- GenericPartitionFieldSummary - Class in org.apache.iceberg
- GenericPartitionFieldSummary(boolean, boolean, ByteBuffer, ByteBuffer) - Constructor for class org.apache.iceberg.GenericPartitionFieldSummary
- GenericPartitionFieldSummary(Schema) - Constructor for class org.apache.iceberg.GenericPartitionFieldSummary
-
Used by Avro reflection to instantiate this class when reading manifest files.
- GenericRecord - Class in org.apache.iceberg.data
- get() - Method in class org.apache.iceberg.common.DynFields.BoundField
- get() - Method in class org.apache.iceberg.common.DynFields.StaticField
- get() - Method in class org.apache.iceberg.data.InternalRecordWrapper
- get() - Static method in class org.apache.iceberg.hadoop.HiddenPathFilter
- get() - Method in class org.apache.iceberg.hadoop.SerializableConfiguration
- get() - Static method in class org.apache.iceberg.MetricsModes.Counts
- get() - Static method in class org.apache.iceberg.MetricsModes.Full
- get() - Static method in class org.apache.iceberg.MetricsModes.None
- get() - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergBinaryObjectInspector
- get() - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDateObjectInspector
- get() - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergFixedObjectInspector
- get() - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimeObjectInspector
- get() - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampObjectInspector
- get() - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampWithZoneObjectInspector
- get() - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergUUIDObjectInspector
- get() - Method in class org.apache.iceberg.mr.mapred.Container
- get() - Static method in class org.apache.iceberg.spark.actions.SparkActions
- get() - Static method in class org.apache.iceberg.spark.FileRewriteCoordinator
- get() - Static method in class org.apache.iceberg.spark.FileScanTaskSetManager
- get() - Static method in class org.apache.iceberg.types.Types.BinaryType
- get() - Static method in class org.apache.iceberg.types.Types.BooleanType
- get() - Static method in class org.apache.iceberg.types.Types.DateType
- get() - Static method in class org.apache.iceberg.types.Types.DoubleType
- get() - Static method in class org.apache.iceberg.types.Types.FloatType
- get() - Static method in class org.apache.iceberg.types.Types.IntegerType
- get() - Static method in class org.apache.iceberg.types.Types.LongType
- get() - Static method in class org.apache.iceberg.types.Types.StringType
- get() - Static method in class org.apache.iceberg.types.Types.TimeType
- get() - Static method in class org.apache.iceberg.types.Types.UUIDType
- get() - Method in interface org.apache.iceberg.types.TypeUtil.NextID
- get() - Method in class org.apache.iceberg.util.CharSequenceWrapper
- get() - Method in class org.apache.iceberg.util.StructLikeWrapper
- get(int) - Method in class org.apache.iceberg.data.GenericRecord
- get(int) - Method in interface org.apache.iceberg.data.Record
- get(int) - Method in class org.apache.iceberg.GenericManifestFile
- get(int) - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- get(int) - Method in class org.apache.iceberg.util.Pair
- get(int, int) - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDecimalObjectInspector
- get(int, Class<T>) - Method in class org.apache.iceberg.data.GenericRecord
- get(int, Class<T>) - Method in class org.apache.iceberg.data.InternalRecordWrapper
- get(int, Class<T>) - Method in class org.apache.iceberg.deletes.PositionDelete
- get(int, Class<T>) - Method in class org.apache.iceberg.flink.RowDataWrapper
- get(int, Class<T>) - Method in class org.apache.iceberg.GenericManifestFile
- get(int, Class<T>) - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- get(int, Class<T>) - Method in class org.apache.iceberg.PartitionKey
- get(int, Class<T>) - Method in class org.apache.iceberg.spark.SparkStructLike
- get(int, Class<T>) - Method in interface org.apache.iceberg.StructLike
- get(int, Class<T>) - Method in class org.apache.iceberg.util.StructProjection
- get(Object) - Method in class org.apache.iceberg.common.DynFields.UnboundField
- get(Object) - Method in class org.apache.iceberg.util.SerializableMap
- get(Object) - Method in class org.apache.iceberg.util.StructLikeMap
- get(PositionDelete<R>, int) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.PositionDeleteStructWriter
- get(SparkSession) - Static method in class org.apache.iceberg.spark.actions.SparkActions
- get(S, int) - Method in class org.apache.iceberg.avro.ValueReaders.StructReader
- get(S, int) - Method in class org.apache.iceberg.avro.ValueWriters.StructWriter
- get(S, int) - Method in class org.apache.iceberg.data.orc.GenericOrcWriters.StructWriter
- get(S, int) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.StructWriter
- get(T) - Method in interface org.apache.iceberg.Accessor
- getAddedSnapshotId() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- getAliases() - Method in class org.apache.iceberg.Schema
-
Returns an alias map for this schema, if set.
- getAllStructFieldRefs() - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergRecordObjectInspector
- getArity() - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getArray(int) - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- getArray(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getArray(int) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- getArray(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getArray(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getATN() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- getATN() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- getAuthorizationProvider() - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- getBinary(int) - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- getBinary(int) - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- getBinary(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getBinary(int) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- getBinary(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getBinary(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getBlockLocations(long, long) - Method in class org.apache.iceberg.hadoop.HadoopInputFile
- getBool(String, JsonNode) - Static method in class org.apache.iceberg.util.JsonUtil
- getBoolean(int) - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- getBoolean(int) - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- getBoolean(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getBoolean(int) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- getBoolean(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getBoolean(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getBuffer(int) - Method in class org.apache.iceberg.parquet.ValuesAsBytesReader
- getByte(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getByte(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getByte(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getCategory() - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergRecordObjectInspector
- getChangelogMode() - Method in class org.apache.iceberg.flink.IcebergTableSource
- getChangelogMode(ChangelogMode) - Method in class org.apache.iceberg.flink.IcebergTableSink
- getChannelNames() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- getChild(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getChild(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getConf() - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- getConf() - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- getConf() - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- getConf() - Method in class org.apache.iceberg.hadoop.HadoopInputFile
- getConf() - Method in class org.apache.iceberg.hadoop.HadoopOutputFile
- getConf() - Method in class org.apache.iceberg.hadoop.HadoopTables
- getConf() - Method in class org.apache.iceberg.hive.HiveCatalog
- getConf() - Method in class org.apache.iceberg.io.ResolvingFileIO
- getConf() - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- getConf() - Method in class org.apache.iceberg.nessie.NessieCatalog
- getConstant() - Method in class org.apache.iceberg.arrow.vectorized.VectorHolder.ConstantVectorHolder
- getConstructedClass() - Method in class org.apache.iceberg.common.DynConstructors.Ctor
- getConvertedType() - Method in class org.apache.iceberg.avro.UUIDConversion
- getCurrentKey() - Method in class org.apache.iceberg.pig.IcebergPigInputFormat.IcebergRecordReader
- getCurrentValue() - Method in class org.apache.iceberg.pig.IcebergPigInputFormat.IcebergRecordReader
- getDatabase(String) - Method in class org.apache.iceberg.flink.FlinkCatalog
- getDecimal(int, int, int) - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- getDecimal(int, int, int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getDecimal(int, int, int) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- getDecimal(int, int, int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getDecimal(int, int, int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getDefault() - Static method in class org.apache.iceberg.MetricsConfig
- getDelegate() - Method in interface org.apache.iceberg.io.DelegatingInputStream
- getDelegate() - Method in interface org.apache.iceberg.io.DelegatingOutputStream
- getDouble(int) - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- getDouble(int) - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- getDouble(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getDouble(int) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- getDouble(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getDouble(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getElement(I) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- getElement(List<E>) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ListReader
- getElementId(Schema) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- getFactory() - Method in class org.apache.iceberg.flink.FlinkCatalog
- getFeatures() - Method in class org.apache.iceberg.pig.IcebergStorage
- getField(I, int) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- getField(String) - Method in class org.apache.iceberg.data.GenericRecord
- getField(String) - Method in interface org.apache.iceberg.data.Record
- getFieldId(Schema.Field) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- getFieldsBySourceId(int) - Method in class org.apache.iceberg.PartitionSpec
-
Returns the
field
that partitions the given source field - getFieldVector() - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- getFileExtension(String) - Static method in class org.apache.iceberg.TableMetadataParser
- getFileExtension(TableMetadataParser.Codec) - Static method in class org.apache.iceberg.TableMetadataParser
- getFileSystem() - Method in class org.apache.iceberg.hadoop.HadoopInputFile
- getFileSystem() - Method in class org.apache.iceberg.hadoop.HadoopOutputFile
- getFileSystem(Path, Configuration) - Method in class org.apache.iceberg.hadoop.HadoopTableOperations
- getFloat(int) - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- getFloat(int) - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- getFloat(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getFloat(int) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- getFloat(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getFloat(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getFormat() - Method in class org.apache.iceberg.spark.SparkTableUtil.SparkPartition
- getFs(Path, Configuration) - Static method in class org.apache.iceberg.hadoop.Util
- getFunction(ObjectPath) - Method in class org.apache.iceberg.flink.FlinkCatalog
- getGenericClass() - Method in interface org.apache.iceberg.arrow.vectorized.GenericArrowVectorAccessorFactory.DecimalFactory
-
Class of concrete decimal type.
- getGenericClass() - Method in interface org.apache.iceberg.arrow.vectorized.GenericArrowVectorAccessorFactory.StringFactory
-
Class of concrete UTF8 String type.
- getGenericClass() - Method in interface org.apache.iceberg.arrow.vectorized.GenericArrowVectorAccessorFactory.StructChildFactory
-
Class of concrete child vector type.
- getGrammarFileName() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- getGrammarFileName() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- getHiveRecordWriter(JobConf, Path, Class, boolean, Properties, Progressable) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputFormat
- getInputFile(String) - Method in class org.apache.iceberg.data.DeleteFilter
- getInputFile(String) - Method in class org.apache.iceberg.data.GenericDeleteFilter
- getInputFile(String) - Method in class org.apache.iceberg.encryption.InputFilesDecryptor
- getInputFile(FileScanTask) - Method in class org.apache.iceberg.encryption.InputFilesDecryptor
- getInputFormat() - Method in class org.apache.iceberg.pig.IcebergStorage
- getInputFormatClass() - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- getInputSplitAssigner(FlinkInputSplit[]) - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- getInt(int) - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- getInt(int) - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- getInt(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getInt(int) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- getInt(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getInt(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getInt(String, JsonNode) - Static method in class org.apache.iceberg.util.JsonUtil
- getIntegerSetOrNull(String, JsonNode) - Static method in class org.apache.iceberg.util.JsonUtil
- getIntOrNull(String, JsonNode) - Static method in class org.apache.iceberg.util.JsonUtil
- getJobGroupInfo(SparkContext) - Static method in class org.apache.iceberg.spark.JobGroupUtils
- getKey() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ReusableEntry
- getKeyId(Schema) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- getLegacyReporter() - Method in class org.apache.iceberg.mr.mapred.MapredIcebergInputFormat.CompatibilityTaskAttemptContextImpl
- getLength() - Method in class org.apache.iceberg.aliyun.oss.OSSInputFile
- getLength() - Method in class org.apache.iceberg.aws.s3.S3InputFile
-
Note: this may be stale if file was deleted since metadata is cached for size/existence checks.
- getLength() - Method in class org.apache.iceberg.gcp.gcs.GCSInputFile
- getLength() - Method in class org.apache.iceberg.hadoop.HadoopInputFile
- getLength() - Method in interface org.apache.iceberg.io.InputFile
-
Returns the total length of the file, in bytes
- getLength() - Method in class org.apache.iceberg.mr.hive.HiveIcebergSplit
- getLength() - Method in class org.apache.iceberg.mr.mapreduce.IcebergSplit
- getLength() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- getLocations() - Method in class org.apache.iceberg.mr.hive.HiveIcebergSplit
- getLocations() - Method in class org.apache.iceberg.mr.mapreduce.IcebergSplit
- getLogicalTypeName() - Method in class org.apache.iceberg.avro.UUIDConversion
- getLong(int) - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- getLong(int) - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- getLong(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getLong(int) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- getLong(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getLong(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getLong(String, JsonNode) - Static method in class org.apache.iceberg.util.JsonUtil
- getMap(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getMap(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getMap(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getMetadata() - Method in class org.apache.iceberg.avro.AvroIterable
- getMetaHook() - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- getModeNames() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- getNext() - Method in class org.apache.iceberg.pig.IcebergStorage
- getObjectInspector() - Method in class org.apache.iceberg.mr.hive.HiveIcebergSerDe
- getOldFileExtension(TableMetadataParser.Codec) - Static method in class org.apache.iceberg.TableMetadataParser
- getOutputFormatClass() - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- getPair(I) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- getPair(Map<K, V>) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.MapReader
- getPartition(ObjectPath, CatalogPartitionSpec) - Method in class org.apache.iceberg.flink.FlinkCatalog
- getPartitionColumnStatistics(ObjectPath, CatalogPartitionSpec) - Method in class org.apache.iceberg.flink.FlinkCatalog
- getPartitionKeys(String, Job) - Method in class org.apache.iceberg.pig.IcebergStorage
- getPartitions(SparkSession, String) - Static method in class org.apache.iceberg.spark.SparkTableUtil
-
Returns all partitions in the table.
- getPartitions(SparkSession, Path, String, Map<String, String>) - Static method in class org.apache.iceberg.spark.Spark3Util
-
Use Spark to list all partitions in the table.
- getPartitions(SparkSession, TableIdentifier, Map<String, String>) - Static method in class org.apache.iceberg.spark.SparkTableUtil
-
Returns all partitions in the table.
- getPartitionsByFilter(SparkSession, String, String) - Static method in class org.apache.iceberg.spark.SparkTableUtil
-
Returns partitions that match the specified 'predicate'.
- getPartitionsByFilter(SparkSession, TableIdentifier, Expression) - Static method in class org.apache.iceberg.spark.SparkTableUtil
-
Returns partitions that match the specified 'predicate'.
- getPartitionSpecId() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- getPartitionStatistics(ObjectPath, CatalogPartitionSpec) - Method in class org.apache.iceberg.flink.FlinkCatalog
- getPath() - Method in class org.apache.iceberg.hadoop.HadoopInputFile
- getPath() - Method in class org.apache.iceberg.hadoop.HadoopOutputFile
- getPath() - Method in class org.apache.iceberg.mr.hive.HiveIcebergSplit
- getPath() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- getPos() - Method in class org.apache.iceberg.aliyun.oss.OSSInputStream
- getPos() - Method in class org.apache.iceberg.aliyun.oss.OSSOutputStream
- getPos() - Method in class org.apache.iceberg.io.PositionOutputStream
-
Return the current position in the OutputStream.
- getPos() - Method in class org.apache.iceberg.io.SeekableInputStream
-
Return the current position in the InputStream.
- getPredicateFields(String, Job) - Method in class org.apache.iceberg.pig.IcebergStorage
- getPrimitiveJavaObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergBinaryObjectInspector
- getPrimitiveJavaObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDateObjectInspector
- getPrimitiveJavaObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDecimalObjectInspector
- getPrimitiveJavaObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergFixedObjectInspector
- getPrimitiveJavaObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimeObjectInspector
- getPrimitiveJavaObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampObjectInspector
- getPrimitiveJavaObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampWithZoneObjectInspector
- getPrimitiveJavaObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergUUIDObjectInspector
- getPrimitiveWritableObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergBinaryObjectInspector
- getPrimitiveWritableObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDateObjectInspector
- getPrimitiveWritableObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDecimalObjectInspector
- getPrimitiveWritableObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergFixedObjectInspector
- getPrimitiveWritableObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimeObjectInspector
- getPrimitiveWritableObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampObjectInspector
- getPrimitiveWritableObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampWithZoneObjectInspector
- getPrimitiveWritableObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergUUIDObjectInspector
- getProgress() - Method in class org.apache.iceberg.mr.mapred.AbstractMapredIcebergRecordReader
- getProgress() - Method in class org.apache.iceberg.pig.IcebergPigInputFormat.IcebergRecordReader
- getProjectedIds(Schema) - Static method in class org.apache.iceberg.types.TypeUtil
- getProjectedIds(Type) - Static method in class org.apache.iceberg.types.TypeUtil
- getRawValue(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getRecordReader(InputSplit, JobConf, Reporter) - Method in class org.apache.iceberg.mr.hive.HiveIcebergInputFormat
- getRecordReader(InputSplit, JobConf, Reporter) - Method in class org.apache.iceberg.mr.mapred.MapredIcebergInputFormat
- getRecordWriter(FileSystem, JobConf, String, Progressable) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputFormat
- getResultType(Type) - Method in interface org.apache.iceberg.transforms.Transform
-
Returns the
Type
produced by this transform given a source type. - getResultType(Type) - Method in class org.apache.iceberg.transforms.UnknownTransform
- getRow(int, int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getRowKind() - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanValueContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallArgumentContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ConstantContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExpressionContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FieldListContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentifierContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MultipartIdentifierContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumberContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleStatementContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StatementContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformArgumentContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- getRuleIndex() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteSpecContext
- getRuleNames() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- getRuleNames() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- getScanRuntimeProvider(ScanTableSource.ScanContext) - Method in class org.apache.iceberg.flink.IcebergTableSource
- getSchema() - Method in class org.apache.iceberg.GenericManifestFile
- getSchema() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- getSchema() - Method in class org.apache.iceberg.util.Pair
- getSchema(String, Job) - Method in class org.apache.iceberg.pig.IcebergStorage
- getSerDeClass() - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- getSerDeStats() - Method in class org.apache.iceberg.mr.hive.HiveIcebergSerDe
- getSerializedATN() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- getSerializedATN() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- getSerializedClass() - Method in class org.apache.iceberg.mr.hive.HiveIcebergSerDe
- getShort(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getShort(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getShort(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getSinkRuntimeProvider(DynamicTableSink.Context) - Method in class org.apache.iceberg.flink.IcebergTableSink
- getSplitNumber() - Method in class org.apache.iceberg.flink.source.FlinkInputSplit
- getSplitOffsets(ParquetMetadata) - Static method in class org.apache.iceberg.parquet.ParquetUtil
-
Returns a list of offsets in ascending order determined by the starting position of the row groups.
- getSplits(JobConf, int) - Method in class org.apache.iceberg.mr.hive.HiveIcebergInputFormat
- getSplits(JobConf, int) - Method in class org.apache.iceberg.mr.mapred.MapredIcebergInputFormat
- getSplits(JobContext) - Method in class org.apache.iceberg.mr.mapreduce.IcebergInputFormat
- getSplits(JobContext) - Method in class org.apache.iceberg.pig.IcebergPigInputFormat
- getStart() - Method in class org.apache.iceberg.mr.hive.HiveIcebergSplit
- getStat() - Method in class org.apache.iceberg.hadoop.HadoopInputFile
- getStatistics(String, Job) - Method in class org.apache.iceberg.pig.IcebergStorage
- getStatistics(BaseStatistics) - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- getString(int) - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- getString(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getString(String, JsonNode) - Static method in class org.apache.iceberg.util.JsonUtil
- getStringList(String, JsonNode) - Static method in class org.apache.iceberg.util.JsonUtil
- getStringMap(String, JsonNode) - Static method in class org.apache.iceberg.util.JsonUtil
- getStringOrNull(String, JsonNode) - Static method in class org.apache.iceberg.util.JsonUtil
- getStructFieldData(Object, StructField) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergRecordObjectInspector
- getStructFieldRef(String) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergRecordObjectInspector
- getStructFieldsDataAsList(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergRecordObjectInspector
- getSupportedExpressionTypes() - Method in class org.apache.iceberg.pig.IcebergStorage
- getSupportedFeatures() - Method in class org.apache.iceberg.mr.hive.HiveIcebergInputFormat
- getTable(ObjectPath) - Method in class org.apache.iceberg.flink.FlinkCatalog
- getTable(StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.source.IcebergSource
- getTableColumnStatistics(ObjectPath) - Method in class org.apache.iceberg.flink.FlinkCatalog
- getTableStatistics(ObjectPath) - Method in class org.apache.iceberg.flink.FlinkCatalog
- getTimestamp(int, int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- getTokenNames() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
-
Deprecated.
- getTokenNames() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
-
Deprecated.
- getType() - Static method in interface org.apache.iceberg.ManifestFile.PartitionFieldSummary
- getType(Types.StructType) - Static method in interface org.apache.iceberg.DataFile
- getTypeName() - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergRecordObjectInspector
- getUri() - Method in class org.apache.iceberg.spark.SparkTableUtil.SparkPartition
- getUTF8String(int) - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- getUTF8String(int) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- getUTF8String(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- getUTF8String(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- getValue() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ReusableEntry
- getValueId(Schema) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- getValues() - Method in class org.apache.iceberg.spark.SparkTableUtil.SparkPartition
- getVector() - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- getVectorAccessor(VectorHolder) - Method in class org.apache.iceberg.arrow.vectorized.GenericArrowVectorAccessorFactory
- getVersion() - Method in class org.apache.iceberg.flink.source.split.IcebergSourceSplitSerializer
- getVocabulary() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- getVocabulary() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- getWorkerPool() - Static method in class org.apache.iceberg.util.ThreadPools
-
Return an
ExecutorService
that uses the "worker" thread-pool. - globalIndex() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesFileGroupInfo
- globalIndex() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.FileGroupInfo
-
returns which file group this is out of the total set of file groups for this rewrite
- glue() - Method in class org.apache.iceberg.aws.AssumeRoleAwsClientFactory
- glue() - Method in interface org.apache.iceberg.aws.AwsClientFactory
-
create a AWS Glue client
- GLUE_CATALOG_ID - Static variable in class org.apache.iceberg.aws.AwsProperties
-
The ID of the Glue Data Catalog where the tables reside.
- GLUE_CATALOG_SKIP_ARCHIVE - Static variable in class org.apache.iceberg.aws.AwsProperties
-
If Glue should skip archiving an old table version when creating a new version in a commit.
- GLUE_CATALOG_SKIP_ARCHIVE_DEFAULT - Static variable in class org.apache.iceberg.aws.AwsProperties
- GlueCatalog - Class in org.apache.iceberg.aws.glue
- GlueCatalog() - Constructor for class org.apache.iceberg.aws.glue.GlueCatalog
-
No-arg constructor to load the catalog dynamically.
- glueCatalogId() - Method in class org.apache.iceberg.aws.AwsProperties
- glueCatalogSkipArchive() - Method in class org.apache.iceberg.aws.AwsProperties
- greaterThan(String, T) - Static method in class org.apache.iceberg.expressions.Expressions
- greaterThan(UnboundTerm<T>, T) - Static method in class org.apache.iceberg.expressions.Expressions
- greaterThanOrEqual(String, T) - Static method in class org.apache.iceberg.expressions.Expressions
- greaterThanOrEqual(UnboundTerm<T>, T) - Static method in class org.apache.iceberg.expressions.Expressions
- groupId() - Method in class org.apache.iceberg.spark.JobGroupInfo
- gt(Bound<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- gt(BoundReference<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- GT - org.apache.iceberg.expressions.Expression.Operation
- GT_EQ - org.apache.iceberg.expressions.Expression.Operation
- gtEq(Bound<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- gtEq(BoundReference<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- GuavaClasses - Class in org.apache.iceberg
- GuavaClasses() - Constructor for class org.apache.iceberg.GuavaClasses
- GZIP - org.apache.iceberg.TableMetadataParser.Codec
H
- hadoop(String, Configuration, Map<String, String>) - Static method in interface org.apache.iceberg.flink.CatalogLoader
- HADOOP_CATALOG - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- HADOOP_CATALOG_WAREHOUSE_LOCATION - Static variable in class org.apache.iceberg.mr.InputFormatConfig
-
Deprecated.please use
InputFormatConfig.catalogPropertyConfigKey(String, String)
with config keyCatalogProperties.WAREHOUSE_LOCATION
to specify the warehouse location of a catalog. - HADOOP_TABLES - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- HadoopCatalog - Class in org.apache.iceberg.hadoop
-
HadoopCatalog provides a way to use table names like db.table to work with path-based tables under a common location.
- HadoopCatalog() - Constructor for class org.apache.iceberg.hadoop.HadoopCatalog
- HadoopCatalog(Configuration, String) - Constructor for class org.apache.iceberg.hadoop.HadoopCatalog
-
The constructor of the HadoopCatalog.
- hadoopConfCatalogOverrides(SparkSession, String) - Static method in class org.apache.iceberg.spark.SparkUtil
-
Pulls any Catalog specific overrides for the Hadoop conf from the current SparkSession, which can be set via `spark.sql.catalog.$catalogName.hadoop.*` Mirrors the override of hadoop configurations for a given spark session using `spark.hadoop.*`.
- HadoopConfigurable - Interface in org.apache.iceberg.hadoop
-
An interface that extends the Hadoop
Configurable
interface to offer better serialization support for customizable Iceberg objects such asFileIO
. - HadoopFileIO - Class in org.apache.iceberg.hadoop
- HadoopFileIO() - Constructor for class org.apache.iceberg.hadoop.HadoopFileIO
-
Constructor used for dynamic FileIO loading.
- HadoopFileIO(Configuration) - Constructor for class org.apache.iceberg.hadoop.HadoopFileIO
- HadoopFileIO(SerializableSupplier<Configuration>) - Constructor for class org.apache.iceberg.hadoop.HadoopFileIO
- HadoopInputFile - Class in org.apache.iceberg.hadoop
-
InputFile
implementation using the HadoopFileSystem
API. - HadoopOutputFile - Class in org.apache.iceberg.hadoop
-
OutputFile
implementation using the HadoopFileSystem
API. - HadoopTableOperations - Class in org.apache.iceberg.hadoop
-
TableOperations implementation for file systems that support atomic rename.
- HadoopTableOperations(Path, FileIO, Configuration, LockManager) - Constructor for class org.apache.iceberg.hadoop.HadoopTableOperations
- HadoopTables - Class in org.apache.iceberg.hadoop
-
Implementation of Iceberg tables that uses the Hadoop FileSystem to store metadata and manifests.
- HadoopTables() - Constructor for class org.apache.iceberg.hadoop.HadoopTables
- HadoopTables(Configuration) - Constructor for class org.apache.iceberg.hadoop.HadoopTables
- HANDLE_TIMESTAMP_WITHOUT_TIMEZONE - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- HANDLE_TIMESTAMP_WITHOUT_TIMEZONE - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- HANDLE_TIMESTAMP_WITHOUT_TIMEZONE - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- HANDLE_TIMESTAMP_WITHOUT_TIMEZONE_DEFAULT - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- handleNonReference(Bound<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
-
Handle a non-reference value in this visitor.
- handleTimestampWithoutZone() - Method in class org.apache.iceberg.spark.SparkReadConf
-
Enables reading a timestamp without time zone as a timestamp with time zone.
- handleTimestampWithoutZone() - Method in class org.apache.iceberg.spark.SparkWriteConf
-
Enables writing a timestamp with time zone as a timestamp without time zone.
- hasAddedFiles() - Method in interface org.apache.iceberg.ManifestFile
-
Returns true if the manifest contains ADDED entries or if the count is not known.
- hasBounds() - Method in class org.apache.iceberg.FieldMetrics
-
Returns if the metrics has bounds (i.e.
- hasBucketField(PartitionSpec) - Static method in class org.apache.iceberg.Partitioning
-
Check whether the spec contains a bucketed partition field.
- hasDeletedFiles() - Method in interface org.apache.iceberg.ManifestFile
-
Returns true if the manifest contains DELETED entries or if the count is not known.
- hasDeletes(CombinedScanTask) - Static method in class org.apache.iceberg.util.TableScanUtil
- hasDeletes(FileScanTask) - Static method in class org.apache.iceberg.util.TableScanUtil
- hasEqDeletes() - Method in class org.apache.iceberg.data.DeleteFilter
- hasEqDeletes(CombinedScanTask) - Static method in class org.apache.iceberg.util.TableScanUtil
-
This is temporarily introduced since we plan to support pos-delete vectorized read first, then get to the equality-delete support.
- hasExistingFiles() - Method in interface org.apache.iceberg.ManifestFile
-
Returns true if the manifest contains EXISTING entries or if the count is not known.
- hasFieldId(Schema.Field) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- hash(T) - Method in interface org.apache.iceberg.types.JavaHash
- HASH - org.apache.iceberg.DistributionMode
- hashCode() - Method in class org.apache.iceberg.catalog.Namespace
- hashCode() - Method in class org.apache.iceberg.catalog.TableIdentifier
- hashCode() - Method in class org.apache.iceberg.data.GenericRecord
- hashCode() - Method in class org.apache.iceberg.GenericManifestFile
- hashCode() - Method in class org.apache.iceberg.mapping.MappedField
- hashCode() - Method in class org.apache.iceberg.mapping.MappedFields
- hashCode() - Method in class org.apache.iceberg.MetricsModes.Truncate
- hashCode() - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergRecordObjectInspector
- hashCode() - Method in class org.apache.iceberg.PartitionField
- hashCode() - Method in class org.apache.iceberg.PartitionKey
- hashCode() - Method in class org.apache.iceberg.PartitionSpec
- hashCode() - Method in class org.apache.iceberg.SortField
- hashCode() - Method in class org.apache.iceberg.SortOrder
- hashCode() - Method in class org.apache.iceberg.spark.source.SparkTable
- hashCode() - Method in class org.apache.iceberg.spark.SparkTableUtil.SparkPartition
- hashCode() - Method in class org.apache.iceberg.TableMetadata.MetadataLogEntry
- hashCode() - Method in class org.apache.iceberg.TableMetadata.SnapshotLogEntry
- hashCode() - Method in class org.apache.iceberg.transforms.UnknownTransform
- hashCode() - Method in class org.apache.iceberg.types.Types.DecimalType
- hashCode() - Method in class org.apache.iceberg.types.Types.FixedType
- hashCode() - Method in class org.apache.iceberg.types.Types.ListType
- hashCode() - Method in class org.apache.iceberg.types.Types.MapType
- hashCode() - Method in class org.apache.iceberg.types.Types.NestedField
- hashCode() - Method in class org.apache.iceberg.types.Types.StructType
- hashCode() - Method in class org.apache.iceberg.types.Types.TimestampType
- hashCode() - Method in class org.apache.iceberg.util.CharSequenceWrapper
- hashCode() - Method in class org.apache.iceberg.util.Pair
- hashCode() - Method in class org.apache.iceberg.util.SerializableMap
- hashCode() - Method in class org.apache.iceberg.util.StructLikeSet
- hashCode() - Method in class org.apache.iceberg.util.StructLikeWrapper
- hashCode(CharSequence) - Static method in class org.apache.iceberg.types.JavaHashes
- hasIds(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
- HasIds() - Constructor for class org.apache.iceberg.parquet.ParquetSchemaUtil.HasIds
- hasMetadataTableName(TableIdentifier) - Static method in class org.apache.iceberg.MetadataTableUtils
- hasNamespace() - Method in class org.apache.iceberg.catalog.TableIdentifier
-
Whether the namespace is empty.
- hasNext - Variable in class org.apache.iceberg.parquet.BasePageIterator
- hasNext() - Method in class org.apache.iceberg.flink.source.DataIterator
- hasNext() - Method in class org.apache.iceberg.io.ClosingIterator
- hasNext() - Method in class org.apache.iceberg.io.FilterIterator
- hasNext() - Method in class org.apache.iceberg.orc.VectorizedRowBatchIterator
- hasNext() - Method in class org.apache.iceberg.parquet.BaseColumnIterator
- hasNext() - Method in class org.apache.iceberg.parquet.BasePageIterator
- hasNonDictionaryPages(ColumnChunkMetaData) - Static method in class org.apache.iceberg.parquet.ParquetUtil
- hasNull() - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- hasNull() - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- hasNull() - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- hasNulls() - Method in class org.apache.iceberg.arrow.vectorized.NullabilityHolder
- hasPosDeletes() - Method in class org.apache.iceberg.data.DeleteFilter
- HasTableOperations - Interface in org.apache.iceberg
-
Used to expose a table's TableOperations.
- hasTimestampWithoutZone(Schema) - Static method in class org.apache.iceberg.spark.SparkUtil
-
Responsible for checking if the table schema has a timestamp without timezone column
- heartbeatIntervalMs() - Method in class org.apache.iceberg.util.LockManagers.BaseLockManager
- heartbeatThreads() - Method in class org.apache.iceberg.util.LockManagers.BaseLockManager
- heartbeatTimeoutMs() - Method in class org.apache.iceberg.util.LockManagers.BaseLockManager
- hiddenImpl(Class<?>...) - Method in class org.apache.iceberg.common.DynConstructors.Builder
- hiddenImpl(Class<?>, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Checks for a method implementation.
- hiddenImpl(Class<?>, String) - Method in class org.apache.iceberg.common.DynFields.Builder
-
Checks for a hidden implementation.
- hiddenImpl(Class<?>, String, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Checks for a method implementation.
- hiddenImpl(Class<T>, Class<?>...) - Method in class org.apache.iceberg.common.DynConstructors.Builder
- hiddenImpl(String, Class<?>...) - Method in class org.apache.iceberg.common.DynConstructors.Builder
- hiddenImpl(String, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Checks for an implementation, first finding the given class by name.
- hiddenImpl(String, String) - Method in class org.apache.iceberg.common.DynFields.Builder
-
Checks for a hidden implementation, first finding the class by name.
- hiddenImpl(String, String, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Checks for an implementation, first finding the given class by name.
- HiddenPathFilter - Class in org.apache.iceberg.hadoop
-
A
PathFilter
that filters out hidden paths. - history() - Method in class org.apache.iceberg.BaseTable
- history() - Method in class org.apache.iceberg.SerializableTable
- history() - Method in interface org.apache.iceberg.Table
-
Get the snapshot history of this table.
- HISTORY - org.apache.iceberg.MetadataTableType
- HistoryEntry - Interface in org.apache.iceberg
-
Table history entry.
- HistoryTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's history as rows. - hive(String, Configuration, Map<String, String>) - Static method in interface org.apache.iceberg.flink.CatalogLoader
- HIVE - org.apache.iceberg.mr.InputFormatConfig.InMemoryDataModel
- HIVE_CATALOG - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- HIVE_CONF_DIR - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- hive3PresentOnClasspath() - Static method in class org.apache.iceberg.hive.MetastoreUtil
-
Returns true if Hive3 dependencies are found on classpath, false otherwise.
- hiveCatalog(Configuration, Properties) - Static method in class org.apache.iceberg.mr.Catalogs
-
Returns true if HiveCatalog is used
- HiveCatalog - Class in org.apache.iceberg.hive
- HiveCatalog() - Constructor for class org.apache.iceberg.hive.HiveCatalog
- HiveClientPool - Class in org.apache.iceberg.hive
- HiveClientPool(int, Configuration) - Constructor for class org.apache.iceberg.hive.HiveClientPool
- HiveIcebergFilterFactory - Class in org.apache.iceberg.mr.hive
- HiveIcebergInputFormat - Class in org.apache.iceberg.mr.hive
- HiveIcebergInputFormat() - Constructor for class org.apache.iceberg.mr.hive.HiveIcebergInputFormat
- HiveIcebergMetaHook - Class in org.apache.iceberg.mr.hive
- HiveIcebergMetaHook(Configuration) - Constructor for class org.apache.iceberg.mr.hive.HiveIcebergMetaHook
- HiveIcebergOutputCommitter - Class in org.apache.iceberg.mr.hive
-
An Iceberg table committer for adding data files to the Iceberg tables.
- HiveIcebergOutputCommitter() - Constructor for class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
- HiveIcebergOutputFormat<T> - Class in org.apache.iceberg.mr.hive
- HiveIcebergOutputFormat() - Constructor for class org.apache.iceberg.mr.hive.HiveIcebergOutputFormat
- HiveIcebergSerDe - Class in org.apache.iceberg.mr.hive
- HiveIcebergSerDe() - Constructor for class org.apache.iceberg.mr.hive.HiveIcebergSerDe
- HiveIcebergSplit - Class in org.apache.iceberg.mr.hive
- HiveIcebergSplit() - Constructor for class org.apache.iceberg.mr.hive.HiveIcebergSplit
- HiveIcebergStorageHandler - Class in org.apache.iceberg.mr.hive
- HiveIcebergStorageHandler() - Constructor for class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- HiveSchemaUtil - Class in org.apache.iceberg.hive
- HiveTableOperations - Class in org.apache.iceberg.hive
-
TODO we should be able to extract some more commonalities to BaseMetastoreTableOperations to avoid code duplication between this class and Metacat Tables.
- HiveTableOperations(Configuration, ClientPool, FileIO, String, String, String) - Constructor for class org.apache.iceberg.hive.HiveTableOperations
- hour(int, String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- hour(String) - Static method in class org.apache.iceberg.expressions.Expressions
- hour(String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- hour(String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- hour(String, int, SortDirection, NullOrder) - Method in interface org.apache.iceberg.transforms.SortOrderVisitor
- hour(String, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- hour(Type) - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a hour
Transform
for timestamps.
I
- ICEBERG_BINARY_TYPE_ATTRIBUTE - Static variable in class org.apache.iceberg.orc.ORCSchemaUtil
-
The name of the ORC
TypeDescription
attribute indicating the Iceberg type corresponding to an ORC binary type. - ICEBERG_CATALOG_HADOOP - Static variable in class org.apache.iceberg.CatalogUtil
- ICEBERG_CATALOG_HIVE - Static variable in class org.apache.iceberg.CatalogUtil
- ICEBERG_CATALOG_TYPE - Static variable in class org.apache.iceberg.CatalogUtil
-
Shortcut catalog property to load a catalog implementation through a short type name, instead of specifying a full java class through
CatalogProperties.CATALOG_IMPL
. - ICEBERG_CATALOG_TYPE - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- ICEBERG_CATALOG_TYPE_HADOOP - Static variable in class org.apache.iceberg.CatalogUtil
- ICEBERG_CATALOG_TYPE_HADOOP - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- ICEBERG_CATALOG_TYPE_HIVE - Static variable in class org.apache.iceberg.CatalogUtil
- ICEBERG_CATALOG_TYPE_HIVE - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- ICEBERG_DEFAULT_CATALOG_NAME - Static variable in class org.apache.iceberg.mr.Catalogs
- ICEBERG_FIELD_NAME_PROP - Static variable in class org.apache.iceberg.avro.AvroSchemaUtil
- ICEBERG_HADOOP_TABLE_NAME - Static variable in class org.apache.iceberg.mr.Catalogs
- ICEBERG_LONG_TYPE_ATTRIBUTE - Static variable in class org.apache.iceberg.orc.ORCSchemaUtil
-
The name of the ORC
TypeDescription
attribute indicating the Iceberg type corresponding to an ORC long type. - ICEBERG_SNAPSHOTS_TABLE_SUFFIX - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- ICEBERG_TABLE_TYPE_VALUE - Static variable in class org.apache.iceberg.BaseMetastoreTableOperations
- IcebergArrowColumnVector - Class in org.apache.iceberg.spark.data.vectorized
-
Implementation of Spark's
ColumnVector
interface. - IcebergArrowColumnVector(VectorHolder) - Constructor for class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- IcebergBinaryObjectInspector - Class in org.apache.iceberg.mr.hive.serde.objectinspector
- IcebergDateObjectInspector - Class in org.apache.iceberg.mr.hive.serde.objectinspector
- IcebergDecimalObjectInspector - Class in org.apache.iceberg.mr.hive.serde.objectinspector
- IcebergDecoder<D> - Class in org.apache.iceberg.data.avro
- IcebergDecoder(Schema) - Constructor for class org.apache.iceberg.data.avro.IcebergDecoder
-
Creates a new decoder that constructs datum instances described by an
Iceberg schema
. - IcebergDecoder(Schema, SchemaStore) - Constructor for class org.apache.iceberg.data.avro.IcebergDecoder
-
Creates a new decoder that constructs datum instances described by an
Iceberg schema
. - IcebergEncoder<D> - Class in org.apache.iceberg.data.avro
- IcebergEncoder(Schema) - Constructor for class org.apache.iceberg.data.avro.IcebergEncoder
-
Creates a new
MessageEncoder
that will deconstruct datum instances described by theschema
. - IcebergEncoder(Schema, boolean) - Constructor for class org.apache.iceberg.data.avro.IcebergEncoder
-
Creates a new
MessageEncoder
that will deconstruct datum instances described by theschema
. - IcebergFixedObjectInspector - Class in org.apache.iceberg.mr.hive.serde.objectinspector
- IcebergGenerics - Class in org.apache.iceberg.data
- IcebergGenerics.ScanBuilder - Class in org.apache.iceberg.data
- IcebergInputFormat<T> - Class in org.apache.iceberg.mr.mapreduce
-
Generic Mrv2 InputFormat API for Iceberg.
- IcebergInputFormat() - Constructor for class org.apache.iceberg.mr.mapreduce.IcebergInputFormat
- IcebergObjectInspector - Class in org.apache.iceberg.mr.hive.serde.objectinspector
- IcebergObjectInspector() - Constructor for class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- IcebergPigInputFormat<T> - Class in org.apache.iceberg.pig
- IcebergPigInputFormat.IcebergRecordReader<T> - Class in org.apache.iceberg.pig
- IcebergRecordObjectInspector - Class in org.apache.iceberg.mr.hive.serde.objectinspector
- IcebergRecordObjectInspector(Types.StructType, List<ObjectInspector>) - Constructor for class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergRecordObjectInspector
- IcebergRecordReader() - Constructor for class org.apache.iceberg.pig.IcebergPigInputFormat.IcebergRecordReader
- IcebergSource - Class in org.apache.iceberg.spark.source
-
The IcebergSource loads/writes tables with format "iceberg".
- IcebergSource() - Constructor for class org.apache.iceberg.spark.source.IcebergSource
- IcebergSourceSplit - Class in org.apache.iceberg.flink.source.split
- IcebergSourceSplitSerializer - Class in org.apache.iceberg.flink.source.split
-
TODO: use Java serialization for now.
- IcebergSourceSplitSerializer() - Constructor for class org.apache.iceberg.flink.source.split.IcebergSourceSplitSerializer
- IcebergSpark - Class in org.apache.iceberg.spark
- icebergSplit() - Method in class org.apache.iceberg.mr.hive.HiveIcebergSplit
- icebergSplit() - Method in class org.apache.iceberg.mr.mapreduce.IcebergSplit
- icebergSplit() - Method in interface org.apache.iceberg.mr.mapreduce.IcebergSplitContainer
- IcebergSplit - Class in org.apache.iceberg.mr.mapreduce
- IcebergSplit() - Constructor for class org.apache.iceberg.mr.mapreduce.IcebergSplit
- IcebergSplitContainer - Interface in org.apache.iceberg.mr.mapreduce
- IcebergSqlExtensionsBaseListener - Class in org.apache.spark.sql.catalyst.parser.extensions
-
This class provides an empty implementation of
IcebergSqlExtensionsListener
, which can be extended to create a listener which only needs to handle a subset of the available methods. - IcebergSqlExtensionsBaseListener() - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
- IcebergSqlExtensionsBaseVisitor<T> - Class in org.apache.spark.sql.catalyst.parser.extensions
-
This class provides an empty implementation of
IcebergSqlExtensionsVisitor
, which can be extended to create a visitor which only needs to handle a subset of the available methods. - IcebergSqlExtensionsBaseVisitor() - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
- IcebergSqlExtensionsLexer - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsLexer(CharStream) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- IcebergSqlExtensionsListener - Interface in org.apache.spark.sql.catalyst.parser.extensions
-
This interface defines a complete listener for a parse tree produced by
IcebergSqlExtensionsParser
. - IcebergSqlExtensionsParser - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser(TokenStream) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- IcebergSqlExtensionsParser.AddPartitionFieldContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.ApplyTransformContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.BigDecimalLiteralContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.BigIntLiteralContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.BooleanLiteralContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.BooleanValueContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.CallArgumentContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.CallContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.ConstantContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.DecimalLiteralContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.DoubleLiteralContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.DropIdentifierFieldsContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.DropPartitionFieldContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.ExponentLiteralContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.ExpressionContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.FieldListContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.FloatLiteralContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.IdentifierContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.IdentityTransformContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.IntegerLiteralContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.MultipartIdentifierContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.NamedArgumentContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.NonReservedContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.NumberContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.NumericLiteralContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.OrderContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.OrderFieldContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.PositionalArgumentContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.QuotedIdentifierContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.ReplacePartitionFieldContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.SetIdentifierFieldsContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.SingleStatementContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.SmallIntLiteralContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.StatementContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.StringLiteralContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.StringMapContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.TinyIntLiteralContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.TransformArgumentContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.TransformContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.TypeConstructorContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.UnquotedIdentifierContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.WriteDistributionSpecContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.WriteOrderingSpecContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsParser.WriteSpecContext - Class in org.apache.spark.sql.catalyst.parser.extensions
- IcebergSqlExtensionsVisitor<T> - Interface in org.apache.spark.sql.catalyst.parser.extensions
-
This interface defines a complete generic visitor for a parse tree produced by
IcebergSqlExtensionsParser
. - IcebergStorage - Class in org.apache.iceberg.pig
- IcebergStorage() - Constructor for class org.apache.iceberg.pig.IcebergStorage
- IcebergTableSink - Class in org.apache.iceberg.flink
- IcebergTableSink(TableLoader, TableSchema) - Constructor for class org.apache.iceberg.flink.IcebergTableSink
- IcebergTableSource - Class in org.apache.iceberg.flink
-
Flink Iceberg table source.
- IcebergTableSource(TableLoader, TableSchema, Map<String, String>, ReadableConfig) - Constructor for class org.apache.iceberg.flink.IcebergTableSource
- IcebergTimeObjectInspector - Class in org.apache.iceberg.mr.hive.serde.objectinspector
- IcebergTimestampObjectInspector - Class in org.apache.iceberg.mr.hive.serde.objectinspector
- IcebergTimestampWithZoneObjectInspector - Class in org.apache.iceberg.mr.hive.serde.objectinspector
- icebergType() - Method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- IcebergUUIDObjectInspector - Class in org.apache.iceberg.mr.hive.serde.objectinspector
- id() - Method in class org.apache.iceberg.FieldMetrics
-
Returns the id of the field that the metrics within this class are associated with.
- id() - Method in enum org.apache.iceberg.FileContent
- id() - Method in enum org.apache.iceberg.ManifestContent
- id() - Method in class org.apache.iceberg.mapping.MappedField
- id(String) - Method in class org.apache.iceberg.mapping.MappedFields
- identifier - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MultipartIdentifierContext
- identifier() - Method in class org.apache.iceberg.spark.Spark3Util.CatalogAndIdentifier
- identifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- identifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- identifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- identifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MultipartIdentifierContext
- identifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NamedArgumentContext
- identifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- identifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TypeConstructorContext
- identifier(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MultipartIdentifierContext
- IDENTIFIER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- IDENTIFIER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- IDENTIFIER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.UnquotedIdentifierContext
- IDENTIFIER_KW - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- IDENTIFIER_KW - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- IDENTIFIER_KW() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- IDENTIFIER_KW() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- IDENTIFIER_KW() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- IdentifierContext() - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentifierContext
- IdentifierContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentifierContext
- identifierFieldIds() - Method in class org.apache.iceberg.Schema
-
The set of identifier field IDs.
- identifierFieldNames() - Method in class org.apache.iceberg.Schema
-
Returns the set of identifier field names.
- identifierToTableIdentifier(Identifier) - Static method in class org.apache.iceberg.spark.Spark3Util
- identity(int, String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- identity(String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- identity(String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- identity(Type) - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns an identity
Transform
that can be used for any type. - IdentityPartitionConverters - Class in org.apache.iceberg.data
- identitySourceIds() - Method in class org.apache.iceberg.PartitionSpec
-
Returns the source field ids for identity partitions.
- IdentityTransformContext(IcebergSqlExtensionsParser.TransformContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentityTransformContext
- idToAlias(Integer) - Method in class org.apache.iceberg.Schema
-
Returns the full column name in the unconverted data schema for the given column id.
- idToOrcName(Schema) - Static method in class org.apache.iceberg.orc.ORCSchemaUtil
-
Generates mapping from field IDs to ORC qualified names.
- ignoreResiduals() - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
from this that applies data filtering to files but not to rows in those files. - immutableMap() - Method in class org.apache.iceberg.util.SerializableMap
- impl(Class<?>, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Checks for a method implementation.
- impl(Class<?>, String) - Method in class org.apache.iceberg.common.DynFields.Builder
-
Checks for an implementation.
- impl(Class<?>, String, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Checks for a method implementation.
- impl(Class<T>, Class<?>...) - Method in class org.apache.iceberg.common.DynConstructors.Builder
- impl(String) - Method in class org.apache.iceberg.common.DynClasses.Builder
-
Checks for an implementation of the class by name.
- impl(String, Class<?>...) - Method in class org.apache.iceberg.common.DynConstructors.Builder
- impl(String, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Checks for an implementation, first finding the given class by name.
- impl(String, String) - Method in class org.apache.iceberg.common.DynFields.Builder
-
Checks for an implementation, first finding the class by name.
- impl(String, String, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Checks for an implementation, first finding the given class by name.
- importedDataFilesCount() - Method in class org.apache.iceberg.actions.BaseSnapshotTableActionResult
- importedDataFilesCount() - Method in interface org.apache.iceberg.actions.SnapshotTable.Result
-
Returns the number of imported data files.
- importSparkPartitions(SparkSession, List<SparkTableUtil.SparkPartition>, Table, PartitionSpec, String) - Static method in class org.apache.iceberg.spark.SparkTableUtil
-
Import files from given partitions to an Iceberg table.
- importSparkPartitions(SparkSession, List<SparkTableUtil.SparkPartition>, Table, PartitionSpec, String, boolean) - Static method in class org.apache.iceberg.spark.SparkTableUtil
-
Import files from given partitions to an Iceberg table.
- importSparkTable(SparkSession, TableIdentifier, Table, String) - Static method in class org.apache.iceberg.spark.SparkTableUtil
-
Import files from an existing Spark table to an Iceberg table.
- importSparkTable(SparkSession, TableIdentifier, Table, String, boolean) - Static method in class org.apache.iceberg.spark.SparkTableUtil
-
Import files from an existing Spark table to an Iceberg table.
- importSparkTable(SparkSession, TableIdentifier, Table, String, Map<String, String>, boolean) - Static method in class org.apache.iceberg.spark.SparkTableUtil
-
Import files from an existing Spark table to an Iceberg table.
- in(String, Iterable<T>) - Static method in class org.apache.iceberg.expressions.Expressions
- in(String, T...) - Static method in class org.apache.iceberg.expressions.Expressions
- in(Bound<T>, Set<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- in(BoundReference<T>, Set<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- in(UnboundTerm<T>, Iterable<T>) - Static method in class org.apache.iceberg.expressions.Expressions
- in(UnboundTerm<T>, T...) - Static method in class org.apache.iceberg.expressions.Expressions
- in(Table) - Static method in class org.apache.iceberg.FindFiles
- IN - org.apache.iceberg.expressions.Expression.Operation
- IN_MEMORY_DATA_MODEL - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- includeColumnStats() - Method in class org.apache.iceberg.FindFiles.Builder
- includeColumnStats() - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
from this that loads the column stats with each data file. - inclusive(PartitionSpec) - Static method in class org.apache.iceberg.expressions.Projections
-
Creates an inclusive
ProjectionEvaluator
for thespec
, defaulting to case sensitive mode. - inclusive(PartitionSpec, boolean) - Static method in class org.apache.iceberg.expressions.Projections
-
Creates an inclusive
ProjectionEvaluator
for thespec
. - InclusiveMetricsEvaluator - Class in org.apache.iceberg.expressions
-
Evaluates an
Expression
on aDataFile
to test whether rows in the file may match. - InclusiveMetricsEvaluator(Schema, Expression) - Constructor for class org.apache.iceberg.expressions.InclusiveMetricsEvaluator
- InclusiveMetricsEvaluator(Schema, Expression, boolean) - Constructor for class org.apache.iceberg.expressions.InclusiveMetricsEvaluator
- IncrementalScanEvent - Class in org.apache.iceberg.events
-
Event sent to listeners when an incremental table scan is planned.
- IncrementalScanEvent(String, long, long, Expression, Schema) - Constructor for class org.apache.iceberg.events.IncrementalScanEvent
- incrementDuplicateDeletes() - Method in class org.apache.iceberg.SnapshotSummary.Builder
- incrementDuplicateDeletes(int) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- indexById(Types.StructType) - Static method in class org.apache.iceberg.types.TypeUtil
- indexByLowerCaseName(Types.StructType) - Static method in class org.apache.iceberg.types.TypeUtil
- indexByName(Types.StructType) - Static method in class org.apache.iceberg.types.TypeUtil
- IndexByName - Class in org.apache.iceberg.types
- IndexByName() - Constructor for class org.apache.iceberg.types.IndexByName
- IndexByName(Function<String, String>) - Constructor for class org.apache.iceberg.types.IndexByName
- indexNameById(Types.StructType) - Static method in class org.apache.iceberg.types.TypeUtil
- indexParents(Types.StructType) - Static method in class org.apache.iceberg.types.TypeUtil
- IndexParents - Class in org.apache.iceberg.types
- IndexParents() - Constructor for class org.apache.iceberg.types.IndexParents
- indexQuotedNameById(Schema) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
- indexQuotedNameById(Types.StructType, Function<String, String>) - Static method in class org.apache.iceberg.types.TypeUtil
- inferPartitioning(CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.source.IcebergSource
- inferSchema(CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.source.IcebergSource
- info() - Method in class org.apache.iceberg.actions.BaseFileGroupRewriteResult
- info() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.FileGroupRewriteResult
- info() - Method in class org.apache.iceberg.actions.RewriteFileGroup
- initDataReader(Encoding, ByteBufferInputStream, int) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedPageIterator
- initDataReader(Encoding, ByteBufferInputStream, int) - Method in class org.apache.iceberg.parquet.BasePageIterator
- initDefinitionLevelsReader(DataPageV1, ColumnDescriptor, ByteBufferInputStream, int) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedPageIterator
- initDefinitionLevelsReader(DataPageV1, ColumnDescriptor, ByteBufferInputStream, int) - Method in class org.apache.iceberg.parquet.BasePageIterator
- initDefinitionLevelsReader(DataPageV2, ColumnDescriptor) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedPageIterator
- initDefinitionLevelsReader(DataPageV2, ColumnDescriptor) - Method in class org.apache.iceberg.parquet.BasePageIterator
- initFromPage(int, ByteBufferInputStream) - Method in class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- initFromPage(int, ByteBufferInputStream) - Method in class org.apache.iceberg.parquet.ValuesAsBytesReader
- initFromPage(DataPageV1) - Method in class org.apache.iceberg.parquet.BasePageIterator
- initFromPage(DataPageV2) - Method in class org.apache.iceberg.parquet.BasePageIterator
- initialize(int, int) - Method in class org.apache.iceberg.flink.sink.RowDataTaskWriterFactory
- initialize(int, int) - Method in interface org.apache.iceberg.flink.sink.TaskWriterFactory
-
Initialize the factory with a given taskId and attemptId.
- initialize(String, Map<String, String>) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- initialize(String, Map<String, String>) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- initialize(String, Map<String, String>) - Method in interface org.apache.iceberg.catalog.Catalog
-
Initialize a catalog given a custom name and a map of catalog properties.
- initialize(String, Map<String, String>) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- initialize(String, Map<String, String>) - Method in class org.apache.iceberg.hive.HiveCatalog
- initialize(String, Map<String, String>) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- initialize(String, Map<String, String>) - Method in class org.apache.iceberg.nessie.NessieCatalog
- initialize(String, CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.SparkCatalog
- initialize(String, CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- initialize(Map<String, String>) - Method in interface org.apache.iceberg.aliyun.AliyunClientFactory
-
Initialize Aliyun client factory from catalog properties.
- initialize(Map<String, String>) - Method in class org.apache.iceberg.aliyun.oss.OSSFileIO
- initialize(Map<String, String>) - Method in class org.apache.iceberg.aws.AssumeRoleAwsClientFactory
- initialize(Map<String, String>) - Method in interface org.apache.iceberg.aws.AwsClientFactory
-
Initialize AWS client factory from catalog properties.
- initialize(Map<String, String>) - Method in class org.apache.iceberg.aws.s3.S3FileIO
- initialize(Map<String, String>) - Method in class org.apache.iceberg.gcp.gcs.GCSFileIO
- initialize(Map<String, String>) - Method in interface org.apache.iceberg.io.FileIO
-
Initialize File IO from catalog properties.
- initialize(Map<String, String>) - Method in class org.apache.iceberg.io.ResolvingFileIO
- initialize(Map<String, String>) - Method in interface org.apache.iceberg.LockManager
-
Initialize lock manager from catalog properties.
- initialize(Map<String, String>) - Method in class org.apache.iceberg.util.LockManagers.BaseLockManager
- initialize(Configuration, Properties) - Method in class org.apache.iceberg.mr.hive.HiveIcebergSerDe
- initialize(InputSplit, TaskAttemptContext) - Method in class org.apache.iceberg.pig.IcebergPigInputFormat.IcebergRecordReader
- initializeState(FunctionInitializationContext) - Method in class org.apache.iceberg.flink.source.StreamingMonitorFunction
- initializeState(StateInitializationContext) - Method in class org.apache.iceberg.flink.source.StreamingReaderOperator
- initialOffset() - Method in class org.apache.iceberg.spark.source.SparkMicroBatchStream
- innerReader - Variable in class org.apache.iceberg.mr.mapred.AbstractMapredIcebergRecordReader
- inPartition(PartitionSpec, StructLike) - Method in class org.apache.iceberg.FindFiles.Builder
-
Filter results to files in any one of the given partitions.
- inPartitions(PartitionSpec, List<StructLike>) - Method in class org.apache.iceberg.FindFiles.Builder
-
Filter results to files in any one of the given partitions.
- inPartitions(PartitionSpec, StructLike...) - Method in class org.apache.iceberg.FindFiles.Builder
-
Filter results to files in any one of the given partitions.
- InputFile - Interface in org.apache.iceberg.io
-
An interface used to read input files using
SeekableInputStream
instances. - InputFilesDecryptor - Class in org.apache.iceberg.encryption
- InputFilesDecryptor(CombinedScanTask, FileIO, EncryptionManager) - Constructor for class org.apache.iceberg.encryption.InputFilesDecryptor
- inputFileSize(List<FileScanTask>) - Method in class org.apache.iceberg.actions.BinPackStrategy
- InputFormatConfig - Class in org.apache.iceberg.mr
- InputFormatConfig.ConfigBuilder - Class in org.apache.iceberg.mr
- InputFormatConfig.InMemoryDataModel - Enum in org.apache.iceberg.mr
- insert(T) - Method in interface org.apache.spark.sql.connector.iceberg.write.DeltaWriter
-
Passes a row to insert.
- insert(T, PartitionSpec, StructLike) - Method in class org.apache.iceberg.io.BasePositionDeltaWriter
- insert(T, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.EqualityDeltaWriter
-
Inserts a row to the provided spec/partition.
- insert(T, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.PositionDeltaWriter
-
Inserts a row to the provided spec/partition.
- inSnapshot(long) - Method in class org.apache.iceberg.FindFiles.Builder
-
Base results on the given snapshot.
- INSTANCE - Static variable in class org.apache.iceberg.flink.source.split.IcebergSourceSplitSerializer
- IntAsLongReader(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.IntAsLongReader
- intBackedDecimalBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- IntBackedDecimalBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.IntBackedDecimalBatchReader
- intBackedDecimalDictEncodedReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- INTEGER - org.apache.iceberg.types.Type.TypeID
- INTEGER_VALUE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- INTEGER_VALUE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- INTEGER_VALUE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IntegerLiteralContext
- IntegerAsDecimalReader(ColumnDescriptor, int) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.IntegerAsDecimalReader
- integerBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- IntegerBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.IntegerBatchReader
- integerDictEncodedReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- IntegerLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IntegerLiteralContext
- IntegerType() - Constructor for class org.apache.iceberg.types.Types.IntegerType
- InternalRecordWrapper - Class in org.apache.iceberg.data
- InternalRecordWrapper(Types.StructType) - Constructor for class org.apache.iceberg.data.InternalRecordWrapper
- internalWrite(TableMetadata, OutputFile, boolean) - Static method in class org.apache.iceberg.TableMetadataParser
- interruptOnCancel() - Method in class org.apache.iceberg.spark.JobGroupInfo
- IntIterator() - Constructor for class org.apache.iceberg.parquet.BasePageIterator.IntIterator
- ints() - Static method in class org.apache.iceberg.avro.ValueReaders
- ints() - Static method in class org.apache.iceberg.avro.ValueWriters
- ints() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- ints() - Static method in class org.apache.iceberg.orc.OrcValueReaders
- ints(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- invalidateTable(TableIdentifier) - Method in class org.apache.iceberg.CachingCatalog
- invalidateTable(TableIdentifier) - Method in interface org.apache.iceberg.catalog.Catalog
-
Invalidate cached table metadata from current catalog.
- invalidateTable(Identifier) - Method in class org.apache.iceberg.spark.SparkCatalog
- invalidateTable(Identifier) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- invoke(Object...) - Method in class org.apache.iceberg.common.DynMethods.BoundMethod
- invoke(Object...) - Method in class org.apache.iceberg.common.DynMethods.StaticMethod
- invoke(Object, Object...) - Method in class org.apache.iceberg.common.DynConstructors.Ctor
- invoke(Object, Object...) - Method in class org.apache.iceberg.common.DynMethods.UnboundMethod
- invokeChecked(Object...) - Method in class org.apache.iceberg.common.DynMethods.BoundMethod
- invokeChecked(Object...) - Method in class org.apache.iceberg.common.DynMethods.StaticMethod
- invokeChecked(Object, Object...) - Method in class org.apache.iceberg.common.DynConstructors.Ctor
- invokeChecked(Object, Object...) - Method in class org.apache.iceberg.common.DynMethods.UnboundMethod
- io() - Method in class org.apache.iceberg.BaseTable
- io() - Method in class org.apache.iceberg.hadoop.HadoopTableOperations
- io() - Method in class org.apache.iceberg.hive.HiveTableOperations
- io() - Method in class org.apache.iceberg.nessie.NessieTableOperations
- io() - Method in class org.apache.iceberg.SerializableTable
- io() - Method in class org.apache.iceberg.StaticTableOperations
- io() - Method in interface org.apache.iceberg.Table
-
Returns a
FileIO
to read and write table data and metadata files. - io() - Method in interface org.apache.iceberg.TableOperations
-
Returns a
FileIO
to read and write table data and metadata files. - io(FileIO) - Method in interface org.apache.iceberg.actions.DeleteReachableFiles
-
Set the
FileIO
to be used for files removal - io(FileIO) - Method in class org.apache.iceberg.spark.actions.BaseDeleteReachableFilesSparkAction
- IS_DELETED - Static variable in class org.apache.iceberg.MetadataColumns
- IS_NAN - org.apache.iceberg.expressions.Expression.Operation
- IS_NULL - org.apache.iceberg.expressions.Expression.Operation
- isAlwaysNull() - Method in class org.apache.iceberg.common.DynFields.UnboundField
-
Returns whether the field is always null.
- isAncestorOf(Table, long) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Returns whether ancestorSnapshotId is an ancestor of the table's current state.
- isAncestorOf(Table, long, long) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Returns whether ancestorSnapshotId is an ancestor of snapshotId.
- isBounded(Map<String, String>) - Static method in class org.apache.iceberg.flink.source.FlinkSource
- isCaseSensitive() - Method in interface org.apache.iceberg.TableScan
-
Returns whether this scan should apply column name case sensitiveness as per
TableScan.caseSensitive(boolean)
. - isCharHighSurrogate(char) - Static method in class org.apache.iceberg.util.UnicodeUtil
-
Determines if the given character value is a unicode high-surrogate code unit.
- isConnectionException(Exception) - Method in class org.apache.iceberg.ClientPoolImpl
- isConnectionException(Exception) - Method in class org.apache.iceberg.hive.HiveClientPool
- isDataTask() - Method in interface org.apache.iceberg.DataTask
- isDataTask() - Method in interface org.apache.iceberg.ScanTask
-
Returns true if this is a
DataTask
, false otherwise. - isDeleted(long) - Method in interface org.apache.iceberg.deletes.PositionDeleteIndex
-
Checks whether a row at the position is deleted.
- isDeleteManifestReader() - Method in class org.apache.iceberg.ManifestReader
- isDictionaryEncoded() - Method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- isDummy() - Method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- isElementOptional() - Method in class org.apache.iceberg.types.Types.ListType
- isElementRequired() - Method in class org.apache.iceberg.types.Types.ListType
- isEmpty() - Method in class org.apache.iceberg.catalog.Namespace
- isEmpty() - Method in interface org.apache.iceberg.deletes.PositionDeleteIndex
-
Returns true if this collection contains no element.
- isEmpty() - Method in class org.apache.iceberg.util.CharSequenceSet
- isEmpty() - Method in class org.apache.iceberg.util.PartitionSet
- isEmpty() - Method in class org.apache.iceberg.util.SerializableMap
- isEmpty() - Method in class org.apache.iceberg.util.StructLikeMap
- isEmpty() - Method in class org.apache.iceberg.util.StructLikeSet
- isFileScanTask() - Method in interface org.apache.iceberg.FileScanTask
- isFileScanTask() - Method in interface org.apache.iceberg.ScanTask
-
Returns true if this is a
FileScanTask
, false otherwise. - isHint() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
-
This method will be called when we see '/*' and try to match it as a bracketed comment.
- isIdentity() - Method in interface org.apache.iceberg.transforms.Transform
-
Return whether this transform is the identity transform.
- isIntType(PrimitiveType) - Static method in class org.apache.iceberg.parquet.ParquetUtil
- isKeyValueSchema(Schema) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- isListType() - Method in interface org.apache.iceberg.types.Type
- isListType() - Method in class org.apache.iceberg.types.Types.ListType
- isLiteralPredicate() - Method in class org.apache.iceberg.expressions.BoundLiteralPredicate
- isLiteralPredicate() - Method in class org.apache.iceberg.expressions.BoundPredicate
- isMapType() - Method in interface org.apache.iceberg.types.Type
- isMapType() - Method in class org.apache.iceberg.types.Types.MapType
- isMapType(LogicalType) - Method in class org.apache.iceberg.flink.data.AvroWithFlinkSchemaVisitor
- isMapType(DataType) - Method in class org.apache.iceberg.spark.data.AvroWithSparkSchemaVisitor
- isMapType(P) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- isMetadataColumn(String) - Static method in class org.apache.iceberg.MetadataColumns
- isNaN(Object) - Static method in class org.apache.iceberg.util.NaNUtil
- isNaN(String) - Static method in class org.apache.iceberg.expressions.Expressions
- isNaN(Bound<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- isNaN(BoundReference<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- isNaN(UnboundTerm<T>) - Static method in class org.apache.iceberg.expressions.Expressions
- isNestedType() - Method in interface org.apache.iceberg.types.Type
- isNestedType() - Method in class org.apache.iceberg.types.Type.NestedType
- isNoop() - Method in class org.apache.iceberg.common.DynMethods.UnboundMethod
-
Returns whether the method is a noop.
- isNull(String) - Static method in class org.apache.iceberg.expressions.Expressions
- isNull(Bound<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- isNull(BoundReference<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- isNull(UnboundTerm<T>) - Static method in class org.apache.iceberg.expressions.Expressions
- isNullable() - Method in class org.apache.iceberg.spark.source.SparkMetadataColumn
- isNullAt(int) - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- isNullAt(int) - Method in class org.apache.iceberg.arrow.vectorized.NullabilityHolder
-
Returns 1 if null, 0 otherwise.
- isNullAt(int) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- isNullAt(int) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- isNullAt(int) - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- isNullAt(int) - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- IsolationLevel - Enum in org.apache.iceberg
-
An isolation level in a table.
- isOptional() - Method in class org.apache.iceberg.types.Types.NestedField
- isOptionSchema(Schema) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- isPartitioned() - Method in class org.apache.iceberg.PartitionSpec
- isPhysicalColumn(TableColumn) - Static method in class org.apache.iceberg.flink.util.FlinkCompatibilityUtil
- isPrimitiveType() - Method in interface org.apache.iceberg.types.Type
- isPrimitiveType() - Method in class org.apache.iceberg.types.Type.PrimitiveType
- isPromotionAllowed(Type, Type.PrimitiveType) - Static method in class org.apache.iceberg.types.TypeUtil
- isRequired() - Method in class org.apache.iceberg.types.Types.NestedField
- isS3ChecksumEnabled() - Method in class org.apache.iceberg.aws.AwsProperties
- isSetPredicate() - Method in class org.apache.iceberg.expressions.BoundPredicate
- isSetPredicate() - Method in class org.apache.iceberg.expressions.BoundSetPredicate
- isSorted() - Method in class org.apache.iceberg.SortOrder
-
Returns true if the sort order is sorted
- isSplittable() - Method in enum org.apache.iceberg.FileFormat
- isStatic() - Method in class org.apache.iceberg.common.DynConstructors.Ctor
- isStatic() - Method in class org.apache.iceberg.common.DynFields.UnboundField
-
Returns whether the field is a static field.
- isStatic() - Method in class org.apache.iceberg.common.DynMethods.UnboundMethod
-
Returns whether the method is a static method.
- isStringType(LogicalType) - Method in class org.apache.iceberg.flink.data.AvroWithFlinkSchemaVisitor
- isStringType(DataType) - Method in class org.apache.iceberg.spark.data.AvroWithSparkSchemaVisitor
- isStringType(P) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- isStructType() - Method in interface org.apache.iceberg.types.Type
- isStructType() - Method in class org.apache.iceberg.types.Types.StructType
- isTimestamptz(Schema) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- isUnaryPredicate() - Method in class org.apache.iceberg.expressions.BoundPredicate
- isUnaryPredicate() - Method in class org.apache.iceberg.expressions.BoundUnaryPredicate
- isUnpartitioned() - Method in class org.apache.iceberg.PartitionSpec
- isUnsorted() - Method in class org.apache.iceberg.SortOrder
-
Returns true if the sort order is unsorted
- isValidDecimal() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
-
Verify whether current token is a valid decimal token (which contains dot).
- isValidIdentifier(TableIdentifier) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- isValidIdentifier(TableIdentifier) - Method in class org.apache.iceberg.BaseMetastoreCatalog
- isValidIdentifier(TableIdentifier) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- isValidIdentifier(TableIdentifier) - Method in class org.apache.iceberg.hive.HiveCatalog
- isValueOptional() - Method in class org.apache.iceberg.types.Types.MapType
- isValueRequired() - Method in class org.apache.iceberg.types.Types.MapType
- iterator() - Method in class org.apache.iceberg.arrow.vectorized.VectorizedTableScanIterable
- iterator() - Method in class org.apache.iceberg.avro.AvroIterable
- iterator() - Method in class org.apache.iceberg.io.CloseableIterable.ConcatCloseableIterable
- iterator() - Method in interface org.apache.iceberg.io.CloseableIterable
-
Returns an closeable iterator over elements of type
T
. - iterator() - Method in class org.apache.iceberg.ManifestReader
- iterator() - Method in class org.apache.iceberg.parquet.ParquetIterable
- iterator() - Method in class org.apache.iceberg.parquet.ParquetReader
- iterator() - Method in class org.apache.iceberg.parquet.VectorizedParquetReader
- iterator() - Method in class org.apache.iceberg.util.BinPacking.PackingIterable
- iterator() - Method in class org.apache.iceberg.util.CharSequenceSet
- iterator() - Method in class org.apache.iceberg.util.ParallelIterable
- iterator() - Method in class org.apache.iceberg.util.PartitionSet
- iterator() - Method in class org.apache.iceberg.util.SortedMerge
- iterator() - Method in class org.apache.iceberg.util.StructLikeSet
J
- javaClass() - Method in enum org.apache.iceberg.types.Type.TypeID
- javaClasses() - Method in class org.apache.iceberg.PartitionSpec
- JavaHash<T> - Interface in org.apache.iceberg.types
- JavaHashes - Class in org.apache.iceberg.types
- JdbcCatalog - Class in org.apache.iceberg.jdbc
- JdbcCatalog() - Constructor for class org.apache.iceberg.jdbc.JdbcCatalog
- JobGroupInfo - Class in org.apache.iceberg.spark
-
Captures information about the current job which is used for displaying on the UI
- JobGroupInfo(String, String, boolean) - Constructor for class org.apache.iceberg.spark.JobGroupInfo
- JobGroupUtils - Class in org.apache.iceberg.spark
- join(Schema, Schema) - Static method in class org.apache.iceberg.types.TypeUtil
- JsonUtil - Class in org.apache.iceberg.util
K
- KEEP_HIVE_STATS - Static variable in class org.apache.iceberg.hadoop.ConfigProperties
- key() - Method in class org.apache.iceberg.aliyun.oss.OSSURI
-
Return OSS object key name.
- KEY_ID_PROP - Static variable in class org.apache.iceberg.avro.AvroSchemaUtil
- KEY_METADATA - Static variable in interface org.apache.iceberg.DataFile
- KEY_METADATA - Static variable in interface org.apache.iceberg.ManifestFile
- keyId() - Method in class org.apache.iceberg.types.Types.MapType
- keyMetadata() - Method in interface org.apache.iceberg.ContentFile
-
Returns metadata about how this file is encrypted, or null if the file is stored in plain text.
- keyMetadata() - Method in interface org.apache.iceberg.encryption.EncryptedInputFile
-
Metadata pointing to some encryption key that would be used to decrypt the input file provided by
EncryptedInputFile.encryptedInputFile()
. - keyMetadata() - Method in interface org.apache.iceberg.encryption.EncryptedOutputFile
-
Metadata about the encryption key that is being used to encrypt the associated
EncryptedOutputFile.encryptingOutputFile()
. - keyMetadata() - Method in class org.apache.iceberg.GenericManifestFile
- keyMetadata() - Method in interface org.apache.iceberg.ManifestFile
-
Returns metadata about how this manifest file is encrypted, or null if the file is stored in plain text.
- keyMetadata() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- keyMetadata() - Method in class org.apache.iceberg.spark.SparkDataFile
- keyName() - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- keySet() - Method in class org.apache.iceberg.util.SerializableMap
- keySet() - Method in class org.apache.iceberg.util.StructLikeMap
- keyType() - Method in class org.apache.iceberg.types.Types.MapType
- kms() - Method in class org.apache.iceberg.aws.AssumeRoleAwsClientFactory
- kms() - Method in interface org.apache.iceberg.aws.AwsClientFactory
-
Create a AWS KMS client
L
- LAST - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- LAST - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- LAST() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- LAST() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- lastColumnId() - Method in class org.apache.iceberg.MetadataUpdate.AddSchema
- lastColumnId() - Method in class org.apache.iceberg.TableMetadata
- lastIndexOfSnapshot() - Method in class org.apache.iceberg.MicroBatches.MicroBatch
- lastSequenceNumber() - Method in class org.apache.iceberg.TableMetadata
- lastUpdatedMillis() - Method in class org.apache.iceberg.TableMetadata
- latestOffset() - Method in class org.apache.iceberg.spark.source.SparkMicroBatchStream
- lazyLength() - Method in class org.apache.iceberg.GenericManifestFile
- left() - Method in class org.apache.iceberg.expressions.And
- left() - Method in class org.apache.iceberg.expressions.Or
- length() - Method in class org.apache.iceberg.deletes.EqualityDeleteWriter
- length() - Method in class org.apache.iceberg.deletes.PositionDeleteWriter
- length() - Method in interface org.apache.iceberg.FileScanTask
-
The number of bytes to scan from the
FileScanTask.start()
position in the file. - length() - Method in class org.apache.iceberg.GenericManifestFile
- length() - Method in class org.apache.iceberg.io.DataWriter
- length() - Method in interface org.apache.iceberg.io.FileAppender
-
Returns the length of this file.
- length() - Method in interface org.apache.iceberg.io.FileWriter
-
Returns the number of bytes that were currently written by this writer.
- length() - Method in interface org.apache.iceberg.ManifestFile
-
Returns length of the manifest file.
- length() - Method in class org.apache.iceberg.ManifestWriter
- length() - Method in class org.apache.iceberg.MetricsModes.Truncate
- length() - Method in class org.apache.iceberg.parquet.ParquetWriteAdapter
-
Deprecated.
- length() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- length() - Method in class org.apache.iceberg.types.Types.FixedType
- length() - Method in class org.apache.iceberg.util.CharSequenceWrapper
- LENGTH - Static variable in interface org.apache.iceberg.ManifestFile
- lessThan(String, T) - Static method in class org.apache.iceberg.expressions.Expressions
- lessThan(UnboundTerm<T>, T) - Static method in class org.apache.iceberg.expressions.Expressions
- lessThanOrEqual(String, T) - Static method in class org.apache.iceberg.expressions.Expressions
- lessThanOrEqual(UnboundTerm<T>, T) - Static method in class org.apache.iceberg.expressions.Expressions
- level(int) - Method in class org.apache.iceberg.catalog.Namespace
- levels() - Method in class org.apache.iceberg.catalog.Namespace
- limit(int) - Method in class org.apache.iceberg.ScanSummary.Builder
- limit(long) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- list(ArrayType, GroupType, T) - Method in class org.apache.iceberg.flink.data.ParquetWithFlinkSchemaVisitor
- list(OrcValueWriter<T>) - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- list(Types.ListType, Integer, Boolean) - Method in class org.apache.iceberg.schema.UnionByNameVisitor
- list(Types.ListType, String) - Method in class org.apache.iceberg.spark.Spark3Util.DescribeSchemaVisitor
- list(Types.ListType, Supplier<List<String>>) - Method in class org.apache.iceberg.types.CheckCompatibility
- list(Types.ListType, Supplier<Type>) - Method in class org.apache.iceberg.spark.PruneColumnsWithoutReordering
- list(Types.ListType, Supplier<Type>) - Method in class org.apache.iceberg.spark.PruneColumnsWithReordering
- list(Types.ListType, Supplier<Type>) - Method in class org.apache.iceberg.types.FixupTypes
- list(Types.ListType, Supplier<T>) - Method in class org.apache.iceberg.types.TypeUtil.CustomOrderSchemaVisitor
- list(Types.ListType, Map<Integer, Integer>) - Method in class org.apache.iceberg.types.IndexParents
- list(Types.ListType, Map<String, Integer>) - Method in class org.apache.iceberg.types.IndexByName
- list(Types.ListType, ObjectInspector) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- list(Types.ListType, TypeDescription, T) - Method in class org.apache.iceberg.orc.OrcSchemaWithTypeVisitor
- list(Types.ListType, GroupType, T) - Method in class org.apache.iceberg.parquet.TypeWithSchemaVisitor
- list(Types.ListType, Type.Repetition, int, String) - Method in class org.apache.iceberg.parquet.TypeToMessageType
- list(Types.ListType, P, R) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- list(Types.ListType, T) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- list(TypeDescription, T) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- list(GroupType, Boolean) - Method in class org.apache.iceberg.parquet.ParquetSchemaUtil.HasIds
- list(GroupType, Type) - Method in class org.apache.iceberg.parquet.RemoveIds
- list(GroupType, T) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- list(ArrayType, GroupType, T) - Method in class org.apache.iceberg.spark.data.ParquetWithSparkSchemaVisitor
- LIST - org.apache.iceberg.types.Type.TypeID
- LIST_ALL_TABLES - Static variable in class org.apache.iceberg.hive.HiveCatalog
- LIST_ALL_TABLES_DEFAULT - Static variable in class org.apache.iceberg.hive.HiveCatalog
- listDatabases() - Method in class org.apache.iceberg.flink.FlinkCatalog
- listElementPartner(P) - Method in interface org.apache.iceberg.schema.SchemaWithPartnerVisitor.PartnerAccessors
- Listener<E> - Interface in org.apache.iceberg.events
-
A listener interface that can receive notifications.
- Listeners - Class in org.apache.iceberg.events
-
Static registration and notification for listeners.
- listFunctions(String) - Method in class org.apache.iceberg.flink.FlinkCatalog
- listNamespaces() - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
List top-level namespaces from the catalog.
- listNamespaces() - Method in class org.apache.iceberg.spark.SparkCatalog
- listNamespaces() - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- listNamespaces(String[]) - Method in class org.apache.iceberg.spark.SparkCatalog
- listNamespaces(String[]) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- listNamespaces(Namespace) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- listNamespaces(Namespace) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- listNamespaces(Namespace) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
List namespaces from the namespace.
- listNamespaces(Namespace) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- listNamespaces(Namespace) - Method in class org.apache.iceberg.hive.HiveCatalog
- listNamespaces(Namespace) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- listNamespaces(Namespace) - Method in class org.apache.iceberg.nessie.NessieCatalog
- ListPacker(long, int, boolean) - Constructor for class org.apache.iceberg.util.BinPacking.ListPacker
- listPartition(Map<String, String>, String, String, PartitionSpec, Configuration, MetricsConfig, NameMapping) - Static method in class org.apache.iceberg.data.TableMigrationUtil
-
Returns the data files in a partition by listing the partition location.
- listPartition(Map<String, String>, String, String, PartitionSpec, Configuration, MetricsConfig, NameMapping, int) - Static method in class org.apache.iceberg.data.TableMigrationUtil
- listPartition(SparkTableUtil.SparkPartition, PartitionSpec, SerializableConfiguration, MetricsConfig) - Static method in class org.apache.iceberg.spark.SparkTableUtil
- listPartition(SparkTableUtil.SparkPartition, PartitionSpec, SerializableConfiguration, MetricsConfig, NameMapping) - Static method in class org.apache.iceberg.spark.SparkTableUtil
- listPartitions(ObjectPath) - Method in class org.apache.iceberg.flink.FlinkCatalog
- listPartitions(ObjectPath, CatalogPartitionSpec) - Method in class org.apache.iceberg.flink.FlinkCatalog
- listPartitionsByFilter(ObjectPath, List<Expression>) - Method in class org.apache.iceberg.flink.FlinkCatalog
- ListReader(int, int, ParquetValueReader<E>) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.ListReader
- listTables(String) - Method in class org.apache.iceberg.flink.FlinkCatalog
- listTables(String[]) - Method in class org.apache.iceberg.spark.SparkCatalog
- listTables(String[]) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- listTables(Namespace) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- listTables(Namespace) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- listTables(Namespace) - Method in class org.apache.iceberg.CachingCatalog
- listTables(Namespace) - Method in interface org.apache.iceberg.catalog.Catalog
-
Return all the identifiers under this namespace.
- listTables(Namespace) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- listTables(Namespace) - Method in class org.apache.iceberg.hive.HiveCatalog
- listTables(Namespace) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- listTables(Namespace) - Method in class org.apache.iceberg.nessie.NessieCatalog
- listViews(String) - Method in class org.apache.iceberg.flink.FlinkCatalog
- literal() - Method in class org.apache.iceberg.expressions.BoundLiteralPredicate
- literal() - Method in class org.apache.iceberg.expressions.UnboundPredicate
- Literal<T> - Interface in org.apache.iceberg.expressions
-
Represents a literal fixed value in an expression predicate
- literals() - Method in class org.apache.iceberg.expressions.UnboundPredicate
- literalSet() - Method in class org.apache.iceberg.expressions.BoundSetPredicate
- load(String) - Method in class org.apache.iceberg.hadoop.HadoopTables
-
Loads the table location from a FileSystem path location.
- load(String) - Method in interface org.apache.iceberg.Tables
- loadCatalog() - Method in class org.apache.iceberg.flink.CatalogLoader.CustomCatalogLoader
- loadCatalog() - Method in class org.apache.iceberg.flink.CatalogLoader.HadoopCatalogLoader
- loadCatalog() - Method in class org.apache.iceberg.flink.CatalogLoader.HiveCatalogLoader
- loadCatalog() - Method in interface org.apache.iceberg.flink.CatalogLoader
-
Create a new catalog with the provided properties.
- loadCatalog(String, String, Map<String, String>, Object) - Static method in class org.apache.iceberg.CatalogUtil
-
Load a custom catalog implementation.
- loadCatalogMetadataTable(SparkSession, Table, MetadataTableType) - Static method in class org.apache.iceberg.spark.SparkTableUtil
- loader(ClassLoader) - Method in class org.apache.iceberg.common.DynClasses.Builder
-
Set the
ClassLoader
used to lookup classes by name. - loader(ClassLoader) - Method in class org.apache.iceberg.common.DynConstructors.Builder
-
Set the
ClassLoader
used to lookup classes by name. - loader(ClassLoader) - Method in class org.apache.iceberg.common.DynFields.Builder
-
Set the
ClassLoader
used to lookup classes by name. - loader(ClassLoader) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Set the
ClassLoader
used to lookup classes by name. - loadFileIO(String, Map<String, String>, Object) - Static method in class org.apache.iceberg.CatalogUtil
-
Load a custom
FileIO
implementation. - loadIcebergTable(SparkSession, String) - Static method in class org.apache.iceberg.spark.Spark3Util
-
Returns an Iceberg Table by its name from a Spark V2 Catalog.
- loadMetadataTable(SparkSession, Table, MetadataTableType) - Static method in class org.apache.iceberg.spark.SparkTableUtil
- loadNamespaceMetadata(String[]) - Method in class org.apache.iceberg.spark.SparkCatalog
- loadNamespaceMetadata(String[]) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- loadNamespaceMetadata(Namespace) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- loadNamespaceMetadata(Namespace) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- loadNamespaceMetadata(Namespace) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
Load metadata properties for a namespace.
- loadNamespaceMetadata(Namespace) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- loadNamespaceMetadata(Namespace) - Method in class org.apache.iceberg.hive.HiveCatalog
- loadNamespaceMetadata(Namespace) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- loadNamespaceMetadata(Namespace) - Method in class org.apache.iceberg.nessie.NessieCatalog
-
namespace metadata is not supported in Nessie, thus we return an empty map.
- loadProcedure(Identifier) - Method in interface org.apache.spark.sql.connector.iceberg.catalog.ProcedureCatalog
-
Load a
stored procedure
byidentifier
. - loadTable() - Method in class org.apache.iceberg.flink.TableLoader.CatalogTableLoader
- loadTable() - Method in class org.apache.iceberg.flink.TableLoader.HadoopTableLoader
- loadTable() - Method in interface org.apache.iceberg.flink.TableLoader
- loadTable(Configuration) - Static method in class org.apache.iceberg.mr.Catalogs
-
Load an Iceberg table using the catalog and table identifier (or table path) specified by the configuration.
- loadTable(Configuration, Properties) - Static method in class org.apache.iceberg.mr.Catalogs
-
Load an Iceberg table using the catalog specified by the configuration.
- loadTable(TableIdentifier) - Method in class org.apache.iceberg.BaseMetastoreCatalog
- loadTable(TableIdentifier) - Method in class org.apache.iceberg.CachingCatalog
- loadTable(TableIdentifier) - Method in interface org.apache.iceberg.catalog.Catalog
-
Load a table.
- loadTable(Identifier) - Method in class org.apache.iceberg.spark.SparkCatalog
- loadTable(Identifier) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- localInput(File) - Static method in class org.apache.iceberg.Files
- localInput(String) - Static method in class org.apache.iceberg.Files
- LOCALITY - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- LOCALITY - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- localityEnabled() - Method in class org.apache.iceberg.spark.SparkReadConf
- LOCALLY - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- LOCALLY - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- LOCALLY() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- LOCALLY() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- localOutput(File) - Static method in class org.apache.iceberg.Files
- localOutput(String) - Static method in class org.apache.iceberg.Files
- location() - Method in class org.apache.iceberg.aliyun.oss.OSSURI
-
Return original, unmodified OSS URI location.
- location() - Method in class org.apache.iceberg.BaseTable
- location() - Method in class org.apache.iceberg.hadoop.HadoopInputFile
- location() - Method in class org.apache.iceberg.hadoop.HadoopOutputFile
- location() - Method in interface org.apache.iceberg.io.InputFile
-
The fully-qualified location of the input file as a String.
- location() - Method in interface org.apache.iceberg.io.OutputFile
-
Return the location this output file will create.
- location() - Method in class org.apache.iceberg.MetadataUpdate.SetLocation
- location() - Method in class org.apache.iceberg.SerializableTable
- location() - Method in class org.apache.iceberg.spark.PathIdentifier
- location() - Method in interface org.apache.iceberg.Table
-
Return the table's base location.
- location() - Method in class org.apache.iceberg.TableMetadata
- location(String) - Method in interface org.apache.iceberg.actions.DeleteOrphanFiles
-
Passes a location which should be scanned for orphan files.
- location(String) - Method in class org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction
- LOCATION - Static variable in class org.apache.iceberg.mr.Catalogs
- locationProvider() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- locationProvider() - Method in class org.apache.iceberg.BaseTable
- locationProvider() - Method in class org.apache.iceberg.hadoop.HadoopTableOperations
- locationProvider() - Method in class org.apache.iceberg.SerializableTable
- locationProvider() - Method in class org.apache.iceberg.StaticTableOperations
- locationProvider() - Method in interface org.apache.iceberg.Table
-
Returns a
LocationProvider
to provide locations for new data files. - locationProvider() - Method in interface org.apache.iceberg.TableOperations
-
Returns a
LocationProvider
that supplies locations for new new data files. - LocationProvider - Interface in org.apache.iceberg.io
-
Interface for providing data file locations to write tasks.
- LocationProviders - Class in org.apache.iceberg
- locationsFor(String, Map<String, String>) - Static method in class org.apache.iceberg.LocationProviders
- LOCK_ACQUIRE_INTERVAL_MS - Static variable in class org.apache.iceberg.CatalogProperties
- LOCK_ACQUIRE_INTERVAL_MS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- LOCK_ACQUIRE_TIMEOUT_MS - Static variable in class org.apache.iceberg.CatalogProperties
- LOCK_ACQUIRE_TIMEOUT_MS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- LOCK_HEARTBEAT_INTERVAL_MS - Static variable in class org.apache.iceberg.CatalogProperties
- LOCK_HEARTBEAT_INTERVAL_MS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- LOCK_HEARTBEAT_THREADS - Static variable in class org.apache.iceberg.CatalogProperties
- LOCK_HEARTBEAT_THREADS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- LOCK_HEARTBEAT_TIMEOUT_MS - Static variable in class org.apache.iceberg.CatalogProperties
- LOCK_HEARTBEAT_TIMEOUT_MS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- LOCK_IMPL - Static variable in class org.apache.iceberg.CatalogProperties
- LOCK_PROPERTY_PREFIX - Static variable in class org.apache.iceberg.hadoop.HadoopTables
- LOCK_TABLE - Static variable in class org.apache.iceberg.CatalogProperties
- LockManager - Interface in org.apache.iceberg
-
An interface for locking, used to ensure commit isolation.
- LockManagers - Class in org.apache.iceberg.util
- LockManagers.BaseLockManager - Class in org.apache.iceberg.util
- LogicalMap - Class in org.apache.iceberg.avro
- LONG - org.apache.iceberg.orc.ORCSchemaUtil.LongType
- LONG - org.apache.iceberg.types.Type.TypeID
- LongAsDecimalReader(ColumnDescriptor, int) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.LongAsDecimalReader
- longBackedDecimalBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- LongBackedDecimalBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.LongBackedDecimalBatchReader
- longBackedDecimalDictEncodedReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- longBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- LongBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.LongBatchReader
- longDictEncodedReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- longs() - Static method in class org.apache.iceberg.avro.ValueReaders
- longs() - Static method in class org.apache.iceberg.avro.ValueWriters
- longs() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- longs() - Static method in class org.apache.iceberg.orc.OrcValueReaders
- longs(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- LongType() - Constructor for class org.apache.iceberg.types.Types.LongType
- LOOKBACK - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- LOWER_BOUNDS - Static variable in interface org.apache.iceberg.DataFile
- lowerBound() - Method in class org.apache.iceberg.FieldMetrics
-
Returns the lower bound value of this field.
- lowerBound() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- lowerBound() - Method in interface org.apache.iceberg.ManifestFile.PartitionFieldSummary
-
Returns a ByteBuffer that contains a serialized bound lower than all values of the field.
- lowerBounds() - Method in interface org.apache.iceberg.ContentFile
-
Returns if collected, map from column ID to value lower bounds, null otherwise.
- lowerBounds() - Method in class org.apache.iceberg.Metrics
-
Get the non-null lower bound values for all fields in a file.
- lowerBounds() - Method in class org.apache.iceberg.spark.SparkDataFile
- lt(Bound<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- lt(BoundReference<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- LT - org.apache.iceberg.expressions.Expression.Operation
- LT_EQ - org.apache.iceberg.expressions.Expression.Operation
- ltEq(Bound<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- ltEq(BoundReference<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
M
- makeColumnOptional(String) - Method in interface org.apache.iceberg.UpdateSchema
-
Update a column to optional.
- makeCompatibleName(String) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- manageSnapshots() - Method in class org.apache.iceberg.BaseTable
- manageSnapshots() - Method in class org.apache.iceberg.SerializableTable
- manageSnapshots() - Method in interface org.apache.iceberg.Table
-
Create a new
manage snapshots API
to manage snapshots in this table and commit. - ManageSnapshots - Interface in org.apache.iceberg
-
API for managing snapshots.
- MANIFEST_CONTENT - Static variable in interface org.apache.iceberg.ManifestFile
- MANIFEST_LISTS_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- MANIFEST_LISTS_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- MANIFEST_MERGE_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- MANIFEST_MERGE_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- MANIFEST_MIN_MERGE_COUNT - Static variable in class org.apache.iceberg.TableProperties
- MANIFEST_MIN_MERGE_COUNT_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- MANIFEST_TARGET_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- MANIFEST_TARGET_SIZE_BYTES_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- ManifestContent - Enum in org.apache.iceberg
-
Content type stored in a manifest file, either DATA or DELETES.
- ManifestEntriesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's manifest entries as rows, for both delete and data files. - ManifestEvaluator - Class in org.apache.iceberg.expressions
-
Evaluates an
Expression
on aManifestFile
to test whether the file contains matching partitions. - ManifestFile - Interface in org.apache.iceberg
-
Represents a manifest file that can be scanned to find data files in a table.
- ManifestFile.PartitionFieldSummary - Interface in org.apache.iceberg
-
Summarizes the values of one partition field stored in a manifest file.
- ManifestFileBean - Class in org.apache.iceberg.spark.actions
- ManifestFileBean() - Constructor for class org.apache.iceberg.spark.actions.ManifestFileBean
- ManifestFiles - Class in org.apache.iceberg
- ManifestFileUtil - Class in org.apache.iceberg.util
- manifestListLocation() - Method in interface org.apache.iceberg.Snapshot
-
Return the location of this snapshot's manifest list, or null if it is not separate.
- manifestListLocations(Table) - Static method in class org.apache.iceberg.ReachableFileUtil
-
Returns locations of manifest lists in a table.
- ManifestReader<F extends ContentFile<F>> - Class in org.apache.iceberg
-
Base reader for data and delete manifest files.
- ManifestReader(InputFile, Map<Integer, PartitionSpec>, InheritableMetadata, ManifestReader.FileType) - Constructor for class org.apache.iceberg.ManifestReader
- ManifestReader.FileType - Enum in org.apache.iceberg
- MANIFESTS - org.apache.iceberg.MetadataTableType
- ManifestsTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's manifest files as rows. - ManifestWriter<F extends ContentFile<F>> - Class in org.apache.iceberg
-
Writer for manifest files.
- map(Schema, Schema) - Method in class org.apache.iceberg.avro.RemoveIds
- map(Schema, T) - Method in class org.apache.iceberg.avro.AvroSchemaVisitor
- map(MapType, GroupType, T, T) - Method in class org.apache.iceberg.flink.data.ParquetWithFlinkSchemaVisitor
- map(ValueReader<K>, ValueReader<V>) - Static method in class org.apache.iceberg.avro.ValueReaders
- map(ValueWriter<K>, ValueWriter<V>) - Static method in class org.apache.iceberg.avro.ValueWriters
- map(CombinedScanTask) - Method in class org.apache.iceberg.flink.source.RowDataRewriter.RewriteMap
- map(OrcValueReader<?>, OrcValueReader<?>) - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- map(OrcValueWriter<K>, OrcValueWriter<V>) - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- map(Types.MapType, Integer, Boolean, Boolean) - Method in class org.apache.iceberg.schema.UnionByNameVisitor
- map(Types.MapType, String, String) - Method in class org.apache.iceberg.spark.Spark3Util.DescribeSchemaVisitor
- map(Types.MapType, Supplier<List<String>>, Supplier<List<String>>) - Method in class org.apache.iceberg.types.CheckCompatibility
- map(Types.MapType, Supplier<Type>, Supplier<Type>) - Method in class org.apache.iceberg.spark.PruneColumnsWithoutReordering
- map(Types.MapType, Supplier<Type>, Supplier<Type>) - Method in class org.apache.iceberg.spark.PruneColumnsWithReordering
- map(Types.MapType, Supplier<Type>, Supplier<Type>) - Method in class org.apache.iceberg.types.FixupTypes
- map(Types.MapType, Supplier<T>, Supplier<T>) - Method in class org.apache.iceberg.types.TypeUtil.CustomOrderSchemaVisitor
- map(Types.MapType, Map<Integer, Integer>, Map<Integer, Integer>) - Method in class org.apache.iceberg.types.IndexParents
- map(Types.MapType, Map<String, Integer>, Map<String, Integer>) - Method in class org.apache.iceberg.types.IndexByName
- map(Types.MapType, Schema, T) - Method in class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- map(Types.MapType, Schema, T, T) - Method in class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- map(Types.MapType, ObjectInspector, ObjectInspector) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- map(Types.MapType, TypeDescription, T, T) - Method in class org.apache.iceberg.orc.OrcSchemaWithTypeVisitor
- map(Types.MapType, GroupType, T, T) - Method in class org.apache.iceberg.parquet.TypeWithSchemaVisitor
- map(Types.MapType, Type.Repetition, int, String) - Method in class org.apache.iceberg.parquet.TypeToMessageType
- map(Types.MapType, P, R, R) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- map(Types.MapType, T, T) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- map(TypeDescription, T, T) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- map(GroupType, Boolean, Boolean) - Method in class org.apache.iceberg.parquet.ParquetSchemaUtil.HasIds
- map(GroupType, Type, Type) - Method in class org.apache.iceberg.parquet.RemoveIds
- map(GroupType, T, T) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- map(MapType, GroupType, T, T) - Method in class org.apache.iceberg.spark.data.ParquetWithSparkSchemaVisitor
- map(P, Schema, T) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- map(P, Schema, T, T) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- MAP - org.apache.iceberg.types.Type.TypeID
- MAP - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- MAP - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- MAP() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- MAP() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- MAP_JOINER - Static variable in class org.apache.iceberg.SnapshotSummary
- mapKeyPartner(P) - Method in interface org.apache.iceberg.schema.SchemaWithPartnerVisitor.PartnerAccessors
- mapKeyType(LogicalType) - Method in class org.apache.iceberg.flink.data.AvroWithFlinkSchemaVisitor
- mapKeyType(DataType) - Method in class org.apache.iceberg.spark.data.AvroWithSparkSchemaVisitor
- mapKeyType(P) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- MappedField - Class in org.apache.iceberg.mapping
-
An immutable mapping between a field ID and a set of names.
- MappedFields - Class in org.apache.iceberg.mapping
- mapper() - Static method in class org.apache.iceberg.util.JsonUtil
- MappingUtil - Class in org.apache.iceberg.mapping
- MapReader(int, int, ParquetValueReader<K>, ParquetValueReader<V>) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.MapReader
- MapredIcebergInputFormat<T> - Class in org.apache.iceberg.mr.mapred
-
Generic MR v1 InputFormat API for Iceberg.
- MapredIcebergInputFormat() - Constructor for class org.apache.iceberg.mr.mapred.MapredIcebergInputFormat
- MapredIcebergInputFormat.CompatibilityTaskAttemptContextImpl - Class in org.apache.iceberg.mr.mapred
- maps(int, int, ParquetValueWriter<K>, ParquetValueWriter<V>) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- mapValuePartner(P) - Method in interface org.apache.iceberg.schema.SchemaWithPartnerVisitor.PartnerAccessors
- mapValueType(LogicalType) - Method in class org.apache.iceberg.flink.data.AvroWithFlinkSchemaVisitor
- mapValueType(DataType) - Method in class org.apache.iceberg.spark.data.AvroWithSparkSchemaVisitor
- mapValueType(P) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- MAX_CONCURRENT_FILE_GROUP_REWRITES - Static variable in interface org.apache.iceberg.actions.RewriteDataFiles
-
The max number of file groups to be simultaneously rewritten by the rewrite strategy.
- MAX_CONCURRENT_FILE_GROUP_REWRITES_DEFAULT - Static variable in interface org.apache.iceberg.actions.RewriteDataFiles
- MAX_FILE_GROUP_SIZE_BYTES - Static variable in interface org.apache.iceberg.actions.RewriteDataFiles
-
The entire rewrite operation is broken down into pieces based on partitioning and within partitions based on size into groups.
- MAX_FILE_GROUP_SIZE_BYTES_DEFAULT - Static variable in interface org.apache.iceberg.actions.RewriteDataFiles
- MAX_FILE_SIZE_BYTES - Static variable in class org.apache.iceberg.actions.BinPackStrategy
-
Adjusts files which will be considered for rewriting.
- MAX_FILE_SIZE_DEFAULT_RATIO - Static variable in class org.apache.iceberg.actions.BinPackStrategy
- MAX_SNAPSHOT_AGE_MS - Static variable in class org.apache.iceberg.TableProperties
- MAX_SNAPSHOT_AGE_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- maxGroupSize() - Method in class org.apache.iceberg.actions.BinPackStrategy
- maxParallelism(int) - Method in class org.apache.iceberg.flink.actions.RewriteDataFilesAction
- merge(SnapshotSummary.Builder) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- MERGE - org.apache.spark.sql.connector.iceberg.write.RowLevelOperation.Command
- MERGE_CARDINALITY_CHECK_ENABLED - Static variable in class org.apache.iceberg.TableProperties
-
Deprecated.will be removed in 0.14.0, the cardinality check is always performed starting from 0.13.0.
- MERGE_CARDINALITY_CHECK_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
-
Deprecated.will be removed in 0.14.0, the cardinality check is always performed starting from 0.13.0.
- MERGE_DISTRIBUTION_MODE - Static variable in class org.apache.iceberg.TableProperties
- MERGE_ISOLATION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- MERGE_ISOLATION_LEVEL_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- MERGE_MODE - Static variable in class org.apache.iceberg.TableProperties
- MERGE_MODE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- MERGE_ON_READ - org.apache.iceberg.RowLevelOperationMode
- message(RowType, MessageType, List<T>) - Method in class org.apache.iceberg.flink.data.ParquetWithFlinkSchemaVisitor
- message(Types.StructType, MessageType, List<VectorizedReader<?>>) - Method in class org.apache.iceberg.arrow.vectorized.VectorizedReaderBuilder
- message(Types.StructType, MessageType, List<T>) - Method in class org.apache.iceberg.parquet.TypeWithSchemaVisitor
- message(MessageType, List<Boolean>) - Method in class org.apache.iceberg.parquet.ParquetSchemaUtil.HasIds
- message(MessageType, List<Type>) - Method in class org.apache.iceberg.parquet.RemoveIds
- message(MessageType, List<T>) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- message(StructType, MessageType, List<T>) - Method in class org.apache.iceberg.spark.data.ParquetWithSparkSchemaVisitor
- meta(String, String) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- meta(String, String) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- meta(String, String) - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- meta(String, String) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- meta(String, String) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- meta(String, String) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- meta(String, String) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- meta(String, String) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- meta(Map<String, String>) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- meta(Map<String, String>) - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- metadata(String, String) - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- METADATA - org.apache.iceberg.FileFormat
- METADATA_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- METADATA_COMPRESSION_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- METADATA_DELETE_AFTER_COMMIT_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- METADATA_DELETE_AFTER_COMMIT_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- METADATA_LOCATION_PROP - Static variable in class org.apache.iceberg.BaseMetastoreTableOperations
- METADATA_PREVIOUS_VERSIONS_MAX - Static variable in class org.apache.iceberg.TableProperties
- METADATA_PREVIOUS_VERSIONS_MAX_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- METADATA_SPLIT_SIZE - Static variable in class org.apache.iceberg.TableProperties
- METADATA_SPLIT_SIZE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- metadataColumn(Table, String) - Static method in class org.apache.iceberg.MetadataColumns
- metadataColumns() - Method in class org.apache.iceberg.spark.source.SparkTable
- MetadataColumns - Class in org.apache.iceberg
- metadataFieldIds() - Static method in class org.apache.iceberg.MetadataColumns
- metadataFileLocation() - Method in class org.apache.iceberg.TableMetadata
- metadataFileLocation(String) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- metadataFileLocation(String) - Method in class org.apache.iceberg.hadoop.HadoopTableOperations
- metadataFileLocation(String) - Method in class org.apache.iceberg.StaticTableOperations
- metadataFileLocation(String) - Method in interface org.apache.iceberg.TableOperations
-
Given the name of a metadata file, obtain the full path of that file using an appropriate base location of the implementation's choosing.
- metadataFileLocations(Table, boolean) - Static method in class org.apache.iceberg.ReachableFileUtil
-
Returns locations of JSON metadata files in a table.
- metadataSchema() - Method in interface org.apache.spark.sql.connector.iceberg.write.ExtendedLogicalWriteInfo
-
The schema of the input metadata from Spark to data source.
- MetadataTableType - Enum in org.apache.iceberg
- MetadataTableUtils - Class in org.apache.iceberg
- MetadataUpdate - Interface in org.apache.iceberg
-
Represents a change to table metadata.
- MetadataUpdate.AddPartitionSpec - Class in org.apache.iceberg
- MetadataUpdate.AddSchema - Class in org.apache.iceberg
- MetadataUpdate.AddSnapshot - Class in org.apache.iceberg
- MetadataUpdate.AddSortOrder - Class in org.apache.iceberg
- MetadataUpdate.AssignUUID - Class in org.apache.iceberg
- MetadataUpdate.RemoveProperties - Class in org.apache.iceberg
- MetadataUpdate.RemoveSnapshot - Class in org.apache.iceberg
- MetadataUpdate.SetCurrentSchema - Class in org.apache.iceberg
- MetadataUpdate.SetCurrentSnapshot - Class in org.apache.iceberg
- MetadataUpdate.SetDefaultPartitionSpec - Class in org.apache.iceberg
- MetadataUpdate.SetDefaultSortOrder - Class in org.apache.iceberg
- MetadataUpdate.SetLocation - Class in org.apache.iceberg
- MetadataUpdate.SetProperties - Class in org.apache.iceberg
- MetadataUpdate.UpgradeFormatVersion - Class in org.apache.iceberg
- MetastoreUtil - Class in org.apache.iceberg.hive
- metrics() - Method in interface org.apache.iceberg.avro.MetricsAwareDatumWriter
-
Returns a stream of
FieldMetrics
that this MetricsAwareDatumWriter keeps track of. - metrics() - Method in interface org.apache.iceberg.avro.ValueWriter
- metrics() - Method in class org.apache.iceberg.data.avro.DataWriter
- metrics() - Method in class org.apache.iceberg.data.orc.GenericOrcWriter
- metrics() - Method in class org.apache.iceberg.data.orc.GenericOrcWriters.StructWriter
- metrics() - Method in class org.apache.iceberg.flink.data.FlinkAvroWriter
- metrics() - Method in class org.apache.iceberg.flink.data.FlinkOrcWriter
- metrics() - Method in interface org.apache.iceberg.io.FileAppender
-
Returns
Metrics
for this file. - metrics() - Method in class org.apache.iceberg.ManifestWriter
- metrics() - Method in interface org.apache.iceberg.orc.OrcRowWriter
-
Returns a stream of
FieldMetrics
that this OrcRowWriter keeps track of. - metrics() - Method in interface org.apache.iceberg.orc.OrcValueWriter
-
Returns a stream of
FieldMetrics
that this OrcValueWriter keeps track of. - metrics() - Method in interface org.apache.iceberg.parquet.ParquetValueWriter
-
Returns a stream of
FieldMetrics
that this ParquetValueWriter keeps track of. - metrics() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedKeyValueWriter
- metrics() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedWriter
- metrics() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.StructWriter
- metrics() - Method in class org.apache.iceberg.parquet.ParquetWriteAdapter
-
Deprecated.
- metrics() - Method in class org.apache.iceberg.spark.data.SparkAvroWriter
- metrics() - Method in class org.apache.iceberg.spark.data.SparkOrcWriter
- Metrics - Class in org.apache.iceberg
-
Iceberg file format metrics.
- Metrics() - Constructor for class org.apache.iceberg.Metrics
- Metrics(Long, Map<Integer, Long>, Map<Integer, Long>, Map<Integer, Long>, Map<Integer, Long>) - Constructor for class org.apache.iceberg.Metrics
- Metrics(Long, Map<Integer, Long>, Map<Integer, Long>, Map<Integer, Long>, Map<Integer, Long>, Map<Integer, ByteBuffer>, Map<Integer, ByteBuffer>) - Constructor for class org.apache.iceberg.Metrics
- METRICS_MODE_COLUMN_CONF_PREFIX - Static variable in class org.apache.iceberg.TableProperties
- MetricsAwareDatumWriter<D> - Interface in org.apache.iceberg.avro
-
Wrapper writer around
DatumWriter
with metrics support. - metricsConfig(MetricsConfig) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- metricsConfig(MetricsConfig) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- metricsConfig(MetricsConfig) - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- metricsConfig(MetricsConfig) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- metricsConfig(MetricsConfig) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- metricsConfig(MetricsConfig) - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- metricsConfig(MetricsConfig) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- metricsConfig(MetricsConfig) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- metricsConfig(MetricsConfig) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- MetricsConfig - Class in org.apache.iceberg
- metricsMode(Schema, MetricsConfig, int) - Static method in class org.apache.iceberg.MetricsUtil
-
Extract MetricsMode for the given field id from metrics config.
- MetricsModes - Class in org.apache.iceberg
-
This class defines different metrics modes, which allow users to control the collection of value_counts, null_value_counts, nan_value_counts, lower_bounds, upper_bounds for different columns in metadata.
- MetricsModes.Counts - Class in org.apache.iceberg
-
Under this mode, only value_counts, null_value_counts, nan_value_counts are persisted.
- MetricsModes.Full - Class in org.apache.iceberg
-
Under this mode, value_counts, null_value_counts, nan_value_counts and full lower_bounds, upper_bounds are persisted.
- MetricsModes.MetricsMode - Interface in org.apache.iceberg
-
A metrics calculation mode.
- MetricsModes.None - Class in org.apache.iceberg
-
Under this mode, value_counts, null_value_counts, nan_value_counts, lower_bounds, upper_bounds are not persisted.
- MetricsModes.Truncate - Class in org.apache.iceberg
-
Under this mode, value_counts, null_value_counts, nan_value_counts and truncated lower_bounds, upper_bounds are persisted.
- MetricsUtil - Class in org.apache.iceberg
- MicroBatches - Class in org.apache.iceberg
- MicroBatches.MicroBatch - Class in org.apache.iceberg
- MicroBatches.MicroBatchBuilder - Class in org.apache.iceberg
- MICROS_PER_MILLIS - Static variable in class org.apache.iceberg.util.DateTimeUtil
- microsFromInstant(Instant) - Static method in class org.apache.iceberg.util.DateTimeUtil
- microsFromTime(LocalTime) - Static method in class org.apache.iceberg.util.DateTimeUtil
- microsFromTimestamp(LocalDateTime) - Static method in class org.apache.iceberg.util.DateTimeUtil
- microsFromTimestamptz(OffsetDateTime) - Static method in class org.apache.iceberg.util.DateTimeUtil
- microsToMillis(long) - Static method in class org.apache.iceberg.util.DateTimeUtil
- migratedDataFilesCount() - Method in class org.apache.iceberg.actions.BaseMigrateTableActionResult
- migratedDataFilesCount() - Method in interface org.apache.iceberg.actions.MigrateTable.Result
-
Returns the number of migrated data files.
- migrateTable(String) - Method in interface org.apache.iceberg.actions.ActionsProvider
-
Instantiates an action to migrate an existing table to Iceberg.
- migrateTable(String) - Method in class org.apache.iceberg.spark.actions.SparkActions
- MigrateTable - Interface in org.apache.iceberg.actions
-
An action that migrates an existing table to Iceberg.
- MigrateTable.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- MIN_FILE_SIZE_BYTES - Static variable in class org.apache.iceberg.actions.BinPackStrategy
-
Adjusts files which will be considered for rewriting.
- MIN_FILE_SIZE_DEFAULT_RATIO - Static variable in class org.apache.iceberg.actions.BinPackStrategy
- MIN_INPUT_FILES - Static variable in class org.apache.iceberg.actions.BinPackStrategy
-
The minimum number of files that need to be in a file group for it to be considered for compaction if the total size of that group is less than the
RewriteDataFiles.TARGET_FILE_SIZE_BYTES
. - MIN_INPUT_FILES_DEFAULT - Static variable in class org.apache.iceberg.actions.BinPackStrategy
- MIN_SEQUENCE_NUMBER - Static variable in interface org.apache.iceberg.ManifestFile
- MIN_SNAPSHOTS_TO_KEEP - Static variable in class org.apache.iceberg.TableProperties
- MIN_SNAPSHOTS_TO_KEEP_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- minSequenceNumber() - Method in class org.apache.iceberg.GenericManifestFile
- minSequenceNumber() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the lowest sequence number of any data file in the manifest.
- minSequenceNumber() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- MINUS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- MINUS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- MINUS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- MINUS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- MINUS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- MINUS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DoubleLiteralContext
- MINUS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExponentLiteralContext
- MINUS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FloatLiteralContext
- MINUS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IntegerLiteralContext
- MINUS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SmallIntLiteralContext
- MINUS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TinyIntLiteralContext
- modeName() - Method in enum org.apache.iceberg.DistributionMode
- modeNames - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- month(int, String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- month(String) - Static method in class org.apache.iceberg.expressions.Expressions
- month(String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- month(String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- month(String, int, SortDirection, NullOrder) - Method in interface org.apache.iceberg.transforms.SortOrderVisitor
- month(String, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- month(Type) - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a month
Transform
for date or timestamp types. - moveAfter(String, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Move a column from its current position to directly after a reference column.
- moveBefore(String, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Move a column from its current position to directly before a reference column.
- moveFirst(String) - Method in interface org.apache.iceberg.UpdateSchema
-
Move a column from its current position to the start of the schema or its parent struct.
- multipartIdentifier - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FieldListContext
- multipartIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- multipartIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- multipartIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- multipartIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- multipartIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FieldListContext
- multipartIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentityTransformContext
- multipartIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- multipartIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- multipartIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- multipartIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- multipartIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformArgumentContext
- multipartIdentifier(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FieldListContext
- MultipartIdentifierContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MultipartIdentifierContext
N
- name - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- name - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- name() - Method in class org.apache.iceberg.actions.BinPackStrategy
- name() - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteStrategy
-
Returns the name of this convert deletes strategy
- name() - Method in interface org.apache.iceberg.actions.RewritePositionDeleteStrategy
-
Returns the name of this rewrite deletes strategy
- name() - Method in interface org.apache.iceberg.actions.RewriteStrategy
-
Returns the name of this rewrite strategy
- name() - Method in class org.apache.iceberg.actions.SortStrategy
- name() - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- name() - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- name() - Method in class org.apache.iceberg.BaseTable
- name() - Method in class org.apache.iceberg.CachingCatalog
- name() - Method in interface org.apache.iceberg.catalog.Catalog
-
Return the name for this catalog.
- name() - Method in class org.apache.iceberg.catalog.TableIdentifier
-
Returns the identifier name.
- name() - Method in class org.apache.iceberg.expressions.NamedReference
- name() - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- name() - Method in class org.apache.iceberg.hive.HiveCatalog
- name() - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- name() - Method in class org.apache.iceberg.nessie.NessieCatalog
- name() - Method in class org.apache.iceberg.PartitionField
-
Returns the name of this partition field.
- name() - Method in class org.apache.iceberg.SerializableTable
- name() - Method in class org.apache.iceberg.spark.PathIdentifier
- name() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- name() - Method in class org.apache.iceberg.spark.source.SparkMetadataColumn
- name() - Method in class org.apache.iceberg.spark.source.SparkTable
- name() - Method in class org.apache.iceberg.spark.SparkCatalog
- name() - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- name() - Method in interface org.apache.iceberg.Table
-
Return the full name for this table.
- name() - Method in class org.apache.iceberg.types.Types.NestedField
- name() - Method in interface org.apache.spark.sql.connector.iceberg.catalog.ProcedureParameter
-
Returns the name of this parameter.
- NAME - Static variable in class org.apache.iceberg.mr.Catalogs
- named(String) - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- named(String) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- NamedArgumentContext(IcebergSqlExtensionsParser.CallArgumentContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NamedArgumentContext
- NamedReference<T> - Class in org.apache.iceberg.expressions
- nameMapping(String) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- NameMapping - Class in org.apache.iceberg.mapping
-
Represents a mapping from external schema names to Iceberg type IDs.
- NameMappingParser - Class in org.apache.iceberg.mapping
-
Parses external name mappings from a JSON representation.
- names() - Method in class org.apache.iceberg.mapping.MappedField
- namespace() - Method in class org.apache.iceberg.catalog.TableIdentifier
-
Returns the identifier namespace.
- namespace() - Method in class org.apache.iceberg.spark.PathIdentifier
- Namespace - Class in org.apache.iceberg.catalog
-
A namespace in a
Catalog
. - namespaceExists(Namespace) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
Checks whether the Namespace exists.
- namespaceExists(Namespace) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- NamespaceNotEmptyException - Exception in org.apache.iceberg.exceptions
-
Exception raised when attempting to drop a namespace that is not empty.
- NamespaceNotEmptyException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.NamespaceNotEmptyException
- NamespaceNotEmptyException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.NamespaceNotEmptyException
- nameToSupportMap - Static variable in enum org.apache.hadoop.hive.ql.exec.vector.VectorizedSupport.Support
- NAN_VALUE_COUNTS - Static variable in interface org.apache.iceberg.DataFile
- NaNUtil - Class in org.apache.iceberg.util
- nanValueCount() - Method in class org.apache.iceberg.FieldMetrics
-
Returns the number of NaN values for this field.
- nanValueCounts() - Method in interface org.apache.iceberg.ContentFile
-
Returns if collected, map from column ID to its NaN value count, null otherwise.
- nanValueCounts() - Method in class org.apache.iceberg.Metrics
-
Get the number of NaN values for all float and double fields in a file.
- nanValueCounts() - Method in class org.apache.iceberg.spark.SparkDataFile
- needsTaskCommit(TaskAttemptContext) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
- negate() - Method in class org.apache.iceberg.expressions.And
- negate() - Method in class org.apache.iceberg.expressions.BoundLiteralPredicate
- negate() - Method in class org.apache.iceberg.expressions.BoundSetPredicate
- negate() - Method in class org.apache.iceberg.expressions.BoundUnaryPredicate
- negate() - Method in interface org.apache.iceberg.expressions.Expression
-
Returns the negation of this expression, equivalent to not(this).
- negate() - Method in enum org.apache.iceberg.expressions.Expression.Operation
-
Returns the operation used when this is negated.
- negate() - Method in class org.apache.iceberg.expressions.False
- negate() - Method in class org.apache.iceberg.expressions.Not
- negate() - Method in class org.apache.iceberg.expressions.Or
- negate() - Method in class org.apache.iceberg.expressions.True
- negate() - Method in class org.apache.iceberg.expressions.UnboundPredicate
- NESSIE_CONFIG_PREFIX - Static variable in class org.apache.iceberg.nessie.NessieUtil
- NessieCatalog - Class in org.apache.iceberg.nessie
-
Nessie implementation of Iceberg Catalog.
- NessieCatalog() - Constructor for class org.apache.iceberg.nessie.NessieCatalog
- NessieTableOperations - Class in org.apache.iceberg.nessie
-
Nessie implementation of Iceberg TableOperations.
- NessieUtil - Class in org.apache.iceberg.nessie
- nestedMapping() - Method in class org.apache.iceberg.mapping.MappedField
- NestedType() - Constructor for class org.apache.iceberg.types.Type.NestedType
- newAppend() - Method in class org.apache.iceberg.BaseTable
- newAppend() - Method in class org.apache.iceberg.SerializableTable
- newAppend() - Method in interface org.apache.iceberg.Table
-
Create a new
append API
to add files to this table and commit. - newAppend() - Method in interface org.apache.iceberg.Transaction
-
Create a new
append API
to add files to this table. - newAppender(OutputFile, FileFormat) - Method in class org.apache.iceberg.data.GenericAppenderFactory
- newAppender(OutputFile, FileFormat) - Method in class org.apache.iceberg.flink.sink.FlinkAppenderFactory
- newAppender(OutputFile, FileFormat) - Method in interface org.apache.iceberg.io.FileAppenderFactory
-
Create a new
FileAppender
. - newAppender(PartitionSpec, OutputFile) - Method in class org.apache.iceberg.ManifestWriter
- newBuilder(String) - Static method in class org.apache.iceberg.spark.procedures.SparkProcedures
- newClient() - Method in class org.apache.iceberg.ClientPoolImpl
- newClient() - Method in class org.apache.iceberg.hive.HiveClientPool
- newCreateTableTransaction(String, Schema, PartitionSpec, Map<String, String>) - Method in class org.apache.iceberg.hadoop.HadoopTables
-
Start a transaction to create a table.
- newCreateTableTransaction(TableIdentifier, Schema) - Method in interface org.apache.iceberg.catalog.Catalog
-
Start a transaction to create a table.
- newCreateTableTransaction(TableIdentifier, Schema, PartitionSpec) - Method in interface org.apache.iceberg.catalog.Catalog
-
Start a transaction to create a table.
- newCreateTableTransaction(TableIdentifier, Schema, PartitionSpec, String, Map<String, String>) - Method in interface org.apache.iceberg.catalog.Catalog
-
Start a transaction to create a table.
- newCreateTableTransaction(TableIdentifier, Schema, PartitionSpec, Map<String, String>) - Method in interface org.apache.iceberg.catalog.Catalog
-
Start a transaction to create a table.
- newDataLocation(String) - Method in interface org.apache.iceberg.io.LocationProvider
-
Return a fully-qualified data file location for the given filename.
- newDataLocation(PartitionSpec, StructLike, String) - Method in interface org.apache.iceberg.io.LocationProvider
-
Return a fully-qualified data file location for the given partition and filename.
- newDataWriter(EncryptedOutputFile, FileFormat, StructLike) - Method in class org.apache.iceberg.data.GenericAppenderFactory
- newDataWriter(EncryptedOutputFile, FileFormat, StructLike) - Method in class org.apache.iceberg.flink.sink.FlinkAppenderFactory
- newDataWriter(EncryptedOutputFile, FileFormat, StructLike) - Method in interface org.apache.iceberg.io.FileAppenderFactory
-
Create a new
DataWriter
. - newDataWriter(EncryptedOutputFile, PartitionSpec, StructLike) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- newDataWriter(EncryptedOutputFile, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.FileWriterFactory
-
Creates a new
DataWriter
. - newDelete() - Method in class org.apache.iceberg.BaseTable
- newDelete() - Method in class org.apache.iceberg.SerializableTable
- newDelete() - Method in interface org.apache.iceberg.Table
-
Create a new
delete API
to replace files in this table and commit. - newDelete() - Method in interface org.apache.iceberg.Transaction
-
Create a new
delete API
to replace files in this table. - newEqDeleteWriter(EncryptedOutputFile, FileFormat, StructLike) - Method in class org.apache.iceberg.data.GenericAppenderFactory
- newEqDeleteWriter(EncryptedOutputFile, FileFormat, StructLike) - Method in class org.apache.iceberg.flink.sink.FlinkAppenderFactory
- newEqDeleteWriter(EncryptedOutputFile, FileFormat, StructLike) - Method in interface org.apache.iceberg.io.FileAppenderFactory
-
Create a new
EqualityDeleteWriter
. - newEqualityDeleteWriter(EncryptedOutputFile, PartitionSpec, StructLike) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- newEqualityDeleteWriter(EncryptedOutputFile, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.FileWriterFactory
-
Creates a new
EqualityDeleteWriter
. - newFastAppend() - Method in class org.apache.iceberg.BaseTable
- newFastAppend() - Method in interface org.apache.iceberg.Table
-
Create a new
append API
to add files to this table and commit. - newFastAppend() - Method in interface org.apache.iceberg.Transaction
-
Create a new
append API
to add files to this table. - newFiles(Long, long, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- newInputFile(String) - Method in class org.apache.iceberg.aliyun.oss.OSSFileIO
- newInputFile(String) - Method in class org.apache.iceberg.aws.s3.S3FileIO
- newInputFile(String) - Method in class org.apache.iceberg.gcp.gcs.GCSFileIO
- newInputFile(String) - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- newInputFile(String) - Method in interface org.apache.iceberg.io.FileIO
-
Get a
InputFile
instance to read bytes from the file at the given path. - newInputFile(String) - Method in class org.apache.iceberg.io.ResolvingFileIO
- newInstance(Object...) - Method in class org.apache.iceberg.common.DynConstructors.Ctor
- newInstanceChecked(Object...) - Method in class org.apache.iceberg.common.DynConstructors.Ctor
- newListData(List<E>) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ListReader
- newListData(T) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- newMapData(Map<K, V>) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.MapReader
- newMapData(M) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- newOSSClient() - Method in interface org.apache.iceberg.aliyun.AliyunClientFactory
-
Create an aliyun OSS client.
- newOutputFile() - Method in class org.apache.iceberg.io.OutputFileFactory
-
Generates an
EncryptedOutputFile
for unpartitioned writes. - newOutputFile(String) - Method in class org.apache.iceberg.aliyun.oss.OSSFileIO
- newOutputFile(String) - Method in class org.apache.iceberg.aws.s3.S3FileIO
- newOutputFile(String) - Method in class org.apache.iceberg.gcp.gcs.GCSFileIO
- newOutputFile(String) - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- newOutputFile(String) - Method in interface org.apache.iceberg.io.FileIO
-
Get a
OutputFile
instance to write bytes to the file at the given path. - newOutputFile(String) - Method in class org.apache.iceberg.io.ResolvingFileIO
- newOutputFile(PartitionSpec, StructLike) - Method in class org.apache.iceberg.io.OutputFileFactory
-
Generates an
EncryptedOutputFile
for partitioned writes in a given spec. - newOutputFile(StructLike) - Method in class org.apache.iceberg.io.OutputFileFactory
-
Generates an
EncryptedOutputFile
for partitioned writes in the default spec. - newOverwrite() - Method in class org.apache.iceberg.BaseTable
- newOverwrite() - Method in class org.apache.iceberg.SerializableTable
- newOverwrite() - Method in interface org.apache.iceberg.Table
-
Create a new
overwrite API
to overwrite files by a filter expression. - newOverwrite() - Method in interface org.apache.iceberg.Transaction
-
Create a new
overwrite API
to overwrite files by a filter expression. - newPosDeleteWriter(EncryptedOutputFile, FileFormat, StructLike) - Method in class org.apache.iceberg.data.GenericAppenderFactory
- newPosDeleteWriter(EncryptedOutputFile, FileFormat, StructLike) - Method in class org.apache.iceberg.flink.sink.FlinkAppenderFactory
- newPosDeleteWriter(EncryptedOutputFile, FileFormat, StructLike) - Method in interface org.apache.iceberg.io.FileAppenderFactory
-
Create a new
PositionDeleteWriter
. - newPositionDeleteWriter(EncryptedOutputFile, PartitionSpec, StructLike) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- newPositionDeleteWriter(EncryptedOutputFile, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.FileWriterFactory
-
Creates a new
PositionDeleteWriter
. - newRefinedScan(TableOperations, Table, Schema, TableScanContext) - Method in class org.apache.iceberg.AllDataFilesTable.AllDataFilesTableScan
- newRefinedScan(TableOperations, Table, Schema, TableScanContext) - Method in class org.apache.iceberg.AllManifestsTable.AllManifestsTableScan
- newRefinedScan(TableOperations, Table, Schema, TableScanContext) - Method in class org.apache.iceberg.DataFilesTable.FilesTableScan
- newRefinedScan(TableOperations, Table, Schema, TableScanContext) - Method in class org.apache.iceberg.DataTableScan
- newReplacePartitions() - Method in class org.apache.iceberg.BaseTable
- newReplacePartitions() - Method in class org.apache.iceberg.SerializableTable
- newReplacePartitions() - Method in interface org.apache.iceberg.Table
-
Not recommended: Create a new
replace partitions API
to dynamically overwrite partitions in the table with new data. - newReplacePartitions() - Method in interface org.apache.iceberg.Transaction
-
Not recommended: Create a new
replace partitions API
to dynamically overwrite partitions in the table with new data. - newReplaceTableTransaction(String, Schema, PartitionSpec, Map<String, String>, boolean) - Method in class org.apache.iceberg.hadoop.HadoopTables
-
Start a transaction to replace a table.
- newReplaceTableTransaction(TableIdentifier, Schema, boolean) - Method in interface org.apache.iceberg.catalog.Catalog
-
Start a transaction to replace a table.
- newReplaceTableTransaction(TableIdentifier, Schema, PartitionSpec, boolean) - Method in interface org.apache.iceberg.catalog.Catalog
-
Start a transaction to replace a table.
- newReplaceTableTransaction(TableIdentifier, Schema, PartitionSpec, String, Map<String, String>, boolean) - Method in interface org.apache.iceberg.catalog.Catalog
-
Start a transaction to replace a table.
- newReplaceTableTransaction(TableIdentifier, Schema, PartitionSpec, Map<String, String>, boolean) - Method in interface org.apache.iceberg.catalog.Catalog
-
Start a transaction to replace a table.
- newRewrite() - Method in class org.apache.iceberg.BaseTable
- newRewrite() - Method in class org.apache.iceberg.SerializableTable
- newRewrite() - Method in interface org.apache.iceberg.Table
-
Create a new
rewrite API
to replace files in this table and commit. - newRewrite() - Method in interface org.apache.iceberg.Transaction
-
Create a new
rewrite API
to replace files in this table. - newRowDelta() - Method in class org.apache.iceberg.BaseTable
- newRowDelta() - Method in class org.apache.iceberg.SerializableTable
- newRowDelta() - Method in interface org.apache.iceberg.Table
-
Create a new
row-level delta API
to remove or replace rows in existing data files. - newRowDelta() - Method in interface org.apache.iceberg.Transaction
-
Create a new
row-level delta API
to remove or replace rows in existing data files. - newRowLevelOperationBuilder(RowLevelOperationInfo) - Method in class org.apache.iceberg.spark.source.SparkTable
- newRowLevelOperationBuilder(RowLevelOperationInfo) - Method in interface org.apache.spark.sql.connector.iceberg.catalog.SupportsRowLevelOperations
-
Returns a RowLevelOperationBuilder to build a RowLevelOperation.
- newScan() - Method in class org.apache.iceberg.AllDataFilesTable
- newScan() - Method in class org.apache.iceberg.AllEntriesTable
- newScan() - Method in class org.apache.iceberg.AllManifestsTable
- newScan() - Method in class org.apache.iceberg.BaseTable
- newScan() - Method in class org.apache.iceberg.DataFilesTable
- newScan() - Method in class org.apache.iceberg.HistoryTable
- newScan() - Method in class org.apache.iceberg.ManifestEntriesTable
- newScan() - Method in class org.apache.iceberg.ManifestsTable
- newScan() - Method in class org.apache.iceberg.PartitionsTable
- newScan() - Method in class org.apache.iceberg.SerializableTable
- newScan() - Method in class org.apache.iceberg.SnapshotsTable
- newScan() - Method in interface org.apache.iceberg.Table
-
Create a new
scan
for this table. - newScanBuilder(CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.RollbackStagedTable
- newScanBuilder(CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.source.SparkTable
- newScanBuilder(CaseInsensitiveStringMap) - Method in interface org.apache.spark.sql.connector.iceberg.write.RowLevelOperation
-
Returns a scan builder to configure a scan for this row-level operation.
- newSnapshotId() - Method in interface org.apache.iceberg.TableOperations
-
Create a new ID for a Snapshot
- newStream() - Method in class org.apache.iceberg.aliyun.oss.OSSInputFile
- newStream() - Method in class org.apache.iceberg.aws.s3.S3InputFile
- newStream() - Method in class org.apache.iceberg.gcp.gcs.GCSInputFile
- newStream() - Method in class org.apache.iceberg.hadoop.HadoopInputFile
- newStream() - Method in interface org.apache.iceberg.io.InputFile
-
Opens a new
SeekableInputStream
for the underlying data file - newStructData(T) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- newTable(TableOperations, String) - Method in class org.apache.iceberg.SerializableTable
- newTableMetadata(Schema, PartitionSpec, String, Map<String, String>) - Static method in class org.apache.iceberg.TableMetadata
- newTableMetadata(Schema, PartitionSpec, SortOrder, String, Map<String, String>) - Static method in class org.apache.iceberg.TableMetadata
- newTableOps(TableIdentifier) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- newTableOps(TableIdentifier) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- newTableOps(TableIdentifier) - Method in class org.apache.iceberg.BaseMetastoreCatalog
- newTableOps(TableIdentifier) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- newTableOps(TableIdentifier) - Method in class org.apache.iceberg.hive.HiveCatalog
- newTableOps(TableIdentifier) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- newTableOps(TableIdentifier) - Method in class org.apache.iceberg.nessie.NessieCatalog
- newTaskAttemptContext(JobConf, Reporter) - Static method in class org.apache.iceberg.mr.mapred.MapredIcebergInputFormat
- newTransaction() - Method in class org.apache.iceberg.BaseTable
- newTransaction() - Method in class org.apache.iceberg.SerializableTable
- newTransaction() - Method in interface org.apache.iceberg.Table
-
Create a new
transaction API
to commit multiple table operations at once. - newTransaction(String, TableOperations) - Static method in class org.apache.iceberg.Transactions
- newWriteBuilder(ExtendedLogicalWriteInfo) - Method in interface org.apache.spark.sql.connector.iceberg.write.RowLevelOperation
-
Returns a write builder to configure a write for this row-level operation.
- newWriteBuilder(ExtendedLogicalWriteInfo) - Method in interface org.apache.spark.sql.connector.iceberg.write.SupportsDelta
- newWriteBuilder(LogicalWriteInfo) - Method in class org.apache.iceberg.spark.RollbackStagedTable
- newWriteBuilder(LogicalWriteInfo) - Method in class org.apache.iceberg.spark.source.SparkTable
- newWriter(EncryptedOutputFile) - Method in class org.apache.iceberg.io.RollingDataWriter
- newWriter(EncryptedOutputFile) - Method in class org.apache.iceberg.io.RollingEqualityDeleteWriter
- newWriter(EncryptedOutputFile) - Method in class org.apache.iceberg.io.RollingPositionDeleteWriter
- newWriter(PartitionSpec, StructLike) - Method in class org.apache.iceberg.io.ClusteredDataWriter
- newWriter(PartitionSpec, StructLike) - Method in class org.apache.iceberg.io.ClusteredEqualityDeleteWriter
- newWriter(PartitionSpec, StructLike) - Method in class org.apache.iceberg.io.ClusteredPositionDeleteWriter
- newWriter(PartitionSpec, StructLike) - Method in class org.apache.iceberg.io.FanoutDataWriter
- next() - Method in class org.apache.iceberg.flink.source.DataIterator
- next() - Method in class org.apache.iceberg.io.ClosingIterator
- next() - Method in class org.apache.iceberg.io.FilterIterator
- next() - Method in class org.apache.iceberg.orc.VectorizedRowBatchIterator
- nextBatch(FieldVector, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.BatchReader
- nextBatchDictionaryIds(IntVector, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedPageIterator
-
Method for reading a batch of dictionary ids from the dictionary encoded data pages.
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.BatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.BooleanBatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.DictionaryBatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.DoubleBatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.FixedLengthDecimalBatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.FixedSizeBinaryBatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.FixedWidthTypeBinaryBatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.FloatBatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.IntBackedDecimalBatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.IntegerBatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.LongBackedDecimalBatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.LongBatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.TimestampMillisBatchReader
- nextBatchOf(FieldVector, int, int, int, NullabilityHolder) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.VarWidthTypeBatchReader
- nextBinary() - Method in class org.apache.iceberg.parquet.ColumnIterator
- nextBoolean() - Method in class org.apache.iceberg.parquet.ColumnIterator
- nextDouble() - Method in class org.apache.iceberg.parquet.ColumnIterator
- nextFloat() - Method in class org.apache.iceberg.parquet.ColumnIterator
- nextInteger() - Method in class org.apache.iceberg.parquet.ColumnIterator
- nextKeyValue() - Method in class org.apache.iceberg.pig.IcebergPigInputFormat.IcebergRecordReader
- nextLong() - Method in class org.apache.iceberg.parquet.ColumnIterator
- nextNull() - Method in class org.apache.iceberg.parquet.ColumnIterator
- nextRecord(RowData) - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- nextSequenceNumber() - Method in class org.apache.iceberg.TableMetadata
- NO_LOCATION_PREFERENCE - Static variable in class org.apache.iceberg.hadoop.HadoopInputFile
- None() - Constructor for class org.apache.iceberg.MetricsModes.None
- NONE - org.apache.iceberg.DistributionMode
- NONE - org.apache.iceberg.TableMetadataParser.Codec
- nonMetadataColumn(String) - Static method in class org.apache.iceberg.MetadataColumns
- nonNullRead(ColumnVector, int) - Method in interface org.apache.iceberg.orc.OrcValueReader
- nonNullRead(ColumnVector, int) - Method in class org.apache.iceberg.orc.OrcValueReaders.StructReader
- nonNullWrite(int, S, ColumnVector) - Method in class org.apache.iceberg.data.orc.GenericOrcWriters.StructWriter
- nonNullWrite(int, T, ColumnVector) - Method in interface org.apache.iceberg.orc.OrcValueWriter
- nonReserved() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- nonReserved() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.UnquotedIdentifierContext
- NonReservedContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- noRetry() - Method in class org.apache.iceberg.util.Tasks.Builder
- NoSuchIcebergTableException - Exception in org.apache.iceberg.exceptions
-
NoSuchTableException thrown when a table is found but it is not an Iceberg table.
- NoSuchIcebergTableException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.NoSuchIcebergTableException
- NoSuchNamespaceException - Exception in org.apache.iceberg.exceptions
-
Exception raised when attempting to load a namespace that does not exist.
- NoSuchNamespaceException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.NoSuchNamespaceException
- NoSuchNamespaceException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.NoSuchNamespaceException
- NoSuchProcedureException - Exception in org.apache.spark.sql.catalyst.analysis
- NoSuchProcedureException(Identifier) - Constructor for exception org.apache.spark.sql.catalyst.analysis.NoSuchProcedureException
- NoSuchTableException - Exception in org.apache.iceberg.exceptions
-
Exception raised when attempting to load a table that does not exist.
- NoSuchTableException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.NoSuchTableException
- NoSuchTableException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.NoSuchTableException
- not(Expression) - Static method in class org.apache.iceberg.expressions.Expressions
- not(R) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- Not - Class in org.apache.iceberg.expressions
- NOT - org.apache.iceberg.expressions.Expression.Operation
- NOT_EQ - org.apache.iceberg.expressions.Expression.Operation
- NOT_IN - org.apache.iceberg.expressions.Expression.Operation
- NOT_NAN - org.apache.iceberg.expressions.Expression.Operation
- NOT_NULL - org.apache.iceberg.expressions.Expression.Operation
- NOT_STARTS_WITH - org.apache.iceberg.expressions.Expression.Operation
- notEq(Bound<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- notEq(BoundReference<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- notEqual(String, T) - Static method in class org.apache.iceberg.expressions.Expressions
- notEqual(UnboundTerm<T>, T) - Static method in class org.apache.iceberg.expressions.Expressions
- NotFoundException - Exception in org.apache.iceberg.exceptions
-
Exception raised when attempting to read a file that does not exist.
- NotFoundException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.NotFoundException
- NotFoundException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.NotFoundException
- notify(E) - Method in interface org.apache.iceberg.events.Listener
- notifyAll(E) - Static method in class org.apache.iceberg.events.Listeners
- notIn(String, Iterable<T>) - Static method in class org.apache.iceberg.expressions.Expressions
- notIn(String, T...) - Static method in class org.apache.iceberg.expressions.Expressions
- notIn(Bound<T>, Set<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- notIn(BoundReference<T>, Set<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- notIn(UnboundTerm<T>, Iterable<T>) - Static method in class org.apache.iceberg.expressions.Expressions
- notIn(UnboundTerm<T>, T...) - Static method in class org.apache.iceberg.expressions.Expressions
- notNaN(String) - Static method in class org.apache.iceberg.expressions.Expressions
- notNaN(Bound<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- notNaN(BoundReference<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- notNaN(UnboundTerm<T>) - Static method in class org.apache.iceberg.expressions.Expressions
- notNull(String) - Static method in class org.apache.iceberg.expressions.Expressions
- notNull(Bound<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- notNull(BoundReference<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- notNull(UnboundTerm<T>) - Static method in class org.apache.iceberg.expressions.Expressions
- notStartsWith(String, String) - Static method in class org.apache.iceberg.expressions.Expressions
- notStartsWith(Bound<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- notStartsWith(BoundReference<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- notStartsWith(UnboundTerm<String>, String) - Static method in class org.apache.iceberg.expressions.Expressions
- NULL_VALUE_COUNTS - Static variable in interface org.apache.iceberg.DataFile
- nullabilityHolder() - Method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- nullabilityHolder() - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- NullabilityHolder - Class in org.apache.iceberg.arrow.vectorized
-
Instances of this class simply track whether a value at an index is null.
- NullabilityHolder(int) - Constructor for class org.apache.iceberg.arrow.vectorized.NullabilityHolder
- nullOrder - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- nullOrder() - Method in class org.apache.iceberg.SortField
-
Returns the null order
- NullOrder - Enum in org.apache.iceberg
- nulls() - Static method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- nulls() - Static method in class org.apache.iceberg.avro.ValueReaders
- nulls() - Static method in class org.apache.iceberg.avro.ValueWriters
- nulls() - Static method in class org.apache.iceberg.parquet.ParquetValueReaders
- NULLS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- NULLS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- NULLS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- NULLS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- NULLS_FIRST - org.apache.iceberg.NullOrder
- NULLS_LAST - org.apache.iceberg.NullOrder
- nullsFirst() - Static method in class org.apache.iceberg.types.Comparators
- nullsLast() - Static method in class org.apache.iceberg.types.Comparators
- nullType() - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- nullType() - Method in class org.apache.iceberg.flink.data.AvroWithFlinkSchemaVisitor
- nullType() - Method in class org.apache.iceberg.spark.data.AvroWithSparkSchemaVisitor
- nullValueCount() - Method in class org.apache.iceberg.FieldMetrics
-
Returns the number of null values for this field.
- nullValueCounts() - Method in interface org.apache.iceberg.ContentFile
-
Returns if collected, map from column ID to its null value count, null otherwise.
- nullValueCounts() - Method in class org.apache.iceberg.Metrics
-
Get the number of null values for all fields in a file.
- nullValueCounts() - Method in class org.apache.iceberg.spark.SparkDataFile
- nullWrite() - Method in interface org.apache.iceberg.orc.OrcValueWriter
- number() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- number() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumericLiteralContext
- NumberContext() - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumberContext
- NumberContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumberContext
- numCols() - Method in class org.apache.iceberg.arrow.vectorized.ColumnarBatch
-
Returns the number of columns that make up this batch.
- NumericLiteralContext(IcebergSqlExtensionsParser.ConstantContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumericLiteralContext
- numNulls() - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- numNulls() - Method in class org.apache.iceberg.arrow.vectorized.NullabilityHolder
- numNulls() - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- numNulls() - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- numOutputFiles(long) - Method in class org.apache.iceberg.actions.BinPackStrategy
-
Determine how many output files to create when rewriting.
- numRows() - Method in class org.apache.iceberg.arrow.vectorized.ColumnarBatch
-
Returns the number of rows for read, including filtered rows.
- numValues() - Method in class org.apache.iceberg.arrow.vectorized.VectorHolder.ConstantVectorHolder
- numValues() - Method in class org.apache.iceberg.arrow.vectorized.VectorHolder
O
- OBJECT_STORE_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- OBJECT_STORE_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- OBJECT_STORE_PATH - Static variable in class org.apache.iceberg.TableProperties
-
Deprecated.Use
TableProperties.WRITE_DATA_LOCATION
instead. - of(boolean) - Static method in interface org.apache.iceberg.expressions.Literal
- of(byte[]) - Static method in class org.apache.iceberg.encryption.EncryptionKeyMetadatas
- of(byte[]) - Static method in interface org.apache.iceberg.expressions.Literal
- of(double) - Static method in interface org.apache.iceberg.expressions.Literal
- of(float) - Static method in interface org.apache.iceberg.expressions.Literal
- of(int) - Static method in interface org.apache.iceberg.expressions.Literal
- of(int, boolean, String, Type) - Static method in class org.apache.iceberg.types.Types.NestedField
- of(int, boolean, String, Type, String) - Static method in class org.apache.iceberg.types.Types.NestedField
- of(int, int) - Static method in class org.apache.iceberg.types.Types.DecimalType
- of(long) - Static method in interface org.apache.iceberg.expressions.Literal
- of(CharSequence) - Static method in interface org.apache.iceberg.expressions.Literal
- of(Integer, Iterable<String>) - Static method in class org.apache.iceberg.mapping.MappedField
- of(Integer, Iterable<String>, MappedFields) - Static method in class org.apache.iceberg.mapping.MappedField
- of(Integer, String) - Static method in class org.apache.iceberg.mapping.MappedField
- of(Integer, String, MappedFields) - Static method in class org.apache.iceberg.mapping.MappedField
- of(Iterable<CharSequence>) - Static method in class org.apache.iceberg.util.CharSequenceSet
- of(String...) - Static method in class org.apache.iceberg.catalog.Namespace
- of(String...) - Static method in class org.apache.iceberg.catalog.TableIdentifier
- of(BigDecimal) - Static method in interface org.apache.iceberg.expressions.Literal
- of(ByteBuffer) - Static method in class org.apache.iceberg.encryption.EncryptionKeyMetadatas
- of(ByteBuffer) - Static method in interface org.apache.iceberg.expressions.Literal
- of(List<MappedField>) - Static method in class org.apache.iceberg.mapping.MappedFields
- of(List<MappedField>) - Static method in class org.apache.iceberg.mapping.NameMapping
- of(List<Types.NestedField>) - Static method in class org.apache.iceberg.types.Types.StructType
- of(UUID) - Static method in interface org.apache.iceberg.expressions.Literal
- of(ValueVector) - Method in interface org.apache.iceberg.arrow.vectorized.GenericArrowVectorAccessorFactory.StructChildFactory
-
Create the child vector of type such as Spark's ArrowColumnVector from the arrow child vector.
- of(Namespace, String) - Static method in class org.apache.iceberg.catalog.TableIdentifier
- of(MappedField...) - Static method in class org.apache.iceberg.mapping.MappedFields
- of(MappedField...) - Static method in class org.apache.iceberg.mapping.NameMapping
- of(MappedFields) - Static method in class org.apache.iceberg.mapping.NameMapping
- of(PartitionSpec, Expression, boolean) - Static method in class org.apache.iceberg.expressions.ResidualEvaluator
-
Return a residual evaluator for a
spec
andexpression
. - of(TableScan) - Static method in class org.apache.iceberg.ScanSummary
-
Create a scan summary builder for a table scan.
- of(Types.NestedField...) - Static method in class org.apache.iceberg.types.Types.StructType
- of(X, Y) - Static method in class org.apache.iceberg.util.Pair
- ofBigDecimal(BigDecimal, int, int) - Method in interface org.apache.iceberg.arrow.vectorized.GenericArrowVectorAccessorFactory.DecimalFactory
-
Create a decimal from the given
BigDecimal
value, precision and scale. - ofBytes(byte[]) - Method in interface org.apache.iceberg.arrow.vectorized.GenericArrowVectorAccessorFactory.StringFactory
-
Create a UTF8 String from the byte array.
- ofChild(ValueVector) - Method in interface org.apache.iceberg.arrow.vectorized.GenericArrowVectorAccessorFactory.ArrayFactory
-
Create a child vector of type
ChildVectorT
from the arrow child vector. - ofEqualityDeletes(int...) - Method in class org.apache.iceberg.FileMetadata.Builder
- offer(RewriteFileGroup) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager.CommitService
-
Places a file group in the queue to be asynchronously committed either when the queue has enough elements to do a batch of size
RewriteDataFilesCommitManager.CommitService.rewritesPerCommit
or the service has been closed. - ofLength(int) - Static method in class org.apache.iceberg.types.Types.FixedType
- ofLong(long, int, int) - Method in interface org.apache.iceberg.arrow.vectorized.GenericArrowVectorAccessorFactory.DecimalFactory
-
Create a decimal from the given long value, precision and scale.
- ofOptional(int, int, Type, Type) - Static method in class org.apache.iceberg.types.Types.MapType
- ofOptional(int, Type) - Static method in class org.apache.iceberg.types.Types.ListType
- ofPositionDeletes() - Method in class org.apache.iceberg.FileMetadata.Builder
- ofRequired(int, int, Type, Type) - Static method in class org.apache.iceberg.types.Types.MapType
- ofRequired(int, Type) - Static method in class org.apache.iceberg.types.Types.ListType
- ofRow(ValueVector, ChildVectorT, int) - Method in interface org.apache.iceberg.arrow.vectorized.GenericArrowVectorAccessorFactory.ArrayFactory
-
Create an Arrow of type
ArrayT
from the row value in the arrow child vector. - ofRow(VarCharVector, int) - Method in interface org.apache.iceberg.arrow.vectorized.GenericArrowVectorAccessorFactory.StringFactory
-
Create a UTF8 String from the row value in the arrow vector.
- olderThan(long) - Method in interface org.apache.iceberg.actions.DeleteOrphanFiles
-
Removes orphan files only if they are older than the given timestamp.
- olderThan(long) - Method in class org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction
- oldestAncestor(Table) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Traverses the history of the table's current snapshot and finds the oldest Snapshot.
- oldestAncestorAfter(Table, long) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Traverses the history of the table's current snapshot and finds the first snapshot committed after the given time.
- onFailure(Tasks.FailureTask<I, ?>) - Method in class org.apache.iceberg.util.Tasks.Builder
- onlyRetryOn(Class<? extends Exception>) - Method in class org.apache.iceberg.util.Tasks.Builder
- onlyRetryOn(Class<? extends Exception>...) - Method in class org.apache.iceberg.util.Tasks.Builder
- op() - Method in class org.apache.iceberg.expressions.And
- op() - Method in interface org.apache.iceberg.expressions.Expression
-
Returns the operation for an expression node.
- op() - Method in class org.apache.iceberg.expressions.False
- op() - Method in class org.apache.iceberg.expressions.Not
- op() - Method in class org.apache.iceberg.expressions.Or
- op() - Method in class org.apache.iceberg.expressions.Predicate
- op() - Method in class org.apache.iceberg.expressions.True
- open() - Method in class org.apache.iceberg.flink.FlinkCatalog
- open() - Method in class org.apache.iceberg.flink.TableLoader.CatalogTableLoader
- open() - Method in class org.apache.iceberg.flink.TableLoader.HadoopTableLoader
- open() - Method in interface org.apache.iceberg.flink.TableLoader
- open(Configuration) - Method in class org.apache.iceberg.flink.source.RowDataRewriter.RewriteMap
- open(FileScanTask, InputFilesDecryptor) - Method in interface org.apache.iceberg.flink.source.FileScanTaskReader
- open(FileScanTask, InputFilesDecryptor) - Method in class org.apache.iceberg.flink.source.RowDataFileScanTaskReader
- open(FlinkInputSplit) - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- open(CloseableIterable<CombinedScanTask>) - Method in class org.apache.iceberg.arrow.vectorized.ArrowReader
-
Returns a new iterator of
ColumnarBatch
objects. - operation() - Method in class org.apache.iceberg.BaseOverwriteFiles
- operation() - Method in class org.apache.iceberg.BaseReplacePartitions
- operation() - Method in class org.apache.iceberg.BaseRewriteManifests
- operation() - Method in class org.apache.iceberg.events.CreateSnapshotEvent
- operation() - Method in interface org.apache.iceberg.Snapshot
-
Return the name of the
data operation
that produced this snapshot. - operation() - Method in class org.apache.iceberg.SnapshotManager
- operationId(String) - Method in class org.apache.iceberg.io.OutputFileFactory.Builder
- operations() - Method in class org.apache.iceberg.BaseTable
- operations() - Method in interface org.apache.iceberg.HasTableOperations
- option(int, ValueWriter<T>) - Static method in class org.apache.iceberg.avro.ValueWriters
- option(String, String) - Method in interface org.apache.iceberg.actions.Action
-
Configures this action with an extra option.
- option(String, String) - Method in interface org.apache.iceberg.TableScan
- option(Type, int, ParquetValueReader<T>) - Static method in class org.apache.iceberg.parquet.ParquetValueReaders
- option(Type, int, ParquetValueWriter<T>) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- optional(int, String, Type) - Static method in class org.apache.iceberg.types.Types.NestedField
- optional(int, String, Type, String) - Static method in class org.apache.iceberg.types.Types.NestedField
- optional(String, DataType) - Static method in interface org.apache.spark.sql.connector.iceberg.catalog.ProcedureParameter
-
Creates an optional input parameter.
- optionalOptions() - Method in class org.apache.iceberg.flink.FlinkDynamicTableFactory
- options() - Method in interface org.apache.spark.sql.connector.iceberg.write.RowLevelOperationInfo
-
Returns options that the user specified when performing the row-level operation.
- options(Map<String, String>) - Method in interface org.apache.iceberg.actions.Action
-
Configures this action with extra options.
- options(Map<String, String>) - Method in class org.apache.iceberg.actions.BinPackStrategy
- options(Map<String, String>) - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteStrategy
-
Sets options to be used with this strategy
- options(Map<String, String>) - Method in interface org.apache.iceberg.actions.RewritePositionDeleteStrategy
-
Sets options to be used with this strategy
- options(Map<String, String>) - Method in interface org.apache.iceberg.actions.RewriteStrategy
-
Sets options to be used with this strategy
- options(Map<String, String>) - Method in class org.apache.iceberg.actions.SortStrategy
- options(Map<String, String>) - Method in class org.apache.iceberg.spark.actions.Spark3SortStrategy
- or(Expression, Expression) - Static method in class org.apache.iceberg.expressions.Expressions
- or(R, R) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- Or - Class in org.apache.iceberg.expressions
- OR - org.apache.iceberg.expressions.Expression.Operation
- ORC - Class in org.apache.iceberg.orc
- ORC - org.apache.iceberg.FileFormat
- ORC_BATCH_SIZE - Static variable in class org.apache.iceberg.TableProperties
- ORC_BATCH_SIZE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- ORC_VECTORIZATION_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- ORC_VECTORIZATION_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- ORC.DataWriteBuilder - Class in org.apache.iceberg.orc
- ORC.DeleteWriteBuilder - Class in org.apache.iceberg.orc
- ORC.ReadBuilder - Class in org.apache.iceberg.orc
- ORC.WriteBuilder - Class in org.apache.iceberg.orc
- OrcBatchReader<T> - Interface in org.apache.iceberg.orc
-
Used for implementing ORC batch readers.
- orcBatchSize() - Method in class org.apache.iceberg.spark.SparkReadConf
- OrcMetrics - Class in org.apache.iceberg.orc
- OrcRowReader<T> - Interface in org.apache.iceberg.orc
-
Used for implementing ORC row readers.
- OrcRowWriter<T> - Interface in org.apache.iceberg.orc
-
Write data value of a schema.
- ORCSchemaUtil - Class in org.apache.iceberg.orc
-
Utilities for mapping Iceberg to ORC schemas.
- ORCSchemaUtil.BinaryType - Enum in org.apache.iceberg.orc
- ORCSchemaUtil.LongType - Enum in org.apache.iceberg.orc
- OrcSchemaVisitor<T> - Class in org.apache.iceberg.orc
-
Generic visitor of an ORC Schema.
- OrcSchemaVisitor() - Constructor for class org.apache.iceberg.orc.OrcSchemaVisitor
- OrcSchemaWithTypeVisitor<T> - Class in org.apache.iceberg.orc
- OrcSchemaWithTypeVisitor() - Constructor for class org.apache.iceberg.orc.OrcSchemaWithTypeVisitor
- OrcValueReader<T> - Interface in org.apache.iceberg.orc
- OrcValueReaders - Class in org.apache.iceberg.orc
- OrcValueReaders.StructReader<T> - Class in org.apache.iceberg.orc
- OrcValueWriter<T> - Interface in org.apache.iceberg.orc
- orcVectorizationEnabled() - Method in class org.apache.iceberg.spark.SparkReadConf
- order() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- order() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- OrderContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderContext
- ORDERED - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- ORDERED - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- ORDERED() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- ORDERED() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- orderField - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderContext
- orderField() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderContext
- orderField() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- orderField(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderContext
- OrderFieldContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- orderId() - Method in class org.apache.iceberg.SortOrder
-
Returns the ID of this sort order
- orderPreservingSortedColumns(SortOrder) - Static method in class org.apache.iceberg.util.SortOrderUtil
- org.apache.hadoop.hive.ql.exec.vector - package org.apache.hadoop.hive.ql.exec.vector
- org.apache.iceberg - package org.apache.iceberg
- org.apache.iceberg.actions - package org.apache.iceberg.actions
- org.apache.iceberg.aliyun - package org.apache.iceberg.aliyun
- org.apache.iceberg.aliyun.oss - package org.apache.iceberg.aliyun.oss
- org.apache.iceberg.arrow - package org.apache.iceberg.arrow
- org.apache.iceberg.arrow.vectorized - package org.apache.iceberg.arrow.vectorized
- org.apache.iceberg.arrow.vectorized.parquet - package org.apache.iceberg.arrow.vectorized.parquet
- org.apache.iceberg.avro - package org.apache.iceberg.avro
- org.apache.iceberg.aws - package org.apache.iceberg.aws
- org.apache.iceberg.aws.dynamodb - package org.apache.iceberg.aws.dynamodb
- org.apache.iceberg.aws.glue - package org.apache.iceberg.aws.glue
- org.apache.iceberg.aws.s3 - package org.apache.iceberg.aws.s3
- org.apache.iceberg.catalog - package org.apache.iceberg.catalog
- org.apache.iceberg.common - package org.apache.iceberg.common
- org.apache.iceberg.data - package org.apache.iceberg.data
- org.apache.iceberg.data.avro - package org.apache.iceberg.data.avro
- org.apache.iceberg.data.orc - package org.apache.iceberg.data.orc
- org.apache.iceberg.data.parquet - package org.apache.iceberg.data.parquet
- org.apache.iceberg.deletes - package org.apache.iceberg.deletes
- org.apache.iceberg.encryption - package org.apache.iceberg.encryption
- org.apache.iceberg.events - package org.apache.iceberg.events
- org.apache.iceberg.exceptions - package org.apache.iceberg.exceptions
- org.apache.iceberg.expressions - package org.apache.iceberg.expressions
- org.apache.iceberg.flink - package org.apache.iceberg.flink
- org.apache.iceberg.flink.actions - package org.apache.iceberg.flink.actions
- org.apache.iceberg.flink.data - package org.apache.iceberg.flink.data
- org.apache.iceberg.flink.sink - package org.apache.iceberg.flink.sink
- org.apache.iceberg.flink.source - package org.apache.iceberg.flink.source
- org.apache.iceberg.flink.source.split - package org.apache.iceberg.flink.source.split
- org.apache.iceberg.flink.util - package org.apache.iceberg.flink.util
- org.apache.iceberg.gcp - package org.apache.iceberg.gcp
- org.apache.iceberg.gcp.gcs - package org.apache.iceberg.gcp.gcs
- org.apache.iceberg.hadoop - package org.apache.iceberg.hadoop
- org.apache.iceberg.hive - package org.apache.iceberg.hive
- org.apache.iceberg.io - package org.apache.iceberg.io
- org.apache.iceberg.jdbc - package org.apache.iceberg.jdbc
- org.apache.iceberg.mapping - package org.apache.iceberg.mapping
- org.apache.iceberg.mr - package org.apache.iceberg.mr
- org.apache.iceberg.mr.hive - package org.apache.iceberg.mr.hive
- org.apache.iceberg.mr.hive.serde.objectinspector - package org.apache.iceberg.mr.hive.serde.objectinspector
- org.apache.iceberg.mr.mapred - package org.apache.iceberg.mr.mapred
- org.apache.iceberg.mr.mapreduce - package org.apache.iceberg.mr.mapreduce
- org.apache.iceberg.nessie - package org.apache.iceberg.nessie
- org.apache.iceberg.orc - package org.apache.iceberg.orc
- org.apache.iceberg.parquet - package org.apache.iceberg.parquet
- org.apache.iceberg.pig - package org.apache.iceberg.pig
- org.apache.iceberg.schema - package org.apache.iceberg.schema
- org.apache.iceberg.spark - package org.apache.iceberg.spark
- org.apache.iceberg.spark.actions - package org.apache.iceberg.spark.actions
- org.apache.iceberg.spark.data - package org.apache.iceberg.spark.data
- org.apache.iceberg.spark.data.vectorized - package org.apache.iceberg.spark.data.vectorized
- org.apache.iceberg.spark.procedures - package org.apache.iceberg.spark.procedures
- org.apache.iceberg.spark.source - package org.apache.iceberg.spark.source
- org.apache.iceberg.transforms - package org.apache.iceberg.transforms
- org.apache.iceberg.types - package org.apache.iceberg.types
- org.apache.iceberg.util - package org.apache.iceberg.util
- org.apache.spark.sql.catalyst.analysis - package org.apache.spark.sql.catalyst.analysis
- org.apache.spark.sql.catalyst.parser.extensions - package org.apache.spark.sql.catalyst.parser.extensions
- org.apache.spark.sql.connector.iceberg.catalog - package org.apache.spark.sql.connector.iceberg.catalog
- org.apache.spark.sql.connector.iceberg.write - package org.apache.spark.sql.connector.iceberg.write
- orNoop() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
If no implementation has been found, adds a NOOP method.
- orNull() - Method in class org.apache.iceberg.common.DynClasses.Builder
-
Instructs this builder to return null if no class is found, rather than throwing an Exception.
- orphanFileLocations() - Method in class org.apache.iceberg.actions.BaseDeleteOrphanFilesActionResult
- orphanFileLocations() - Method in interface org.apache.iceberg.actions.DeleteOrphanFiles.Result
-
Returns locations of orphan files.
- OSS_ENDPOINT - Static variable in class org.apache.iceberg.aliyun.AliyunProperties
-
The domain name used to access OSS.
- OSS_STAGING_DIRECTORY - Static variable in class org.apache.iceberg.aliyun.AliyunProperties
-
Location to put staging files for uploading to OSS, defaults to the directory value of java.io.tmpdir.
- ossEndpoint() - Method in class org.apache.iceberg.aliyun.AliyunProperties
- OSSFileIO - Class in org.apache.iceberg.aliyun.oss
-
FileIO implementation backed by OSS.
- OSSFileIO() - Constructor for class org.apache.iceberg.aliyun.oss.OSSFileIO
-
No-arg constructor to load the FileIO dynamically.
- OSSFileIO(SerializableSupplier<OSS>) - Constructor for class org.apache.iceberg.aliyun.oss.OSSFileIO
-
Constructor with custom oss supplier and default aliyun properties.
- OSSInputFile - Class in org.apache.iceberg.aliyun.oss
- OSSInputStream - Class in org.apache.iceberg.aliyun.oss
- OSSInputStream(OSS, OSSURI) - Constructor for class org.apache.iceberg.aliyun.oss.OSSInputStream
- OSSOutputStream - Class in org.apache.iceberg.aliyun.oss
- ossStagingDirectory() - Method in class org.apache.iceberg.aliyun.AliyunProperties
- OSSURI - Class in org.apache.iceberg.aliyun.oss
-
This class represents a fully qualified location in OSS for input/output operations expressed as as URI.
- OSSURI(String) - Constructor for class org.apache.iceberg.aliyun.oss.OSSURI
-
Creates a new OSSURI based on the bucket and key parsed from the location The location in string form has the syntax as below, which refers to RFC2396: [scheme:][//bucket][object key][#fragment] [scheme:][//bucket][object key][?query][#fragment] It specifies precisely which characters are permitted in the various components of a URI reference in Aliyun OSS documentation as below: Bucket: https://help.aliyun.com/document_detail/257087.html Object: https://help.aliyun.com/document_detail/273129.html Scheme: https or oss
- OUTPUT_TABLES - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- OutputFile - Interface in org.apache.iceberg.io
-
An interface used to create output files using
PositionOutputStream
instances. - OutputFileFactory - Class in org.apache.iceberg.io
-
Factory responsible for generating unique but recognizable data file names.
- OutputFileFactory.Builder - Class in org.apache.iceberg.io
- outputSpecId(int) - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
-
Pass a PartitionSpec id to specify which PartitionSpec should be used in DataFile rewrite
- outputTables(Configuration) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
-
Returns the names of the output tables stored in the configuration.
- outputType() - Method in class org.apache.iceberg.spark.procedures.AncestorsOfProcedure
- outputType() - Method in class org.apache.iceberg.spark.procedures.ExpireSnapshotsProcedure
- outputType() - Method in class org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure
- outputType() - Method in interface org.apache.spark.sql.connector.iceberg.catalog.Procedure
-
Returns the type of rows produced by this procedure.
- overwrite() - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- overwrite() - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- overwrite() - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- overwrite() - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- overwrite() - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- overwrite() - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- overwrite() - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- overwrite() - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- overwrite() - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- overwrite(boolean) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- overwrite(boolean) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- overwrite(boolean) - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- overwrite(boolean) - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
- overwrite(boolean) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- overwrite(boolean) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- overwrite(boolean) - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- overwrite(boolean) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- overwrite(boolean) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- overwrite(boolean) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- overwrite(TableMetadata, OutputFile) - Static method in class org.apache.iceberg.TableMetadataParser
- OVERWRITE - Static variable in class org.apache.iceberg.DataOperations
-
New data is added to overwrite existing data.
- OVERWRITE_MODE - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- overwriteByRowFilter(Expression) - Method in class org.apache.iceberg.BaseOverwriteFiles
- overwriteByRowFilter(Expression) - Method in interface org.apache.iceberg.OverwriteFiles
-
Delete files that match an
Expression
on data rows from the table. - OverwriteFiles - Interface in org.apache.iceberg
-
API for overwriting files in a table.
- overwriteMode() - Method in class org.apache.iceberg.spark.SparkWriteConf
P
- pack(Iterable<T>, Function<T, Long>) - Method in class org.apache.iceberg.util.BinPacking.ListPacker
- packEnd(List<T>, Function<T, Long>) - Method in class org.apache.iceberg.util.BinPacking.ListPacker
- PackingIterable(Iterable<T>, long, int, Function<T, Long>) - Constructor for class org.apache.iceberg.util.BinPacking.PackingIterable
- PackingIterable(Iterable<T>, long, int, Function<T, Long>, boolean) - Constructor for class org.apache.iceberg.util.BinPacking.PackingIterable
- page - Variable in class org.apache.iceberg.parquet.BasePageIterator
- pageIterator() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- pageIterator() - Method in class org.apache.iceberg.parquet.BaseColumnIterator
- pageIterator() - Method in class org.apache.iceberg.parquet.ColumnIterator
- pageSource - Variable in class org.apache.iceberg.parquet.BaseColumnIterator
- Pair<X,Y> - Class in org.apache.iceberg.util
- pairs(M) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedKeyValueWriter
- ParallelIterable<T> - Class in org.apache.iceberg.util
- ParallelIterable(Iterable<? extends Iterable<T>>, ExecutorService) - Constructor for class org.apache.iceberg.util.ParallelIterable
- parameters() - Method in class org.apache.iceberg.spark.procedures.AncestorsOfProcedure
- parameters() - Method in class org.apache.iceberg.spark.procedures.ExpireSnapshotsProcedure
- parameters() - Method in class org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure
- parameters() - Method in interface org.apache.spark.sql.connector.iceberg.catalog.Procedure
-
Returns the input parameters of this procedure.
- parentId() - Method in interface org.apache.iceberg.Snapshot
-
Return this snapshot's parent ID or null.
- Parquet - Class in org.apache.iceberg.parquet
- PARQUET - org.apache.iceberg.FileFormat
- PARQUET_BATCH_SIZE - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_BATCH_SIZE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_COMPRESSION_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_COMPRESSION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_COMPRESSION_LEVEL_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_DICT_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_DICT_SIZE_BYTES_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_PAGE_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_PAGE_SIZE_BYTES_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_ROW_GROUP_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_ROW_GROUP_SIZE_BYTES_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_VECTORIZATION_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- PARQUET_VECTORIZATION_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- Parquet.DataWriteBuilder - Class in org.apache.iceberg.parquet
- Parquet.DeleteWriteBuilder - Class in org.apache.iceberg.parquet
- Parquet.ReadBuilder - Class in org.apache.iceberg.parquet
- Parquet.WriteBuilder - Class in org.apache.iceberg.parquet
- ParquetAvroReader - Class in org.apache.iceberg.parquet
- ParquetAvroReader() - Constructor for class org.apache.iceberg.parquet.ParquetAvroReader
- ParquetAvroValueReaders - Class in org.apache.iceberg.parquet
- ParquetAvroValueReaders.TimeMillisReader - Class in org.apache.iceberg.parquet
- ParquetAvroValueReaders.TimestampMillisReader - Class in org.apache.iceberg.parquet
- ParquetAvroWriter - Class in org.apache.iceberg.parquet
- parquetBatchSize() - Method in class org.apache.iceberg.spark.SparkReadConf
- ParquetDictionaryRowGroupFilter - Class in org.apache.iceberg.parquet
- ParquetDictionaryRowGroupFilter(Schema, Expression) - Constructor for class org.apache.iceberg.parquet.ParquetDictionaryRowGroupFilter
- ParquetDictionaryRowGroupFilter(Schema, Expression, boolean) - Constructor for class org.apache.iceberg.parquet.ParquetDictionaryRowGroupFilter
- ParquetIterable<T> - Class in org.apache.iceberg.parquet
- ParquetMetricsRowGroupFilter - Class in org.apache.iceberg.parquet
- ParquetMetricsRowGroupFilter(Schema, Expression) - Constructor for class org.apache.iceberg.parquet.ParquetMetricsRowGroupFilter
- ParquetMetricsRowGroupFilter(Schema, Expression, boolean) - Constructor for class org.apache.iceberg.parquet.ParquetMetricsRowGroupFilter
- ParquetReader<T> - Class in org.apache.iceberg.parquet
- ParquetReader(InputFile, Schema, ParquetReadOptions, Function<MessageType, ParquetValueReader<?>>, NameMapping, Expression, boolean, boolean) - Constructor for class org.apache.iceberg.parquet.ParquetReader
- ParquetSchemaUtil - Class in org.apache.iceberg.parquet
- ParquetSchemaUtil.HasIds - Class in org.apache.iceberg.parquet
- ParquetTypeVisitor<T> - Class in org.apache.iceberg.parquet
- ParquetTypeVisitor() - Constructor for class org.apache.iceberg.parquet.ParquetTypeVisitor
- ParquetUtil - Class in org.apache.iceberg.parquet
- ParquetValueReader<T> - Interface in org.apache.iceberg.parquet
- ParquetValueReaders - Class in org.apache.iceberg.parquet
- ParquetValueReaders.BinaryAsDecimalReader - Class in org.apache.iceberg.parquet
- ParquetValueReaders.ByteArrayReader - Class in org.apache.iceberg.parquet
- ParquetValueReaders.BytesReader - Class in org.apache.iceberg.parquet
- ParquetValueReaders.FloatAsDoubleReader - Class in org.apache.iceberg.parquet
- ParquetValueReaders.IntAsLongReader - Class in org.apache.iceberg.parquet
- ParquetValueReaders.IntegerAsDecimalReader - Class in org.apache.iceberg.parquet
- ParquetValueReaders.ListReader<E> - Class in org.apache.iceberg.parquet
- ParquetValueReaders.LongAsDecimalReader - Class in org.apache.iceberg.parquet
- ParquetValueReaders.MapReader<K,V> - Class in org.apache.iceberg.parquet
- ParquetValueReaders.PrimitiveReader<T> - Class in org.apache.iceberg.parquet
- ParquetValueReaders.RepeatedKeyValueReader<M,I,K,V> - Class in org.apache.iceberg.parquet
- ParquetValueReaders.RepeatedReader<T,I,E> - Class in org.apache.iceberg.parquet
- ParquetValueReaders.ReusableEntry<K,V> - Class in org.apache.iceberg.parquet
- ParquetValueReaders.StringReader - Class in org.apache.iceberg.parquet
- ParquetValueReaders.StructReader<T,I> - Class in org.apache.iceberg.parquet
- ParquetValueReaders.UnboxedReader<T> - Class in org.apache.iceberg.parquet
- ParquetValueWriter<T> - Interface in org.apache.iceberg.parquet
- ParquetValueWriters - Class in org.apache.iceberg.parquet
- ParquetValueWriters.PositionDeleteStructWriter<R> - Class in org.apache.iceberg.parquet
- ParquetValueWriters.PrimitiveWriter<T> - Class in org.apache.iceberg.parquet
- ParquetValueWriters.RepeatedKeyValueWriter<M,K,V> - Class in org.apache.iceberg.parquet
- ParquetValueWriters.RepeatedWriter<L,E> - Class in org.apache.iceberg.parquet
- ParquetValueWriters.StructWriter<S> - Class in org.apache.iceberg.parquet
- parquetVectorizationEnabled() - Method in class org.apache.iceberg.spark.SparkReadConf
- ParquetWithFlinkSchemaVisitor<T> - Class in org.apache.iceberg.flink.data
- ParquetWithFlinkSchemaVisitor() - Constructor for class org.apache.iceberg.flink.data.ParquetWithFlinkSchemaVisitor
- ParquetWithSparkSchemaVisitor<T> - Class in org.apache.iceberg.spark.data
-
Visitor for traversing a Parquet type with a companion Spark type.
- ParquetWithSparkSchemaVisitor() - Constructor for class org.apache.iceberg.spark.data.ParquetWithSparkSchemaVisitor
- ParquetWriteAdapter<D> - Class in org.apache.iceberg.parquet
-
Deprecated.use
ParquetWriter
- ParquetWriteAdapter(ParquetWriter<D>, MetricsConfig) - Constructor for class org.apache.iceberg.parquet.ParquetWriteAdapter
-
Deprecated.
- parse(String) - Static method in class org.apache.iceberg.catalog.TableIdentifier
- PARTIAL_PROGRESS_ENABLED - Static variable in interface org.apache.iceberg.actions.RewriteDataFiles
-
Enable committing groups of files (see max-file-group-size-bytes) prior to the entire rewrite completing.
- PARTIAL_PROGRESS_ENABLED_DEFAULT - Static variable in interface org.apache.iceberg.actions.RewriteDataFiles
- PARTIAL_PROGRESS_MAX_COMMITS - Static variable in interface org.apache.iceberg.actions.RewriteDataFiles
-
The maximum amount of Iceberg commits that this rewrite is allowed to produce if partial progress is enabled.
- PARTIAL_PROGRESS_MAX_COMMITS_DEFAULT - Static variable in interface org.apache.iceberg.actions.RewriteDataFiles
- partition() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesFileGroupInfo
- partition() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.FileGroupInfo
-
returns which partition this file group contains files from
- partition() - Method in interface org.apache.iceberg.ContentFile
-
Returns partition for this file as a
StructLike
. - partition() - Method in class org.apache.iceberg.spark.SparkDataFile
- partition(StructLike) - Method in class org.apache.iceberg.PartitionKey
-
Replace this key's partition values with the partition values for the row.
- partition(InternalRow) - Method in class org.apache.iceberg.spark.source.SparkPartitionedFanoutWriter
- partition(InternalRow) - Method in class org.apache.iceberg.spark.source.SparkPartitionedWriter
- partition(T) - Method in class org.apache.iceberg.io.PartitionedFanoutWriter
-
Create a PartitionKey from the values in row.
- partition(T) - Method in class org.apache.iceberg.io.PartitionedWriter
-
Create a PartitionKey from the values in row.
- PARTITION - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- PARTITION - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- PARTITION() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- PARTITION() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- PARTITION() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- PARTITION() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- PARTITION() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- PARTITION_COLUMN_DOC - Static variable in class org.apache.iceberg.MetadataColumns
- PARTITION_COLUMN_ID - Static variable in class org.apache.iceberg.MetadataColumns
- PARTITION_COLUMN_NAME - Static variable in class org.apache.iceberg.MetadataColumns
- PARTITION_DOC - Static variable in interface org.apache.iceberg.DataFile
- PARTITION_ID - Static variable in interface org.apache.iceberg.DataFile
- PARTITION_NAME - Static variable in interface org.apache.iceberg.DataFile
- PARTITION_SPEC - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- PARTITION_SUMMARIES - Static variable in interface org.apache.iceberg.ManifestFile
- PARTITION_SUMMARY_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- PARTITION_SUMMARY_TYPE - Static variable in interface org.apache.iceberg.ManifestFile
- partitionDF(SparkSession, String) - Static method in class org.apache.iceberg.spark.SparkTableUtil
-
Returns a DataFrame with a row for each partition in the table.
- partitionDFByFilter(SparkSession, String, String) - Static method in class org.apache.iceberg.spark.SparkTableUtil
-
Returns a DataFrame with a row for each partition that matches the specified 'expression'.
- PartitionedFanoutWriter<T> - Class in org.apache.iceberg.io
- PartitionedFanoutWriter(PartitionSpec, FileFormat, FileAppenderFactory<T>, OutputFileFactory, FileIO, long) - Constructor for class org.apache.iceberg.io.PartitionedFanoutWriter
- PartitionedWriter<T> - Class in org.apache.iceberg.io
- PartitionedWriter(PartitionSpec, FileFormat, FileAppenderFactory<T>, OutputFileFactory, FileIO, long) - Constructor for class org.apache.iceberg.io.PartitionedWriter
- partitionExists(ObjectPath, CatalogPartitionSpec) - Method in class org.apache.iceberg.flink.FlinkCatalog
- PartitionField - Class in org.apache.iceberg
-
Represents a single field in a
PartitionSpec
. - partitionIndex() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesFileGroupInfo
- partitionIndex() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.FileGroupInfo
-
returns which file group this is out of the set of file groups for this partition
- partitioning() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- partitioning() - Method in class org.apache.iceberg.spark.source.SparkTable
- Partitioning - Class in org.apache.iceberg
- PartitioningWriter<T,R> - Interface in org.apache.iceberg.io
-
A writer capable of writing files of a single type (i.e.
- PartitionKey - Class in org.apache.iceberg
-
A struct of partition values.
- PartitionKey(PartitionSpec, Schema) - Constructor for class org.apache.iceberg.PartitionKey
- partitionMapToExpression(StructType, Map<String, String>) - Static method in class org.apache.iceberg.spark.SparkUtil
-
Get a List of Spark filter Expression.
- PartitionMetrics() - Constructor for class org.apache.iceberg.ScanSummary.PartitionMetrics
- partitions() - Method in class org.apache.iceberg.GenericManifestFile
- partitions() - Method in interface org.apache.iceberg.ManifestFile
-
Returns a list of
partition field summaries
. - partitions() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- PARTITIONS - org.apache.iceberg.MetadataTableType
- PartitionSet - Class in org.apache.iceberg.util
- PartitionSpec - Class in org.apache.iceberg
-
Represents how to produce partition data for a table.
- PartitionSpec.Builder - Class in org.apache.iceberg
-
Used to create valid
partition specs
. - partitionSpecId() - Method in class org.apache.iceberg.GenericManifestFile
- partitionSpecId() - Method in interface org.apache.iceberg.ManifestFile
-
Returns iD of the
PartitionSpec
used to write the manifest file. - partitionSpecId() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- PartitionSpecParser - Class in org.apache.iceberg
- PartitionSpecVisitor<T> - Interface in org.apache.iceberg.transforms
- PartitionsTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's partitions as rows. - partitionToPath(StructLike) - Method in class org.apache.iceberg.PartitionSpec
- partitionType() - Method in class org.apache.iceberg.PartitionSpec
-
Returns a
Types.StructType
for partition data defined by this spec. - partitionType(Table) - Static method in class org.apache.iceberg.Partitioning
-
Builds a common partition type for all specs in a table.
- PartitionUtil - Class in org.apache.iceberg.util
- parts - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MultipartIdentifierContext
- path() - Method in interface org.apache.iceberg.ContentFile
-
Returns fully qualified path to the file, suitable for constructing a Hadoop Path.
- path() - Method in class org.apache.iceberg.deletes.PositionDelete
- path() - Method in class org.apache.iceberg.GenericManifestFile
- path() - Method in interface org.apache.iceberg.ManifestFile
-
Returns fully qualified path to the file, suitable for constructing a Hadoop Path.
- path() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- path() - Method in class org.apache.iceberg.spark.SparkDataFile
- path(String) - Method in class org.apache.iceberg.flink.data.ParquetWithFlinkSchemaVisitor
- path(String) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- path(String) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- path(String) - Method in class org.apache.iceberg.parquet.TypeWithSchemaVisitor
- path(String) - Method in class org.apache.iceberg.spark.data.ParquetWithSparkSchemaVisitor
- PATH - Static variable in interface org.apache.iceberg.ManifestFile
- PathIdentifier - Class in org.apache.iceberg.spark
- PathIdentifier(String) - Constructor for class org.apache.iceberg.spark.PathIdentifier
- pathPosSchema() - Static method in class org.apache.iceberg.io.DeleteSchemaUtil
- PendingUpdate<T> - Interface in org.apache.iceberg
-
API for table metadata changes.
- PIG - org.apache.iceberg.mr.InputFormatConfig.InMemoryDataModel
- PIG_ICEBERG_TABLES_IMPL - Static variable in class org.apache.iceberg.pig.IcebergStorage
- PigParquetReader - Class in org.apache.iceberg.pig
- PlaintextEncryptionManager - Class in org.apache.iceberg.encryption
- PlaintextEncryptionManager() - Constructor for class org.apache.iceberg.encryption.PlaintextEncryptionManager
- planDeleteFileGroups(Iterable<DeleteFile>) - Method in interface org.apache.iceberg.actions.RewritePositionDeleteStrategy
-
Groups into lists which will be processed in a single executable unit.
- planDeleteFileGroups(Iterable<FileScanTask>) - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteStrategy
-
Groups delete files into lists which will be processed in a single executable unit.
- planFileGroups(Iterable<FileScanTask>) - Method in class org.apache.iceberg.actions.BinPackStrategy
- planFileGroups(Iterable<FileScanTask>) - Method in interface org.apache.iceberg.actions.RewriteStrategy
-
Groups file scans into lists which will be processed in a single executable unit.
- planFileGroups(Iterable<FileScanTask>) - Method in class org.apache.iceberg.actions.SortStrategy
- planFiles() - Method in interface org.apache.iceberg.TableScan
-
Plan the
files
that will be read by this scan. - planFiles(TableOperations, Snapshot, Expression, boolean, boolean, boolean) - Method in class org.apache.iceberg.AllDataFilesTable.AllDataFilesTableScan
- planFiles(TableOperations, Snapshot, Expression, boolean, boolean, boolean) - Method in class org.apache.iceberg.AllManifestsTable.AllManifestsTableScan
- planFiles(TableOperations, Snapshot, Expression, boolean, boolean, boolean) - Method in class org.apache.iceberg.DataFilesTable.FilesTableScan
- planFiles(TableOperations, Snapshot, Expression, boolean, boolean, boolean) - Method in class org.apache.iceberg.DataTableScan
- planIcebergSourceSplits(Table, ScanContext) - Static method in class org.apache.iceberg.flink.source.FlinkSplitPlanner
-
This returns splits for the FLIP-27 source
- planInputPartitions(Offset, Offset) - Method in class org.apache.iceberg.spark.source.SparkMicroBatchStream
- planTasks() - Method in interface org.apache.iceberg.TableScan
-
Plan the
tasks
for this scan. - planTasks(CloseableIterable<FileScanTask>, long, int, long) - Static method in class org.apache.iceberg.util.TableScanUtil
- PLUS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- PLUS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- poolSize() - Method in class org.apache.iceberg.ClientPoolImpl
- pos() - Method in interface org.apache.iceberg.ContentFile
-
Returns the ordinal position of the file in a manifest, or null if it was not read from a manifest.
- pos() - Method in class org.apache.iceberg.deletes.PositionDelete
- pos() - Method in class org.apache.iceberg.spark.SparkDataFile
- pos(Record) - Method in class org.apache.iceberg.data.GenericDeleteFilter
- pos(T) - Method in class org.apache.iceberg.data.DeleteFilter
- posDeleteSchema(Schema) - Static method in class org.apache.iceberg.io.DeleteSchemaUtil
- position() - Static method in class org.apache.iceberg.parquet.ParquetValueReaders
- POSITION_DELETES - org.apache.iceberg.FileContent
- PositionalArgumentContext(IcebergSqlExtensionsParser.CallArgumentContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.PositionalArgumentContext
- positionDelete(OrcRowWriter<T>, Function<CharSequence, ?>) - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- PositionDelete<R> - Class in org.apache.iceberg.deletes
- positionDeleteDistributionMode() - Method in class org.apache.iceberg.spark.SparkWriteConf
- PositionDeleteIndex - Interface in org.apache.iceberg.deletes
- positionDeleteRowSchema() - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- PositionDeleteStructWriter(ParquetValueWriters.StructWriter<?>, Function<CharSequence, ?>) - Constructor for class org.apache.iceberg.parquet.ParquetValueWriters.PositionDeleteStructWriter
- PositionDeleteWriter<T> - Class in org.apache.iceberg.deletes
- PositionDeleteWriter(FileAppender<StructLike>, FileFormat, String, PartitionSpec, StructLike, EncryptionKeyMetadata) - Constructor for class org.apache.iceberg.deletes.PositionDeleteWriter
- PositionDeltaWriter<T> - Interface in org.apache.iceberg.io
-
A writer capable of writing data and position deletes that may belong to different specs and partitions.
- PositionOutputStream - Class in org.apache.iceberg.io
- PositionOutputStream() - Constructor for class org.apache.iceberg.io.PositionOutputStream
- positions() - Static method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- positionsWithSetArrowValidityVector() - Static method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- PositionVectorHolder(FieldVector, Type, NullabilityHolder) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorHolder.PositionVectorHolder
- precision() - Method in class org.apache.iceberg.types.Types.DecimalType
- preCreateTable(Table) - Method in class org.apache.iceberg.mr.hive.HiveIcebergMetaHook
- predicate(BoundPredicate<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- predicate(BoundPredicate<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- predicate(BoundPredicate<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- predicate(Expression.Operation, String) - Static method in class org.apache.iceberg.expressions.Expressions
- predicate(Expression.Operation, String, Iterable<T>) - Static method in class org.apache.iceberg.expressions.Expressions
- predicate(Expression.Operation, String, Literal<T>) - Static method in class org.apache.iceberg.expressions.Expressions
- predicate(Expression.Operation, String, T) - Static method in class org.apache.iceberg.expressions.Expressions
- predicate(UnboundPredicate<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- predicate(UnboundPredicate<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- predicate(UnboundPredicate<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- Predicate<T,C extends Term> - Class in org.apache.iceberg.expressions
- preDropTable(Table) - Method in class org.apache.iceberg.mr.hive.HiveIcebergMetaHook
- preferLocality() - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
-
If this API is called.
- prepare(ManifestEntry<F>) - Method in class org.apache.iceberg.ManifestWriter
- prepareToRead(RecordReader, PigSplit) - Method in class org.apache.iceberg.pig.IcebergStorage
- preservesOrder() - Method in interface org.apache.iceberg.transforms.Transform
-
Whether the transform preserves the order of values (is monotonic).
- PREVIOUS_METADATA_LOCATION_PROP - Static variable in class org.apache.iceberg.BaseMetastoreTableOperations
- previousFiles() - Method in class org.apache.iceberg.TableMetadata
- primitive(Schema) - Method in class org.apache.iceberg.avro.AvroSchemaVisitor
- primitive(Schema) - Method in class org.apache.iceberg.avro.RemoveIds
- primitive(LogicalType, PrimitiveType) - Method in class org.apache.iceberg.flink.data.ParquetWithFlinkSchemaVisitor
- primitive(Type.PrimitiveType) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- primitive(Type.PrimitiveType) - Method in class org.apache.iceberg.spark.PruneColumnsWithoutReordering
- primitive(Type.PrimitiveType) - Method in class org.apache.iceberg.spark.PruneColumnsWithReordering
- primitive(Type.PrimitiveType) - Method in class org.apache.iceberg.spark.Spark3Util.DescribeSchemaVisitor
- primitive(Type.PrimitiveType) - Method in class org.apache.iceberg.types.CheckCompatibility
- primitive(Type.PrimitiveType) - Method in class org.apache.iceberg.types.FixupTypes
- primitive(Type.PrimitiveType) - Method in class org.apache.iceberg.types.IndexByName
- primitive(Type.PrimitiveType) - Method in class org.apache.iceberg.types.IndexParents
- primitive(Type.PrimitiveType) - Method in class org.apache.iceberg.types.TypeUtil.CustomOrderSchemaVisitor
- primitive(Type.PrimitiveType) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- primitive(Type.PrimitiveType, Integer) - Method in class org.apache.iceberg.schema.UnionByNameVisitor
- primitive(Type.PrimitiveType, Schema) - Method in class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- primitive(Type.PrimitiveType, TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaWithTypeVisitor
- primitive(Type.PrimitiveType, PrimitiveType) - Method in class org.apache.iceberg.arrow.vectorized.VectorizedReaderBuilder
- primitive(Type.PrimitiveType, PrimitiveType) - Method in class org.apache.iceberg.parquet.TypeWithSchemaVisitor
- primitive(Type.PrimitiveType, Type.Repetition, int, String) - Method in class org.apache.iceberg.parquet.TypeToMessageType
- primitive(Type.PrimitiveType, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- primitive(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- primitive(PrimitiveType) - Method in class org.apache.iceberg.parquet.ParquetSchemaUtil.HasIds
- primitive(PrimitiveType) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- primitive(PrimitiveType) - Method in class org.apache.iceberg.parquet.RemoveIds
- primitive(DataType, PrimitiveType) - Method in class org.apache.iceberg.spark.data.ParquetWithSparkSchemaVisitor
- primitive(P, Schema) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- PrimitiveReader(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.PrimitiveReader
- PrimitiveType() - Constructor for class org.apache.iceberg.types.Type.PrimitiveType
- PrimitiveWriter(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.ParquetValueWriters.PrimitiveWriter
- Procedure - Interface in org.apache.spark.sql.connector.iceberg.catalog
-
An interface representing a stored procedure available for execution.
- ProcedureCatalog - Interface in org.apache.spark.sql.connector.iceberg.catalog
-
A catalog API for working with stored procedures.
- ProcedureParameter - Interface in org.apache.spark.sql.connector.iceberg.catalog
-
An input parameter of a
stored procedure
. - processElement(StreamRecord<FlinkInputSplit>) - Method in class org.apache.iceberg.flink.source.StreamingReaderOperator
- processWatermark(Watermark) - Method in class org.apache.iceberg.flink.source.StreamingReaderOperator
- producesDictionaryEncodedVector() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- producesDictionaryEncodedVector() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedPageIterator
- project(String, BoundPredicate<S>) - Method in interface org.apache.iceberg.transforms.Transform
-
Transforms a
predicate
to an inclusive predicate on the partition values produced byTransform.apply(Object)
. - project(String, BoundPredicate<S>) - Method in class org.apache.iceberg.transforms.UnknownTransform
- project(TableSchema) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- project(Expression) - Method in class org.apache.iceberg.expressions.Projections.ProjectionEvaluator
-
Project the given row expression to a partition expression.
- project(Schema) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- project(Schema) - Method in class org.apache.iceberg.ManifestReader
- project(Schema) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- project(Schema) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- project(Schema) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- project(Schema) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
from this with the schema as its projection. - project(Schema, List<String>) - Static method in class org.apache.iceberg.pig.SchemaUtil
- project(Schema, Set<Integer>) - Static method in class org.apache.iceberg.types.TypeUtil
-
Project extracts particular fields from a schema by ID.
- project(Types.StructType, Set<Integer>) - Static method in class org.apache.iceberg.types.TypeUtil
- projectId() - Method in class org.apache.iceberg.gcp.GCPProperties
- projection() - Method in class org.apache.iceberg.events.IncrementalScanEvent
- projection() - Method in class org.apache.iceberg.events.ScanEvent
- ProjectionDatumReader<D> - Class in org.apache.iceberg.avro
- ProjectionDatumReader(Function<Schema, DatumReader<?>>, Schema, Map<String, String>, NameMapping) - Constructor for class org.apache.iceberg.avro.ProjectionDatumReader
- ProjectionEvaluator() - Constructor for class org.apache.iceberg.expressions.Projections.ProjectionEvaluator
- Projections - Class in org.apache.iceberg.expressions
-
Utils to project expressions on rows to expressions on partitions.
- Projections.ProjectionEvaluator - Class in org.apache.iceberg.expressions
-
A class that projects expressions for a table's data rows into expressions on the table's partition values, for a table's
partition spec
. - projectStrict(String, BoundPredicate<S>) - Method in interface org.apache.iceberg.transforms.Transform
-
Transforms a
predicate
to a strict predicate on the partition values produced byTransform.apply(Object)
. - projectStrict(String, BoundPredicate<S>) - Method in class org.apache.iceberg.transforms.UnknownTransform
- properties() - Method in class org.apache.iceberg.BaseTable
- properties() - Method in class org.apache.iceberg.SerializableTable
- properties() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- properties() - Method in class org.apache.iceberg.spark.source.SparkTable
- properties() - Method in interface org.apache.iceberg.Table
-
Return a map of string properties for this table.
- properties() - Method in class org.apache.iceberg.TableMetadata
- properties(Map<String, String>) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- property(String, String) - Method in class org.apache.iceberg.TableMetadata
- PROPERTY_PREFIX - Static variable in class org.apache.iceberg.jdbc.JdbcCatalog
- PROPERTY_VERSION - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- propertyAsBoolean(String, boolean) - Method in class org.apache.iceberg.TableMetadata
- propertyAsBoolean(Map<String, String>, String, boolean) - Static method in class org.apache.iceberg.util.PropertyUtil
- propertyAsDouble(Map<String, String>, String, double) - Static method in class org.apache.iceberg.util.PropertyUtil
- propertyAsInt(String, int) - Method in class org.apache.iceberg.TableMetadata
- propertyAsInt(Map<String, String>, String, int) - Static method in class org.apache.iceberg.util.PropertyUtil
- propertyAsLong(String, long) - Method in class org.apache.iceberg.TableMetadata
- propertyAsLong(Map<String, String>, String, long) - Static method in class org.apache.iceberg.util.PropertyUtil
- propertyAsString(Map<String, String>, String, String) - Static method in class org.apache.iceberg.util.PropertyUtil
- PropertyUtil - Class in org.apache.iceberg.util
- prune(Schema, StructType) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Prune columns from a
Schema
using aSpark type
projection. - prune(Schema, StructType, List<Expression>) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Prune columns from a
Schema
using aSpark type
projection. - prune(Schema, StructType, Expression, boolean) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Prune columns from a
Schema
using aSpark type
projection. - pruneColumns(Schema, Set<Integer>, NameMapping) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- pruneColumns(MessageType, Schema) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
- pruneColumns(StructType) - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- pruneColumnsFallback(MessageType, Schema) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
-
Prunes columns from a Parquet file schema that was written without field ids.
- PruneColumnsWithoutReordering - Class in org.apache.iceberg.spark
- PruneColumnsWithReordering - Class in org.apache.iceberg.spark
- PUBLISHED_WAP_ID_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- publishedWapId(Snapshot) - Static method in class org.apache.iceberg.util.WapUtil
- pushedFilters() - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- pushFilters(Filter[]) - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- pushProjection(LoadPushDown.RequiredFieldList) - Method in class org.apache.iceberg.pig.IcebergStorage
- put(int, Object) - Method in class org.apache.iceberg.GenericManifestFile
- put(int, Object) - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- put(int, Object) - Method in class org.apache.iceberg.util.Pair
- put(K, V) - Method in class org.apache.iceberg.util.SerializableMap
- put(StructLike, T) - Method in class org.apache.iceberg.util.StructLikeMap
- putAll(Map<? extends K, ? extends V>) - Method in class org.apache.iceberg.util.SerializableMap
Q
- quotedIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- quotedIdentifier() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext
- QuotedIdentifierAlternativeContext(IcebergSqlExtensionsParser.IdentifierContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext
- QuotedIdentifierContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierContext
R
- range(int) - Static method in class org.apache.iceberg.util.Tasks
- RANGE - org.apache.iceberg.DistributionMode
- ReachableFileUtil - Class in org.apache.iceberg
- reachedEnd() - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- read() - Method in class org.apache.iceberg.aliyun.oss.OSSInputStream
- read(byte[]) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ByteArrayReader
- read(byte[], int, int) - Method in class org.apache.iceberg.aliyun.oss.OSSInputStream
- read(D, Decoder) - Method in class org.apache.iceberg.avro.ProjectionDatumReader
- read(Double) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.FloatAsDoubleReader
- read(Long) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.IntAsLongReader
- read(String) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StringReader
- read(BigDecimal) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.BinaryAsDecimalReader
- read(BigDecimal) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.IntegerAsDecimalReader
- read(BigDecimal) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.LongAsDecimalReader
- read(ByteBuffer) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.BytesReader
- read(M) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- read(Decoder, Object) - Method in interface org.apache.iceberg.avro.ValueReader
- read(Decoder, Object) - Method in class org.apache.iceberg.avro.ValueReaders.StructReader
- read(RowData, Decoder) - Method in class org.apache.iceberg.flink.data.FlinkAvroReader
- read(VectorHolder, int) - Method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader.ConstantVectorReader
- read(VectorHolder, int) - Method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- read(FileIO, String) - Static method in class org.apache.iceberg.TableMetadataParser
- read(FileIO, InputFile) - Static method in class org.apache.iceberg.TableMetadataParser
- read(InputFile) - Static method in class org.apache.iceberg.avro.Avro
- read(InputFile) - Static method in class org.apache.iceberg.orc.ORC
- read(InputFile) - Static method in class org.apache.iceberg.parquet.Parquet
- read(ManifestFile, FileIO) - Static method in class org.apache.iceberg.ManifestFiles
-
Returns a new
ManifestReader
for aManifestFile
. - read(ManifestFile, FileIO, Map<Integer, PartitionSpec>) - Static method in class org.apache.iceberg.ManifestFiles
-
Returns a new
ManifestReader
for aManifestFile
. - read(Table) - Static method in class org.apache.iceberg.data.IcebergGenerics
-
Returns a builder to configure a read of the given table that produces generic records.
- read(ColumnVector, int) - Method in interface org.apache.iceberg.orc.OrcValueReader
- read(VectorizedRowBatch) - Method in interface org.apache.iceberg.orc.OrcBatchReader
-
Reads a row batch.
- read(VectorizedRowBatch, int) - Method in class org.apache.iceberg.data.orc.GenericOrcReader
- read(VectorizedRowBatch, int) - Method in class org.apache.iceberg.flink.data.FlinkOrcReader
- read(VectorizedRowBatch, int) - Method in interface org.apache.iceberg.orc.OrcRowReader
-
Reads a row.
- read(VectorizedRowBatch, int) - Method in class org.apache.iceberg.spark.data.SparkOrcReader
- read(InternalRow, Decoder) - Method in class org.apache.iceberg.spark.data.SparkAvroReader
- read(ColumnarBatch, int) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnarBatchReader
- read(T) - Method in interface org.apache.iceberg.parquet.ParquetValueReader
- read(T) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- read(T) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- read(T) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.UnboxedReader
- read(T, int) - Method in interface org.apache.iceberg.parquet.VectorizedReader
-
Reads a batch of type @param <T> and of size numRows
- read(T, Decoder) - Method in class org.apache.iceberg.data.avro.DataReader
- READ_SCHEMA - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- readBinary() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.UnboxedReader
- readBoolean() - Method in class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- readBoolean() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.UnboxedReader
- readBoolean() - Method in class org.apache.iceberg.parquet.ValuesAsBytesReader
- readBooleanAsInt() - Method in class org.apache.iceberg.parquet.ValuesAsBytesReader
-
Returns 1 if true, 0 otherwise.
- readCompatibilityErrors(Schema, Schema) - Static method in class org.apache.iceberg.types.CheckCompatibility
-
Returns a list of compatibility errors for reading with the given read schema.
- readDeleteManifest(ManifestFile, FileIO, Map<Integer, PartitionSpec>) - Static method in class org.apache.iceberg.ManifestFiles
-
Returns a new
ManifestReader
for aManifestFile
. - readDictionary(ColumnDescriptor, PageReader) - Static method in class org.apache.iceberg.parquet.ParquetUtil
- readDouble() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.FloatAsDoubleReader
- readDouble() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.UnboxedReader
- readDouble() - Method in class org.apache.iceberg.parquet.ValuesAsBytesReader
- reader(int) - Method in class org.apache.iceberg.avro.ValueReaders.StructReader
- reader(int) - Method in class org.apache.iceberg.orc.OrcValueReaders.StructReader
- readers - Variable in class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- readFields(DataInput) - Method in class org.apache.iceberg.mr.hive.HiveIcebergSplit
- readFields(DataInput) - Method in class org.apache.iceberg.mr.mapred.Container
- readFields(DataInput) - Method in class org.apache.iceberg.mr.mapreduce.IcebergSplit
- readFloat() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.UnboxedReader
- readFloat() - Method in class org.apache.iceberg.parquet.ValuesAsBytesReader
- readFrom(String) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- readFrom(TableIdentifier) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- readInteger() - Method in class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- readInteger() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.UnboxedReader
- readInteger() - Method in class org.apache.iceberg.parquet.ValuesAsBytesReader
- readLong() - Method in class org.apache.iceberg.parquet.ParquetAvroValueReaders.TimeMillisReader
- readLong() - Method in class org.apache.iceberg.parquet.ParquetAvroValueReaders.TimestampMillisReader
- readLong() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.IntAsLongReader
- readLong() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.UnboxedReader
- readLong() - Method in class org.apache.iceberg.parquet.ValuesAsBytesReader
- readPaths(ManifestFile, FileIO) - Static method in class org.apache.iceberg.ManifestFiles
-
Returns a
CloseableIterable
of file paths in theManifestFile
. - readSchema(Configuration) - Static method in class org.apache.iceberg.mr.InputFormatConfig
- readSupport(ReadSupport<?>) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- readValueDictionaryId() - Method in class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- reassignIds(Schema, Schema) - Static method in class org.apache.iceberg.types.TypeUtil
-
Reassigns ids in a schema from another schema.
- rebuildCreateProperties(Map<String, String>) - Static method in class org.apache.iceberg.spark.Spark3Util
- reconnect(C) - Method in class org.apache.iceberg.ClientPoolImpl
- reconnect(IMetaStoreClient) - Method in class org.apache.iceberg.hive.HiveClientPool
- record(List<ValueReader<?>>, Class<R>, Schema) - Static method in class org.apache.iceberg.avro.ValueReaders
- record(List<ValueReader<?>>, Schema) - Static method in class org.apache.iceberg.avro.ValueReaders
- record(List<ValueWriter<?>>) - Static method in class org.apache.iceberg.avro.ValueWriters
- record(Schema, List<String>, List<Schema>) - Method in class org.apache.iceberg.avro.RemoveIds
- record(Schema, List<String>, List<T>) - Method in class org.apache.iceberg.avro.AvroSchemaVisitor
- record(Types.StructType, Schema, List<String>, List<T>) - Method in class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- record(Types.StructType, TypeDescription, List<String>, List<T>) - Method in class org.apache.iceberg.orc.OrcSchemaWithTypeVisitor
- record(TypeDescription, List<String>, List<T>) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- record(P, Schema, List<String>, List<T>) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- Record - Interface in org.apache.iceberg.data
- RECORD_COUNT - Static variable in interface org.apache.iceberg.DataFile
- recordCount() - Method in interface org.apache.iceberg.ContentFile
-
Returns the number of top-level records in the file.
- recordCount() - Method in class org.apache.iceberg.Metrics
-
Get the number of records (rows) in file.
- recordCount() - Method in class org.apache.iceberg.ScanSummary.PartitionMetrics
- recordCount() - Method in class org.apache.iceberg.spark.SparkDataFile
- recordOffset() - Method in class org.apache.iceberg.flink.source.split.IcebergSourceSplit
- recordsPerBatch(int) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- recordsPerBatch(int) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- ref() - Method in interface org.apache.iceberg.expressions.Bound
-
Returns the underlying reference.
- ref() - Method in class org.apache.iceberg.expressions.BoundPredicate
- ref() - Method in class org.apache.iceberg.expressions.BoundReference
- ref() - Method in class org.apache.iceberg.expressions.BoundTransform
- ref() - Method in class org.apache.iceberg.expressions.NamedReference
- ref() - Method in interface org.apache.iceberg.expressions.Unbound
-
Returns this expression's underlying reference.
- ref() - Method in class org.apache.iceberg.expressions.UnboundPredicate
- ref() - Method in class org.apache.iceberg.expressions.UnboundTransform
- ref(String) - Static method in class org.apache.iceberg.expressions.Expressions
-
Constructs a reference for a given column.
- Reference<T> - Interface in org.apache.iceberg.expressions
-
Represents a variable reference in an
expression
. - referencedDataFiles() - Method in class org.apache.iceberg.deletes.PositionDeleteWriter
- referencedDataFiles() - Method in class org.apache.iceberg.io.DeleteWriteResult
- referencedDataFiles() - Method in class org.apache.iceberg.io.WriteResult
- referencesDataFiles() - Method in class org.apache.iceberg.io.DeleteWriteResult
- refresh() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- refresh() - Method in class org.apache.iceberg.BaseTable
- refresh() - Method in class org.apache.iceberg.hadoop.HadoopTableOperations
- refresh() - Method in class org.apache.iceberg.nessie.NessieCatalog
- refresh() - Method in class org.apache.iceberg.SerializableTable
- refresh() - Method in class org.apache.iceberg.StaticTableOperations
-
StaticTableOperations works on the same version of TableMetadata, and it will never refer a different TableMetadata object than the one it was created with.
- refresh() - Method in interface org.apache.iceberg.Table
-
Refresh the current table metadata.
- refresh() - Method in interface org.apache.iceberg.TableOperations
-
Return the current table metadata after checking for updates.
- refreshFromMetadataLocation(String) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- refreshFromMetadataLocation(String, int) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- refreshFromMetadataLocation(String, Predicate<Exception>, int) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- refreshFromMetadataLocation(String, Predicate<Exception>, int) - Method in class org.apache.iceberg.nessie.NessieTableOperations
- refreshFromMetadataLocation(String, Predicate<Exception>, int, Function<String, TableMetadata>) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- refreshIdentifierFields(Types.StructType, Schema) - Static method in class org.apache.iceberg.types.TypeUtil
-
Get the identifier fields in the fresh schema based on the identifier fields in the base schema.
- register(Listener<E>, Class<E>) - Static method in class org.apache.iceberg.events.Listeners
- registerBucketUDF(SparkSession, String, DataType, int) - Static method in class org.apache.iceberg.spark.IcebergSpark
- registerTable(TableIdentifier, String) - Method in interface org.apache.iceberg.catalog.Catalog
-
Register a table with the catalog if it does not exist.
- registerTable(TableIdentifier, String) - Method in class org.apache.iceberg.hive.HiveCatalog
- registerTruncateUDF(SparkSession, String, DataType, int) - Static method in class org.apache.iceberg.spark.IcebergSpark
- relativeToAbsolutePath(String, Path) - Method in class org.apache.iceberg.pig.IcebergStorage
- release(String, String) - Method in interface org.apache.iceberg.LockManager
-
Release a lock exception must not be thrown for this method.
- remove(int, StructLike) - Method in class org.apache.iceberg.util.PartitionSet
- remove(Object) - Method in class org.apache.iceberg.util.CharSequenceSet
- remove(Object) - Method in class org.apache.iceberg.util.PartitionSet
- remove(Object) - Method in class org.apache.iceberg.util.SerializableMap
- remove(Object) - Method in class org.apache.iceberg.util.StructLikeMap
- remove(Object) - Method in class org.apache.iceberg.util.StructLikeSet
- remove(String) - Method in interface org.apache.iceberg.UpdateProperties
-
Remove the given property key from the table.
- removeAll(Collection<?>) - Method in class org.apache.iceberg.util.CharSequenceSet
- removeAll(Collection<?>) - Method in class org.apache.iceberg.util.PartitionSet
- removeAll(Collection<?>) - Method in class org.apache.iceberg.util.StructLikeSet
- removed() - Method in class org.apache.iceberg.MetadataUpdate.RemoveProperties
- REMOVED_DELETE_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- REMOVED_EQ_DELETES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- REMOVED_FILE_SIZE_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- REMOVED_POS_DELETES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- removeField(String) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Remove a partition field by name.
- removeField(Term) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Remove a partition field by its transform
expression term
. - removeIds(Schema) - Static method in class org.apache.iceberg.avro.RemoveIds
- removeIds(MessageType) - Static method in class org.apache.iceberg.parquet.RemoveIds
- RemoveIds - Class in org.apache.iceberg.avro
- RemoveIds - Class in org.apache.iceberg.parquet
- RemoveIds() - Constructor for class org.apache.iceberg.avro.RemoveIds
- RemoveIds() - Constructor for class org.apache.iceberg.parquet.RemoveIds
- RemoveOrphanFilesProcedure - Class in org.apache.iceberg.spark.procedures
-
A procedure that removes orphan files in a table.
- removeProperties(Set<String>) - Method in class org.apache.iceberg.TableMetadata.Builder
- removeProperties(Namespace, Set<String>) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- removeProperties(Namespace, Set<String>) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- removeProperties(Namespace, Set<String>) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
Remove a set of property keys from a namespace in the catalog.
- removeProperties(Namespace, Set<String>) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- removeProperties(Namespace, Set<String>) - Method in class org.apache.iceberg.hive.HiveCatalog
- removeProperties(Namespace, Set<String>) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- removeProperties(Namespace, Set<String>) - Method in class org.apache.iceberg.nessie.NessieCatalog
- RemoveProperties(Set<String>) - Constructor for class org.apache.iceberg.MetadataUpdate.RemoveProperties
- RemoveSnapshot(long) - Constructor for class org.apache.iceberg.MetadataUpdate.RemoveSnapshot
- removeSnapshots(List<Snapshot>) - Method in class org.apache.iceberg.TableMetadata.Builder
- removeSnapshotsIf(Predicate<Snapshot>) - Method in class org.apache.iceberg.TableMetadata
- removeTasks(Table, String) - Method in class org.apache.iceberg.spark.FileScanTaskSetManager
- rename(String, String) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- renameColumn(String, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Rename a column in the schema.
- renameField(String, String) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Rename a field in the partition spec.
- renameTable(ObjectPath, String, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- renameTable(TableIdentifier, TableIdentifier) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- renameTable(TableIdentifier, TableIdentifier) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
-
Rename table in Glue is a drop table and create table.
- renameTable(TableIdentifier, TableIdentifier) - Method in class org.apache.iceberg.CachingCatalog
- renameTable(TableIdentifier, TableIdentifier) - Method in interface org.apache.iceberg.catalog.Catalog
-
Rename a table.
- renameTable(TableIdentifier, TableIdentifier) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- renameTable(TableIdentifier, TableIdentifier) - Method in class org.apache.iceberg.hive.HiveCatalog
- renameTable(TableIdentifier, TableIdentifier) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- renameTable(TableIdentifier, TableIdentifier) - Method in class org.apache.iceberg.nessie.NessieCatalog
- renameTable(Identifier, Identifier) - Method in class org.apache.iceberg.spark.SparkCatalog
- renameTable(Identifier, Identifier) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- RepeatedKeyValueReader(int, int, ParquetValueReader<K>, ParquetValueReader<V>) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- RepeatedKeyValueWriter(int, int, ParquetValueWriter<K>, ParquetValueWriter<V>) - Constructor for class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedKeyValueWriter
- RepeatedReader(int, int, ParquetValueReader<E>) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- RepeatedWriter(int, int, ParquetValueWriter<E>) - Constructor for class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedWriter
- repetitionLevels - Variable in class org.apache.iceberg.parquet.BasePageIterator
- REPLACE - Static variable in class org.apache.iceberg.DataOperations
-
Files are removed and replaced, without changing the data in the table.
- REPLACE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- REPLACE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- REPLACE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- REPLACE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- REPLACE_PARTITIONS_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- replaceCurrentSnapshot(Snapshot) - Method in class org.apache.iceberg.TableMetadata
- ReplacePartitionFieldContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- ReplacePartitions - Interface in org.apache.iceberg
-
Not recommended: API for overwriting files in a table by partition.
- replaceProperties(Map<String, String>) - Method in class org.apache.iceberg.TableMetadata
- replaceSortOrder() - Method in class org.apache.iceberg.BaseTable
- replaceSortOrder() - Method in class org.apache.iceberg.SerializableTable
- replaceSortOrder() - Method in interface org.apache.iceberg.Table
-
Create a new
ReplaceSortOrder
to set the table sort order and commit the change. - replaceSortOrder() - Method in interface org.apache.iceberg.Transaction
-
Create a new
ReplaceSortOrder
to set a table sort order and commit the change. - replaceSortOrder(SortOrder) - Method in class org.apache.iceberg.TableMetadata
- ReplaceSortOrder - Interface in org.apache.iceberg
-
API for replacing table sort order with a newly created order.
- replaceTableTransaction(String, TableOperations, TableMetadata) - Static method in class org.apache.iceberg.Transactions
- replaceTransaction() - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- replaceTransaction() - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Starts a transaction to replace the table.
- requestRefresh() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- requireColumn(String) - Method in interface org.apache.iceberg.UpdateSchema
-
Update a column to required.
- required() - Method in interface org.apache.spark.sql.connector.iceberg.catalog.ProcedureParameter
-
Returns true if this parameter is required.
- required(int, String, Type) - Static method in class org.apache.iceberg.types.Types.NestedField
- required(int, String, Type, String) - Static method in class org.apache.iceberg.types.Types.NestedField
- required(String, DataType) - Static method in interface org.apache.spark.sql.connector.iceberg.catalog.ProcedureParameter
-
Creates a required input parameter.
- requiredContext() - Method in class org.apache.iceberg.flink.FlinkCatalogFactory
- requiredMetadataAttributes() - Method in interface org.apache.spark.sql.connector.iceberg.write.RowLevelOperation
-
Returns metadata attributes that are required to perform this row-level operation.
- requiredOptions() - Method in class org.apache.iceberg.flink.FlinkDynamicTableFactory
- requiredSchema() - Method in class org.apache.iceberg.data.DeleteFilter
- RESERVED_PROPERTIES - Static variable in class org.apache.iceberg.TableProperties
-
Reserved Iceberg table properties list.
- reset() - Method in class org.apache.iceberg.arrow.vectorized.NullabilityHolder
- reset() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedPageIterator
- reset() - Method in class org.apache.iceberg.parquet.BasePageIterator
- residual() - Method in interface org.apache.iceberg.FileScanTask
-
Returns the residual expression that should be applied to rows in this file scan.
- ResidualEvaluator - Class in org.apache.iceberg.expressions
-
Finds the residuals for an
Expression
the partitions in the givenPartitionSpec
. - residualFor(StructLike) - Method in class org.apache.iceberg.expressions.ResidualEvaluator
-
Returns a residual expression for the given partition values.
- resolveAndRead(Decoder, Schema, Schema, ValueReader<T>, T) - Static method in class org.apache.iceberg.data.avro.DecoderResolver
- ResolvingFileIO - Class in org.apache.iceberg.io
-
FileIO implementation that uses location scheme to choose the correct FileIO implementation.
- ResolvingFileIO() - Constructor for class org.apache.iceberg.io.ResolvingFileIO
-
No-arg constructor to load the FileIO dynamically.
- result() - Method in class org.apache.iceberg.deletes.EqualityDeleteWriter
- result() - Method in class org.apache.iceberg.deletes.PositionDeleteWriter
- result() - Method in class org.apache.iceberg.io.BasePositionDeltaWriter
- result() - Method in class org.apache.iceberg.io.DataWriter
- result() - Method in interface org.apache.iceberg.io.EqualityDeltaWriter
-
Returns a result that contains information about written
DataFile
s orDeleteFile
s. - result() - Method in interface org.apache.iceberg.io.FileWriter
-
Returns a result that contains information about written
DataFile
s orDeleteFile
s. - result() - Method in interface org.apache.iceberg.io.PartitioningWriter
-
Returns a result that contains information about written
DataFile
s orDeleteFile
s. - result() - Method in interface org.apache.iceberg.io.PositionDeltaWriter
-
Returns a result that contains information about written
DataFile
s orDeleteFile
s. - results() - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager.CommitService
-
Returns all File groups which have been committed
- retainAll(Collection<?>) - Method in class org.apache.iceberg.util.CharSequenceSet
- retainAll(Collection<?>) - Method in class org.apache.iceberg.util.PartitionSet
- retainAll(Collection<?>) - Method in class org.apache.iceberg.util.StructLikeSet
- retainLast(int) - Method in interface org.apache.iceberg.actions.ExpireSnapshots
-
Retains the most recent ancestors of the current snapshot.
- retainLast(int) - Method in interface org.apache.iceberg.ExpireSnapshots
-
Retains the most recent ancestors of the current snapshot.
- retainLast(int) - Method in class org.apache.iceberg.spark.actions.BaseExpireSnapshotsSparkAction
- retry(int) - Method in class org.apache.iceberg.util.Tasks.Builder
- ReusableEntry() - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.ReusableEntry
- REUSE_CONTAINERS - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- reuseContainers() - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- reuseContainers() - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- reuseContainers() - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- reuseContainers(boolean) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- reuseContainers(boolean) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- reuseOrCreate(Object) - Method in class org.apache.iceberg.avro.ValueReaders.StructReader
- revertWith(Tasks.Task<I, ?>) - Method in class org.apache.iceberg.util.Tasks.Builder
- REWRITE_ALL - Static variable in class org.apache.iceberg.actions.SortStrategy
-
Rewrites all files, regardless of their size.
- REWRITE_ALL_DEFAULT - Static variable in class org.apache.iceberg.actions.SortStrategy
- rewriteDataFiles() - Method in class org.apache.iceberg.flink.actions.Actions
- rewriteDataFiles(Table) - Method in interface org.apache.iceberg.actions.ActionsProvider
-
Instantiates an action to rewrite data files.
- rewriteDataFiles(Table) - Method in class org.apache.iceberg.spark.actions.SparkActions
- RewriteDataFiles - Interface in org.apache.iceberg.actions
-
An action for rewriting data files according to a rewrite strategy.
- RewriteDataFiles.FileGroupInfo - Interface in org.apache.iceberg.actions
-
A description of a file group, when it was processed, and within which partition.
- RewriteDataFiles.FileGroupRewriteResult - Interface in org.apache.iceberg.actions
-
For a particular file group, the number of files which are newly created and the number of files which were formerly part of the table but have been rewritten.
- RewriteDataFiles.Result - Interface in org.apache.iceberg.actions
-
A map of file group information to the results of rewriting that file group.
- RewriteDataFilesAction - Class in org.apache.iceberg.flink.actions
- RewriteDataFilesAction(StreamExecutionEnvironment, Table) - Constructor for class org.apache.iceberg.flink.actions.RewriteDataFilesAction
- RewriteDataFilesActionResult - Class in org.apache.iceberg.actions
- RewriteDataFilesActionResult(List<DataFile>, List<DataFile>) - Constructor for class org.apache.iceberg.actions.RewriteDataFilesActionResult
- RewriteDataFilesCommitManager - Class in org.apache.iceberg.actions
-
Functionality used by RewriteDataFile Actions from different platforms to handle commits.
- RewriteDataFilesCommitManager(Table) - Constructor for class org.apache.iceberg.actions.RewriteDataFilesCommitManager
- RewriteDataFilesCommitManager(Table, long) - Constructor for class org.apache.iceberg.actions.RewriteDataFilesCommitManager
- RewriteDataFilesCommitManager(Table, long, boolean) - Constructor for class org.apache.iceberg.actions.RewriteDataFilesCommitManager
- RewriteDataFilesCommitManager.CommitService - Class in org.apache.iceberg.actions
- rewriteDataForTasks(List<CombinedScanTask>) - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- rewriteDataForTasks(List<CombinedScanTask>) - Method in class org.apache.iceberg.flink.actions.RewriteDataFilesAction
- rewriteDataForTasks(DataStream<CombinedScanTask>, int) - Method in class org.apache.iceberg.flink.source.RowDataRewriter
- rewriteDataForTasks(JavaRDD<CombinedScanTask>) - Method in class org.apache.iceberg.spark.source.RowDataRewriter
- rewriteDeleteFiles(Iterable<DeleteFile>) - Method in interface org.apache.iceberg.actions.RewritePositionDeleteStrategy
-
Define how to rewrite the deletes.
- RewriteFileGroup - Class in org.apache.iceberg.actions
-
Container class representing a set of files to be rewritten by a RewriteAction and the new files which have been written by the action.
- RewriteFileGroup(RewriteDataFiles.FileGroupInfo, List<FileScanTask>) - Constructor for class org.apache.iceberg.actions.RewriteFileGroup
- rewriteFiles(List<FileScanTask>) - Method in interface org.apache.iceberg.actions.RewriteStrategy
-
Method which will rewrite files based on this particular RewriteStrategy's algorithm.
- rewriteFiles(List<FileScanTask>) - Method in class org.apache.iceberg.spark.actions.Spark3BinPackStrategy
- rewriteFiles(List<FileScanTask>) - Method in class org.apache.iceberg.spark.actions.Spark3SortStrategy
- rewriteFiles(Set<DataFile>, Set<DataFile>) - Method in interface org.apache.iceberg.RewriteFiles
-
Add a rewrite that replaces one set of data files with another set that contains the same data.
- rewriteFiles(Set<DataFile>, Set<DataFile>, long) - Method in interface org.apache.iceberg.RewriteFiles
-
Add a rewrite that replaces one set of data files with another set that contains the same data.
- rewriteFiles(Set<DataFile>, Set<DeleteFile>, Set<DataFile>, Set<DeleteFile>) - Method in interface org.apache.iceberg.RewriteFiles
-
Add a rewrite that replaces one set of files with another set that contains the same data.
- RewriteFiles - Interface in org.apache.iceberg
-
API for replacing files in a table.
- rewriteIf(Predicate<ManifestFile>) - Method in interface org.apache.iceberg.actions.RewriteManifests
-
Rewrites only manifests that match the given predicate.
- rewriteIf(Predicate<ManifestFile>) - Method in class org.apache.iceberg.BaseRewriteManifests
- rewriteIf(Predicate<ManifestFile>) - Method in interface org.apache.iceberg.RewriteManifests
-
Determines which existing
ManifestFile
for the table should be rewritten. - rewriteIf(Predicate<ManifestFile>) - Method in class org.apache.iceberg.spark.actions.BaseRewriteManifestsSparkAction
- rewriteManifests() - Method in class org.apache.iceberg.BaseTable
- rewriteManifests() - Method in class org.apache.iceberg.SerializableTable
- rewriteManifests() - Method in interface org.apache.iceberg.Table
-
Create a new
rewrite manifests API
to replace manifests for this table and commit. - rewriteManifests() - Method in interface org.apache.iceberg.Transaction
-
Create a new
rewrite manifests API
to replace manifests for this table. - rewriteManifests(Table) - Method in interface org.apache.iceberg.actions.ActionsProvider
-
Instantiates an action to rewrite manifests.
- RewriteManifests - Interface in org.apache.iceberg.actions
-
An action that rewrites manifests.
- RewriteManifests - Interface in org.apache.iceberg
-
API for rewriting manifests for a table.
- RewriteManifests.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- RewriteMap(Schema, String, FileIO, boolean, EncryptionManager, TaskWriterFactory<RowData>) - Constructor for class org.apache.iceberg.flink.source.RowDataRewriter.RewriteMap
- rewriteNot(Expression) - Static method in class org.apache.iceberg.expressions.Expressions
- RewritePositionDeleteFiles - Interface in org.apache.iceberg.actions
-
An action for rewriting position delete files.
- RewritePositionDeleteFiles.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- RewritePositionDeleteStrategy - Interface in org.apache.iceberg.actions
-
A strategy for an action to rewrite position delete files.
- rewriteResults() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesResult
- rewriteResults() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.Result
- RewriteStrategy - Interface in org.apache.iceberg.actions
- REWRITTEN_FILE_SCAN_TASK_SET_ID - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- rewrittenDataFilesCount() - Method in class org.apache.iceberg.actions.BaseFileGroupRewriteResult
- rewrittenDataFilesCount() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.FileGroupRewriteResult
- rewrittenDataFilesCount() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.Result
- rewrittenDeleteFilesCount() - Method in interface org.apache.iceberg.actions.RewritePositionDeleteFiles.Result
-
Returns the count of the position deletes that been rewritten.
- rewrittenFiles() - Method in class org.apache.iceberg.actions.RewriteFileGroup
- rewrittenFileSetId() - Method in class org.apache.iceberg.spark.SparkWriteConf
- rewrittenManifests() - Method in class org.apache.iceberg.actions.BaseRewriteManifestsActionResult
- rewrittenManifests() - Method in interface org.apache.iceberg.actions.RewriteManifests.Result
-
Returns rewritten manifests.
- right() - Method in class org.apache.iceberg.expressions.And
- right() - Method in class org.apache.iceberg.expressions.Or
- rollback() - Method in class org.apache.iceberg.BaseTable
- rollback() - Method in class org.apache.iceberg.SerializableTable
- rollback() - Method in interface org.apache.iceberg.Table
-
Deprecated.Replaced by
Table.manageSnapshots()
- Rollback - Interface in org.apache.iceberg
-
API for rolling table data back to the state at an older table
snapshot
. - rollbackCreateTable(Table) - Method in class org.apache.iceberg.mr.hive.HiveIcebergMetaHook
- rollbackDropTable(Table) - Method in class org.apache.iceberg.mr.hive.HiveIcebergMetaHook
- RollbackStagedTable - Class in org.apache.iceberg.spark
-
An implementation of StagedTable that mimics the behavior of Spark's non-atomic CTAS and RTAS.
- RollbackStagedTable(TableCatalog, Identifier, Table) - Constructor for class org.apache.iceberg.spark.RollbackStagedTable
- rollbackTo(long) - Method in interface org.apache.iceberg.ManageSnapshots
-
Rollback table's state to a specific
Snapshot
identified by id. - rollbackTo(long) - Method in class org.apache.iceberg.SnapshotManager
- rollbackToTime(long) - Method in interface org.apache.iceberg.ManageSnapshots
-
Roll this table's data back to the last
Snapshot
before the given timestamp. - rollbackToTime(long) - Method in class org.apache.iceberg.SnapshotManager
- RollingDataWriter<T> - Class in org.apache.iceberg.io
-
A rolling data writer that splits incoming data into multiple files within one spec/partition based on the target file size.
- RollingDataWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, long, PartitionSpec, StructLike) - Constructor for class org.apache.iceberg.io.RollingDataWriter
- RollingEqualityDeleteWriter<T> - Class in org.apache.iceberg.io
-
A rolling equality delete writer that splits incoming deletes into multiple files within one spec/partition based on the target file size.
- RollingEqualityDeleteWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, long, PartitionSpec, StructLike) - Constructor for class org.apache.iceberg.io.RollingEqualityDeleteWriter
- RollingFileWriter(StructLike) - Constructor for class org.apache.iceberg.io.BaseTaskWriter.RollingFileWriter
- RollingPositionDeleteWriter<T> - Class in org.apache.iceberg.io
-
A rolling position delete writer that splits incoming deletes into multiple files within one spec/partition based on the target file size.
- RollingPositionDeleteWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, long, PartitionSpec, StructLike) - Constructor for class org.apache.iceberg.io.RollingPositionDeleteWriter
- rootAllocator() - Static method in class org.apache.iceberg.arrow.ArrowAllocation
- row() - Method in class org.apache.iceberg.deletes.PositionDelete
- ROW_POSITION - Static variable in class org.apache.iceberg.MetadataColumns
- rowCount(InputFile) - Static method in class org.apache.iceberg.avro.Avro
-
Returns number of rows in specified Avro file
- RowDataFileScanTaskReader - Class in org.apache.iceberg.flink.source
- RowDataFileScanTaskReader(Schema, Schema, String, boolean) - Constructor for class org.apache.iceberg.flink.source.RowDataFileScanTaskReader
- RowDataProjection - Class in org.apache.iceberg.flink.data
- RowDataRewriter - Class in org.apache.iceberg.flink.source
- RowDataRewriter - Class in org.apache.iceberg.spark.source
- RowDataRewriter(Table, boolean, FileIO, EncryptionManager) - Constructor for class org.apache.iceberg.flink.source.RowDataRewriter
- RowDataRewriter(Broadcast<Table>, PartitionSpec, boolean) - Constructor for class org.apache.iceberg.spark.source.RowDataRewriter
- RowDataRewriter.RewriteMap - Class in org.apache.iceberg.flink.source
- RowDataTaskWriterFactory - Class in org.apache.iceberg.flink.sink
- RowDataTaskWriterFactory(Table, RowType, long, FileFormat, List<Integer>, boolean) - Constructor for class org.apache.iceberg.flink.sink.RowDataTaskWriterFactory
- RowDataUtil - Class in org.apache.iceberg.flink.data
- RowDataWrapper - Class in org.apache.iceberg.flink
- RowDataWrapper(RowType, Types.StructType) - Constructor for class org.apache.iceberg.flink.RowDataWrapper
- RowDelta - Interface in org.apache.iceberg
-
API for encoding row-level changes to a table.
- rowId() - Method in interface org.apache.spark.sql.connector.iceberg.write.SupportsDelta
-
Returns the row ID column references that should be used for row equality.
- rowIdSchema() - Method in interface org.apache.spark.sql.connector.iceberg.write.ExtendedLogicalWriteInfo
-
The schema of the ID columns from Spark to data source.
- RowLevelOperation - Interface in org.apache.spark.sql.connector.iceberg.write
-
A logical representation of a data source DELETE, UPDATE, or MERGE operation that requires rewriting data.
- RowLevelOperation.Command - Enum in org.apache.spark.sql.connector.iceberg.write
-
The SQL operation being performed.
- RowLevelOperationBuilder - Interface in org.apache.spark.sql.connector.iceberg.write
-
An interface for building a row-level operation.
- RowLevelOperationInfo - Interface in org.apache.spark.sql.connector.iceberg.write
-
An interface with logical information for a row-level operation such as DELETE or MERGE.
- RowLevelOperationMode - Enum in org.apache.iceberg
-
Iceberg supports two ways to modify records in a table: copy-on-write and merge-on-read.
- RowPositionColumnVector - Class in org.apache.iceberg.spark.data.vectorized
- rows() - Method in interface org.apache.iceberg.DataTask
-
Returns an iterable of
StructLike
rows. - rowSchema(Schema) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- rowSchema(Schema) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- rowSchema(Schema) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- RULE_booleanValue - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_callArgument - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_constant - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_expression - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_fieldList - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_identifier - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_multipartIdentifier - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_nonReserved - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_number - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_order - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_orderField - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_quotedIdentifier - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_singleStatement - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_statement - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_stringMap - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_transform - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_transformArgument - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_writeDistributionSpec - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_writeOrderingSpec - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- RULE_writeSpec - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- ruleNames - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- ruleNames - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- run() - Method in interface org.apache.iceberg.util.ExceptionUtil.Block
- run() - Method in interface org.apache.iceberg.util.ExceptionUtil.FinallyBlock
- run(C) - Method in interface org.apache.iceberg.ClientPool.Action
- run(I) - Method in interface org.apache.iceberg.util.Tasks.Task
- run(I, Exception) - Method in interface org.apache.iceberg.util.Tasks.FailureTask
- run(Throwable) - Method in interface org.apache.iceberg.util.ExceptionUtil.CatchBlock
- run(SourceFunction.SourceContext<FlinkInputSplit>) - Method in class org.apache.iceberg.flink.source.StreamingMonitorFunction
- run(ClientPool.Action<R, C, E>) - Method in interface org.apache.iceberg.ClientPool
- run(ClientPool.Action<R, C, E>) - Method in class org.apache.iceberg.ClientPoolImpl
- run(ClientPool.Action<R, C, E>, boolean) - Method in interface org.apache.iceberg.ClientPool
- run(ClientPool.Action<R, C, E>, boolean) - Method in class org.apache.iceberg.ClientPoolImpl
- run(ClientPool.Action<R, IMetaStoreClient, TException>) - Method in class org.apache.iceberg.hive.CachedClientPool
- run(ClientPool.Action<R, IMetaStoreClient, TException>, boolean) - Method in class org.apache.iceberg.hive.CachedClientPool
- run(Tasks.Task<I, E>, Class<E>) - Method in class org.apache.iceberg.util.Tasks.Builder
- run(Tasks.Task<I, RuntimeException>) - Method in class org.apache.iceberg.util.Tasks.Builder
- runSafely(ExceptionUtil.Block<R, E1, E2, E3>, ExceptionUtil.CatchBlock, ExceptionUtil.FinallyBlock, Class<? extends E1>, Class<? extends E2>, Class<? extends E3>) - Static method in class org.apache.iceberg.util.ExceptionUtil
- runSafely(ExceptionUtil.Block<R, E1, E2, RuntimeException>, ExceptionUtil.CatchBlock, ExceptionUtil.FinallyBlock, Class<? extends E1>, Class<? extends E2>) - Static method in class org.apache.iceberg.util.ExceptionUtil
- runSafely(ExceptionUtil.Block<R, E1, RuntimeException, RuntimeException>, ExceptionUtil.CatchBlock, ExceptionUtil.FinallyBlock, Class<? extends E1>) - Static method in class org.apache.iceberg.util.ExceptionUtil
- runSafely(ExceptionUtil.Block<R, RuntimeException, RuntimeException, RuntimeException>, ExceptionUtil.CatchBlock, ExceptionUtil.FinallyBlock) - Static method in class org.apache.iceberg.util.ExceptionUtil
- RuntimeIOException - Exception in org.apache.iceberg.exceptions
-
Deprecated.Use java.io.UncheckedIOException directly instead. Exception used to wrap
IOException
as aRuntimeException
and add context. - RuntimeIOException(IOException) - Constructor for exception org.apache.iceberg.exceptions.RuntimeIOException
-
Deprecated.
- RuntimeIOException(IOException, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.RuntimeIOException
-
Deprecated.
- RuntimeIOException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.RuntimeIOException
-
Deprecated.
- RuntimeMetaException - Exception in org.apache.iceberg.hive
-
Exception used to wrap
MetaException
as aRuntimeException
and add context. - RuntimeMetaException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.hive.RuntimeMetaException
- RuntimeMetaException(MetaException) - Constructor for exception org.apache.iceberg.hive.RuntimeMetaException
- RuntimeMetaException(MetaException, String, Object...) - Constructor for exception org.apache.iceberg.hive.RuntimeMetaException
S
- s3() - Method in class org.apache.iceberg.aws.AssumeRoleAwsClientFactory
- s3() - Method in interface org.apache.iceberg.aws.AwsClientFactory
-
create a Amazon S3 client
- S3_CHECKSUM_ENABLED - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Enables eTag checks for S3 PUT and MULTIPART upload requests.
- S3FileIO - Class in org.apache.iceberg.aws.s3
-
FileIO implementation backed by S3.
- S3FileIO() - Constructor for class org.apache.iceberg.aws.s3.S3FileIO
-
No-arg constructor to load the FileIO dynamically.
- S3FileIO(SerializableSupplier<S3Client>) - Constructor for class org.apache.iceberg.aws.s3.S3FileIO
-
Constructor with custom s3 supplier and default AWS properties.
- S3FileIO(SerializableSupplier<S3Client>, AwsProperties) - Constructor for class org.apache.iceberg.aws.s3.S3FileIO
-
Constructor with custom s3 supplier and AWS properties.
- S3FILEIO_ACCESS_KEY_ID - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Configure the static access key ID used to access S3FileIO.
- S3FILEIO_ACL - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used to configure canned access control list (ACL) for S3 client to use during write.
- S3FILEIO_ENDPOINT - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Configure an alternative endpoint of the S3 service for S3FileIO to access.
- S3FILEIO_MULTIPART_SIZE - Static variable in class org.apache.iceberg.aws.AwsProperties
-
The size of a single part for multipart upload requests in bytes (default: 32MB).
- S3FILEIO_MULTIPART_SIZE_DEFAULT - Static variable in class org.apache.iceberg.aws.AwsProperties
- S3FILEIO_MULTIPART_SIZE_MIN - Static variable in class org.apache.iceberg.aws.AwsProperties
- S3FILEIO_MULTIPART_THRESHOLD_FACTOR - Static variable in class org.apache.iceberg.aws.AwsProperties
-
The threshold expressed as a factor times the multipart size at which to switch from uploading using a single put object request to uploading using multipart upload (default: 1.5).
- S3FILEIO_MULTIPART_THRESHOLD_FACTOR_DEFAULT - Static variable in class org.apache.iceberg.aws.AwsProperties
- S3FILEIO_MULTIPART_UPLOAD_THREADS - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Number of threads to use for uploading parts to S3 (shared pool across all output streams), default to
Runtime.availableProcessors()
- S3FILEIO_SECRET_ACCESS_KEY - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Configure the static secret access key used to access S3FileIO.
- S3FILEIO_SESSION_TOKEN - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Configure the static session token used to access S3FileIO.
- S3FILEIO_SSE_KEY - Static variable in class org.apache.iceberg.aws.AwsProperties
-
If S3 encryption type is SSE-KMS, input is a KMS Key ID or ARN.
- S3FILEIO_SSE_MD5 - Static variable in class org.apache.iceberg.aws.AwsProperties
-
If S3 encryption type is SSE-C, input is the base-64 MD5 digest of the secret key.
- S3FILEIO_SSE_TYPE - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Type of S3 Server side encryption used, default to
AwsProperties.S3FILEIO_SSE_TYPE_NONE
. - S3FILEIO_SSE_TYPE_CUSTOM - Static variable in class org.apache.iceberg.aws.AwsProperties
-
S3 SSE-C encryption.
- S3FILEIO_SSE_TYPE_KMS - Static variable in class org.apache.iceberg.aws.AwsProperties
-
S3 SSE-KMS encryption.
- S3FILEIO_SSE_TYPE_NONE - Static variable in class org.apache.iceberg.aws.AwsProperties
-
No server side encryption.
- S3FILEIO_SSE_TYPE_S3 - Static variable in class org.apache.iceberg.aws.AwsProperties
-
S3 SSE-S3 encryption.
- S3FILEIO_STAGING_DIRECTORY - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Location to put staging files for upload to S3, default to temp directory set in java.io.tmpdir.
- s3FileIoAcl() - Method in class org.apache.iceberg.aws.AwsProperties
- s3FileIoMultiPartSize() - Method in class org.apache.iceberg.aws.AwsProperties
- s3FileIOMultipartThresholdFactor() - Method in class org.apache.iceberg.aws.AwsProperties
- s3FileIoMultipartUploadThreads() - Method in class org.apache.iceberg.aws.AwsProperties
- s3FileIoSseKey() - Method in class org.apache.iceberg.aws.AwsProperties
- s3FileIoSseMd5() - Method in class org.apache.iceberg.aws.AwsProperties
- s3FileIoSseType() - Method in class org.apache.iceberg.aws.AwsProperties
- s3fileIoStagingDirectory() - Method in class org.apache.iceberg.aws.AwsProperties
- S3InputFile - Class in org.apache.iceberg.aws.s3
- S3OutputFile - Class in org.apache.iceberg.aws.s3
- S3RequestUtil - Class in org.apache.iceberg.aws.s3
- sameOrder(SortOrder) - Method in class org.apache.iceberg.SortOrder
-
Checks whether this order is equivalent to another order while ignoring the order id.
- sameSchema(Schema) - Method in class org.apache.iceberg.Schema
-
Checks whether this schema is equivalent to another schema while ignoring the schema ID.
- satisfies(SortField) - Method in class org.apache.iceberg.SortField
-
Checks whether this field's order satisfies another field's order.
- satisfies(SortOrder) - Method in class org.apache.iceberg.SortOrder
-
Checks whether this order satisfies another order.
- satisfiesOrderOf(Transform<?, ?>) - Method in interface org.apache.iceberg.transforms.Transform
-
Whether ordering by this transform's result satisfies the ordering of another transform's result.
- scale() - Method in class org.apache.iceberg.types.Types.DecimalType
- SCAN_THREAD_POOL_ENABLED - Static variable in class org.apache.iceberg.SystemProperties
-
Whether to use the shared worker pool when planning table scans.
- ScanBuilder(Table) - Constructor for class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- ScanEvent - Class in org.apache.iceberg.events
-
Event sent to listeners when a table scan is planned.
- ScanEvent(String, long, Expression, Schema) - Constructor for class org.apache.iceberg.events.ScanEvent
- ScanSummary - Class in org.apache.iceberg
- ScanSummary.Builder - Class in org.apache.iceberg
- ScanSummary.PartitionMetrics - Class in org.apache.iceberg
- ScanTask - Interface in org.apache.iceberg
-
A scan task.
- scheduler() - Method in class org.apache.iceberg.util.LockManagers.BaseLockManager
- schema() - Method in class org.apache.iceberg.AllDataFilesTable
- schema() - Method in class org.apache.iceberg.AllEntriesTable
- schema() - Method in class org.apache.iceberg.AllManifestsTable
- schema() - Method in class org.apache.iceberg.BaseTable
- schema() - Method in class org.apache.iceberg.DataFilesTable
- schema() - Method in class org.apache.iceberg.HistoryTable
- schema() - Method in class org.apache.iceberg.ManifestEntriesTable
- schema() - Static method in interface org.apache.iceberg.ManifestFile
- schema() - Method in class org.apache.iceberg.ManifestReader
- schema() - Method in class org.apache.iceberg.ManifestsTable
- schema() - Method in class org.apache.iceberg.MetadataUpdate.AddSchema
- schema() - Method in class org.apache.iceberg.PartitionSpec
-
Returns the
Schema
for this spec. - schema() - Method in class org.apache.iceberg.PartitionsTable
- schema() - Method in class org.apache.iceberg.SerializableTable
- schema() - Method in class org.apache.iceberg.SnapshotsTable
- schema() - Method in class org.apache.iceberg.SortOrder
-
Returns the
Schema
for this sort order - schema() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- schema() - Method in class org.apache.iceberg.spark.source.SparkTable
- schema() - Method in interface org.apache.iceberg.Table
-
Return the
schema
for this table. - schema() - Method in class org.apache.iceberg.TableMetadata
- schema() - Method in interface org.apache.iceberg.TableScan
-
Returns this scan's projection
Schema
. - schema(Configuration) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
-
Returns the Table Schema serialized to the configuration.
- schema(Schema) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- schema(Schema) - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- schema(Schema) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- schema(Schema) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- schema(Schema) - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- schema(Schema) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- schema(Schema) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- schema(Schema, String) - Method in class org.apache.iceberg.spark.Spark3Util.DescribeSchemaVisitor
- schema(Schema, Supplier<List<String>>) - Method in class org.apache.iceberg.types.CheckCompatibility
- schema(Schema, Supplier<Type>) - Method in class org.apache.iceberg.spark.PruneColumnsWithoutReordering
- schema(Schema, Supplier<Type>) - Method in class org.apache.iceberg.spark.PruneColumnsWithReordering
- schema(Schema, Supplier<Type>) - Method in class org.apache.iceberg.types.FixupTypes
- schema(Schema, Supplier<T>) - Method in class org.apache.iceberg.types.TypeUtil.CustomOrderSchemaVisitor
- schema(Schema, Map<Integer, Integer>) - Method in class org.apache.iceberg.types.IndexParents
- schema(Schema, Map<String, Integer>) - Method in class org.apache.iceberg.types.IndexByName
- schema(Schema, ObjectInspector) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- schema(Schema, P, R) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- schema(Schema, T) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- Schema - Class in org.apache.iceberg
-
The schema of a data table.
- Schema(int, List<Types.NestedField>) - Constructor for class org.apache.iceberg.Schema
- Schema(int, List<Types.NestedField>, Map<String, Integer>, Set<Integer>) - Constructor for class org.apache.iceberg.Schema
- Schema(int, List<Types.NestedField>, Set<Integer>) - Constructor for class org.apache.iceberg.Schema
- Schema(int, Types.NestedField...) - Constructor for class org.apache.iceberg.Schema
- Schema(List<Types.NestedField>) - Constructor for class org.apache.iceberg.Schema
- Schema(List<Types.NestedField>, Map<String, Integer>) - Constructor for class org.apache.iceberg.Schema
- Schema(List<Types.NestedField>, Map<String, Integer>, Set<Integer>) - Constructor for class org.apache.iceberg.Schema
- Schema(List<Types.NestedField>, Set<Integer>) - Constructor for class org.apache.iceberg.Schema
- Schema(Types.NestedField...) - Constructor for class org.apache.iceberg.Schema
- SCHEMA - Static variable in interface org.apache.iceberg.ManifestFile
- SCHEMA_AUTO_CONVERSION - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- schemaFor(Table, long) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Returns the schema of the table for the specified snapshot.
- schemaFor(Table, Long, Long) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Convenience method for returning the schema of the table for a snapshot, when we have a snapshot id or a timestamp.
- schemaForTable(SparkSession, String) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Returns a
Schema
for the given table with fresh field ids. - schemaId() - Method in class org.apache.iceberg.MetadataUpdate.SetCurrentSchema
- schemaId() - Method in class org.apache.iceberg.Schema
-
Returns the schema ID for this schema.
- schemaId() - Method in interface org.apache.iceberg.Snapshot
-
Return the id of the schema used when this snapshot was created, or null if this information is not available.
- SchemaParser - Class in org.apache.iceberg
- schemas() - Method in class org.apache.iceberg.BaseTable
- schemas() - Method in class org.apache.iceberg.SerializableTable
- schemas() - Method in interface org.apache.iceberg.Table
-
Return a map of
schema
for this table. - schemas() - Method in class org.apache.iceberg.TableMetadata
- schemasById() - Method in class org.apache.iceberg.TableMetadata
- SchemaUtil - Class in org.apache.iceberg.pig
- SchemaVisitor() - Constructor for class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- SchemaWithPartnerVisitor<P,R> - Class in org.apache.iceberg.schema
- SchemaWithPartnerVisitor() - Constructor for class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- SchemaWithPartnerVisitor.PartnerAccessors<P> - Interface in org.apache.iceberg.schema
- second() - Method in class org.apache.iceberg.util.Pair
- seek(long) - Method in class org.apache.iceberg.aliyun.oss.OSSInputStream
- seek(long) - Method in class org.apache.iceberg.io.SeekableInputStream
-
Seek to a new position in the InputStream.
- SeekableInputStream - Class in org.apache.iceberg.io
-
SeekableInputStream
is an interface with the methods needed to read data from a file or Hadoop data stream. - SeekableInputStream() - Constructor for class org.apache.iceberg.io.SeekableInputStream
- select(String...) - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- select(String...) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- select(String...) - Method in class org.apache.iceberg.Schema
-
Creates a projection schema for a subset of columns, selected by name.
- select(String...) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
from this that will read the given data columns. - select(Collection<String>) - Method in class org.apache.iceberg.ManifestReader
- select(Collection<String>) - Method in class org.apache.iceberg.Schema
-
Creates a projection schema for a subset of columns, selected by name.
- select(Collection<String>) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
from this that will read the given data columns. - select(List<String>) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- select(Schema, Set<Integer>) - Static method in class org.apache.iceberg.types.TypeUtil
- select(Types.StructType, Set<Integer>) - Static method in class org.apache.iceberg.types.TypeUtil
- selectDeleteFiles(Iterable<DeleteFile>) - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteStrategy
-
Select the delete files to convert.
- selectDeleteFiles(Iterable<DeleteFile>) - Method in interface org.apache.iceberg.actions.RewritePositionDeleteStrategy
-
Select the delete files to rewrite.
- SELECTED_COLUMNS - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- selectedColumns(Configuration) - Static method in class org.apache.iceberg.mr.InputFormatConfig
- selectFilesToRewrite(Iterable<FileScanTask>) - Method in class org.apache.iceberg.actions.BinPackStrategy
- selectFilesToRewrite(Iterable<FileScanTask>) - Method in interface org.apache.iceberg.actions.RewriteStrategy
-
Selects files which this strategy believes are valid targets to be rewritten.
- selectFilesToRewrite(Iterable<FileScanTask>) - Method in class org.apache.iceberg.actions.SortStrategy
- selectNot(Schema, Set<Integer>) - Static method in class org.apache.iceberg.types.TypeUtil
- selectNot(Types.StructType, Set<Integer>) - Static method in class org.apache.iceberg.types.TypeUtil
- self() - Method in class org.apache.iceberg.BaseOverwriteFiles
- self() - Method in class org.apache.iceberg.BaseReplacePartitions
- self() - Method in class org.apache.iceberg.BaseRewriteManifests
- self() - Method in class org.apache.iceberg.flink.actions.RewriteDataFilesAction
- self() - Method in class org.apache.iceberg.SnapshotManager
- self() - Method in class org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction
- self() - Method in class org.apache.iceberg.spark.actions.BaseDeleteReachableFilesSparkAction
- self() - Method in class org.apache.iceberg.spark.actions.BaseExpireSnapshotsSparkAction
- self() - Method in class org.apache.iceberg.spark.actions.BaseMigrateTableSparkAction
- self() - Method in class org.apache.iceberg.spark.actions.BaseRewriteDataFilesSpark3Action
- self() - Method in class org.apache.iceberg.spark.actions.BaseRewriteManifestsSparkAction
- self() - Method in class org.apache.iceberg.spark.actions.BaseSnapshotTableSparkAction
- sempred(RuleContext, int, int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- SEQUENCE_NUMBER - Static variable in interface org.apache.iceberg.ManifestFile
- sequenceNumber() - Method in class org.apache.iceberg.events.CreateSnapshotEvent
- sequenceNumber() - Method in class org.apache.iceberg.GenericManifestFile
- sequenceNumber() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the sequence number of the commit that added the manifest file.
- sequenceNumber() - Method in interface org.apache.iceberg.Snapshot
-
Return this snapshot's sequence number.
- sequenceNumber() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- SERIALIZABLE - org.apache.iceberg.IsolationLevel
- SerializableConfiguration - Class in org.apache.iceberg.hadoop
-
Wraps a
Configuration
object in aSerializable
layer. - SerializableConfiguration(Configuration) - Constructor for class org.apache.iceberg.hadoop.SerializableConfiguration
- serializableFileIO(Table) - Static method in class org.apache.iceberg.spark.SparkUtil
- SerializableMap<K,V> - Class in org.apache.iceberg.util
- SerializableSupplier<T> - Interface in org.apache.iceberg.util
- SerializableTable - Class in org.apache.iceberg
-
A read-only serializable table that can be sent to other nodes in a cluster.
- SerializationUtil - Class in org.apache.iceberg.util
- serialize(Object, ObjectInspector) - Method in class org.apache.iceberg.mr.hive.HiveIcebergSerDe
- serialize(IcebergSourceSplit) - Method in class org.apache.iceberg.flink.source.split.IcebergSourceSplitSerializer
- serializeConfWith(Function<Configuration, SerializableSupplier<Configuration>>) - Method in interface org.apache.iceberg.hadoop.HadoopConfigurable
-
Take a function that serializes Hadoop configuration into a supplier.
- serializeConfWith(Function<Configuration, SerializableSupplier<Configuration>>) - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- serializeConfWith(Function<Configuration, SerializableSupplier<Configuration>>) - Method in class org.apache.iceberg.io.ResolvingFileIO
- SERIALIZED_TABLE_PREFIX - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- serializeToBase64(Object) - Static method in class org.apache.iceberg.util.SerializationUtil
- serializeToBytes(Object) - Static method in class org.apache.iceberg.util.SerializationUtil
-
Serialize an object to bytes.
- serializeToBytes(Object, Function<Configuration, SerializableSupplier<Configuration>>) - Static method in class org.apache.iceberg.util.SerializationUtil
-
Serialize an object to bytes.
- service(int) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager
-
An async service which allows for committing multiple file groups as their rewrites complete.
- serviceHost() - Method in class org.apache.iceberg.gcp.GCPProperties
- set(int, T) - Method in class org.apache.iceberg.data.GenericRecord
- set(int, T) - Method in class org.apache.iceberg.data.InternalRecordWrapper
- set(int, T) - Method in class org.apache.iceberg.deletes.PositionDelete
- set(int, T) - Method in class org.apache.iceberg.flink.RowDataWrapper
- set(int, T) - Method in class org.apache.iceberg.GenericManifestFile
- set(int, T) - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- set(int, T) - Method in class org.apache.iceberg.PartitionKey
- set(int, T) - Method in class org.apache.iceberg.spark.SparkStructLike
- set(int, T) - Method in interface org.apache.iceberg.StructLike
- set(int, T) - Method in class org.apache.iceberg.util.StructProjection
- set(I, int, Object) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
-
Used to set a struct value by position.
- set(CharSequence) - Method in class org.apache.iceberg.util.CharSequenceWrapper
- set(CharSequence, long, R) - Method in class org.apache.iceberg.deletes.PositionDelete
- set(Object, T) - Method in class org.apache.iceberg.common.DynFields.UnboundField
- set(String, String) - Method in interface org.apache.iceberg.actions.SnapshotUpdateAction
- set(String, String) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- set(String, String) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- set(String, String) - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- set(String, String) - Method in class org.apache.iceberg.BaseRewriteManifests
- set(String, String) - Method in class org.apache.iceberg.data.GenericAppenderFactory
- set(String, String) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- set(String, String) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- set(String, String) - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- set(String, String) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- set(String, String) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- set(String, String) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- set(String, String) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- set(String, String) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- set(String, String) - Method in interface org.apache.iceberg.SnapshotUpdate
-
Set a summary property in the snapshot produced by this update.
- set(String, String) - Method in interface org.apache.iceberg.UpdateProperties
-
Add a key/value property to the table.
- set(K, V) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ReusableEntry
- set(StructLike) - Method in class org.apache.iceberg.util.StructLikeWrapper
- set(S, int, Object) - Method in class org.apache.iceberg.avro.ValueReaders.StructReader
- set(T) - Method in class org.apache.iceberg.common.DynFields.BoundField
- set(T) - Method in class org.apache.iceberg.common.DynFields.StaticField
- set(T) - Method in class org.apache.iceberg.mr.mapred.Container
- set(T, int, Object) - Method in class org.apache.iceberg.orc.OrcValueReaders.StructReader
- SET - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- SET - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- SET() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- SET() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- setAddedSnapshotId(Long) - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- setAll(Map<String, String>) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- setAll(Map<String, String>) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- setAll(Map<String, String>) - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- setAll(Map<String, String>) - Method in class org.apache.iceberg.data.GenericAppenderFactory
- setAll(Map<String, String>) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- setAll(Map<String, String>) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- setAll(Map<String, String>) - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- setAll(Map<String, String>) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- setAll(Map<String, String>) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- setAll(Map<String, String>) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- setAllPagesDictEncoded(boolean) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedPageIterator
- setBatchContext(long) - Method in class org.apache.iceberg.data.orc.GenericOrcReader
- setBatchContext(long) - Method in class org.apache.iceberg.flink.data.FlinkOrcReader
- setBatchContext(long) - Method in interface org.apache.iceberg.orc.OrcBatchReader
- setBatchContext(long) - Method in interface org.apache.iceberg.orc.OrcRowReader
- setBatchContext(long) - Method in interface org.apache.iceberg.orc.OrcValueReader
- setBatchContext(long) - Method in class org.apache.iceberg.orc.OrcValueReaders.StructReader
- setBatchContext(long) - Method in class org.apache.iceberg.spark.data.SparkOrcReader
- setBatchSize(int) - Method in class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- setBatchSize(int) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- setBatchSize(int) - Method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader.ConstantVectorReader
- setBatchSize(int) - Method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- setBatchSize(int) - Method in interface org.apache.iceberg.parquet.VectorizedReader
- setBoolean(I, int, boolean) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- setColumnStore(ColumnWriteStore) - Method in class org.apache.iceberg.parquet.ColumnWriter
- setColumnStore(ColumnWriteStore) - Method in interface org.apache.iceberg.parquet.ParquetValueWriter
- setColumnStore(ColumnWriteStore) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.PrimitiveWriter
- setColumnStore(ColumnWriteStore) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedKeyValueWriter
- setColumnStore(ColumnWriteStore) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedWriter
- setColumnStore(ColumnWriteStore) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.StructWriter
- setConf(C) - Method in interface org.apache.iceberg.hadoop.Configurable
- setConf(Configuration) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- setConf(Configuration) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- setConf(Configuration) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- setConf(Configuration) - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- setConf(Configuration) - Method in class org.apache.iceberg.hadoop.HadoopTables
- setConf(Configuration) - Method in class org.apache.iceberg.hive.HiveCatalog
- setConf(Configuration) - Method in class org.apache.iceberg.io.ResolvingFileIO
- setConf(Configuration) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- setConf(Configuration) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- setConf(Configuration) - Method in class org.apache.iceberg.nessie.NessieCatalog
- setCurrentSchema(int) - Method in class org.apache.iceberg.TableMetadata.Builder
- setCurrentSchema(Schema, int) - Method in class org.apache.iceberg.TableMetadata.Builder
- SetCurrentSchema(int) - Constructor for class org.apache.iceberg.MetadataUpdate.SetCurrentSchema
- setCurrentSnapshot(long) - Method in interface org.apache.iceberg.ManageSnapshots
-
Roll this table's data back to a specific
Snapshot
identified by id. - setCurrentSnapshot(long) - Method in class org.apache.iceberg.SnapshotManager
- setCurrentSnapshot(long) - Method in class org.apache.iceberg.TableMetadata.Builder
- setCurrentSnapshot(Snapshot) - Method in class org.apache.iceberg.TableMetadata.Builder
- SetCurrentSnapshot(Long) - Constructor for class org.apache.iceberg.MetadataUpdate.SetCurrentSnapshot
- setDefaultPartitionSpec(int) - Method in class org.apache.iceberg.TableMetadata.Builder
- setDefaultPartitionSpec(PartitionSpec) - Method in class org.apache.iceberg.TableMetadata.Builder
- SetDefaultPartitionSpec(int) - Constructor for class org.apache.iceberg.MetadataUpdate.SetDefaultPartitionSpec
- setDefaultSortOrder(int) - Method in class org.apache.iceberg.TableMetadata.Builder
- setDefaultSortOrder(SortOrder) - Method in class org.apache.iceberg.TableMetadata.Builder
- SetDefaultSortOrder(int) - Constructor for class org.apache.iceberg.MetadataUpdate.SetDefaultSortOrder
- setDelegateCatalog(CatalogPlugin) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- setDeleteFilter(DeleteFilter<InternalRow>) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnarBatchReader
- setDictionary(Dictionary) - Method in class org.apache.iceberg.parquet.BasePageIterator
- setDouble(I, int, double) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- setDynamoDbTableName(String) - Method in class org.apache.iceberg.aws.AwsProperties
- setField(String, Object) - Method in class org.apache.iceberg.data.GenericRecord
- setField(String, Object) - Method in interface org.apache.iceberg.data.Record
- setFloat(I, int, float) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- setGlueCatalogId(String) - Method in class org.apache.iceberg.aws.AwsProperties
- setGlueCatalogSkipArchive(boolean) - Method in class org.apache.iceberg.aws.AwsProperties
- setIdentifierFields(String...) - Method in interface org.apache.iceberg.UpdateSchema
-
Set the identifier fields given some field names.
- setIdentifierFields(Collection<String>) - Method in interface org.apache.iceberg.UpdateSchema
-
Set the identifier fields given a set of field names.
- SetIdentifierFieldsContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- setInteger(I, int, int) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- setJobGroupInfo(SparkContext, JobGroupInfo) - Static method in class org.apache.iceberg.spark.JobGroupUtils
- setLength(Long) - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- setLocation(String) - Method in class org.apache.iceberg.SetLocation
- setLocation(String) - Method in class org.apache.iceberg.TableMetadata.Builder
- setLocation(String) - Method in interface org.apache.iceberg.UpdateLocation
-
Set the table's location.
- setLocation(String, Job) - Method in class org.apache.iceberg.pig.IcebergStorage
- SetLocation - Class in org.apache.iceberg
- SetLocation(String) - Constructor for class org.apache.iceberg.MetadataUpdate.SetLocation
- SetLocation(TableOperations) - Constructor for class org.apache.iceberg.SetLocation
- setLong(I, int, long) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- setNotNull(int) - Method in class org.apache.iceberg.arrow.vectorized.NullabilityHolder
- setNotNulls(int, int) - Method in class org.apache.iceberg.arrow.vectorized.NullabilityHolder
- setNull(int) - Method in class org.apache.iceberg.arrow.vectorized.NullabilityHolder
- setNull(I, int) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- setNulls(int, int) - Method in class org.apache.iceberg.arrow.vectorized.NullabilityHolder
- setOption(String, String, CaseInsensitiveStringMap) - Static method in class org.apache.iceberg.spark.Spark3Util
- setOutputFiles(Set<DataFile>) - Method in class org.apache.iceberg.actions.RewriteFileGroup
- setPage(DataPage) - Method in class org.apache.iceberg.parquet.BasePageIterator
- setPageSource(PageReader) - Method in class org.apache.iceberg.parquet.BaseColumnIterator
- setPageSource(PageReadStore, long) - Method in interface org.apache.iceberg.parquet.ParquetValueReader
- setPageSource(PageReadStore, long) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.PrimitiveReader
- setPageSource(PageReadStore, long) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- setPageSource(PageReadStore, long) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- setPageSource(PageReadStore, long) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- setPartitionFilter(Expression) - Method in class org.apache.iceberg.pig.IcebergStorage
- setPartitionSpecId(Integer) - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- setPartitionSummaryLimit(int) - Method in class org.apache.iceberg.SnapshotSummary.Builder
-
Sets the maximum number of changed partitions before partition summaries will be excluded.
- setPath(String) - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- setProperties(Map<String, String>) - Method in class org.apache.iceberg.TableMetadata.Builder
- setProperties(Namespace, Map<String, String>) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- setProperties(Namespace, Map<String, String>) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- setProperties(Namespace, Map<String, String>) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
Set a collection of properties on a namespace in the catalog.
- setProperties(Namespace, Map<String, String>) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- setProperties(Namespace, Map<String, String>) - Method in class org.apache.iceberg.hive.HiveCatalog
- setProperties(Namespace, Map<String, String>) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- setProperties(Namespace, Map<String, String>) - Method in class org.apache.iceberg.nessie.NessieCatalog
- SetProperties(Map<String, String>) - Constructor for class org.apache.iceberg.MetadataUpdate.SetProperties
- setPushdownPredicate(Expression) - Method in class org.apache.iceberg.pig.IcebergStorage
- setRowGroupInfo(PageReader, boolean) - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- setRowGroupInfo(PageReadStore, Map<ColumnPath, ColumnChunkMetaData>, long) - Method in class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- setRowGroupInfo(PageReadStore, Map<ColumnPath, ColumnChunkMetaData>, long) - Method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader.ConstantVectorReader
- setRowGroupInfo(PageReadStore, Map<ColumnPath, ColumnChunkMetaData>, long) - Method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- setRowGroupInfo(PageReadStore, Map<ColumnPath, ColumnChunkMetaData>, long) - Method in interface org.apache.iceberg.parquet.VectorizedReader
-
Sets the row group information to be used with this reader
- setRowGroupInfo(PageReadStore, Map<ColumnPath, ColumnChunkMetaData>, long) - Method in class org.apache.iceberg.spark.data.vectorized.ColumnarBatchReader
- setRowKind(RowKind) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- setRowPositionSupplier(Supplier<Long>) - Method in class org.apache.iceberg.avro.ProjectionDatumReader
- setRowPositionSupplier(Supplier<Long>) - Method in interface org.apache.iceberg.avro.SupportsRowPosition
- setRowPositionSupplier(Supplier<Long>) - Method in class org.apache.iceberg.avro.ValueReaders.StructReader
- setRowPositionSupplier(Supplier<Long>) - Method in class org.apache.iceberg.data.avro.DataReader
- setRowPositionSupplier(Supplier<Long>) - Method in class org.apache.iceberg.flink.data.FlinkAvroReader
- setRowPositionSupplier(Supplier<Long>) - Method in class org.apache.iceberg.spark.data.SparkAvroReader
- setS3ChecksumEnabled(boolean) - Method in class org.apache.iceberg.aws.AwsProperties
- setS3FileIoAcl(ObjectCannedACL) - Method in class org.apache.iceberg.aws.AwsProperties
- setS3FileIoMultiPartSize(int) - Method in class org.apache.iceberg.aws.AwsProperties
- setS3FileIoMultipartThresholdFactor(double) - Method in class org.apache.iceberg.aws.AwsProperties
- setS3FileIoMultipartUploadThreads(int) - Method in class org.apache.iceberg.aws.AwsProperties
- setS3FileIoSseKey(String) - Method in class org.apache.iceberg.aws.AwsProperties
- setS3FileIoSseMd5(String) - Method in class org.apache.iceberg.aws.AwsProperties
- setS3FileIoSseType(String) - Method in class org.apache.iceberg.aws.AwsProperties
- setS3fileIoStagingDirectory(String) - Method in class org.apache.iceberg.aws.AwsProperties
- setSchema(Schema) - Method in class org.apache.iceberg.avro.ProjectionDatumReader
- setSchema(Schema) - Method in class org.apache.iceberg.data.avro.DataReader
- setSchema(Schema) - Method in class org.apache.iceberg.data.avro.DataWriter
- setSchema(Schema) - Method in class org.apache.iceberg.flink.data.FlinkAvroReader
- setSchema(Schema) - Method in class org.apache.iceberg.flink.data.FlinkAvroWriter
- setSchema(Schema) - Method in class org.apache.iceberg.spark.data.SparkAvroReader
- setSchema(Schema) - Method in class org.apache.iceberg.spark.data.SparkAvroWriter
- setSuppressCloseFailure(boolean) - Method in class org.apache.iceberg.io.CloseableGroup
-
Whether to suppress failure when any of the closeable this class tracks throws exception during closing.
- setUDFContextSignature(String) - Method in class org.apache.iceberg.pig.IcebergStorage
- setupJob(JobContext) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
- setupTask(TaskAttemptContext) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
- setValue(V) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ReusableEntry
- SetWriteDistributionAndOrderingContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- shortName() - Method in class org.apache.iceberg.spark.source.IcebergSource
- shorts() - Static method in class org.apache.iceberg.avro.ValueWriters
- shorts() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- shorts(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- shouldAdjustToUTC() - Method in class org.apache.iceberg.types.Types.TimestampType
- shouldKeep(T) - Method in class org.apache.iceberg.io.FilterIterator
- shouldKeep(T) - Method in class org.apache.iceberg.util.Filter
- shouldRead(MessageType, BlockMetaData) - Method in class org.apache.iceberg.parquet.ParquetMetricsRowGroupFilter
-
Test whether the file may contain records that match the expression.
- shouldRead(MessageType, BlockMetaData, DictionaryPageReadStore) - Method in class org.apache.iceberg.parquet.ParquetDictionaryRowGroupFilter
-
Test whether the dictionaries for a row group may contain records that match the expression.
- shouldRetryTest(Predicate<Exception>) - Method in class org.apache.iceberg.util.Tasks.Builder
- shouldSkipCombine(Path, Configuration) - Method in class org.apache.iceberg.mr.hive.HiveIcebergInputFormat
- signedBytes() - Static method in class org.apache.iceberg.types.Comparators
- SIMPLE_COMMENT - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- SIMPLE_COMMENT - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- singleStatement() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- SingleStatementContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleStatementContext
- size() - Method in class org.apache.iceberg.arrow.vectorized.NullabilityHolder
- size() - Method in class org.apache.iceberg.data.GenericRecord
- size() - Method in class org.apache.iceberg.data.InternalRecordWrapper
- size() - Method in class org.apache.iceberg.deletes.PositionDelete
- size() - Method in class org.apache.iceberg.flink.RowDataWrapper
- size() - Method in class org.apache.iceberg.GenericManifestFile
- size() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- size() - Method in class org.apache.iceberg.mapping.MappedFields
- size() - Method in class org.apache.iceberg.PartitionKey
- size() - Method in class org.apache.iceberg.spark.SparkStructLike
- size() - Method in interface org.apache.iceberg.StructLike
- size() - Method in class org.apache.iceberg.util.CharSequenceSet
- size() - Method in class org.apache.iceberg.util.PartitionSet
- size() - Method in class org.apache.iceberg.util.SerializableMap
- size() - Method in class org.apache.iceberg.util.StructLikeMap
- size() - Method in class org.apache.iceberg.util.StructLikeSet
- size() - Method in class org.apache.iceberg.util.StructProjection
- sizeInBytes() - Method in class org.apache.iceberg.MicroBatches.MicroBatch
- skip() - Method in class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- skip() - Method in class org.apache.iceberg.parquet.ValuesAsBytesReader
- SKIP_RESIDUAL_FILTERING - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- skipResidualFiltering() - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
-
Compute platforms pass down filters to data sources.
- SMALLINT_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- SMALLINT_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- SMALLINT_LITERAL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SmallIntLiteralContext
- SmallIntLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SmallIntLiteralContext
- snapshot() - Method in class org.apache.iceberg.MetadataUpdate.AddSnapshot
- snapshot() - Method in interface org.apache.iceberg.TableScan
-
Returns the
Snapshot
that will be used by this scan. - snapshot(long) - Method in class org.apache.iceberg.BaseTable
- snapshot(long) - Method in class org.apache.iceberg.SerializableTable
- snapshot(long) - Method in interface org.apache.iceberg.Table
-
Get the
snapshot
of this table with the given id, or null if there is no matching snapshot. - snapshot(long) - Method in class org.apache.iceberg.TableMetadata
- Snapshot - Interface in org.apache.iceberg
-
A snapshot of the data in a table at a point in time.
- SNAPSHOT - org.apache.iceberg.IsolationLevel
- SNAPSHOT_ID - Static variable in interface org.apache.iceberg.ManifestFile
- SNAPSHOT_ID - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- SNAPSHOT_ID - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- SNAPSHOT_ID_INHERITANCE_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- SNAPSHOT_ID_INHERITANCE_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- SNAPSHOT_PROPERTY_PREFIX - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- SNAPSHOT_TABLE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- SNAPSHOT_TABLE_SUFFIX - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- snapshotAfter(Table, long) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Traverses the history of the table's current snapshot and finds the snapshot with the given snapshot id as its parent.
- snapshotId() - Method in class org.apache.iceberg.events.CreateSnapshotEvent
- snapshotId() - Method in class org.apache.iceberg.events.ScanEvent
- snapshotId() - Method in class org.apache.iceberg.GenericManifestFile
- snapshotId() - Method in interface org.apache.iceberg.HistoryEntry
-
Returns ID of the new current snapshot.
- snapshotId() - Method in interface org.apache.iceberg.ManifestFile
-
Returns iD of the snapshot that added the manifest file to table metadata.
- snapshotId() - Method in class org.apache.iceberg.MetadataUpdate.RemoveSnapshot
- snapshotId() - Method in class org.apache.iceberg.MetadataUpdate.SetCurrentSnapshot
- snapshotId() - Method in class org.apache.iceberg.MicroBatches.MicroBatch
- snapshotId() - Method in interface org.apache.iceberg.Snapshot
-
Return this snapshot's ID.
- snapshotId() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- snapshotId() - Method in class org.apache.iceberg.spark.SparkReadConf
- snapshotId() - Method in class org.apache.iceberg.TableMetadata.SnapshotLogEntry
- snapshotId(long) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- snapshotId(Long) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- snapshotIdAsOfTime(Table, long) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Returns the ID of the most recent snapshot for the table as of the timestamp.
- snapshotIdsBetween(Table, long, long) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Returns list of snapshot ids in the range - (fromSnapshotId, toSnapshotId]
- snapshotLog() - Method in class org.apache.iceberg.TableMetadata
- SnapshotManager - Class in org.apache.iceberg
- SnapshotParser - Class in org.apache.iceberg
- snapshotProperty(String, String) - Method in interface org.apache.iceberg.actions.SnapshotUpdate
-
Sets a summary property in the snapshot produced by this action.
- snapshots() - Method in class org.apache.iceberg.BaseTable
- snapshots() - Method in class org.apache.iceberg.SerializableTable
- snapshots() - Method in interface org.apache.iceberg.Table
-
Get the
snapshots
of this table. - snapshots() - Method in class org.apache.iceberg.TableMetadata
- SNAPSHOTS - org.apache.iceberg.MetadataTableType
- SnapshotsTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's known snapshots as rows. - snapshotState(FunctionSnapshotContext) - Method in class org.apache.iceberg.flink.source.StreamingMonitorFunction
- snapshotState(StateSnapshotContext) - Method in class org.apache.iceberg.flink.source.StreamingReaderOperator
- SnapshotSummary - Class in org.apache.iceberg
- SnapshotSummary.Builder - Class in org.apache.iceberg
- snapshotTable(String) - Method in interface org.apache.iceberg.actions.ActionsProvider
-
Instantiates an action to snapshot an existing table as a new Iceberg table.
- snapshotTable(String) - Method in class org.apache.iceberg.spark.actions.SparkActions
- SnapshotTable - Interface in org.apache.iceberg.actions
-
An action that creates an independent snapshot of an existing table.
- SnapshotTable.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- SnapshotUpdate<ThisT,R> - Interface in org.apache.iceberg.actions
-
An action that produces snapshots.
- SnapshotUpdate<ThisT> - Interface in org.apache.iceberg
-
API for table changes that produce snapshots.
- SnapshotUpdateAction<ThisT,R> - Interface in org.apache.iceberg.actions
- SnapshotUtil - Class in org.apache.iceberg.util
- sort() - Method in interface org.apache.iceberg.actions.RewriteDataFiles
-
Choose SORT as a strategy for this rewrite operation using the table's sortOrder
- sort(SortOrder) - Method in interface org.apache.iceberg.actions.RewriteDataFiles
-
Choose SORT as a strategy for this rewrite operation and manually specify the sortOrder to use
- SORT_ORDER_ID - Static variable in interface org.apache.iceberg.DataFile
- sortBy(String, SortDirection, NullOrder) - Method in class org.apache.iceberg.SortOrder.Builder
- sortBy(Term, SortDirection, NullOrder) - Method in class org.apache.iceberg.SortOrder.Builder
- SortDirection - Enum in org.apache.iceberg
- SortedMerge<T> - Class in org.apache.iceberg.util
-
An Iterable that merges the items from other Iterables in order.
- SortedMerge(Comparator<T>, List<CloseableIterable<T>>) - Constructor for class org.apache.iceberg.util.SortedMerge
- SortField - Class in org.apache.iceberg
-
A field in a
SortOrder
. - sortOrder() - Method in class org.apache.iceberg.actions.SortStrategy
- sortOrder() - Method in class org.apache.iceberg.BaseTable
- sortOrder() - Method in class org.apache.iceberg.SerializableTable
- sortOrder() - Method in interface org.apache.iceberg.Table
-
Return the
sort order
for this table. - sortOrder() - Method in class org.apache.iceberg.TableMetadata
- sortOrder(SortOrder) - Method in class org.apache.iceberg.actions.SortStrategy
-
Sets the sort order to be used in this strategy when rewriting files
- SortOrder - Class in org.apache.iceberg
-
A sort order that defines how data and delete files should be ordered in a table.
- SortOrder.Builder - Class in org.apache.iceberg
-
A builder used to create valid
sort orders
. - SortOrderBuilder<R> - Interface in org.apache.iceberg
-
Methods for building a sort order.
- sortOrderFor(PartitionSpec) - Static method in class org.apache.iceberg.Partitioning
-
Create a sort order that will group data for a partition spec.
- sortOrderId() - Method in interface org.apache.iceberg.ContentFile
-
Returns the sort order id of this file, which describes how the file is ordered.
- sortOrderId() - Method in class org.apache.iceberg.MetadataUpdate.SetDefaultSortOrder
- sortOrderId() - Method in class org.apache.iceberg.spark.SparkDataFile
- SortOrderParser - Class in org.apache.iceberg
- sortOrders() - Method in class org.apache.iceberg.BaseTable
- sortOrders() - Method in class org.apache.iceberg.SerializableTable
- sortOrders() - Method in interface org.apache.iceberg.Table
-
Return a map of sort order IDs to
sort orders
for this table. - sortOrders() - Method in class org.apache.iceberg.TableMetadata
- sortOrdersById() - Method in class org.apache.iceberg.TableMetadata
- SortOrderUtil - Class in org.apache.iceberg.util
- SortOrderVisitor<T> - Interface in org.apache.iceberg.transforms
- sortPlan(Distribution, SortOrder[], LogicalPlan, SQLConf) - Method in class org.apache.iceberg.spark.actions.Spark3SortStrategy
- sortStrategy() - Method in class org.apache.iceberg.spark.actions.BaseRewriteDataFilesSpark3Action
- SortStrategy - Class in org.apache.iceberg.actions
-
A rewrite strategy for data files which aims to reorder data with data files to optimally lay them out in relation to a column.
- SortStrategy() - Constructor for class org.apache.iceberg.actions.SortStrategy
- SOURCE_SNAPSHOT_ID_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- sourceId() - Method in class org.apache.iceberg.PartitionField
-
Returns the field id of the source field in the
spec's
table schema. - sourceId() - Method in class org.apache.iceberg.SortField
-
Returns the field id of the source field in the
sort order's
table schema - spark() - Method in class org.apache.iceberg.spark.actions.Spark3SortStrategy
- SPARK_WRITE_PARTITIONED_FANOUT_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- SPARK_WRITE_PARTITIONED_FANOUT_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- Spark3BinPackStrategy - Class in org.apache.iceberg.spark.actions
- Spark3BinPackStrategy(Table, SparkSession) - Constructor for class org.apache.iceberg.spark.actions.Spark3BinPackStrategy
- Spark3SortStrategy - Class in org.apache.iceberg.spark.actions
- Spark3SortStrategy(Table, SparkSession) - Constructor for class org.apache.iceberg.spark.actions.Spark3SortStrategy
- Spark3Util - Class in org.apache.iceberg.spark
- Spark3Util.CatalogAndIdentifier - Class in org.apache.iceberg.spark
-
This mimics a class inside of Spark which is private inside of LookupCatalog.
- Spark3Util.DescribeSchemaVisitor - Class in org.apache.iceberg.spark
- SparkActions - Class in org.apache.iceberg.spark.actions
-
An implementation of
ActionsProvider
for Spark. - SparkAvroReader - Class in org.apache.iceberg.spark.data
- SparkAvroReader(Schema, Schema) - Constructor for class org.apache.iceberg.spark.data.SparkAvroReader
- SparkAvroReader(Schema, Schema, Map<Integer, ?>) - Constructor for class org.apache.iceberg.spark.data.SparkAvroReader
- SparkAvroWriter - Class in org.apache.iceberg.spark.data
- SparkAvroWriter(StructType) - Constructor for class org.apache.iceberg.spark.data.SparkAvroWriter
- SparkCatalog - Class in org.apache.iceberg.spark
-
A Spark TableCatalog implementation that wraps an Iceberg
Catalog
. - SparkCatalog() - Constructor for class org.apache.iceberg.spark.SparkCatalog
- SparkDataFile - Class in org.apache.iceberg.spark
- SparkDataFile(Types.StructType, StructType) - Constructor for class org.apache.iceberg.spark.SparkDataFile
- SparkDistributionAndOrderingUtil - Class in org.apache.iceberg.spark
- SparkExceptionUtil - Class in org.apache.iceberg.spark
- SparkFilters - Class in org.apache.iceberg.spark
- SparkMetadataColumn - Class in org.apache.iceberg.spark.source
- SparkMetadataColumn(String, DataType, boolean) - Constructor for class org.apache.iceberg.spark.source.SparkMetadataColumn
- SparkMicroBatchStream - Class in org.apache.iceberg.spark.source
- SparkOrcReader - Class in org.apache.iceberg.spark.data
-
Converts the OrcIterator, which returns ORC's VectorizedRowBatch to a set of Spark's UnsafeRows.
- SparkOrcReader(Schema, TypeDescription) - Constructor for class org.apache.iceberg.spark.data.SparkOrcReader
- SparkOrcReader(Schema, TypeDescription, Map<Integer, ?>) - Constructor for class org.apache.iceberg.spark.data.SparkOrcReader
- SparkOrcValueReaders - Class in org.apache.iceberg.spark.data
- SparkOrcWriter - Class in org.apache.iceberg.spark.data
-
This class acts as an adaptor from an OrcFileAppender to a FileAppender<InternalRow>.
- SparkOrcWriter(Schema, TypeDescription) - Constructor for class org.apache.iceberg.spark.data.SparkOrcWriter
- SparkParquetReaders - Class in org.apache.iceberg.spark.data
- SparkParquetWriters - Class in org.apache.iceberg.spark.data
- SparkPartition(Map<String, String>, String, String) - Constructor for class org.apache.iceberg.spark.SparkTableUtil.SparkPartition
- SparkPartitionedFanoutWriter - Class in org.apache.iceberg.spark.source
- SparkPartitionedFanoutWriter(PartitionSpec, FileFormat, FileAppenderFactory<InternalRow>, OutputFileFactory, FileIO, long, Schema, StructType) - Constructor for class org.apache.iceberg.spark.source.SparkPartitionedFanoutWriter
- SparkPartitionedWriter - Class in org.apache.iceberg.spark.source
- SparkPartitionedWriter(PartitionSpec, FileFormat, FileAppenderFactory<InternalRow>, OutputFileFactory, FileIO, long, Schema, StructType) - Constructor for class org.apache.iceberg.spark.source.SparkPartitionedWriter
- SparkProcedures - Class in org.apache.iceberg.spark.procedures
- SparkProcedures.ProcedureBuilder - Interface in org.apache.iceberg.spark.procedures
- SparkReadConf - Class in org.apache.iceberg.spark
-
A class for common Iceberg configs for Spark reads.
- SparkReadConf(SparkSession, Table, Map<String, String>) - Constructor for class org.apache.iceberg.spark.SparkReadConf
- SparkReadOptions - Class in org.apache.iceberg.spark
-
Spark DF read options
- SparkScanBuilder - Class in org.apache.iceberg.spark.source
- SparkSchemaUtil - Class in org.apache.iceberg.spark
-
Helper methods for working with Spark/Hive metadata.
- SparkSessionCatalog<T extends org.apache.spark.sql.connector.catalog.TableCatalog & org.apache.spark.sql.connector.catalog.SupportsNamespaces> - Class in org.apache.iceberg.spark
-
A Spark catalog that can also load non-Iceberg tables.
- SparkSessionCatalog() - Constructor for class org.apache.iceberg.spark.SparkSessionCatalog
- SparkSQLProperties - Class in org.apache.iceberg.spark
- SparkStructLike - Class in org.apache.iceberg.spark
- SparkStructLike(Types.StructType) - Constructor for class org.apache.iceberg.spark.SparkStructLike
- SparkTable - Class in org.apache.iceberg.spark.source
- SparkTable(Table, boolean) - Constructor for class org.apache.iceberg.spark.source.SparkTable
- SparkTable(Table, Long, boolean) - Constructor for class org.apache.iceberg.spark.source.SparkTable
- SparkTableUtil - Class in org.apache.iceberg.spark
-
Java version of the original SparkTableUtil.scala https://github.com/apache/iceberg/blob/apache-iceberg-0.8.0-incubating/spark/src/main/scala/org/apache/iceberg/spark/SparkTableUtil.scala
- SparkTableUtil.SparkPartition - Class in org.apache.iceberg.spark
-
Class representing a table partition.
- SparkUtil - Class in org.apache.iceberg.spark
- SparkValueConverter - Class in org.apache.iceberg.spark
-
A utility class that converts Spark values to Iceberg's internal representation.
- SparkValueReaders - Class in org.apache.iceberg.spark.data
- SparkValueWriters - Class in org.apache.iceberg.spark.data
- SparkWriteConf - Class in org.apache.iceberg.spark
-
A class for common Iceberg configs for Spark writes.
- SparkWriteConf(SparkSession, Table, Map<String, String>) - Constructor for class org.apache.iceberg.spark.SparkWriteConf
- SparkWriteOptions - Class in org.apache.iceberg.spark
-
Spark DF write options
- spec() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- spec() - Method in class org.apache.iceberg.BaseTable
- spec() - Method in interface org.apache.iceberg.FileScanTask
-
The
spec
used to store this file. - spec() - Method in class org.apache.iceberg.io.BaseTaskWriter
- spec() - Method in class org.apache.iceberg.ManifestReader
- spec() - Method in class org.apache.iceberg.MetadataUpdate.AddPartitionSpec
- spec() - Method in class org.apache.iceberg.MetadataUpdate.AddSortOrder
- spec() - Method in class org.apache.iceberg.SerializableTable
- spec() - Method in interface org.apache.iceberg.Table
-
Return the
partition spec
for this table. - spec() - Method in class org.apache.iceberg.TableMetadata
- spec(int) - Method in class org.apache.iceberg.TableMetadata
- spec(Schema, List<FieldSchema>) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts the Hive partition columns to Iceberg identity partition specification.
- SPEC_ID - Static variable in interface org.apache.iceberg.DataFile
- SPEC_ID - Static variable in interface org.apache.iceberg.ManifestFile
- SPEC_ID - Static variable in class org.apache.iceberg.MetadataColumns
- specForTable(SparkSession, String) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Returns a
PartitionSpec
for the given table. - specId() - Method in interface org.apache.iceberg.ContentFile
-
Returns id of the partition spec used for partition metadata.
- specId() - Method in class org.apache.iceberg.MetadataUpdate.SetDefaultPartitionSpec
- specId() - Method in class org.apache.iceberg.PartitionSpec
-
Returns the ID of this spec.
- specId() - Method in class org.apache.iceberg.spark.SparkDataFile
- specId(int) - Method in interface org.apache.iceberg.actions.RewriteManifests
-
Rewrites manifests for a given spec id.
- specId(int) - Method in class org.apache.iceberg.spark.actions.BaseRewriteManifestsSparkAction
- specs() - Method in class org.apache.iceberg.BaseTable
- specs() - Method in class org.apache.iceberg.SerializableTable
- specs() - Method in interface org.apache.iceberg.Table
-
Return a map of
partition specs
for this table. - specs() - Method in class org.apache.iceberg.TableMetadata
- specsById() - Method in class org.apache.iceberg.TableMetadata
- specsById(Map<Integer, PartitionSpec>) - Method in class org.apache.iceberg.MicroBatches.MicroBatchBuilder
- split(long) - Method in interface org.apache.iceberg.FileScanTask
-
Splits this scan task into component
scan tasks
, each ofsplitSize
size - split(long, long) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
-
Restricts the read to the given range: [start, end = start + length).
- split(long, long) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
-
Restricts the read to the given range: [start, start + length).
- split(long, long) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
-
Restricts the read to the given range: [start, start + length).
- SPLIT_LOOKBACK - Static variable in class org.apache.iceberg.TableProperties
- SPLIT_LOOKBACK_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- SPLIT_OFFSETS - Static variable in interface org.apache.iceberg.DataFile
- SPLIT_OPEN_FILE_COST - Static variable in class org.apache.iceberg.TableProperties
- SPLIT_OPEN_FILE_COST_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- SPLIT_SIZE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- SPLIT_SIZE - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- SPLIT_SIZE - Static variable in class org.apache.iceberg.TableProperties
- SPLIT_SIZE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- splitFiles(CloseableIterable<FileScanTask>, long) - Static method in class org.apache.iceberg.util.TableScanUtil
- splitId() - Method in class org.apache.iceberg.flink.source.split.IcebergSourceSplit
- splitLookback() - Method in class org.apache.iceberg.spark.SparkReadConf
- splitLookback() - Method in interface org.apache.iceberg.TableScan
-
Returns the split lookback for this scan.
- splitLookback(int) - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
-
Specify the number of "bins" considered when trying to pack the next file split into a task.
- splitLookback(Integer) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- splitLookbackOption() - Method in class org.apache.iceberg.spark.SparkReadConf
- splitOffsets() - Method in interface org.apache.iceberg.ContentFile
-
Returns list of recommended split locations, if applicable, null otherwise.
- splitOffsets() - Method in interface org.apache.iceberg.DeleteFile
- splitOffsets() - Method in interface org.apache.iceberg.io.FileAppender
-
Returns a list of recommended split locations, if applicable, null otherwise.
- splitOffsets() - Method in class org.apache.iceberg.parquet.ParquetWriteAdapter
-
Deprecated.
- splitOffsets() - Method in class org.apache.iceberg.spark.SparkDataFile
- splitOpenFileCost() - Method in class org.apache.iceberg.spark.SparkReadConf
- splitOpenFileCost() - Method in interface org.apache.iceberg.TableScan
-
Returns the split open file cost for this scan.
- splitOpenFileCost(long) - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
-
Specify the minimum file size to count to pack into one "bin".
- splitOpenFileCost(Long) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- splitOpenFileCostOption() - Method in class org.apache.iceberg.spark.SparkReadConf
- splitSize() - Method in class org.apache.iceberg.spark.SparkReadConf
- splitSize(long) - Method in class org.apache.iceberg.actions.BinPackStrategy
-
Returns the smallest of our max write file threshold, and our estimated split size based on the number of output files we want to generate.
- splitSize(long) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- splitSize(Long) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- splitSizeOption() - Method in class org.apache.iceberg.spark.SparkReadConf
- stageCreate(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkCatalog
- stageCreate(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- stageCreateOrReplace(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkCatalog
- stageCreateOrReplace(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- STAGED_WAP_ID_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- StagedSparkTable - Class in org.apache.iceberg.spark.source
- StagedSparkTable(Transaction) - Constructor for class org.apache.iceberg.spark.source.StagedSparkTable
- stagedWapId(Snapshot) - Static method in class org.apache.iceberg.util.WapUtil
- stageOnly() - Method in interface org.apache.iceberg.SnapshotUpdate
-
Called to stage a snapshot in table metadata, but not update the current snapshot id.
- stageReplace(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkCatalog
- stageReplace(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- stageRewrite(Table, String, Set<DataFile>) - Method in class org.apache.iceberg.spark.FileRewriteCoordinator
-
Called to persist the output of a rewrite action for a specific group.
- stageTasks(Table, String, List<FileScanTask>) - Method in class org.apache.iceberg.spark.FileScanTaskSetManager
- stagingLocation(String) - Method in interface org.apache.iceberg.actions.RewriteManifests
-
Passes a location where the staged manifests should be written.
- stagingLocation(String) - Method in class org.apache.iceberg.spark.actions.BaseRewriteManifestsSparkAction
- start() - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager.CommitService
-
Starts a single threaded executor service for handling file group commits.
- start() - Method in interface org.apache.iceberg.FileScanTask
-
The starting position of this scan range in the file.
- START_SNAPSHOT_ID - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- startFileIndex() - Method in class org.apache.iceberg.MicroBatches.MicroBatch
- STARTS_WITH - org.apache.iceberg.expressions.Expression.Operation
- startSnapshotId() - Method in class org.apache.iceberg.spark.SparkReadConf
- startSnapshotId(Long) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- startsWith(String, String) - Static method in class org.apache.iceberg.expressions.Expressions
- startsWith(Bound<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- startsWith(BoundReference<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- startsWith(UnboundTerm<String>, String) - Static method in class org.apache.iceberg.expressions.Expressions
- statement() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleStatementContext
- statement() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- StatementContext() - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StatementContext
- StatementContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StatementContext
- StaticTableOperations - Class in org.apache.iceberg
-
TableOperations implementation that provides access to metadata for a Table at some point in time, using a table metadata location.
- StaticTableOperations(String, FileIO) - Constructor for class org.apache.iceberg.StaticTableOperations
-
Creates a StaticTableOperations tied to a specific static version of the TableMetadata
- StaticTableOperations(String, FileIO, LocationProvider) - Constructor for class org.apache.iceberg.StaticTableOperations
- stop() - Method in class org.apache.iceberg.spark.source.SparkMicroBatchStream
- stopAbortsOnFailure() - Method in class org.apache.iceberg.util.Tasks.Builder
- stopOnFailure() - Method in class org.apache.iceberg.util.Tasks.Builder
- stopRetryOn(Class<? extends Exception>...) - Method in class org.apache.iceberg.util.Tasks.Builder
- stopRevertsOnFailure() - Method in class org.apache.iceberg.util.Tasks.Builder
- STREAM_FROM_TIMESTAMP - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- streamFromTimestamp() - Method in class org.apache.iceberg.spark.SparkReadConf
- streaming(boolean) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- STREAMING_SKIP_DELETE_SNAPSHOTS - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- STREAMING_SKIP_DELETE_SNAPSHOTS_DEFAULT - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- streamingFilter(CloseableIterable<T>, Function<T, Long>, CloseableIterable<Long>) - Static method in class org.apache.iceberg.deletes.Deletes
- StreamingMonitorFunction - Class in org.apache.iceberg.flink.source
-
This is the single (non-parallel) monitoring task which takes a
FlinkInputFormat
, it is responsible for: Monitoring snapshots of the Iceberg table. Creating thesplits
corresponding to the incremental files Assigning them to downstream tasks for further processing. - StreamingMonitorFunction(TableLoader, ScanContext) - Constructor for class org.apache.iceberg.flink.source.StreamingMonitorFunction
- StreamingReaderOperator - Class in org.apache.iceberg.flink.source
-
The operator that reads the
splits
received from the precedingStreamingMonitorFunction
. - streamingSkipDeleteSnapshots() - Method in class org.apache.iceberg.spark.SparkReadConf
- strict(PartitionSpec) - Static method in class org.apache.iceberg.expressions.Projections
-
Creates a strict
ProjectionEvaluator
for thespec
, defaulting to case sensitive mode. - strict(PartitionSpec, boolean) - Static method in class org.apache.iceberg.expressions.Projections
-
Creates a strict
ProjectionEvaluator
for thespec
. - StrictMetricsEvaluator - Class in org.apache.iceberg.expressions
-
Evaluates an
Expression
on aDataFile
to test whether all rows in the file match. - StrictMetricsEvaluator(Schema, Expression) - Constructor for class org.apache.iceberg.expressions.StrictMetricsEvaluator
- StrictMetricsEvaluator(Schema, Expression, boolean) - Constructor for class org.apache.iceberg.expressions.StrictMetricsEvaluator
- STRING - org.apache.iceberg.types.Type.TypeID
- STRING - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- STRING - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- STRING() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringLiteralContext
- STRING() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TypeConstructorContext
- STRING(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringLiteralContext
- StringLiteralContext(IcebergSqlExtensionsParser.ConstantContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringLiteralContext
- stringMap() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExpressionContext
- stringMap() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- StringMapContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- StringReader(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.StringReader
- strings() - Static method in class org.apache.iceberg.avro.ValueReaders
- strings() - Static method in class org.apache.iceberg.avro.ValueWriters
- strings() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- strings() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- strings(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- StringType() - Constructor for class org.apache.iceberg.types.Types.StringType
- struct() - Method in class org.apache.iceberg.data.GenericRecord
- struct() - Method in interface org.apache.iceberg.data.Record
- struct(List<OrcValueReader<?>>, Types.StructType, Map<Integer, ?>) - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- struct(RowType, GroupType, List<T>) - Method in class org.apache.iceberg.flink.data.ParquetWithFlinkSchemaVisitor
- struct(Types.StructType, Integer, List<Boolean>) - Method in class org.apache.iceberg.schema.UnionByNameVisitor
- struct(Types.StructType, Iterable<List<String>>) - Method in class org.apache.iceberg.types.CheckCompatibility
- struct(Types.StructType, Iterable<Type>) - Method in class org.apache.iceberg.spark.PruneColumnsWithoutReordering
- struct(Types.StructType, Iterable<Type>) - Method in class org.apache.iceberg.spark.PruneColumnsWithReordering
- struct(Types.StructType, Iterable<Type>) - Method in class org.apache.iceberg.types.FixupTypes
- struct(Types.StructType, Iterable<T>) - Method in class org.apache.iceberg.types.TypeUtil.CustomOrderSchemaVisitor
- struct(Types.StructType, List<String>) - Method in class org.apache.iceberg.spark.Spark3Util.DescribeSchemaVisitor
- struct(Types.StructType, List<Map<Integer, Integer>>) - Method in class org.apache.iceberg.types.IndexParents
- struct(Types.StructType, List<Map<String, Integer>>) - Method in class org.apache.iceberg.types.IndexByName
- struct(Types.StructType, List<ObjectInspector>) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- struct(Types.StructType, List<T>) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- struct(Types.StructType, GroupType, List<VectorizedReader<?>>) - Method in class org.apache.iceberg.arrow.vectorized.VectorizedReaderBuilder
- struct(Types.StructType, GroupType, List<T>) - Method in class org.apache.iceberg.parquet.TypeWithSchemaVisitor
- struct(Types.StructType, Type.Repetition, int, String) - Method in class org.apache.iceberg.parquet.TypeToMessageType
- struct(Types.StructType, P, List<R>) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- struct(GroupType, List<Boolean>) - Method in class org.apache.iceberg.parquet.ParquetSchemaUtil.HasIds
- struct(GroupType, List<Type>) - Method in class org.apache.iceberg.parquet.RemoveIds
- struct(GroupType, List<T>) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- struct(StructType, GroupType, List<T>) - Method in class org.apache.iceberg.spark.data.ParquetWithSparkSchemaVisitor
- STRUCT - org.apache.iceberg.types.Type.TypeID
- StructLike - Interface in org.apache.iceberg
-
Interface for accessing data by position in a schema.
- StructLikeMap<T> - Class in org.apache.iceberg.util
- StructLikeSet - Class in org.apache.iceberg.util
- StructLikeWrapper - Class in org.apache.iceberg.util
-
Wrapper to adapt StructLike for use in maps and sets by implementing equals and hashCode.
- StructProjection - Class in org.apache.iceberg.util
- StructReader(List<ValueReader<?>>, Schema) - Constructor for class org.apache.iceberg.avro.ValueReaders.StructReader
- StructReader(List<ValueReader<?>>, Types.StructType, Map<Integer, ?>) - Constructor for class org.apache.iceberg.avro.ValueReaders.StructReader
- StructReader(List<OrcValueReader<?>>, Types.StructType, Map<Integer, ?>) - Constructor for class org.apache.iceberg.orc.OrcValueReaders.StructReader
- StructReader(List<Type>, List<ParquetValueReader<?>>) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- StructWriter(List<ValueWriter<?>>) - Constructor for class org.apache.iceberg.avro.ValueWriters.StructWriter
- StructWriter(List<OrcValueWriter<?>>) - Constructor for class org.apache.iceberg.data.orc.GenericOrcWriters.StructWriter
- StructWriter(List<ParquetValueWriter<?>>) - Constructor for class org.apache.iceberg.parquet.ParquetValueWriters.StructWriter
- subSequence(int, int) - Method in class org.apache.iceberg.util.CharSequenceWrapper
- SUCCESS - org.apache.iceberg.BaseMetastoreTableOperations.CommitStatus
- summary() - Method in class org.apache.iceberg.BaseRewriteManifests
- summary() - Method in class org.apache.iceberg.events.CreateSnapshotEvent
- summary() - Method in interface org.apache.iceberg.Snapshot
-
Return a string map of summary data for the operation that produced this snapshot.
- supportedProperties() - Method in class org.apache.iceberg.flink.FlinkCatalogFactory
- SupportsDelta - Interface in org.apache.spark.sql.connector.iceberg.write
-
A mix-in interface for RowLevelOperation.
- supportsExternalMetadata() - Method in class org.apache.iceberg.spark.source.IcebergSource
- SupportsNamespaces - Interface in org.apache.iceberg.catalog
-
Catalog methods for working with namespaces.
- supportsNestedProjection() - Method in class org.apache.iceberg.flink.IcebergTableSource
- SupportsRowLevelOperations - Interface in org.apache.spark.sql.connector.iceberg.catalog
-
A mix-in interface for row-level operations support.
- SupportsRowPosition - Interface in org.apache.iceberg.avro
-
Interface for readers that accept a callback to determine the starting row position of an Avro split.
- suppressAndThrow(E, Runnable) - Static method in class org.apache.iceberg.util.Exceptions
- suppressExceptions(E, Runnable) - Static method in class org.apache.iceberg.util.Exceptions
- suppressFailureWhenFinished() - Method in class org.apache.iceberg.util.Tasks.Builder
- SystemProperties - Class in org.apache.iceberg
-
Configuration properties that are controlled by Java system properties.
T
- T__0 - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- T__0 - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- T__1 - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- T__1 - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- T__2 - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- T__2 - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- T__3 - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- T__3 - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- T__4 - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- T__4 - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- table() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- table() - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteStrategy
-
Returns the table being modified by this convert strategy
- table() - Method in interface org.apache.iceberg.actions.RewritePositionDeleteStrategy
-
Returns the table being modified by this rewrite strategy
- table() - Method in interface org.apache.iceberg.actions.RewriteStrategy
-
Returns the table being modified by this rewrite strategy
- table() - Method in class org.apache.iceberg.mr.mapreduce.IcebergSplit
- table() - Method in class org.apache.iceberg.spark.actions.Spark3BinPackStrategy
- table() - Method in class org.apache.iceberg.spark.actions.Spark3SortStrategy
- table() - Method in class org.apache.iceberg.spark.source.SparkTable
- table() - Method in interface org.apache.iceberg.TableScan
-
Returns the
Table
from which this scan loads data. - table() - Method in interface org.apache.iceberg.Transaction
-
Return the
Table
that this transaction will update. - table(Configuration, String) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
-
Returns the Table serialized to the configuration based on the table name.
- table(Table) - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
- table(Table) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- Table - Interface in org.apache.iceberg
-
Represents a table.
- TABLE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- TABLE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- TABLE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- TABLE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- TABLE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- TABLE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- TABLE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- TABLE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- TABLE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- TABLE_CATALOG_PREFIX - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- TABLE_EXEC_ICEBERG_INFER_SOURCE_PARALLELISM - Static variable in class org.apache.iceberg.flink.FlinkConfigOptions
- TABLE_EXEC_ICEBERG_INFER_SOURCE_PARALLELISM_MAX - Static variable in class org.apache.iceberg.flink.FlinkConfigOptions
- TABLE_IDENTIFIER - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- TABLE_LOCATION - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- TABLE_SCHEMA - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- TABLE_TYPE_PROP - Static variable in class org.apache.iceberg.BaseMetastoreTableOperations
- tableCache - Variable in class org.apache.iceberg.CachingCatalog
- tableExists(ObjectPath) - Method in class org.apache.iceberg.flink.FlinkCatalog
- tableExists(TableIdentifier) - Method in interface org.apache.iceberg.catalog.Catalog
-
Check whether table exists.
- TableIdentifier - Class in org.apache.iceberg.catalog
-
Identifies a table in iceberg catalog.
- tableLoader(TableLoader) - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
-
The table loader is used for loading tables in
IcebergFilesCommitter
lazily, we need this loader becauseTable
is not serializable and could not just use the loaded table from Builder#table in the remote task manager. - tableLoader(TableLoader) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- TableLoader - Interface in org.apache.iceberg.flink
-
Serializable loader to load an Iceberg
Table
. - TableLoader.CatalogTableLoader - Class in org.apache.iceberg.flink
- TableLoader.HadoopTableLoader - Class in org.apache.iceberg.flink
- tableLocation(String) - Method in interface org.apache.iceberg.actions.SnapshotTable
-
Sets the table location for the newly created Iceberg table.
- tableLocation(String) - Method in class org.apache.iceberg.spark.actions.BaseSnapshotTableSparkAction
- TableMetadata - Class in org.apache.iceberg
-
Metadata for a table.
- TableMetadata.Builder - Class in org.apache.iceberg
- TableMetadata.MetadataLogEntry - Class in org.apache.iceberg
- TableMetadata.SnapshotLogEntry - Class in org.apache.iceberg
- TableMetadataParser - Class in org.apache.iceberg
- TableMetadataParser.Codec - Enum in org.apache.iceberg
- TableMigrationUtil - Class in org.apache.iceberg.data
- tableName() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
-
The full name of the table used for logging purposes only.
- tableName() - Method in class org.apache.iceberg.events.CreateSnapshotEvent
- tableName() - Method in class org.apache.iceberg.events.IncrementalScanEvent
- tableName() - Method in class org.apache.iceberg.events.ScanEvent
- tableName() - Method in class org.apache.iceberg.hive.HiveTableOperations
- tableName() - Method in class org.apache.iceberg.nessie.NessieTableOperations
- TableOperations - Interface in org.apache.iceberg
-
SPI interface to abstract table metadata access and updates.
- tableProperties(Map<String, String>) - Method in interface org.apache.iceberg.actions.MigrateTable
-
Sets table properties in the newly created Iceberg table.
- tableProperties(Map<String, String>) - Method in interface org.apache.iceberg.actions.SnapshotTable
-
Sets table properties in the newly created Iceberg table.
- tableProperties(Map<String, String>) - Method in class org.apache.iceberg.spark.actions.BaseMigrateTableSparkAction
- tableProperties(Map<String, String>) - Method in class org.apache.iceberg.spark.actions.BaseSnapshotTableSparkAction
- TableProperties - Class in org.apache.iceberg
- tableProperty(String, String) - Method in interface org.apache.iceberg.actions.MigrateTable
-
Sets a table property in the newly created Iceberg table.
- tableProperty(String, String) - Method in interface org.apache.iceberg.actions.SnapshotTable
-
Sets a table property in the newly created Iceberg table.
- tableProperty(String, String) - Method in class org.apache.iceberg.spark.actions.BaseMigrateTableSparkAction
- tableProperty(String, String) - Method in class org.apache.iceberg.spark.actions.BaseSnapshotTableSparkAction
- Tables - Interface in org.apache.iceberg
-
Generic interface for creating and loading a table implementation.
- TableScan - Interface in org.apache.iceberg
-
API for configuring a table scan.
- TableScanUtil - Class in org.apache.iceberg.util
- tableSchema(TableSchema) - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
- tableSchema(Configuration) - Static method in class org.apache.iceberg.mr.InputFormatConfig
- tableType() - Method in class org.apache.iceberg.AllDataFilesTable.AllDataFilesTableScan
- tableType() - Method in class org.apache.iceberg.AllManifestsTable.AllManifestsTableScan
- TARGET_DELETE_FILE_SIZE_BYTES - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- TARGET_FILE_SIZE_BYTES - Static variable in interface org.apache.iceberg.actions.RewriteDataFiles
-
The output file size that this rewrite strategy will attempt to generate when rewriting files.
- TARGET_FILE_SIZE_BYTES - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- targetDataFileSize() - Method in class org.apache.iceberg.spark.SparkWriteConf
- targetDeleteFileSize() - Method in class org.apache.iceberg.spark.SparkWriteConf
- targetFileSize() - Method in class org.apache.iceberg.actions.BinPackStrategy
- targetSizeInBytes(long) - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
-
Specify the target rewrite data file size in bytes
- targetSplitSize() - Method in class org.apache.iceberg.DataTableScan
- targetSplitSize() - Method in interface org.apache.iceberg.TableScan
-
Returns the target split size for this scan.
- task() - Method in class org.apache.iceberg.flink.source.split.IcebergSourceSplit
- task() - Method in class org.apache.iceberg.mr.mapreduce.IcebergSplit
- task(TableScan) - Method in class org.apache.iceberg.ManifestsTable
- taskAttemptWrapper(JobConf) - Static method in class org.apache.iceberg.mr.hive.TezUtil
- taskAttemptWrapper(TaskAttemptID) - Static method in class org.apache.iceberg.mr.hive.TezUtil
- tasks() - Method in class org.apache.iceberg.MicroBatches.MicroBatch
- Tasks - Class in org.apache.iceberg.util
- Tasks.Builder<I> - Class in org.apache.iceberg.util
- Tasks.FailureTask<I,E extends java.lang.Exception> - Interface in org.apache.iceberg.util
- Tasks.Task<I,E extends java.lang.Exception> - Interface in org.apache.iceberg.util
- Tasks.UnrecoverableException - Exception in org.apache.iceberg.util
- TaskWriter<T> - Interface in org.apache.iceberg.io
-
The writer interface could accept records and provide the generated data files.
- TaskWriterFactory<T> - Interface in org.apache.iceberg.flink.sink
-
Factory to create
TaskWriter
- temp(TableMetadata) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- temp(TableMetadata) - Method in class org.apache.iceberg.hadoop.HadoopTableOperations
- temp(TableMetadata) - Method in interface org.apache.iceberg.TableOperations
-
Return a temporary
TableOperations
instance that uses configuration from uncommitted metadata. - term() - Method in class org.apache.iceberg.expressions.Predicate
- Term - Interface in org.apache.iceberg.expressions
-
An expression that evaluates to a value.
- test(StructLike) - Method in class org.apache.iceberg.expressions.BoundPredicate
- test(T) - Method in class org.apache.iceberg.expressions.BoundLiteralPredicate
- test(T) - Method in class org.apache.iceberg.expressions.BoundPredicate
- test(T) - Method in class org.apache.iceberg.expressions.BoundSetPredicate
- test(T) - Method in class org.apache.iceberg.expressions.BoundUnaryPredicate
- TezUtil - Class in org.apache.iceberg.mr.hive
- ThreadPools - Class in org.apache.iceberg.util
- throwFailureWhenFinished() - Method in class org.apache.iceberg.util.Tasks.Builder
- throwFailureWhenFinished(boolean) - Method in class org.apache.iceberg.util.Tasks.Builder
- throwIfLimited() - Method in class org.apache.iceberg.ScanSummary.Builder
- TIME - org.apache.iceberg.orc.ORCSchemaUtil.LongType
- TIME - org.apache.iceberg.types.Type.TypeID
- timeFromMicros(long) - Static method in class org.apache.iceberg.util.DateTimeUtil
- times() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- times() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- timestamp() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- TIMESTAMP - org.apache.iceberg.types.Type.TypeID
- TIMESTAMP_INSPECTOR - Static variable in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- TIMESTAMP_INSPECTOR_WITH_TZ - Static variable in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- TIMESTAMP_WITHOUT_TIMEZONE_ERROR - Static variable in class org.apache.iceberg.spark.SparkUtil
- timestampFromMicros(long) - Static method in class org.apache.iceberg.util.DateTimeUtil
- timestampMillis() - Method in interface org.apache.iceberg.HistoryEntry
-
Returns the timestamp in milliseconds of the change.
- timestampMillis() - Method in interface org.apache.iceberg.Snapshot
-
Return this snapshot's timestamp.
- timestampMillis() - Method in class org.apache.iceberg.TableMetadata.MetadataLogEntry
- timestampMillis() - Method in class org.apache.iceberg.TableMetadata.SnapshotLogEntry
- timestampMillisBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- TimestampMillisBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.TimestampMillisBatchReader
- timestampMillisDictEncodedReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- timestamps() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- timestampTz() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- timestamptzFromMicros(long) - Static method in class org.apache.iceberg.util.DateTimeUtil
- timestampTzs() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- timestampTzs() - Static method in class org.apache.iceberg.spark.data.SparkOrcValueReaders
- TINYINT_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- TINYINT_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- TINYINT_LITERAL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TinyIntLiteralContext
- TinyIntLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TinyIntLiteralContext
- tinyints() - Static method in class org.apache.iceberg.avro.ValueWriters
- tinyints(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- to(Type) - Method in interface org.apache.iceberg.expressions.Literal
-
Converts this literal to a literal of the given type.
- toArray() - Method in class org.apache.iceberg.util.CharSequenceSet
- toArray() - Method in class org.apache.iceberg.util.PartitionSet
- toArray() - Method in class org.apache.iceberg.util.StructLikeSet
- toArray(T[]) - Method in class org.apache.iceberg.util.CharSequenceSet
- toArray(T[]) - Method in class org.apache.iceberg.util.PartitionSet
- toArray(T[]) - Method in class org.apache.iceberg.util.StructLikeSet
- toBatch() - Method in interface org.apache.spark.sql.connector.iceberg.write.DeltaWrite
- toByteArray(ByteBuffer) - Static method in class org.apache.iceberg.util.ByteBuffers
- toByteBuffer() - Method in interface org.apache.iceberg.expressions.Literal
-
Serializes the value wrapped by this literal to binary using the single-value serialization format described in the Iceberg table specification.
- toByteBuffer(Type.TypeID, Object) - Static method in class org.apache.iceberg.types.Conversions
- toByteBuffer(Type, Object) - Static method in class org.apache.iceberg.types.Conversions
- toDataFile() - Method in class org.apache.iceberg.io.DataWriter
- toDeleteFile() - Method in class org.apache.iceberg.deletes.EqualityDeleteWriter
- toDeleteFile() - Method in class org.apache.iceberg.deletes.PositionDeleteWriter
- toEqualitySet(CloseableIterable<StructLike>, Types.StructType) - Static method in class org.apache.iceberg.deletes.Deletes
- toFixed(UUID, Schema, LogicalType) - Method in class org.apache.iceberg.avro.UUIDConversion
- toHumanString(T) - Method in interface org.apache.iceberg.transforms.Transform
-
Returns a human-readable String representation of a transformed value.
- toIceberg(Schema) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- toIcebergTable(Table) - Static method in class org.apache.iceberg.spark.Spark3Util
- toIcebergTerm(Expression) - Static method in class org.apache.iceberg.spark.Spark3Util
- toInputFile() - Method in class org.apache.iceberg.aws.s3.S3OutputFile
- toInputFile() - Method in class org.apache.iceberg.gcp.gcs.GCSOutputFile
- toInputFile() - Method in class org.apache.iceberg.hadoop.HadoopOutputFile
- toInputFile() - Method in interface org.apache.iceberg.io.OutputFile
-
Return an
InputFile
for the location of this output file. - toIntArray(List<Integer>) - Static method in class org.apache.iceberg.util.ArrayUtil
- toIntList(int[]) - Static method in class org.apache.iceberg.util.ArrayUtil
- toJson(NameMapping) - Static method in class org.apache.iceberg.mapping.NameMappingParser
- toJson(PartitionSpec) - Static method in class org.apache.iceberg.PartitionSpecParser
- toJson(PartitionSpec, boolean) - Static method in class org.apache.iceberg.PartitionSpecParser
- toJson(PartitionSpec, JsonGenerator) - Static method in class org.apache.iceberg.PartitionSpecParser
- toJson(Schema) - Static method in class org.apache.iceberg.SchemaParser
- toJson(Schema, boolean) - Static method in class org.apache.iceberg.SchemaParser
- toJson(Schema, JsonGenerator) - Static method in class org.apache.iceberg.SchemaParser
- toJson(Snapshot) - Static method in class org.apache.iceberg.SnapshotParser
- toJson(SortOrder) - Static method in class org.apache.iceberg.SortOrderParser
- toJson(SortOrder, boolean) - Static method in class org.apache.iceberg.SortOrderParser
- toJson(SortOrder, JsonGenerator) - Static method in class org.apache.iceberg.SortOrderParser
- toJson(TableMetadata) - Static method in class org.apache.iceberg.TableMetadataParser
- tokenNames - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
-
Deprecated.Use
IcebergSqlExtensionsLexer.VOCABULARY
instead. - tokenNames - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
-
Deprecated.Use
IcebergSqlExtensionsParser.VOCABULARY
instead. - toLongArray(List<Long>) - Static method in class org.apache.iceberg.util.ArrayUtil
- toLongList(long[]) - Static method in class org.apache.iceberg.util.ArrayUtil
- toLowerCase() - Method in class org.apache.iceberg.catalog.TableIdentifier
- toManifestFile() - Method in class org.apache.iceberg.ManifestWriter
- toNamedReference(String) - Static method in class org.apache.iceberg.spark.Spark3Util
- toPartitionSpec(Schema, Transform[]) - Static method in class org.apache.iceberg.spark.Spark3Util
-
Converts Spark transforms into a
PartitionSpec
. - toPath() - Method in class org.apache.iceberg.PartitionKey
- toPosition(Accessor<StructLike>) - Static method in class org.apache.iceberg.Accessors
- toPositionIndex(CharSequence, List<CloseableIterable<T>>) - Static method in class org.apache.iceberg.deletes.Deletes
- toPositionIndex(CloseableIterable<Long>) - Static method in class org.apache.iceberg.deletes.Deletes
- toPrimitive(Boolean[]) - Static method in class org.apache.iceberg.util.ArrayUtil
-
Converts an array of object Booleans to primitives.
- toPrimitive(Byte[]) - Static method in class org.apache.iceberg.util.ArrayUtil
-
Converts an array of object Bytes to primitives.
- toPrimitive(Double[]) - Static method in class org.apache.iceberg.util.ArrayUtil
-
Converts an array of object Doubles to primitives.
- toPrimitive(Float[]) - Static method in class org.apache.iceberg.util.ArrayUtil
-
Converts an array of object Floats to primitives.
- toPrimitive(Integer[]) - Static method in class org.apache.iceberg.util.ArrayUtil
-
Converts an array of object Integers to primitives.
- toPrimitive(Long[]) - Static method in class org.apache.iceberg.util.ArrayUtil
-
Converts an array of object Longs to primitives.
- toPrimitive(Short[]) - Static method in class org.apache.iceberg.util.ArrayUtil
-
Converts an array of object Shorts to primitives.
- toReusedFixLengthBytes(int, int, BigDecimal, byte[]) - Static method in class org.apache.iceberg.util.DecimalUtil
-
Convert a
BigDecimal
to reused fix length bytes, the extra bytes are filled according to the signum. - toSchema(RowType) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert a
RowType
to aTableSchema
. - toSchema(Schema) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert a
Schema
to aTableSchema
. - toSnapshotAtTime(long) - Method in interface org.apache.iceberg.Rollback
-
Deprecated.Replaced by
ManageSnapshots.rollbackToTime(long)
- toSnapshotId() - Method in class org.apache.iceberg.events.IncrementalScanEvent
- toSnapshotId(long) - Method in interface org.apache.iceberg.Rollback
-
Deprecated.Replaced by
ManageSnapshots.setCurrentSnapshot(long)
- toString() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesFileGroupInfo
- toString() - Method in class org.apache.iceberg.actions.RewriteFileGroup
- toString() - Method in class org.apache.iceberg.aliyun.oss.OSSURI
- toString() - Method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader.ConstantVectorReader
- toString() - Method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- toString() - Method in class org.apache.iceberg.BaseCombinedScanTask
- toString() - Method in class org.apache.iceberg.BaseMetastoreCatalog
- toString() - Method in class org.apache.iceberg.BaseTable
- toString() - Method in class org.apache.iceberg.catalog.Namespace
- toString() - Method in class org.apache.iceberg.catalog.TableIdentifier
- toString() - Method in class org.apache.iceberg.common.DynConstructors.Ctor
- toString() - Method in class org.apache.iceberg.common.DynFields.UnboundField
- toString() - Method in class org.apache.iceberg.common.DynMethods.UnboundMethod
- toString() - Method in class org.apache.iceberg.data.GenericRecord
- toString() - Method in class org.apache.iceberg.expressions.And
- toString() - Method in class org.apache.iceberg.expressions.BoundLiteralPredicate
- toString() - Method in class org.apache.iceberg.expressions.BoundReference
- toString() - Method in class org.apache.iceberg.expressions.BoundSetPredicate
- toString() - Method in class org.apache.iceberg.expressions.BoundTransform
- toString() - Method in class org.apache.iceberg.expressions.BoundUnaryPredicate
- toString() - Method in class org.apache.iceberg.expressions.False
- toString() - Method in class org.apache.iceberg.expressions.NamedReference
- toString() - Method in class org.apache.iceberg.expressions.Not
- toString() - Method in class org.apache.iceberg.expressions.Or
- toString() - Method in class org.apache.iceberg.expressions.True
- toString() - Method in class org.apache.iceberg.expressions.UnboundPredicate
- toString() - Method in class org.apache.iceberg.expressions.UnboundTransform
- toString() - Method in class org.apache.iceberg.flink.CatalogLoader.CustomCatalogLoader
- toString() - Method in class org.apache.iceberg.flink.CatalogLoader.HadoopCatalogLoader
- toString() - Method in class org.apache.iceberg.flink.CatalogLoader.HiveCatalogLoader
- toString() - Method in class org.apache.iceberg.flink.source.FlinkInputSplit
- toString() - Method in class org.apache.iceberg.flink.source.split.IcebergSourceSplit
- toString() - Method in class org.apache.iceberg.flink.TableLoader.CatalogTableLoader
- toString() - Method in class org.apache.iceberg.flink.TableLoader.HadoopTableLoader
- toString() - Method in class org.apache.iceberg.GenericManifestFile
- toString() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- toString() - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- toString() - Method in class org.apache.iceberg.hadoop.HadoopInputFile
- toString() - Method in class org.apache.iceberg.hadoop.HadoopOutputFile
- toString() - Method in class org.apache.iceberg.hive.HiveCatalog
- toString() - Method in class org.apache.iceberg.mapping.MappedField
- toString() - Method in class org.apache.iceberg.mapping.MappedFields
- toString() - Method in class org.apache.iceberg.mapping.NameMapping
- toString() - Method in class org.apache.iceberg.MetricsModes.Counts
- toString() - Method in class org.apache.iceberg.MetricsModes.Full
- toString() - Method in class org.apache.iceberg.MetricsModes.None
- toString() - Method in class org.apache.iceberg.MetricsModes.Truncate
- toString() - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- toString() - Method in enum org.apache.iceberg.NullOrder
- toString() - Method in class org.apache.iceberg.PartitionField
- toString() - Method in class org.apache.iceberg.PartitionKey
- toString() - Method in class org.apache.iceberg.PartitionSpec
- toString() - Method in class org.apache.iceberg.ScanSummary.PartitionMetrics
- toString() - Method in class org.apache.iceberg.Schema
- toString() - Method in class org.apache.iceberg.SortField
- toString() - Method in class org.apache.iceberg.SortOrder
- toString() - Method in class org.apache.iceberg.spark.source.SparkTable
- toString() - Method in class org.apache.iceberg.spark.SparkTableUtil.SparkPartition
- toString() - Method in class org.apache.iceberg.TableMetadata.MetadataLogEntry
- toString() - Method in class org.apache.iceberg.TableMetadata.SnapshotLogEntry
- toString() - Method in class org.apache.iceberg.transforms.UnknownTransform
- toString() - Method in class org.apache.iceberg.types.Types.BinaryType
- toString() - Method in class org.apache.iceberg.types.Types.BooleanType
- toString() - Method in class org.apache.iceberg.types.Types.DateType
- toString() - Method in class org.apache.iceberg.types.Types.DecimalType
- toString() - Method in class org.apache.iceberg.types.Types.DoubleType
- toString() - Method in class org.apache.iceberg.types.Types.FixedType
- toString() - Method in class org.apache.iceberg.types.Types.FloatType
- toString() - Method in class org.apache.iceberg.types.Types.IntegerType
- toString() - Method in class org.apache.iceberg.types.Types.ListType
- toString() - Method in class org.apache.iceberg.types.Types.LongType
- toString() - Method in class org.apache.iceberg.types.Types.MapType
- toString() - Method in class org.apache.iceberg.types.Types.NestedField
- toString() - Method in class org.apache.iceberg.types.Types.StringType
- toString() - Method in class org.apache.iceberg.types.Types.StructType
- toString() - Method in class org.apache.iceberg.types.Types.TimestampType
- toString() - Method in class org.apache.iceberg.types.Types.TimeType
- toString() - Method in class org.apache.iceberg.types.Types.UUIDType
- toString() - Method in class org.apache.iceberg.util.CharSequenceWrapper
- toString() - Method in class org.apache.iceberg.util.Pair
- TOTAL_DATA_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- TOTAL_DELETE_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- TOTAL_EQ_DELETES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- TOTAL_FILE_SIZE_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- TOTAL_POS_DELETES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- TOTAL_RECORDS_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- totalSize() - Method in class org.apache.iceberg.ScanSummary.PartitionMetrics
- toTransforms(PartitionSpec) - Static method in class org.apache.iceberg.spark.Spark3Util
-
Converts a PartitionSpec to Spark transforms.
- toTypeInfo(RowType) - Static method in class org.apache.iceberg.flink.util.FlinkCompatibilityUtil
- toUncheckedException(Throwable, String, Object...) - Static method in class org.apache.iceberg.spark.SparkExceptionUtil
-
Converts checked exceptions to unchecked exceptions.
- toV1TableIdentifier(Identifier) - Static method in class org.apache.iceberg.spark.Spark3Util
- Transaction - Interface in org.apache.iceberg
-
A transaction for performing multiple updates to a table.
- Transactions - Class in org.apache.iceberg
- transform() - Method in class org.apache.iceberg.expressions.BoundTransform
- transform() - Method in class org.apache.iceberg.expressions.UnboundTransform
- transform() - Method in class org.apache.iceberg.PartitionField
-
Returns the transform used to produce partition values from source values.
- transform() - Method in class org.apache.iceberg.SortField
-
Returns the transform used to produce sort values from source values.
- transform() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- transform() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- transform() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- transform() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- transform() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- transform(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- transform(String, Transform<?, T>) - Static method in class org.apache.iceberg.expressions.Expressions
-
Constructs a transform expression for a given column.
- transform(CloseableIterable<I>, Function<I, O>) - Static method in interface org.apache.iceberg.io.CloseableIterable
- transform(CloseableIterator<I>, Function<I, O>) - Static method in interface org.apache.iceberg.io.CloseableIterator
- Transform<S,T> - Interface in org.apache.iceberg.transforms
-
A transform function used for partitioning.
- transformArgument - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- transformArgument() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- transformArgument() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- transformArgument(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- TransformArgumentContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformArgumentContext
- TransformContext() - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformContext
- TransformContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformContext
- transformName - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- transformPaths(Function<CharSequence, ?>) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- transformPaths(Function<CharSequence, ?>) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- Transforms - Class in org.apache.iceberg.transforms
-
Factory methods for transforms.
- translateToIcebergProp(String) - Static method in class org.apache.iceberg.hive.HiveTableOperations
-
Provides key translation where necessary between Iceberg and HMS props.
- triplesCount - Variable in class org.apache.iceberg.parquet.BaseColumnIterator
- triplesCount - Variable in class org.apache.iceberg.parquet.BasePageIterator
- triplesRead - Variable in class org.apache.iceberg.parquet.BaseColumnIterator
- triplesRead - Variable in class org.apache.iceberg.parquet.BasePageIterator
- TripleWriter<T> - Interface in org.apache.iceberg.parquet
- True - Class in org.apache.iceberg.expressions
-
An
expression
that is always true. - TRUE - org.apache.iceberg.expressions.Expression.Operation
- TRUE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- TRUE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- TRUE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanValueContext
- TRUE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- truncate(int, String, int, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- truncate(String, int) - Static method in class org.apache.iceberg.expressions.Expressions
- truncate(String, int) - Method in class org.apache.iceberg.PartitionSpec.Builder
- truncate(String, int, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- truncate(String, int, int, SortDirection, NullOrder) - Method in interface org.apache.iceberg.transforms.SortOrderVisitor
- truncate(String, int, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- truncate(Type, int) - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a truncate
Transform
for the given type and width. - truncateBinary(ByteBuffer, int) - Static method in class org.apache.iceberg.util.BinaryUtil
-
Truncates the input byte buffer to the given length.
- truncateBinaryMax(Literal<ByteBuffer>, int) - Static method in class org.apache.iceberg.util.BinaryUtil
-
Returns a byte buffer whose length is lesser than or equal to truncateLength and is greater than the given input
- truncateBinaryMin(Literal<ByteBuffer>, int) - Static method in class org.apache.iceberg.util.BinaryUtil
-
Returns a byte buffer whose length is lesser than or equal to truncateLength and is lower than the given input
- truncateString(CharSequence, int) - Static method in class org.apache.iceberg.util.UnicodeUtil
-
Truncates the input charSequence such that the truncated charSequence is a valid unicode string and the number of unicode characters in the truncated charSequence is lesser than or equal to length
- truncateStringMax(Literal<CharSequence>, int) - Static method in class org.apache.iceberg.util.UnicodeUtil
-
Returns a valid unicode charsequence that is greater than the given input such that the number of unicode characters in the truncated charSequence is lesser than or equal to length
- truncateStringMin(Literal<CharSequence>, int) - Static method in class org.apache.iceberg.util.UnicodeUtil
-
Returns a valid unicode charsequence that is lower than the given input such that the number of unicode characters in the truncated charSequence is lesser than or equal to length
- type() - Method in interface org.apache.iceberg.Accessor
- type() - Method in class org.apache.iceberg.expressions.BoundReference
- type() - Method in interface org.apache.iceberg.expressions.BoundTerm
-
Returns the type produced by this expression.
- type() - Method in class org.apache.iceberg.expressions.BoundTransform
- type() - Method in class org.apache.iceberg.types.Types.NestedField
- Type - Interface in org.apache.iceberg.types
- TYPE - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- Type.NestedType - Class in org.apache.iceberg.types
- Type.PrimitiveType - Class in org.apache.iceberg.types
- Type.TypeID - Enum in org.apache.iceberg.types
- typeCompatibilityErrors(Schema, Schema) - Static method in class org.apache.iceberg.types.CheckCompatibility
-
Returns a list of compatibility errors for writing with the given write schema.
- typeCompatibilityErrors(Schema, Schema, boolean) - Static method in class org.apache.iceberg.types.CheckCompatibility
-
Returns a list of compatibility errors for writing with the given write schema.
- TypeConstructorContext(IcebergSqlExtensionsParser.ConstantContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TypeConstructorContext
- typeId() - Method in interface org.apache.iceberg.types.Type
- typeId() - Method in class org.apache.iceberg.types.Types.BinaryType
- typeId() - Method in class org.apache.iceberg.types.Types.BooleanType
- typeId() - Method in class org.apache.iceberg.types.Types.DateType
- typeId() - Method in class org.apache.iceberg.types.Types.DecimalType
- typeId() - Method in class org.apache.iceberg.types.Types.DoubleType
- typeId() - Method in class org.apache.iceberg.types.Types.FixedType
- typeId() - Method in class org.apache.iceberg.types.Types.FloatType
- typeId() - Method in class org.apache.iceberg.types.Types.IntegerType
- typeId() - Method in class org.apache.iceberg.types.Types.ListType
- typeId() - Method in class org.apache.iceberg.types.Types.LongType
- typeId() - Method in class org.apache.iceberg.types.Types.MapType
- typeId() - Method in class org.apache.iceberg.types.Types.StringType
- typeId() - Method in class org.apache.iceberg.types.Types.StructType
- typeId() - Method in class org.apache.iceberg.types.Types.TimestampType
- typeId() - Method in class org.apache.iceberg.types.Types.TimeType
- typeId() - Method in class org.apache.iceberg.types.Types.UUIDType
- Types - Class in org.apache.iceberg.types
- Types.BinaryType - Class in org.apache.iceberg.types
- Types.BooleanType - Class in org.apache.iceberg.types
- Types.DateType - Class in org.apache.iceberg.types
- Types.DecimalType - Class in org.apache.iceberg.types
- Types.DoubleType - Class in org.apache.iceberg.types
- Types.FixedType - Class in org.apache.iceberg.types
- Types.FloatType - Class in org.apache.iceberg.types
- Types.IntegerType - Class in org.apache.iceberg.types
- Types.ListType - Class in org.apache.iceberg.types
- Types.LongType - Class in org.apache.iceberg.types
- Types.MapType - Class in org.apache.iceberg.types
- Types.NestedField - Class in org.apache.iceberg.types
- Types.StringType - Class in org.apache.iceberg.types
- Types.StructType - Class in org.apache.iceberg.types
- Types.TimestampType - Class in org.apache.iceberg.types
- Types.TimeType - Class in org.apache.iceberg.types
- Types.UUIDType - Class in org.apache.iceberg.types
- TypeToMessageType - Class in org.apache.iceberg.parquet
- TypeToMessageType() - Constructor for class org.apache.iceberg.parquet.TypeToMessageType
- TypeUtil - Class in org.apache.iceberg.types
- TypeUtil.CustomOrderSchemaVisitor<T> - Class in org.apache.iceberg.types
- TypeUtil.NextID - Interface in org.apache.iceberg.types
-
Interface for passing a function that assigns column IDs.
- TypeUtil.SchemaVisitor<T> - Class in org.apache.iceberg.types
- TypeWithSchemaVisitor<T> - Class in org.apache.iceberg.parquet
-
Visitor for traversing a Parquet type with a companion Iceberg type.
- TypeWithSchemaVisitor() - Constructor for class org.apache.iceberg.parquet.TypeWithSchemaVisitor
U
- uidPrefix(String) - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
-
Set the uid prefix for FlinkSink operators.
- Unbound<T,B> - Interface in org.apache.iceberg.expressions
-
Represents an unbound expression node.
- UnboundPredicate<T> - Class in org.apache.iceberg.expressions
- UnboundTerm<T> - Interface in org.apache.iceberg.expressions
-
Represents an unbound term.
- UnboundTransform<S,T> - Class in org.apache.iceberg.expressions
- UnboxedReader(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.UnboxedReader
- UncheckedInterruptedException - Exception in org.apache.iceberg.jdbc
- UncheckedInterruptedException(String, Object...) - Constructor for exception org.apache.iceberg.jdbc.UncheckedInterruptedException
- UncheckedInterruptedException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.jdbc.UncheckedInterruptedException
- UncheckedSQLException - Exception in org.apache.iceberg.jdbc
- UncheckedSQLException(String, Object...) - Constructor for exception org.apache.iceberg.jdbc.UncheckedSQLException
- UncheckedSQLException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.jdbc.UncheckedSQLException
- UnicodeUtil - Class in org.apache.iceberg.util
- union(List<ValueReader<?>>) - Static method in class org.apache.iceberg.avro.ValueReaders
- union(Schema, List<Schema>) - Method in class org.apache.iceberg.avro.RemoveIds
- union(Schema, List<T>) - Method in class org.apache.iceberg.avro.AvroSchemaVisitor
- union(Type, Schema, List<T>) - Method in class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- union(P, Schema, List<T>) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- UnionByNameVisitor - Class in org.apache.iceberg.schema
-
Visitor class that accumulates the set of changes needed to evolve an existing schema into the union of the existing and a new schema.
- unionByNameWith(Schema) - Method in interface org.apache.iceberg.UpdateSchema
-
Applies all field additions and updates from the provided new schema to the existing schema so to create a union schema.
- unknown(int, String, int, String) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- unknown(String, int, String, SortDirection, NullOrder) - Method in interface org.apache.iceberg.transforms.SortOrderVisitor
- UNKNOWN - org.apache.iceberg.BaseMetastoreTableOperations.CommitStatus
- UnknownTransform<S,T> - Class in org.apache.iceberg.transforms
- UNORDERED - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- UNORDERED - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- UNORDERED() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- UNORDERED() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- unpartitioned() - Static method in class org.apache.iceberg.PartitionSpec
-
Returns a spec for unpartitioned tables.
- unpartitioned(Expression) - Static method in class org.apache.iceberg.expressions.ResidualEvaluator
-
Return a residual evaluator for an unpartitioned
spec
. - UnpartitionedWriter<T> - Class in org.apache.iceberg.io
- UnpartitionedWriter(PartitionSpec, FileFormat, FileAppenderFactory<T>, OutputFileFactory, FileIO, long) - Constructor for class org.apache.iceberg.io.UnpartitionedWriter
- UnquotedIdentifierContext(IcebergSqlExtensionsParser.IdentifierContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.UnquotedIdentifierContext
- UNRECOGNIZED - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- UNRECOGNIZED - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- UnrecoverableException(String) - Constructor for exception org.apache.iceberg.util.Tasks.UnrecoverableException
- UnrecoverableException(String, Throwable) - Constructor for exception org.apache.iceberg.util.Tasks.UnrecoverableException
- UnrecoverableException(Throwable) - Constructor for exception org.apache.iceberg.util.Tasks.UnrecoverableException
- unsignedByteArrays() - Static method in class org.apache.iceberg.types.Comparators
- unsignedBytes() - Static method in class org.apache.iceberg.types.Comparators
- unsorted() - Static method in class org.apache.iceberg.SortOrder
-
Returns a sort order for unsorted tables.
- update(NameMapping, Map<Integer, Types.NestedField>, Multimap<Integer, Types.NestedField>) - Static method in class org.apache.iceberg.mapping.MappingUtil
-
Update a name-based mapping using changes to a schema.
- update(T, T, T) - Method in interface org.apache.spark.sql.connector.iceberg.write.DeltaWriter
-
Passes information for a row that must be updated together with the updated row.
- UPDATE - org.apache.spark.sql.connector.iceberg.write.RowLevelOperation.Command
- UPDATE_DISTRIBUTION_MODE - Static variable in class org.apache.iceberg.TableProperties
- UPDATE_ISOLATION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- UPDATE_ISOLATION_LEVEL_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- UPDATE_MODE - Static variable in class org.apache.iceberg.TableProperties
- UPDATE_MODE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- updateColumn(String, Type.PrimitiveType) - Method in interface org.apache.iceberg.UpdateSchema
-
Update a column in the schema to a new primitive type.
- updateColumn(String, Type.PrimitiveType, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Update a column in the schema to a new primitive type.
- updateColumnDoc(String, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Update a column in the schema to a new primitive type.
- updated() - Method in class org.apache.iceberg.MetadataUpdate.SetProperties
- updateEvent() - Method in interface org.apache.iceberg.PendingUpdate
-
Generates update event to notify about metadata changes
- updateEvent() - Method in class org.apache.iceberg.SnapshotManager
- updateLocation() - Method in class org.apache.iceberg.BaseTable
- updateLocation() - Method in class org.apache.iceberg.SerializableTable
- updateLocation() - Method in interface org.apache.iceberg.Table
-
Create a new
UpdateLocation
to update table location and commit the changes. - updateLocation() - Method in interface org.apache.iceberg.Transaction
-
Create a new
UpdateLocation
to update table location. - updateLocation(String) - Method in class org.apache.iceberg.TableMetadata
- UpdateLocation - Interface in org.apache.iceberg
-
API for setting a table's base location.
- updatePartitionSpec(PartitionSpec) - Method in class org.apache.iceberg.TableMetadata
- UpdatePartitionSpec - Interface in org.apache.iceberg
-
API for partition spec evolution.
- updatePosition(int, long) - Method in class org.apache.iceberg.flink.source.split.IcebergSourceSplit
- updateProperties() - Method in class org.apache.iceberg.BaseTable
- updateProperties() - Method in class org.apache.iceberg.SerializableTable
- updateProperties() - Method in interface org.apache.iceberg.Table
-
Create a new
UpdateProperties
to update table properties and commit the changes. - updateProperties() - Method in interface org.apache.iceberg.Transaction
-
Create a new
UpdateProperties
to update table properties. - UpdateProperties - Interface in org.apache.iceberg
-
API for updating table properties.
- updateSchema() - Method in class org.apache.iceberg.BaseTable
- updateSchema() - Method in class org.apache.iceberg.SerializableTable
- updateSchema() - Method in interface org.apache.iceberg.Table
-
Create a new
UpdateSchema
to alter the columns of this table and commit the change. - updateSchema() - Method in interface org.apache.iceberg.Transaction
-
Create a new
UpdateSchema
to alter the columns of this table. - updateSchema(Schema, int) - Method in class org.apache.iceberg.TableMetadata
- UpdateSchema - Interface in org.apache.iceberg
-
API for schema evolution.
- updateSpec() - Method in class org.apache.iceberg.BaseTable
- updateSpec() - Method in class org.apache.iceberg.SerializableTable
- updateSpec() - Method in interface org.apache.iceberg.Table
-
Create a new
UpdatePartitionSpec
to alter the partition spec of this table and commit the change. - updateSpec() - Method in interface org.apache.iceberg.Transaction
-
Create a new
UpdatePartitionSpec
to alter the partition spec of this table. - upgradeFormatVersion(int) - Method in class org.apache.iceberg.TableMetadata.Builder
- UpgradeFormatVersion(int) - Constructor for class org.apache.iceberg.MetadataUpdate.UpgradeFormatVersion
- upgradeToFormatVersion(int) - Method in class org.apache.iceberg.TableMetadata
- UPPER_BOUNDS - Static variable in interface org.apache.iceberg.DataFile
- upperBound() - Method in class org.apache.iceberg.FieldMetrics
-
Returns the upper bound value of this field.
- upperBound() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- upperBound() - Method in interface org.apache.iceberg.ManifestFile.PartitionFieldSummary
-
Returns a ByteBuffer that contains a serialized bound higher than all values of the field.
- upperBounds() - Method in interface org.apache.iceberg.ContentFile
-
Returns if collected, map from column ID to value upper bounds, null otherwise.
- upperBounds() - Method in class org.apache.iceberg.Metrics
-
Get the non-null upper bound values for all fields in a file.
- upperBounds() - Method in class org.apache.iceberg.spark.SparkDataFile
- upsert(boolean) - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
-
All INSERT/UPDATE_AFTER events from input stream will be transformed to UPSERT events, which means it will DELETE the old records and then INSERT the new records.
- UPSERT_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- UPSERT_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- URI - Static variable in class org.apache.iceberg.CatalogProperties
- uriToString(URI) - Static method in class org.apache.iceberg.hadoop.Util
-
From Apache Spark Convert URI to String.
- USE_STARTING_SEQUENCE_NUMBER - Static variable in interface org.apache.iceberg.actions.RewriteDataFiles
-
If the compaction should use the sequence number of the snapshot at compaction start time for new data files, instead of using the sequence number of the newly produced snapshot.
- USE_STARTING_SEQUENCE_NUMBER_DEFAULT - Static variable in interface org.apache.iceberg.actions.RewriteDataFiles
- USE_TABLE_DISTRIBUTION_AND_ORDERING - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- USE_TABLE_DISTRIBUTION_AND_ORDERING_DEFAULT - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- USE_TIMESTAMP_WITHOUT_TIME_ZONE_IN_NEW_TABLES - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- USE_TIMESTAMP_WITHOUT_TIME_ZONE_IN_NEW_TABLES_DEFAULT - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- useHiveRows() - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- usePigTuples() - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- USER - Static variable in class org.apache.iceberg.CatalogProperties
- userProject() - Method in class org.apache.iceberg.gcp.GCPProperties
- useSnapshot(long) - Method in class org.apache.iceberg.AllDataFilesTable.AllDataFilesTableScan
- useSnapshot(long) - Method in class org.apache.iceberg.AllManifestsTable.AllManifestsTableScan
- useSnapshot(long) - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- useSnapshot(long) - Method in class org.apache.iceberg.DataTableScan
- useSnapshot(long) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
from this scan's configuration that will use the given snapshot by ID. - useTableDistributionAndOrdering() - Method in class org.apache.iceberg.spark.SparkWriteConf
- useTimestampWithoutZoneInNewTables(RuntimeConfig) - Static method in class org.apache.iceberg.spark.SparkUtil
-
Checks whether timestamp types for new tables should be stored with timezone info.
- utf8s() - Static method in class org.apache.iceberg.avro.ValueReaders
- utf8s() - Static method in class org.apache.iceberg.avro.ValueWriters
- utf8String() - Static method in class org.apache.iceberg.spark.data.SparkOrcValueReaders
- Util - Class in org.apache.iceberg.hadoop
- uuid() - Method in class org.apache.iceberg.MetadataUpdate.AssignUUID
- uuid() - Method in class org.apache.iceberg.TableMetadata
- UUID - org.apache.iceberg.orc.ORCSchemaUtil.BinaryType
- UUID - org.apache.iceberg.types.Type.TypeID
- UUID - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for UUID.
- UUIDConversion - Class in org.apache.iceberg.avro
- UUIDConversion() - Constructor for class org.apache.iceberg.avro.UUIDConversion
- uuids() - Static method in class org.apache.iceberg.avro.ValueReaders
- uuids() - Static method in class org.apache.iceberg.avro.ValueWriters
- uuids() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- uuids() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- UUIDType() - Constructor for class org.apache.iceberg.types.Types.UUIDType
- UUIDUtil - Class in org.apache.iceberg.util
V
- validate(Schema) - Method in class org.apache.iceberg.avro.LogicalMap
- validate(TableMetadata) - Method in class org.apache.iceberg.BaseOverwriteFiles
- validate(TableMetadata) - Method in class org.apache.iceberg.SnapshotManager
- validateAddedFilesMatchOverwriteFilter() - Method in class org.apache.iceberg.BaseOverwriteFiles
- validateAddedFilesMatchOverwriteFilter() - Method in interface org.apache.iceberg.OverwriteFiles
-
Signal that each file added to the table must match the overwrite expression.
- validateAppendOnly() - Method in class org.apache.iceberg.BaseReplacePartitions
- validateAppendOnly() - Method in interface org.apache.iceberg.ReplacePartitions
-
Validate that no partitions will be replaced and the operation is append-only.
- validateDataFilesExist(Iterable<? extends CharSequence>) - Method in interface org.apache.iceberg.RowDelta
-
Add data file paths that must not be removed by conflicting commits for this RowDelta to succeed.
- validateDeletedFiles() - Method in interface org.apache.iceberg.RowDelta
-
Enable validation that referenced data files passed to
RowDelta.validateDataFilesExist(Iterable)
have not been removed by a delete operation. - validateFromSnapshot(long) - Method in class org.apache.iceberg.BaseOverwriteFiles
- validateFromSnapshot(long) - Method in interface org.apache.iceberg.OverwriteFiles
-
Set the snapshot ID used in any reads for this operation.
- validateFromSnapshot(long) - Method in interface org.apache.iceberg.RewriteFiles
-
Set the snapshot ID used in any reads for this operation.
- validateFromSnapshot(long) - Method in interface org.apache.iceberg.RowDelta
-
Set the snapshot ID used in any reads for this operation.
- validateMetadataColumnReferences(Schema, Schema) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
- validateNoConflictingAppends(Expression) - Method in interface org.apache.iceberg.OverwriteFiles
-
Deprecated.since 0.13.0, will be removed in 0.14.0; use
OverwriteFiles.conflictDetectionFilter(Expression)
andOverwriteFiles.validateNoConflictingData()
instead. - validateNoConflictingAppends(Expression) - Method in interface org.apache.iceberg.RowDelta
-
Deprecated.since 0.13.0, will be removed in 0.14.0; use
RowDelta.conflictDetectionFilter(Expression)
andRowDelta.validateNoConflictingDataFiles()
instead. - validateNoConflictingData() - Method in class org.apache.iceberg.BaseOverwriteFiles
- validateNoConflictingData() - Method in interface org.apache.iceberg.OverwriteFiles
-
Enables validation that data added concurrently does not conflict with this commit's operation.
- validateNoConflictingDataFiles() - Method in interface org.apache.iceberg.RowDelta
-
Enables validation that data files added concurrently do not conflict with this commit's operation.
- validateNoConflictingDeleteFiles() - Method in interface org.apache.iceberg.RowDelta
-
Enables validation that delete files added concurrently do not conflict with this commit's operation.
- validateNoConflictingDeletes() - Method in class org.apache.iceberg.BaseOverwriteFiles
- validateNoConflictingDeletes() - Method in interface org.apache.iceberg.OverwriteFiles
-
Enables validation that deletes that happened concurrently do not conflict with this commit's operation.
- validateOptions() - Method in class org.apache.iceberg.actions.SortStrategy
- validatePartitionTransforms(PartitionSpec) - Static method in class org.apache.iceberg.spark.SparkUtil
-
Check whether the partition transforms in a spec can be used to write data.
- validateReferencedColumns(Schema) - Method in class org.apache.iceberg.MetricsConfig
- validateSchema(String, Schema, Schema, boolean, boolean) - Static method in class org.apache.iceberg.types.TypeUtil
-
Validates whether the provided schema is compatible with the expected schema.
- validateWapPublish(TableMetadata, long) - Static method in class org.apache.iceberg.util.WapUtil
-
Check if a given staged snapshot's associated wap-id was already published.
- validateWriteSchema(Schema, Schema, Boolean, Boolean) - Static method in class org.apache.iceberg.types.TypeUtil
-
Check whether we could write the iceberg table with the user-provided write schema.
- ValidationException - Exception in org.apache.iceberg.exceptions
-
Exception raised when validation checks fail.
- ValidationException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.ValidationException
- ValidationException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.ValidationException
- validOptions() - Method in class org.apache.iceberg.actions.BinPackStrategy
- validOptions() - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteStrategy
-
Returns a set of options which this convert strategy can use.
- validOptions() - Method in interface org.apache.iceberg.actions.RewritePositionDeleteStrategy
-
Returns a set of options which this rewrite strategy can use.
- validOptions() - Method in interface org.apache.iceberg.actions.RewriteStrategy
-
Returns a set of options which this rewrite strategy can use.
- validOptions() - Method in class org.apache.iceberg.actions.SortStrategy
- validOptions() - Method in class org.apache.iceberg.spark.actions.Spark3SortStrategy
- value() - Method in interface org.apache.iceberg.expressions.Literal
-
Returns the value wrapped by this literal.
- VALUE_COUNTS - Static variable in interface org.apache.iceberg.DataFile
- VALUE_ID_PROP - Static variable in class org.apache.iceberg.avro.AvroSchemaUtil
- valueCount() - Method in class org.apache.iceberg.FieldMetrics
-
Returns the number of all values, including nulls, NaN and repeated, for the given field.
- valueCounts() - Method in interface org.apache.iceberg.ContentFile
-
Returns if collected, map from column ID to the count of its non-null values, null otherwise.
- valueCounts() - Method in class org.apache.iceberg.Metrics
-
Get the number of all values, including nulls, NaN and repeated.
- valueCounts() - Method in class org.apache.iceberg.spark.SparkDataFile
- valueEncoding - Variable in class org.apache.iceberg.parquet.BasePageIterator
- valueId() - Method in class org.apache.iceberg.types.Types.MapType
- valueName() - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- valueOf(String) - Static method in enum org.apache.hadoop.hive.ql.exec.vector.VectorizedSupport.Support
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.BaseMetastoreTableOperations.CommitStatus
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.DistributionMode
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.expressions.Expression.Operation
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.FileContent
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.FileFormat
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.IsolationLevel
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.ManifestContent
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.ManifestReader.FileType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.MetadataTableType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.mr.InputFormatConfig.InMemoryDataModel
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.NullOrder
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.orc.ORCSchemaUtil.BinaryType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.orc.ORCSchemaUtil.LongType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.RowLevelOperationMode
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.SortDirection
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.TableMetadataParser.Codec
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.iceberg.types.Type.TypeID
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.spark.sql.connector.iceberg.write.RowLevelOperation.Command
-
Returns the enum constant of this type with the specified name.
- ValueReader<T> - Interface in org.apache.iceberg.avro
- ValueReaders - Class in org.apache.iceberg.avro
- ValueReaders.StructReader<S> - Class in org.apache.iceberg.avro
- values - Variable in class org.apache.iceberg.parquet.BasePageIterator
- values() - Static method in enum org.apache.hadoop.hive.ql.exec.vector.VectorizedSupport.Support
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.BaseMetastoreTableOperations.CommitStatus
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.DistributionMode
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.expressions.Expression.Operation
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.FileContent
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.FileFormat
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.IsolationLevel
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.ManifestContent
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.ManifestReader.FileType
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.MetadataTableType
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.mr.InputFormatConfig.InMemoryDataModel
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.NullOrder
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.orc.ORCSchemaUtil.BinaryType
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.orc.ORCSchemaUtil.LongType
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.RowLevelOperationMode
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.SortDirection
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.TableMetadataParser.Codec
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.iceberg.types.Type.TypeID
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Method in class org.apache.iceberg.util.SerializableMap
- values() - Method in class org.apache.iceberg.util.StructLikeMap
- values() - Static method in enum org.apache.spark.sql.connector.iceberg.write.RowLevelOperation.Command
-
Returns an array containing the constants of this enum type, in the order they are declared.
- ValuesAsBytesReader - Class in org.apache.iceberg.parquet
-
Implements a
ValuesReader
specifically to read given number of bytes from the underlyingByteBufferInputStream
. - ValuesAsBytesReader() - Constructor for class org.apache.iceberg.parquet.ValuesAsBytesReader
- valueType() - Method in class org.apache.iceberg.types.Types.MapType
- ValueWriter<D> - Interface in org.apache.iceberg.avro
- ValueWriters - Class in org.apache.iceberg.avro
- ValueWriters.StructWriter<S> - Class in org.apache.iceberg.avro
- varWidthBinaryDictEncodedReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- varWidthTypeBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- VarWidthTypeBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.VarWidthTypeBatchReader
- vector() - Method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- vectorAccessor() - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- VectorHolder - Class in org.apache.iceberg.arrow.vectorized
-
Container class for holding the Arrow vector storing a batch of values along with other state needed for reading values out of it.
- VectorHolder(ColumnDescriptor, FieldVector, boolean, Dictionary, NullabilityHolder, Type) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorHolder
- VectorHolder.ConstantVectorHolder<T> - Class in org.apache.iceberg.arrow.vectorized
-
A Vector Holder which does not actually produce values, consumers of this class should use the constantValue to populate their ColumnVector implementation.
- VectorHolder.PositionVectorHolder - Class in org.apache.iceberg.arrow.vectorized
- vectorHolders - Variable in class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- VECTORIZATION_BATCH_SIZE - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- VECTORIZATION_ENABLED - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- VECTORIZATION_ENABLED - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- VectorizedArrowReader - Class in org.apache.iceberg.arrow.vectorized
-
VectorReader(s)
that read in a batch of values into Arrow vectors. - VectorizedArrowReader(ColumnDescriptor, Types.NestedField, BufferAllocator, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- VectorizedArrowReader.ConstantVectorReader<T> - Class in org.apache.iceberg.arrow.vectorized
-
A Dummy Vector Reader which doesn't actually read files, instead it returns a dummy VectorHolder which indicates the constant value which should be used for this column.
- VectorizedColumnIterator - Class in org.apache.iceberg.arrow.vectorized.parquet
-
Vectorized version of the ColumnIterator that reads column values in data pages of a column in a row group in a batched fashion.
- VectorizedColumnIterator(ColumnDescriptor, String, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- VectorizedColumnIterator.BatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.BooleanBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.DictionaryBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.DoubleBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.FixedLengthDecimalBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.FixedSizeBinaryBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.FixedWidthTypeBinaryBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.FloatBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.IntBackedDecimalBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.IntegerBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.LongBackedDecimalBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.LongBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.TimestampMillisBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedColumnIterator.VarWidthTypeBatchReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedDictionaryEncodedParquetValuesReader - Class in org.apache.iceberg.arrow.vectorized.parquet
-
This decoder reads Parquet dictionary encoded data in a vectorized fashion.
- VectorizedDictionaryEncodedParquetValuesReader(int, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- VectorizedPageIterator - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedPageIterator(ColumnDescriptor, String, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedPageIterator
- VectorizedParquetDefinitionLevelReader - Class in org.apache.iceberg.arrow.vectorized.parquet
- VectorizedParquetDefinitionLevelReader(int, int, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedParquetDefinitionLevelReader
- VectorizedParquetDefinitionLevelReader(int, int, boolean, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedParquetDefinitionLevelReader
- VectorizedParquetReader<T> - Class in org.apache.iceberg.parquet
- VectorizedParquetReader(InputFile, Schema, ParquetReadOptions, Function<MessageType, VectorizedReader<?>>, NameMapping, Expression, boolean, boolean, int) - Constructor for class org.apache.iceberg.parquet.VectorizedParquetReader
- vectorizedReader(List<VectorizedReader<?>>) - Method in class org.apache.iceberg.arrow.vectorized.VectorizedReaderBuilder
- VectorizedReader<T> - Interface in org.apache.iceberg.parquet
-
Interface for vectorized Iceberg readers.
- VectorizedReaderBuilder - Class in org.apache.iceberg.arrow.vectorized
- VectorizedReaderBuilder(Schema, MessageType, boolean, Map<Integer, ?>, Function<List<VectorizedReader<?>>, VectorizedReader<?>>) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorizedReaderBuilder
- VectorizedRowBatchIterator - Class in org.apache.iceberg.orc
-
An adaptor so that the ORC RecordReader can be used as an Iterator.
- VectorizedSparkOrcReaders - Class in org.apache.iceberg.spark.data.vectorized
- VectorizedSparkParquetReaders - Class in org.apache.iceberg.spark.data.vectorized
- VectorizedSupport - Class in org.apache.hadoop.hive.ql.exec.vector
-
Copied here from Hive for compatibility
- VectorizedSupport() - Constructor for class org.apache.hadoop.hive.ql.exec.vector.VectorizedSupport
- VectorizedSupport.Support - Enum in org.apache.hadoop.hive.ql.exec.vector
- VectorizedTableScanIterable - Class in org.apache.iceberg.arrow.vectorized
-
A vectorized implementation of the Iceberg reader that iterates over the table scan.
- VectorizedTableScanIterable(TableScan) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorizedTableScanIterable
-
Create a new instance using default values for
batchSize
andreuseContainers
. - VectorizedTableScanIterable(TableScan, int, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorizedTableScanIterable
-
Create a new instance.
- VERSION_HINT_FILENAME - Static variable in class org.apache.iceberg.hadoop.Util
- versionHintLocation(Table) - Static method in class org.apache.iceberg.ReachableFileUtil
-
Returns the location of the version hint file
- visit(Schema, AvroSchemaVisitor<T>) - Static method in class org.apache.iceberg.avro.AvroSchemaVisitor
- visit(DayTimeIntervalType) - Method in class org.apache.iceberg.flink.FlinkTypeVisitor
- visit(DistinctType) - Method in class org.apache.iceberg.flink.FlinkTypeVisitor
- visit(LogicalType) - Method in class org.apache.iceberg.flink.FlinkTypeVisitor
- visit(LogicalType, Type, ParquetWithFlinkSchemaVisitor<T>) - Static method in class org.apache.iceberg.flink.data.ParquetWithFlinkSchemaVisitor
- visit(NullType) - Method in class org.apache.iceberg.flink.FlinkTypeVisitor
- visit(RawType<?>) - Method in class org.apache.iceberg.flink.FlinkTypeVisitor
- visit(StructuredType) - Method in class org.apache.iceberg.flink.FlinkTypeVisitor
- visit(SymbolType<?>) - Method in class org.apache.iceberg.flink.FlinkTypeVisitor
- visit(YearMonthIntervalType) - Method in class org.apache.iceberg.flink.FlinkTypeVisitor
- visit(ZonedTimestampType) - Method in class org.apache.iceberg.flink.FlinkTypeVisitor
- visit(Expression, ExpressionVisitors.ExpressionVisitor<R>) - Static method in class org.apache.iceberg.expressions.ExpressionVisitors
-
Traverses the given
expression
with avisitor
. - visit(PartitionSpec, PartitionSpecVisitor<R>) - Static method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
-
Visit the fields of a
PartitionSpec
. - visit(Schema, Schema, AvroSchemaWithTypeVisitor<T>) - Static method in class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- visit(Schema, PartitionField, PartitionSpecVisitor<R>) - Static method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- visit(Schema, TypeUtil.CustomOrderSchemaVisitor<T>) - Static method in class org.apache.iceberg.types.TypeUtil
- visit(Schema, TypeUtil.SchemaVisitor<T>) - Static method in class org.apache.iceberg.types.TypeUtil
- visit(Schema, TypeDescription, OrcSchemaWithTypeVisitor<T>) - Static method in class org.apache.iceberg.orc.OrcSchemaWithTypeVisitor
- visit(Schema, P, SchemaWithPartnerVisitor<P, T>, SchemaWithPartnerVisitor.PartnerAccessors<P>) - Static method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- visit(SortOrder, SortOrderVisitor<R>) - Static method in interface org.apache.iceberg.transforms.SortOrderVisitor
-
Visit the fields of a
SortOrder
. - visit(Type, Schema, AvroSchemaWithTypeVisitor<T>) - Static method in class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- visit(Type, TypeUtil.CustomOrderSchemaVisitor<T>) - Static method in class org.apache.iceberg.types.TypeUtil
-
Used to traverse types with traversals other than pre-order.
- visit(Type, TypeUtil.SchemaVisitor<T>) - Static method in class org.apache.iceberg.types.TypeUtil
- visit(Type, TypeDescription, OrcSchemaWithTypeVisitor<T>) - Static method in class org.apache.iceberg.orc.OrcSchemaWithTypeVisitor
- visit(Type, Type, TypeWithSchemaVisitor<T>) - Static method in class org.apache.iceberg.parquet.TypeWithSchemaVisitor
- visit(Type, P, SchemaWithPartnerVisitor<P, T>, SchemaWithPartnerVisitor.PartnerAccessors<P>) - Static method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- visit(UpdateSchema, Schema, Schema) - Static method in class org.apache.iceberg.schema.UnionByNameVisitor
-
Adds changes needed to produce a union of two schemas to an
UpdateSchema
operation. - visit(TypeDescription, OrcSchemaVisitor<T>) - Static method in class org.apache.iceberg.orc.OrcSchemaVisitor
- visit(Type, ParquetTypeVisitor<T>) - Static method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- visit(DataType, Type, ParquetWithSparkSchemaVisitor<T>) - Static method in class org.apache.iceberg.spark.data.ParquetWithSparkSchemaVisitor
- visit(P, Schema, AvroWithPartnerByStructureVisitor<P, T>) - Static method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- visitAddPartitionField(IcebergSqlExtensionsParser.AddPartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
addPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitAddPartitionField(IcebergSqlExtensionsParser.AddPartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
addPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitApplyTransform(IcebergSqlExtensionsParser.ApplyTransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
applyTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - visitApplyTransform(IcebergSqlExtensionsParser.ApplyTransformContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
applyTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - visitBigDecimalLiteral(IcebergSqlExtensionsParser.BigDecimalLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
bigDecimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitBigDecimalLiteral(IcebergSqlExtensionsParser.BigDecimalLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
bigDecimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitBigIntLiteral(IcebergSqlExtensionsParser.BigIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
bigIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitBigIntLiteral(IcebergSqlExtensionsParser.BigIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
bigIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitBooleanLiteral(IcebergSqlExtensionsParser.BooleanLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
booleanLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - visitBooleanLiteral(IcebergSqlExtensionsParser.BooleanLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
booleanLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - visitBooleanValue(IcebergSqlExtensionsParser.BooleanValueContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.booleanValue()
. - visitBooleanValue(IcebergSqlExtensionsParser.BooleanValueContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.booleanValue()
. - visitCall(IcebergSqlExtensionsParser.CallContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
call
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitCall(IcebergSqlExtensionsParser.CallContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
call
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitDecimalLiteral(IcebergSqlExtensionsParser.DecimalLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
decimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitDecimalLiteral(IcebergSqlExtensionsParser.DecimalLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
decimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitDoubleLiteral(IcebergSqlExtensionsParser.DoubleLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
doubleLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitDoubleLiteral(IcebergSqlExtensionsParser.DoubleLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
doubleLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitDropIdentifierFields(IcebergSqlExtensionsParser.DropIdentifierFieldsContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
dropIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitDropIdentifierFields(IcebergSqlExtensionsParser.DropIdentifierFieldsContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
dropIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitDropPartitionField(IcebergSqlExtensionsParser.DropPartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
dropPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitDropPartitionField(IcebergSqlExtensionsParser.DropPartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
dropPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitErrorNode(ErrorNode) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
- visitEvaluator(Expression, ExpressionVisitors.ExpressionVisitor<Boolean>) - Static method in class org.apache.iceberg.expressions.ExpressionVisitors
-
Traverses the given
expression
with avisitor
. - visitExponentLiteral(IcebergSqlExtensionsParser.ExponentLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
exponentLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitExponentLiteral(IcebergSqlExtensionsParser.ExponentLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
exponentLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitExpression(IcebergSqlExtensionsParser.ExpressionContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.expression()
. - visitExpression(IcebergSqlExtensionsParser.ExpressionContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.expression()
. - visitFieldList(IcebergSqlExtensionsParser.FieldListContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.fieldList()
. - visitFieldList(IcebergSqlExtensionsParser.FieldListContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.fieldList()
. - visitFloatLiteral(IcebergSqlExtensionsParser.FloatLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
floatLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitFloatLiteral(IcebergSqlExtensionsParser.FloatLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
floatLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitIdentityTransform(IcebergSqlExtensionsParser.IdentityTransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
identityTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - visitIdentityTransform(IcebergSqlExtensionsParser.IdentityTransformContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
identityTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - visitIntegerLiteral(IcebergSqlExtensionsParser.IntegerLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
integerLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitIntegerLiteral(IcebergSqlExtensionsParser.IntegerLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
integerLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitMultipartIdentifier(IcebergSqlExtensionsParser.MultipartIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.multipartIdentifier()
. - visitMultipartIdentifier(IcebergSqlExtensionsParser.MultipartIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.multipartIdentifier()
. - visitNamedArgument(IcebergSqlExtensionsParser.NamedArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
namedArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - visitNamedArgument(IcebergSqlExtensionsParser.NamedArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
namedArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - visitNonReserved(IcebergSqlExtensionsParser.NonReservedContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.nonReserved()
. - visitNonReserved(IcebergSqlExtensionsParser.NonReservedContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.nonReserved()
. - visitNumericLiteral(IcebergSqlExtensionsParser.NumericLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
numericLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - visitNumericLiteral(IcebergSqlExtensionsParser.NumericLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
numericLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - visitOrder(IcebergSqlExtensionsParser.OrderContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.order()
. - visitOrder(IcebergSqlExtensionsParser.OrderContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.order()
. - visitOrderField(IcebergSqlExtensionsParser.OrderFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.orderField()
. - visitOrderField(IcebergSqlExtensionsParser.OrderFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.orderField()
. - visitPositionalArgument(IcebergSqlExtensionsParser.PositionalArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
positionalArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - visitPositionalArgument(IcebergSqlExtensionsParser.PositionalArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
positionalArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - visitQuotedIdentifier(IcebergSqlExtensionsParser.QuotedIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.quotedIdentifier()
. - visitQuotedIdentifier(IcebergSqlExtensionsParser.QuotedIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.quotedIdentifier()
. - visitQuotedIdentifierAlternative(IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
quotedIdentifierAlternative
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - visitQuotedIdentifierAlternative(IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
quotedIdentifierAlternative
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - visitReplacePartitionField(IcebergSqlExtensionsParser.ReplacePartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
replacePartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitReplacePartitionField(IcebergSqlExtensionsParser.ReplacePartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
replacePartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitSchema(TypeDescription, OrcSchemaVisitor<T>) - Static method in class org.apache.iceberg.orc.OrcSchemaVisitor
- visitSetIdentifierFields(IcebergSqlExtensionsParser.SetIdentifierFieldsContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
setIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitSetIdentifierFields(IcebergSqlExtensionsParser.SetIdentifierFieldsContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
setIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitSetWriteDistributionAndOrdering(IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
setWriteDistributionAndOrdering
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitSetWriteDistributionAndOrdering(IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
setWriteDistributionAndOrdering
labeled alternative inIcebergSqlExtensionsParser.statement()
. - visitSingleStatement(IcebergSqlExtensionsParser.SingleStatementContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.singleStatement()
. - visitSingleStatement(IcebergSqlExtensionsParser.SingleStatementContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.singleStatement()
. - visitSmallIntLiteral(IcebergSqlExtensionsParser.SmallIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
smallIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitSmallIntLiteral(IcebergSqlExtensionsParser.SmallIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
smallIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitStringLiteral(IcebergSqlExtensionsParser.StringLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
stringLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - visitStringLiteral(IcebergSqlExtensionsParser.StringLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
stringLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - visitStringMap(IcebergSqlExtensionsParser.StringMapContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.stringMap()
. - visitStringMap(IcebergSqlExtensionsParser.StringMapContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.stringMap()
. - visitTerminal(TerminalNode) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
- visitTinyIntLiteral(IcebergSqlExtensionsParser.TinyIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
tinyIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitTinyIntLiteral(IcebergSqlExtensionsParser.TinyIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
tinyIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - visitTransformArgument(IcebergSqlExtensionsParser.TransformArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.transformArgument()
. - visitTransformArgument(IcebergSqlExtensionsParser.TransformArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.transformArgument()
. - visitTypeConstructor(IcebergSqlExtensionsParser.TypeConstructorContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
typeConstructor
labeled alternative inIcebergSqlExtensionsParser.constant()
. - visitTypeConstructor(IcebergSqlExtensionsParser.TypeConstructorContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
typeConstructor
labeled alternative inIcebergSqlExtensionsParser.constant()
. - visitUnquotedIdentifier(IcebergSqlExtensionsParser.UnquotedIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by the
unquotedIdentifier
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - visitUnquotedIdentifier(IcebergSqlExtensionsParser.UnquotedIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by the
unquotedIdentifier
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - visitWriteDistributionSpec(IcebergSqlExtensionsParser.WriteDistributionSpecContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.writeDistributionSpec()
. - visitWriteDistributionSpec(IcebergSqlExtensionsParser.WriteDistributionSpecContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.writeDistributionSpec()
. - visitWriteOrderingSpec(IcebergSqlExtensionsParser.WriteOrderingSpecContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.writeOrderingSpec()
. - visitWriteOrderingSpec(IcebergSqlExtensionsParser.WriteOrderingSpecContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.writeOrderingSpec()
. - visitWriteSpec(IcebergSqlExtensionsParser.WriteSpecContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.writeSpec()
. - visitWriteSpec(IcebergSqlExtensionsParser.WriteSpecContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsVisitor
-
Visit a parse tree produced by
IcebergSqlExtensionsParser.writeSpec()
. - VOCABULARY - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- VOCABULARY - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
W
- wapEnabled() - Method in class org.apache.iceberg.spark.SparkWriteConf
- wapId() - Method in class org.apache.iceberg.spark.SparkWriteConf
- WapUtil - Class in org.apache.iceberg.util
- WAREHOUSE_LOCATION - Static variable in class org.apache.iceberg.CatalogProperties
- where(Expression) - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- WITH - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- WITH - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- WITH() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- WITH() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- withClose(Iterator<E>) - Static method in interface org.apache.iceberg.io.CloseableIterator
- withEncryptedOutputFile(EncryptedOutputFile) - Method in class org.apache.iceberg.DataFiles.Builder
- withEncryptedOutputFile(EncryptedOutputFile) - Method in class org.apache.iceberg.FileMetadata.Builder
- withEncryptionKeyMetadata(ByteBuffer) - Method in class org.apache.iceberg.DataFiles.Builder
- withEncryptionKeyMetadata(ByteBuffer) - Method in class org.apache.iceberg.FileMetadata.Builder
- withEncryptionKeyMetadata(EncryptionKeyMetadata) - Method in class org.apache.iceberg.DataFiles.Builder
- withEncryptionKeyMetadata(EncryptionKeyMetadata) - Method in class org.apache.iceberg.FileMetadata.Builder
- withFileSizeInBytes(long) - Method in class org.apache.iceberg.DataFiles.Builder
- withFileSizeInBytes(long) - Method in class org.apache.iceberg.FileMetadata.Builder
- withFormat(String) - Method in class org.apache.iceberg.DataFiles.Builder
- withFormat(String) - Method in class org.apache.iceberg.FileMetadata.Builder
- withFormat(FileFormat) - Method in class org.apache.iceberg.DataFiles.Builder
- withFormat(FileFormat) - Method in class org.apache.iceberg.FileMetadata.Builder
- withInputFile(InputFile) - Method in class org.apache.iceberg.DataFiles.Builder
- withInputFile(InputFile) - Method in class org.apache.iceberg.FileMetadata.Builder
- withKeyMetadata(EncryptionKeyMetadata) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- withKeyMetadata(EncryptionKeyMetadata) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- withKeyMetadata(EncryptionKeyMetadata) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- withKeyMetadata(EncryptionKeyMetadata) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- withKeyMetadata(EncryptionKeyMetadata) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- withKeyMetadata(EncryptionKeyMetadata) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- withLength(int) - Static method in class org.apache.iceberg.MetricsModes.Truncate
- withLocation(String) - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- withLocation(String) - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Sets a location for the table.
- withMetadataColumns(String...) - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- withMetadataLocation(String) - Method in class org.apache.iceberg.TableMetadata.Builder
- withMetadataMatching(Expression) - Method in class org.apache.iceberg.FindFiles.Builder
-
Filter results using a metadata filter for the data in a
DataFile
. - withMetrics(Metrics) - Method in class org.apache.iceberg.DataFiles.Builder
- withMetrics(Metrics) - Method in class org.apache.iceberg.FileMetadata.Builder
- withNameMapping(NameMapping) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- withNameMapping(NameMapping) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- withNameMapping(NameMapping) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- withNoopClose(E) - Static method in interface org.apache.iceberg.io.CloseableIterable
- withNoopClose(Iterable<E>) - Static method in interface org.apache.iceberg.io.CloseableIterable
- withOrderId(int) - Method in class org.apache.iceberg.SortOrder.Builder
- withoutZone() - Static method in class org.apache.iceberg.types.Types.TimestampType
- withPartition(StructLike) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- withPartition(StructLike) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- withPartition(StructLike) - Method in class org.apache.iceberg.DataFiles.Builder
- withPartition(StructLike) - Method in class org.apache.iceberg.FileMetadata.Builder
- withPartition(StructLike) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- withPartition(StructLike) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- withPartition(StructLike) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- withPartition(StructLike) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- withPartitionPath(String) - Method in class org.apache.iceberg.DataFiles.Builder
- withPartitionPath(String) - Method in class org.apache.iceberg.FileMetadata.Builder
- withPartitionSpec(PartitionSpec) - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- withPartitionSpec(PartitionSpec) - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Sets a partition spec for the table.
- withPath(String) - Method in class org.apache.iceberg.DataFiles.Builder
- withPath(String) - Method in class org.apache.iceberg.FileMetadata.Builder
- withProperties(Map<String, String>) - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- withProperties(Map<String, String>) - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Adds key/value properties to the table.
- withProperty(String, String) - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- withProperty(String, String) - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Adds a key/value property to the table.
- withRecordCount(long) - Method in class org.apache.iceberg.DataFiles.Builder
- withRecordCount(long) - Method in class org.apache.iceberg.FileMetadata.Builder
- withRecordsMatching(Expression) - Method in class org.apache.iceberg.FindFiles.Builder
-
Filter results using a record filter.
- withSnapshotId(Long) - Method in class org.apache.iceberg.GenericManifestFile.CopyBuilder
- withSortOrder(SortOrder) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- withSortOrder(SortOrder) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- withSortOrder(SortOrder) - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- withSortOrder(SortOrder) - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Sets a sort order for the table.
- withSortOrder(SortOrder) - Method in class org.apache.iceberg.DataFiles.Builder
- withSortOrder(SortOrder) - Method in class org.apache.iceberg.FileMetadata.Builder
- withSortOrder(SortOrder) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- withSortOrder(SortOrder) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- withSortOrder(SortOrder) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- withSortOrder(SortOrder) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- withSpec(PartitionSpec) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- withSpec(PartitionSpec) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- withSpec(PartitionSpec) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- withSpec(PartitionSpec) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- withSpec(PartitionSpec) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- withSpec(PartitionSpec) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- withSpecId(int) - Method in class org.apache.iceberg.PartitionSpec.Builder
- withSplitOffsets(List<Long>) - Method in class org.apache.iceberg.DataFiles.Builder
- withStatus(FileStatus) - Method in class org.apache.iceberg.DataFiles.Builder
- withStatus(FileStatus) - Method in class org.apache.iceberg.FileMetadata.Builder
- withTableCatalog(TableCatalog) - Method in interface org.apache.iceberg.spark.procedures.SparkProcedures.ProcedureBuilder
- withUUID() - Method in class org.apache.iceberg.TableMetadata
- withZone() - Static method in class org.apache.iceberg.types.Types.TimestampType
- WORKER_THREAD_POOL_SIZE - Static variable in class org.apache.iceberg.util.ThreadPools
- WORKER_THREAD_POOL_SIZE_PROP - Static variable in class org.apache.iceberg.SystemProperties
-
Sets the size of the worker pool.
- WORKER_THREAD_POOL_SIZE_PROP - Static variable in class org.apache.iceberg.util.ThreadPools
- wrap(CharSequence) - Static method in class org.apache.iceberg.util.CharSequenceWrapper
- wrap(RowData) - Method in class org.apache.iceberg.flink.data.RowDataProjection
- wrap(RowData) - Method in class org.apache.iceberg.flink.RowDataWrapper
- wrap(Catalog) - Static method in class org.apache.iceberg.CachingCatalog
- wrap(Catalog, boolean, long) - Static method in class org.apache.iceberg.CachingCatalog
- wrap(Catalog, long) - Static method in class org.apache.iceberg.CachingCatalog
- wrap(StructLike) - Method in class org.apache.iceberg.data.InternalRecordWrapper
- wrap(StructLike) - Method in class org.apache.iceberg.util.StructProjection
- wrap(Row) - Method in class org.apache.iceberg.spark.SparkDataFile
- wrap(Row) - Method in class org.apache.iceberg.spark.SparkStructLike
- write(byte[], int, int) - Method in class org.apache.iceberg.aliyun.oss.OSSOutputStream
- write(int) - Method in class org.apache.iceberg.aliyun.oss.OSSOutputStream
- write(int, L) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedWriter
- write(int, M) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedKeyValueWriter
- write(int, PartitionSpec, OutputFile, Long) - Static method in class org.apache.iceberg.ManifestFiles
-
Create a new
ManifestWriter
for the given format version. - write(int, S) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.StructWriter
- write(int, T) - Method in interface org.apache.iceberg.parquet.ParquetValueWriter
- write(int, T) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.PrimitiveWriter
- write(int, T) - Method in interface org.apache.iceberg.parquet.TripleWriter
-
Write a value.
- write(int, T, ColumnVector) - Method in interface org.apache.iceberg.orc.OrcValueWriter
-
Take a value from the data value and add it to the ORC output.
- write(D, Encoder) - Method in interface org.apache.iceberg.avro.ValueWriter
- write(DataOutput) - Method in class org.apache.iceberg.mr.hive.HiveIcebergSplit
- write(DataOutput) - Method in class org.apache.iceberg.mr.mapred.Container
- write(DataOutput) - Method in class org.apache.iceberg.mr.mapreduce.IcebergSplit
- write(Iterable<T>) - Method in interface org.apache.iceberg.io.FileWriter
-
Writes rows to a predefined spec/partition.
- write(RowData, Encoder) - Method in class org.apache.iceberg.flink.data.FlinkAvroWriter
- write(RowData, VectorizedRowBatch) - Method in class org.apache.iceberg.flink.data.FlinkOrcWriter
- write(Record, VectorizedRowBatch) - Method in class org.apache.iceberg.data.orc.GenericOrcWriter
- write(PositionDelete<T>) - Method in class org.apache.iceberg.deletes.PositionDeleteWriter
- write(OutputFile) - Static method in class org.apache.iceberg.avro.Avro
- write(OutputFile) - Static method in class org.apache.iceberg.orc.ORC
- write(OutputFile) - Static method in class org.apache.iceberg.parquet.Parquet
- write(PartitionSpec, OutputFile) - Static method in class org.apache.iceberg.ManifestFiles
-
Create a new
ManifestWriter
. - write(TableMetadata, OutputFile) - Static method in class org.apache.iceberg.TableMetadataParser
- write(InternalRow, Encoder) - Method in class org.apache.iceberg.spark.data.SparkAvroWriter
- write(InternalRow, VectorizedRowBatch) - Method in class org.apache.iceberg.spark.data.SparkOrcWriter
- write(S, Encoder) - Method in class org.apache.iceberg.avro.ValueWriters.StructWriter
- write(T) - Method in class org.apache.iceberg.deletes.EqualityDeleteWriter
- write(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
- write(T) - Method in class org.apache.iceberg.io.DataWriter
- write(T) - Method in interface org.apache.iceberg.io.FileWriter
-
Writes a row to a predefined spec/partition.
- write(T) - Method in class org.apache.iceberg.io.PartitionedFanoutWriter
- write(T) - Method in class org.apache.iceberg.io.PartitionedWriter
- write(T) - Method in interface org.apache.iceberg.io.TaskWriter
-
Write the row into the data files.
- write(T) - Method in class org.apache.iceberg.io.UnpartitionedWriter
- write(T) - Method in interface org.apache.spark.sql.connector.iceberg.write.DeltaWriter
- write(T, Encoder) - Method in class org.apache.iceberg.data.avro.DataWriter
- write(T, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.PartitioningWriter
-
Writes a row to the provided spec/partition.
- write(T, VectorizedRowBatch) - Method in interface org.apache.iceberg.orc.OrcRowWriter
-
Writes or appends a row to ORC's VectorizedRowBatch.
- WRITE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- WRITE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- WRITE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- WRITE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- WRITE_AUDIT_PUBLISH_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- WRITE_AUDIT_PUBLISH_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- WRITE_DATA_LOCATION - Static variable in class org.apache.iceberg.TableProperties
- WRITE_DISTRIBUTION_MODE - Static variable in class org.apache.iceberg.TableProperties
- WRITE_DISTRIBUTION_MODE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
-
Deprecated.will be removed in 0.14.0, use specific modes instead
- WRITE_DISTRIBUTION_MODE_HASH - Static variable in class org.apache.iceberg.TableProperties
- WRITE_DISTRIBUTION_MODE_NONE - Static variable in class org.apache.iceberg.TableProperties
- WRITE_DISTRIBUTION_MODE_RANGE - Static variable in class org.apache.iceberg.TableProperties
- WRITE_FOLDER_STORAGE_LOCATION - Static variable in class org.apache.iceberg.TableProperties
-
Deprecated.Use
TableProperties.WRITE_DATA_LOCATION
instead. - WRITE_FORMAT - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- WRITE_LOCATION_PROVIDER_IMPL - Static variable in class org.apache.iceberg.TableProperties
- WRITE_METADATA_LOCATION - Static variable in class org.apache.iceberg.TableProperties
- WRITE_NEW_DATA_LOCATION - Static variable in class org.apache.iceberg.TableProperties
-
Deprecated.will be removed in 0.14.0, use
TableProperties.WRITE_DATA_LOCATION
instead - WRITE_PARTITION_SUMMARY_LIMIT - Static variable in class org.apache.iceberg.TableProperties
- WRITE_PARTITION_SUMMARY_LIMIT_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- WRITE_TARGET_FILE_SIZE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- WRITE_TARGET_FILE_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- WRITE_TARGET_FILE_SIZE_BYTES_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- writeBinary(int, Binary) - Method in class org.apache.iceberg.parquet.ColumnWriter
- writeBinary(int, Binary) - Method in interface org.apache.iceberg.parquet.TripleWriter
-
Write a triple.
- writeBoolean(int, boolean) - Method in class org.apache.iceberg.parquet.ColumnWriter
- writeBoolean(int, boolean) - Method in interface org.apache.iceberg.parquet.TripleWriter
-
Write a triple.
- writeCompatibilityErrors(Schema, Schema) - Static method in class org.apache.iceberg.types.CheckCompatibility
-
Returns a list of compatibility errors for writing with the given write schema.
- writeCompatibilityErrors(Schema, Schema, boolean) - Static method in class org.apache.iceberg.types.CheckCompatibility
-
Returns a list of compatibility errors for writing with the given write schema.
- writeData(OutputFile) - Static method in class org.apache.iceberg.avro.Avro
- writeData(OutputFile) - Static method in class org.apache.iceberg.orc.ORC
- writeData(OutputFile) - Static method in class org.apache.iceberg.parquet.Parquet
- writeDeleteManifest(int, PartitionSpec, OutputFile, Long) - Static method in class org.apache.iceberg.ManifestFiles
-
Create a new
ManifestWriter
for the given format version. - writeDeletes(OutputFile) - Static method in class org.apache.iceberg.avro.Avro
- writeDeletes(OutputFile) - Static method in class org.apache.iceberg.orc.ORC
- writeDeletes(OutputFile) - Static method in class org.apache.iceberg.parquet.Parquet
- writeDistributionSpec() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- writeDistributionSpec() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteSpecContext
- writeDistributionSpec(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteSpecContext
- WriteDistributionSpecContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- writeDouble(int, double) - Method in class org.apache.iceberg.parquet.ColumnWriter
- writeDouble(int, double) - Method in interface org.apache.iceberg.parquet.TripleWriter
-
Write a triple.
- writeFloat(int, float) - Method in class org.apache.iceberg.parquet.ColumnWriter
- writeFloat(int, float) - Method in interface org.apache.iceberg.parquet.TripleWriter
-
Write a triple.
- writeInteger(int, int) - Method in class org.apache.iceberg.parquet.ColumnWriter
- writeInteger(int, int) - Method in interface org.apache.iceberg.parquet.TripleWriter
-
Write a triple.
- writeLong(int, long) - Method in class org.apache.iceberg.parquet.ColumnWriter
- writeLong(int, long) - Method in interface org.apache.iceberg.parquet.TripleWriter
-
Write a triple.
- writeMaxFileSize() - Method in class org.apache.iceberg.actions.BinPackStrategy
-
Estimates a larger max target file size than our target size used in task creation to avoid tasks which are predicted to have a certain size, but exceed that target size when serde is complete creating tiny remainder files.
- writeNewMetadata(TableMetadata, int) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- writeNull(int, int) - Method in class org.apache.iceberg.parquet.ColumnWriter
- writeNull(int, int) - Method in interface org.apache.iceberg.parquet.TripleWriter
-
Write a triple for a null value.
- WriteObjectInspector - Interface in org.apache.iceberg.mr.hive.serde.objectinspector
-
Interface for converting the Hive primitive objects for the objects which could be added to an Iceberg Record.
- writeOrderingSpec() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- writeOrderingSpec() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteSpecContext
- writeOrderingSpec(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteSpecContext
- WriteOrderingSpecContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- writeParallelism(int) - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
-
Configuring the write parallel number for iceberg stream writer.
- writer(int) - Method in class org.apache.iceberg.avro.ValueWriters.StructWriter
- WriteResult - Class in org.apache.iceberg.io
- WriteResult.Builder - Class in org.apache.iceberg.io
- writeRow(S, VectorizedRowBatch) - Method in class org.apache.iceberg.data.orc.GenericOrcWriters.StructWriter
- writers() - Method in class org.apache.iceberg.data.orc.GenericOrcWriter
- writers() - Method in class org.apache.iceberg.data.orc.GenericOrcWriters.StructWriter
- writers() - Method in class org.apache.iceberg.flink.data.FlinkOrcWriter
- writers() - Method in interface org.apache.iceberg.orc.OrcRowWriter
- writers() - Method in class org.apache.iceberg.spark.data.SparkOrcWriter
- writerVersion - Variable in class org.apache.iceberg.parquet.BasePageIterator
- writerVersion(ParquetProperties.WriterVersion) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- writeSpec() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- writeSpec() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- WriteSpecContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteSpecContext
- writeSupport(WriteSupport<?>) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- WS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- WS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
Y
- year(int, String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- year(String) - Static method in class org.apache.iceberg.expressions.Expressions
- year(String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- year(String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- year(String, int, SortDirection, NullOrder) - Method in interface org.apache.iceberg.transforms.SortOrderVisitor
- year(String, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- year(Type) - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a year
Transform
for date or timestamp types.
_
- _ATN - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- _ATN - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- _decisionToDFA - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- _decisionToDFA - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- _serializedATN - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- _serializedATN - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- _sharedContextCache - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- _sharedContextCache - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
All Classes All Packages