A B C D E F G H I J K L M N O P Q R S T U V W Y Z _
All Classes All Packages
All Classes All Packages
A
- abort() - Method in class org.apache.iceberg.io.BaseTaskWriter
- abort() - Method in interface org.apache.iceberg.io.TaskWriter
-
Close the writer and delete the completed files if possible when aborting.
- abortFileGroup(RewriteFileGroup) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager
-
Clean up a specified file set by removing any files created for that operation, should not throw any exceptions
- abortJob(JobContext, int) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
-
Removes the generated data files if there is a commit file already generated for them.
- abortStagedChanges() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- abortStagedChanges() - Method in class org.apache.iceberg.spark.source.StagedSparkTable
- abortTask(TaskAttemptContext) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
-
Removes files generated by this task.
- abortWith(Tasks.Task<I, ?>) - Method in class org.apache.iceberg.util.Tasks.Builder
- AbstractMapredIcebergRecordReader<T> - Class in org.apache.iceberg.mr.mapred
- AbstractMapredIcebergRecordReader(IcebergInputFormat<?>, IcebergSplit, JobConf, Reporter) - Constructor for class org.apache.iceberg.mr.mapred.AbstractMapredIcebergRecordReader
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanValueContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DoubleLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExponentLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExpressionContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FieldListContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FloatLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentityTransformContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IntegerLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MultipartIdentifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NamedArgumentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumericLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.PositionalArgumentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleOrderContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleStatementContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SmallIntLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TinyIntLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformArgumentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TypeConstructorContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.UnquotedIdentifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteSpecContext
- accept(Path) - Method in class org.apache.iceberg.hadoop.HiddenPathFilter
- ACCESS_TOKEN_TYPE - Static variable in class org.apache.iceberg.rest.auth.OAuth2Properties
- accessKeyId() - Method in class org.apache.iceberg.aliyun.AliyunProperties
- accessKeySecret() - Method in class org.apache.iceberg.aliyun.AliyunProperties
- accessor() - Method in class org.apache.iceberg.expressions.BoundReference
- accessor() - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- Accessor<T> - Interface in org.apache.iceberg
- accessorForField(int) - Method in class org.apache.iceberg.Schema
-
Returns an accessor for retrieving the data from
StructLike
. - Accessors - Class in org.apache.iceberg
-
Position2Accessor and Position3Accessor here is an optimization.
- acquire(String, String) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbLockManager
- acquire(String, String) - Method in interface org.apache.iceberg.LockManager
-
Try to acquire a lock
- acquireIntervalMs() - Method in class org.apache.iceberg.util.LockManagers.BaseLockManager
- acquireTimeoutMs() - Method in class org.apache.iceberg.util.LockManagers.BaseLockManager
- Action<ThisT,R> - Interface in org.apache.iceberg.actions
-
An action performed on a table.
- Actions - Class in org.apache.iceberg.flink.actions
- ActionsProvider - Interface in org.apache.iceberg.actions
-
An API that should be implemented by query engine integrations for providing actions.
- add(int, StructLike) - Method in class org.apache.iceberg.util.PartitionSet
- add(D) - Method in interface org.apache.iceberg.io.FileAppender
- add(D) - Method in class org.apache.iceberg.parquet.ParquetWriteAdapter
-
Deprecated.
- add(F) - Method in class org.apache.iceberg.ManifestWriter
-
Add an added entry for a file.
- add(F, long) - Method in class org.apache.iceberg.ManifestWriter
-
Add an added entry for a file with a specific sequence number.
- add(CharSequence) - Method in class org.apache.iceberg.util.CharSequenceSet
- add(String, Table) - Method in class org.apache.iceberg.spark.SparkTableCache
- add(Namespace) - Method in class org.apache.iceberg.rest.responses.ListNamespacesResponse.Builder
- add(TableIdentifier) - Method in class org.apache.iceberg.rest.responses.ListTablesResponse.Builder
- add(WriteResult) - Method in class org.apache.iceberg.io.WriteResult.Builder
- add(Blob) - Method in class org.apache.iceberg.puffin.PuffinWriter
- add(StructLike) - Method in class org.apache.iceberg.util.StructLikeSet
- add(Pair<Integer, StructLike>) - Method in class org.apache.iceberg.util.PartitionSet
- add(T) - Method in class org.apache.iceberg.io.DataWriter
-
Deprecated.since 0.13.0, will be removed in 0.14.0; use
DataWriter.write(Object)
instead. - add(T[], T) - Static method in class org.apache.iceberg.util.ArrayUtil
-
Copies the given array and adds the given element at the end of the new array.
- ADD - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- ADD - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- ADD() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- ADD() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- ADD_EQ_DELETE_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADD_POS_DELETE_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- addAll(Iterable<D>) - Method in interface org.apache.iceberg.io.FileAppender
- addAll(Iterable<WriteResult>) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addAll(Collection<? extends CharSequence>) - Method in class org.apache.iceberg.util.CharSequenceSet
- addAll(Collection<? extends StructLike>) - Method in class org.apache.iceberg.util.StructLikeSet
- addAll(Collection<? extends Pair<Integer, StructLike>>) - Method in class org.apache.iceberg.util.PartitionSet
- addAll(Collection<Namespace>) - Method in class org.apache.iceberg.rest.responses.ListNamespacesResponse.Builder
- addAll(Collection<TableIdentifier>) - Method in class org.apache.iceberg.rest.responses.ListTablesResponse.Builder
- addAll(Iterator<D>) - Method in interface org.apache.iceberg.io.FileAppender
- addAllConfig(Map<String, String>) - Method in class org.apache.iceberg.rest.responses.LoadTableResponse.Builder
- addCloseable(Closeable) - Method in class org.apache.iceberg.io.CloseableGroup
-
Register a closeable to be managed by this class.
- addCloseable(AutoCloseable) - Method in class org.apache.iceberg.io.CloseableGroup
-
Register an autocloseables to be managed by this class.
- addColumn(String, String, Type) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new column to a nested struct.
- addColumn(String, String, Type, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new column to a nested struct.
- addColumn(String, Type) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new top-level column.
- addColumn(String, Type, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new top-level column.
- addConfig(String, String) - Method in class org.apache.iceberg.rest.responses.LoadTableResponse.Builder
- addDataFiles(Iterable<DataFile>) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addDataFiles(DataFile...) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addDeleteFiles(Iterable<DeleteFile>) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addDeleteFiles(DeleteFile...) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addDeletes(DeleteFile) - Method in interface org.apache.iceberg.RowDelta
-
Add a
DeleteFile
to the table. - ADDED_DELETE_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_EQ_DELETES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_FILE_SIZE_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_FILES_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- ADDED_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_POS_DELETES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_RECORDS_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_ROWS_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- addedDataFiles() - Method in class org.apache.iceberg.actions.RewriteDataFilesActionResult
- addedDataFiles(FileIO) - Method in interface org.apache.iceberg.Snapshot
-
Return all data files added to the table in this snapshot.
- addedDataFilesCount() - Method in class org.apache.iceberg.actions.BaseFileGroupRewriteResult
- addedDataFilesCount() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.FileGroupRewriteResult
- addedDataFilesCount() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.Result
- addedDeleteFiles(FileIO) - Method in interface org.apache.iceberg.Snapshot
-
Return all delete files added to the table in this snapshot.
- addedDeleteFilesCount() - Method in interface org.apache.iceberg.actions.RewritePositionDeleteFiles.Result
-
Returns the count of the added delete files.
- addedDeletes() - Method in interface org.apache.iceberg.DeletedRowsScanTask
-
A list of added
delete files
that apply to the task's data file. - addedFile(PartitionSpec, DataFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- addedFile(PartitionSpec, DeleteFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- addedFiles() - Method in class org.apache.iceberg.actions.RewriteFileGroup
- addedFiles() - Method in interface org.apache.iceberg.Snapshot
-
Deprecated.since 0.14.0, will be removed in 1.0.0; Use
Snapshot.addedDataFiles(FileIO)
instead. - addedFilesCount() - Method in class org.apache.iceberg.GenericManifestFile
- addedFilesCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the number of data files with status ADDED in the manifest file.
- addedFilesCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- addedManifest(ManifestFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- addedManifests() - Method in class org.apache.iceberg.actions.BaseRewriteManifestsActionResult
- addedManifests() - Method in interface org.apache.iceberg.actions.RewriteManifests.Result
-
Returns added manifests.
- addedPositionDeleteFilesCount() - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteFiles.Result
-
Returns the count of the added position delete files.
- addedRowsCount() - Method in class org.apache.iceberg.GenericManifestFile
- addedRowsCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the total number of rows in all data files with status ADDED in the manifest file.
- addedRowsCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- AddedRowsScanTask - Interface in org.apache.iceberg
-
A scan task for inserts generated by adding a data file to the table.
- addElement(I, E) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- addElement(List<E>, E) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ListReader
- addExtension(String) - Method in enum org.apache.iceberg.FileFormat
-
Returns filename with this format's extension added, if necessary.
- addFallbackIds(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
- addField(String) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Add a new partition field from a source column.
- addField(String, Term) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Add a new partition field from an
expression term
, with the given partition field name. - addField(Term) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Add a new partition field from an
expression term
. - addFile(DataFile) - Method in class org.apache.iceberg.BaseOverwriteFiles
- addFile(DataFile) - Method in class org.apache.iceberg.BaseReplacePartitions
- addFile(DataFile) - Method in interface org.apache.iceberg.OverwriteFiles
-
Add a
DataFile
to the table. - addFile(DataFile) - Method in interface org.apache.iceberg.ReplacePartitions
-
Add a
DataFile
to the table. - addManifest(ManifestFile) - Method in class org.apache.iceberg.BaseRewriteManifests
- addManifest(ManifestFile) - Method in interface org.apache.iceberg.RewriteManifests
-
Adds a
manifest file
to the table. - addMissing(String) - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- addMissing(Collection<String>) - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- addPair(I, K, V) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- addPair(Map<K, V>, K, V) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.MapReader
- AddPartitionFieldContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- addPartitionSpec(PartitionSpec) - Method in class org.apache.iceberg.TableMetadata.Builder
- addPartitionSpec(UnboundPartitionSpec) - Method in class org.apache.iceberg.TableMetadata.Builder
- AddPartitionSpec(PartitionSpec) - Constructor for class org.apache.iceberg.MetadataUpdate.AddPartitionSpec
- AddPartitionSpec(UnboundPartitionSpec) - Constructor for class org.apache.iceberg.MetadataUpdate.AddPartitionSpec
- addReferencedDataFiles(CharSequence...) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addReferencedDataFiles(Iterable<CharSequence>) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addRemoved(String) - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- addRemoved(Collection<String>) - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- addRequiredColumn(String, String, Type) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new required top-level column.
- addRequiredColumn(String, String, Type, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new required top-level column.
- addRequiredColumn(String, Type) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new required top-level column.
- addRequiredColumn(String, Type, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new required top-level column.
- addResult(DataWriteResult) - Method in class org.apache.iceberg.io.ClusteredDataWriter
- addResult(DataWriteResult) - Method in class org.apache.iceberg.io.FanoutDataWriter
- addResult(DataWriteResult) - Method in class org.apache.iceberg.io.RollingDataWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.ClusteredEqualityDeleteWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.ClusteredPositionDeleteWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.RollingEqualityDeleteWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.RollingPositionDeleteWriter
- addRows(DataFile) - Method in interface org.apache.iceberg.RowDelta
-
Add a
DataFile
to the table. - addSchema(Schema) - Method in class org.apache.iceberg.data.avro.IcebergDecoder
-
Adds an
Iceberg schema
that can be used to decode buffers. - addSchema(Schema, int) - Method in class org.apache.iceberg.TableMetadata.Builder
- AddSchema(Schema, int) - Constructor for class org.apache.iceberg.MetadataUpdate.AddSchema
- addScope(String) - Method in class org.apache.iceberg.rest.responses.OAuthTokenResponse.Builder
- addScopes(List<String>) - Method in class org.apache.iceberg.rest.responses.OAuthTokenResponse.Builder
- addSnapshot(Snapshot) - Method in class org.apache.iceberg.TableMetadata.Builder
- AddSnapshot(Snapshot) - Constructor for class org.apache.iceberg.MetadataUpdate.AddSnapshot
- addSortOrder(SortOrder) - Method in class org.apache.iceberg.TableMetadata.Builder
- addSortOrder(UnboundSortOrder) - Method in class org.apache.iceberg.TableMetadata.Builder
- AddSortOrder(SortOrder) - Constructor for class org.apache.iceberg.MetadataUpdate.AddSortOrder
- AddSortOrder(UnboundSortOrder) - Constructor for class org.apache.iceberg.MetadataUpdate.AddSortOrder
- addUpdated(String) - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- addUpdated(Collection<String>) - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- addValue(double) - Method in class org.apache.iceberg.DoubleFieldMetrics.Builder
- addValue(float) - Method in class org.apache.iceberg.FloatFieldMetrics.Builder
- ADJUST_TO_UTC_PROP - Static variable in class org.apache.iceberg.avro.AvroSchemaUtil
- advance() - Method in class org.apache.iceberg.parquet.BaseColumnIterator
- advanceNextPageCount - Variable in class org.apache.iceberg.parquet.BaseColumnIterator
- AES_CTR - org.apache.iceberg.encryption.EncryptionAlgorithm
-
Counter mode (CTR) allows fast encryption with high throughput.
- AES_GCM - org.apache.iceberg.encryption.EncryptionAlgorithm
-
Galois/Counter mode (GCM) combines CTR with the new Galois mode of authentication.
- AES_GCM_CTR - org.apache.iceberg.encryption.EncryptionAlgorithm
-
A combination of GCM and CTR that can be used for file types like Parquet, so that all modules except pages are encrypted by GCM to ensure integrity, and CTR is used for efficient encryption of bulk data.
- AesGcmDecryptor(byte[]) - Constructor for class org.apache.iceberg.encryption.Ciphers.AesGcmDecryptor
- AesGcmEncryptor(byte[]) - Constructor for class org.apache.iceberg.encryption.Ciphers.AesGcmEncryptor
- after(long) - Method in class org.apache.iceberg.ScanSummary.Builder
- after(String) - Method in class org.apache.iceberg.ScanSummary.Builder
- afterElementField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- afterElementField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterField(String, TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- afterField(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- afterField(Types.NestedField) - Method in class org.apache.iceberg.types.IndexParents
- afterField(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- afterField(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- afterField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterKeyField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- afterKeyField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterListElement(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- afterListElement(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- afterListElement(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- afterMapKey(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- afterMapKey(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- afterMapKey(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- afterMapValue(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- afterMapValue(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- afterMapValue(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- afterRepeatedElement(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterRepeatedKeyValue(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterValueField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- afterValueField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- aggregatedResult() - Method in class org.apache.iceberg.io.ClusteredDataWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.ClusteredEqualityDeleteWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.ClusteredPositionDeleteWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.FanoutDataWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.RollingDataWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.RollingEqualityDeleteWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.RollingPositionDeleteWriter
- aliasToId(String) - Method in class org.apache.iceberg.Schema
-
Returns the column id for the given column alias.
- AliyunClientFactories - Class in org.apache.iceberg.aliyun
- AliyunClientFactory - Interface in org.apache.iceberg.aliyun
- aliyunProperties() - Method in interface org.apache.iceberg.aliyun.AliyunClientFactory
-
Returns an initialized
AliyunProperties
- AliyunProperties - Class in org.apache.iceberg.aliyun
- AliyunProperties() - Constructor for class org.apache.iceberg.aliyun.AliyunProperties
- AliyunProperties(Map<String, String>) - Constructor for class org.apache.iceberg.aliyun.AliyunProperties
- ALL_DATA_FILES - org.apache.iceberg.MetadataTableType
- ALL_DELETE_FILES - org.apache.iceberg.MetadataTableType
- ALL_ENTRIES - org.apache.iceberg.MetadataTableType
- ALL_FILES - org.apache.iceberg.MetadataTableType
- ALL_MANIFESTS - org.apache.iceberg.MetadataTableType
- AllDataFilesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's valid data files as rows. - AllDataFilesTable.AllDataFilesTableScan - Class in org.apache.iceberg
- AllDeleteFilesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes its valid delete files as rows. - AllDeleteFilesTable.AllDeleteFilesTableScan - Class in org.apache.iceberg
- AllEntriesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's manifest entries as rows, for both delete and data files. - AllFilesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes its valid files as rows. - AllFilesTable.AllFilesTableScan - Class in org.apache.iceberg
- allManifests() - Method in interface org.apache.iceberg.Snapshot
-
Deprecated.since 0.14.0, will be removed in 1.0.0; Use
Snapshot.allManifests(FileIO)
instead. - allManifests(FileIO) - Method in interface org.apache.iceberg.Snapshot
-
Return all
ManifestFile
instances for either data or delete manifests in this snapshot. - AllManifestsTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's valid manifest files as rows. - AllManifestsTable.AllManifestsTableScan - Class in org.apache.iceberg
- allowIncompatibleChanges() - Method in interface org.apache.iceberg.UpdateSchema
-
Allow incompatible changes to the schema.
- AlreadyExistsException - Exception in org.apache.iceberg.exceptions
-
Exception raised when attempting to create a table that already exists.
- AlreadyExistsException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.AlreadyExistsException
- AlreadyExistsException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.AlreadyExistsException
- ALTER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- ALTER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- alterDatabase(String, CatalogDatabase, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterFunction(ObjectPath, CatalogFunction, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterNamespace(String[], NamespaceChange...) - Method in class org.apache.iceberg.spark.SparkCatalog
- alterNamespace(String[], NamespaceChange...) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- alterPartition(ObjectPath, CatalogPartitionSpec, CatalogPartition, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterPartitionColumnStatistics(ObjectPath, CatalogPartitionSpec, CatalogColumnStatistics, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterPartitionStatistics(ObjectPath, CatalogPartitionSpec, CatalogTableStatistics, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterTable(ObjectPath, CatalogBaseTable, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterTable(IMetaStoreClient, String, String, Table) - Static method in class org.apache.iceberg.hive.MetastoreUtil
-
Calls alter_table method using the metastore client.
- alterTable(Identifier, TableChange...) - Method in class org.apache.iceberg.spark.SparkCachedTableCatalog
- alterTable(Identifier, TableChange...) - Method in class org.apache.iceberg.spark.SparkCatalog
- alterTable(Identifier, TableChange...) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- alterTableColumnStatistics(ObjectPath, CatalogColumnStatistics, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterTableStatistics(ObjectPath, CatalogTableStatistics, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alwaysFalse() - Static method in class org.apache.iceberg.expressions.Expressions
- alwaysFalse() - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- alwaysNull() - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a
Transform
that always produces null. - alwaysNull(int, String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- alwaysNull(String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- alwaysNull(String, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- alwaysTrue() - Static method in class org.apache.iceberg.expressions.Expressions
- alwaysTrue() - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- ancestorIds(Snapshot, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- ancestorIdsBetween(long, Long, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- ancestorsBetween(long, Long, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- ancestorsOf(long, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- AncestorsOfProcedure - Class in org.apache.iceberg.spark.procedures
- and(Expression, Expression) - Static method in class org.apache.iceberg.expressions.Expressions
- and(Expression, Expression, Expression...) - Static method in class org.apache.iceberg.expressions.Expressions
- and(R, R) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- And - Class in org.apache.iceberg.expressions
- AND - org.apache.iceberg.expressions.Expression.Operation
- APACHE_DATASKETCHES_THETA_V1 - Static variable in class org.apache.iceberg.puffin.StandardBlobTypes
-
A serialized form of a "compact" Theta sketch produced by the Apache DataSketches library
- APP_ID - Static variable in class org.apache.iceberg.CatalogProperties
- append() - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
-
Append the iceberg sink operators to write records to iceberg table.
- APPEND - Static variable in class org.apache.iceberg.DataOperations
-
New data is appended to the table and no data is removed or deleted.
- appendFile(DataFile) - Method in interface org.apache.iceberg.AppendFiles
-
Append a
DataFile
to the table. - AppendFiles - Interface in org.apache.iceberg
-
API for appending new files in a table.
- appendManifest(ManifestFile) - Method in interface org.apache.iceberg.AppendFiles
-
Append a
ManifestFile
to the table. - appendsAfter(long) - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- appendsAfter(long) - Method in class org.apache.iceberg.DataTableScan
- appendsAfter(long) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
to read appended data fromfromSnapshotId
exclusive to the current snapshot inclusive. - appendsBetween(long, long) - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- appendsBetween(long, long) - Method in class org.apache.iceberg.DataTableScan
- appendsBetween(long, long) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
to read appended data fromfromSnapshotId
exclusive totoSnapshotId
inclusive. - apply() - Method in class org.apache.iceberg.BaseReplaceSortOrder
- apply() - Method in interface org.apache.iceberg.PendingUpdate
-
Apply the pending changes and return the uncommitted changes for validation.
- apply() - Method in class org.apache.iceberg.SetLocation
- apply() - Method in class org.apache.iceberg.SnapshotManager
- apply(Map<String, String>) - Method in class org.apache.iceberg.rest.HTTPClientFactory
- apply(IcebergSourceSplit) - Method in class org.apache.iceberg.flink.source.reader.DataIteratorReaderFunction
- apply(TableMetadata) - Method in class org.apache.iceberg.BaseReplacePartitions
- apply(TableMetadata) - Method in class org.apache.iceberg.BaseRewriteManifests
- apply(S) - Method in interface org.apache.iceberg.transforms.Transform
-
Transforms a value to its corresponding partition value.
- apply(S) - Method in class org.apache.iceberg.transforms.UnknownTransform
- applyFilters(List<ResolvedExpression>) - Method in class org.apache.iceberg.flink.IcebergTableSource
- applyLimit(long) - Method in class org.apache.iceberg.flink.IcebergTableSource
- applyNameMapping(MessageType, NameMapping) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
- applyOverwrite(boolean) - Method in class org.apache.iceberg.flink.IcebergTableSink
- applyProjection(int[][]) - Method in class org.apache.iceberg.flink.IcebergTableSource
- applyPropertyChanges(UpdateProperties, List<TableChange>) - Static method in class org.apache.iceberg.spark.Spark3Util
-
Applies a list of Spark table changes to an
UpdateProperties
operation. - applySchemaChanges(UpdateSchema, List<TableChange>) - Static method in class org.apache.iceberg.spark.Spark3Util
-
Applies a list of Spark table changes to an
UpdateSchema
operation. - applyStaticPartition(Map<String, String>) - Method in class org.apache.iceberg.flink.IcebergTableSink
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AddPartitionSpec
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AddSchema
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AddSnapshot
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AddSortOrder
- applyTo(TableMetadata.Builder) - Method in interface org.apache.iceberg.MetadataUpdate
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AssignUUID
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.RemoveProperties
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.RemoveSnapshot
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.RemoveSnapshotRef
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetCurrentSchema
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetDefaultPartitionSpec
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetDefaultSortOrder
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetLocation
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetProperties
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetSnapshotRef
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.UpgradeFormatVersion
- ApplyTransformContext(IcebergSqlExtensionsParser.TransformContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- arguments - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- array(Schema, Schema) - Method in class org.apache.iceberg.avro.RemoveIds
- array(Schema, T) - Method in class org.apache.iceberg.avro.AvroSchemaVisitor
- array(ValueReader<T>) - Static method in class org.apache.iceberg.avro.ValueReaders
- array(ValueWriter<T>) - Static method in class org.apache.iceberg.avro.ValueWriters
- array(OrcValueReader<?>) - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- array(Types.ListType, Schema, T) - Method in class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- array(P, Schema, T) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- arrayElementType(LogicalType) - Method in class org.apache.iceberg.flink.data.AvroWithFlinkSchemaVisitor
- arrayElementType(DataType) - Method in class org.apache.iceberg.spark.data.AvroWithSparkSchemaVisitor
- arrayElementType(P) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- arrayMap(ValueReader<K>, ValueReader<V>) - Static method in class org.apache.iceberg.avro.ValueReaders
- arrayMap(ValueWriter<K>, ValueWriter<V>) - Static method in class org.apache.iceberg.avro.ValueWriters
- ArrayUtil - Class in org.apache.iceberg.util
- ArrowAllocation - Class in org.apache.iceberg.arrow
- ArrowReader - Class in org.apache.iceberg.arrow.vectorized
-
Vectorized reader that returns an iterator of
ColumnarBatch
. - ArrowReader(TableScan, int, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.ArrowReader
-
Create a new instance of the reader.
- ArrowSchemaUtil - Class in org.apache.iceberg.arrow
- ArrowVectorAccessor<DecimalT,Utf8StringT,ArrayT,ChildVectorT extends java.lang.AutoCloseable> - Class in org.apache.iceberg.arrow.vectorized
- ArrowVectorAccessor(ValueVector) - Constructor for class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- ArrowVectorAccessor(ValueVector, ChildVectorT[]) - Constructor for class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- ArrowVectorAccessors - Class in org.apache.iceberg.spark.data.vectorized
- as(String) - Method in interface org.apache.iceberg.actions.SnapshotTable
-
Sets the table identifier for the newly created Iceberg table.
- as(String) - Method in class org.apache.iceberg.spark.actions.SnapshotTableSparkAction
- AS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- AS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- AS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- AS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- AS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- AS_OF_TIMESTAMP - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- AS_OF_TIMESTAMP - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- asc(String) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add a field to the sort by field name, ascending with nulls first.
- asc(String, NullOrder) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add a field to the sort by field name, ascending with the given null order.
- asc(Term) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add an expression term to the sort, ascending with nulls first.
- asc(Term, NullOrder) - Method in class org.apache.iceberg.BaseReplaceSortOrder
- asc(Term, NullOrder) - Method in class org.apache.iceberg.SortOrder.Builder
-
Add an expression term to the sort, ascending with the given null order.
- asc(Term, NullOrder) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add an expression term to the sort, ascending with the given null order.
- ASC - org.apache.iceberg.SortDirection
- ASC - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- ASC - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- ASC() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- ASC() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- asCatalog(SessionCatalog.SessionContext) - Method in class org.apache.iceberg.catalog.BaseSessionCatalog
- asCombinedScanTask() - Method in interface org.apache.iceberg.CombinedScanTask
- asCombinedScanTask() - Method in interface org.apache.iceberg.ScanTask
-
Returns this cast to
CombinedScanTask
if it is one - asDataTask() - Method in interface org.apache.iceberg.DataTask
- asDataTask() - Method in interface org.apache.iceberg.ScanTask
-
Returns this cast to
DataTask
if it is one - asFileScanTask() - Method in interface org.apache.iceberg.FileScanTask
- asFileScanTask() - Method in interface org.apache.iceberg.ScanTask
-
Returns this cast to
FileScanTask
if it is one - asListType() - Method in interface org.apache.iceberg.types.Type
- asListType() - Method in class org.apache.iceberg.types.Types.ListType
- asLiteralPredicate() - Method in class org.apache.iceberg.expressions.BoundLiteralPredicate
- asLiteralPredicate() - Method in class org.apache.iceberg.expressions.BoundPredicate
- asMappedFields() - Method in class org.apache.iceberg.mapping.NameMapping
- asMapType() - Method in interface org.apache.iceberg.types.Type
- asMapType() - Method in class org.apache.iceberg.types.Types.MapType
- asNestedType() - Method in interface org.apache.iceberg.types.Type
- asNestedType() - Method in class org.apache.iceberg.types.Type.NestedType
- asOfTime(long) - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- asOfTime(long) - Method in class org.apache.iceberg.FindFiles.Builder
-
Base results on files in the snapshot that was current as of a timestamp.
- asOfTime(long) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- asOfTime(long) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
from this scan's configuration that will use the most recent snapshot as of the given time in milliseconds. - asOfTimestamp() - Method in class org.apache.iceberg.flink.source.ScanContext
- asOfTimestamp() - Method in class org.apache.iceberg.spark.SparkReadConf
- asOfTimestamp(Long) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- asOfTimestamp(Long) - Method in class org.apache.iceberg.flink.source.IcebergSource.Builder
- asOfTimestamp(Long) - Method in class org.apache.iceberg.flink.source.ScanContext.Builder
- asOptional() - Method in class org.apache.iceberg.types.Types.NestedField
- asPrimitiveType() - Method in interface org.apache.iceberg.types.Type
- asPrimitiveType() - Method in class org.apache.iceberg.types.Type.PrimitiveType
- asRequired() - Method in class org.apache.iceberg.types.Types.NestedField
- asResult() - Method in class org.apache.iceberg.actions.RewriteFileGroup
- AssertLastAssignedFieldId(int) - Constructor for class org.apache.iceberg.rest.requests.UpdateTableRequest.UpdateRequirement.AssertLastAssignedFieldId
- AssertLastAssignedPartitionId(int) - Constructor for class org.apache.iceberg.rest.requests.UpdateTableRequest.UpdateRequirement.AssertLastAssignedPartitionId
- asSetPredicate() - Method in class org.apache.iceberg.expressions.BoundPredicate
- asSetPredicate() - Method in class org.apache.iceberg.expressions.BoundSetPredicate
- ASSIGNED - org.apache.iceberg.flink.source.split.IcebergSourceSplitStatus
- ASSIGNED_BYTES - Static variable in class org.apache.iceberg.flink.source.reader.ReaderMetricsContext
- ASSIGNED_SPLITS - Static variable in class org.apache.iceberg.flink.source.reader.ReaderMetricsContext
- assignerFactory(SplitAssignerFactory) - Method in class org.apache.iceberg.flink.source.IcebergSource.Builder
- assignFreshIds(int, Schema, TypeUtil.NextID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns fresh ids from the
nextId function
for all fields in a schema. - assignFreshIds(Schema, Schema, TypeUtil.NextID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns ids to match a given schema, and fresh ids from the
nextId function
for all other fields. - assignFreshIds(Schema, TypeUtil.NextID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns fresh ids from the
nextId function
for all fields in a schema. - assignFreshIds(Type, TypeUtil.NextID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns fresh ids from the
nextId function
for all fields in a type. - assignIncreasingFreshIds(Schema) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns strictly increasing fresh ids for all fields in a schema, starting from 1.
- assignUUID() - Method in class org.apache.iceberg.TableMetadata.Builder
- AssignUUID(String) - Constructor for class org.apache.iceberg.MetadataUpdate.AssignUUID
- asStatic() - Method in class org.apache.iceberg.common.DynFields.UnboundField
-
Returns this field as a StaticField.
- asStatic() - Method in class org.apache.iceberg.common.DynMethods.UnboundMethod
-
Returns this method as a StaticMethod.
- asStruct() - Method in class org.apache.iceberg.Schema
-
Returns the underlying
struct type
for this schema. - asStructLike(Record) - Method in class org.apache.iceberg.data.GenericDeleteFilter
- asStructLike(T) - Method in class org.apache.iceberg.data.DeleteFilter
- asStructLike(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
-
Wrap the data as a
StructLike
. - asStructLikeKey(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
-
Wrap the passed in key of a row as a
StructLike
- asStructType() - Method in interface org.apache.iceberg.types.Type
- asStructType() - Method in class org.apache.iceberg.types.Types.StructType
- AssumeRoleAwsClientFactory - Class in org.apache.iceberg.aws
- AssumeRoleAwsClientFactory() - Constructor for class org.apache.iceberg.aws.AssumeRoleAwsClientFactory
- asSummaryString() - Method in class org.apache.iceberg.flink.IcebergTableSink
- asSummaryString() - Method in class org.apache.iceberg.flink.IcebergTableSource
- asUnaryPredicate() - Method in class org.apache.iceberg.expressions.BoundPredicate
- asUnaryPredicate() - Method in class org.apache.iceberg.expressions.BoundUnaryPredicate
- AUTH_DEFAULT_REFRESH_ENABLED - Static variable in class org.apache.iceberg.CatalogProperties
- AUTH_DEFAULT_REFRESH_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- AUTH_SESSION_TIMEOUT_MS - Static variable in class org.apache.iceberg.CatalogProperties
- AUTH_SESSION_TIMEOUT_MS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- authHeaders(String) - Static method in class org.apache.iceberg.rest.auth.OAuth2Util
- AuthSession(Map<String, String>, String, String) - Constructor for class org.apache.iceberg.rest.auth.OAuth2Util.AuthSession
- AVAILABLE - org.apache.iceberg.flink.source.assigner.GetSplitResult.Status
- Avro - Class in org.apache.iceberg.avro
- AVRO - org.apache.iceberg.FileFormat
- AVRO_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- AVRO_COMPRESSION_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- AVRO_COMPRESSION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- AVRO_COMPRESSION_LEVEL_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- Avro.DataWriteBuilder - Class in org.apache.iceberg.avro
- Avro.DeleteWriteBuilder - Class in org.apache.iceberg.avro
- Avro.ReadBuilder - Class in org.apache.iceberg.avro
- Avro.WriteBuilder - Class in org.apache.iceberg.avro
- AvroEncoderUtil - Class in org.apache.iceberg.avro
- AvroIterable<D> - Class in org.apache.iceberg.avro
- AvroMetrics - Class in org.apache.iceberg.avro
- AvroSchemaUtil - Class in org.apache.iceberg.avro
- AvroSchemaVisitor<T> - Class in org.apache.iceberg.avro
- AvroSchemaVisitor() - Constructor for class org.apache.iceberg.avro.AvroSchemaVisitor
- AvroSchemaWithTypeVisitor<T> - Class in org.apache.iceberg.avro
- AvroSchemaWithTypeVisitor() - Constructor for class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- AvroWithFlinkSchemaVisitor<T> - Class in org.apache.iceberg.flink.data
- AvroWithFlinkSchemaVisitor() - Constructor for class org.apache.iceberg.flink.data.AvroWithFlinkSchemaVisitor
- AvroWithPartnerByStructureVisitor<P,T> - Class in org.apache.iceberg.avro
-
A abstract avro schema visitor with partner type.
- AvroWithPartnerByStructureVisitor() - Constructor for class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- AvroWithSparkSchemaVisitor<T> - Class in org.apache.iceberg.spark.data
- AvroWithSparkSchemaVisitor() - Constructor for class org.apache.iceberg.spark.data.AvroWithSparkSchemaVisitor
- AwsClientFactories - Class in org.apache.iceberg.aws
- AwsClientFactory - Interface in org.apache.iceberg.aws
-
Interface to customize AWS clients used by Iceberg.
- AwsProperties - Class in org.apache.iceberg.aws
- AwsProperties() - Constructor for class org.apache.iceberg.aws.AwsProperties
- AwsProperties(Map<String, String>) - Constructor for class org.apache.iceberg.aws.AwsProperties
B
- BACKQUOTED_IDENTIFIER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BACKQUOTED_IDENTIFIER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BACKQUOTED_IDENTIFIER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierContext
- BadRequestException - Exception in org.apache.iceberg.exceptions
-
Exception thrown on HTTP 400 - Bad Request
- BadRequestException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.BadRequestException
- BadRequestException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.BadRequestException
- BASE_NAMESPACE - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- BaseBatchReader<T> - Class in org.apache.iceberg.arrow.vectorized
-
A base BatchReader class that contains common functionality
- BaseBatchReader(List<VectorizedReader<?>>) - Constructor for class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- BaseColumnIterator - Class in org.apache.iceberg.parquet
- BaseColumnIterator(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.BaseColumnIterator
- BaseCombinedScanTask - Class in org.apache.iceberg
- BaseCombinedScanTask(List<FileScanTask>) - Constructor for class org.apache.iceberg.BaseCombinedScanTask
- BaseCombinedScanTask(FileScanTask...) - Constructor for class org.apache.iceberg.BaseCombinedScanTask
- BaseDeleteOrphanFilesActionResult - Class in org.apache.iceberg.actions
- BaseDeleteOrphanFilesActionResult(Iterable<String>) - Constructor for class org.apache.iceberg.actions.BaseDeleteOrphanFilesActionResult
- BaseDeleteReachableFilesActionResult - Class in org.apache.iceberg.actions
- BaseDeleteReachableFilesActionResult(long, long, long, long) - Constructor for class org.apache.iceberg.actions.BaseDeleteReachableFilesActionResult
- BaseEqualityDeltaWriter(StructLike, Schema, Schema) - Constructor for class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
- BaseExpireSnapshotsActionResult - Class in org.apache.iceberg.actions
- BaseExpireSnapshotsActionResult(long, long, long) - Constructor for class org.apache.iceberg.actions.BaseExpireSnapshotsActionResult
- BaseExpireSnapshotsActionResult(long, long, long, long, long) - Constructor for class org.apache.iceberg.actions.BaseExpireSnapshotsActionResult
- BaseFileGroupRewriteResult - Class in org.apache.iceberg.actions
- BaseFileGroupRewriteResult(RewriteDataFiles.FileGroupInfo, int, int) - Constructor for class org.apache.iceberg.actions.BaseFileGroupRewriteResult
- BaseFileScanTask - Class in org.apache.iceberg
- BaseFileScanTask(DataFile, DeleteFile[], String, String, ResidualEvaluator) - Constructor for class org.apache.iceberg.BaseFileScanTask
- BaseFileWriterFactory<T> - Class in org.apache.iceberg.data
-
A base writer factory to be extended by query engine integrations.
- BaseFileWriterFactory(Table, FileFormat, Schema, SortOrder, FileFormat, int[], Schema, SortOrder, Schema) - Constructor for class org.apache.iceberg.data.BaseFileWriterFactory
- BaseLockManager() - Constructor for class org.apache.iceberg.util.LockManagers.BaseLockManager
- BaseMetadataTable - Class in org.apache.iceberg
-
Base class for metadata tables.
- BaseMetadataTable(TableOperations, Table, String) - Constructor for class org.apache.iceberg.BaseMetadataTable
- BaseMetastoreCatalog - Class in org.apache.iceberg
- BaseMetastoreCatalog() - Constructor for class org.apache.iceberg.BaseMetastoreCatalog
- BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder - Class in org.apache.iceberg
- BaseMetastoreCatalogTableBuilder(TableIdentifier, Schema) - Constructor for class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- BaseMetastoreTableOperations - Class in org.apache.iceberg
- BaseMetastoreTableOperations() - Constructor for class org.apache.iceberg.BaseMetastoreTableOperations
- BaseMetastoreTableOperations.CommitStatus - Enum in org.apache.iceberg
- BaseMigrateTableActionResult - Class in org.apache.iceberg.actions
- BaseMigrateTableActionResult(long) - Constructor for class org.apache.iceberg.actions.BaseMigrateTableActionResult
- BaseOverwriteFiles - Class in org.apache.iceberg
- BaseOverwriteFiles(String, TableOperations) - Constructor for class org.apache.iceberg.BaseOverwriteFiles
- BasePageIterator - Class in org.apache.iceberg.parquet
- BasePageIterator(ColumnDescriptor, String) - Constructor for class org.apache.iceberg.parquet.BasePageIterator
- BasePageIterator.IntIterator - Class in org.apache.iceberg.parquet
- BaseParquetReaders<T> - Class in org.apache.iceberg.data.parquet
- BaseParquetReaders() - Constructor for class org.apache.iceberg.data.parquet.BaseParquetReaders
- BaseParquetWriter<T> - Class in org.apache.iceberg.data.parquet
- BaseParquetWriter() - Constructor for class org.apache.iceberg.data.parquet.BaseParquetWriter
- BasePositionDeltaWriter<T> - Class in org.apache.iceberg.io
- BasePositionDeltaWriter(PartitioningWriter<T, DataWriteResult>, PartitioningWriter<T, DataWriteResult>, PartitioningWriter<PositionDelete<T>, DeleteWriteResult>) - Constructor for class org.apache.iceberg.io.BasePositionDeltaWriter
- BaseReplacePartitions - Class in org.apache.iceberg
- BaseReplaceSortOrder - Class in org.apache.iceberg
- BaseRewriteDataFilesAction<ThisT> - Class in org.apache.iceberg.actions
- BaseRewriteDataFilesAction(Table) - Constructor for class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- BaseRewriteDataFilesFileGroupInfo - Class in org.apache.iceberg.actions
- BaseRewriteDataFilesFileGroupInfo(int, int, StructLike) - Constructor for class org.apache.iceberg.actions.BaseRewriteDataFilesFileGroupInfo
- BaseRewriteDataFilesResult - Class in org.apache.iceberg.actions
- BaseRewriteDataFilesResult(List<RewriteDataFiles.FileGroupRewriteResult>) - Constructor for class org.apache.iceberg.actions.BaseRewriteDataFilesResult
- BaseRewriteManifests - Class in org.apache.iceberg
- BaseRewriteManifestsActionResult - Class in org.apache.iceberg.actions
- BaseRewriteManifestsActionResult(Iterable<ManifestFile>, Iterable<ManifestFile>) - Constructor for class org.apache.iceberg.actions.BaseRewriteManifestsActionResult
- BaseScanTaskGroup<T extends ScanTask> - Class in org.apache.iceberg
- BaseScanTaskGroup(Collection<T>) - Constructor for class org.apache.iceberg.BaseScanTaskGroup
- BaseSessionCatalog - Class in org.apache.iceberg.catalog
- BaseSessionCatalog() - Constructor for class org.apache.iceberg.catalog.BaseSessionCatalog
- BaseSessionCatalog.AsCatalog - Class in org.apache.iceberg.catalog
- BaseSnapshotTableActionResult - Class in org.apache.iceberg.actions
- BaseSnapshotTableActionResult(long) - Constructor for class org.apache.iceberg.actions.BaseSnapshotTableActionResult
- BaseTable - Class in org.apache.iceberg
-
Base
Table
implementation. - BaseTable(TableOperations, String) - Constructor for class org.apache.iceberg.BaseTable
- BaseTaskWriter<T> - Class in org.apache.iceberg.io
- BaseTaskWriter(PartitionSpec, FileFormat, FileAppenderFactory<T>, OutputFileFactory, FileIO, long) - Constructor for class org.apache.iceberg.io.BaseTaskWriter
- BaseTaskWriter.BaseEqualityDeltaWriter - Class in org.apache.iceberg.io
-
Base equality delta writer to write both insert records and equality-deletes.
- BaseTaskWriter.RollingEqDeleteWriter - Class in org.apache.iceberg.io
- BaseTaskWriter.RollingFileWriter - Class in org.apache.iceberg.io
- BaseTransaction - Class in org.apache.iceberg
- BaseTransaction.TransactionTable - Class in org.apache.iceberg
- BaseTransaction.TransactionTableOperations - Class in org.apache.iceberg
- BaseVectorizedParquetValuesReader - Class in org.apache.iceberg.arrow.vectorized.parquet
-
A values reader for Parquet's run-length encoded data that reads column data in batches instead of one value at a time.
- BaseVectorizedParquetValuesReader(int, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- BaseVectorizedParquetValuesReader(int, int, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- BaseVectorizedParquetValuesReader(int, int, boolean, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- batch(String, DataIterator<T>) - Method in interface org.apache.iceberg.flink.source.reader.DataIteratorBatcher
- BatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.BatchReader
- before(long) - Method in class org.apache.iceberg.ScanSummary.Builder
- before(String) - Method in class org.apache.iceberg.ScanSummary.Builder
- beforeElementField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- beforeElementField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeField(String, TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- beforeField(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- beforeField(Types.NestedField) - Method in class org.apache.iceberg.types.IndexParents
- beforeField(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- beforeField(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- beforeField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeKeyField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- beforeKeyField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeListElement(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- beforeListElement(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- beforeListElement(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- beforeMapKey(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- beforeMapKey(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- beforeMapKey(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- beforeMapValue(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- beforeMapValue(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- beforeMapValue(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- beforeRepeatedElement(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeRepeatedKeyValue(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeValueField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- beforeValueField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- BIGDECIMAL_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BIGDECIMAL_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BIGDECIMAL_LITERAL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- BigDecimalLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- BIGINT_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BIGINT_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BIGINT_LITERAL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- BigIntLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- BINARY - org.apache.iceberg.orc.ORCSchemaUtil.BinaryType
- BINARY - org.apache.iceberg.types.Type.TypeID
- BinaryAsDecimalReader(ColumnDescriptor, int) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.BinaryAsDecimalReader
- BinaryType() - Constructor for class org.apache.iceberg.types.Types.BinaryType
- BinaryUtil - Class in org.apache.iceberg.util
- bind(Object) - Method in class org.apache.iceberg.common.DynConstructors.Ctor
- bind(Object) - Method in class org.apache.iceberg.common.DynFields.UnboundField
-
Returns this method as a BoundMethod for the given receiver.
- bind(Object) - Method in class org.apache.iceberg.common.DynMethods.UnboundMethod
-
Returns this method as a BoundMethod for the given receiver.
- bind(Schema) - Method in class org.apache.iceberg.UnboundPartitionSpec
- bind(Schema) - Method in class org.apache.iceberg.UnboundSortOrder
- bind(Types.StructType, boolean) - Method in class org.apache.iceberg.expressions.NamedReference
- bind(Types.StructType, boolean) - Method in interface org.apache.iceberg.expressions.Unbound
-
Bind this value expression to concrete types.
- bind(Types.StructType, boolean) - Method in class org.apache.iceberg.expressions.UnboundPredicate
-
Bind this UnboundPredicate.
- bind(Types.StructType, boolean) - Method in class org.apache.iceberg.expressions.UnboundTransform
- bind(Types.StructType, Expression, boolean) - Static method in class org.apache.iceberg.expressions.Binder
-
Replaces all unbound/named references with bound references to fields in the given struct.
- Binder - Class in org.apache.iceberg.expressions
-
Rewrites
expressions
by replacing unbound named references with references to fields in a struct schema. - binPack() - Method in interface org.apache.iceberg.actions.RewriteDataFiles
-
Choose BINPACK as a strategy for this rewrite operation
- binPack() - Method in class org.apache.iceberg.spark.actions.RewriteDataFilesSparkAction
- BinPacking - Class in org.apache.iceberg.util
- BinPacking() - Constructor for class org.apache.iceberg.util.BinPacking
- BinPacking.ListPacker<T> - Class in org.apache.iceberg.util
- BinPacking.PackingIterable<T> - Class in org.apache.iceberg.util
- BinPackStrategy - Class in org.apache.iceberg.actions
-
A rewrite strategy for data files which determines which files to rewrite based on their size.
- BinPackStrategy() - Constructor for class org.apache.iceberg.actions.BinPackStrategy
- Blob - Class in org.apache.iceberg.puffin
- Blob(String, List<Integer>, long, long, ByteBuffer) - Constructor for class org.apache.iceberg.puffin.Blob
- Blob(String, List<Integer>, long, long, ByteBuffer, PuffinCompressionCodec, Map<String, String>) - Constructor for class org.apache.iceberg.puffin.Blob
- blobData() - Method in class org.apache.iceberg.puffin.Blob
- BlobMetadata - Class in org.apache.iceberg.puffin
- BlobMetadata(String, List<Integer>, long, long, long, long, String, Map<String, String>) - Constructor for class org.apache.iceberg.puffin.BlobMetadata
- blobs() - Method in class org.apache.iceberg.puffin.FileMetadata
- blockLocations(CombinedScanTask, Configuration) - Static method in class org.apache.iceberg.hadoop.Util
- blockLocations(FileIO, CombinedScanTask) - Static method in class org.apache.iceberg.hadoop.Util
- BOOLEAN - org.apache.iceberg.types.Type.TypeID
- booleanBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- BooleanBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.BooleanBatchReader
- BooleanLiteralContext(IcebergSqlExtensionsParser.ConstantContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanLiteralContext
- booleans() - Static method in class org.apache.iceberg.avro.ValueReaders
- booleans() - Static method in class org.apache.iceberg.avro.ValueWriters
- booleans() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- booleans() - Static method in class org.apache.iceberg.orc.OrcValueReaders
- booleans(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- BooleanType() - Constructor for class org.apache.iceberg.types.Types.BooleanType
- booleanValue() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanLiteralContext
- booleanValue() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BooleanValueContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanValueContext
- Bound<T> - Interface in org.apache.iceberg.expressions
-
Represents a bound value expression.
- BoundExpressionVisitor() - Constructor for class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- BoundLiteralPredicate<T> - Class in org.apache.iceberg.expressions
- BoundPredicate<T> - Class in org.apache.iceberg.expressions
- BoundPredicate(Expression.Operation, BoundTerm<T>) - Constructor for class org.apache.iceberg.expressions.BoundPredicate
- BoundReference<T> - Class in org.apache.iceberg.expressions
- boundReferences(Types.StructType, List<Expression>, boolean) - Static method in class org.apache.iceberg.expressions.Binder
- BoundSetPredicate<T> - Class in org.apache.iceberg.expressions
- BoundTerm<T> - Interface in org.apache.iceberg.expressions
-
Represents a bound term.
- BoundTransform<S,T> - Class in org.apache.iceberg.expressions
-
A transform expression.
- BoundUnaryPredicate<T> - Class in org.apache.iceberg.expressions
- BoundVisitor() - Constructor for class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- BRACKETED_COMMENT - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BRACKETED_COMMENT - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- branchBuilder(long) - Static method in class org.apache.iceberg.SnapshotRef
- bucket() - Method in class org.apache.iceberg.aliyun.oss.OSSURI
-
Return OSS bucket name.
- bucket(int, String, int, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- bucket(String, int) - Static method in class org.apache.iceberg.expressions.Expressions
- bucket(String, int) - Method in class org.apache.iceberg.PartitionSpec.Builder
- bucket(String, int, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- bucket(String, int, int, SortDirection, NullOrder) - Method in interface org.apache.iceberg.transforms.SortOrderVisitor
- bucket(String, int, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- bucket(Type, int) - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a bucket
Transform
for the given type and number of buckets. - buffer() - Method in interface org.apache.iceberg.encryption.EncryptionKeyMetadata
-
Opaque blob representing metadata about a file's encryption key.
- build() - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- build() - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- build() - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- build() - Method in class org.apache.iceberg.common.DynClasses.Builder
-
Returns the first implementation or throws RuntimeException if one was not found.
- build() - Method in class org.apache.iceberg.common.DynConstructors.Builder
- build() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a UnboundField or throws a NoSuchFieldException if there is none.
- build() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a UnboundMethod or throws a RuntimeError if there is none.
- build() - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- build() - Method in class org.apache.iceberg.DataFiles.Builder
- build() - Method in class org.apache.iceberg.DoubleFieldMetrics.Builder
- build() - Method in class org.apache.iceberg.encryption.NativeFileCryptoParameters.Builder
- build() - Method in class org.apache.iceberg.FileMetadata.Builder
- build() - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- build() - Method in class org.apache.iceberg.flink.source.IcebergSource.Builder
- build() - Method in class org.apache.iceberg.flink.source.ScanContext.Builder
- build() - Method in class org.apache.iceberg.FloatFieldMetrics.Builder
- build() - Method in class org.apache.iceberg.GenericManifestFile.CopyBuilder
- build() - Method in class org.apache.iceberg.io.OutputFileFactory.Builder
- build() - Method in class org.apache.iceberg.io.WriteResult.Builder
- build() - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- build() - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- build() - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- build() - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- build() - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- build() - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- build() - Method in class org.apache.iceberg.PartitionSpec.Builder
- build() - Method in class org.apache.iceberg.puffin.Puffin.ReadBuilder
- build() - Method in class org.apache.iceberg.puffin.Puffin.WriteBuilder
- build() - Method in class org.apache.iceberg.rest.HTTPClient.Builder
- build() - Method in class org.apache.iceberg.rest.requests.CreateNamespaceRequest.Builder
- build() - Method in class org.apache.iceberg.rest.requests.CreateTableRequest.Builder
- build() - Method in class org.apache.iceberg.rest.requests.RenameTableRequest.Builder
- build() - Method in class org.apache.iceberg.rest.requests.UpdateNamespacePropertiesRequest.Builder
- build() - Method in class org.apache.iceberg.rest.requests.UpdateTableRequest.Builder
- build() - Method in class org.apache.iceberg.rest.responses.ConfigResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.CreateNamespaceResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.ErrorResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.GetNamespaceResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.ListNamespacesResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.ListTablesResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.LoadTableResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.OAuthTokenResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- build() - Method in class org.apache.iceberg.ScanSummary.Builder
-
Summarizes a table scan as a map of partition key to metrics for that partition.
- build() - Method in class org.apache.iceberg.SnapshotRef.Builder
- build() - Method in class org.apache.iceberg.SnapshotSummary.Builder
- build() - Method in class org.apache.iceberg.SortOrder.Builder
- build() - Method in interface org.apache.iceberg.spark.procedures.SparkProcedures.ProcedureBuilder
- build() - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- build() - Method in class org.apache.iceberg.TableMetadata.Builder
- build() - Method in interface org.apache.spark.sql.connector.iceberg.write.DeltaWriteBuilder
-
Returns a logical delta write.
- build(Object) - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a BoundMethod or throws a RuntimeException if there is none.
- build(Object) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a BoundMethod or throws a RuntimeError if there is none.
- buildAvroProjection(Schema, Schema, Map<String, String>) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- buildChecked() - Method in class org.apache.iceberg.common.DynClasses.Builder
-
Returns the first implementation or throws ClassNotFoundException if one was not found.
- buildChecked() - Method in class org.apache.iceberg.common.DynConstructors.Builder
- buildChecked() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a UnboundField or throws a NoSuchFieldException if there is none.
- buildChecked() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a UnboundMethod or throws a NoSuchMethodException if there is none.
- buildChecked(Object) - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a BoundMethod or throws a NoSuchMethodException if there is none.
- buildChecked(Object) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a BoundMethod or throws a NoSuchMethodException if there is none.
- buildCopyOnWriteDistribution(Table, RowLevelOperation.Command, DistributionMode) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- buildCopyOnWriteOrdering(Table, RowLevelOperation.Command, Distribution) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- buildCopyOnWriteScan() - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- buildEqualityWriter() - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- buildEqualityWriter() - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- buildEqualityWriter() - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- builder() - Static method in class org.apache.iceberg.common.DynClasses
- builder() - Static method in class org.apache.iceberg.common.DynConstructors
- builder() - Static method in class org.apache.iceberg.common.DynFields
- builder() - Static method in class org.apache.iceberg.flink.source.IcebergSource
- builder() - Static method in class org.apache.iceberg.flink.source.ScanContext
- builder() - Static method in class org.apache.iceberg.io.WriteResult
- builder() - Static method in class org.apache.iceberg.rest.HTTPClient
- builder() - Static method in class org.apache.iceberg.rest.requests.CreateNamespaceRequest
- builder() - Static method in class org.apache.iceberg.rest.requests.CreateTableRequest
- builder() - Static method in class org.apache.iceberg.rest.requests.RenameTableRequest
- builder() - Static method in class org.apache.iceberg.rest.requests.UpdateNamespacePropertiesRequest
- builder() - Static method in class org.apache.iceberg.rest.responses.ConfigResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.CreateNamespaceResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.ErrorResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.GetNamespaceResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.ListNamespacesResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.ListTablesResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.LoadTableResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.OAuthTokenResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse
- builder() - Static method in class org.apache.iceberg.SnapshotSummary
- builder() - Static method in class org.apache.iceberg.spark.procedures.AncestorsOfProcedure
- builder() - Static method in class org.apache.iceberg.spark.procedures.ExpireSnapshotsProcedure
- builder() - Static method in class org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure
- builder(Class<?>) - Static method in class org.apache.iceberg.common.DynConstructors
- builder(String) - Static method in class org.apache.iceberg.common.DynMethods
-
Constructs a new builder for calling methods dynamically.
- builder(PartitionSpec) - Static method in class org.apache.iceberg.DataFiles
- Builder() - Constructor for class org.apache.iceberg.common.DynConstructors.Builder
- Builder() - Constructor for class org.apache.iceberg.flink.source.FlinkSource.Builder
- Builder(int) - Constructor for class org.apache.iceberg.DoubleFieldMetrics.Builder
- Builder(int) - Constructor for class org.apache.iceberg.FloatFieldMetrics.Builder
- Builder(Class<?>) - Constructor for class org.apache.iceberg.common.DynConstructors.Builder
- Builder(Iterable<I>) - Constructor for class org.apache.iceberg.util.Tasks.Builder
- Builder(String) - Constructor for class org.apache.iceberg.common.DynMethods.Builder
- Builder(PartitionSpec) - Constructor for class org.apache.iceberg.DataFiles.Builder
- Builder(Table) - Constructor for class org.apache.iceberg.FindFiles.Builder
- Builder(TableMetadata, boolean) - Constructor for class org.apache.iceberg.rest.requests.UpdateTableRequest.Builder
- Builder(TableScan) - Constructor for class org.apache.iceberg.ScanSummary.Builder
- builderFor(int) - Method in class org.apache.iceberg.FloatFieldMetrics
- builderFor(long, SnapshotRefType) - Static method in class org.apache.iceberg.SnapshotRef
- builderFor(DataStream<T>, MapFunction<T, RowData>, TypeInformation<RowData>) - Static method in class org.apache.iceberg.flink.sink.FlinkSink
-
Initialize a
FlinkSink.Builder
to export the data from generic input data stream into iceberg table. - builderFor(Schema) - Static method in class org.apache.iceberg.PartitionSpec
-
Creates a new
partition spec builder
for the givenSchema
. - builderFor(Schema) - Static method in class org.apache.iceberg.SortOrder
-
Creates a new
sort order builder
for the givenSchema
. - builderFor(TableMetadata) - Static method in class org.apache.iceberg.rest.requests.UpdateTableRequest
- builderFor(Table, int, long) - Static method in class org.apache.iceberg.io.OutputFileFactory
- builderForCreate() - Static method in class org.apache.iceberg.rest.requests.UpdateTableRequest
- builderForReplace(TableMetadata) - Static method in class org.apache.iceberg.rest.requests.UpdateTableRequest
- builderFrom(SnapshotRef) - Static method in class org.apache.iceberg.SnapshotRef
- builderFrom(SnapshotRef, long) - Static method in class org.apache.iceberg.SnapshotRef
-
Creates a ref builder from the given ref and its properties but the ref will now point to the given snapshotId.
- buildFormat() - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- buildFrom(TableMetadata) - Static method in class org.apache.iceberg.TableMetadata
- buildFromEmpty() - Static method in class org.apache.iceberg.TableMetadata
- buildIcebergCatalog(String, Map<String, String>, Object) - Static method in class org.apache.iceberg.CatalogUtil
-
Build an Iceberg
Catalog
based on a map of catalog properties and optional Hadoop configuration. - buildIcebergCatalog(String, CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.SparkCatalog
-
Build an Iceberg
Catalog
to be used by this Spark catalog adapter. - buildIdentifier(Identifier) - Method in class org.apache.iceberg.spark.SparkCatalog
-
Build an Iceberg
TableIdentifier
for the given Spark identifier. - buildList(I) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- buildList(List<E>) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ListReader
- buildMap(I) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- buildMap(Map<K, V>) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.MapReader
- buildMergeOnReadScan() - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- buildOrcProjection(Schema, TypeDescription) - Static method in class org.apache.iceberg.orc.ORCSchemaUtil
-
Converts an Iceberg schema to a corresponding ORC schema within the context of an existing ORC file schema.
- buildPositionDeltaDistribution(Table, RowLevelOperation.Command, DistributionMode) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- buildPositionDeltaOrdering(Table, RowLevelOperation.Command) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- buildPositionWriter() - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- buildPositionWriter() - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- buildPositionWriter() - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- buildReader(Schema, TypeDescription) - Static method in class org.apache.iceberg.data.orc.GenericOrcReader
- buildReader(Schema, TypeDescription, Map<Integer, ?>) - Static method in class org.apache.iceberg.data.orc.GenericOrcReader
- buildReader(Schema, TypeDescription, Map<Integer, ?>) - Static method in class org.apache.iceberg.spark.data.vectorized.VectorizedSparkOrcReaders
- buildReader(Schema, MessageType) - Static method in class org.apache.iceberg.data.parquet.GenericParquetReaders
- buildReader(Schema, MessageType) - Static method in class org.apache.iceberg.flink.data.FlinkParquetReaders
- buildReader(Schema, MessageType) - Static method in class org.apache.iceberg.parquet.ParquetAvroValueReaders
- buildReader(Schema, MessageType) - Static method in class org.apache.iceberg.spark.data.SparkParquetReaders
- buildReader(Schema, MessageType, boolean) - Static method in class org.apache.iceberg.spark.data.vectorized.VectorizedSparkParquetReaders
- buildReader(Schema, MessageType, boolean, Map<Integer, ?>) - Static method in class org.apache.iceberg.spark.data.vectorized.VectorizedSparkParquetReaders
- buildReader(Schema, MessageType, boolean, Map<Integer, ?>, DeleteFilter<InternalRow>) - Static method in class org.apache.iceberg.spark.data.vectorized.VectorizedSparkParquetReaders
- buildReader(Schema, MessageType, Map<Integer, ?>) - Static method in class org.apache.iceberg.data.parquet.GenericParquetReaders
- buildReader(Schema, MessageType, Map<Integer, ?>) - Static method in class org.apache.iceberg.flink.data.FlinkParquetReaders
- buildReader(Schema, MessageType, Map<Integer, ?>) - Static method in class org.apache.iceberg.spark.data.SparkParquetReaders
- buildReader(MessageType, Schema, Map<Integer, Object>) - Static method in class org.apache.iceberg.pig.PigParquetReader
- buildReplacement(Schema, PartitionSpec, SortOrder, String, Map<String, String>) - Method in class org.apache.iceberg.TableMetadata
- buildRequiredDistribution(Table, DistributionMode) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- buildRequiredOrdering(Table, Distribution) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- buildSortOrder(Schema, PartitionSpec, SortOrder) - Static method in class org.apache.iceberg.util.SortOrderUtil
-
Build a final sort order that satisfies the clustering required by the partition spec.
- buildSortOrder(Table) - Static method in class org.apache.iceberg.util.SortOrderUtil
- buildSortOrder(Table, SortOrder) - Static method in class org.apache.iceberg.util.SortOrderUtil
- buildSparkCatalog(String, CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
-
Build a
SparkCatalog
to be used for Iceberg operations. - buildStatic() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a StaticField or throws a RuntimeException if there is none.
- buildStatic() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a StaticMethod or throws a RuntimeException if there is none.
- buildStaticChecked() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a StaticField or throws a NoSuchFieldException if there is none.
- buildStaticChecked() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a StaticMethod or throws a NoSuchMethodException if there is none.
- buildStruct(I) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- buildTable(String, Schema) - Method in class org.apache.iceberg.hadoop.HadoopTables
- buildTable(SessionCatalog.SessionContext, TableIdentifier, Schema) - Method in interface org.apache.iceberg.catalog.SessionCatalog
-
Create a builder to create a table or start a create/replace transaction.
- buildTable(SessionCatalog.SessionContext, TableIdentifier, Schema) - Method in class org.apache.iceberg.rest.RESTSessionCatalog
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.BaseMetastoreCatalog
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.CachingCatalog
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.catalog.BaseSessionCatalog.AsCatalog
- buildTable(TableIdentifier, Schema) - Method in interface org.apache.iceberg.catalog.Catalog
-
/** Instantiate a builder to either create a table or start a create/replace transaction.
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.rest.RESTCatalog
- buildWriter(LogicalType, MessageType) - Static method in class org.apache.iceberg.flink.data.FlinkParquetWriters
- buildWriter(RowType, Schema) - Static method in class org.apache.iceberg.flink.data.FlinkOrcWriter
- buildWriter(Schema, TypeDescription) - Static method in class org.apache.iceberg.data.orc.GenericOrcWriter
- buildWriter(MessageType) - Static method in class org.apache.iceberg.data.parquet.GenericParquetWriter
- buildWriter(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetAvroWriter
- buildWriter(StructType, MessageType) - Static method in class org.apache.iceberg.spark.data.SparkParquetWriters
- BulkDeletionFailureException - Exception in org.apache.iceberg.io
- BulkDeletionFailureException(int) - Constructor for exception org.apache.iceberg.io.BulkDeletionFailureException
- BY - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BY - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BY() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- BY() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- BY() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- byId() - Method in class org.apache.iceberg.types.IndexByName
-
Returns a mapping from field ID to full name.
- byName() - Method in class org.apache.iceberg.types.IndexByName
-
Returns a mapping from full field name to ID.
- ByteArrayReader(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.ByteArrayReader
- byteArrays() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- ByteBufferInputStream - Class in org.apache.iceberg.io
- ByteBufferInputStream() - Constructor for class org.apache.iceberg.io.ByteBufferInputStream
- byteBuffers() - Static method in class org.apache.iceberg.avro.ValueReaders
- byteBuffers() - Static method in class org.apache.iceberg.avro.ValueWriters
- byteBuffers() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- byteBuffers(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- ByteBuffers - Class in org.apache.iceberg.util
- bytes() - Static method in class org.apache.iceberg.avro.ValueReaders
- bytes() - Static method in class org.apache.iceberg.avro.ValueWriters
- bytes() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- bytes() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- bytes() - Static method in class org.apache.iceberg.orc.OrcValueReaders
- BYTES - org.apache.iceberg.metrics.MetricsContext.Unit
- BYTES_ASC - org.apache.iceberg.RewriteJobOrder
- BYTES_DESC - org.apache.iceberg.RewriteJobOrder
- BytesReader(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.BytesReader
- byteTruncateOrFill(byte[], int, ByteBuffer) - Static method in class org.apache.iceberg.util.ZOrderByteUtils
-
Return a bytebuffer with the given bytes truncated to length, or filled with 0's to length depending on whether the given bytes are larger or smaller than the given length.
C
- CACHE_ENABLED - Static variable in class org.apache.iceberg.CatalogProperties
-
Controls whether the catalog will cache table entries upon load.
- CACHE_ENABLED - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- CACHE_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- CACHE_EXPIRATION_INTERVAL_MS - Static variable in class org.apache.iceberg.CatalogProperties
-
Controls the duration for which entries in the catalog are cached.
- CACHE_EXPIRATION_INTERVAL_MS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- CACHE_EXPIRATION_INTERVAL_MS_OFF - Static variable in class org.apache.iceberg.CatalogProperties
- CachedClientPool - Class in org.apache.iceberg.hive
- CachingCatalog - Class in org.apache.iceberg
-
Class that wraps an Iceberg Catalog to cache tables.
- CachingCatalog(Catalog, boolean, long, Ticker) - Constructor for class org.apache.iceberg.CachingCatalog
- call(InternalRow) - Method in class org.apache.iceberg.spark.procedures.AncestorsOfProcedure
- call(InternalRow) - Method in class org.apache.iceberg.spark.procedures.ExpireSnapshotsProcedure
- call(InternalRow) - Method in class org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure
- call(InternalRow) - Method in interface org.apache.spark.sql.connector.iceberg.catalog.Procedure
-
Executes this procedure.
- CALL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- CALL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- CALL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- CALL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- callArgument() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- callArgument() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- callArgument(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- CallArgumentContext() - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallArgumentContext
- CallArgumentContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallArgumentContext
- CallContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- callInit() - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- cancel() - Method in class org.apache.iceberg.flink.source.StreamingMonitorFunction
- canContainAny(ManifestFile, Iterable<StructLike>, Function<Integer, PartitionSpec>) - Static method in class org.apache.iceberg.util.ManifestFileUtil
- canContainAny(ManifestFile, Iterable<Pair<Integer, StructLike>>, Map<Integer, PartitionSpec>) - Static method in class org.apache.iceberg.util.ManifestFileUtil
- canDeleteWhere(Filter[]) - Method in class org.apache.iceberg.spark.source.SparkTable
- canMerge(ScanTask) - Method in interface org.apache.iceberg.MergeableScanTask
-
Checks if this task can merge with a given task.
- canTransform(Type) - Method in interface org.apache.iceberg.transforms.Transform
-
Checks whether this function can be applied to the given
Type
. - canTransform(Type) - Method in class org.apache.iceberg.transforms.UnknownTransform
- capabilities() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- capabilities() - Method in class org.apache.iceberg.spark.source.SparkTable
- CASE_SENSITIVE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- CASE_SENSITIVE_DEFAULT - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- caseInsensitive() - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- caseInsensitive() - Method in class org.apache.iceberg.FindFiles.Builder
- caseInsensitive() - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- caseInsensitiveField(String) - Method in class org.apache.iceberg.types.Types.StructType
- caseInsensitiveFindField(String) - Method in class org.apache.iceberg.Schema
-
Returns a sub-field by name as a
Types.NestedField
. - caseInsensitiveSelect(String...) - Method in class org.apache.iceberg.Schema
-
Creates a projection schema for a subset of columns, selected by case insensitive names
- caseInsensitiveSelect(Collection<String>) - Method in class org.apache.iceberg.Schema
-
Creates a projection schema for a subset of columns, selected by case insensitive names
- caseSensitive() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- caseSensitive() - Method in class org.apache.iceberg.flink.source.ScanContext
- caseSensitive() - Method in class org.apache.iceberg.spark.SparkReadConf
- caseSensitive(boolean) - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
-
Is it case sensitive
- caseSensitive(boolean) - Method in interface org.apache.iceberg.DeleteFiles
-
Enables or disables case sensitive expression binding for methods that accept expressions.
- caseSensitive(boolean) - Method in class org.apache.iceberg.FindFiles.Builder
- caseSensitive(boolean) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- caseSensitive(boolean) - Method in class org.apache.iceberg.flink.source.IcebergSource.Builder
- caseSensitive(boolean) - Method in class org.apache.iceberg.flink.source.ScanContext.Builder
- caseSensitive(boolean) - Method in class org.apache.iceberg.ManifestReader
- caseSensitive(boolean) - Method in class org.apache.iceberg.MicroBatches.MicroBatchBuilder
- caseSensitive(boolean) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- caseSensitive(boolean) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- caseSensitive(boolean) - Method in interface org.apache.iceberg.OverwriteFiles
-
Enables or disables case sensitive expression binding for validations that accept expressions.
- caseSensitive(boolean) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- caseSensitive(boolean) - Method in interface org.apache.iceberg.RowDelta
-
Enables or disables case sensitive expression binding for validations that accept expressions.
- caseSensitive(boolean) - Method in interface org.apache.iceberg.Scan
-
Create a new scan from this that, if data columns where selected via
Scan.select(java.util.Collection)
, controls whether the match to the schema will be done with case sensitivity. - caseSensitive(boolean) - Method in class org.apache.iceberg.SortOrder.Builder
- caseSensitive(boolean) - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- caseSensitive(boolean) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Set whether column resolution in the source schema should be case sensitive.
- castAndThrow(Throwable, Class<E>) - Static method in class org.apache.iceberg.util.ExceptionUtil
- catalog() - Method in class org.apache.iceberg.flink.FlinkCatalog
- catalog() - Method in class org.apache.iceberg.spark.Spark3Util.CatalogAndIdentifier
- Catalog - Interface in org.apache.iceberg.catalog
-
A Catalog API for table create, drop, and load operations.
- CATALOG - Static variable in class org.apache.iceberg.mr.InputFormatConfig
-
Deprecated.please use
InputFormatConfig.catalogPropertyConfigKey(String, String)
with config keyCatalogUtil.ICEBERG_CATALOG_TYPE
to specify the type of a catalog. - CATALOG_CONFIG_PREFIX - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- CATALOG_IMPL - Static variable in class org.apache.iceberg.CatalogProperties
- CATALOG_LOADER_CLASS - Static variable in class org.apache.iceberg.mr.InputFormatConfig
-
Deprecated.please use
InputFormatConfig.catalogPropertyConfigKey(String, String)
with config keyCatalogProperties.CATALOG_IMPL
to specify the implementation of a catalog. - CATALOG_NAME - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- CATALOG_SCOPE - Static variable in class org.apache.iceberg.rest.auth.OAuth2Properties
-
Scope for OAuth2 flows.
- Catalog.TableBuilder - Interface in org.apache.iceberg.catalog
-
A builder used to create valid
tables
or start create/replacetransactions
. - catalogAndIdentifier(String, SparkSession, String) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(String, SparkSession, String, CatalogPlugin) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(List<String>, Function<String, C>, BiFunction<String[], String, T>, C, String[]) - Static method in class org.apache.iceberg.spark.SparkUtil
-
A modified version of Spark's LookupCatalog.CatalogAndIdentifier.unapply Attempts to find the catalog and identifier a multipart identifier represents
- catalogAndIdentifier(SparkSession, String) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(SparkSession, String, CatalogPlugin) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(SparkSession, List<String>) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(SparkSession, List<String>, CatalogPlugin) - Static method in class org.apache.iceberg.spark.Spark3Util
-
A modified version of Spark's LookupCatalog.CatalogAndIdentifier.unapply Attempts to find the catalog and identifier a multipart identifier represents
- CatalogAndIdentifier(Pair<CatalogPlugin, Identifier>) - Constructor for class org.apache.iceberg.spark.Spark3Util.CatalogAndIdentifier
- CatalogAndIdentifier(CatalogPlugin, Identifier) - Constructor for class org.apache.iceberg.spark.Spark3Util.CatalogAndIdentifier
- CatalogHandlers - Class in org.apache.iceberg.rest
- CatalogLoader - Interface in org.apache.iceberg.flink
-
Serializable loader to load an Iceberg
Catalog
. - CatalogLoader.CustomCatalogLoader - Class in org.apache.iceberg.flink
- CatalogLoader.HadoopCatalogLoader - Class in org.apache.iceberg.flink
- CatalogLoader.HiveCatalogLoader - Class in org.apache.iceberg.flink
- catalogName(Configuration, String) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
-
Returns the catalog name serialized to the configuration.
- CatalogProperties - Class in org.apache.iceberg
- catalogPropertyConfigKey(String, String) - Static method in class org.apache.iceberg.mr.InputFormatConfig
-
Get Hadoop config key of a catalog property based on catalog name
- Catalogs - Class in org.apache.iceberg.mr
-
Class for catalog resolution and accessing the common functions for
Catalog
API. - CatalogUtil - Class in org.apache.iceberg
- CHANGED_PARTITION_COUNT_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- CHANGED_PARTITION_PREFIX - Static variable in class org.apache.iceberg.SnapshotSummary
- ChangelogOperation - Enum in org.apache.iceberg
-
An enum representing possible operations in a changelog.
- ChangelogScanTask - Interface in org.apache.iceberg
-
A changelog scan task.
- changeOrdinal() - Method in interface org.apache.iceberg.ChangelogScanTask
-
Returns the ordinal of changes produced by this task.
- changes() - Method in class org.apache.iceberg.TableMetadata
- channelNames - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- channelReadChunkSize() - Method in class org.apache.iceberg.gcp.GCPProperties
- channelWriteChunkSize() - Method in class org.apache.iceberg.gcp.GCPProperties
- charAt(int) - Method in class org.apache.iceberg.util.CharSequenceWrapper
- charSequences() - Static method in class org.apache.iceberg.types.Comparators
- CharSequenceSet - Class in org.apache.iceberg.util
- CharSequenceWrapper - Class in org.apache.iceberg.util
-
Wrapper class to adapt CharSequence for use in maps and sets.
- check(boolean, String, Object...) - Static method in exception org.apache.iceberg.exceptions.NoSuchIcebergTableException
- check(boolean, String, Object...) - Static method in exception org.apache.iceberg.exceptions.ValidationException
- CHECK_NULLABILITY - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- CHECK_NULLABILITY - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- CHECK_NULLABILITY_DEFAULT - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- CHECK_ORDERING - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- CHECK_ORDERING - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- CHECK_ORDERING_DEFAULT - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- checkAndSetIoConfig(Configuration, Table) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
-
If enabled, it populates the FileIO's hadoop configuration with the input config object.
- checkAndSkipIoConfigSerialization(Configuration, Table) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
-
If enabled, it ensures that the FileIO's hadoop configuration will not be serialized.
- checkCommitStatus(String, TableMetadata) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
-
Attempt to load the table and see if any current or past metadata location matches the one we were attempting to set.
- checkCompatibility(SortOrder, Schema) - Static method in class org.apache.iceberg.SortOrder
- CheckCompatibility - Class in org.apache.iceberg.types
- checkNullability() - Method in class org.apache.iceberg.spark.SparkWriteConf
- checkOrdering() - Method in class org.apache.iceberg.spark.SparkWriteConf
- checkOutputSpecs(FileSystem, JobConf) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputFormat
- checkSourceCatalog(CatalogPlugin) - Method in class org.apache.iceberg.spark.actions.MigrateTableSparkAction
- checkSourceCatalog(CatalogPlugin) - Method in class org.apache.iceberg.spark.actions.SnapshotTableSparkAction
- cherrypick(long) - Method in interface org.apache.iceberg.ManageSnapshots
-
Apply supported changes in given snapshot and create a new snapshot which will be set as the current snapshot on commit.
- cherrypick(long) - Method in class org.apache.iceberg.SnapshotManager
- CherrypickAncestorCommitException - Exception in org.apache.iceberg.exceptions
-
This exception occurs when one cherrypicks an ancestor or when the picked snapshot is already linked to a published ancestor.
- CherrypickAncestorCommitException(long) - Constructor for exception org.apache.iceberg.exceptions.CherrypickAncestorCommitException
- CherrypickAncestorCommitException(long, long) - Constructor for exception org.apache.iceberg.exceptions.CherrypickAncestorCommitException
- child() - Method in class org.apache.iceberg.expressions.Not
- childColumn(int) - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- Ciphers - Class in org.apache.iceberg.encryption
- Ciphers() - Constructor for class org.apache.iceberg.encryption.Ciphers
- Ciphers.AesGcmDecryptor - Class in org.apache.iceberg.encryption
- Ciphers.AesGcmEncryptor - Class in org.apache.iceberg.encryption
- classLoader(ClassLoader) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- cleanExpiredFiles(boolean) - Method in interface org.apache.iceberg.ExpireSnapshots
-
Allows expiration of snapshots without any cleanup of underlying manifest or data files.
- cleanUncommitted(Set<ManifestFile>) - Method in class org.apache.iceberg.BaseRewriteManifests
- clear() - Method in class org.apache.iceberg.DataFiles.Builder
- clear() - Method in class org.apache.iceberg.FileMetadata.Builder
- clear() - Method in class org.apache.iceberg.SnapshotSummary.Builder
- clear() - Method in class org.apache.iceberg.util.CharSequenceSet
- clear() - Method in class org.apache.iceberg.util.PartitionSet
- clear() - Method in class org.apache.iceberg.util.SerializableMap
- clear() - Method in class org.apache.iceberg.util.StructLikeMap
- clear() - Method in class org.apache.iceberg.util.StructLikeSet
- clearRewrite(Table, String) - Method in class org.apache.iceberg.spark.FileRewriteCoordinator
- CLIENT_ACCESS_KEY_ID - Static variable in class org.apache.iceberg.aliyun.AliyunProperties
-
Aliyun uses an AccessKey pair, which includes an AccessKey ID and an AccessKey secret to implement symmetric encryption and verify the identity of a requester.
- CLIENT_ACCESS_KEY_SECRET - Static variable in class org.apache.iceberg.aliyun.AliyunProperties
-
Aliyun uses an AccessKey pair, which includes an AccessKey ID and an AccessKey secret to implement symmetric encryption and verify the identity of a requester.
- CLIENT_ASSUME_ROLE_ARN - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_EXTERNAL_ID - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_REGION - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_TAGS_PREFIX - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
to pass a list of sessions. - CLIENT_ASSUME_ROLE_TIMEOUT_SEC - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_TIMEOUT_SEC_DEFAULT - Static variable in class org.apache.iceberg.aws.AwsProperties
- CLIENT_ENABLE_ETAG_CHECK_DEFAULT - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Deprecated.will be removed at 0.15.0, please use
AwsProperties.S3_CHECKSUM_ENABLED_DEFAULT
instead - CLIENT_FACTORY - Static variable in class org.apache.iceberg.aliyun.AliyunProperties
-
The implementation class of
AliyunClientFactory
to customize Aliyun client configurations. - CLIENT_FACTORY - Static variable in class org.apache.iceberg.aws.AwsProperties
-
The implementation class of
AwsClientFactory
to customize AWS client configurations. - CLIENT_FACTORY - Static variable in class org.apache.iceberg.dell.DellProperties
-
The implementation class of
DellClientFactory
to customize Dell client configurations. - CLIENT_POOL_CACHE_EVICTION_INTERVAL_MS - Static variable in class org.apache.iceberg.CatalogProperties
- CLIENT_POOL_CACHE_EVICTION_INTERVAL_MS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- CLIENT_POOL_SIZE - Static variable in class org.apache.iceberg.CatalogProperties
- CLIENT_POOL_SIZE_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- clientLibToken() - Method in class org.apache.iceberg.gcp.GCPProperties
- ClientPool<C,E extends java.lang.Exception> - Interface in org.apache.iceberg
- ClientPool.Action<R,C,E extends java.lang.Exception> - Interface in org.apache.iceberg
- ClientPoolImpl<C,E extends java.lang.Exception> - Class in org.apache.iceberg
- ClientPoolImpl(int, Class<? extends E>, boolean) - Constructor for class org.apache.iceberg.ClientPoolImpl
- clone(RowData, RowData, RowType, TypeSerializer[]) - Static method in class org.apache.iceberg.flink.data.RowDataUtil
-
Similar to the private
RowDataSerializer.copyRowData(RowData, RowData)
method. - close() - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager.CommitService
- close() - Method in class org.apache.iceberg.aliyun.oss.OSSFileIO
- close() - Method in class org.apache.iceberg.aliyun.oss.OSSInputStream
-
Deprecated.
- close() - Method in class org.apache.iceberg.aliyun.oss.OSSOutputStream
- close() - Method in class org.apache.iceberg.arrow.vectorized.ArrowReader
- close() - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- close() - Method in class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- close() - Method in class org.apache.iceberg.arrow.vectorized.ColumnarBatch
-
Called to close all the columns in this batch.
- close() - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- close() - Method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- close() - Method in class org.apache.iceberg.arrow.vectorized.VectorizedTableScanIterable
- close() - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- close() - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbLockManager
- close() - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- close() - Method in class org.apache.iceberg.aws.s3.S3FileIO
- close() - Method in class org.apache.iceberg.ClientPoolImpl
- close() - Method in class org.apache.iceberg.deletes.EqualityDeleteWriter
- close() - Method in class org.apache.iceberg.deletes.PositionDeleteWriter
- close() - Method in class org.apache.iceberg.dell.ecs.EcsCatalog
- close() - Method in class org.apache.iceberg.dell.ecs.EcsFileIO
- close() - Method in class org.apache.iceberg.flink.FlinkCatalog
- close() - Method in interface org.apache.iceberg.flink.source.assigner.SplitAssigner
-
Some assigners may need to perform certain actions when their corresponding enumerators are closed
- close() - Method in class org.apache.iceberg.flink.source.DataIterator
- close() - Method in class org.apache.iceberg.flink.source.enumerator.ContinuousIcebergEnumerator
- close() - Method in class org.apache.iceberg.flink.source.enumerator.ContinuousSplitPlannerImpl
- close() - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- close() - Method in class org.apache.iceberg.flink.source.StreamingMonitorFunction
- close() - Method in class org.apache.iceberg.flink.source.StreamingReaderOperator
- close() - Method in class org.apache.iceberg.flink.TableLoader.CatalogTableLoader
- close() - Method in class org.apache.iceberg.flink.TableLoader.HadoopTableLoader
- close() - Method in class org.apache.iceberg.gcp.gcs.GCSFileIO
- close() - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- close() - Method in class org.apache.iceberg.io.BasePositionDeltaWriter
- close() - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
- close() - Method in class org.apache.iceberg.io.CloseableGroup
-
Close all the registered resources.
- close() - Method in class org.apache.iceberg.io.DataWriter
- close() - Method in interface org.apache.iceberg.io.FileIO
-
Close File IO to release underlying resources.
- close() - Method in class org.apache.iceberg.io.FilterIterator
- close() - Method in class org.apache.iceberg.io.PartitionedFanoutWriter
- close() - Method in class org.apache.iceberg.io.PartitionedWriter
- close() - Method in class org.apache.iceberg.io.ResolvingFileIO
- close() - Method in class org.apache.iceberg.io.UnpartitionedWriter
- close() - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- close() - Method in class org.apache.iceberg.ManifestWriter
- close() - Method in class org.apache.iceberg.mr.mapred.AbstractMapredIcebergRecordReader
- close() - Method in class org.apache.iceberg.nessie.NessieCatalog
- close() - Method in class org.apache.iceberg.nessie.NessieIcebergClient
- close() - Method in class org.apache.iceberg.orc.VectorizedRowBatchIterator
- close() - Method in class org.apache.iceberg.parquet.ParquetWriteAdapter
-
Deprecated.
- close() - Method in interface org.apache.iceberg.parquet.VectorizedReader
-
Release any resources allocated.
- close() - Method in class org.apache.iceberg.pig.IcebergPigInputFormat.IcebergRecordReader
- close() - Method in class org.apache.iceberg.puffin.PuffinReader
- close() - Method in class org.apache.iceberg.puffin.PuffinWriter
- close() - Method in class org.apache.iceberg.rest.HTTPClient
- close() - Method in class org.apache.iceberg.rest.RESTCatalog
- close() - Method in class org.apache.iceberg.rest.RESTSessionCatalog
- close() - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- close() - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- close(C) - Method in class org.apache.iceberg.ClientPoolImpl
- close(Closeable, boolean) - Static method in class org.apache.iceberg.util.Exceptions
- close(IMetaStoreClient) - Method in class org.apache.iceberg.hive.HiveClientPool
- CloseableGroup - Class in org.apache.iceberg.io
-
This class acts as a helper for handling the closure of multiple resource.
- CloseableGroup() - Constructor for class org.apache.iceberg.io.CloseableGroup
- CloseableIterable<T> - Interface in org.apache.iceberg.io
- CloseableIterable.ConcatCloseableIterable<E> - Class in org.apache.iceberg.io
- CloseableIterator<T> - Interface in org.apache.iceberg.io
- closeVectors() - Method in class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- ClosingIterator<T> - Class in org.apache.iceberg.io
-
A convenience wrapper around
CloseableIterator
, providing auto-close functionality when all of the elements in the iterator are consumed. - ClosingIterator(CloseableIterator<T>) - Constructor for class org.apache.iceberg.io.ClosingIterator
- clusterBy(Function<DataFile, Object>) - Method in class org.apache.iceberg.BaseRewriteManifests
- clusterBy(Function<DataFile, Object>) - Method in interface org.apache.iceberg.RewriteManifests
-
Groups an existing
DataFile
by a cluster key produced by a function. - ClusteredDataWriter<T> - Class in org.apache.iceberg.io
-
A data writer capable of writing to multiple specs and partitions that requires the incoming records to be properly clustered by partition spec and by partition within each spec.
- ClusteredDataWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, long) - Constructor for class org.apache.iceberg.io.ClusteredDataWriter
- ClusteredEqualityDeleteWriter<T> - Class in org.apache.iceberg.io
-
An equality delete writer capable of writing to multiple specs and partitions that requires the incoming delete records to be properly clustered by partition spec and by partition within each spec.
- ClusteredEqualityDeleteWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, long) - Constructor for class org.apache.iceberg.io.ClusteredEqualityDeleteWriter
- ClusteredPositionDeleteWriter<T> - Class in org.apache.iceberg.io
-
A position delete writer capable of writing to multiple specs and partitions that requires the incoming delete records to be properly clustered by partition spec and by partition within each spec.
- ClusteredPositionDeleteWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, long) - Constructor for class org.apache.iceberg.io.ClusteredPositionDeleteWriter
- clusterHadoopConf() - Static method in class org.apache.iceberg.flink.FlinkCatalogFactory
- code() - Method in class org.apache.iceberg.rest.responses.ErrorResponse
- codecName() - Method in enum org.apache.iceberg.puffin.PuffinCompressionCodec
- collect() - Method in class org.apache.iceberg.FindFiles.Builder
-
Returns all files in the table that match all of the filters.
- collections(int, int, ParquetValueWriter<E>) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- column - Variable in class org.apache.iceberg.parquet.ParquetValueReaders.PrimitiveReader
- column - Variable in class org.apache.iceberg.parquet.ParquetValueWriters.PrimitiveWriter
- column() - Method in interface org.apache.iceberg.parquet.ParquetValueReader
- column() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.PrimitiveReader
- column() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- column() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- column() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- column(int) - Method in class org.apache.iceberg.arrow.vectorized.ColumnarBatch
-
Returns the column at `ordinal`.
- COLUMN_SIZES - Static variable in interface org.apache.iceberg.DataFile
- ColumnarBatch - Class in org.apache.iceberg.arrow.vectorized
-
This class is inspired by Spark's
ColumnarBatch
. - ColumnarBatchReader - Class in org.apache.iceberg.spark.data.vectorized
-
VectorizedReader
that returns Spark'sColumnarBatch
to support Spark's vectorized read path. - ColumnarBatchReader(List<VectorizedReader<?>>) - Constructor for class org.apache.iceberg.spark.data.vectorized.ColumnarBatchReader
- columnIsDeletedPosition() - Method in class org.apache.iceberg.data.DeleteFilter
- ColumnIterator<T> - Class in org.apache.iceberg.parquet
- columnMode(String) - Method in class org.apache.iceberg.MetricsConfig
- columns() - Method in interface org.apache.iceberg.parquet.ParquetValueReader
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.PrimitiveReader
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- columns() - Method in interface org.apache.iceberg.parquet.ParquetValueWriter
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.PrimitiveWriter
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedKeyValueWriter
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedWriter
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.StructWriter
- columns() - Method in class org.apache.iceberg.Schema
-
Returns a List of the
columns
in this Schema. - columnSizes() - Method in interface org.apache.iceberg.ContentFile
-
Returns if collected, map from column ID to the size of the column in bytes, null otherwise.
- columnSizes() - Method in class org.apache.iceberg.Metrics
-
Get the number of bytes for all fields in a file.
- columnSizes() - Method in class org.apache.iceberg.spark.SparkDataFile
- ColumnVector - Class in org.apache.iceberg.arrow.vectorized
-
This class is inspired by Spark's
ColumnVector
. - ColumnVectorWithFilter - Class in org.apache.iceberg.spark.data.vectorized
- ColumnVectorWithFilter(VectorHolder, int[]) - Constructor for class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- ColumnWriter<T> - Class in org.apache.iceberg.parquet
- combine(Iterable<E>, Closeable) - Static method in interface org.apache.iceberg.io.CloseableIterable
- CombinedScanTask - Interface in org.apache.iceberg
-
A scan task made of several ranges from files.
- commit() - Method in class org.apache.iceberg.BaseReplaceSortOrder
- commit() - Method in interface org.apache.iceberg.PendingUpdate
-
Apply the pending changes and commit.
- commit() - Method in class org.apache.iceberg.SetLocation
- commit() - Method in class org.apache.iceberg.SnapshotManager
- commit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- commit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.BaseTransaction.TransactionTableOperations
- commit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.hadoop.HadoopTableOperations
- commit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.StaticTableOperations
- commit(TableMetadata, TableMetadata) - Method in interface org.apache.iceberg.TableOperations
-
Replace the base table metadata with a new version.
- commit(Offset) - Method in class org.apache.iceberg.spark.source.SparkMicroBatchStream
- COMMIT_FILE_THREAD_POOL_SIZE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- COMMIT_FILE_THREAD_POOL_SIZE_DEFAULT - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- COMMIT_MAX_RETRY_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_MAX_RETRY_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_MIN_RETRY_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_MIN_RETRY_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_NUM_RETRIES - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_NUM_RETRIES_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_NUM_STATUS_CHECKS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_NUM_STATUS_CHECKS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_MAX_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_MAX_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_MIN_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_MIN_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_TOTAL_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_TOTAL_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_TABLE_THREAD_POOL_SIZE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- COMMIT_TABLE_THREAD_POOL_SIZE_DEFAULT - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- COMMIT_TOTAL_RETRY_TIME_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_TOTAL_RETRY_TIME_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- commitCreateTable(Table) - Method in class org.apache.iceberg.mr.hive.HiveIcebergMetaHook
- commitDropTable(Table, boolean) - Method in class org.apache.iceberg.mr.hive.HiveIcebergMetaHook
- CommitFailedException - Exception in org.apache.iceberg.exceptions
-
Exception raised when a commit fails because of out of date metadata.
- CommitFailedException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.CommitFailedException
- CommitFailedException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.CommitFailedException
- commitFileGroups(Set<RewriteFileGroup>) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager
-
Perform a commit operation on the table adding and removing files as required for this set of file groups
- commitJob(JobContext) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
-
Reads the commit files stored in the temp directories and collects the generated committed data files.
- CommitMetadata - Class in org.apache.iceberg.spark
-
utility class to accept thread local commit properties
- commitOrClean(Set<RewriteFileGroup>) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager
- commitProperties() - Static method in class org.apache.iceberg.spark.CommitMetadata
- commitSnapshotId() - Method in interface org.apache.iceberg.ChangelogScanTask
-
Returns the snapshot ID in which the changes were committed.
- commitStagedChanges() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- commitStagedChanges() - Method in class org.apache.iceberg.spark.source.StagedSparkTable
- CommitStateUnknownException - Exception in org.apache.iceberg.exceptions
-
Exception for a failure to confirm either affirmatively or negatively that a commit was applied.
- CommitStateUnknownException(Throwable) - Constructor for exception org.apache.iceberg.exceptions.CommitStateUnknownException
- commitTask(TaskAttemptContext) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
-
Collects the generated data files and creates a commit file storing the data file list.
- commitTransaction() - Method in class org.apache.iceberg.BaseTransaction
- commitTransaction() - Method in interface org.apache.iceberg.Transaction
-
Apply the pending changes from all actions and commit.
- comparator() - Method in interface org.apache.iceberg.expressions.BoundTerm
-
Returns a
Comparator
for values produced by this term. - comparator() - Method in interface org.apache.iceberg.expressions.Literal
-
Return a
Comparator
for values. - Comparators - Class in org.apache.iceberg.types
- compareToFileList(Dataset<Row>) - Method in class org.apache.iceberg.spark.actions.DeleteOrphanFilesSparkAction
- CompatibilityTaskAttemptContextImpl(Configuration, TaskAttemptID, Reporter) - Constructor for class org.apache.iceberg.mr.mapred.MapredIcebergInputFormat.CompatibilityTaskAttemptContextImpl
- compatibleWith(PartitionSpec) - Method in class org.apache.iceberg.PartitionSpec
-
Returns true if this spec is equivalent to the other, with partition field ids ignored.
- complete() - Method in class org.apache.iceberg.io.BaseTaskWriter
- complete() - Method in interface org.apache.iceberg.io.TaskWriter
-
Close the writer and get the completed data and delete files.
- COMPLETED - org.apache.iceberg.flink.source.split.IcebergSourceSplitStatus
- compressBlobs(PuffinCompressionCodec) - Method in class org.apache.iceberg.puffin.Puffin.WriteBuilder
-
Configures the writer to compress the blobs.
- compressFooter() - Method in class org.apache.iceberg.puffin.Puffin.WriteBuilder
-
Configures the writer to compress the footer.
- COMPRESSION_FACTOR - Static variable in class org.apache.iceberg.spark.actions.SparkSortStrategy
-
The number of shuffle partitions and consequently the number of output files created by the Spark Sort is based on the size of the input data files used in this rewrite operation.
- compressionCodec() - Method in class org.apache.iceberg.puffin.BlobMetadata
- concat(Iterable<File>, File, int, Schema, Map<String, String>) - Static method in class org.apache.iceberg.parquet.Parquet
-
Combines several files into one
- concat(Iterable<CloseableIterable<E>>) - Static method in interface org.apache.iceberg.io.CloseableIterable
- conf() - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- conf() - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- config() - Static method in class org.apache.iceberg.rest.ResourcePaths
- config() - Method in class org.apache.iceberg.rest.responses.LoadTableResponse
- config(String, String) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- config(String, String) - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
-
Deprecated.Please use #set(String, String) instead
- CONFIG - Static variable in class org.apache.iceberg.flink.actions.Actions
- CONFIG_SERIALIZATION_DISABLED - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- CONFIG_SERIALIZATION_DISABLED_DEFAULT - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- ConfigBuilder(Configuration) - Constructor for class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- ConfigProperties - Class in org.apache.iceberg.hadoop
- ConfigResponse - Class in org.apache.iceberg.rest.responses
-
Represents a response to requesting server-side provided configuration for the REST catalog.
- ConfigResponse() - Constructor for class org.apache.iceberg.rest.responses.ConfigResponse
- ConfigResponse.Builder - Class in org.apache.iceberg.rest.responses
- Configurable<C> - Interface in org.apache.iceberg.hadoop
-
Interface used to avoid runtime dependencies on Hadoop Configurable
- configure(Configuration) - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- configure(JobConf) - Static method in class org.apache.iceberg.mr.mapred.MapredIcebergInputFormat
-
Configures the
JobConf
to use theMapredIcebergInputFormat
and returns a helper to add further configuration. - configure(Job) - Static method in class org.apache.iceberg.mr.mapreduce.IcebergInputFormat
-
Configures the
Job
to use theIcebergInputFormat
and returns a helper to add further configuration. - configure(T) - Method in class org.apache.iceberg.aws.AssumeRoleAwsClientFactory
- configureDataWrite(Avro.DataWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureDataWrite(ORC.DataWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureDataWrite(Parquet.DataWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureEndpoint(T, String) - Static method in class org.apache.iceberg.aws.AwsClientFactories
- configureEqualityDelete(Avro.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureEqualityDelete(ORC.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureEqualityDelete(Parquet.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureHadoopConf(Object, Object) - Static method in class org.apache.iceberg.CatalogUtil
-
Dynamically detects whether an object is a Hadoop Configurable and calls setConf.
- configureHttpClientBuilder(String) - Static method in class org.apache.iceberg.aws.AwsClientFactories
- configureInputJobCredentials(TableDesc, Map<String, String>) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- configureInputJobProperties(TableDesc, Map<String, String>) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- configureJobConf(TableDesc, JobConf) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- configureOutputJobProperties(TableDesc, Map<String, String>) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- configurePositionDelete(Avro.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configurePositionDelete(ORC.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configurePositionDelete(Parquet.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureTableJobProperties(TableDesc, Map<String, String>) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- conflictDetectionFilter(Expression) - Method in class org.apache.iceberg.BaseOverwriteFiles
- conflictDetectionFilter(Expression) - Method in interface org.apache.iceberg.OverwriteFiles
-
Sets a conflict detection filter used to validate concurrently added data and delete files.
- conflictDetectionFilter(Expression) - Method in interface org.apache.iceberg.RowDelta
-
Sets a conflict detection filter used to validate concurrently added data and delete files.
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExpressionContext
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformArgumentContext
- constant(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- constant(C) - Static method in class org.apache.iceberg.parquet.ParquetValueReaders
- ConstantContext() - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ConstantContext
- ConstantContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ConstantContext
- constantHolder(int, T) - Static method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- constants(C) - Static method in class org.apache.iceberg.orc.OrcValueReaders
- constantsMap(FileScanTask) - Static method in class org.apache.iceberg.util.PartitionUtil
- constantsMap(FileScanTask, BiFunction<Type, Object, Object>) - Static method in class org.apache.iceberg.util.PartitionUtil
- constantsMap(FileScanTask, Types.StructType, BiFunction<Type, Object, Object>) - Static method in class org.apache.iceberg.util.PartitionUtil
- ConstantVectorHolder(int) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorHolder.ConstantVectorHolder
- ConstantVectorHolder(int, T) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorHolder.ConstantVectorHolder
- ConstantVectorReader(T) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader.ConstantVectorReader
- constrained() - Static method in class org.apache.iceberg.flink.source.assigner.GetSplitResult
- CONSTRAINED - org.apache.iceberg.flink.source.assigner.GetSplitResult.Status
-
There are pending splits.
- Container<T> - Class in org.apache.iceberg.mr.mapred
-
A simple container of objects that you can get and set.
- Container() - Constructor for class org.apache.iceberg.mr.mapred.Container
- contains(int, StructLike) - Method in class org.apache.iceberg.util.PartitionSet
- contains(Object) - Method in class org.apache.iceberg.util.CharSequenceSet
- contains(Object) - Method in class org.apache.iceberg.util.PartitionSet
- contains(Object) - Method in class org.apache.iceberg.util.StructLikeSet
- contains(String) - Method in class org.apache.iceberg.spark.SparkTableCache
- containsAll(Collection<?>) - Method in class org.apache.iceberg.util.CharSequenceSet
- containsAll(Collection<?>) - Method in class org.apache.iceberg.util.PartitionSet
- containsAll(Collection<?>) - Method in class org.apache.iceberg.util.StructLikeSet
- containsKey(Object) - Method in class org.apache.iceberg.util.SerializableMap
- containsKey(Object) - Method in class org.apache.iceberg.util.StructLikeMap
- containsNaN() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- containsNaN() - Method in interface org.apache.iceberg.ManifestFile.PartitionFieldSummary
-
Returns true if at least one data file in the manifest has a NaN value for the field.
- containsNull() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- containsNull() - Method in interface org.apache.iceberg.ManifestFile.PartitionFieldSummary
-
Returns true if at least one data file in the manifest has a null value for the field.
- containsValue(Object) - Method in class org.apache.iceberg.util.SerializableMap
- containsValue(Object) - Method in class org.apache.iceberg.util.StructLikeMap
- content() - Method in interface org.apache.iceberg.ContentFile
-
Returns type of content stored in the file; one of DATA, POSITION_DELETES, or EQUALITY_DELETES.
- content() - Method in interface org.apache.iceberg.DataFile
- content() - Method in class org.apache.iceberg.GenericManifestFile
- content() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the content stored in the manifest; either DATA or DELETES.
- content() - Method in class org.apache.iceberg.ManifestWriter
- content() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- CONTENT - Static variable in interface org.apache.iceberg.DataFile
- ContentFile<F> - Interface in org.apache.iceberg
-
Superinterface of
DataFile
andDeleteFile
that exposes common methods. - ContentScanTask<F extends ContentFile<F>> - Interface in org.apache.iceberg
-
A scan task over a range of bytes in a content file.
- ContinuousIcebergEnumerator - Class in org.apache.iceberg.flink.source.enumerator
- ContinuousIcebergEnumerator(SplitEnumeratorContext<IcebergSourceSplit>, SplitAssigner, ScanContext, ContinuousSplitPlanner, IcebergEnumeratorState) - Constructor for class org.apache.iceberg.flink.source.enumerator.ContinuousIcebergEnumerator
- ContinuousSplitPlanner - Interface in org.apache.iceberg.flink.source.enumerator
-
This interface is introduced so that we can plug in different split planner for unit test
- ContinuousSplitPlannerImpl - Class in org.apache.iceberg.flink.source.enumerator
- ContinuousSplitPlannerImpl(Table, ScanContext, String) - Constructor for class org.apache.iceberg.flink.source.enumerator.ContinuousSplitPlannerImpl
- Conversions - Class in org.apache.iceberg.types
- convert(byte[]) - Static method in class org.apache.iceberg.util.UUIDUtil
- convert(byte[], int) - Static method in class org.apache.iceberg.util.UUIDUtil
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergBinaryObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDateObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDecimalObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergFixedObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimeObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampWithZoneObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergUUIDObjectInspector
- convert(Object) - Method in interface org.apache.iceberg.mr.hive.serde.objectinspector.WriteObjectInspector
- convert(ByteBuffer) - Static method in class org.apache.iceberg.util.UUIDUtil
- convert(List<String>, List<TypeInfo>, List<String>) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts the Hive list of column names and column types to an Iceberg schema.
- convert(List<String>, List<TypeInfo>, List<String>, boolean) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts the Hive list of column names and column types to an Iceberg schema.
- convert(List<FieldSchema>) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts a Hive schema (list of FieldSchema objects) to an Iceberg schema.
- convert(List<FieldSchema>, boolean) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts a Hive schema (list of FieldSchema objects) to an Iceberg schema.
- convert(UUID) - Static method in class org.apache.iceberg.util.UUIDUtil
- convert(Schema) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(TableSchema) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert the flink table schema to apache iceberg schema.
- convert(Expression) - Static method in class org.apache.iceberg.flink.FlinkFilters
-
Convert flink expression to iceberg expression.
- convert(TypeInfo) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts a Hive typeInfo object to an Iceberg type.
- convert(Schema) - Static method in class org.apache.iceberg.arrow.ArrowSchemaUtil
-
Convert Iceberg schema to Arrow Schema.
- convert(Schema) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert a
Schema
to aFlink type
. - convert(Schema) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts the Iceberg schema to a Hive schema (list of FieldSchema objects).
- convert(Schema) - Static method in class org.apache.iceberg.orc.ORCSchemaUtil
- convert(Schema) - Static method in class org.apache.iceberg.pig.SchemaUtil
- convert(Schema) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a
Schema
to aSpark type
. - convert(Schema, String) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Schema, String) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
- convert(Schema, String) - Method in class org.apache.iceberg.parquet.TypeToMessageType
- convert(Schema, Map<Types.StructType, String>) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Schema, TableSchema) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert a Flink
TableSchema
to aSchema
based on the given schema. - convert(Schema, Row) - Static method in class org.apache.iceberg.spark.SparkValueConverter
- convert(Schema, StructType) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aSchema
based on the given schema. - convert(SortOrder) - Static method in class org.apache.iceberg.spark.SparkDistributionAndOrderingUtil
- convert(Type) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Type) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert a
Type
to aFlink type
. - convert(Type) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts an Iceberg type to a Hive TypeInfo object.
- convert(Type) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a
Type
to aSpark type
. - convert(Type, Object) - Static method in class org.apache.iceberg.spark.SparkValueConverter
- convert(Type, Map<Types.StructType, String>) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Types.NestedField) - Static method in class org.apache.iceberg.arrow.ArrowSchemaUtil
- convert(Types.StructType, String) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(TypeDescription) - Static method in class org.apache.iceberg.orc.ORCSchemaUtil
-
Convert an ORC schema to an Iceberg schema.
- convert(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
-
Converts a Parquet schema to an Iceberg schema.
- convert(Filter) - Static method in class org.apache.iceberg.spark.SparkFilters
- convert(Filter[]) - Static method in class org.apache.iceberg.spark.SparkFilters
- convert(DataType) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aType
with new field ids. - convert(StructType) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aSchema
with new field ids. - convert(StructType, boolean) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aSchema
with new field ids. - convertAndPrune(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
-
Converts a Parquet schema to an Iceberg schema and prunes fields without IDs.
- convertConstant(Type, Object) - Static method in class org.apache.iceberg.data.IdentityPartitionConverters
-
Conversions from internal representations to Iceberg generic values.
- convertConstant(Type, Object) - Static method in class org.apache.iceberg.flink.data.RowDataUtil
- convertDeleteFiles(Iterable<DeleteFile>) - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteStrategy
-
Define how to convert the deletes.
- convertedEqualityDeleteFilesCount() - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteFiles.Result
-
Returns the count of the deletes that been converted.
- ConvertEqualityDeleteFiles - Interface in org.apache.iceberg.actions
-
An action for converting the equality delete files to position delete files.
- ConvertEqualityDeleteFiles.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- ConvertEqualityDeleteStrategy - Interface in org.apache.iceberg.actions
-
A strategy for the action to convert equality delete to position deletes.
- convertToByteBuffer(UUID) - Static method in class org.apache.iceberg.util.UUIDUtil
- convertTypes(Types.StructType, String) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convertWithFreshIds(Schema, StructType) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aSchema
based on the given schema. - copy() - Method in interface org.apache.iceberg.ContentFile
-
Copies this file.
- copy() - Method in class org.apache.iceberg.data.GenericRecord
- copy() - Method in interface org.apache.iceberg.data.Record
- copy() - Method in interface org.apache.iceberg.encryption.EncryptionKeyMetadata
- copy() - Method in class org.apache.iceberg.flink.IcebergTableSink
- copy() - Method in class org.apache.iceberg.flink.IcebergTableSource
- copy() - Method in class org.apache.iceberg.GenericManifestFile
- copy() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- copy() - Method in interface org.apache.iceberg.ManifestFile
-
Copies this
manifest file
. - copy() - Method in interface org.apache.iceberg.ManifestFile.PartitionFieldSummary
-
Copies this
summary
. - copy() - Method in class org.apache.iceberg.PartitionKey
- copy() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- copy() - Method in class org.apache.iceberg.spark.SparkDataFile
- copy(boolean) - Method in interface org.apache.iceberg.ContentFile
-
Copies this file (potentially without file stats).
- copy(String, Object) - Method in interface org.apache.iceberg.data.Record
- copy(String, Object, String, Object) - Method in interface org.apache.iceberg.data.Record
- copy(String, Object, String, Object, String, Object) - Method in interface org.apache.iceberg.data.Record
- copy(ByteBuffer) - Static method in class org.apache.iceberg.util.ByteBuffers
- copy(Map<String, Object>) - Method in class org.apache.iceberg.data.GenericRecord
- copy(Map<String, Object>) - Method in interface org.apache.iceberg.data.Record
- copy(DataFile) - Method in class org.apache.iceberg.DataFiles.Builder
- copy(DeleteFile) - Method in class org.apache.iceberg.FileMetadata.Builder
- copy(PartitionSpec, StructLike) - Static method in class org.apache.iceberg.DataFiles
- COPY_ON_WRITE - org.apache.iceberg.RowLevelOperationMode
- copyFor(StructLike) - Method in class org.apache.iceberg.data.InternalRecordWrapper
- copyFor(StructLike) - Method in class org.apache.iceberg.util.StructLikeWrapper
-
Creates a copy of this wrapper that wraps a struct.
- copyFrom(IcebergSqlExtensionsParser.CallArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallArgumentContext
- copyFrom(IcebergSqlExtensionsParser.ConstantContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ConstantContext
- copyFrom(IcebergSqlExtensionsParser.IdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentifierContext
- copyFrom(IcebergSqlExtensionsParser.NumberContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumberContext
- copyFrom(IcebergSqlExtensionsParser.StatementContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StatementContext
- copyFrom(IcebergSqlExtensionsParser.TransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformContext
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergBinaryObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDateObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDecimalObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergFixedObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimeObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampWithZoneObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergUUIDObjectInspector
- copyOf(Map<K, V>) - Static method in class org.apache.iceberg.util.SerializableMap
- copyOf(ManifestFile) - Static method in class org.apache.iceberg.GenericManifestFile
- copyOf(Table) - Static method in class org.apache.iceberg.SerializableTable
-
Creates a read-only serializable table that can be sent to other nodes in a cluster.
- copyOf(Table) - Static method in class org.apache.iceberg.spark.source.SerializableTableWithSize
- copyOnWriteMergeDistributionMode() - Method in class org.apache.iceberg.spark.SparkWriteConf
- copyWithAppendsBetween(Long, long) - Method in class org.apache.iceberg.flink.source.ScanContext
- copyWithoutStats() - Method in interface org.apache.iceberg.ContentFile
-
Copies this file without file stats.
- copyWithoutStats() - Method in class org.apache.iceberg.spark.SparkDataFile
- copyWithSnapshotId(long) - Method in class org.apache.iceberg.flink.source.ScanContext
- count() - Method in interface org.apache.iceberg.metrics.MetricsContext.Counter
-
Reporting count is optional if the counter is reporting externally.
- COUNT - org.apache.iceberg.metrics.MetricsContext.Unit
- counter(String, Class<T>, MetricsContext.Unit) - Method in class org.apache.iceberg.flink.source.reader.ReaderMetricsContext
- counter(String, Class<T>, MetricsContext.Unit) - Method in class org.apache.iceberg.hadoop.HadoopMetricsContext
-
The Hadoop implementation delegates to the Hadoop delegates to the FileSystem.Statistics implementation and therefore does not require support for operations like unit() and count() as the counter values are not directly consumed.
- counter(String, Class<T>, MetricsContext.Unit) - Method in interface org.apache.iceberg.metrics.MetricsContext
-
Get a named counter of a specific type.
- Counts() - Constructor for class org.apache.iceberg.MetricsModes.Counts
- create() - Method in class org.apache.iceberg.aws.s3.S3OutputFile
-
Create an output stream for the specified location if the target object does not exist in S3 at the time of invocation.
- create() - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- create() - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Creates the table.
- create() - Static method in class org.apache.iceberg.deletes.PositionDelete
- create() - Method in class org.apache.iceberg.flink.sink.RowDataTaskWriterFactory
- create() - Method in interface org.apache.iceberg.flink.sink.TaskWriterFactory
-
Initialize a
TaskWriter
with given task id and attempt id. - create() - Method in class org.apache.iceberg.hadoop.HadoopOutputFile
- create() - Method in interface org.apache.iceberg.io.OutputFile
-
Create a new file and return a
PositionOutputStream
to it. - create() - Method in class org.apache.iceberg.orc.OrcValueReaders.StructReader
- create(ByteBuffer) - Static method in class org.apache.iceberg.encryption.NativeFileCryptoParameters
-
Creates the builder.
- create(Map<Integer, PartitionSpec>) - Static method in class org.apache.iceberg.util.PartitionSet
- create(Schema) - Static method in class org.apache.iceberg.data.avro.DataWriter
- create(RowType, Types.StructType, Types.StructType) - Static method in class org.apache.iceberg.flink.data.RowDataProjection
-
Creates a projecting wrapper for
RowData
rows. - create(Schema) - Static method in class org.apache.iceberg.data.GenericRecord
- create(Schema) - Static method in class org.apache.iceberg.mapping.MappingUtil
-
Create a name-based mapping for a schema.
- create(Schema) - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- create(Schema, String) - Method in interface org.apache.iceberg.Tables
- create(Schema, Set<Integer>) - Static method in class org.apache.iceberg.util.StructProjection
-
Creates a projecting wrapper for
StructLike
rows. - create(Schema, Schema) - Static method in class org.apache.iceberg.data.avro.DataReader
- create(Schema, Schema, Map<Integer, ?>) - Static method in class org.apache.iceberg.data.avro.DataReader
- create(Schema, PartitionSpec, String) - Method in interface org.apache.iceberg.Tables
- create(Schema, PartitionSpec, Map<String, String>, String) - Method in interface org.apache.iceberg.Tables
- create(Schema, PartitionSpec, SortOrder, Map<String, String>, String) - Method in class org.apache.iceberg.hadoop.HadoopTables
-
Create a table using the FileSystem implementation resolve from location.
- create(Schema, PartitionSpec, SortOrder, Map<String, String>, String) - Method in interface org.apache.iceberg.Tables
- create(Schema, Schema) - Static method in class org.apache.iceberg.flink.data.RowDataProjection
-
Creates a projecting wrapper for
RowData
rows. - create(Schema, Schema) - Static method in class org.apache.iceberg.util.StructProjection
-
Creates a projecting wrapper for
StructLike
rows. - create(Types.NestedField...) - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- create(Types.StructType) - Static method in class org.apache.iceberg.data.GenericRecord
- create(Types.StructType) - Static method in class org.apache.iceberg.util.StructLikeMap
- create(Types.StructType) - Static method in class org.apache.iceberg.util.StructLikeSet
- create(Types.StructType, Types.StructType) - Static method in class org.apache.iceberg.util.StructProjection
-
Creates a projecting wrapper for
StructLike
rows. - createAllowMissing(Types.StructType, Types.StructType) - Static method in class org.apache.iceberg.util.StructProjection
-
Creates a projecting wrapper for
StructLike
rows. - createAssigner() - Method in class org.apache.iceberg.flink.source.assigner.SimpleSplitAssignerFactory
- createAssigner() - Method in interface org.apache.iceberg.flink.source.assigner.SplitAssignerFactory
- createAssigner(Collection<IcebergSourceSplitState>) - Method in class org.apache.iceberg.flink.source.assigner.SimpleSplitAssignerFactory
- createAssigner(Collection<IcebergSourceSplitState>) - Method in interface org.apache.iceberg.flink.source.assigner.SplitAssignerFactory
- createBatchedReaderFunc(Function<TypeDescription, OrcBatchReader<?>>) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- createBatchedReaderFunc(Function<MessageType, VectorizedReader<?>>) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- createBatchWriterFactory(PhysicalWriteInfo) - Method in interface org.apache.spark.sql.connector.iceberg.write.DeltaBatchWrite
- createBranch(String, long) - Method in interface org.apache.iceberg.ManageSnapshots
-
Create a new branch pointing to the given snapshot id.
- createBranch(String, long) - Method in class org.apache.iceberg.SnapshotManager
- createCatalog(String, Map<String, String>) - Method in class org.apache.iceberg.flink.FlinkCatalogFactory
- createCatalog(String, Map<String, String>, Configuration) - Method in class org.apache.iceberg.flink.FlinkCatalogFactory
- CREATED_BY_PROPERTY - Static variable in class org.apache.iceberg.puffin.StandardPuffinProperties
-
human-readable identification of the application writing the file, along with its version.
- createDatabase(String, CatalogDatabase, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- createDataIterator(IcebergSourceSplit) - Method in class org.apache.iceberg.flink.source.reader.DataIteratorReaderFunction
- createDataIterator(IcebergSourceSplit) - Method in class org.apache.iceberg.flink.source.reader.RowDataReaderFunction
- createdAtMillis() - Method in class org.apache.iceberg.io.FileInfo
- createdBy(String) - Method in class org.apache.iceberg.puffin.Puffin.WriteBuilder
-
Sets file-level "created-by" property.
- createDynamicTableSink(DynamicTableFactory.Context) - Method in class org.apache.iceberg.flink.FlinkDynamicTableFactory
- createDynamicTableSource(DynamicTableFactory.Context) - Method in class org.apache.iceberg.flink.FlinkDynamicTableFactory
- createEmpty() - Static method in class org.apache.iceberg.catalog.SessionCatalog.SessionContext
- createEnumerator(SplitEnumeratorContext<IcebergSourceSplit>) - Method in class org.apache.iceberg.flink.source.IcebergSource
- createFunction(ObjectPath, CatalogFunction, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- createInputSplits(int) - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- createKey() - Method in class org.apache.iceberg.mr.mapred.AbstractMapredIcebergRecordReader
- createMetadataTableInstance(TableOperations, String, String, MetadataTableType) - Static method in class org.apache.iceberg.MetadataTableUtils
- createMetadataTableInstance(TableOperations, String, TableIdentifier, TableIdentifier, MetadataTableType) - Static method in class org.apache.iceberg.MetadataTableUtils
- createMetadataTableInstance(Table, MetadataTableType) - Static method in class org.apache.iceberg.MetadataTableUtils
- createNamespace(String[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkCatalog
- createNamespace(String[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- createNamespace(Namespace) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
Create a namespace in the catalog.
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.catalog.BaseSessionCatalog.AsCatalog
- createNamespace(Namespace, Map<String, String>) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
Create a namespace in the catalog.
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.dell.ecs.EcsCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.hive.HiveCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.nessie.NessieCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.nessie.NessieIcebergClient
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.rest.RESTCatalog
- createNamespace(SessionCatalog.SessionContext, Namespace) - Method in interface org.apache.iceberg.catalog.SessionCatalog
-
Create a namespace in the catalog.
- createNamespace(SessionCatalog.SessionContext, Namespace, Map<String, String>) - Method in interface org.apache.iceberg.catalog.SessionCatalog
-
Create a namespace in the catalog.
- createNamespace(SessionCatalog.SessionContext, Namespace, Map<String, String>) - Method in class org.apache.iceberg.rest.RESTSessionCatalog
- createNamespace(SupportsNamespaces, CreateNamespaceRequest) - Static method in class org.apache.iceberg.rest.CatalogHandlers
- CreateNamespaceRequest - Class in org.apache.iceberg.rest.requests
-
A REST request to create a namespace, with an optional set of properties.
- CreateNamespaceRequest() - Constructor for class org.apache.iceberg.rest.requests.CreateNamespaceRequest
- CreateNamespaceRequest.Builder - Class in org.apache.iceberg.rest.requests
- CreateNamespaceResponse - Class in org.apache.iceberg.rest.responses
-
Represents a REST response for a request to create a namespace / database.
- CreateNamespaceResponse() - Constructor for class org.apache.iceberg.rest.responses.CreateNamespaceResponse
- CreateNamespaceResponse.Builder - Class in org.apache.iceberg.rest.responses
- createNanValueCounts(Stream<FieldMetrics<?>>, MetricsConfig, Schema) - Static method in class org.apache.iceberg.MetricsUtil
-
Construct mapping relationship between column id to NaN value counts from input metrics and metrics config.
- createOrOverwrite() - Method in class org.apache.iceberg.aws.s3.S3OutputFile
- createOrOverwrite() - Method in class org.apache.iceberg.hadoop.HadoopOutputFile
- createOrOverwrite() - Method in interface org.apache.iceberg.io.OutputFile
-
Create a new file and return a
PositionOutputStream
to it. - createOrReplaceTableTransaction(String, TableOperations, TableMetadata) - Static method in class org.apache.iceberg.Transactions
- createOrReplaceTransaction() - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- createOrReplaceTransaction() - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Starts a transaction to create or replace the table.
- createPartition(ObjectPath, CatalogPartitionSpec, CatalogPartition, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- createReader(SourceReaderContext) - Method in class org.apache.iceberg.flink.source.IcebergSource
- createReader(Schema, MessageType) - Method in class org.apache.iceberg.data.parquet.BaseParquetReaders
- createReader(Schema, MessageType, Map<Integer, ?>) - Method in class org.apache.iceberg.data.parquet.BaseParquetReaders
- createReaderFactory() - Method in class org.apache.iceberg.spark.source.SparkMicroBatchStream
- createReaderFunc(BiFunction<Schema, Schema, DatumReader<?>>) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- createReaderFunc(Function<Schema, DatumReader<?>>) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- createReaderFunc(Function<TypeDescription, OrcRowReader<?>>) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- createReaderFunc(Function<MessageType, ParquetValueReader<?>>) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- createRecordReader(InputSplit, TaskAttemptContext) - Method in class org.apache.iceberg.mr.mapreduce.IcebergInputFormat
- createRecordReader(InputSplit, TaskAttemptContext) - Method in class org.apache.iceberg.pig.IcebergPigInputFormat
- CreateSnapshotEvent - Class in org.apache.iceberg.events
- CreateSnapshotEvent(String, String, long, long, Map<String, String>) - Constructor for class org.apache.iceberg.events.CreateSnapshotEvent
- createStructReader(List<Type>, List<ParquetValueReader<?>>, Types.StructType) - Method in class org.apache.iceberg.data.parquet.BaseParquetReaders
- createStructReader(List<Type>, List<ParquetValueReader<?>>, Types.StructType) - Method in class org.apache.iceberg.data.parquet.GenericParquetReaders
- createStructReader(Types.StructType, List<ValueReader<?>>, Map<Integer, ?>) - Method in class org.apache.iceberg.data.avro.DataReader
- createStructWriter(List<ValueWriter<?>>) - Method in class org.apache.iceberg.data.avro.DataWriter
- createStructWriter(List<ParquetValueWriter<?>>) - Method in class org.apache.iceberg.data.parquet.BaseParquetWriter
- createStructWriter(List<ParquetValueWriter<?>>) - Method in class org.apache.iceberg.data.parquet.GenericParquetWriter
- createTable(ObjectPath, CatalogBaseTable, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- createTable(Configuration, Properties) - Static method in class org.apache.iceberg.mr.Catalogs
-
Creates an Iceberg table using the catalog specified by the configuration.
- createTable(Catalog, Namespace, CreateTableRequest) - Static method in class org.apache.iceberg.rest.CatalogHandlers
- createTable(TableIdentifier, Schema) - Method in interface org.apache.iceberg.catalog.Catalog
-
Create an unpartitioned table.
- createTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.rest.RESTCatalog
- createTable(TableIdentifier, Schema, PartitionSpec) - Method in interface org.apache.iceberg.catalog.Catalog
-
Create a table.
- createTable(TableIdentifier, Schema, PartitionSpec) - Method in class org.apache.iceberg.rest.RESTCatalog
- createTable(TableIdentifier, Schema, PartitionSpec, String, Map<String, String>) - Method in interface org.apache.iceberg.catalog.Catalog
-
Create a table.
- createTable(TableIdentifier, Schema, PartitionSpec, String, Map<String, String>) - Method in class org.apache.iceberg.rest.RESTCatalog
- createTable(TableIdentifier, Schema, PartitionSpec, Map<String, String>) - Method in interface org.apache.iceberg.catalog.Catalog
-
Create a table.
- createTable(TableIdentifier, Schema, PartitionSpec, Map<String, String>) - Method in class org.apache.iceberg.rest.RESTCatalog
- createTable(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkCachedTableCatalog
- createTable(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkCatalog
- createTable(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- CreateTableRequest - Class in org.apache.iceberg.rest.requests
-
A REST request to create a table, either via direct commit or staging the creation of the table as part of a transaction.
- CreateTableRequest() - Constructor for class org.apache.iceberg.rest.requests.CreateTableRequest
- CreateTableRequest.Builder - Class in org.apache.iceberg.rest.requests
- createTableTransaction(String, TableOperations, TableMetadata) - Static method in class org.apache.iceberg.Transactions
- createTag(String, long) - Method in interface org.apache.iceberg.ManageSnapshots
-
Create a new tag pointing to the given snapshot id
- createTag(String, long) - Method in class org.apache.iceberg.SnapshotManager
- createTransaction() - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- createTransaction() - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Starts a transaction to create the table.
- createVectorSchemaRootFromVectors() - Method in class org.apache.iceberg.arrow.vectorized.ColumnarBatch
-
Create a new instance of
VectorSchemaRoot
from the arrow vectors stored in this arrow batch. - createWriter(int, long) - Method in interface org.apache.spark.sql.connector.iceberg.write.DeltaWriterFactory
- createWriter(MessageType) - Method in class org.apache.iceberg.data.parquet.BaseParquetWriter
- createWriterFunc(BiFunction<Schema, TypeDescription, OrcRowWriter<?>>) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- createWriterFunc(BiFunction<Schema, TypeDescription, OrcRowWriter<?>>) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- createWriterFunc(BiFunction<Schema, TypeDescription, OrcRowWriter<?>>) - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- createWriterFunc(Function<Schema, DatumWriter<?>>) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- createWriterFunc(Function<Schema, DatumWriter<?>>) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- createWriterFunc(Function<Schema, DatumWriter<?>>) - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- createWriterFunc(Function<MessageType, ParquetValueWriter<?>>) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- createWriterFunc(Function<MessageType, ParquetValueWriter<?>>) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- createWriterFunc(Function<MessageType, ParquetValueWriter<?>>) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- CREDENTIAL - Static variable in class org.apache.iceberg.rest.auth.OAuth2Properties
-
A credential to exchange for a token in the OAuth2 client credentials flow.
- credentials() - Method in class org.apache.iceberg.catalog.SessionCatalog.SessionContext
-
Returns the session's credential map.
- CredentialSupplier - Interface in org.apache.iceberg.io
-
Interface used to expose credentials held by a FileIO instance.
- ctorImpl(Class<?>, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
- ctorImpl(String, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
- current() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- current() - Method in class org.apache.iceberg.BaseTransaction.TransactionTableOperations
- current() - Method in class org.apache.iceberg.hadoop.HadoopTableOperations
- current() - Method in class org.apache.iceberg.StaticTableOperations
- current() - Method in interface org.apache.iceberg.TableOperations
-
Return the currently loaded table metadata, without checking for updates.
- CURRENT_SCHEMA - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for the JSON representation of current schema.
- CURRENT_SNAPSHOT_ID - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for current snapshot id.
- CURRENT_SNAPSHOT_SUMMARY - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for current snapshot summary.
- CURRENT_SNAPSHOT_TIMESTAMP - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for current snapshot timestamp.
- currentAncestorIds(Table) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Return the snapshot IDs for the ancestors of the current table state.
- currentAncestors(Table) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Returns an iterable that traverses the table's snapshots from the current to the last known ancestor.
- currentDefinitionLevel() - Method in class org.apache.iceberg.parquet.ColumnIterator
- currentDL - Variable in class org.apache.iceberg.parquet.BasePageIterator
- currentFieldName() - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- currentFileHasNext() - Method in class org.apache.iceberg.flink.source.DataIterator
- currentMetadataLocation() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- currentPageCount() - Method in class org.apache.iceberg.parquet.BasePageIterator
- currentPath() - Method in class org.apache.iceberg.flink.data.ParquetWithFlinkSchemaVisitor
- currentPath() - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- currentPath() - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- currentPath() - Method in class org.apache.iceberg.parquet.TypeWithSchemaVisitor
- currentPath() - Method in class org.apache.iceberg.spark.data.ParquetWithSparkSchemaVisitor
- currentRepetitionLevel() - Method in class org.apache.iceberg.parquet.ColumnIterator
- currentRL - Variable in class org.apache.iceberg.parquet.BasePageIterator
- currentSchemaId() - Method in class org.apache.iceberg.TableMetadata
- currentSnapshot() - Method in class org.apache.iceberg.BaseMetadataTable
- currentSnapshot() - Method in class org.apache.iceberg.BaseTable
- currentSnapshot() - Method in class org.apache.iceberg.BaseTransaction.TransactionTable
- currentSnapshot() - Method in class org.apache.iceberg.SerializableTable
- currentSnapshot() - Method in interface org.apache.iceberg.Table
-
Get the current
snapshot
for this table, or null if there are no snapshots. - currentSnapshot() - Method in class org.apache.iceberg.TableMetadata
- currentVersion() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- currentVersion() - Static method in class org.apache.iceberg.dell.ecs.PropertiesSerDesUtil
-
Get version of current serializer implementation.
- custom(String, Map<String, String>, Configuration, String) - Static method in interface org.apache.iceberg.flink.CatalogLoader
- CustomOrderSchemaVisitor() - Constructor for class org.apache.iceberg.types.TypeUtil.CustomOrderSchemaVisitor
D
- data(PartitionSpec, String) - Static method in class org.apache.iceberg.DataFiles
- DATA - org.apache.iceberg.FileContent
- DATA - org.apache.iceberg.ManifestContent
- DATA_FILES - org.apache.iceberg.ManifestReader.FileType
- DATA_FILES - org.apache.iceberg.MetadataTableType
- databaseExists(String) - Method in class org.apache.iceberg.flink.FlinkCatalog
- DataFile - Interface in org.apache.iceberg
-
Interface for data files listed in a table manifest.
- dataFileFormat() - Method in class org.apache.iceberg.flink.FlinkWriteConf
- dataFileFormat() - Method in class org.apache.iceberg.spark.SparkWriteConf
- dataFiles() - Method in class org.apache.iceberg.io.DataWriteResult
- dataFiles() - Method in interface org.apache.iceberg.io.TaskWriter
-
Close the writer and get the completed data files, it requires that the task writer would produce data files only.
- dataFiles() - Method in class org.apache.iceberg.io.WriteResult
- DataFiles - Class in org.apache.iceberg
- DataFiles.Builder - Class in org.apache.iceberg
- DataFilesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's data files as rows. - DataFilesTable.DataFilesTableScan - Class in org.apache.iceberg
- DataIterator<T> - Class in org.apache.iceberg.flink.source
-
Flink data iterator that reads
CombinedScanTask
into aCloseableIterator
- DataIterator(FileScanTaskReader<T>, CombinedScanTask, FileIO, EncryptionManager) - Constructor for class org.apache.iceberg.flink.source.DataIterator
- DataIteratorBatcher<T> - Interface in org.apache.iceberg.flink.source.reader
-
Batcher converts iterator of T into iterator of batched
RecordsWithSplitIds<RecordAndPosition<T>>
, as FLIP-27'sSplitReader.fetch()
returns batched records. - DataIteratorReaderFunction<T> - Class in org.apache.iceberg.flink.source.reader
-
A
ReaderFunction
implementation that usesDataIterator
. - DataIteratorReaderFunction(DataIteratorBatcher<T>) - Constructor for class org.apache.iceberg.flink.source.reader.DataIteratorReaderFunction
- dataManifests() - Method in interface org.apache.iceberg.Snapshot
-
Deprecated.since 0.14.0, will be removed in 1.0.0; Use
Snapshot.dataManifests(FileIO)
instead. - dataManifests(FileIO) - Method in interface org.apache.iceberg.Snapshot
-
Return a
ManifestFile
for each data manifest in this snapshot. - DataOperations - Class in org.apache.iceberg
-
Data operations that produce snapshots.
- DataReader<T> - Class in org.apache.iceberg.data.avro
- DataReader(Schema, Schema, Map<Integer, ?>) - Constructor for class org.apache.iceberg.data.avro.DataReader
- dataSchema() - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- DataTableScan - Class in org.apache.iceberg
- DataTableScan(TableOperations, Table) - Constructor for class org.apache.iceberg.DataTableScan
- DataTableScan(TableOperations, Table, Schema, TableScanContext) - Constructor for class org.apache.iceberg.DataTableScan
- DataTask - Interface in org.apache.iceberg
-
A task that returns data as
rows
instead of where to read data. - dataTimestampMillis() - Method in class org.apache.iceberg.ScanSummary.PartitionMetrics
- dataType() - Method in class org.apache.iceberg.spark.source.SparkMetadataColumn
- dataType() - Method in interface org.apache.spark.sql.connector.iceberg.catalog.ProcedureParameter
-
Returns the type of this parameter.
- DataWriter<T> - Class in org.apache.iceberg.data.avro
- DataWriter<T> - Class in org.apache.iceberg.io
- DataWriter(Schema) - Constructor for class org.apache.iceberg.data.avro.DataWriter
- DataWriter(FileAppender<T>, FileFormat, String, PartitionSpec, StructLike, EncryptionKeyMetadata) - Constructor for class org.apache.iceberg.io.DataWriter
- DataWriter(FileAppender<T>, FileFormat, String, PartitionSpec, StructLike, EncryptionKeyMetadata, SortOrder) - Constructor for class org.apache.iceberg.io.DataWriter
- DataWriteResult - Class in org.apache.iceberg.io
-
A result of writing data files.
- DataWriteResult(List<DataFile>) - Constructor for class org.apache.iceberg.io.DataWriteResult
- DataWriteResult(DataFile) - Constructor for class org.apache.iceberg.io.DataWriteResult
- DATE - org.apache.iceberg.types.Type.TypeID
- DATE_INSPECTOR - Static variable in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- dateFromDays(int) - Static method in class org.apache.iceberg.util.DateTimeUtil
- dates() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- dates() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- DateTimeUtil - Class in org.apache.iceberg.util
- DateType() - Constructor for class org.apache.iceberg.types.Types.DateType
- day(int, String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- day(String) - Static method in class org.apache.iceberg.expressions.Expressions
- day(String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- day(String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- day(String, int, SortDirection, NullOrder) - Method in interface org.apache.iceberg.transforms.SortOrderVisitor
- day(String, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- day(Type) - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a day
Transform
for date or timestamp types. - daysFromDate(LocalDate) - Static method in class org.apache.iceberg.util.DateTimeUtil
- daysFromInstant(Instant) - Static method in class org.apache.iceberg.util.DateTimeUtil
- decimal(int, int) - Static method in class org.apache.iceberg.avro.ValueWriters
- decimal(int, int) - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- decimal(ValueReader<byte[]>, int) - Static method in class org.apache.iceberg.avro.ValueReaders
- DECIMAL - org.apache.iceberg.types.Type.TypeID
- DECIMAL_64 - org.apache.hadoop.hive.ql.exec.vector.VectorizedSupport.Support
- DECIMAL_INT32_MAX_DIGITS - Static variable in class org.apache.iceberg.parquet.TypeToMessageType
- DECIMAL_INT64_MAX_DIGITS - Static variable in class org.apache.iceberg.parquet.TypeToMessageType
- DECIMAL_VALUE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- DECIMAL_VALUE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- DECIMAL_VALUE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- decimalAsFixed(ColumnDescriptor, int, int) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- decimalAsInteger(ColumnDescriptor, int, int) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- decimalAsLong(ColumnDescriptor, int, int) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- decimalBytesReader(Schema) - Static method in class org.apache.iceberg.avro.ValueReaders
- DecimalLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- decimalRequiredBytes(int) - Static method in class org.apache.iceberg.types.TypeUtil
- decimals() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- decimals(int, int) - Static method in class org.apache.iceberg.spark.data.SparkOrcValueReaders
- DecimalUtil - Class in org.apache.iceberg.util
- DecimalVectorUtil - Class in org.apache.iceberg.arrow.vectorized.parquet
- decode(byte[]) - Static method in class org.apache.iceberg.avro.AvroEncoderUtil
- decode(byte[]) - Static method in class org.apache.iceberg.ManifestFiles
-
Decode the binary data into a
ManifestFile
. - decode(InputStream, D) - Method in class org.apache.iceberg.data.avro.IcebergDecoder
- decodeNamespace(String) - Static method in class org.apache.iceberg.rest.RESTUtil
-
Takes in a string representation of a namespace as used for a URL parameter and returns the corresponding namespace.
- DecoderResolver - Class in org.apache.iceberg.data.avro
-
Resolver to resolve
Decoder
to aResolvingDecoder
. - decodeString(String) - Static method in class org.apache.iceberg.rest.RESTUtil
-
Decodes a URL-encoded string.
- decomposePredicate(JobConf, Deserializer, ExprNodeDesc) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- decrypt(byte[], byte[]) - Method in class org.apache.iceberg.encryption.Ciphers.AesGcmDecryptor
- decrypt(Iterable<EncryptedInputFile>) - Method in interface org.apache.iceberg.encryption.EncryptionManager
-
Variant of
EncryptionManager.decrypt(EncryptedInputFile)
that provides a sequence of files that all need to be decrypted in a single context. - decrypt(EncryptedInputFile) - Method in interface org.apache.iceberg.encryption.EncryptionManager
-
Given an
EncryptedInputFile.encryptedInputFile()
representing the raw encrypted bytes from the underlying file system, and given metadata about how the file was encrypted viaEncryptedInputFile.keyMetadata()
, return anInputFile
that returns decrypted input streams. - decrypt(EncryptedInputFile) - Method in class org.apache.iceberg.encryption.PlaintextEncryptionManager
- decryptionKey() - Method in class org.apache.iceberg.gcp.GCPProperties
- dedupName() - Method in interface org.apache.iceberg.transforms.Transform
-
Return the unique transform name to check if similar transforms for the same source field are added multiple times in partition spec builder.
- DEFAULT_BATCH_SIZE - Static variable in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- DEFAULT_DATABASE - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- DEFAULT_DATABASE_NAME - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- DEFAULT_FILE_FORMAT - Static variable in class org.apache.iceberg.TableProperties
- DEFAULT_FILE_FORMAT_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- DEFAULT_NAME_MAPPING - Static variable in class org.apache.iceberg.TableProperties
- DEFAULT_PARTITION_SPEC - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for the JSON representation of current(default) partition spec.
- DEFAULT_SORT_ORDER - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for the JSON representation of current(default) sort order.
- DEFAULT_WRITE_METRICS_MODE - Static variable in class org.apache.iceberg.TableProperties
- DEFAULT_WRITE_METRICS_MODE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- defaultAlwaysNull() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Instructs this builder to return AlwaysNull if no implementation is found.
- defaultErrorHandler() - Static method in class org.apache.iceberg.rest.ErrorHandlers
-
Request error handler that handles the common cases that are included with all responses, such as 400, 500, etc.
- defaultFactory() - Static method in class org.apache.iceberg.aliyun.AliyunClientFactories
- defaultFactory() - Static method in class org.apache.iceberg.aws.AwsClientFactories
- defaultFormat(FileFormat) - Method in interface org.apache.iceberg.UpdateProperties
-
Set the default file format for the table.
- defaultLocationProperty() - Static method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
-
The property used to set a default location for tables in a namespace.
- defaultLockManager() - Static method in class org.apache.iceberg.util.LockManagers
- defaultNamespace() - Method in class org.apache.iceberg.spark.SparkCatalog
- defaultNamespace() - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- defaults() - Method in class org.apache.iceberg.rest.responses.ConfigResponse
-
Properties that should be used as default configuration.
- defaultSortOrderId() - Method in class org.apache.iceberg.TableMetadata
- defaultSpec(PartitionSpec) - Method in class org.apache.iceberg.io.OutputFileFactory.Builder
- defaultSpecId() - Method in class org.apache.iceberg.TableMetadata
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
-
This method produces the same result as using a HiveCatalog.
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.BaseMetastoreCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.dell.ecs.EcsCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.hive.HiveCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.nessie.NessieCatalog
- definitionLevels - Variable in class org.apache.iceberg.parquet.BasePageIterator
- DelegatingInputStream - Interface in org.apache.iceberg.io
- DelegatingOutputStream - Interface in org.apache.iceberg.io
- delete(long) - Method in interface org.apache.iceberg.deletes.PositionDeleteIndex
-
Set a deleted row position.
- delete(long, long) - Method in interface org.apache.iceberg.deletes.PositionDeleteIndex
-
Set a range of deleted row positions.
- delete(F) - Method in class org.apache.iceberg.ManifestWriter
-
Add a delete entry for a file.
- delete(CharSequence, long) - Method in class org.apache.iceberg.deletes.PositionDeleteWriter
-
Deprecated.since 0.13.0, will be removed in 0.14.0; use
PositionDeleteWriter.write(PositionDelete)
instead. - delete(CharSequence, long, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.PositionDeltaWriter
-
Deletes a position in the provided spec/partition.
- delete(CharSequence, long, T) - Method in class org.apache.iceberg.deletes.PositionDeleteWriter
-
Deprecated.since 0.13.0, will be removed in 0.14.0; use
PositionDeleteWriter.write(PositionDelete)
instead. - delete(CharSequence, long, T, PartitionSpec, StructLike) - Method in class org.apache.iceberg.io.BasePositionDeltaWriter
- delete(CharSequence, long, T, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.PositionDeltaWriter
-
Deletes a position in the provided spec/partition and records the deleted row in the delete file.
- delete(String, Class<T>, Supplier<Map<String, String>>, Consumer<ErrorResponse>) - Method in interface org.apache.iceberg.rest.RESTClient
- delete(String, Class<T>, Map<String, String>, Consumer<ErrorResponse>) - Method in class org.apache.iceberg.rest.HTTPClient
- delete(String, Class<T>, Map<String, String>, Consumer<ErrorResponse>) - Method in interface org.apache.iceberg.rest.RESTClient
- delete(T) - Method in class org.apache.iceberg.deletes.EqualityDeleteWriter
-
Deprecated.since 0.13.0, will be removed in 0.14.0; use
EqualityDeleteWriter.write(Object)
instead. - delete(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
-
Delete those rows whose equality fields has the same values with the given row.
- delete(T, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.EqualityDeltaWriter
-
Deletes a row from the provided spec/partition.
- delete(T, T) - Method in interface org.apache.spark.sql.connector.iceberg.write.DeltaWriter
-
Passes information for a row that must be deleted.
- DELETE - org.apache.iceberg.ChangelogOperation
- DELETE - Static variable in class org.apache.iceberg.DataOperations
-
Data is deleted from the table and no data is added.
- DELETE_AVRO_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- DELETE_AVRO_COMPRESSION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- DELETE_DEFAULT_FILE_FORMAT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_DISTRIBUTION_MODE - Static variable in class org.apache.iceberg.TableProperties
- DELETE_FILE_PATH - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_POS - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_ROW_DOC - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_ROW_FIELD_ID - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_ROW_FIELD_NAME - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_THRESHOLD - Static variable in class org.apache.iceberg.actions.BinPackStrategy
-
The minimum number of deletes that needs to be associated with a data file for it to be considered for rewriting.
- DELETE_FILE_THRESHOLD_DEFAULT - Static variable in class org.apache.iceberg.actions.BinPackStrategy
- DELETE_FILES - org.apache.iceberg.ManifestReader.FileType
- DELETE_FILES - org.apache.iceberg.MetadataTableType
- DELETE_FORMAT - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- DELETE_ISOLATION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ISOLATION_LEVEL_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_MODE - Static variable in class org.apache.iceberg.TableProperties
- DELETE_MODE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ORC_BLOCK_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ORC_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ORC_COMPRESSION_STRATEGY - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ORC_STRIPE_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ORC_WRITE_BATCH_SIZE - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_COMPRESSION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_DICT_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_PAGE_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_ROW_GROUP_CHECK_MAX_RECORD_COUNT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_ROW_GROUP_CHECK_MIN_RECORD_COUNT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_ROW_GROUP_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_TARGET_FILE_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_TARGET_FILE_SIZE_BYTES_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- deleteAll(Iterable<T>) - Method in class org.apache.iceberg.deletes.EqualityDeleteWriter
-
Deprecated.since 0.13.0, will be removed in 0.14.0; use
FileWriter.write(Iterable)
instead. - deleteColumn(String) - Method in interface org.apache.iceberg.UpdateSchema
-
Delete a column in the schema.
- DELETED_DUPLICATE_FILES - Static variable in class org.apache.iceberg.SnapshotSummary
- DELETED_FILES_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- DELETED_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- DELETED_RECORDS_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- DELETED_ROWS_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- deletedDataFiles() - Method in class org.apache.iceberg.actions.RewriteDataFilesActionResult
- DeletedDataFileScanTask - Interface in org.apache.iceberg
-
A scan task for deletes generated by removing a data file from the table.
- deletedDataFilesCount() - Method in class org.apache.iceberg.actions.BaseDeleteReachableFilesActionResult
- deletedDataFilesCount() - Method in class org.apache.iceberg.actions.BaseExpireSnapshotsActionResult
- deletedDataFilesCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted data files.
- deletedDataFilesCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted data files.
- deletedEqualityDeleteFilesCount() - Method in class org.apache.iceberg.actions.BaseExpireSnapshotsActionResult
- deletedEqualityDeleteFilesCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted equality delete files.
- deletedFile(PartitionSpec, ContentFile<?>) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- deletedFile(PartitionSpec, DataFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- deletedFile(PartitionSpec, DeleteFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- deletedFiles() - Method in interface org.apache.iceberg.Snapshot
-
Deprecated.since 0.14.0, will be removed in 1.0.0; Use
Snapshot.removedDataFiles(FileIO)
instead. - deletedFilesCount() - Method in class org.apache.iceberg.GenericManifestFile
- deletedFilesCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the number of data files with status DELETED in the manifest file.
- deletedFilesCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- deleteDistributionMode() - Method in class org.apache.iceberg.spark.SparkWriteConf
- deletedManifestListsCount() - Method in class org.apache.iceberg.actions.BaseDeleteReachableFilesActionResult
- deletedManifestListsCount() - Method in class org.apache.iceberg.actions.BaseExpireSnapshotsActionResult
- deletedManifestListsCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted manifest lists.
- deletedManifestListsCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted manifest lists.
- deletedManifestsCount() - Method in class org.apache.iceberg.actions.BaseDeleteReachableFilesActionResult
- deletedManifestsCount() - Method in class org.apache.iceberg.actions.BaseExpireSnapshotsActionResult
- deletedManifestsCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted manifests.
- deletedManifestsCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted manifests.
- deletedOtherFilesCount() - Method in class org.apache.iceberg.actions.BaseDeleteReachableFilesActionResult
- deletedOtherFilesCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted metadata json, version hint files.
- deletedPositionDeleteFilesCount() - Method in class org.apache.iceberg.actions.BaseExpireSnapshotsActionResult
- deletedPositionDeleteFilesCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted position delete files.
- deletedRowPositions() - Method in class org.apache.iceberg.data.DeleteFilter
- deletedRowsCount() - Method in class org.apache.iceberg.GenericManifestFile
- deletedRowsCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the total number of rows in all data files with status DELETED in the manifest file.
- deletedRowsCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- DeletedRowsScanTask - Interface in org.apache.iceberg
-
A scan task for deletes generated by adding delete files to the table.
- deleteFile(CharSequence) - Method in interface org.apache.iceberg.DeleteFiles
-
Delete a file path from the underlying table.
- deleteFile(CharSequence) - Method in class org.apache.iceberg.StreamingDelete
- deleteFile(String) - Method in class org.apache.iceberg.aliyun.oss.OSSFileIO
- deleteFile(String) - Method in class org.apache.iceberg.aws.s3.S3FileIO
- deleteFile(String) - Method in class org.apache.iceberg.dell.ecs.EcsFileIO
- deleteFile(String) - Method in class org.apache.iceberg.gcp.gcs.GCSFileIO
- deleteFile(String) - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- deleteFile(String) - Method in interface org.apache.iceberg.io.FileIO
-
Delete the file at the given path.
- deleteFile(String) - Method in class org.apache.iceberg.io.ResolvingFileIO
- deleteFile(DataFile) - Method in class org.apache.iceberg.BaseOverwriteFiles
- deleteFile(DataFile) - Method in interface org.apache.iceberg.DeleteFiles
-
Delete a file tracked by a
DataFile
from the underlying table. - deleteFile(DataFile) - Method in interface org.apache.iceberg.OverwriteFiles
-
Delete a
DataFile
from the table. - deleteFile(DataFile) - Method in class org.apache.iceberg.StreamingDelete
- deleteFile(InputFile) - Method in interface org.apache.iceberg.io.FileIO
- deleteFile(OutputFile) - Method in interface org.apache.iceberg.io.FileIO
-
Convenience method to
delete
anOutputFile
. - DeleteFile - Interface in org.apache.iceberg
-
Interface for delete files listed in a table delete manifest.
- deleteFileBuilder(PartitionSpec) - Static method in class org.apache.iceberg.FileMetadata
- deleteFileFormat() - Method in class org.apache.iceberg.spark.SparkWriteConf
- deleteFiles() - Method in class org.apache.iceberg.io.DeleteWriteResult
- deleteFiles() - Method in class org.apache.iceberg.io.WriteResult
- deleteFiles(Iterable<String>) - Method in class org.apache.iceberg.aws.s3.S3FileIO
-
Deletes the given paths in a batched manner.
- deleteFiles(Iterable<String>) - Method in interface org.apache.iceberg.io.SupportsBulkOperations
-
Delete the files at the given paths.
- DeleteFiles - Interface in org.apache.iceberg
-
API for deleting files from a table.
- DeleteFilesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's delete files as rows. - DeleteFilesTable.DeleteFilesTableScan - Class in org.apache.iceberg
- DeleteFilter<T> - Class in org.apache.iceberg.data
- DeleteFilter(String, List<DeleteFile>, Schema, Schema) - Constructor for class org.apache.iceberg.data.DeleteFilter
- deleteFromRowFilter(Expression) - Method in interface org.apache.iceberg.DeleteFiles
-
Delete files that match an
Expression
on data rows from the table. - deleteFromRowFilter(Expression) - Method in class org.apache.iceberg.StreamingDelete
- deleteKey(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
-
Delete those rows with the given key.
- deleteKey(T, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.EqualityDeltaWriter
-
Deletes a key from the provided spec/partition.
- deleteManifest(ManifestFile) - Method in class org.apache.iceberg.BaseRewriteManifests
- deleteManifest(ManifestFile) - Method in interface org.apache.iceberg.RewriteManifests
-
Deletes a
manifest file
from the table. - deleteManifests() - Method in interface org.apache.iceberg.Snapshot
-
Deprecated.since 0.14.0, will be removed in 1.0.0; Use
Snapshot.deleteManifests(FileIO)
instead. - deleteManifests(FileIO) - Method in interface org.apache.iceberg.Snapshot
-
Return a
ManifestFile
for each delete manifest in this snapshot. - deleteOrphanFiles(Table) - Method in interface org.apache.iceberg.actions.ActionsProvider
-
Instantiates an action to delete orphan files.
- deleteOrphanFiles(Table) - Method in class org.apache.iceberg.spark.actions.SparkActions
- DeleteOrphanFiles - Interface in org.apache.iceberg.actions
-
An action that deletes orphan metadata, data and delete files in a table.
- DeleteOrphanFiles.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- DeleteOrphanFilesSparkAction - Class in org.apache.iceberg.spark.actions
-
An action that removes orphan metadata, data and delete files by listing a given location and comparing the actual files in that location with content and metadata files referenced by all valid snapshots.
- deletePositions(CharSequence, List<CloseableIterable<T>>) - Static method in class org.apache.iceberg.deletes.Deletes
- deletePositions(CharSequence, CloseableIterable<StructLike>) - Static method in class org.apache.iceberg.deletes.Deletes
- deletePrefix(String) - Method in class org.apache.iceberg.aws.s3.S3FileIO
-
This method provides a "best-effort" to delete all objects under the given prefix.
- deletePrefix(String) - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- deletePrefix(String) - Method in interface org.apache.iceberg.io.SupportsPrefixOperations
-
Delete all files under a prefix.
- deleteReachableFiles(String) - Method in interface org.apache.iceberg.actions.ActionsProvider
-
Instantiates an action to delete all the files reachable from given metadata location.
- deleteReachableFiles(String) - Method in class org.apache.iceberg.spark.actions.SparkActions
- DeleteReachableFiles - Interface in org.apache.iceberg.actions
-
An action that deletes all files referenced by a table metadata file.
- DeleteReachableFiles.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- DeleteReachableFilesSparkAction - Class in org.apache.iceberg.spark.actions
-
An implementation of
DeleteReachableFiles
that uses metadata tables in Spark to determine which files should be deleted. - deletes() - Method in interface org.apache.iceberg.AddedRowsScanTask
-
A list of
delete files
to apply when reading the data file in this task. - deletes() - Method in class org.apache.iceberg.BaseFileScanTask
- deletes() - Method in interface org.apache.iceberg.FileScanTask
-
A list of
delete files
to apply when reading the task's data file. - Deletes - Class in org.apache.iceberg.deletes
- DELETES - org.apache.iceberg.ManifestContent
- DeleteSchemaUtil - Class in org.apache.iceberg.io
- deleteWhere(Filter[]) - Method in class org.apache.iceberg.spark.RollbackStagedTable
- deleteWhere(Filter[]) - Method in class org.apache.iceberg.spark.source.SparkTable
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.actions.DeleteOrphanFiles
-
Passes an alternative delete implementation that will be used for orphan files.
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.actions.DeleteReachableFiles
-
Passes an alternative delete implementation that will be used for files.
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.actions.ExpireSnapshots
-
Passes an alternative delete implementation that will be used for manifests, data and delete files.
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.ExpireSnapshots
-
Passes an alternative delete implementation that will be used for manifests and data files.
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.SnapshotUpdate
-
Set a callback to delete files instead of the table's default.
- deleteWith(Consumer<String>) - Method in class org.apache.iceberg.spark.actions.DeleteOrphanFilesSparkAction
- deleteWith(Consumer<String>) - Method in class org.apache.iceberg.spark.actions.DeleteReachableFilesSparkAction
- deleteWith(Consumer<String>) - Method in class org.apache.iceberg.spark.actions.ExpireSnapshotsSparkAction
- DeleteWriteResult - Class in org.apache.iceberg.io
-
A result of writing delete files.
- DeleteWriteResult(List<DeleteFile>) - Constructor for class org.apache.iceberg.io.DeleteWriteResult
- DeleteWriteResult(List<DeleteFile>, CharSequenceSet) - Constructor for class org.apache.iceberg.io.DeleteWriteResult
- DeleteWriteResult(DeleteFile) - Constructor for class org.apache.iceberg.io.DeleteWriteResult
- DeleteWriteResult(DeleteFile, CharSequenceSet) - Constructor for class org.apache.iceberg.io.DeleteWriteResult
- DellClientFactories - Class in org.apache.iceberg.dell
- DellClientFactory - Interface in org.apache.iceberg.dell
- DellProperties - Class in org.apache.iceberg.dell
- DellProperties() - Constructor for class org.apache.iceberg.dell.DellProperties
- DellProperties(Map<String, String>) - Constructor for class org.apache.iceberg.dell.DellProperties
- DeltaBatchWrite - Interface in org.apache.spark.sql.connector.iceberg.write
-
An interface that defines how to write a delta of rows during batch processing.
- DeltaWrite - Interface in org.apache.spark.sql.connector.iceberg.write
-
A logical representation of a data source write that handles a delta of rows.
- DeltaWriteBuilder - Interface in org.apache.spark.sql.connector.iceberg.write
-
An interface for building delta writes.
- DeltaWriter<T> - Interface in org.apache.spark.sql.connector.iceberg.write
-
A data writer responsible for writing a delta of rows.
- DeltaWriterFactory - Interface in org.apache.spark.sql.connector.iceberg.write
-
A factory for creating and initializing delta writers at the executor side.
- desc - Variable in class org.apache.iceberg.parquet.BaseColumnIterator
- desc - Variable in class org.apache.iceberg.parquet.BasePageIterator
- desc(String) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add a field to the sort by field name, ascending with nulls first.
- desc(String, NullOrder) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add a field to the sort by field name, ascending with the given null order.
- desc(Term) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add an expression term to the sort, ascending with nulls first.
- desc(Term, NullOrder) - Method in class org.apache.iceberg.BaseReplaceSortOrder
- desc(Term, NullOrder) - Method in class org.apache.iceberg.SortOrder.Builder
-
Add an expression term to the sort, ascending with the given null order.
- desc(Term, NullOrder) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add an expression term to the sort, ascending with the given null order.
- DESC - org.apache.iceberg.SortDirection
- DESC - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- DESC - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- DESC() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- DESC() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- describe(Expression) - Static method in class org.apache.iceberg.spark.Spark3Util
- describe(Schema) - Static method in class org.apache.iceberg.spark.Spark3Util
- describe(SortOrder) - Static method in class org.apache.iceberg.spark.Spark3Util
- describe(Type) - Static method in class org.apache.iceberg.spark.Spark3Util
- description() - Method in class org.apache.iceberg.spark.JobGroupInfo
- description() - Method in class org.apache.iceberg.spark.procedures.AncestorsOfProcedure
- description() - Method in class org.apache.iceberg.spark.procedures.ExpireSnapshotsProcedure
- description() - Method in class org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure
- description() - Method in interface org.apache.spark.sql.connector.iceberg.catalog.Procedure
-
Returns the description of this procedure.
- descriptor() - Method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- deserialize(int, byte[]) - Method in class org.apache.iceberg.flink.source.enumerator.IcebergEnumeratorStateSerializer
- deserialize(int, byte[]) - Method in class org.apache.iceberg.flink.source.split.IcebergSourceSplitSerializer
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.iceberg.rest.RESTSerializers.ErrorResponseDeserializer
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.iceberg.rest.RESTSerializers.MetadataUpdateDeserializer
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.iceberg.rest.RESTSerializers.NamespaceDeserializer
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.iceberg.rest.RESTSerializers.OAuthTokenResponseDeserializer
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.iceberg.rest.RESTSerializers.SchemaDeserializer
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.iceberg.rest.RESTSerializers.TableIdentifierDeserializer
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.iceberg.rest.RESTSerializers.TableMetadataDeserializer
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.iceberg.rest.RESTSerializers.UnboundPartitionSpecDeserializer
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.iceberg.rest.RESTSerializers.UnboundSortOrderDeserializer
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.iceberg.rest.RESTSerializers.UpdateRequirementDeserializer
- deserialize(Writable) - Method in class org.apache.iceberg.mr.hive.HiveIcebergSerDe
- deserializeFromBase64(String) - Static method in class org.apache.iceberg.util.SerializationUtil
- deserializeFromBytes(byte[]) - Static method in class org.apache.iceberg.util.SerializationUtil
- deserializeOffset(String) - Method in class org.apache.iceberg.spark.source.SparkMicroBatchStream
- destCatalog() - Method in class org.apache.iceberg.spark.actions.MigrateTableSparkAction
- destCatalog() - Method in class org.apache.iceberg.spark.actions.SnapshotTableSparkAction
- destination() - Method in class org.apache.iceberg.rest.requests.RenameTableRequest
- destTableIdent() - Method in class org.apache.iceberg.spark.actions.MigrateTableSparkAction
- destTableIdent() - Method in class org.apache.iceberg.spark.actions.SnapshotTableSparkAction
- destTableProps() - Method in class org.apache.iceberg.spark.actions.MigrateTableSparkAction
- destTableProps() - Method in class org.apache.iceberg.spark.actions.SnapshotTableSparkAction
- determineListElementType(GroupType) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
- dictionary - Variable in class org.apache.iceberg.parquet.BaseColumnIterator
- dictionary - Variable in class org.apache.iceberg.parquet.BasePageIterator
- dictionary() - Method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- dictionaryBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- DictionaryBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.DictionaryBatchReader
- dictionaryIdReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- direction - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- direction() - Method in class org.apache.iceberg.SortField
-
Returns the sort direction
- direction() - Method in class org.apache.iceberg.spark.ExtendedParser.RawOrderField
- disableRefresh() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- discardChanges() - Method in class org.apache.iceberg.TableMetadata.Builder
- displayName() - Method in enum org.apache.iceberg.metrics.MetricsContext.Unit
- DISTRIBUTED - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- DISTRIBUTED - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- DISTRIBUTED() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- DISTRIBUTED() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- DISTRIBUTION_MODE - Static variable in class org.apache.iceberg.flink.FlinkWriteOptions
- DISTRIBUTION_MODE - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- distributionMode() - Method in class org.apache.iceberg.flink.FlinkWriteConf
- distributionMode() - Method in class org.apache.iceberg.spark.SparkWriteConf
- distributionMode(DistributionMode) - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
-
Configure the write
DistributionMode
that the flink sink will use. - DistributionMode - Enum in org.apache.iceberg
-
Enum of supported write distribution mode, it defines the write behavior of batch or streaming job:
- doc() - Method in class org.apache.iceberg.types.Types.NestedField
- doCommit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- doCommit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.dell.ecs.EcsTableOperations
- doCommit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.hive.HiveTableOperations
- doCommit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.nessie.NessieTableOperations
- doPlanFiles() - Method in class org.apache.iceberg.AllManifestsTable.AllManifestsTableScan
- doPlanFiles() - Method in class org.apache.iceberg.DataTableScan
- doRefresh() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- doRefresh() - Method in class org.apache.iceberg.dell.ecs.EcsTableOperations
- doRefresh() - Method in class org.apache.iceberg.hive.HiveTableOperations
- doRefresh() - Method in class org.apache.iceberg.nessie.NessieTableOperations
- DOUBLE - org.apache.iceberg.types.Type.TypeID
- DOUBLE_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- DOUBLE_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- DOUBLE_LITERAL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DoubleLiteralContext
- doubleBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- DoubleBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.DoubleBatchReader
- doubleDictEncodedReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedDictionaryEncodedParquetValuesReader
- DoubleFieldMetrics - Class in org.apache.iceberg
-
Iceberg internally tracked field level metrics, used by Parquet and ORC writers only.
- DoubleFieldMetrics.Builder - Class in org.apache.iceberg
- DoubleLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DoubleLiteralContext
- doubles() - Static method in class org.apache.iceberg.avro.ValueReaders
- doubles() - Static method in class org.apache.iceberg.avro.ValueWriters
- doubles() - Static method in class org.apache.iceberg.orc.OrcValueReaders
- doubles(int) - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- doubles(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- doubleToOrderedBytes(double, ByteBuffer) - Static method in class org.apache.iceberg.util.ZOrderByteUtils
-
Doubles are treated the same as floats in
ZOrderByteUtils.floatToOrderedBytes(float, ByteBuffer)
- DoubleType() - Constructor for class org.apache.iceberg.types.Types.DoubleType
- DROP - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- DROP - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- DROP() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- DROP() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- DROP() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- dropDatabase(String, boolean, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- dropFunction(ObjectPath, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- DropIdentifierFieldsContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- dropNamespace(String[], boolean) - Method in class org.apache.iceberg.spark.SparkCatalog
- dropNamespace(String[], boolean) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.catalog.BaseSessionCatalog.AsCatalog
- dropNamespace(Namespace) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
Drop a namespace.
- dropNamespace(Namespace) - Method in class org.apache.iceberg.dell.ecs.EcsCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.hive.HiveCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.nessie.NessieCatalog
- dropNamespace(Namespace) - Method in class org.apache.iceberg.nessie.NessieIcebergClient
- dropNamespace(Namespace) - Method in class org.apache.iceberg.rest.RESTCatalog
- dropNamespace(SessionCatalog.SessionContext, Namespace) - Method in interface org.apache.iceberg.catalog.SessionCatalog
-
Drop a namespace.
- dropNamespace(SessionCatalog.SessionContext, Namespace) - Method in class org.apache.iceberg.rest.RESTSessionCatalog
- dropNamespace(SupportsNamespaces, Namespace) - Static method in class org.apache.iceberg.rest.CatalogHandlers
- dropPartition(ObjectPath, CatalogPartitionSpec, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- DropPartitionFieldContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- dropTable(String) - Method in class org.apache.iceberg.hadoop.HadoopTables
-
Drop a table and delete all data and metadata files.
- dropTable(String, boolean) - Method in class org.apache.iceberg.hadoop.HadoopTables
-
Drop a table; optionally delete data and metadata files.
- dropTable(ObjectPath, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- dropTable(Configuration, Properties) - Static method in class org.apache.iceberg.mr.Catalogs
-
Drops an Iceberg table using the catalog specified by the configuration.
- dropTable(Catalog, TableIdentifier) - Static method in class org.apache.iceberg.rest.CatalogHandlers
- dropTable(SessionCatalog.SessionContext, TableIdentifier) - Method in interface org.apache.iceberg.catalog.SessionCatalog
-
Drop a table, without requesting that files are immediately deleted.
- dropTable(SessionCatalog.SessionContext, TableIdentifier) - Method in class org.apache.iceberg.rest.RESTSessionCatalog
- dropTable(TableIdentifier) - Method in class org.apache.iceberg.catalog.BaseSessionCatalog.AsCatalog
- dropTable(TableIdentifier) - Method in interface org.apache.iceberg.catalog.Catalog
-
Drop a table and delete all data and metadata files.
- dropTable(TableIdentifier) - Method in class org.apache.iceberg.rest.RESTCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.CachingCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.catalog.BaseSessionCatalog.AsCatalog
- dropTable(TableIdentifier, boolean) - Method in interface org.apache.iceberg.catalog.Catalog
-
Drop a table; optionally delete data and metadata files.
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.dell.ecs.EcsCatalog
-
Remove table object.
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.hive.HiveCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.nessie.NessieCatalog
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.nessie.NessieIcebergClient
- dropTable(TableIdentifier, boolean) - Method in class org.apache.iceberg.rest.RESTCatalog
- dropTable(Identifier) - Method in class org.apache.iceberg.spark.SparkCachedTableCatalog
- dropTable(Identifier) - Method in class org.apache.iceberg.spark.SparkCatalog
- dropTable(Identifier) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- dropTableData(FileIO, TableMetadata) - Static method in class org.apache.iceberg.CatalogUtil
-
Drops all data and metadata files referenced by TableMetadata.
- dummyHolder(int) - Static method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- DuplicateWAPCommitException - Exception in org.apache.iceberg.exceptions
-
This exception occurs when the WAP workflow detects a duplicate wap commit.
- DuplicateWAPCommitException(String) - Constructor for exception org.apache.iceberg.exceptions.DuplicateWAPCommitException
- dynamo() - Method in class org.apache.iceberg.aws.AssumeRoleAwsClientFactory
- dynamo() - Method in interface org.apache.iceberg.aws.AwsClientFactory
-
Create a Amazon DynamoDB client
- DYNAMODB_ENDPOINT - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Configure an alternative endpoint of the DynamoDB service to access.
- DYNAMODB_TABLE_NAME - Static variable in class org.apache.iceberg.aws.AwsProperties
-
DynamoDB table name for
DynamoDbCatalog
- DYNAMODB_TABLE_NAME_DEFAULT - Static variable in class org.apache.iceberg.aws.AwsProperties
- DynamoDbCatalog - Class in org.apache.iceberg.aws.dynamodb
-
DynamoDB implementation of Iceberg catalog
- DynamoDbCatalog() - Constructor for class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- DynamoDbLockManager - Class in org.apache.iceberg.aws.dynamodb
-
DynamoDB implementation for the lock manager.
- DynamoDbLockManager() - Constructor for class org.apache.iceberg.aws.dynamodb.DynamoDbLockManager
-
constructor for dynamic initialization,
DynamoDbLockManager.initialize(Map)
must be called later. - DynamoDbLockManager(DynamoDbClient, String) - Constructor for class org.apache.iceberg.aws.dynamodb.DynamoDbLockManager
-
constructor used for testing purpose
- dynamoDbTableName() - Method in class org.apache.iceberg.aws.AwsProperties
- DynClasses - Class in org.apache.iceberg.common
- DynClasses.Builder - Class in org.apache.iceberg.common
- DynConstructors - Class in org.apache.iceberg.common
-
Copied from parquet-common
- DynConstructors.Builder - Class in org.apache.iceberg.common
- DynConstructors.Ctor<C> - Class in org.apache.iceberg.common
- DynFields - Class in org.apache.iceberg.common
- DynFields.BoundField<T> - Class in org.apache.iceberg.common
- DynFields.Builder - Class in org.apache.iceberg.common
- DynFields.StaticField<T> - Class in org.apache.iceberg.common
- DynFields.UnboundField<T> - Class in org.apache.iceberg.common
-
Convenience wrapper class around
Field
. - DynMethods - Class in org.apache.iceberg.common
-
Copied from parquet-common
- DynMethods.BoundMethod - Class in org.apache.iceberg.common
- DynMethods.Builder - Class in org.apache.iceberg.common
- DynMethods.StaticMethod - Class in org.apache.iceberg.common
- DynMethods.UnboundMethod - Class in org.apache.iceberg.common
-
Convenience wrapper class around
Method
.
E
- ECS_S3_ACCESS_KEY_ID - Static variable in class org.apache.iceberg.dell.DellProperties
-
S3 Access key id of Dell EMC ECS
- ECS_S3_ENDPOINT - Static variable in class org.apache.iceberg.dell.DellProperties
-
S3 endpoint of Dell EMC ECS
- ECS_S3_SECRET_ACCESS_KEY - Static variable in class org.apache.iceberg.dell.DellProperties
-
S3 Secret access key of Dell EMC ECS
- EcsCatalog - Class in org.apache.iceberg.dell.ecs
- EcsCatalog() - Constructor for class org.apache.iceberg.dell.ecs.EcsCatalog
-
No-arg constructor to load the catalog dynamically.
- EcsFileIO - Class in org.apache.iceberg.dell.ecs
-
FileIO implementation backed by Dell EMC ECS.
- EcsFileIO() - Constructor for class org.apache.iceberg.dell.ecs.EcsFileIO
- ecsS3() - Method in interface org.apache.iceberg.dell.DellClientFactory
-
Create a Dell EMC ECS S3 client
- ecsS3AccessKeyId() - Method in class org.apache.iceberg.dell.DellProperties
- ecsS3Endpoint() - Method in class org.apache.iceberg.dell.DellProperties
- ecsS3SecretAccessKey() - Method in class org.apache.iceberg.dell.DellProperties
- EcsTableOperations - Class in org.apache.iceberg.dell.ecs
- EcsTableOperations(String, EcsURI, FileIO, EcsCatalog) - Constructor for class org.apache.iceberg.dell.ecs.EcsTableOperations
- ELEMENT_ID_PROP - Static variable in class org.apache.iceberg.avro.AvroSchemaUtil
- elementId() - Method in class org.apache.iceberg.types.Types.ListType
- elementName() - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- elements(L) - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedWriter
- elementType() - Method in class org.apache.iceberg.types.Types.ListType
- empty() - Static method in class org.apache.iceberg.actions.BaseRewriteManifestsActionResult
- empty() - Static method in class org.apache.iceberg.catalog.Namespace
- empty() - Static method in interface org.apache.iceberg.encryption.EncryptionKeyMetadata
- empty() - Static method in interface org.apache.iceberg.io.CloseableIterable
- empty() - Static method in interface org.apache.iceberg.io.CloseableIterator
- empty() - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergRecordObjectInspector
- empty() - Static method in class org.apache.iceberg.util.CharSequenceSet
- EMPTY - Static variable in interface org.apache.iceberg.encryption.EncryptionKeyMetadata
- EMPTY_BOOLEAN_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- EMPTY_BYTE_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- EMPTY_DOUBLE_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- EMPTY_FLOAT_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- EMPTY_INT_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- EMPTY_LONG_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- EMPTY_SHORT_ARRAY - Static variable in class org.apache.iceberg.util.ArrayUtil
- encode(D) - Method in class org.apache.iceberg.data.avro.IcebergEncoder
- encode(D, OutputStream) - Method in class org.apache.iceberg.data.avro.IcebergEncoder
- encode(ManifestFile) - Static method in class org.apache.iceberg.ManifestFiles
-
Encode the
ManifestFile
to a byte array by using avro encoder. - encode(T, Schema) - Static method in class org.apache.iceberg.avro.AvroEncoderUtil
- encodeFormData(Map<?, ?>) - Static method in class org.apache.iceberg.rest.RESTUtil
-
Encodes a map of form data as application/x-www-form-urlencoded.
- encodeNamespace(Namespace) - Static method in class org.apache.iceberg.rest.RESTUtil
-
Returns a String representation of a namespace that is suitable for use in a URL / URI.
- encodeString(String) - Static method in class org.apache.iceberg.rest.RESTUtil
-
Encodes a string using URL encoding
- encrypt(byte[], byte[]) - Method in class org.apache.iceberg.encryption.Ciphers.AesGcmEncryptor
- encrypt(Iterable<OutputFile>) - Method in interface org.apache.iceberg.encryption.EncryptionManager
-
Variant of
EncryptionManager.encrypt(OutputFile)
that provides a sequence of files that all need to be encrypted in a single context. - encrypt(OutputFile) - Method in interface org.apache.iceberg.encryption.EncryptionManager
-
Given a handle on an
OutputFile
that writes raw bytes to the underlying file system, return a bundle of anEncryptedOutputFile.encryptingOutputFile()
that writes encrypted bytes to the underlying file system, and theEncryptedOutputFile.keyMetadata()
that points to the encryption key that is being used to encrypt this file. - encrypt(OutputFile) - Method in class org.apache.iceberg.encryption.PlaintextEncryptionManager
- EncryptedFiles - Class in org.apache.iceberg.encryption
- encryptedInput(InputFile, byte[]) - Static method in class org.apache.iceberg.encryption.EncryptedFiles
- encryptedInput(InputFile, ByteBuffer) - Static method in class org.apache.iceberg.encryption.EncryptedFiles
- encryptedInput(InputFile, EncryptionKeyMetadata) - Static method in class org.apache.iceberg.encryption.EncryptedFiles
- encryptedInputFile() - Method in interface org.apache.iceberg.encryption.EncryptedInputFile
-
The
InputFile
that is reading raw encrypted bytes from the underlying file system. - EncryptedInputFile - Interface in org.apache.iceberg.encryption
-
Thin wrapper around an
InputFile
instance that is encrypted. - encryptedOutput(OutputFile, byte[]) - Static method in class org.apache.iceberg.encryption.EncryptedFiles
- encryptedOutput(OutputFile, ByteBuffer) - Static method in class org.apache.iceberg.encryption.EncryptedFiles
- encryptedOutput(OutputFile, EncryptionKeyMetadata) - Static method in class org.apache.iceberg.encryption.EncryptedFiles
- EncryptedOutputFile - Interface in org.apache.iceberg.encryption
-
Thin wrapper around a
OutputFile
that is encrypting bytes written to the underlying file system, via an encryption key that is symbolized by the enclosedEncryptionKeyMetadata
. - encryptingOutputFile() - Method in interface org.apache.iceberg.encryption.EncryptedOutputFile
-
An OutputFile instance that encrypts the bytes that are written to its output streams.
- encryption() - Method in class org.apache.iceberg.BaseMetadataTable
- encryption() - Method in class org.apache.iceberg.BaseTable
- encryption() - Method in class org.apache.iceberg.BaseTransaction.TransactionTable
- encryption() - Method in class org.apache.iceberg.BaseTransaction.TransactionTableOperations
- encryption() - Method in class org.apache.iceberg.SerializableTable
- encryption() - Method in interface org.apache.iceberg.Table
-
Returns an
EncryptionManager
to encrypt and decrypt data files. - encryption() - Method in interface org.apache.iceberg.TableOperations
-
Returns a
EncryptionManager
to encrypt and decrypt data files. - encryptionAlgorithm() - Method in class org.apache.iceberg.encryption.NativeFileCryptoParameters
- encryptionAlgorithm(EncryptionAlgorithm) - Method in class org.apache.iceberg.encryption.NativeFileCryptoParameters.Builder
- EncryptionAlgorithm - Enum in org.apache.iceberg.encryption
-
Algorithm supported for file encryption.
- encryptionKey() - Method in class org.apache.iceberg.gcp.GCPProperties
- EncryptionKeyMetadata - Interface in org.apache.iceberg.encryption
-
Light typedef over a ByteBuffer that indicates that the given bytes represent metadata about an encrypted data file's encryption key.
- EncryptionKeyMetadatas - Class in org.apache.iceberg.encryption
- encryptionManager() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- EncryptionManager - Interface in org.apache.iceberg.encryption
-
Module for encrypting and decrypting table data files.
- END_SNAPSHOT_ID - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- endFileIndex() - Method in class org.apache.iceberg.MicroBatches.MicroBatch
- endSnapshotId() - Method in class org.apache.iceberg.flink.source.ScanContext
- endSnapshotId() - Method in class org.apache.iceberg.spark.SparkReadConf
- endSnapshotId(Long) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- endSnapshotId(Long) - Method in class org.apache.iceberg.flink.source.IcebergSource.Builder
- endSnapshotId(Long) - Method in class org.apache.iceberg.flink.source.ScanContext.Builder
- ENGINE_HIVE_ENABLED - Static variable in class org.apache.iceberg.hadoop.ConfigProperties
- ENGINE_HIVE_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- ENGINE_HIVE_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- enrichContextWithAttemptWrapper(TaskAttemptContext) - Static method in class org.apache.iceberg.mr.hive.TezUtil
-
Creates a new taskAttemptContext by replacing the taskAttemptID with a wrapped object.
- enrichContextWithVertexId(JobContext) - Static method in class org.apache.iceberg.mr.hive.TezUtil
-
If the Tez vertex id is present in config, creates a new jobContext by appending the Tez vertex id to the jobID.
- enterAddPartitionField(IcebergSqlExtensionsParser.AddPartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
addPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterAddPartitionField(IcebergSqlExtensionsParser.AddPartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
addPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterApplyTransform(IcebergSqlExtensionsParser.ApplyTransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
applyTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - enterApplyTransform(IcebergSqlExtensionsParser.ApplyTransformContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
applyTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - enterBigDecimalLiteral(IcebergSqlExtensionsParser.BigDecimalLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
bigDecimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterBigDecimalLiteral(IcebergSqlExtensionsParser.BigDecimalLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
bigDecimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterBigIntLiteral(IcebergSqlExtensionsParser.BigIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
bigIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterBigIntLiteral(IcebergSqlExtensionsParser.BigIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
bigIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterBooleanLiteral(IcebergSqlExtensionsParser.BooleanLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
booleanLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterBooleanLiteral(IcebergSqlExtensionsParser.BooleanLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
booleanLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterBooleanValue(IcebergSqlExtensionsParser.BooleanValueContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.booleanValue()
. - enterBooleanValue(IcebergSqlExtensionsParser.BooleanValueContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.booleanValue()
. - enterCall(IcebergSqlExtensionsParser.CallContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
call
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterCall(IcebergSqlExtensionsParser.CallContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
call
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterDecimalLiteral(IcebergSqlExtensionsParser.DecimalLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
decimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterDecimalLiteral(IcebergSqlExtensionsParser.DecimalLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
decimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterDoubleLiteral(IcebergSqlExtensionsParser.DoubleLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
doubleLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterDoubleLiteral(IcebergSqlExtensionsParser.DoubleLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
doubleLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterDropIdentifierFields(IcebergSqlExtensionsParser.DropIdentifierFieldsContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
dropIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterDropIdentifierFields(IcebergSqlExtensionsParser.DropIdentifierFieldsContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
dropIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterDropPartitionField(IcebergSqlExtensionsParser.DropPartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
dropPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterDropPartitionField(IcebergSqlExtensionsParser.DropPartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
dropPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterEveryRule(ParserRuleContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
- enterExponentLiteral(IcebergSqlExtensionsParser.ExponentLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
exponentLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterExponentLiteral(IcebergSqlExtensionsParser.ExponentLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
exponentLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterExpression(IcebergSqlExtensionsParser.ExpressionContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.expression()
. - enterExpression(IcebergSqlExtensionsParser.ExpressionContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.expression()
. - enterFieldList(IcebergSqlExtensionsParser.FieldListContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.fieldList()
. - enterFieldList(IcebergSqlExtensionsParser.FieldListContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.fieldList()
. - enterFloatLiteral(IcebergSqlExtensionsParser.FloatLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
floatLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterFloatLiteral(IcebergSqlExtensionsParser.FloatLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
floatLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterIdentityTransform(IcebergSqlExtensionsParser.IdentityTransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
identityTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - enterIdentityTransform(IcebergSqlExtensionsParser.IdentityTransformContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
identityTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - enterIntegerLiteral(IcebergSqlExtensionsParser.IntegerLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
integerLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterIntegerLiteral(IcebergSqlExtensionsParser.IntegerLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
integerLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterMultipartIdentifier(IcebergSqlExtensionsParser.MultipartIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.multipartIdentifier()
. - enterMultipartIdentifier(IcebergSqlExtensionsParser.MultipartIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.multipartIdentifier()
. - enterNamedArgument(IcebergSqlExtensionsParser.NamedArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
namedArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - enterNamedArgument(IcebergSqlExtensionsParser.NamedArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
namedArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - enterNonReserved(IcebergSqlExtensionsParser.NonReservedContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.nonReserved()
. - enterNonReserved(IcebergSqlExtensionsParser.NonReservedContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.nonReserved()
. - enterNumericLiteral(IcebergSqlExtensionsParser.NumericLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
numericLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterNumericLiteral(IcebergSqlExtensionsParser.NumericLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
numericLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterOrder(IcebergSqlExtensionsParser.OrderContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.order()
. - enterOrder(IcebergSqlExtensionsParser.OrderContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.order()
. - enterOrderField(IcebergSqlExtensionsParser.OrderFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.orderField()
. - enterOrderField(IcebergSqlExtensionsParser.OrderFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.orderField()
. - enterPositionalArgument(IcebergSqlExtensionsParser.PositionalArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
positionalArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - enterPositionalArgument(IcebergSqlExtensionsParser.PositionalArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
positionalArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - enterQuotedIdentifier(IcebergSqlExtensionsParser.QuotedIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.quotedIdentifier()
. - enterQuotedIdentifier(IcebergSqlExtensionsParser.QuotedIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.quotedIdentifier()
. - enterQuotedIdentifierAlternative(IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
quotedIdentifierAlternative
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - enterQuotedIdentifierAlternative(IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
quotedIdentifierAlternative
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - enterReplacePartitionField(IcebergSqlExtensionsParser.ReplacePartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
replacePartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterReplacePartitionField(IcebergSqlExtensionsParser.ReplacePartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
replacePartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanValueContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DoubleLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExponentLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExpressionContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FieldListContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FloatLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentityTransformContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IntegerLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MultipartIdentifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NamedArgumentContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumericLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.PositionalArgumentContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleOrderContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleStatementContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SmallIntLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TinyIntLiteralContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformArgumentContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TypeConstructorContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.UnquotedIdentifierContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- enterRule(ParseTreeListener) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteSpecContext
- enterSetIdentifierFields(IcebergSqlExtensionsParser.SetIdentifierFieldsContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
setIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterSetIdentifierFields(IcebergSqlExtensionsParser.SetIdentifierFieldsContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
setIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterSetWriteDistributionAndOrdering(IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
setWriteDistributionAndOrdering
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterSetWriteDistributionAndOrdering(IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
setWriteDistributionAndOrdering
labeled alternative inIcebergSqlExtensionsParser.statement()
. - enterSingleOrder(IcebergSqlExtensionsParser.SingleOrderContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.singleOrder()
. - enterSingleOrder(IcebergSqlExtensionsParser.SingleOrderContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.singleOrder()
. - enterSingleStatement(IcebergSqlExtensionsParser.SingleStatementContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.singleStatement()
. - enterSingleStatement(IcebergSqlExtensionsParser.SingleStatementContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.singleStatement()
. - enterSmallIntLiteral(IcebergSqlExtensionsParser.SmallIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
smallIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterSmallIntLiteral(IcebergSqlExtensionsParser.SmallIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
smallIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterStringLiteral(IcebergSqlExtensionsParser.StringLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
stringLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterStringLiteral(IcebergSqlExtensionsParser.StringLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
stringLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterStringMap(IcebergSqlExtensionsParser.StringMapContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.stringMap()
. - enterStringMap(IcebergSqlExtensionsParser.StringMapContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.stringMap()
. - enterTinyIntLiteral(IcebergSqlExtensionsParser.TinyIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
tinyIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterTinyIntLiteral(IcebergSqlExtensionsParser.TinyIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
tinyIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - enterTransformArgument(IcebergSqlExtensionsParser.TransformArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.transformArgument()
. - enterTransformArgument(IcebergSqlExtensionsParser.TransformArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.transformArgument()
. - enterTypeConstructor(IcebergSqlExtensionsParser.TypeConstructorContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
typeConstructor
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterTypeConstructor(IcebergSqlExtensionsParser.TypeConstructorContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
typeConstructor
labeled alternative inIcebergSqlExtensionsParser.constant()
. - enterUnquotedIdentifier(IcebergSqlExtensionsParser.UnquotedIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by the
unquotedIdentifier
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - enterUnquotedIdentifier(IcebergSqlExtensionsParser.UnquotedIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by the
unquotedIdentifier
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - enterWriteDistributionSpec(IcebergSqlExtensionsParser.WriteDistributionSpecContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.writeDistributionSpec()
. - enterWriteDistributionSpec(IcebergSqlExtensionsParser.WriteDistributionSpecContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.writeDistributionSpec()
. - enterWriteOrderingSpec(IcebergSqlExtensionsParser.WriteOrderingSpecContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.writeOrderingSpec()
. - enterWriteOrderingSpec(IcebergSqlExtensionsParser.WriteOrderingSpecContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.writeOrderingSpec()
. - enterWriteSpec(IcebergSqlExtensionsParser.WriteSpecContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.writeSpec()
. - enterWriteSpec(IcebergSqlExtensionsParser.WriteSpecContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Enter a parse tree produced by
IcebergSqlExtensionsParser.writeSpec()
. - ENTRIES - org.apache.iceberg.MetadataTableType
- entrySet() - Method in class org.apache.iceberg.util.SerializableMap
- entrySet() - Method in class org.apache.iceberg.util.StructLikeMap
- enums(List<String>) - Static method in class org.apache.iceberg.avro.ValueReaders
- env(StreamExecutionEnvironment) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- EnvironmentUtil - Class in org.apache.iceberg.util
- EOF() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleOrderContext
- EOF() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleStatementContext
- EPOCH - Static variable in class org.apache.iceberg.util.DateTimeUtil
- EPOCH_DAY - Static variable in class org.apache.iceberg.util.DateTimeUtil
- eq(Bound<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- eq(BoundReference<T>, Literal<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- EQ - org.apache.iceberg.expressions.Expression.Operation
- eqDeletedRowFilter() - Method in class org.apache.iceberg.data.DeleteFilter
- equal(String, T) - Static method in class org.apache.iceberg.expressions.Expressions
- equal(UnboundTerm<T>, T) - Static method in class org.apache.iceberg.expressions.Expressions
- EQUALITY_DELETES - org.apache.iceberg.FileContent
- EQUALITY_IDS - Static variable in interface org.apache.iceberg.DataFile
- EqualityDeleteRowReader - Class in org.apache.iceberg.spark.source
- EqualityDeleteRowReader(CombinedScanTask, Table, Schema, boolean) - Constructor for class org.apache.iceberg.spark.source.EqualityDeleteRowReader
- equalityDeleteRowSchema() - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- EqualityDeleteWriter<T> - Class in org.apache.iceberg.deletes
- EqualityDeleteWriter(FileAppender<T>, FileFormat, String, PartitionSpec, StructLike, EncryptionKeyMetadata, SortOrder, int...) - Constructor for class org.apache.iceberg.deletes.EqualityDeleteWriter
- EqualityDeltaWriter<T> - Interface in org.apache.iceberg.io
-
A writer capable of writing data and equality deletes that may belong to different specs and partitions.
- equalityFieldColumns(List<String>) - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
-
Configuring the equality field columns for iceberg table that accept CDC or UPSERT events.
- equalityFieldIds() - Method in interface org.apache.iceberg.ContentFile
-
Returns the set of field IDs used for equality comparison, in equality delete files.
- equalityFieldIds() - Method in interface org.apache.iceberg.DataFile
- equalityFieldIds(int...) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- equalityFieldIds(int...) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- equalityFieldIds(int...) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- equalityFieldIds(List<Integer>) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- equalityFieldIds(List<Integer>) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- equalityFieldIds(List<Integer>) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- equals(Object) - Method in class org.apache.iceberg.catalog.Namespace
- equals(Object) - Method in class org.apache.iceberg.catalog.TableIdentifier
- equals(Object) - Method in class org.apache.iceberg.data.GenericRecord
- equals(Object) - Method in class org.apache.iceberg.GenericManifestFile
- equals(Object) - Method in class org.apache.iceberg.mapping.MappedField
- equals(Object) - Method in class org.apache.iceberg.mapping.MappedFields
- equals(Object) - Method in class org.apache.iceberg.MetricsModes.Truncate
- equals(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergRecordObjectInspector
- equals(Object) - Method in class org.apache.iceberg.PartitionField
- equals(Object) - Method in class org.apache.iceberg.PartitionKey
- equals(Object) - Method in class org.apache.iceberg.PartitionSpec
- equals(Object) - Method in class org.apache.iceberg.SnapshotRef
- equals(Object) - Method in class org.apache.iceberg.SortField
- equals(Object) - Method in class org.apache.iceberg.SortOrder
- equals(Object) - Method in class org.apache.iceberg.spark.source.SparkTable
- equals(Object) - Method in class org.apache.iceberg.spark.SparkTableUtil.SparkPartition
- equals(Object) - Method in class org.apache.iceberg.TableMetadata.MetadataLogEntry
- equals(Object) - Method in class org.apache.iceberg.TableMetadata.SnapshotLogEntry
- equals(Object) - Method in class org.apache.iceberg.transforms.UnknownTransform
- equals(Object) - Method in class org.apache.iceberg.types.Types.DecimalType
- equals(Object) - Method in class org.apache.iceberg.types.Types.FixedType
- equals(Object) - Method in class org.apache.iceberg.types.Types.ListType
- equals(Object) - Method in class org.apache.iceberg.types.Types.MapType
- equals(Object) - Method in class org.apache.iceberg.types.Types.NestedField
- equals(Object) - Method in class org.apache.iceberg.types.Types.StructType
- equals(Object) - Method in class org.apache.iceberg.types.Types.TimestampType
- equals(Object) - Method in class org.apache.iceberg.util.CharSequenceSet
- equals(Object) - Method in class org.apache.iceberg.util.CharSequenceWrapper
- equals(Object) - Method in class org.apache.iceberg.util.Pair
- equals(Object) - Method in class org.apache.iceberg.util.SerializableMap
- equals(Object) - Method in class org.apache.iceberg.util.StructLikeSet
- equals(Object) - Method in class org.apache.iceberg.util.StructLikeWrapper
- equivalent(Expression, Expression, Types.StructType, boolean) - Static method in class org.apache.iceberg.expressions.ExpressionUtil
-
Returns whether two unbound expressions will accept the same inputs.
- ErrorHandlers - Class in org.apache.iceberg.rest
-
A set of consumers to handle errors for requests for table entities or for namespace entities, to throw the correct exception.
- ErrorResponse - Class in org.apache.iceberg.rest.responses
-
Standard response body for all API errors
- ErrorResponse.Builder - Class in org.apache.iceberg.rest.responses
- ErrorResponseDeserializer() - Constructor for class org.apache.iceberg.rest.RESTSerializers.ErrorResponseDeserializer
- ErrorResponseParser - Class in org.apache.iceberg.rest.responses
- ErrorResponseSerializer() - Constructor for class org.apache.iceberg.rest.RESTSerializers.ErrorResponseSerializer
- estimatedSize() - Method in class org.apache.iceberg.spark.source.SerializableTableWithSize
- estimatedSize() - Method in class org.apache.iceberg.spark.source.SerializableTableWithSize.SerializableMetadataTableWithSize
- EstimateOrcAvgWidthVisitor - Class in org.apache.iceberg.orc
- EstimateOrcAvgWidthVisitor() - Constructor for class org.apache.iceberg.orc.EstimateOrcAvgWidthVisitor
- estimateSize(StructType, long) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Estimate approximate table size based on Spark schema and total records.
- estimateStatistics() - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- eval(ContentFile<?>) - Method in class org.apache.iceberg.expressions.InclusiveMetricsEvaluator
-
Test whether the file may contain records that match the expression.
- eval(ContentFile<?>) - Method in class org.apache.iceberg.expressions.StrictMetricsEvaluator
-
Test whether all records within the file match the expression.
- eval(ManifestFile) - Method in class org.apache.iceberg.expressions.ManifestEvaluator
-
Test whether the file may contain records that match the expression.
- eval(StructLike) - Method in interface org.apache.iceberg.expressions.Bound
-
Produce a value from the struct for this expression.
- eval(StructLike) - Method in class org.apache.iceberg.expressions.BoundPredicate
- eval(StructLike) - Method in class org.apache.iceberg.expressions.BoundReference
- eval(StructLike) - Method in class org.apache.iceberg.expressions.BoundTransform
- eval(StructLike) - Method in class org.apache.iceberg.expressions.Evaluator
- Evaluator - Class in org.apache.iceberg.expressions
-
Evaluates an
Expression
for data described by aTypes.StructType
. - Evaluator(Types.StructType, Expression) - Constructor for class org.apache.iceberg.expressions.Evaluator
- Evaluator(Types.StructType, Expression, boolean) - Constructor for class org.apache.iceberg.expressions.Evaluator
- Exceptions - Class in org.apache.iceberg.util
- ExceptionUtil - Class in org.apache.iceberg.util
- ExceptionUtil.Block<R,E1 extends java.lang.Exception,E2 extends java.lang.Exception,E3 extends java.lang.Exception> - Interface in org.apache.iceberg.util
- ExceptionUtil.CatchBlock - Interface in org.apache.iceberg.util
- ExceptionUtil.FinallyBlock - Interface in org.apache.iceberg.util
- exchangeToken(RESTClient, Map<String, String>, String, String, String, String, String) - Static method in class org.apache.iceberg.rest.auth.OAuth2Util
- execute() - Method in interface org.apache.iceberg.actions.Action
-
Executes this action.
- execute() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- execute() - Method in class org.apache.iceberg.spark.actions.DeleteOrphanFilesSparkAction
- execute() - Method in class org.apache.iceberg.spark.actions.DeleteReachableFilesSparkAction
- execute() - Method in class org.apache.iceberg.spark.actions.ExpireSnapshotsSparkAction
- execute() - Method in class org.apache.iceberg.spark.actions.MigrateTableSparkAction
- execute() - Method in class org.apache.iceberg.spark.actions.RewriteDataFilesSparkAction
- execute() - Method in class org.apache.iceberg.spark.actions.RewriteManifestsSparkAction
- execute() - Method in class org.apache.iceberg.spark.actions.SnapshotTableSparkAction
- executeDeleteWith(ExecutorService) - Method in interface org.apache.iceberg.actions.DeleteOrphanFiles
-
Passes an alternative executor service that will be used for removing orphaned files.
- executeDeleteWith(ExecutorService) - Method in interface org.apache.iceberg.actions.DeleteReachableFiles
-
Passes an alternative executor service that will be used for files removal.
- executeDeleteWith(ExecutorService) - Method in interface org.apache.iceberg.actions.ExpireSnapshots
-
Passes an alternative executor service that will be used for manifests, data and delete files deletion.
- executeDeleteWith(ExecutorService) - Method in interface org.apache.iceberg.ExpireSnapshots
-
Passes an alternative executor service that will be used for manifests and data files deletion.
- executeDeleteWith(ExecutorService) - Method in class org.apache.iceberg.spark.actions.DeleteOrphanFilesSparkAction
- executeDeleteWith(ExecutorService) - Method in class org.apache.iceberg.spark.actions.DeleteReachableFilesSparkAction
- executeDeleteWith(ExecutorService) - Method in class org.apache.iceberg.spark.actions.ExpireSnapshotsSparkAction
- executeWith(ExecutorService) - Method in class org.apache.iceberg.util.Tasks.Builder
- existing(F, long, long) - Method in class org.apache.iceberg.ManifestWriter
-
Add an existing entry for a file.
- EXISTING_FILES_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- EXISTING_ROWS_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- existingDeletes() - Method in interface org.apache.iceberg.DeletedDataFileScanTask
-
A list of previously added
delete files
to apply when reading the data file in this task. - existingDeletes() - Method in interface org.apache.iceberg.DeletedRowsScanTask
-
A list of
delete files
that existed before and must be applied prior to determining which records are deleted by delete files inDeletedRowsScanTask.addedDeletes()
. - existingFilesCount() - Method in class org.apache.iceberg.GenericManifestFile
- existingFilesCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the number of data files with status EXISTING in the manifest file.
- existingFilesCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- existingRowsCount() - Method in class org.apache.iceberg.GenericManifestFile
- existingRowsCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the total number of rows in all data files with status EXISTING in the manifest file.
- existingRowsCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- exists() - Method in class org.apache.iceberg.hadoop.HadoopInputFile
- exists() - Method in interface org.apache.iceberg.io.InputFile
-
Checks whether the file exists.
- exists(String) - Method in class org.apache.iceberg.hadoop.HadoopTables
- exists(String) - Method in interface org.apache.iceberg.Tables
- exitAddPartitionField(IcebergSqlExtensionsParser.AddPartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
addPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitAddPartitionField(IcebergSqlExtensionsParser.AddPartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
addPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitApplyTransform(IcebergSqlExtensionsParser.ApplyTransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
applyTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - exitApplyTransform(IcebergSqlExtensionsParser.ApplyTransformContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
applyTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - exitBigDecimalLiteral(IcebergSqlExtensionsParser.BigDecimalLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
bigDecimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitBigDecimalLiteral(IcebergSqlExtensionsParser.BigDecimalLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
bigDecimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitBigIntLiteral(IcebergSqlExtensionsParser.BigIntLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
bigIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitBigIntLiteral(IcebergSqlExtensionsParser.BigIntLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
bigIntLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitBooleanLiteral(IcebergSqlExtensionsParser.BooleanLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
booleanLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - exitBooleanLiteral(IcebergSqlExtensionsParser.BooleanLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
booleanLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - exitBooleanValue(IcebergSqlExtensionsParser.BooleanValueContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.booleanValue()
. - exitBooleanValue(IcebergSqlExtensionsParser.BooleanValueContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.booleanValue()
. - exitCall(IcebergSqlExtensionsParser.CallContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
call
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitCall(IcebergSqlExtensionsParser.CallContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
call
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitDecimalLiteral(IcebergSqlExtensionsParser.DecimalLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
decimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitDecimalLiteral(IcebergSqlExtensionsParser.DecimalLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
decimalLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitDoubleLiteral(IcebergSqlExtensionsParser.DoubleLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
doubleLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitDoubleLiteral(IcebergSqlExtensionsParser.DoubleLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
doubleLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitDropIdentifierFields(IcebergSqlExtensionsParser.DropIdentifierFieldsContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
dropIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitDropIdentifierFields(IcebergSqlExtensionsParser.DropIdentifierFieldsContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
dropIdentifierFields
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitDropPartitionField(IcebergSqlExtensionsParser.DropPartitionFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
dropPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitDropPartitionField(IcebergSqlExtensionsParser.DropPartitionFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
dropPartitionField
labeled alternative inIcebergSqlExtensionsParser.statement()
. - exitEveryRule(ParserRuleContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
- exitExponentLiteral(IcebergSqlExtensionsParser.ExponentLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
exponentLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitExponentLiteral(IcebergSqlExtensionsParser.ExponentLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
exponentLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitExpression(IcebergSqlExtensionsParser.ExpressionContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.expression()
. - exitExpression(IcebergSqlExtensionsParser.ExpressionContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.expression()
. - exitFieldList(IcebergSqlExtensionsParser.FieldListContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.fieldList()
. - exitFieldList(IcebergSqlExtensionsParser.FieldListContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.fieldList()
. - exitFloatLiteral(IcebergSqlExtensionsParser.FloatLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
floatLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitFloatLiteral(IcebergSqlExtensionsParser.FloatLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
floatLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitIdentityTransform(IcebergSqlExtensionsParser.IdentityTransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
identityTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - exitIdentityTransform(IcebergSqlExtensionsParser.IdentityTransformContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
identityTransform
labeled alternative inIcebergSqlExtensionsParser.transform()
. - exitIntegerLiteral(IcebergSqlExtensionsParser.IntegerLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
integerLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitIntegerLiteral(IcebergSqlExtensionsParser.IntegerLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
integerLiteral
labeled alternative inIcebergSqlExtensionsParser.number()
. - exitMultipartIdentifier(IcebergSqlExtensionsParser.MultipartIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.multipartIdentifier()
. - exitMultipartIdentifier(IcebergSqlExtensionsParser.MultipartIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.multipartIdentifier()
. - exitNamedArgument(IcebergSqlExtensionsParser.NamedArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
namedArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - exitNamedArgument(IcebergSqlExtensionsParser.NamedArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
namedArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - exitNonReserved(IcebergSqlExtensionsParser.NonReservedContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.nonReserved()
. - exitNonReserved(IcebergSqlExtensionsParser.NonReservedContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.nonReserved()
. - exitNumericLiteral(IcebergSqlExtensionsParser.NumericLiteralContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
numericLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - exitNumericLiteral(IcebergSqlExtensionsParser.NumericLiteralContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
numericLiteral
labeled alternative inIcebergSqlExtensionsParser.constant()
. - exitOrder(IcebergSqlExtensionsParser.OrderContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.order()
. - exitOrder(IcebergSqlExtensionsParser.OrderContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.order()
. - exitOrderField(IcebergSqlExtensionsParser.OrderFieldContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.orderField()
. - exitOrderField(IcebergSqlExtensionsParser.OrderFieldContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.orderField()
. - exitPositionalArgument(IcebergSqlExtensionsParser.PositionalArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
positionalArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - exitPositionalArgument(IcebergSqlExtensionsParser.PositionalArgumentContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by the
positionalArgument
labeled alternative inIcebergSqlExtensionsParser.callArgument()
. - exitQuotedIdentifier(IcebergSqlExtensionsParser.QuotedIdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.quotedIdentifier()
. - exitQuotedIdentifier(IcebergSqlExtensionsParser.QuotedIdentifierContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsListener
-
Exit a parse tree produced by
IcebergSqlExtensionsParser.quotedIdentifier()
. - exitQuotedIdentifierAlternative(IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsBaseListener
-
Exit a parse tree produced by the
quotedIdentifierAlternative
labeled alternative inIcebergSqlExtensionsParser.identifier()
. - exitQuotedIdentifierAlternative(IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext) - Method in interface org.apache.spark.sql.catalyst.parser.extensio