Index
All Classes and Interfaces|All Packages|Constant Field Values|Serialized Form
A
- aadPrefix() - Method in interface org.apache.iceberg.encryption.NativeEncryptionKeyMetadata
-
Additional authentication data as a
ByteBuffer
- abort() - Method in class org.apache.iceberg.io.BaseTaskWriter
- abort() - Method in interface org.apache.iceberg.io.TaskWriter
-
Close the writer and delete the completed files if possible when aborting.
- abort(RewritePositionDeletesGroup) - Method in class org.apache.iceberg.actions.RewritePositionDeletesCommitManager
-
Clean up a specified file set by removing any files created for that operation, should not throw any exceptions.
- abortFileGroup(RewriteFileGroup) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager
-
Clean up a specified file set by removing any files created for that operation, should not throw any exceptions
- abortFileGroup(RewriteFileGroup) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager.CommitService
- abortFileGroup(RewritePositionDeletesGroup) - Method in class org.apache.iceberg.actions.RewritePositionDeletesCommitManager.CommitService
- abortJob(JobContext, int) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
-
Removes the generated data files if there is a commit file already generated for them.
- abortStagedChanges() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- abortStagedChanges() - Method in class org.apache.iceberg.spark.source.StagedSparkTable
- abortTask(TaskAttemptContext) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
-
Removes files generated by this task.
- abortWith(Tasks.Task<I, ?>) - Method in class org.apache.iceberg.util.Tasks.Builder
- AbstractMapredIcebergRecordReader<T> - Class in org.apache.iceberg.mr.mapred
- AbstractMapredIcebergRecordReader(IcebergInputFormat<?>, IcebergSplit, JobConf, Reporter) - Constructor for class org.apache.iceberg.mr.mapred.AbstractMapredIcebergRecordReader
- ACCELERATION_ENABLED - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Determines if S3 client will use the Acceleration Mode, default to false.
- ACCELERATION_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanValueContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BranchOptionsContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateOrReplaceBranchContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateOrReplaceTagContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateReplaceBranchClauseContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateReplaceTagClauseContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DoubleLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropBranchContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropTagContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExponentLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExpressionContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FieldListContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.FloatLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentityTransformContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IntegerLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MaxSnapshotAgeContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MinSnapshotsToKeepContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.MultipartIdentifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NamedArgumentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumericLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumSnapshotsContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.PositionalArgumentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierAlternativeContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.RefRetainContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleOrderContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SingleStatementContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SmallIntLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SnapshotIdContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SnapshotRetentionContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringArrayContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TagOptionsContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TimeUnitContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TinyIntLiteralContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformArgumentContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TypeConstructorContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.UnquotedIdentifierContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteSpecContext
- accept(Path) - Method in class org.apache.iceberg.hadoop.HiddenPathFilter
- ACCESS_KEY_ID - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Configure the static access key ID used to access S3FileIO.
- ACCESS_POINTS_PREFIX - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Used by
S3FileIO
, prefix used for bucket access point configuration. - ACCESS_TOKEN_TYPE - Static variable in class org.apache.iceberg.rest.auth.OAuth2Properties
- accessKeyId() - Method in class org.apache.iceberg.aliyun.AliyunProperties
- accessKeyId() - Method in class org.apache.iceberg.aws.s3.S3FileIOProperties
- accessKeySecret() - Method in class org.apache.iceberg.aliyun.AliyunProperties
- accessor() - Method in class org.apache.iceberg.expressions.BoundReference
- accessor() - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- Accessor<T> - Interface in org.apache.iceberg
- accessorForField(int) - Method in class org.apache.iceberg.Schema
-
Returns an accessor for retrieving the data from
StructLike
. - Accessors - Class in org.apache.iceberg
-
Position2Accessor and Position3Accessor here is an optimization.
- acl() - Method in class org.apache.iceberg.aws.s3.S3FileIOProperties
- ACL - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Used to configure canned access control list (ACL) for S3 client to use during write.
- acquire(String, String) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbLockManager
- acquire(String, String) - Method in interface org.apache.iceberg.LockManager
-
Try to acquire a lock
- acquireIntervalMs() - Method in class org.apache.iceberg.util.LockManagers.BaseLockManager
- acquireTimeoutMs() - Method in class org.apache.iceberg.util.LockManagers.BaseLockManager
- Action<ThisT,
R> - Interface in org.apache.iceberg.actions -
An action performed on a table.
- actions() - Method in class org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure
- Actions - Class in org.apache.iceberg.flink.actions
- ActionsProvider - Interface in org.apache.iceberg.actions
-
An API that should be implemented by query engine integrations for providing actions.
- ADAPTIVE_SPLIT_SIZE_ENABLED - Static variable in class org.apache.iceberg.TableProperties
- ADAPTIVE_SPLIT_SIZE_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- adaptiveSplitSizeEnabled() - Method in class org.apache.iceberg.spark.SparkReadConf
- add(int, StructLike) - Method in class org.apache.iceberg.util.PartitionSet
- add(D) - Method in interface org.apache.iceberg.io.FileAppender
- add(D) - Method in class org.apache.iceberg.parquet.ParquetWriteAdapter
-
Deprecated.
- add(F) - Method in class org.apache.iceberg.ManifestWriter
-
Add an added entry for a file.
- add(F) - Method in class org.apache.iceberg.RollingManifestWriter
-
Add an added entry for a file.
- add(F, long) - Method in class org.apache.iceberg.ManifestWriter
-
Add an added entry for a file with a specific sequence number.
- add(F, long) - Method in class org.apache.iceberg.RollingManifestWriter
-
Add an added entry for a file with a specific sequence number.
- add(CharSequence) - Method in class org.apache.iceberg.util.CharSequenceSet
- add(String, Table) - Method in class org.apache.iceberg.spark.SparkTableCache
- add(Namespace) - Method in class org.apache.iceberg.rest.responses.ListNamespacesResponse.Builder
- add(TableIdentifier) - Method in class org.apache.iceberg.rest.responses.ListTablesResponse.Builder
- add(DataFile) - Method in class org.apache.iceberg.StreamingDelete
-
Add a data file to the new snapshot.
- add(DeleteFile) - Method in class org.apache.iceberg.StreamingDelete
-
Add a delete file to the new snapshot.
- add(DeleteFile, long) - Method in class org.apache.iceberg.StreamingDelete
-
Add a delete file to the new snapshot.
- add(WriteResult) - Method in class org.apache.iceberg.io.WriteResult.Builder
- add(ManifestFile) - Method in class org.apache.iceberg.StreamingDelete
-
Add all files in a manifest to the new snapshot.
- add(Blob) - Method in class org.apache.iceberg.puffin.PuffinWriter
- add(StructLike) - Method in class org.apache.iceberg.util.StructLikeSet
- add(Pair<Integer, StructLike>) - Method in class org.apache.iceberg.util.PartitionSet
- add(T) - Method in class org.apache.iceberg.spark.actions.SetAccumulator
- add(T[], T) - Static method in class org.apache.iceberg.util.ArrayUtil
-
Copies the given array and adds the given element at the end of the new array.
- ADD - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- ADD - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- ADD() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- ADD() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- ADD_EQ_DELETE_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADD_POS_DELETE_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- addAll(Iterable<D>) - Method in interface org.apache.iceberg.io.FileAppender
- addAll(Iterable<WriteResult>) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addAll(Collection<? extends CharSequence>) - Method in class org.apache.iceberg.util.CharSequenceSet
- addAll(Collection<? extends StructLike>) - Method in class org.apache.iceberg.util.StructLikeSet
- addAll(Collection<? extends Pair<Integer, StructLike>>) - Method in class org.apache.iceberg.util.PartitionSet
- addAll(Collection<Namespace>) - Method in class org.apache.iceberg.rest.responses.ListNamespacesResponse.Builder
- addAll(Collection<TableIdentifier>) - Method in class org.apache.iceberg.rest.responses.ListTablesResponse.Builder
- addAll(Iterator<D>) - Method in interface org.apache.iceberg.io.FileAppender
- addAllConfig(Map<String, String>) - Method in class org.apache.iceberg.rest.responses.LoadTableResponse.Builder
- addCloseable(Closeable) - Method in class org.apache.iceberg.io.CloseableGroup
-
Register a closeable to be managed by this class.
- addCloseable(AutoCloseable) - Method in class org.apache.iceberg.io.CloseableGroup
-
Register an autocloseables to be managed by this class.
- addColumn(String, String, Type) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new column to a nested struct.
- addColumn(String, String, Type, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new column to a nested struct.
- addColumn(String, Type) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new top-level column.
- addColumn(String, Type, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new top-level column.
- addConfig(String, String) - Method in class org.apache.iceberg.rest.responses.LoadTableResponse.Builder
- addDataFiles(Iterable<DataFile>) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addDataFiles(DataFile...) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addDeleteFiles(Iterable<DeleteFile>) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addDeleteFiles(DeleteFile...) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addDeletes(DeleteFile) - Method in interface org.apache.iceberg.RowDelta
-
Add a
DeleteFile
to the table. - ADDED_DATA_FILES - Static variable in interface org.apache.iceberg.metrics.CommitMetricsResult
- ADDED_DELETE_FILES - Static variable in interface org.apache.iceberg.metrics.CommitMetricsResult
- ADDED_DELETE_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_EQ_DELETE_FILES - Static variable in interface org.apache.iceberg.metrics.CommitMetricsResult
- ADDED_EQ_DELETES - Static variable in interface org.apache.iceberg.metrics.CommitMetricsResult
- ADDED_EQ_DELETES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_FILE_SIZE_BYTES - Static variable in interface org.apache.iceberg.metrics.CommitMetricsResult
- ADDED_FILE_SIZE_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_FILES_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- ADDED_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_POS_DELETE_FILES - Static variable in interface org.apache.iceberg.metrics.CommitMetricsResult
- ADDED_POS_DELETES - Static variable in interface org.apache.iceberg.metrics.CommitMetricsResult
- ADDED_POS_DELETES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_RECORDS - Static variable in interface org.apache.iceberg.metrics.CommitMetricsResult
- ADDED_RECORDS_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- ADDED_ROWS_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- addedBytes() - Method in class org.apache.iceberg.actions.RewritePositionDeletesGroup
- addedBytesCount() - Method in interface org.apache.iceberg.actions.RewritePositionDeleteFiles.FileGroupRewriteResult
-
Returns the number of bytes of newly added position delete files in this group.
- addedBytesCount() - Method in interface org.apache.iceberg.actions.RewritePositionDeleteFiles.Result
-
Returns the number of bytes of newly added position delete files
- addedDataFiles() - Method in class org.apache.iceberg.actions.RewriteDataFilesActionResult
- addedDataFiles() - Method in class org.apache.iceberg.StreamingDelete
- addedDataFiles() - Method in interface org.apache.iceberg.metrics.CommitMetricsResult
- addedDataFiles(FileIO) - Method in interface org.apache.iceberg.Snapshot
-
Return all data files added to the table in this snapshot.
- addedDataFilesCount() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.FileGroupRewriteResult
- addedDataFilesCount() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.Result
- addedDeleteFiles() - Method in class org.apache.iceberg.actions.RewritePositionDeletesGroup
- addedDeleteFiles() - Method in interface org.apache.iceberg.metrics.CommitMetricsResult
- addedDeleteFiles(FileIO) - Method in interface org.apache.iceberg.Snapshot
-
Return all delete files added to the table in this snapshot.
- addedDeleteFiles(TableMetadata, Long, Expression, PartitionSet, Snapshot) - Method in class org.apache.iceberg.StreamingDelete
-
Returns matching delete files have been added to the table since a starting snapshot.
- addedDeleteFilesCount() - Method in interface org.apache.iceberg.actions.RewritePositionDeleteFiles.FileGroupRewriteResult
-
Returns the count of the added position delete files in this group.
- addedDeleteFilesCount() - Method in interface org.apache.iceberg.actions.RewritePositionDeleteFiles.Result
-
Returns the count of the added position delete files.
- addedDeletes() - Method in interface org.apache.iceberg.DeletedRowsScanTask
-
A list of added
delete files
that apply to the task's data file. - addedEqualityDeleteFiles() - Method in interface org.apache.iceberg.metrics.CommitMetricsResult
- addedEqualityDeletes() - Method in interface org.apache.iceberg.metrics.CommitMetricsResult
- addedFile(PartitionSpec, DataFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- addedFile(PartitionSpec, DeleteFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- addedFiles() - Method in class org.apache.iceberg.actions.RewriteFileGroup
- addedFilesCount() - Method in class org.apache.iceberg.GenericManifestFile
- addedFilesCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the number of files with status ADDED in the manifest file.
- addedFilesCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- addedFilesSizeInBytes() - Method in interface org.apache.iceberg.metrics.CommitMetricsResult
- addedManifest(ManifestFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- addedManifests() - Method in interface org.apache.iceberg.actions.RewriteManifests.Result
-
Returns added manifests.
- addedPositionalDeleteFiles() - Method in interface org.apache.iceberg.metrics.CommitMetricsResult
- addedPositionalDeletes() - Method in interface org.apache.iceberg.metrics.CommitMetricsResult
- addedPositionDeleteFilesCount() - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteFiles.Result
-
Returns the count of the added position delete files.
- addedRecords() - Method in interface org.apache.iceberg.metrics.CommitMetricsResult
- addedRowsCount() - Method in class org.apache.iceberg.GenericManifestFile
- addedRowsCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the total number of rows in all files with status ADDED in the manifest file.
- addedRowsCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- AddedRowsScanTask - Interface in org.apache.iceberg
-
A scan task for inserts generated by adding a data file to the table.
- addElement(I, E) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- addElement(List<E>, E) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ListReader
- addExtension(String) - Method in enum class org.apache.iceberg.FileFormat
-
Returns filename with this format's extension added, if necessary.
- addFallbackIds(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
- addField(String) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Add a new partition field from a source column.
- addField(String, Term) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Add a new partition field from an
expression term
, with the given partition field name. - addField(Term) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Add a new partition field from an
expression term
. - addFile(String, byte[]) - Method in class org.apache.iceberg.inmemory.InMemoryFileIO
- addFile(DataFile) - Method in class org.apache.iceberg.BaseOverwriteFiles
- addFile(DataFile) - Method in class org.apache.iceberg.BaseReplacePartitions
- addFile(DataFile) - Method in interface org.apache.iceberg.OverwriteFiles
-
Add a
DataFile
to the table. - addFile(DataFile) - Method in interface org.apache.iceberg.ReplacePartitions
-
Add a
DataFile
to the table. - addFile(DataFile) - Method in interface org.apache.iceberg.RewriteFiles
-
Add a new data file.
- addFile(DeleteFile) - Method in interface org.apache.iceberg.RewriteFiles
-
Add a new delete file.
- addFile(DeleteFile, long) - Method in interface org.apache.iceberg.RewriteFiles
-
Add a new delete file with the given data sequence number.
- additionalProperties() - Method in class org.apache.iceberg.spark.actions.SnapshotTableSparkAction
- addManifest(ManifestFile) - Method in class org.apache.iceberg.BaseRewriteManifests
- addManifest(ManifestFile) - Method in interface org.apache.iceberg.RewriteManifests
-
Adds a
manifest file
to the table. - addMissing(String) - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- addMissing(Collection<String>) - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- addPair(I, K, V) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- addPair(Map<K, V>, K, V) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.MapReader
- AddPartitionFieldContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- addPartitionSpec(PartitionSpec) - Method in class org.apache.iceberg.TableMetadata.Builder
- addPartitionSpec(UnboundPartitionSpec) - Method in class org.apache.iceberg.TableMetadata.Builder
- AddPartitionSpec(PartitionSpec) - Constructor for class org.apache.iceberg.MetadataUpdate.AddPartitionSpec
- AddPartitionSpec(UnboundPartitionSpec) - Constructor for class org.apache.iceberg.MetadataUpdate.AddPartitionSpec
- addReader(int) - Method in class org.apache.iceberg.flink.source.enumerator.StaticIcebergEnumerator
- addReferencedDataFiles(CharSequence...) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addReferencedDataFiles(Iterable<CharSequence>) - Method in class org.apache.iceberg.io.WriteResult.Builder
- addRemoved(String) - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- addRemoved(Collection<String>) - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- addRequiredColumn(String, String, Type) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new required top-level column.
- addRequiredColumn(String, String, Type, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new required top-level column.
- addRequiredColumn(String, Type) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new required top-level column.
- addRequiredColumn(String, Type, String) - Method in interface org.apache.iceberg.UpdateSchema
-
Add a new required top-level column.
- addResult(DataWriteResult) - Method in class org.apache.iceberg.io.ClusteredDataWriter
- addResult(DataWriteResult) - Method in class org.apache.iceberg.io.FanoutDataWriter
- addResult(DataWriteResult) - Method in class org.apache.iceberg.io.RollingDataWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.ClusteredEqualityDeleteWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.ClusteredPositionDeleteWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.FanoutPositionOnlyDeleteWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.RollingEqualityDeleteWriter
- addResult(DeleteWriteResult) - Method in class org.apache.iceberg.io.RollingPositionDeleteWriter
- addRows(DataFile) - Method in interface org.apache.iceberg.RowDelta
-
Add a
DataFile
to the table. - addSchema(Schema) - Method in class org.apache.iceberg.data.avro.IcebergDecoder
-
Adds an
Iceberg schema
that can be used to decode buffers. - addSchema(Schema) - Method in class org.apache.iceberg.view.ViewMetadata.Builder
- addSchema(Schema, int) - Method in class org.apache.iceberg.TableMetadata.Builder
- AddSchema(Schema, int) - Constructor for class org.apache.iceberg.MetadataUpdate.AddSchema
- addScope(String) - Method in class org.apache.iceberg.rest.responses.OAuthTokenResponse.Builder
- addScopes(List<String>) - Method in class org.apache.iceberg.rest.responses.OAuthTokenResponse.Builder
- addsDataFiles() - Method in class org.apache.iceberg.StreamingDelete
- addsDeleteFiles() - Method in class org.apache.iceberg.StreamingDelete
- addSnapshot(Snapshot) - Method in class org.apache.iceberg.TableMetadata.Builder
- AddSnapshot(Snapshot) - Constructor for class org.apache.iceberg.MetadataUpdate.AddSnapshot
- addSortOrder(SortOrder) - Method in class org.apache.iceberg.TableMetadata.Builder
- addSortOrder(UnboundSortOrder) - Method in class org.apache.iceberg.TableMetadata.Builder
- AddSortOrder(SortOrder) - Constructor for class org.apache.iceberg.MetadataUpdate.AddSortOrder
- AddSortOrder(UnboundSortOrder) - Constructor for class org.apache.iceberg.MetadataUpdate.AddSortOrder
- addSplitsBack(List<IcebergSourceSplit>, int) - Method in class org.apache.iceberg.flink.source.enumerator.StaticIcebergEnumerator
- addUpdated(String) - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- addUpdated(Collection<String>) - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- addValue(double) - Method in class org.apache.iceberg.DoubleFieldMetrics.Builder
- addValue(float) - Method in class org.apache.iceberg.FloatFieldMetrics.Builder
- addVersion(ViewVersion) - Method in class org.apache.iceberg.view.ViewMetadata.Builder
- AddViewVersion(ViewVersion) - Constructor for class org.apache.iceberg.MetadataUpdate.AddViewVersion
- ADJUST_TO_UTC_PROP - Static variable in class org.apache.iceberg.avro.AvroSchemaUtil
- adjustSplitSize(long, int, long) - Static method in class org.apache.iceberg.util.TableScanUtil
- ADLS_CONNECTION_STRING_PREFIX - Static variable in class org.apache.iceberg.azure.AzureProperties
- ADLS_READ_BLOCK_SIZE - Static variable in class org.apache.iceberg.azure.AzureProperties
- ADLS_SAS_TOKEN_PREFIX - Static variable in class org.apache.iceberg.azure.AzureProperties
- ADLS_SHARED_KEY_ACCOUNT_KEY - Static variable in class org.apache.iceberg.azure.AzureProperties
- ADLS_SHARED_KEY_ACCOUNT_NAME - Static variable in class org.apache.iceberg.azure.AzureProperties
- ADLS_WRITE_BLOCK_SIZE - Static variable in class org.apache.iceberg.azure.AzureProperties
- ADLSFileIO - Class in org.apache.iceberg.azure.adlsv2
-
FileIO implementation backed by Azure Data Lake Storage Gen2.
- ADLSFileIO() - Constructor for class org.apache.iceberg.azure.adlsv2.ADLSFileIO
-
No-arg constructor to load the FileIO dynamically.
- adlsReadBlockSize() - Method in class org.apache.iceberg.azure.AzureProperties
- adlsWriteBlockSize() - Method in class org.apache.iceberg.azure.AzureProperties
- advance() - Method in class org.apache.iceberg.parquet.BaseColumnIterator
- advanceNextPageCount - Variable in class org.apache.iceberg.parquet.BaseColumnIterator
- ADVISORY_PARTITION_SIZE - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- ADVISORY_PARTITION_SIZE - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- advisoryPartitionSize() - Method in class org.apache.iceberg.spark.SparkWriteRequirements
- AES_CTR - Enum constant in enum class org.apache.iceberg.encryption.EncryptionAlgorithm
-
Counter mode (CTR) allows fast encryption with high throughput.
- AES_GCM - Enum constant in enum class org.apache.iceberg.encryption.EncryptionAlgorithm
-
Galois/Counter mode (GCM) combines CTR with the new Galois mode of authentication.
- AES_GCM_CTR - Enum constant in enum class org.apache.iceberg.encryption.EncryptionAlgorithm
-
A combination of GCM and CTR that can be used for file types like Parquet, so that all modules except pages are encrypted by GCM to ensure integrity, and CTR is used for efficient encryption of bulk data.
- AesGcmDecryptor(byte[]) - Constructor for class org.apache.iceberg.encryption.Ciphers.AesGcmDecryptor
- AesGcmEncryptor(byte[]) - Constructor for class org.apache.iceberg.encryption.Ciphers.AesGcmEncryptor
- AesGcmInputFile - Class in org.apache.iceberg.encryption
- AesGcmInputFile(InputFile, byte[], byte[]) - Constructor for class org.apache.iceberg.encryption.AesGcmInputFile
- AesGcmInputStream - Class in org.apache.iceberg.encryption
- AesGcmOutputFile - Class in org.apache.iceberg.encryption
- AesGcmOutputFile(OutputFile, byte[], byte[]) - Constructor for class org.apache.iceberg.encryption.AesGcmOutputFile
- AesGcmOutputStream - Class in org.apache.iceberg.encryption
- after(long) - Method in class org.apache.iceberg.ScanSummary.Builder
- after(String) - Method in class org.apache.iceberg.ScanSummary.Builder
- afterElementField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- afterElementField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterField(String, TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- afterField(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- afterField(Types.NestedField) - Method in class org.apache.iceberg.types.IndexParents
- afterField(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- afterField(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- afterField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterKeyField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- afterKeyField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterListElement(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- afterListElement(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- afterListElement(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- afterMapKey(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- afterMapKey(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- afterMapKey(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- afterMapValue(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- afterMapValue(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- afterMapValue(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- afterRepeatedElement(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterRepeatedKeyValue(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- afterValueField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- afterValueField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- aggregate(BoundAggregate<T, C>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- aggregate(UnboundAggregate<T>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- Aggregate<C extends Term> - Class in org.apache.iceberg.expressions
-
The aggregate functions that can be pushed and evaluated in Iceberg.
- AGGREGATE_PUSH_DOWN_ENABLED - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- AGGREGATE_PUSH_DOWN_ENABLED - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- AGGREGATE_PUSH_DOWN_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- aggregatedResult() - Method in class org.apache.iceberg.io.ClusteredDataWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.ClusteredEqualityDeleteWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.ClusteredPositionDeleteWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.FanoutDataWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.FanoutPositionOnlyDeleteWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.RollingDataWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.RollingEqualityDeleteWriter
- aggregatedResult() - Method in class org.apache.iceberg.io.RollingPositionDeleteWriter
- AggregatedStatisticsSerializer - Class in org.apache.iceberg.flink.sink.shuffle
- AggregatedStatisticsSerializer.AggregatedStatisticsSerializerSnapshot - Class in org.apache.iceberg.flink.sink.shuffle
- AggregatedStatisticsSerializerSnapshot() - Constructor for class org.apache.iceberg.flink.sink.shuffle.AggregatedStatisticsSerializer.AggregatedStatisticsSerializerSnapshot
-
Constructor for read instantiation.
- AggregatedStatisticsSerializerSnapshot(AggregatedStatisticsSerializer) - Constructor for class org.apache.iceberg.flink.sink.shuffle.AggregatedStatisticsSerializer.AggregatedStatisticsSerializerSnapshot
- AggregateEvaluator - Class in org.apache.iceberg.expressions
-
A class for evaluating aggregates.
- aggregatePushDownEnabled() - Method in class org.apache.iceberg.spark.SparkReadConf
- aggregates() - Method in class org.apache.iceberg.expressions.AggregateEvaluator
- aggregateTaskMetrics(long[]) - Method in class org.apache.iceberg.spark.source.metrics.NumDeletes
- aggregateTaskMetrics(long[]) - Method in class org.apache.iceberg.spark.source.metrics.NumSplits
- aliasToId(String) - Method in class org.apache.iceberg.Schema
-
Returns the column id for the given column alias.
- AliyunClientFactories - Class in org.apache.iceberg.aliyun
- AliyunClientFactory - Interface in org.apache.iceberg.aliyun
- aliyunProperties() - Method in interface org.apache.iceberg.aliyun.AliyunClientFactory
-
Returns an initialized
AliyunProperties
- AliyunProperties - Class in org.apache.iceberg.aliyun
- AliyunProperties() - Constructor for class org.apache.iceberg.aliyun.AliyunProperties
- AliyunProperties(Map<String, String>) - Constructor for class org.apache.iceberg.aliyun.AliyunProperties
- ALL_DATA_FILES - Enum constant in enum class org.apache.iceberg.MetadataTableType
- ALL_DELETE_FILES - Enum constant in enum class org.apache.iceberg.MetadataTableType
- ALL_ENTRIES - Enum constant in enum class org.apache.iceberg.MetadataTableType
- ALL_FILES - Enum constant in enum class org.apache.iceberg.MetadataTableType
- ALL_MANIFESTS - Enum constant in enum class org.apache.iceberg.MetadataTableType
- allAggregatorsValid() - Method in class org.apache.iceberg.expressions.AggregateEvaluator
- AllDataFilesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's valid data files as rows. - AllDataFilesTable.AllDataFilesTableScan - Class in org.apache.iceberg
- AllDeleteFilesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes its valid delete files as rows. - AllDeleteFilesTable.AllDeleteFilesTableScan - Class in org.apache.iceberg
- AllEntriesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's manifest entries as rows, for both delete and data files. - AllFilesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes its valid files as rows. - AllFilesTable.AllFilesTableScan - Class in org.apache.iceberg
- allManifests(FileIO) - Method in interface org.apache.iceberg.Snapshot
-
Return all
ManifestFile
instances for either data or delete manifests in this snapshot. - AllManifestsTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's valid manifest files as rows. - AllManifestsTable.AllManifestsTableScan - Class in org.apache.iceberg
- allowIncompatibleChanges() - Method in interface org.apache.iceberg.UpdateSchema
-
Allow incompatible changes to the schema.
- allReachableOtherMetadataFileDS(Table) - Method in class org.apache.iceberg.spark.actions.DeleteReachableFilesSparkAction
- AlreadyExistsException - Exception in org.apache.iceberg.exceptions
-
Exception raised when attempting to create a table that already exists.
- AlreadyExistsException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.AlreadyExistsException
- AlreadyExistsException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.AlreadyExistsException
- ALTER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- ALTER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateOrReplaceBranchContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateOrReplaceTagContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropBranchContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropIdentifierFieldsContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropPartitionFieldContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropTagContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetIdentifierFieldsContext
- ALTER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.SetWriteDistributionAndOrderingContext
- alterDatabase(String, CatalogDatabase, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterFunction(ObjectPath, CatalogFunction, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterNamespace(String[], NamespaceChange...) - Method in class org.apache.iceberg.spark.SparkCatalog
- alterNamespace(String[], NamespaceChange...) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- alterPartition(ObjectPath, CatalogPartitionSpec, CatalogPartition, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterPartitionColumnStatistics(ObjectPath, CatalogPartitionSpec, CatalogColumnStatistics, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterPartitionStatistics(ObjectPath, CatalogPartitionSpec, CatalogTableStatistics, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterTable(ObjectPath, CatalogBaseTable, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
-
This alterTable API only supports altering table properties.
- alterTable(ObjectPath, CatalogBaseTable, List<TableChange>, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterTable(IMetaStoreClient, String, String, Table) - Static method in class org.apache.iceberg.hive.MetastoreUtil
-
Calls alter_table method using the metastore client.
- alterTable(IMetaStoreClient, String, String, Table, Map<String, String>) - Static method in class org.apache.iceberg.hive.MetastoreUtil
-
Calls alter_table method using the metastore client.
- alterTable(Identifier, TableChange...) - Method in class org.apache.iceberg.spark.SparkCachedTableCatalog
- alterTable(Identifier, TableChange...) - Method in class org.apache.iceberg.spark.SparkCatalog
- alterTable(Identifier, TableChange...) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- alterTableColumnStatistics(ObjectPath, CatalogColumnStatistics, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterTableStatistics(ObjectPath, CatalogTableStatistics, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- alterView(Identifier, ViewChange...) - Method in class org.apache.iceberg.spark.SparkCatalog
- alwaysFalse() - Static method in class org.apache.iceberg.expressions.Expressions
- alwaysFalse() - Method in class org.apache.iceberg.expressions.ExpressionVisitors.CustomOrderExpressionVisitor
- alwaysFalse() - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- alwaysNull() - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a
Transform
that always produces null. - alwaysNull(int, String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- alwaysNull(String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- alwaysNull(String, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- alwaysTrue() - Static method in class org.apache.iceberg.expressions.Expressions
- alwaysTrue() - Method in class org.apache.iceberg.expressions.ExpressionVisitors.CustomOrderExpressionVisitor
- alwaysTrue() - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- ancestorIds(Snapshot, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- ancestorIdsBetween(long, Long, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- ancestorsBetween(long, Long, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- ancestorsBetween(Table, long, Long) - Static method in class org.apache.iceberg.util.SnapshotUtil
- ancestorsOf(long, Function<Long, Snapshot>) - Static method in class org.apache.iceberg.util.SnapshotUtil
- AncestorsOfProcedure - Class in org.apache.iceberg.spark.procedures
- and(Supplier<R>, Supplier<R>) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.CustomOrderExpressionVisitor
- and(Expression, Expression) - Static method in class org.apache.iceberg.expressions.Expressions
- and(Expression, Expression, Expression...) - Static method in class org.apache.iceberg.expressions.Expressions
- and(R, R) - Method in class org.apache.iceberg.expressions.ExpressionVisitors.ExpressionVisitor
- And - Class in org.apache.iceberg.expressions
- AND - Enum constant in enum class org.apache.iceberg.expressions.Expression.Operation
- APACHE_CONNECTION_ACQUISITION_TIMEOUT_MS - Static variable in class org.apache.iceberg.aws.HttpClientProperties
-
Used to configure the connection acquisition timeout in milliseconds for
ApacheHttpClient.Builder
. - APACHE_CONNECTION_MAX_IDLE_TIME_MS - Static variable in class org.apache.iceberg.aws.HttpClientProperties
-
Used to configure the connection max idle time in milliseconds for
ApacheHttpClient.Builder
. - APACHE_CONNECTION_TIME_TO_LIVE_MS - Static variable in class org.apache.iceberg.aws.HttpClientProperties
-
Used to configure the connection time to live in milliseconds for
ApacheHttpClient.Builder
. - APACHE_CONNECTION_TIMEOUT_MS - Static variable in class org.apache.iceberg.aws.HttpClientProperties
-
Used to configure the connection timeout in milliseconds for
ApacheHttpClient.Builder
. - APACHE_DATASKETCHES_THETA_V1 - Static variable in class org.apache.iceberg.puffin.StandardBlobTypes
-
A serialized form of a "compact" Theta sketch produced by the Apache DataSketches library
- APACHE_EXPECT_CONTINUE_ENABLED - Static variable in class org.apache.iceberg.aws.HttpClientProperties
-
Used to configure whether to enable the expect continue setting for
ApacheHttpClient.Builder
. - APACHE_MAX_CONNECTIONS - Static variable in class org.apache.iceberg.aws.HttpClientProperties
-
Used to configure the max connections number for
ApacheHttpClient.Builder
. - APACHE_SOCKET_TIMEOUT_MS - Static variable in class org.apache.iceberg.aws.HttpClientProperties
-
Used to configure the socket timeout in milliseconds for
ApacheHttpClient.Builder
. - APACHE_TCP_KEEP_ALIVE_ENABLED - Static variable in class org.apache.iceberg.aws.HttpClientProperties
-
Used to configure whether to enable the tcp keep alive setting for
ApacheHttpClient.Builder
. - APACHE_USE_IDLE_CONNECTION_REAPER_ENABLED - Static variable in class org.apache.iceberg.aws.HttpClientProperties
-
Used to configure whether to use idle connection reaper for
ApacheHttpClient.Builder
. - APP_ID - Static variable in class org.apache.iceberg.CatalogProperties
- append() - Method in class org.apache.iceberg.flink.sink.FlinkSink.Builder
-
Append the iceberg sink operators to write records to iceberg table.
- APPEND - Static variable in class org.apache.iceberg.DataOperations
-
New data is appended to the table and no data is removed or deleted.
- appendFile(DataFile) - Method in interface org.apache.iceberg.AppendFiles
-
Append a
DataFile
to the table. - AppendFiles - Interface in org.apache.iceberg
-
API for appending new files in a table.
- appendManifest(ManifestFile) - Method in interface org.apache.iceberg.AppendFiles
-
Append a
ManifestFile
to the table. - appendsAfter(long) - Method in class org.apache.iceberg.AllDeleteFilesTable.AllDeleteFilesTableScan
- appendsAfter(long) - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- appendsAfter(long) - Method in class org.apache.iceberg.DataTableScan
- appendsAfter(long) - Method in interface org.apache.iceberg.TableScan
-
Deprecated.
- appendsBetween(long, long) - Method in class org.apache.iceberg.AllDeleteFilesTable.AllDeleteFilesTableScan
- appendsBetween(long, long) - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- appendsBetween(long, long) - Method in class org.apache.iceberg.DataTableScan
- appendsBetween(long, long) - Method in interface org.apache.iceberg.TableScan
-
Deprecated.since 1.0.0, will be removed in 2.0.0; use
Table.newIncrementalAppendScan()
instead. - apply() - Method in class org.apache.iceberg.BaseReplaceSortOrder
- apply() - Method in interface org.apache.iceberg.PendingUpdate
-
Apply the pending changes and return the uncommitted changes for validation.
- apply() - Method in class org.apache.iceberg.SetLocation
- apply() - Method in class org.apache.iceberg.SetPartitionStatistics
- apply() - Method in class org.apache.iceberg.SetStatistics
- apply() - Method in class org.apache.iceberg.SnapshotManager
- apply() - Method in class org.apache.iceberg.BaseRewriteManifests
- apply(int, int) - Static method in class org.apache.iceberg.spark.functions.BucketFunction.BucketBase
- apply(RowData) - Method in class org.apache.iceberg.flink.source.RowDataToAvroGenericRecordConverter
- apply(IcebergSourceSplit) - Method in class org.apache.iceberg.flink.source.reader.DataIteratorReaderFunction
- apply(TableMetadata, Snapshot) - Method in class org.apache.iceberg.BaseReplacePartitions
- apply(TableMetadata, Snapshot) - Method in class org.apache.iceberg.BaseRewriteManifests
- apply(TableMetadata, Snapshot) - Method in class org.apache.iceberg.StreamingDelete
- apply(S) - Method in interface org.apache.iceberg.transforms.Transform
-
Deprecated.use
Transform.bind(Type)
instead; will be removed in 2.0.0 - apply(S) - Method in class org.apache.iceberg.transforms.UnknownTransform
- applyAssumeRoleConfigurations(T) - Method in class org.apache.iceberg.aws.AssumeRoleAwsClientFactory
- applyClientConfiguration(String, DataLakeFileSystemClientBuilder) - Method in class org.apache.iceberg.azure.AzureProperties
- applyClientCredentialConfigurations(T) - Method in class org.apache.iceberg.aws.AwsClientProperties
-
Configure the credential provider for AWS clients.
- applyClientRegionConfiguration(T) - Method in class org.apache.iceberg.aws.AwsClientProperties
-
Configure a client AWS region.
- applyCredentialConfigurations(AwsClientProperties, T) - Method in class org.apache.iceberg.aws.s3.S3FileIOProperties
- applyDynamoDbEndpointConfigurations(T) - Method in class org.apache.iceberg.aws.AwsProperties
-
Override the endpoint for a dynamoDb client.
- applyEndpointConfigurations(T) - Method in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Override the endpoint for an S3 client.
- applyFilters(List<ResolvedExpression>) - Method in class org.apache.iceberg.flink.source.IcebergTableSource
- applyGlueEndpointConfigurations(T) - Method in class org.apache.iceberg.aws.AwsProperties
-
Override the endpoint for a glue client.
- applyHttpClientConfigurations(T) - Method in class org.apache.iceberg.aws.HttpClientProperties
-
Configure the httpClient for a client according to the HttpClientType.
- applyLimit(long) - Method in class org.apache.iceberg.flink.source.IcebergTableSource
- applyNameMapping(Schema, NameMapping) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- applyNameMapping(MessageType, NameMapping) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
- ApplyNameMapping - Class in org.apache.iceberg.avro
-
An Avro Schema visitor to apply a name mapping to add Iceberg field IDs.
- ApplyNameMapping(NameMapping) - Constructor for class org.apache.iceberg.avro.ApplyNameMapping
- applyOverwrite(boolean) - Method in class org.apache.iceberg.flink.IcebergTableSink
- applyProjection(int[][]) - Method in class org.apache.iceberg.flink.source.IcebergTableSource
- applyPropertyChanges(UpdateProperties, List<TableChange>) - Static method in class org.apache.iceberg.flink.util.FlinkAlterTableUtil
-
Applies a list of Flink table property changes to an
UpdateProperties
operation. - applyPropertyChanges(UpdateProperties, List<TableChange>) - Static method in class org.apache.iceberg.spark.Spark3Util
-
Applies a list of Spark table changes to an
UpdateProperties
operation. - applyS3AccessGrantsConfigurations(T) - Method in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Add the S3 Access Grants Plugin for an S3 client.
- applySchemaChanges(Map<String, String>, List<String>, Map<String, String>, Set<String>) - Static method in class org.apache.iceberg.util.PropertyUtil
- applySchemaChanges(UpdateSchema, List<TableChange>) - Static method in class org.apache.iceberg.flink.util.FlinkAlterTableUtil
-
Applies a list of Flink table changes to an
UpdateSchema
operation. - applySchemaChanges(UpdateSchema, List<TableChange>) - Static method in class org.apache.iceberg.spark.Spark3Util
-
Applies a list of Spark table changes to an
UpdateSchema
operation. - applyServiceConfigurations(T) - Method in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Configure services settings for an S3 client.
- applySignerConfiguration(T) - Method in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Configure a signer for an S3 client.
- applyStaticPartition(Map<String, String>) - Method in class org.apache.iceberg.flink.IcebergTableSink
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AddPartitionSpec
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AddSchema
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AddSnapshot
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AddSortOrder
- applyTo(TableMetadata.Builder) - Method in interface org.apache.iceberg.MetadataUpdate
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AssignUUID
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.RemovePartitionStatistics
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.RemoveProperties
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.RemoveSnapshot
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.RemoveSnapshotRef
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.RemoveStatistics
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetCurrentSchema
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetDefaultPartitionSpec
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetDefaultSortOrder
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetLocation
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetPartitionStatistics
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetProperties
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetSnapshotRef
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetStatistics
- applyTo(TableMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.UpgradeFormatVersion
- applyTo(ViewMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AddSchema
- applyTo(ViewMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AddViewVersion
- applyTo(ViewMetadata.Builder) - Method in interface org.apache.iceberg.MetadataUpdate
- applyTo(ViewMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.AssignUUID
- applyTo(ViewMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.RemoveProperties
- applyTo(ViewMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetCurrentViewVersion
- applyTo(ViewMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetLocation
- applyTo(ViewMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.SetProperties
- applyTo(ViewMetadata.Builder) - Method in class org.apache.iceberg.MetadataUpdate.UpgradeFormatVersion
- ApplyTransformContext(IcebergSqlExtensionsParser.TransformContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- applyUserAgentConfigurations(T) - Method in class org.apache.iceberg.aws.s3.S3FileIOProperties
- arguments - Variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ApplyTransformContext
- array(Schema, Schema) - Method in class org.apache.iceberg.avro.ApplyNameMapping
- array(Schema, Schema) - Method in class org.apache.iceberg.avro.RemoveIds
- array(Schema, T) - Method in class org.apache.iceberg.avro.AvroSchemaVisitor
- array(ValueReader<T>) - Static method in class org.apache.iceberg.avro.ValueReaders
- array(ValueWriter<T>) - Static method in class org.apache.iceberg.avro.ValueWriters
- array(OrcValueReader<?>) - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- array(Type, Schema, MappedFields) - Method in class org.apache.iceberg.avro.NameMappingWithAvroSchema
- array(Types.ListType, Schema, T) - Method in class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- array(P, Schema, R) - Method in class org.apache.iceberg.avro.AvroWithPartnerVisitor
- array(P, Schema, T) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- ARRAY - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- ARRAY - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- ARRAY() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringArrayContext
- arrayElementType(LogicalType) - Method in class org.apache.iceberg.flink.data.AvroWithFlinkSchemaVisitor
- arrayElementType(Type) - Method in class org.apache.iceberg.avro.AvroWithTypeByStructureVisitor
- arrayElementType(DataType) - Method in class org.apache.iceberg.spark.data.AvroWithSparkSchemaVisitor
- arrayElementType(P) - Method in class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- arrayMap(ValueReader<K>, ValueReader<V>) - Static method in class org.apache.iceberg.avro.ValueReaders
- arrayMap(ValueWriter<K>, ValueWriter<V>) - Static method in class org.apache.iceberg.avro.ValueWriters
- arrayMap(P, Schema, R, R) - Method in class org.apache.iceberg.avro.AvroWithPartnerVisitor
- ArrayUtil - Class in org.apache.iceberg.util
- ArrowAllocation - Class in org.apache.iceberg.arrow
- ArrowReader - Class in org.apache.iceberg.arrow.vectorized
-
Vectorized reader that returns an iterator of
ColumnarBatch
. - ArrowReader(TableScan, int, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.ArrowReader
-
Create a new instance of the reader.
- ArrowSchemaUtil - Class in org.apache.iceberg.arrow
- ArrowVectorAccessor<DecimalT,
Utf8StringT, ArrayT, ChildVectorT extends AutoCloseable> - Class in org.apache.iceberg.arrow.vectorized - ArrowVectorAccessor(ValueVector) - Constructor for class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- ArrowVectorAccessor(ValueVector, ChildVectorT[]) - Constructor for class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- ArrowVectorAccessors - Class in org.apache.iceberg.spark.data.vectorized
- as(String) - Method in interface org.apache.iceberg.actions.SnapshotTable
-
Sets the table identifier for the newly created Iceberg table.
- as(String) - Method in class org.apache.iceberg.spark.actions.SnapshotTableSparkAction
- as(TableIdentifier) - Method in interface org.apache.iceberg.delta.SnapshotDeltaLakeTable
-
Sets the identifier of the newly created Iceberg table.
- AS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- AS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- AS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.AddPartitionFieldContext
- AS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BranchOptionsContext
- AS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- AS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ReplacePartitionFieldContext
- AS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TagOptionsContext
- AS_OF_TIMESTAMP - Static variable in class org.apache.iceberg.flink.FlinkReadOptions
- AS_OF_TIMESTAMP - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- AS_OF_TIMESTAMP - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- asc(String) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add a field to the sort by field name, ascending with nulls first.
- asc(String, NullOrder) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add a field to the sort by field name, ascending with the given null order.
- asc(Term) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add an expression term to the sort, ascending with nulls first.
- asc(Term, NullOrder) - Method in class org.apache.iceberg.BaseReplaceSortOrder
- asc(Term, NullOrder) - Method in class org.apache.iceberg.SortOrder.Builder
-
Add an expression term to the sort, ascending with the given null order.
- asc(Term, NullOrder) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add an expression term to the sort, ascending with the given null order.
- ASC - Enum constant in enum class org.apache.iceberg.SortDirection
- ASC - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- ASC - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- ASC() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- ASC() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.OrderFieldContext
- asCatalog(SessionCatalog.SessionContext) - Method in class org.apache.iceberg.catalog.BaseSessionCatalog
- asCombinedScanTask() - Method in interface org.apache.iceberg.CombinedScanTask
- asCombinedScanTask() - Method in interface org.apache.iceberg.ScanTask
-
Returns this cast to
CombinedScanTask
if it is one - asDataTask() - Method in interface org.apache.iceberg.DataTask
- asDataTask() - Method in interface org.apache.iceberg.ScanTask
-
Returns this cast to
DataTask
if it is one - asFile() - Method in class org.apache.iceberg.spark.SparkContentFile
- asFile() - Method in class org.apache.iceberg.spark.SparkDataFile
- asFile() - Method in class org.apache.iceberg.spark.SparkDeleteFile
- asFileScanTask() - Method in interface org.apache.iceberg.FileScanTask
- asFileScanTask() - Method in interface org.apache.iceberg.ScanTask
-
Returns this cast to
FileScanTask
if it is one - asListType() - Method in interface org.apache.iceberg.types.Type
- asListType() - Method in class org.apache.iceberg.types.Types.ListType
- asLiteralPredicate() - Method in class org.apache.iceberg.expressions.BoundLiteralPredicate
- asLiteralPredicate() - Method in class org.apache.iceberg.expressions.BoundPredicate
- asMappedFields() - Method in class org.apache.iceberg.mapping.NameMapping
- asMapType() - Method in interface org.apache.iceberg.types.Type
- asMapType() - Method in class org.apache.iceberg.types.Types.MapType
- asNestedType() - Method in interface org.apache.iceberg.types.Type
- asNestedType() - Method in class org.apache.iceberg.types.Type.NestedType
- asOfTime(long) - Method in class org.apache.iceberg.AllDeleteFilesTable.AllDeleteFilesTableScan
- asOfTime(long) - Method in interface org.apache.iceberg.BatchScan
-
Create a new
BatchScan
from this scan's configuration that will use the most recent snapshot as of the given time in milliseconds on the branch in the scan or main if no branch is set. - asOfTime(long) - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- asOfTime(long) - Method in class org.apache.iceberg.FindFiles.Builder
-
Base results on files in the snapshot that was current as of a timestamp.
- asOfTime(long) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- asOfTime(long) - Method in class org.apache.iceberg.SnapshotScan
- asOfTime(long) - Method in interface org.apache.iceberg.TableScan
-
Create a new
TableScan
from this scan's configuration that will use the most recent snapshot as of the given time in milliseconds on the branch in the scan or main if no branch is set. - asOfTimestamp() - Method in class org.apache.iceberg.flink.FlinkReadConf
- asOfTimestamp() - Method in class org.apache.iceberg.flink.source.ScanContext
- asOfTimestamp() - Method in class org.apache.iceberg.spark.SparkReadConf
- asOfTimestamp(Long) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- asOfTimestamp(Long) - Method in class org.apache.iceberg.flink.source.IcebergSource.Builder
- asOfTimestamp(Long) - Method in class org.apache.iceberg.flink.source.ScanContext.Builder
- asOptional() - Method in class org.apache.iceberg.types.Types.NestedField
- asPrimitiveType() - Method in interface org.apache.iceberg.types.Type
- asPrimitiveType() - Method in class org.apache.iceberg.types.Type.PrimitiveType
- asRequired() - Method in class org.apache.iceberg.types.Types.NestedField
- asResult() - Method in class org.apache.iceberg.actions.RewriteFileGroup
- asResult() - Method in class org.apache.iceberg.actions.RewritePositionDeletesGroup
- asSchema() - Method in class org.apache.iceberg.types.Types.StructType
-
Returns a schema which contains the columns inside struct type.
- AssertCurrentSchemaID(int) - Constructor for class org.apache.iceberg.UpdateRequirement.AssertCurrentSchemaID
- AssertDefaultSortOrderID(int) - Constructor for class org.apache.iceberg.UpdateRequirement.AssertDefaultSortOrderID
- AssertDefaultSpecID(int) - Constructor for class org.apache.iceberg.UpdateRequirement.AssertDefaultSpecID
- AssertLastAssignedFieldId(int) - Constructor for class org.apache.iceberg.UpdateRequirement.AssertLastAssignedFieldId
- AssertLastAssignedPartitionId(int) - Constructor for class org.apache.iceberg.UpdateRequirement.AssertLastAssignedPartitionId
- AssertRefSnapshotID(String, Long) - Constructor for class org.apache.iceberg.UpdateRequirement.AssertRefSnapshotID
- AssertTableDoesNotExist() - Constructor for class org.apache.iceberg.UpdateRequirement.AssertTableDoesNotExist
- AssertTableUUID(String) - Constructor for class org.apache.iceberg.UpdateRequirement.AssertTableUUID
- AssertViewUUID(String) - Constructor for class org.apache.iceberg.UpdateRequirement.AssertViewUUID
- asSetPredicate() - Method in class org.apache.iceberg.expressions.BoundPredicate
- asSetPredicate() - Method in class org.apache.iceberg.expressions.BoundSetPredicate
- ASSIGNED - Enum constant in enum class org.apache.iceberg.flink.source.split.IcebergSourceSplitStatus
- assignerFactory(SplitAssignerFactory) - Method in class org.apache.iceberg.flink.source.IcebergSource.Builder
- assignFreshIds(int, Schema, TypeUtil.NextID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns fresh ids from the
nextId function
for all fields in a schema. - assignFreshIds(Schema, Schema, TypeUtil.NextID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns ids to match a given schema, and fresh ids from the
nextId function
for all other fields. - assignFreshIds(Schema, TypeUtil.NextID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns fresh ids from the
nextId function
for all fields in a schema. - assignFreshIds(Type, TypeUtil.NextID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns fresh ids from the
nextId function
for all fields in a type. - assignIds(Type, TypeUtil.GetID) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns fresh ids from the
getId function
for all fields in a type. - assignIncreasingFreshIds(Schema) - Static method in class org.apache.iceberg.types.TypeUtil
-
Assigns strictly increasing fresh ids for all fields in a schema, starting from 1.
- assignments() - Method in class org.apache.iceberg.connect.events.DataComplete
- assignUUID() - Method in class org.apache.iceberg.TableMetadata.Builder
- assignUUID(String) - Method in class org.apache.iceberg.TableMetadata.Builder
- assignUUID(String) - Method in class org.apache.iceberg.view.ViewMetadata.Builder
- AssignUUID(String) - Constructor for class org.apache.iceberg.MetadataUpdate.AssignUUID
- asStatic() - Method in class org.apache.iceberg.common.DynFields.UnboundField
-
Returns this field as a StaticField.
- asStatic() - Method in class org.apache.iceberg.common.DynMethods.UnboundMethod
-
Returns this method as a StaticMethod.
- asStruct() - Method in class org.apache.iceberg.Schema
-
Returns the underlying
struct type
for this schema. - asStructLike(Record) - Method in class org.apache.iceberg.data.GenericDeleteFilter
- asStructLike(T) - Method in class org.apache.iceberg.data.DeleteFilter
- asStructLike(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
-
Wrap the data as a
StructLike
. - asStructLikeKey(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
-
Wrap the passed in key of a row as a
StructLike
- asStructType() - Method in interface org.apache.iceberg.types.Type
- asStructType() - Method in class org.apache.iceberg.types.Types.StructType
- AssumeRoleAwsClientFactory - Class in org.apache.iceberg.aws
- AssumeRoleAwsClientFactory() - Constructor for class org.apache.iceberg.aws.AssumeRoleAwsClientFactory
- asSummaryString() - Method in class org.apache.iceberg.flink.IcebergTableSink
- asSummaryString() - Method in class org.apache.iceberg.flink.source.IcebergTableSource
- asUnaryPredicate() - Method in class org.apache.iceberg.expressions.BoundPredicate
- asUnaryPredicate() - Method in class org.apache.iceberg.expressions.BoundUnaryPredicate
- asViewCatalog(SessionCatalog.SessionContext) - Method in class org.apache.iceberg.catalog.BaseViewSessionCatalog
- attempts() - Method in class org.apache.iceberg.metrics.CommitMetrics
- attempts() - Method in interface org.apache.iceberg.metrics.CommitMetricsResult
- ATTEMPTS - Static variable in class org.apache.iceberg.metrics.CommitMetrics
- AUDIENCE - Static variable in class org.apache.iceberg.rest.auth.OAuth2Properties
-
Optional param audience for OAuth2.
- AUTH_SESSION_TIMEOUT_MS - Static variable in class org.apache.iceberg.CatalogProperties
- AUTH_SESSION_TIMEOUT_MS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- AuthConfig - Interface in org.apache.iceberg.rest.auth
-
The purpose of this class is to hold configuration options for
OAuth2Util.AuthSession
. - authHeaders(String) - Static method in class org.apache.iceberg.rest.auth.OAuth2Util
- AuthSession(Map<String, String>, String, String, String, String, String) - Constructor for class org.apache.iceberg.rest.auth.OAuth2Util.AuthSession
-
Deprecated.since 1.6.0, will be removed in 1.7.0
- AuthSession(Map<String, String>, AuthConfig) - Constructor for class org.apache.iceberg.rest.auth.OAuth2Util.AuthSession
- Auto - Enum constant in enum class org.apache.iceberg.flink.sink.shuffle.StatisticsType
-
Initially use Map for statistics tracking.
- AUTO - Enum constant in enum class org.apache.iceberg.PlanningMode
- autoCreateEnabled() - Method in class org.apache.iceberg.connect.IcebergSinkConfig
- autoCreateProps() - Method in class org.apache.iceberg.connect.IcebergSinkConfig
- available() - Method in class org.apache.iceberg.encryption.AesGcmInputStream
- AVAILABLE - Enum constant in enum class org.apache.iceberg.flink.source.assigner.GetSplitResult.Status
- Avro - Class in org.apache.iceberg.avro
- AVRO - Enum constant in enum class org.apache.iceberg.FileFormat
- AVRO_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- AVRO_COMPRESSION_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- AVRO_COMPRESSION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- AVRO_COMPRESSION_LEVEL_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- Avro.DataWriteBuilder - Class in org.apache.iceberg.avro
- Avro.DeleteWriteBuilder - Class in org.apache.iceberg.avro
- Avro.ReadBuilder - Class in org.apache.iceberg.avro
- Avro.WriteBuilder - Class in org.apache.iceberg.avro
- avroCompressionCodec() - Method in class org.apache.iceberg.flink.FlinkWriteConf
- avroCompressionLevel() - Method in class org.apache.iceberg.flink.FlinkWriteConf
- AvroEncoderUtil - Class in org.apache.iceberg.avro
- AvroGenericRecordFileScanTaskReader - Class in org.apache.iceberg.flink.source
- AvroGenericRecordFileScanTaskReader(RowDataFileScanTaskReader, RowDataToAvroGenericRecordConverter) - Constructor for class org.apache.iceberg.flink.source.AvroGenericRecordFileScanTaskReader
- AvroGenericRecordReaderFunction - Class in org.apache.iceberg.flink.source.reader
-
Read Iceberg rows as
GenericRecord
. - AvroGenericRecordReaderFunction(String, ReadableConfig, Schema, Schema, String, boolean, FileIO, EncryptionManager, List<Expression>) - Constructor for class org.apache.iceberg.flink.source.reader.AvroGenericRecordReaderFunction
- AvroGenericRecordToRowDataMapper - Class in org.apache.iceberg.flink.sink
-
This util class converts Avro GenericRecord to Flink RowData.
- AvroIterable<D> - Class in org.apache.iceberg.avro
- AvroMetrics - Class in org.apache.iceberg.avro
- AvroSchemaUtil - Class in org.apache.iceberg.avro
- AvroSchemaVisitor<T> - Class in org.apache.iceberg.avro
- AvroSchemaVisitor() - Constructor for class org.apache.iceberg.avro.AvroSchemaVisitor
- AvroSchemaWithTypeVisitor<T> - Class in org.apache.iceberg.avro
- AvroSchemaWithTypeVisitor() - Constructor for class org.apache.iceberg.avro.AvroSchemaWithTypeVisitor
- AvroUtil - Class in org.apache.iceberg.connect.events
-
Class for Avro-related utility methods.
- AvroWithFlinkSchemaVisitor<T> - Class in org.apache.iceberg.flink.data
- AvroWithFlinkSchemaVisitor() - Constructor for class org.apache.iceberg.flink.data.AvroWithFlinkSchemaVisitor
- AvroWithPartnerByStructureVisitor<P,
T> - Class in org.apache.iceberg.avro -
A abstract avro schema visitor with partner type.
- AvroWithPartnerByStructureVisitor() - Constructor for class org.apache.iceberg.avro.AvroWithPartnerByStructureVisitor
- AvroWithPartnerVisitor<P,
R> - Class in org.apache.iceberg.avro - AvroWithPartnerVisitor() - Constructor for class org.apache.iceberg.avro.AvroWithPartnerVisitor
- AvroWithPartnerVisitor.PartnerAccessors<P> - Interface in org.apache.iceberg.avro
- AvroWithSparkSchemaVisitor<T> - Class in org.apache.iceberg.spark.data
- AvroWithSparkSchemaVisitor() - Constructor for class org.apache.iceberg.spark.data.AvroWithSparkSchemaVisitor
- AvroWithTypeByStructureVisitor<T> - Class in org.apache.iceberg.avro
- AvroWithTypeByStructureVisitor() - Constructor for class org.apache.iceberg.avro.AvroWithTypeByStructureVisitor
- AwsClientFactories - Class in org.apache.iceberg.aws
- AwsClientFactory - Interface in org.apache.iceberg.aws
-
Interface to customize AWS clients used by Iceberg.
- AwsClientProperties - Class in org.apache.iceberg.aws
- AwsClientProperties() - Constructor for class org.apache.iceberg.aws.AwsClientProperties
- AwsClientProperties(Map<String, String>) - Constructor for class org.apache.iceberg.aws.AwsClientProperties
- awsProperties() - Method in class org.apache.iceberg.aws.AssumeRoleAwsClientFactory
- AwsProperties - Class in org.apache.iceberg.aws
- AwsProperties() - Constructor for class org.apache.iceberg.aws.AwsProperties
- AwsProperties(Map<String, String>) - Constructor for class org.apache.iceberg.aws.AwsProperties
- AzureProperties - Class in org.apache.iceberg.azure
- AzureProperties() - Constructor for class org.apache.iceberg.azure.AzureProperties
- AzureProperties(Map<String, String>) - Constructor for class org.apache.iceberg.azure.AzureProperties
B
- BACKQUOTED_IDENTIFIER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BACKQUOTED_IDENTIFIER - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BACKQUOTED_IDENTIFIER() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.QuotedIdentifierContext
- backupTableName(String) - Method in interface org.apache.iceberg.actions.MigrateTable
-
Sets a table name for the backup of the original table.
- backupTableName(String) - Method in class org.apache.iceberg.spark.actions.MigrateTableSparkAction
- BadRequestException - Exception in org.apache.iceberg.exceptions
-
Exception thrown on HTTP 400 - Bad Request
- BadRequestException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.BadRequestException
- BadRequestException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.BadRequestException
- BASE_NAMESPACE - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- BaseBatchReader<T> - Class in org.apache.iceberg.arrow.vectorized
-
A base BatchReader class that contains common functionality
- BaseBatchReader(List<VectorizedReader<?>>) - Constructor for class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- BaseColumnIterator - Class in org.apache.iceberg.parquet
- BaseColumnIterator(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.BaseColumnIterator
- BaseCombinedScanTask - Class in org.apache.iceberg
- BaseCombinedScanTask(List<FileScanTask>) - Constructor for class org.apache.iceberg.BaseCombinedScanTask
- BaseCombinedScanTask(FileScanTask...) - Constructor for class org.apache.iceberg.BaseCombinedScanTask
- BaseDeleteLoader - Class in org.apache.iceberg.data
- BaseDeleteLoader(Function<DeleteFile, InputFile>) - Constructor for class org.apache.iceberg.data.BaseDeleteLoader
- BaseDeleteLoader(Function<DeleteFile, InputFile>, ExecutorService) - Constructor for class org.apache.iceberg.data.BaseDeleteLoader
- BaseEqualityDeltaWriter(StructLike, Schema, Schema) - Constructor for class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
- BaseEqualityDeltaWriter(StructLike, Schema, Schema, DeleteGranularity) - Constructor for class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
- BaseFileScanTask - Class in org.apache.iceberg
- BaseFileScanTask(DataFile, DeleteFile[], String, String, ResidualEvaluator) - Constructor for class org.apache.iceberg.BaseFileScanTask
- BaseFileWriterFactory<T> - Class in org.apache.iceberg.data
-
A base writer factory to be extended by query engine integrations.
- BaseFileWriterFactory(Table, FileFormat, Schema, SortOrder, FileFormat, int[], Schema, SortOrder, Schema) - Constructor for class org.apache.iceberg.data.BaseFileWriterFactory
- BaseLockManager() - Constructor for class org.apache.iceberg.util.LockManagers.BaseLockManager
- BaseMetadataTable - Class in org.apache.iceberg
-
Base class for metadata tables.
- BaseMetadataTable(Table, String) - Constructor for class org.apache.iceberg.BaseMetadataTable
- BaseMetastoreCatalog - Class in org.apache.iceberg
- BaseMetastoreCatalog() - Constructor for class org.apache.iceberg.BaseMetastoreCatalog
- BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder - Class in org.apache.iceberg
- BaseMetastoreCatalogTableBuilder(TableIdentifier, Schema) - Constructor for class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- BaseMetastoreOperations - Class in org.apache.iceberg
- BaseMetastoreOperations() - Constructor for class org.apache.iceberg.BaseMetastoreOperations
- BaseMetastoreOperations.CommitStatus - Enum Class in org.apache.iceberg
- BaseMetastoreTableOperations - Class in org.apache.iceberg
- BaseMetastoreTableOperations() - Constructor for class org.apache.iceberg.BaseMetastoreTableOperations
- BaseMetastoreTableOperations.CommitStatus - Enum Class in org.apache.iceberg
-
Deprecated.since 1.6.0, will be removed in 1.7.0; Use
BaseMetastoreOperations.CommitStatus
instead - BaseMetastoreViewCatalog - Class in org.apache.iceberg.view
- BaseMetastoreViewCatalog() - Constructor for class org.apache.iceberg.view.BaseMetastoreViewCatalog
- BaseMetastoreViewCatalog.BaseMetastoreViewCatalogTableBuilder - Class in org.apache.iceberg.view
-
The purpose of this class is to add view detection when replacing a table
- BaseMetastoreViewCatalog.BaseViewBuilder - Class in org.apache.iceberg.view
- BaseMetastoreViewCatalogTableBuilder(TableIdentifier, Schema) - Constructor for class org.apache.iceberg.view.BaseMetastoreViewCatalog.BaseMetastoreViewCatalogTableBuilder
- BaseOverwriteFiles - Class in org.apache.iceberg
- BaseOverwriteFiles(String, TableOperations) - Constructor for class org.apache.iceberg.BaseOverwriteFiles
- BasePageIterator - Class in org.apache.iceberg.parquet
- BasePageIterator(ColumnDescriptor, String) - Constructor for class org.apache.iceberg.parquet.BasePageIterator
- BasePageIterator.IntIterator - Class in org.apache.iceberg.parquet
- BaseParquetReaders<T> - Class in org.apache.iceberg.data.parquet
- BaseParquetReaders() - Constructor for class org.apache.iceberg.data.parquet.BaseParquetReaders
- BaseParquetWriter<T> - Class in org.apache.iceberg.data.parquet
- BaseParquetWriter() - Constructor for class org.apache.iceberg.data.parquet.BaseParquetWriter
- BasePositionDeltaWriter<T> - Class in org.apache.iceberg.io
- BasePositionDeltaWriter(PartitioningWriter<T, DataWriteResult>, PartitioningWriter<PositionDelete<T>, DeleteWriteResult>) - Constructor for class org.apache.iceberg.io.BasePositionDeltaWriter
- BasePositionDeltaWriter(PartitioningWriter<T, DataWriteResult>, PartitioningWriter<T, DataWriteResult>, PartitioningWriter<PositionDelete<T>, DeleteWriteResult>) - Constructor for class org.apache.iceberg.io.BasePositionDeltaWriter
- BaseReplacePartitions - Class in org.apache.iceberg
- BaseReplaceSortOrder - Class in org.apache.iceberg
- BaseRewriteDataFilesAction<ThisT> - Class in org.apache.iceberg.actions
- BaseRewriteDataFilesAction(Table) - Constructor for class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- BaseRewriteManifests - Class in org.apache.iceberg
- BaseScanTaskGroup<T extends ScanTask> - Class in org.apache.iceberg
- BaseScanTaskGroup(Collection<T>) - Constructor for class org.apache.iceberg.BaseScanTaskGroup
- BaseScanTaskGroup(StructLike, Collection<T>) - Constructor for class org.apache.iceberg.BaseScanTaskGroup
- BaseSessionCatalog - Class in org.apache.iceberg.catalog
- BaseSessionCatalog() - Constructor for class org.apache.iceberg.catalog.BaseSessionCatalog
- BaseSessionCatalog.AsCatalog - Class in org.apache.iceberg.catalog
- baseSignerUri() - Method in class org.apache.iceberg.aws.s3.signer.S3V4RestSignerClient
- BaseTable - Class in org.apache.iceberg
-
Base
Table
implementation. - BaseTable(TableOperations, String) - Constructor for class org.apache.iceberg.BaseTable
- BaseTable(TableOperations, String, MetricsReporter) - Constructor for class org.apache.iceberg.BaseTable
- baseTableFilter(Expression) - Method in class org.apache.iceberg.PositionDeletesTable.PositionDeletesBatchScan
-
Sets a filter that applies on base table of this position deletes table, to use for this scan.
- BaseTaskWriter<T> - Class in org.apache.iceberg.io
- BaseTaskWriter(PartitionSpec, FileFormat, FileAppenderFactory<T>, OutputFileFactory, FileIO, long) - Constructor for class org.apache.iceberg.io.BaseTaskWriter
- BaseTaskWriter.BaseEqualityDeltaWriter - Class in org.apache.iceberg.io
-
Base equality delta writer to write both insert records and equality-deletes.
- BaseTaskWriter.RollingEqDeleteWriter - Class in org.apache.iceberg.io
- BaseTaskWriter.RollingFileWriter - Class in org.apache.iceberg.io
- BaseTransaction - Class in org.apache.iceberg
- BaseTransaction.TransactionTable - Class in org.apache.iceberg
- BaseTransaction.TransactionTableOperations - Class in org.apache.iceberg
- BaseVectorizedParquetValuesReader - Class in org.apache.iceberg.arrow.vectorized.parquet
-
A values reader for Parquet's run-length encoded data that reads column data in batches instead of one value at a time.
- BaseVectorizedParquetValuesReader(int, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- BaseVectorizedParquetValuesReader(int, int, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- BaseVectorizedParquetValuesReader(int, int, boolean, boolean) - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.BaseVectorizedParquetValuesReader
- BaseView - Class in org.apache.iceberg.view
- BaseView(ViewOperations, String) - Constructor for class org.apache.iceberg.view.BaseView
- BaseViewBuilder(TableIdentifier) - Constructor for class org.apache.iceberg.view.BaseMetastoreViewCatalog.BaseViewBuilder
- BaseViewOperations - Class in org.apache.iceberg.view
- BaseViewOperations() - Constructor for class org.apache.iceberg.view.BaseViewOperations
- BaseViewSessionCatalog - Class in org.apache.iceberg.catalog
- BaseViewSessionCatalog() - Constructor for class org.apache.iceberg.catalog.BaseViewSessionCatalog
- BaseViewSessionCatalog.AsViewCatalog - Class in org.apache.iceberg.catalog
- basicAuthHeaders(String) - Static method in class org.apache.iceberg.rest.auth.OAuth2Util
- batch(String, DataIterator<T>) - Method in interface org.apache.iceberg.flink.source.reader.DataIteratorBatcher
- BatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.BatchReader
- BatchScan - Interface in org.apache.iceberg
-
API for configuring a batch scan.
- before(long) - Method in class org.apache.iceberg.ScanSummary.Builder
- before(String) - Method in class org.apache.iceberg.ScanSummary.Builder
- beforeElementField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- beforeElementField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeField(String, TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- beforeField(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- beforeField(Types.NestedField) - Method in class org.apache.iceberg.types.IndexParents
- beforeField(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- beforeField(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- beforeField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeKeyField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- beforeKeyField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeListElement(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- beforeListElement(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- beforeListElement(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- beforeMapKey(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- beforeMapKey(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- beforeMapKey(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- beforeMapValue(Types.NestedField) - Method in class org.apache.iceberg.types.IndexByName
- beforeMapValue(Types.NestedField) - Method in class org.apache.iceberg.types.TypeUtil.SchemaVisitor
- beforeMapValue(Types.NestedField, P) - Method in class org.apache.iceberg.schema.SchemaWithPartnerVisitor
- beforeRepeatedElement(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeRepeatedKeyValue(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- beforeValueField(TypeDescription) - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- beforeValueField(Type) - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- BIGDECIMAL_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BIGDECIMAL_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BIGDECIMAL_LITERAL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- BigDecimalLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigDecimalLiteralContext
- BIGINT_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BIGINT_LITERAL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BIGINT_LITERAL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- BigIntLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BigIntLiteralContext
- BINARY - Enum constant in enum class org.apache.iceberg.orc.ORCSchemaUtil.BinaryType
- BINARY - Enum constant in enum class org.apache.iceberg.types.Type.TypeID
- BinaryAsDecimalReader(ColumnDescriptor, int) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.BinaryAsDecimalReader
- BinaryType() - Constructor for class org.apache.iceberg.types.Types.BinaryType
- BinaryUtil - Class in org.apache.iceberg.util
- bind(Object) - Method in class org.apache.iceberg.common.DynConstructors.Ctor
- bind(Object) - Method in class org.apache.iceberg.common.DynFields.UnboundField
-
Returns this method as a BoundMethod for the given receiver.
- bind(Object) - Method in class org.apache.iceberg.common.DynMethods.UnboundMethod
-
Returns this method as a BoundMethod for the given receiver.
- bind(Schema) - Method in class org.apache.iceberg.UnboundPartitionSpec
- bind(Schema) - Method in class org.apache.iceberg.UnboundSortOrder
- bind(Type) - Method in class org.apache.iceberg.transforms.Months
- bind(Type) - Method in interface org.apache.iceberg.transforms.Transform
-
Returns a function that applies this transform to values of the given
type
. - bind(Type) - Method in class org.apache.iceberg.transforms.UnknownTransform
- bind(Types.StructType, boolean) - Method in class org.apache.iceberg.expressions.NamedReference
- bind(Types.StructType, boolean) - Method in interface org.apache.iceberg.expressions.Unbound
-
Bind this value expression to concrete types.
- bind(Types.StructType, boolean) - Method in class org.apache.iceberg.expressions.UnboundAggregate
-
Bind this UnboundAggregate.
- bind(Types.StructType, boolean) - Method in class org.apache.iceberg.expressions.UnboundPredicate
-
Bind this UnboundPredicate.
- bind(Types.StructType, boolean) - Method in class org.apache.iceberg.expressions.UnboundTransform
- bind(Types.StructType, Expression, boolean) - Static method in class org.apache.iceberg.expressions.Binder
-
Replaces all unbound/named references with bound references to fields in the given struct.
- bind(StructType) - Method in class org.apache.iceberg.spark.functions.BucketFunction
- bind(StructType) - Method in class org.apache.iceberg.spark.functions.IcebergVersionFunction
- bind(StructType) - Method in class org.apache.iceberg.spark.functions.TruncateFunction
- bind(StructType) - Method in class org.apache.iceberg.spark.functions.HoursFunction
- Binder - Class in org.apache.iceberg.expressions
-
Rewrites
expressions
by replacing unbound named references with references to fields in a struct schema. - binPack() - Method in interface org.apache.iceberg.actions.RewriteDataFiles
-
Choose BINPACK as a strategy for this rewrite operation
- binPack() - Method in class org.apache.iceberg.spark.actions.RewriteDataFilesSparkAction
- BinPacking - Class in org.apache.iceberg.util
- BinPacking() - Constructor for class org.apache.iceberg.util.BinPacking
- BinPacking.ListPacker<T> - Class in org.apache.iceberg.util
- BinPacking.PackingIterable<T> - Class in org.apache.iceberg.util
- Blob - Class in org.apache.iceberg.puffin
- Blob(String, List<Integer>, long, long, ByteBuffer) - Constructor for class org.apache.iceberg.puffin.Blob
- Blob(String, List<Integer>, long, long, ByteBuffer, PuffinCompressionCodec, Map<String, String>) - Constructor for class org.apache.iceberg.puffin.Blob
- blobData() - Method in class org.apache.iceberg.puffin.Blob
- blobMetadata() - Method in class org.apache.iceberg.GenericStatisticsFile
- blobMetadata() - Method in interface org.apache.iceberg.StatisticsFile
-
List of statistics contained in the file.
- BlobMetadata - Class in org.apache.iceberg.puffin
- BlobMetadata - Interface in org.apache.iceberg
-
A metadata about a statistics or indices blob.
- BlobMetadata(String, List<Integer>, long, long, long, long, String, Map<String, String>) - Constructor for class org.apache.iceberg.puffin.BlobMetadata
- blobs() - Method in class org.apache.iceberg.puffin.FileMetadata
- blockLocations(CombinedScanTask, Configuration) - Static method in class org.apache.iceberg.hadoop.Util
- blockLocations(FileIO, ScanTaskGroup<?>) - Static method in class org.apache.iceberg.hadoop.Util
- body() - Method in interface org.apache.iceberg.aws.s3.signer.S3SignRequest
- BOOLEAN - Enum constant in enum class org.apache.iceberg.types.Type.TypeID
- booleanBatchReader() - Method in class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator
- BooleanBatchReader() - Constructor for class org.apache.iceberg.arrow.vectorized.parquet.VectorizedColumnIterator.BooleanBatchReader
- BooleanLiteralContext(IcebergSqlExtensionsParser.ConstantContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanLiteralContext
- booleans() - Static method in class org.apache.iceberg.avro.ValueReaders
- booleans() - Static method in class org.apache.iceberg.avro.ValueWriters
- booleans() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- booleans() - Static method in class org.apache.iceberg.orc.OrcValueReaders
- booleans(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- BooleanType() - Constructor for class org.apache.iceberg.types.Types.BooleanType
- booleanValue() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanLiteralContext
- booleanValue() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BooleanValueContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BooleanValueContext
- Bound<T> - Interface in org.apache.iceberg.expressions
-
Represents a bound value expression.
- BoundAggregate<T,
C> - Class in org.apache.iceberg.expressions - BoundAggregate(Expression.Operation, BoundTerm<T>) - Constructor for class org.apache.iceberg.expressions.BoundAggregate
- BoundExpressionVisitor() - Constructor for class org.apache.iceberg.expressions.ExpressionVisitors.BoundExpressionVisitor
- BoundLiteralPredicate<T> - Class in org.apache.iceberg.expressions
- BoundPredicate<T> - Class in org.apache.iceberg.expressions
- BoundPredicate(Expression.Operation, BoundTerm<T>) - Constructor for class org.apache.iceberg.expressions.BoundPredicate
- BoundReference<T> - Class in org.apache.iceberg.expressions
- boundReferences(Types.StructType, List<Expression>, boolean) - Static method in class org.apache.iceberg.expressions.Binder
- BoundSetPredicate<T> - Class in org.apache.iceberg.expressions
- BoundTerm<T> - Interface in org.apache.iceberg.expressions
-
Represents a bound term.
- BoundTransform<S,
T> - Class in org.apache.iceberg.expressions -
A transform expression.
- BoundUnaryPredicate<T> - Class in org.apache.iceberg.expressions
- BoundVisitor() - Constructor for class org.apache.iceberg.expressions.ExpressionVisitors.BoundVisitor
- BRACKETED_COMMENT - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BRACKETED_COMMENT - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- branch() - Method in class org.apache.iceberg.flink.FlinkReadConf
- branch() - Method in class org.apache.iceberg.flink.FlinkWriteConf
- branch() - Method in class org.apache.iceberg.flink.source.ScanContext
- branch() - Method in class org.apache.iceberg.spark.source.SparkTable
- branch() - Method in class org.apache.iceberg.spark.SparkReadConf
- branch() - Method in class org.apache.iceberg.spark.SparkWriteConf
- branch(String) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- branch(String) - Method in class org.apache.iceberg.flink.source.IcebergSource.Builder
- BRANCH - Static variable in class org.apache.iceberg.flink.FlinkReadOptions
- BRANCH - Static variable in class org.apache.iceberg.flink.FlinkWriteOptions
- BRANCH - Static variable in class org.apache.iceberg.spark.SparkReadOptions
- BRANCH - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BRANCH - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BRANCH() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateReplaceBranchClauseContext
- BRANCH() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DropBranchContext
- BRANCH() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- branchBuilder(long) - Static method in class org.apache.iceberg.SnapshotRef
- branchOptions() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- branchOptions() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateReplaceBranchClauseContext
- BranchOptionsContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.BranchOptionsContext
- bucket() - Method in class org.apache.iceberg.aliyun.oss.OSSURI
-
Return OSS bucket name.
- bucket(int) - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a bucket
Transform
for the given number of buckets. - bucket(int, String, int, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- bucket(String, int) - Static method in class org.apache.iceberg.expressions.Expressions
- bucket(String, int) - Method in class org.apache.iceberg.PartitionSpec.Builder
- bucket(String, int, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- bucket(String, int, int, SortDirection, NullOrder) - Method in interface org.apache.iceberg.transforms.SortOrderVisitor
- bucket(String, int, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- bucket(Type, int) - Static method in class org.apache.iceberg.transforms.Transforms
-
Deprecated.use
Transforms.bucket(int)
instead; will be removed in 2.0.0 - BucketBase() - Constructor for class org.apache.iceberg.spark.functions.BucketFunction.BucketBase
- BucketBinary() - Constructor for class org.apache.iceberg.spark.functions.BucketFunction.BucketBinary
- BucketDecimal(DataType) - Constructor for class org.apache.iceberg.spark.functions.BucketFunction.BucketDecimal
- BucketFunction - Class in org.apache.iceberg.spark.functions
-
A Spark function implementation for the Iceberg bucket transform.
- BucketFunction() - Constructor for class org.apache.iceberg.spark.functions.BucketFunction
- BucketFunction.BucketBase - Class in org.apache.iceberg.spark.functions
- BucketFunction.BucketBinary - Class in org.apache.iceberg.spark.functions
- BucketFunction.BucketDecimal - Class in org.apache.iceberg.spark.functions
- BucketFunction.BucketInt - Class in org.apache.iceberg.spark.functions
- BucketFunction.BucketLong - Class in org.apache.iceberg.spark.functions
- BucketFunction.BucketString - Class in org.apache.iceberg.spark.functions
- BucketInt(DataType) - Constructor for class org.apache.iceberg.spark.functions.BucketFunction.BucketInt
- BucketLong(DataType) - Constructor for class org.apache.iceberg.spark.functions.BucketFunction.BucketLong
- BucketString() - Constructor for class org.apache.iceberg.spark.functions.BucketFunction.BucketString
- bucketToAccessPointMapping() - Method in class org.apache.iceberg.aws.s3.S3FileIOProperties
- BucketUtil - Class in org.apache.iceberg.util
-
Contains the logic for hashing various types for use with the
bucket
partition transformations - buffer() - Method in interface org.apache.iceberg.encryption.EncryptionKeyMetadata
-
Opaque blob representing metadata about a file's encryption key.
- build() - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- build() - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- build() - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- build() - Method in class org.apache.iceberg.common.DynClasses.Builder
-
Returns the first implementation or throws RuntimeException if one was not found.
- build() - Method in class org.apache.iceberg.common.DynConstructors.Builder
- build() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a UnboundField or throws a NoSuchFieldException if there is none.
- build() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a UnboundMethod or throws a RuntimeError if there is none.
- build() - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- build() - Method in class org.apache.iceberg.DataFiles.Builder
- build() - Method in class org.apache.iceberg.DoubleFieldMetrics.Builder
- build() - Method in class org.apache.iceberg.encryption.NativeFileCryptoParameters.Builder
- build() - Method in class org.apache.iceberg.FileMetadata.Builder
- build() - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- build() - Method in class org.apache.iceberg.flink.source.IcebergSource.Builder
- build() - Method in class org.apache.iceberg.flink.source.ScanContext.Builder
- build() - Method in class org.apache.iceberg.FloatFieldMetrics.Builder
- build() - Method in class org.apache.iceberg.GenericManifestFile.CopyBuilder
- build() - Method in class org.apache.iceberg.io.OutputFileFactory.Builder
- build() - Method in class org.apache.iceberg.io.WriteResult.Builder
- build() - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- build() - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- build() - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- build() - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- build() - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- build() - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- build() - Method in class org.apache.iceberg.PartitionSpec.Builder
- build() - Method in class org.apache.iceberg.puffin.Puffin.ReadBuilder
- build() - Method in class org.apache.iceberg.puffin.Puffin.WriteBuilder
- build() - Method in class org.apache.iceberg.rest.HTTPClient.Builder
- build() - Method in class org.apache.iceberg.rest.requests.CreateNamespaceRequest.Builder
- build() - Method in class org.apache.iceberg.rest.requests.CreateTableRequest.Builder
- build() - Method in class org.apache.iceberg.rest.requests.RenameTableRequest.Builder
- build() - Method in class org.apache.iceberg.rest.requests.UpdateNamespacePropertiesRequest.Builder
- build() - Method in class org.apache.iceberg.rest.responses.ConfigResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.CreateNamespaceResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.ErrorResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.GetNamespaceResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.ListNamespacesResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.ListTablesResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.LoadTableResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.OAuthTokenResponse.Builder
- build() - Method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse.Builder
- build() - Method in class org.apache.iceberg.ScanSummary.Builder
-
Summarizes a table scan as a map of partition key to metrics for that partition.
- build() - Method in class org.apache.iceberg.SnapshotRef.Builder
- build() - Method in class org.apache.iceberg.SnapshotSummary.Builder
- build() - Method in class org.apache.iceberg.SortOrder.Builder
- build() - Method in interface org.apache.iceberg.spark.procedures.SparkProcedures.ProcedureBuilder
- build() - Method in class org.apache.iceberg.spark.source.SparkPositionDeletesRewriteBuilder
- build() - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- build() - Method in class org.apache.iceberg.TableMetadata.Builder
- build() - Method in class org.apache.iceberg.view.ViewMetadata.Builder
- build(Object) - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a BoundMethod or throws a RuntimeException if there is none.
- build(Object) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a BoundMethod or throws a RuntimeError if there is none.
- buildAvroProjection(Schema, Schema, Map<String, String>) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- buildChangelogScan() - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- buildChecked() - Method in class org.apache.iceberg.common.DynClasses.Builder
-
Returns the first implementation or throws ClassNotFoundException if one was not found.
- buildChecked() - Method in class org.apache.iceberg.common.DynConstructors.Builder
- buildChecked() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a UnboundField or throws a NoSuchFieldException if there is none.
- buildChecked() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a UnboundMethod or throws a NoSuchMethodException if there is none.
- buildChecked(Object) - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a BoundMethod or throws a NoSuchMethodException if there is none.
- buildChecked(Object) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a BoundMethod or throws a NoSuchMethodException if there is none.
- buildCopyOnWriteScan() - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- buildEqualityWriter() - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- buildEqualityWriter() - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- buildEqualityWriter() - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- builder() - Static method in class org.apache.iceberg.common.DynClasses
- builder() - Static method in class org.apache.iceberg.common.DynConstructors
- builder() - Static method in class org.apache.iceberg.common.DynFields
- builder() - Static method in class org.apache.iceberg.flink.source.IcebergSource
- builder() - Static method in class org.apache.iceberg.flink.source.ScanContext
- builder() - Static method in class org.apache.iceberg.io.WriteResult
- builder() - Static method in interface org.apache.iceberg.rest.auth.AuthConfig
- builder() - Static method in class org.apache.iceberg.rest.requests.CreateNamespaceRequest
- builder() - Static method in class org.apache.iceberg.rest.requests.CreateTableRequest
- builder() - Static method in class org.apache.iceberg.rest.requests.RenameTableRequest
- builder() - Static method in class org.apache.iceberg.rest.requests.UpdateNamespacePropertiesRequest
- builder() - Static method in class org.apache.iceberg.rest.responses.ConfigResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.CreateNamespaceResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.ErrorResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.GetNamespaceResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.ListNamespacesResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.ListTablesResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.LoadTableResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.OAuthTokenResponse
- builder() - Static method in class org.apache.iceberg.rest.responses.UpdateNamespacePropertiesResponse
- builder() - Static method in class org.apache.iceberg.SnapshotSummary
- builder() - Static method in class org.apache.iceberg.spark.procedures.AncestorsOfProcedure
- builder() - Static method in class org.apache.iceberg.spark.procedures.CreateChangelogViewProcedure
- builder() - Static method in class org.apache.iceberg.spark.procedures.ExpireSnapshotsProcedure
- builder() - Static method in class org.apache.iceberg.spark.procedures.FastForwardBranchProcedure
- builder() - Static method in class org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure
- builder() - Static method in class org.apache.iceberg.spark.procedures.RewritePositionDeleteFilesProcedure
- builder() - Static method in interface org.apache.iceberg.view.ViewMetadata
- builder(Class<?>) - Static method in class org.apache.iceberg.common.DynConstructors
- builder(String) - Static method in class org.apache.iceberg.common.DynMethods
-
Constructs a new builder for calling methods dynamically.
- builder(Map<String, String>) - Static method in class org.apache.iceberg.rest.HTTPClient
- builder(PartitionSpec) - Static method in class org.apache.iceberg.DataFiles
- Builder() - Constructor for class org.apache.iceberg.common.DynConstructors.Builder
- Builder() - Constructor for class org.apache.iceberg.flink.source.FlinkSource.Builder
- Builder(int) - Constructor for class org.apache.iceberg.DoubleFieldMetrics.Builder
- Builder(int) - Constructor for class org.apache.iceberg.FloatFieldMetrics.Builder
- Builder(Class<?>) - Constructor for class org.apache.iceberg.common.DynConstructors.Builder
- Builder(Iterable<I>) - Constructor for class org.apache.iceberg.util.Tasks.Builder
- Builder(String) - Constructor for class org.apache.iceberg.common.DynMethods.Builder
- Builder(PartitionSpec) - Constructor for class org.apache.iceberg.DataFiles.Builder
- Builder(Table) - Constructor for class org.apache.iceberg.FindFiles.Builder
- Builder(TableScan) - Constructor for class org.apache.iceberg.ScanSummary.Builder
- builderFor(int) - Method in class org.apache.iceberg.FloatFieldMetrics
- builderFor(long, SnapshotRefType) - Static method in class org.apache.iceberg.SnapshotRef
- builderFor(DataStream<T>, MapFunction<T, RowData>, TypeInformation<RowData>) - Static method in class org.apache.iceberg.flink.sink.FlinkSink
-
Initialize a
FlinkSink.Builder
to export the data from generic input data stream into iceberg table. - builderFor(Schema) - Static method in class org.apache.iceberg.PartitionSpec
-
Creates a new
partition spec builder
for the givenSchema
. - builderFor(Schema) - Static method in class org.apache.iceberg.SortOrder
-
Creates a new
sort order builder
for the givenSchema
. - builderFor(Table, int, long) - Static method in class org.apache.iceberg.io.OutputFileFactory
- builderFrom(SnapshotRef) - Static method in class org.apache.iceberg.SnapshotRef
- builderFrom(SnapshotRef, long) - Static method in class org.apache.iceberg.SnapshotRef
-
Creates a ref builder from the given ref and its properties but the ref will now point to the given snapshotId.
- buildFormat() - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- buildFrom(TableMetadata) - Static method in class org.apache.iceberg.TableMetadata
- buildFrom(ViewMetadata) - Static method in interface org.apache.iceberg.view.ViewMetadata
- buildFromEmpty() - Static method in class org.apache.iceberg.TableMetadata
- buildFromEmpty(int) - Static method in class org.apache.iceberg.TableMetadata
- buildIcebergCatalog(String, Map<String, String>, Object) - Static method in class org.apache.iceberg.CatalogUtil
-
Build an Iceberg
Catalog
based on a map of catalog properties and optional Hadoop configuration. - buildIcebergCatalog(String, CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.SparkCatalog
-
Build an Iceberg
Catalog
to be used by this Spark catalog adapter. - buildIdentifier(Identifier) - Method in class org.apache.iceberg.spark.SparkCatalog
-
Build an Iceberg
TableIdentifier
for the given Spark identifier. - buildList(I) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- buildList(List<E>) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.ListReader
- buildMap(I) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- buildMap(Map<K, V>) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.MapReader
- buildMergeOnReadScan() - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- buildOptionalParam(Map<String, String>) - Static method in class org.apache.iceberg.rest.auth.OAuth2Util
- buildOrcProjection(Schema, TypeDescription) - Static method in class org.apache.iceberg.orc.ORCSchemaUtil
-
Converts an Iceberg schema to a corresponding ORC schema within the context of an existing ORC file schema.
- buildPositionWriter() - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- buildPositionWriter() - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- buildPositionWriter() - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- buildReader(Schema, TypeDescription) - Static method in class org.apache.iceberg.data.orc.GenericOrcReader
- buildReader(Schema, TypeDescription, Map<Integer, ?>) - Static method in class org.apache.iceberg.data.orc.GenericOrcReader
- buildReader(Schema, TypeDescription, Map<Integer, ?>) - Static method in class org.apache.iceberg.spark.data.vectorized.VectorizedSparkOrcReaders
- buildReader(Schema, MessageType) - Static method in class org.apache.iceberg.data.parquet.GenericParquetReaders
- buildReader(Schema, MessageType) - Static method in class org.apache.iceberg.flink.data.FlinkParquetReaders
- buildReader(Schema, MessageType) - Static method in class org.apache.iceberg.parquet.ParquetAvroValueReaders
- buildReader(Schema, MessageType) - Static method in class org.apache.iceberg.spark.data.SparkParquetReaders
- buildReader(Schema, MessageType, Map<Integer, ?>) - Static method in class org.apache.iceberg.data.parquet.GenericParquetReaders
- buildReader(Schema, MessageType, Map<Integer, ?>) - Static method in class org.apache.iceberg.flink.data.FlinkParquetReaders
- buildReader(Schema, MessageType, Map<Integer, ?>) - Static method in class org.apache.iceberg.spark.data.SparkParquetReaders
- buildReader(Schema, MessageType, Map<Integer, ?>, DeleteFilter<InternalRow>) - Static method in class org.apache.iceberg.spark.data.vectorized.VectorizedSparkParquetReaders
- buildReader(MessageType, Schema, Map<Integer, Object>) - Static method in class org.apache.iceberg.pig.PigParquetReader
- buildReplacement(Schema, PartitionSpec, SortOrder, String, Map<String, String>) - Method in class org.apache.iceberg.TableMetadata
- buildSortOrder(Schema, PartitionSpec, SortOrder) - Static method in class org.apache.iceberg.util.SortOrderUtil
-
Build a final sort order that satisfies the clustering required by the partition spec.
- buildSortOrder(Table) - Static method in class org.apache.iceberg.util.SortOrderUtil
- buildSortOrder(Table, SortOrder) - Static method in class org.apache.iceberg.util.SortOrderUtil
- buildSparkCatalog(String, CaseInsensitiveStringMap) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
-
Build a
SparkCatalog
to be used for Iceberg operations. - buildStatic() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Returns the first valid implementation as a StaticField or throws a RuntimeException if there is none.
- buildStatic() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a StaticMethod or throws a RuntimeException if there is none.
- buildStaticChecked() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Deprecated.since 1.6.0, will be removed in 1.7.0
- buildStaticChecked() - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Returns the first valid implementation as a StaticMethod or throws a NoSuchMethodException if there is none.
- buildStruct(I) - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- buildTable(String, Schema) - Method in class org.apache.iceberg.hadoop.HadoopTables
- buildTable(SessionCatalog.SessionContext, TableIdentifier, Schema) - Method in interface org.apache.iceberg.catalog.SessionCatalog
-
Create a builder to create a table or start a create/replace transaction.
- buildTable(SessionCatalog.SessionContext, TableIdentifier, Schema) - Method in class org.apache.iceberg.rest.RESTSessionCatalog
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.BaseMetastoreCatalog
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.CachingCatalog
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.catalog.BaseSessionCatalog.AsCatalog
- buildTable(TableIdentifier, Schema) - Method in interface org.apache.iceberg.catalog.Catalog
-
/** Instantiate a builder to either create a table or start a create/replace transaction.
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.rest.RESTCatalog
- buildTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.view.BaseMetastoreViewCatalog
- buildView(SessionCatalog.SessionContext, TableIdentifier) - Method in interface org.apache.iceberg.catalog.ViewSessionCatalog
-
Instantiate a builder to create or replace a SQL view.
- buildView(SessionCatalog.SessionContext, TableIdentifier) - Method in class org.apache.iceberg.rest.RESTSessionCatalog
- buildView(TableIdentifier) - Method in class org.apache.iceberg.catalog.BaseViewSessionCatalog.AsViewCatalog
- buildView(TableIdentifier) - Method in interface org.apache.iceberg.catalog.ViewCatalog
-
Instantiate a builder to create or replace a SQL view.
- buildView(TableIdentifier) - Method in class org.apache.iceberg.rest.RESTCatalog
- buildView(TableIdentifier) - Method in class org.apache.iceberg.view.BaseMetastoreViewCatalog
- buildWriter(LogicalType, MessageType) - Static method in class org.apache.iceberg.flink.data.FlinkParquetWriters
- buildWriter(RowType, Schema) - Static method in class org.apache.iceberg.flink.data.FlinkOrcWriter
- buildWriter(Schema, TypeDescription) - Static method in class org.apache.iceberg.data.orc.GenericOrcWriter
- buildWriter(MessageType) - Static method in class org.apache.iceberg.data.parquet.GenericParquetWriter
- buildWriter(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetAvroWriter
- buildWriter(StructType, MessageType) - Static method in class org.apache.iceberg.spark.data.SparkParquetWriters
- bulkDecrypt(Iterable<? extends ContentFile<?>>) - Method in class org.apache.iceberg.encryption.EncryptingFileIO
- BulkDeletionFailureException - Exception in org.apache.iceberg.io
- BulkDeletionFailureException(int) - Constructor for exception org.apache.iceberg.io.BulkDeletionFailureException
- BY - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- BY - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- BY() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- BY() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteDistributionSpecContext
- BY() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.WriteOrderingSpecContext
- byId() - Method in class org.apache.iceberg.types.IndexByName
-
Returns a mapping from field ID to full name.
- byName() - Method in class org.apache.iceberg.types.IndexByName
-
Returns a mapping from full field name to ID.
- ByteArrayReader(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.ByteArrayReader
- byteArrays() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- ByteBufferInputStream - Class in org.apache.iceberg.io
- ByteBufferInputStream() - Constructor for class org.apache.iceberg.io.ByteBufferInputStream
- byteBuffers() - Static method in class org.apache.iceberg.avro.ValueReaders
- byteBuffers() - Static method in class org.apache.iceberg.avro.ValueWriters
- byteBuffers() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- byteBuffers(ColumnDescriptor) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- ByteBuffers - Class in org.apache.iceberg.util
- bytes() - Static method in class org.apache.iceberg.avro.ValueReaders
- bytes() - Static method in class org.apache.iceberg.avro.ValueWriters
- bytes() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- bytes() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- bytes() - Static method in class org.apache.iceberg.orc.OrcValueReaders
- BYTES - Enum constant in enum class org.apache.iceberg.metrics.MetricsContext.Unit
- BYTES_ASC - Enum constant in enum class org.apache.iceberg.RewriteJobOrder
- BYTES_DESC - Enum constant in enum class org.apache.iceberg.RewriteJobOrder
- BytesReader(ColumnDescriptor) - Constructor for class org.apache.iceberg.parquet.ParquetValueReaders.BytesReader
- byteTruncateOrFill(byte[], int, ByteBuffer) - Static method in class org.apache.iceberg.util.ZOrderByteUtils
-
Return a bytebuffer with the given bytes truncated to length, or filled with 0's to length depending on whether the given bytes are larger or smaller than the given length.
C
- CACHE_CASE_SENSITIVE - Static variable in class org.apache.iceberg.CatalogProperties
-
Controls whether the caching catalog will cache table entries using case sensitive keys.
- CACHE_CASE_SENSITIVE_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- CACHE_ENABLED - Static variable in class org.apache.iceberg.CatalogProperties
-
Controls whether the catalog will cache table entries upon load.
- CACHE_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- CACHE_EXPIRATION_INTERVAL_MS - Static variable in class org.apache.iceberg.CatalogProperties
-
Controls the duration for which entries in the catalog are cached.
- CACHE_EXPIRATION_INTERVAL_MS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- CACHE_EXPIRATION_INTERVAL_MS_OFF - Static variable in class org.apache.iceberg.CatalogProperties
- CachedClientPool - Class in org.apache.iceberg.hive
-
A ClientPool that caches the underlying HiveClientPool instances.
- CachingCatalog - Class in org.apache.iceberg
-
Class that wraps an Iceberg Catalog to cache tables.
- CachingCatalog(Catalog, boolean, long, Ticker) - Constructor for class org.apache.iceberg.CachingCatalog
- calculateContentHashPresign(SdkHttpFullRequest.Builder, Aws4PresignerParams) - Method in class org.apache.iceberg.aws.s3.signer.S3V4RestSignerClient
- call(InternalRow) - Method in class org.apache.iceberg.spark.procedures.AncestorsOfProcedure
- call(InternalRow) - Method in class org.apache.iceberg.spark.procedures.CreateChangelogViewProcedure
- call(InternalRow) - Method in class org.apache.iceberg.spark.procedures.ExpireSnapshotsProcedure
- call(InternalRow) - Method in class org.apache.iceberg.spark.procedures.FastForwardBranchProcedure
- call(InternalRow) - Method in class org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure
- call(InternalRow) - Method in class org.apache.iceberg.spark.procedures.RewritePositionDeleteFilesProcedure
- call(InternalRow) - Method in interface org.apache.spark.sql.connector.iceberg.catalog.Procedure
-
Executes this procedure.
- CALL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- CALL - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- CALL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- CALL() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- callArgument() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- callArgument() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- callArgument(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- CallArgumentContext() - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallArgumentContext
- CallArgumentContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallArgumentContext
- CallContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallContext
- callInit() - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
-
Deprecated.will be removed in 2.0.0; use
Parquet.ReadBuilder.createReaderFunc(Function)
instead - canCache(long) - Method in class org.apache.iceberg.data.BaseDeleteLoader
-
Checks if the given number of bytes can be cached.
- cancel() - Method in class org.apache.iceberg.flink.source.StreamingMonitorFunction
- canContainAny(ManifestFile, Iterable<StructLike>, Function<Integer, PartitionSpec>) - Static method in class org.apache.iceberg.util.ManifestFileUtil
- canContainAny(ManifestFile, Iterable<Pair<Integer, StructLike>>, Map<Integer, PartitionSpec>) - Static method in class org.apache.iceberg.util.ManifestFileUtil
- canDeleteWhere(Predicate[]) - Method in class org.apache.iceberg.spark.source.SparkTable
- canInheritSnapshotId() - Method in class org.apache.iceberg.BaseRewriteManifests
- canMerge(ScanTask) - Method in interface org.apache.iceberg.MergeableScanTask
-
Checks if this task can merge with a given task.
- canonicalName() - Method in class org.apache.iceberg.spark.functions.BucketFunction.BucketBinary
- canonicalName() - Method in class org.apache.iceberg.spark.functions.BucketFunction.BucketDecimal
- canonicalName() - Method in class org.apache.iceberg.spark.functions.BucketFunction.BucketInt
- canonicalName() - Method in class org.apache.iceberg.spark.functions.BucketFunction.BucketLong
- canonicalName() - Method in class org.apache.iceberg.spark.functions.BucketFunction.BucketString
- canonicalName() - Method in class org.apache.iceberg.spark.functions.DaysFunction.DateToDaysFunction
- canonicalName() - Method in class org.apache.iceberg.spark.functions.DaysFunction.TimestampNtzToDaysFunction
- canonicalName() - Method in class org.apache.iceberg.spark.functions.DaysFunction.TimestampToDaysFunction
- canonicalName() - Method in class org.apache.iceberg.spark.functions.HoursFunction.TimestampNtzToHoursFunction
- canonicalName() - Method in class org.apache.iceberg.spark.functions.HoursFunction.TimestampToHoursFunction
- canonicalName() - Method in class org.apache.iceberg.spark.functions.MonthsFunction.DateToMonthsFunction
- canonicalName() - Method in class org.apache.iceberg.spark.functions.MonthsFunction.TimestampNtzToMonthsFunction
- canonicalName() - Method in class org.apache.iceberg.spark.functions.MonthsFunction.TimestampToMonthsFunction
- canonicalName() - Method in class org.apache.iceberg.spark.functions.TruncateFunction.TruncateBigInt
- canonicalName() - Method in class org.apache.iceberg.spark.functions.TruncateFunction.TruncateBinary
- canonicalName() - Method in class org.apache.iceberg.spark.functions.TruncateFunction.TruncateDecimal
- canonicalName() - Method in class org.apache.iceberg.spark.functions.TruncateFunction.TruncateInt
- canonicalName() - Method in class org.apache.iceberg.spark.functions.TruncateFunction.TruncateSmallInt
- canonicalName() - Method in class org.apache.iceberg.spark.functions.TruncateFunction.TruncateString
- canonicalName() - Method in class org.apache.iceberg.spark.functions.TruncateFunction.TruncateTinyInt
- canonicalName() - Method in class org.apache.iceberg.spark.functions.YearsFunction.DateToYearsFunction
- canonicalName() - Method in class org.apache.iceberg.spark.functions.YearsFunction.TimestampNtzToYearsFunction
- canonicalName() - Method in class org.apache.iceberg.spark.functions.YearsFunction.TimestampToYearsFunction
- canTransform(Type) - Method in class org.apache.iceberg.transforms.Hours
- canTransform(Type) - Method in class org.apache.iceberg.transforms.Months
- canTransform(Type) - Method in interface org.apache.iceberg.transforms.Transform
-
Checks whether this function can be applied to the given
Type
. - canTransform(Type) - Method in class org.apache.iceberg.transforms.UnknownTransform
- capabilities() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- capabilities() - Method in class org.apache.iceberg.spark.source.SparkChangelogTable
- capabilities() - Method in class org.apache.iceberg.spark.source.SparkTable
- CASE_SENSITIVE - Static variable in class org.apache.iceberg.flink.FlinkReadOptions
- CASE_SENSITIVE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- CASE_SENSITIVE_DEFAULT - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- CASE_SENSITIVE_OPTION - Static variable in class org.apache.iceberg.flink.FlinkReadOptions
- caseInsensitive() - Method in class org.apache.iceberg.data.IcebergGenerics.ScanBuilder
- caseInsensitive() - Method in class org.apache.iceberg.FindFiles.Builder
- caseInsensitive() - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- caseInsensitiveField(String) - Method in class org.apache.iceberg.types.Types.StructType
- caseInsensitiveFindField(String) - Method in class org.apache.iceberg.Schema
-
Returns a sub-field by name as a
Types.NestedField
. - caseInsensitiveSelect(String...) - Method in class org.apache.iceberg.Schema
-
Creates a projection schema for a subset of columns, selected by case insensitive names
- caseInsensitiveSelect(Collection<String>) - Method in class org.apache.iceberg.Schema
-
Creates a projection schema for a subset of columns, selected by case insensitive names
- caseSensitive() - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- caseSensitive() - Method in class org.apache.iceberg.flink.FlinkReadConf
- caseSensitive() - Method in class org.apache.iceberg.flink.source.ScanContext
- caseSensitive() - Method in class org.apache.iceberg.spark.source.EqualityDeleteRowReader
- caseSensitive() - Method in class org.apache.iceberg.spark.SparkReadConf
- caseSensitive() - Method in class org.apache.iceberg.spark.SparkWriteConf
- caseSensitive(boolean) - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
-
Is it case sensitive
- caseSensitive(boolean) - Method in class org.apache.iceberg.BaseReplaceSortOrder
- caseSensitive(boolean) - Method in class org.apache.iceberg.AllDeleteFilesTable.AllDeleteFilesTableScan
- caseSensitive(boolean) - Method in interface org.apache.iceberg.DeleteFiles
-
Enables or disables case sensitive expression binding for methods that accept expressions.
- caseSensitive(boolean) - Method in class org.apache.iceberg.FindFiles.Builder
- caseSensitive(boolean) - Method in class org.apache.iceberg.flink.source.FlinkSource.Builder
- caseSensitive(boolean) - Method in class org.apache.iceberg.flink.source.IcebergSource.Builder
- caseSensitive(boolean) - Method in class org.apache.iceberg.flink.source.ScanContext.Builder
- caseSensitive(boolean) - Method in class org.apache.iceberg.ManifestReader
- caseSensitive(boolean) - Method in class org.apache.iceberg.StreamingDelete
- caseSensitive(boolean) - Method in class org.apache.iceberg.MicroBatches.MicroBatchBuilder
- caseSensitive(boolean) - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- caseSensitive(boolean) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- caseSensitive(boolean) - Method in interface org.apache.iceberg.OverwriteFiles
-
Enables or disables case sensitive expression binding for validations that accept expressions.
- caseSensitive(boolean) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- caseSensitive(boolean) - Method in interface org.apache.iceberg.RowDelta
-
Enables or disables case sensitive expression binding for validations that accept expressions.
- caseSensitive(boolean) - Method in interface org.apache.iceberg.Scan
-
Create a new scan from this that, if data columns where selected via
Scan.select(java.util.Collection)
, controls whether the match to the schema will be done with case sensitivity. - caseSensitive(boolean) - Method in class org.apache.iceberg.SortOrder.Builder
- caseSensitive(boolean) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Set case sensitivity of sort column name resolution.
- caseSensitive(boolean) - Method in class org.apache.iceberg.spark.source.SparkScanBuilder
- caseSensitive(boolean) - Method in interface org.apache.iceberg.UpdatePartitionSpec
-
Set whether column resolution in the source schema should be case sensitive.
- caseSensitive(boolean) - Method in interface org.apache.iceberg.UpdateSchema
-
Determines if the case of schema needs to be considered when comparing column names
- caseSensitive(SparkSession) - Static method in class org.apache.iceberg.spark.SparkUtil
- castAndThrow(Throwable, Class<E>) - Static method in class org.apache.iceberg.util.ExceptionUtil
- catalog() - Method in class org.apache.iceberg.connect.events.TableReference
- catalog() - Method in class org.apache.iceberg.flink.FlinkCatalog
- catalog() - Method in class org.apache.iceberg.spark.Spark3Util.CatalogAndIdentifier
- Catalog - Interface in org.apache.iceberg.catalog
-
A Catalog API for table create, drop, and load operations.
- CATALOG_CONFIG_PREFIX - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- CATALOG_IMPL - Static variable in class org.apache.iceberg.CatalogProperties
- CATALOG_NAME - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- CATALOG_SCOPE - Static variable in class org.apache.iceberg.rest.auth.OAuth2Properties
-
Scope for OAuth2 flows.
- Catalog.TableBuilder - Interface in org.apache.iceberg.catalog
-
A builder used to create valid
tables
or start create/replacetransactions
. - catalogAndIdentifier(String, SparkSession, String) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(String, SparkSession, String, CatalogPlugin) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(List<String>, Function<String, C>, BiFunction<String[], String, T>, C, String[]) - Static method in class org.apache.iceberg.spark.SparkUtil
-
A modified version of Spark's LookupCatalog.CatalogAndIdentifier.unapply Attempts to find the catalog and identifier a multipart identifier represents
- catalogAndIdentifier(SparkSession, String) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(SparkSession, String, CatalogPlugin) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(SparkSession, List<String>) - Static method in class org.apache.iceberg.spark.Spark3Util
- catalogAndIdentifier(SparkSession, List<String>, CatalogPlugin) - Static method in class org.apache.iceberg.spark.Spark3Util
-
A modified version of Spark's LookupCatalog.CatalogAndIdentifier.unapply Attempts to find the catalog and identifier a multipart identifier represents
- CatalogAndIdentifier(Pair<CatalogPlugin, Identifier>) - Constructor for class org.apache.iceberg.spark.Spark3Util.CatalogAndIdentifier
- CatalogAndIdentifier(CatalogPlugin, Identifier) - Constructor for class org.apache.iceberg.spark.Spark3Util.CatalogAndIdentifier
- CatalogHandlers - Class in org.apache.iceberg.rest
- CatalogLoader - Interface in org.apache.iceberg.flink
-
Serializable loader to load an Iceberg
Catalog
. - CatalogLoader.CustomCatalogLoader - Class in org.apache.iceberg.flink
- CatalogLoader.HadoopCatalogLoader - Class in org.apache.iceberg.flink
- CatalogLoader.HiveCatalogLoader - Class in org.apache.iceberg.flink
- CatalogLoader.RESTCatalogLoader - Class in org.apache.iceberg.flink
- catalogName() - Method in class org.apache.iceberg.connect.IcebergSinkConfig
- catalogName(Configuration, String) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
-
Returns the catalog name serialized to the configuration.
- CatalogProperties - Class in org.apache.iceberg
- catalogPropertyConfigKey(String, String) - Static method in class org.apache.iceberg.mr.InputFormatConfig
-
Get Hadoop config key of a catalog property based on catalog name
- catalogProps() - Method in class org.apache.iceberg.connect.IcebergSinkConfig
- Catalogs - Class in org.apache.iceberg.mr
-
Class for catalog resolution and accessing the common functions for
Catalog
API. - CatalogUtil - Class in org.apache.iceberg
- CHANGE_ORDINAL - Static variable in class org.apache.iceberg.MetadataColumns
- CHANGE_TYPE - Static variable in class org.apache.iceberg.MetadataColumns
- CHANGED_PARTITION_COUNT_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- CHANGED_PARTITION_PREFIX - Static variable in class org.apache.iceberg.SnapshotSummary
- ChangelogIterator - Class in org.apache.iceberg.spark
-
An iterator that transforms rows from changelog tables within a single Spark task.
- ChangelogIterator(Iterator<Row>, StructType) - Constructor for class org.apache.iceberg.spark.ChangelogIterator
- ChangelogOperation - Enum Class in org.apache.iceberg
-
An enum representing possible operations in a changelog.
- ChangelogScanTask - Interface in org.apache.iceberg
-
A changelog scan task.
- changelogSchema(Schema) - Static method in class org.apache.iceberg.ChangelogUtil
- ChangelogUtil - Class in org.apache.iceberg
- changeOrdinal() - Method in interface org.apache.iceberg.ChangelogScanTask
-
Returns the ordinal of changes produced by this task.
- changes() - Method in class org.apache.iceberg.TableMetadata
- changes() - Method in interface org.apache.iceberg.view.ViewMetadata
- changeType(Row) - Method in class org.apache.iceberg.spark.ChangelogIterator
- changeTypeIndex() - Method in class org.apache.iceberg.spark.ChangelogIterator
- channelNames - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- channelReadChunkSize() - Method in class org.apache.iceberg.gcp.GCPProperties
- channelWriteChunkSize() - Method in class org.apache.iceberg.gcp.GCPProperties
- charAt(int) - Method in class org.apache.iceberg.util.CharSequenceWrapper
- CharSequenceMap<V> - Class in org.apache.iceberg.util
-
A map that uses char sequences as keys.
- charSequences() - Static method in class org.apache.iceberg.types.Comparators
- CharSequenceSet - Class in org.apache.iceberg.util
- CharSequenceUtil - Class in org.apache.iceberg.util
- CharSequenceWrapper - Class in org.apache.iceberg.util
-
Wrapper class to adapt CharSequence for use in maps and sets.
- check() - Method in class org.apache.iceberg.aws.s3.signer.S3V4RestSignerClient
- check() - Method in interface org.apache.iceberg.view.ViewMetadata
- check(boolean, String, Object...) - Static method in exception org.apache.iceberg.exceptions.NoSuchIcebergTableException
- check(boolean, String, Object...) - Static method in exception org.apache.iceberg.exceptions.ValidationException
- CHECK_NULLABILITY - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- CHECK_NULLABILITY - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- CHECK_NULLABILITY_DEFAULT - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- CHECK_ORDERING - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- CHECK_ORDERING - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- CHECK_ORDERING_DEFAULT - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- checkAndSetIoConfig(Configuration, Table) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
-
If enabled, it populates the FileIO's hadoop configuration with the input config object.
- checkAndSkipIoConfigSerialization(Configuration, Table) - Static method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
-
If enabled, it ensures that the FileIO's hadoop configuration will not be serialized.
- checkCommitStatus(String, String, Map<String, String>, Supplier<Boolean>) - Method in class org.apache.iceberg.BaseMetastoreOperations
-
Attempt to load the content and see if any current or past metadata location matches the one we were attempting to set.
- checkCommitStatus(String, TableMetadata) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
-
Attempt to load the table and see if any current or past metadata location matches the one we were attempting to set.
- checkCompatibility(SortOrder, Schema) - Static method in class org.apache.iceberg.SortOrder
- CheckCompatibility - Class in org.apache.iceberg.types
- checkDestinationCatalog(CatalogPlugin) - Method in class org.apache.iceberg.spark.actions.SnapshotTableSparkAction
- checkNullability() - Method in class org.apache.iceberg.spark.SparkWriteConf
- checkOrdering() - Method in class org.apache.iceberg.spark.SparkWriteConf
- checkOutputSpecs(FileSystem, JobConf) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputFormat
- checkSourceCatalog(CatalogPlugin) - Method in class org.apache.iceberg.spark.actions.MigrateTableSparkAction
- checkSourceCatalog(CatalogPlugin) - Method in class org.apache.iceberg.spark.actions.SnapshotTableSparkAction
- CHECKSUM_ENABLED - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Enables eTag checks for S3 PUT and MULTIPART upload requests.
- CHECKSUM_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
- cherrypick(long) - Method in interface org.apache.iceberg.ManageSnapshots
-
Apply supported changes in given snapshot and create a new snapshot which will be set as the current snapshot on commit.
- cherrypick(long) - Method in class org.apache.iceberg.SnapshotManager
- CherrypickAncestorCommitException - Exception in org.apache.iceberg.exceptions
-
This exception occurs when one cherrypicks an ancestor or when the picked snapshot is already linked to a published ancestor.
- CherrypickAncestorCommitException(long) - Constructor for exception org.apache.iceberg.exceptions.CherrypickAncestorCommitException
- CherrypickAncestorCommitException(long, long) - Constructor for exception org.apache.iceberg.exceptions.CherrypickAncestorCommitException
- child() - Method in class org.apache.iceberg.expressions.Not
- childColumn(int) - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- CIPHER_BLOCK_SIZE - Static variable in class org.apache.iceberg.encryption.Ciphers
- Ciphers - Class in org.apache.iceberg.encryption
- Ciphers.AesGcmDecryptor - Class in org.apache.iceberg.encryption
- Ciphers.AesGcmEncryptor - Class in org.apache.iceberg.encryption
- classLoader(ClassLoader) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- CleanableFailure - Interface in org.apache.iceberg.exceptions
-
A marker interface for commit exceptions where the state is known to be failure and uncommitted metadata can be cleaned up.
- cleanAll() - Method in class org.apache.iceberg.BaseRewriteManifests
- cleanExpiredFiles(boolean) - Method in interface org.apache.iceberg.ExpireSnapshots
-
Allows expiration of snapshots without any cleanup of underlying manifest or data files.
- cleanUncommitted(Set<ManifestFile>) - Method in class org.apache.iceberg.BaseRewriteManifests
- cleanUncommitted(Set<ManifestFile>) - Method in class org.apache.iceberg.StreamingDelete
- cleanUp() - Method in class org.apache.iceberg.io.ContentCache
- clear() - Method in class org.apache.iceberg.DataFiles.Builder
- clear() - Method in class org.apache.iceberg.FileMetadata.Builder
- clear() - Method in class org.apache.iceberg.PartitionData
- clear() - Method in class org.apache.iceberg.SnapshotSummary.Builder
- clear() - Method in class org.apache.iceberg.util.CharSequenceMap
- clear() - Method in class org.apache.iceberg.util.CharSequenceSet
- clear() - Method in class org.apache.iceberg.util.PartitionMap
- clear() - Method in class org.apache.iceberg.util.PartitionSet
- clear() - Method in class org.apache.iceberg.util.SerializableMap
- clear() - Method in class org.apache.iceberg.util.StructLikeMap
- clear() - Method in class org.apache.iceberg.util.StructLikeSet
- clearCache() - Static method in class org.apache.iceberg.data.avro.DecoderResolver
- clearRewrite(Table, String) - Method in class org.apache.iceberg.spark.FileRewriteCoordinator
- client() - Method in class org.apache.iceberg.aws.s3.S3FileIO
- client() - Method in class org.apache.iceberg.gcp.gcs.GCSFileIO
- client(String) - Method in class org.apache.iceberg.azure.adlsv2.ADLSFileIO
- CLIENT_ACCESS_KEY_ID - Static variable in class org.apache.iceberg.aliyun.AliyunProperties
-
Aliyun uses an AccessKey pair, which includes an AccessKey ID and an AccessKey secret to implement symmetric encryption and verify the identity of a requester.
- CLIENT_ACCESS_KEY_SECRET - Static variable in class org.apache.iceberg.aliyun.AliyunProperties
-
Aliyun uses an AccessKey pair, which includes an AccessKey ID and an AccessKey secret to implement symmetric encryption and verify the identity of a requester.
- CLIENT_API_VERSION - Static variable in class org.apache.iceberg.nessie.NessieUtil
- CLIENT_ASSUME_ROLE_ARN - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_EXTERNAL_ID - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_REGION - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_SESSION_NAME - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_TAGS_PREFIX - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
to pass a list of sessions. - CLIENT_ASSUME_ROLE_TIMEOUT_SEC - Static variable in class org.apache.iceberg.aws.AwsProperties
-
Used by
AssumeRoleAwsClientFactory
. - CLIENT_ASSUME_ROLE_TIMEOUT_SEC_DEFAULT - Static variable in class org.apache.iceberg.aws.AwsProperties
- CLIENT_CREDENTIAL_PROVIDER_PREFIX - Static variable in class org.apache.iceberg.aws.AwsClientProperties
-
Used by the client.credentials-provider configured value that will be used by
AwsClientFactories.defaultFactory()
and other AWS client factory classes to pass provider-specific properties. - CLIENT_CREDENTIALS_PROVIDER - Static variable in class org.apache.iceberg.aws.AwsClientProperties
-
Configure the AWS credentials provider used to create AWS clients.
- CLIENT_FACTORY - Static variable in class org.apache.iceberg.aliyun.AliyunProperties
-
The implementation class of
AliyunClientFactory
to customize Aliyun client configurations. - CLIENT_FACTORY - Static variable in class org.apache.iceberg.aws.AwsProperties
-
The implementation class of
AwsClientFactory
to customize AWS client configurations. - CLIENT_FACTORY - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
This property is used to pass in the aws client factory implementation class for S3 FileIO.
- CLIENT_FACTORY - Static variable in class org.apache.iceberg.dell.DellProperties
-
The implementation class of
DellClientFactory
to customize Dell client configurations. - CLIENT_POOL_CACHE_EVICTION_INTERVAL_MS - Static variable in class org.apache.iceberg.CatalogProperties
- CLIENT_POOL_CACHE_EVICTION_INTERVAL_MS_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- CLIENT_POOL_CACHE_KEYS - Static variable in class org.apache.iceberg.CatalogProperties
-
A comma separated list of elements used, in addition to the
CatalogProperties.URI
, to compose the key of the client pool cache. - CLIENT_POOL_SIZE - Static variable in class org.apache.iceberg.CatalogProperties
- CLIENT_POOL_SIZE_DEFAULT - Static variable in class org.apache.iceberg.CatalogProperties
- CLIENT_REGION - Static variable in class org.apache.iceberg.aws.AwsClientProperties
-
Used by
AwsClientFactories.DefaultAwsClientFactory
and also other client factory classes. - CLIENT_SECURITY_TOKEN - Static variable in class org.apache.iceberg.aliyun.AliyunProperties
-
Aliyun supports Security Token Service (STS) to generate temporary access credentials to authorize a user to access the Object Storage Service (OSS) resources within a specific period of time.
- CLIENT_TYPE - Static variable in class org.apache.iceberg.aws.HttpClientProperties
-
The type of
SdkHttpClient
implementation used byAwsClientFactory
If set, all AWS clients will use this specified HTTP client. - CLIENT_TYPE_APACHE - Static variable in class org.apache.iceberg.aws.HttpClientProperties
-
If this is set under
HttpClientProperties.CLIENT_TYPE
,ApacheHttpClient
will be used as the HTTP Client inAwsClientFactory
- CLIENT_TYPE_DEFAULT - Static variable in class org.apache.iceberg.aws.HttpClientProperties
- CLIENT_TYPE_URLCONNECTION - Static variable in class org.apache.iceberg.aws.HttpClientProperties
-
If this is set under
HttpClientProperties.CLIENT_TYPE
,UrlConnectionHttpClient
will be used as the HTTP Client inAwsClientFactory
- clientAssumeRoleArn() - Method in class org.apache.iceberg.aws.AwsProperties
- clientAssumeRoleExternalId() - Method in class org.apache.iceberg.aws.AwsProperties
- clientAssumeRoleRegion() - Method in class org.apache.iceberg.aws.AwsProperties
- clientAssumeRoleSessionName() - Method in class org.apache.iceberg.aws.AwsProperties
- clientAssumeRoleTimeoutSec() - Method in class org.apache.iceberg.aws.AwsProperties
- clientLibToken() - Method in class org.apache.iceberg.gcp.GCPProperties
- ClientPool<C,
E extends Exception> - Interface in org.apache.iceberg - ClientPool.Action<R,
C, E extends Exception> - Interface in org.apache.iceberg - ClientPoolImpl<C,
E extends Exception> - Class in org.apache.iceberg - ClientPoolImpl(int, Class<? extends E>, boolean) - Constructor for class org.apache.iceberg.ClientPoolImpl
- ClientPoolImpl(int, Class<? extends E>, boolean, int) - Constructor for class org.apache.iceberg.ClientPoolImpl
- clientRegion() - Method in class org.apache.iceberg.aws.AwsClientProperties
- clone() - Method in interface org.apache.iceberg.flink.CatalogLoader
-
Clone a CatalogLoader.
- clone() - Method in class org.apache.iceberg.flink.CatalogLoader.CustomCatalogLoader
- clone() - Method in class org.apache.iceberg.flink.CatalogLoader.HadoopCatalogLoader
- clone() - Method in class org.apache.iceberg.flink.CatalogLoader.HiveCatalogLoader
- clone() - Method in class org.apache.iceberg.flink.CatalogLoader.RESTCatalogLoader
- clone() - Method in class org.apache.iceberg.flink.TableLoader.CatalogTableLoader
- clone() - Method in interface org.apache.iceberg.flink.TableLoader
-
Clone a TableLoader
- clone() - Method in class org.apache.iceberg.flink.TableLoader.HadoopTableLoader
- clone(RowData, RowData, RowType, TypeSerializer[]) - Static method in class org.apache.iceberg.flink.data.RowDataUtil
-
Deprecated.will be removed in 1.7.0; Not reusing FieldGetter in this method could lead to performance degradation, use
RowDataUtil.clone(RowData, RowData, RowType, TypeSerializer[], RowData.FieldGetter[])
instead. - clone(RowData, RowData, RowType, TypeSerializer[], RowData.FieldGetter[]) - Static method in class org.apache.iceberg.flink.data.RowDataUtil
-
Similar to the private
RowDataSerializer.copyRowData(RowData, RowData)
method. - close() - Method in class org.apache.iceberg.actions.RewritePositionDeletesCommitManager.CommitService
- close() - Method in class org.apache.iceberg.aliyun.oss.OSSFileIO
- close() - Method in class org.apache.iceberg.aliyun.oss.OSSOutputStream
- close() - Method in class org.apache.iceberg.arrow.vectorized.ArrowReader
- close() - Method in class org.apache.iceberg.arrow.vectorized.ArrowVectorAccessor
- close() - Method in class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- close() - Method in class org.apache.iceberg.arrow.vectorized.ColumnarBatch
-
Called to close all the columns in this batch.
- close() - Method in class org.apache.iceberg.arrow.vectorized.ColumnVector
- close() - Method in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- close() - Method in class org.apache.iceberg.arrow.vectorized.VectorizedTableScanIterable
- close() - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- close() - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbLockManager
- close() - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- close() - Method in class org.apache.iceberg.aws.s3.S3FileIO
- close() - Method in class org.apache.iceberg.aws.util.RetryDetector
- close() - Method in class org.apache.iceberg.BaseMetastoreCatalog
- close() - Method in class org.apache.iceberg.ClientPoolImpl
- close() - Method in class org.apache.iceberg.connect.data.SinkWriter
- close() - Method in class org.apache.iceberg.deletes.EqualityDeleteWriter
- close() - Method in class org.apache.iceberg.deletes.FileScopedPositionDeleteWriter
- close() - Method in class org.apache.iceberg.deletes.PositionDeleteWriter
- close() - Method in class org.apache.iceberg.deletes.SortingPositionOnlyDeleteWriter
- close() - Method in class org.apache.iceberg.dell.ecs.EcsCatalog
- close() - Method in class org.apache.iceberg.dell.ecs.EcsFileIO
- close() - Method in class org.apache.iceberg.encryption.AesGcmInputStream
- close() - Method in class org.apache.iceberg.encryption.AesGcmOutputStream
- close() - Method in class org.apache.iceberg.encryption.EncryptingFileIO
- close() - Method in class org.apache.iceberg.flink.FlinkCatalog
- close() - Method in interface org.apache.iceberg.flink.source.assigner.SplitAssigner
-
Some assigners may need to perform certain actions when their corresponding enumerators are closed
- close() - Method in class org.apache.iceberg.flink.source.DataIterator
- close() - Method in class org.apache.iceberg.flink.source.enumerator.StaticIcebergEnumerator
- close() - Method in class org.apache.iceberg.flink.source.enumerator.ContinuousIcebergEnumerator
- close() - Method in class org.apache.iceberg.flink.source.enumerator.ContinuousSplitPlannerImpl
- close() - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- close() - Method in class org.apache.iceberg.flink.source.StreamingMonitorFunction
- close() - Method in class org.apache.iceberg.flink.source.StreamingReaderOperator
- close() - Method in class org.apache.iceberg.flink.TableLoader.CatalogTableLoader
- close() - Method in class org.apache.iceberg.flink.TableLoader.HadoopTableLoader
- close() - Method in class org.apache.iceberg.gcp.gcs.GCSFileIO
- close() - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- close() - Method in class org.apache.iceberg.inmemory.InMemoryCatalog
- close() - Method in class org.apache.iceberg.inmemory.InMemoryFileIO
- close() - Method in class org.apache.iceberg.io.BasePositionDeltaWriter
- close() - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
- close() - Method in class org.apache.iceberg.io.BaseTaskWriter.RollingEqDeleteWriter
- close() - Method in class org.apache.iceberg.io.CloseableGroup
-
Close all the registered resources.
- close() - Method in class org.apache.iceberg.io.ClusteredEqualityDeleteWriter
- close() - Method in class org.apache.iceberg.io.DataWriter
- close() - Method in class org.apache.iceberg.io.FanoutPositionOnlyDeleteWriter
- close() - Method in interface org.apache.iceberg.io.FileIO
-
Close File IO to release underlying resources.
- close() - Method in class org.apache.iceberg.io.FilterIterator
- close() - Method in class org.apache.iceberg.io.PartitionedFanoutWriter
- close() - Method in class org.apache.iceberg.io.PartitionedWriter
- close() - Method in class org.apache.iceberg.io.ResolvingFileIO
- close() - Method in class org.apache.iceberg.io.RollingDataWriter
- close() - Method in class org.apache.iceberg.io.UnpartitionedWriter
- close() - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- close() - Method in class org.apache.iceberg.ManifestWriter
- close() - Method in interface org.apache.iceberg.metrics.MetricsReporter
- close() - Method in interface org.apache.iceberg.metrics.Timer.Timed
- close() - Method in class org.apache.iceberg.mr.mapred.AbstractMapredIcebergRecordReader
- close() - Method in class org.apache.iceberg.nessie.NessieCatalog
- close() - Method in class org.apache.iceberg.nessie.NessieIcebergClient
- close() - Method in class org.apache.iceberg.orc.VectorizedRowBatchIterator
- close() - Method in class org.apache.iceberg.parquet.ParquetWriteAdapter
-
Deprecated.
- close() - Method in interface org.apache.iceberg.parquet.VectorizedReader
-
Release any resources allocated.
- close() - Method in class org.apache.iceberg.pig.IcebergPigInputFormat.IcebergRecordReader
- close() - Method in class org.apache.iceberg.puffin.PuffinReader
- close() - Method in class org.apache.iceberg.puffin.PuffinWriter
- close() - Method in class org.apache.iceberg.rest.HTTPClient
- close() - Method in class org.apache.iceberg.rest.RESTCatalog
- close() - Method in class org.apache.iceberg.rest.RESTSessionCatalog
- close() - Method in class org.apache.iceberg.RollingManifestWriter
- close() - Method in class org.apache.iceberg.snowflake.SnowflakeCatalog
- close() - Method in class org.apache.iceberg.spark.data.vectorized.DeletedColumnVector
- close() - Method in class org.apache.iceberg.spark.data.vectorized.IcebergArrowColumnVector
- close() - Method in class org.apache.iceberg.spark.data.vectorized.RowPositionColumnVector
- close() - Method in class org.apache.iceberg.spark.source.EqualityDeleteRowReader
- close() - Method in class org.apache.iceberg.spark.source.SerializableTableWithSize
- close() - Method in class org.apache.iceberg.spark.source.SerializableTableWithSize.SerializableMetadataTableWithSize
- close() - Method in class org.apache.iceberg.util.LockManagers.BaseLockManager
- close(C) - Method in class org.apache.iceberg.ClientPoolImpl
- close(Closeable, boolean) - Static method in class org.apache.iceberg.util.Exceptions
- close(Connection) - Method in class org.apache.iceberg.jdbc.JdbcClientPool
- close(Collection<TopicPartition>) - Method in class org.apache.iceberg.connect.IcebergSinkTask
- close(IMetaStoreClient) - Method in class org.apache.iceberg.hive.HiveClientPool
- CloseableGroup - Class in org.apache.iceberg.io
-
This class acts as a helper for handling the closure of multiple resource.
- CloseableGroup() - Constructor for class org.apache.iceberg.io.CloseableGroup
- CloseableIterable<T> - Interface in org.apache.iceberg.io
- CloseableIterable.ConcatCloseableIterable<E> - Class in org.apache.iceberg.io
- CloseableIterator<T> - Interface in org.apache.iceberg.io
- closeService() - Method in class org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure
-
Closes this procedure's executor service if a new one was created with
BaseProcedure.executorService(int, String)
. - closeVectors() - Method in class org.apache.iceberg.arrow.vectorized.BaseBatchReader
- ClosingIterator<T> - Class in org.apache.iceberg.io
-
A convenience wrapper around
CloseableIterator
, providing auto-close functionality when all of the elements in the iterator are consumed. - ClosingIterator(CloseableIterator<T>) - Constructor for class org.apache.iceberg.io.ClosingIterator
- clusterBy(Function<DataFile, Object>) - Method in class org.apache.iceberg.BaseRewriteManifests
- clusterBy(Function<DataFile, Object>) - Method in interface org.apache.iceberg.RewriteManifests
-
Groups an existing
DataFile
by a cluster key produced by a function. - ClusteredDataWriter<T> - Class in org.apache.iceberg.io
-
A data writer capable of writing to multiple specs and partitions that requires the incoming records to be properly clustered by partition spec and by partition within each spec.
- ClusteredDataWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, long) - Constructor for class org.apache.iceberg.io.ClusteredDataWriter
- ClusteredEqualityDeleteWriter<T> - Class in org.apache.iceberg.io
-
An equality delete writer capable of writing to multiple specs and partitions that requires the incoming delete records to be properly clustered by partition spec and by partition within each spec.
- ClusteredEqualityDeleteWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, long) - Constructor for class org.apache.iceberg.io.ClusteredEqualityDeleteWriter
- ClusteredPositionDeleteWriter<T> - Class in org.apache.iceberg.io
-
A position delete writer capable of writing to multiple specs and partitions that requires the incoming delete records to be properly clustered by partition spec and by partition within each spec.
- ClusteredPositionDeleteWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, long) - Constructor for class org.apache.iceberg.io.ClusteredPositionDeleteWriter
- ClusteredPositionDeleteWriter(FileWriterFactory<T>, OutputFileFactory, FileIO, long, DeleteGranularity) - Constructor for class org.apache.iceberg.io.ClusteredPositionDeleteWriter
- clusterHadoopConf() - Static method in class org.apache.iceberg.flink.FlinkCatalogFactory
- code() - Method in class org.apache.iceberg.rest.responses.ErrorResponse
- codecName() - Method in enum class org.apache.iceberg.puffin.PuffinCompressionCodec
- coercePartition(Types.StructType, PartitionSpec, StructLike) - Static method in class org.apache.iceberg.util.PartitionUtil
- collect() - Method in class org.apache.iceberg.FindFiles.Builder
-
Returns all files in the table that match all of the filters.
- collections(int, int, ParquetValueWriter<E>) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- column - Variable in class org.apache.iceberg.parquet.ParquetValueReaders.PrimitiveReader
- column - Variable in class org.apache.iceberg.parquet.ParquetValueWriters.PrimitiveWriter
- column() - Method in interface org.apache.iceberg.parquet.ParquetValueReader
- column() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.PrimitiveReader
- column() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- column() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- column() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- column(int) - Method in class org.apache.iceberg.arrow.vectorized.ColumnarBatch
-
Returns the column at `ordinal`.
- COLUMN_SIZES - Static variable in interface org.apache.iceberg.DataFile
- columnAliases() - Method in class org.apache.iceberg.spark.source.SparkView
- ColumnarBatch - Class in org.apache.iceberg.arrow.vectorized
-
This class is inspired by Spark's
ColumnarBatch
. - ColumnarBatchReader - Class in org.apache.iceberg.spark.data.vectorized
-
VectorizedReader
that returns Spark'sColumnarBatch
to support Spark's vectorized read path. - ColumnarBatchReader(List<VectorizedReader<?>>) - Constructor for class org.apache.iceberg.spark.data.vectorized.ColumnarBatchReader
- columnComments() - Method in class org.apache.iceberg.spark.source.SparkView
- columnIsDeletedPosition() - Method in class org.apache.iceberg.data.DeleteFilter
- ColumnIterator<T> - Class in org.apache.iceberg.parquet
- columnMode(String) - Method in class org.apache.iceberg.MetricsConfig
- columnName() - Method in class org.apache.iceberg.expressions.BoundAggregate
- columns() - Method in interface org.apache.iceberg.parquet.ParquetValueReader
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.PrimitiveReader
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedKeyValueReader
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.RepeatedReader
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueReaders.StructReader
- columns() - Method in interface org.apache.iceberg.parquet.ParquetValueWriter
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.PrimitiveWriter
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedKeyValueWriter
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.RepeatedWriter
- columns() - Method in class org.apache.iceberg.parquet.ParquetValueWriters.StructWriter
- columns() - Method in class org.apache.iceberg.Schema
-
Returns a List of the
columns
in this Schema. - columnSizes() - Method in interface org.apache.iceberg.ContentFile
-
Returns if collected, map from column ID to the size of the column in bytes, null otherwise.
- columnSizes() - Method in class org.apache.iceberg.Metrics
-
Get the number of bytes for all fields in a file.
- columnSizes() - Method in class org.apache.iceberg.spark.SparkContentFile
- ColumnStatsWatermarkExtractor - Class in org.apache.iceberg.flink.source.reader
-
SplitWatermarkExtractor
implementation which uses an Iceberg timestamp column statistics to get the watermarks for theIcebergSourceSplit
. - ColumnStatsWatermarkExtractor(Schema, String, TimeUnit) - Constructor for class org.apache.iceberg.flink.source.reader.ColumnStatsWatermarkExtractor
-
Creates the extractor.
- columnsToKeepStats() - Method in class org.apache.iceberg.AllDeleteFilesTable.AllDeleteFilesTableScan
- ColumnVector - Class in org.apache.iceberg.arrow.vectorized
-
This class is inspired by Spark's
ColumnVector
. - ColumnVectorWithFilter - Class in org.apache.iceberg.spark.data.vectorized
- ColumnVectorWithFilter(VectorHolder, int[]) - Constructor for class org.apache.iceberg.spark.data.vectorized.ColumnVectorWithFilter
- ColumnWriter<T> - Class in org.apache.iceberg.parquet
- combine(Iterable<E>, Closeable) - Static method in interface org.apache.iceberg.io.CloseableIterable
- combine(FileIO, EncryptionManager) - Static method in class org.apache.iceberg.encryption.EncryptingFileIO
- combine(MetricsReporter, MetricsReporter) - Static method in class org.apache.iceberg.metrics.MetricsReporters
- CombinedScanTask - Interface in org.apache.iceberg
-
A scan task made of several ranges from files.
- COMMA_JOINER - Static variable in class org.apache.iceberg.spark.actions.DeleteReachableFilesSparkAction
- COMMA_SPLITTER - Static variable in class org.apache.iceberg.spark.actions.DeleteReachableFilesSparkAction
- COMMENT - Static variable in class org.apache.iceberg.view.ViewProperties
- commit() - Method in class org.apache.iceberg.BaseReplaceSortOrder
- commit() - Method in interface org.apache.iceberg.PendingUpdate
-
Apply the pending changes and commit.
- commit() - Method in class org.apache.iceberg.SetLocation
- commit() - Method in class org.apache.iceberg.SetPartitionStatistics
- commit() - Method in class org.apache.iceberg.SetStatistics
- commit() - Method in class org.apache.iceberg.SnapshotManager
- commit() - Method in class org.apache.iceberg.BaseRewriteManifests
- commit(Set<RewritePositionDeletesGroup>) - Method in class org.apache.iceberg.actions.RewritePositionDeletesCommitManager
-
Perform a commit operation on the table adding and removing files as required for this set of file groups.
- commit(SnapshotUpdate<?>) - Method in class org.apache.iceberg.actions.BaseRewriteDataFilesAction
- commit(SnapshotUpdate<?>) - Method in class org.apache.iceberg.spark.actions.RewriteDataFilesSparkAction
- commit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- commit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.BaseTransaction.TransactionTableOperations
- commit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.hadoop.HadoopTableOperations
- commit(TableMetadata, TableMetadata) - Method in class org.apache.iceberg.StaticTableOperations
- commit(TableMetadata, TableMetadata) - Method in interface org.apache.iceberg.TableOperations
-
Replace the base table metadata with a new version.
- commit(ViewMetadata, ViewMetadata) - Method in class org.apache.iceberg.view.BaseViewOperations
- commit(ViewMetadata, ViewMetadata) - Method in interface org.apache.iceberg.view.ViewOperations
-
Replace the base view metadata with a new version.
- commit(Offset) - Method in class org.apache.iceberg.spark.source.SparkMicroBatchStream
- COMMIT_COMPLETE - Enum constant in enum class org.apache.iceberg.connect.events.PayloadType
-
Maps to payload of type
CommitComplete
- COMMIT_FILE_THREAD_POOL_SIZE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- COMMIT_FILE_THREAD_POOL_SIZE_DEFAULT - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- COMMIT_MAX_RETRY_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_MAX_RETRY_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_MIN_RETRY_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_MIN_RETRY_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_NUM_RETRIES - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_NUM_RETRIES_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_NUM_STATUS_CHECKS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_NUM_STATUS_CHECKS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_REPORT - Enum constant in enum class org.apache.iceberg.rest.requests.ReportMetricsRequest.ReportType
- COMMIT_SNAPSHOT_ID - Static variable in class org.apache.iceberg.MetadataColumns
- COMMIT_STATUS_CHECKS_MAX_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_MAX_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_MIN_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_MIN_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_TOTAL_WAIT_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_STATUS_CHECKS_TOTAL_WAIT_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_TABLE_THREAD_POOL_SIZE - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- COMMIT_TABLE_THREAD_POOL_SIZE_DEFAULT - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- COMMIT_TO_TABLE - Enum constant in enum class org.apache.iceberg.connect.events.PayloadType
-
Maps to payload of type
CommitToTable
- COMMIT_TOTAL_RETRY_TIME_MS - Static variable in class org.apache.iceberg.TableProperties
- COMMIT_TOTAL_RETRY_TIME_MS_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- commitBranch() - Method in class org.apache.iceberg.connect.TableSinkConfig
- commitChanges(Table, String, String, String, List<TableChange>, List<TableChange>) - Static method in class org.apache.iceberg.flink.util.FlinkAlterTableUtil
- commitChanges(Table, String, String, String, Map<String, String>) - Static method in class org.apache.iceberg.flink.util.FlinkAlterTableUtil
- CommitComplete - Class in org.apache.iceberg.connect.events
-
A control event payload for events sent by a coordinator that indicates it has completed a commit cycle.
- CommitComplete(UUID, OffsetDateTime) - Constructor for class org.apache.iceberg.connect.events.CommitComplete
- CommitComplete(Schema) - Constructor for class org.apache.iceberg.connect.events.CommitComplete
- commitCreateTable(Table) - Method in class org.apache.iceberg.mr.hive.HiveIcebergMetaHook
- commitDropTable(Table, boolean) - Method in class org.apache.iceberg.mr.hive.HiveIcebergMetaHook
- CommitFailedException - Exception in org.apache.iceberg.exceptions
-
Exception raised when a commit fails because of out of date metadata.
- CommitFailedException(String, Object...) - Constructor for exception org.apache.iceberg.exceptions.CommitFailedException
- CommitFailedException(Throwable) - Constructor for exception org.apache.iceberg.exceptions.CommitFailedException
- CommitFailedException(Throwable, String, Object...) - Constructor for exception org.apache.iceberg.exceptions.CommitFailedException
- commitFileGroups(Set<RewriteFileGroup>) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager
-
Perform a commit operation on the table adding and removing files as required for this set of file groups
- commitId() - Method in class org.apache.iceberg.connect.events.CommitComplete
- commitId() - Method in class org.apache.iceberg.connect.events.CommitToTable
- commitId() - Method in class org.apache.iceberg.connect.events.DataComplete
- commitId() - Method in class org.apache.iceberg.connect.events.DataWritten
- commitId() - Method in class org.apache.iceberg.connect.events.StartCommit
- commitIntervalMs() - Method in class org.apache.iceberg.connect.IcebergSinkConfig
- commitJob(JobContext) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
-
Reads the commit files stored in the temp directories and collects the generated committed data files.
- commitManageSnapshots(Table, String, String) - Static method in class org.apache.iceberg.flink.util.FlinkAlterTableUtil
- CommitMetadata - Class in org.apache.iceberg.spark
-
utility class to accept thread local commit properties
- commitMetrics() - Method in interface org.apache.iceberg.metrics.CommitReport
- commitMetrics() - Method in class org.apache.iceberg.BaseRewriteManifests
- CommitMetrics - Class in org.apache.iceberg.metrics
-
Carries all metrics for a particular commit
- CommitMetrics() - Constructor for class org.apache.iceberg.metrics.CommitMetrics
- CommitMetricsResult - Interface in org.apache.iceberg.metrics
-
A serializable version of
CommitMetrics
that carries its results. - commitOrClean(Set<RewriteFileGroup>) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager
- commitOrClean(Set<RewriteFileGroup>) - Method in class org.apache.iceberg.actions.RewriteDataFilesCommitManager.CommitService
- commitOrClean(Set<RewritePositionDeletesGroup>) - Method in class org.apache.iceberg.actions.RewritePositionDeletesCommitManager
- commitOrClean(Set<RewritePositionDeletesGroup>) - Method in class org.apache.iceberg.actions.RewritePositionDeletesCommitManager.CommitService
- commitProperties() - Static method in class org.apache.iceberg.spark.CommitMetadata
- CommitReport - Interface in org.apache.iceberg.metrics
-
A commit report that contains all relevant information from a Snapshot.
- CommitReportParser - Class in org.apache.iceberg.metrics
- commitSnapshotId() - Method in interface org.apache.iceberg.ChangelogScanTask
-
Returns the snapshot ID in which the changes were committed.
- commitStagedChanges() - Method in class org.apache.iceberg.spark.RollbackStagedTable
- commitStagedChanges() - Method in class org.apache.iceberg.spark.source.StagedSparkTable
- CommitStateUnknownException - Exception in org.apache.iceberg.exceptions
-
Exception for a failure to confirm either affirmatively or negatively that a commit was applied.
- CommitStateUnknownException(String, Throwable) - Constructor for exception org.apache.iceberg.exceptions.CommitStateUnknownException
- CommitStateUnknownException(Throwable) - Constructor for exception org.apache.iceberg.exceptions.CommitStateUnknownException
- commitSummary() - Method in class org.apache.iceberg.spark.actions.RewriteDataFilesSparkAction
- commitTable(TableMetadata, TableMetadata, String, String, ContentKey) - Method in class org.apache.iceberg.nessie.NessieIcebergClient
- commitTask(TaskAttemptContext) - Method in class org.apache.iceberg.mr.hive.HiveIcebergOutputCommitter
-
Collects the generated data files and creates a commit file storing the data file list.
- Committer - Interface in org.apache.iceberg.connect
- CommitterImpl - Class in org.apache.iceberg.connect.channel
- CommitterImpl() - Constructor for class org.apache.iceberg.connect.channel.CommitterImpl
- commitThreads() - Method in class org.apache.iceberg.connect.IcebergSinkConfig
- commitTimeoutMs() - Method in class org.apache.iceberg.connect.IcebergSinkConfig
- CommitToTable - Class in org.apache.iceberg.connect.events
-
A control event payload for events sent by a coordinator that indicates it has completed a commit cycle.
- CommitToTable(UUID, TableReference, Long, OffsetDateTime) - Constructor for class org.apache.iceberg.connect.events.CommitToTable
- CommitToTable(Schema) - Constructor for class org.apache.iceberg.connect.events.CommitToTable
- commitTransaction() - Method in class org.apache.iceberg.BaseTransaction
- commitTransaction() - Method in class org.apache.iceberg.rest.ResourcePaths
- commitTransaction() - Method in interface org.apache.iceberg.Transaction
-
Apply the pending changes from all actions and commit.
- commitTransaction(List<TableCommit>) - Method in class org.apache.iceberg.rest.RESTCatalog
- commitTransaction(SessionCatalog.SessionContext, List<TableCommit>) - Method in class org.apache.iceberg.rest.RESTSessionCatalog
- commitTransaction(TableCommit...) - Method in class org.apache.iceberg.rest.RESTCatalog
- CommitTransactionRequest - Class in org.apache.iceberg.rest.requests
- CommitTransactionRequest(List<UpdateTableRequest>) - Constructor for class org.apache.iceberg.rest.requests.CommitTransactionRequest
- CommitTransactionRequestDeserializer() - Constructor for class org.apache.iceberg.rest.RESTSerializers.CommitTransactionRequestDeserializer
- CommitTransactionRequestParser - Class in org.apache.iceberg.rest.requests
- CommitTransactionRequestSerializer() - Constructor for class org.apache.iceberg.rest.RESTSerializers.CommitTransactionRequestSerializer
- commitView(ViewMetadata, ViewMetadata, String, String, ContentKey) - Method in class org.apache.iceberg.nessie.NessieIcebergClient
- comparator() - Method in interface org.apache.iceberg.expressions.BoundTerm
-
Returns a
Comparator
for values produced by this term. - comparator() - Method in interface org.apache.iceberg.expressions.Literal
-
Return a
Comparator
for values. - comparator(RewriteJobOrder) - Static method in class org.apache.iceberg.actions.RewriteFileGroup
- comparator(RewriteJobOrder) - Static method in class org.apache.iceberg.actions.RewritePositionDeletesGroup
- Comparators - Class in org.apache.iceberg.types
- compareTo(Offset) - Method in class org.apache.iceberg.connect.data.Offset
- compareToFileList(Dataset<Row>) - Method in class org.apache.iceberg.spark.actions.DeleteOrphanFilesSparkAction
- CompatibilityTaskAttemptContextImpl(Configuration, TaskAttemptID, Reporter) - Constructor for class org.apache.iceberg.mr.mapred.MapredIcebergInputFormat.CompatibilityTaskAttemptContextImpl
- compatibleWith(PartitionSpec) - Method in class org.apache.iceberg.PartitionSpec
-
Returns true if this spec is equivalent to the other, with partition field ids ignored.
- complete() - Method in class org.apache.iceberg.io.BaseTaskWriter
- complete() - Method in interface org.apache.iceberg.io.TaskWriter
-
Close the writer and get the completed data and delete files.
- COMPLETED - Enum constant in enum class org.apache.iceberg.flink.source.split.IcebergSourceSplitStatus
- completeWrite() - Method in class org.apache.iceberg.connect.data.SinkWriter
- compressBlobs(PuffinCompressionCodec) - Method in class org.apache.iceberg.puffin.Puffin.WriteBuilder
-
Configures the writer to compress the blobs.
- compressFooter() - Method in class org.apache.iceberg.puffin.Puffin.WriteBuilder
-
Configures the writer to compress the footer.
- COMPRESSION_CODEC - Static variable in class org.apache.iceberg.flink.FlinkWriteOptions
- COMPRESSION_CODEC - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- COMPRESSION_CODEC - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- COMPRESSION_LEVEL - Static variable in class org.apache.iceberg.flink.FlinkWriteOptions
- COMPRESSION_LEVEL - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- COMPRESSION_LEVEL - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- COMPRESSION_STRATEGY - Static variable in class org.apache.iceberg.flink.FlinkWriteOptions
- COMPRESSION_STRATEGY - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- COMPRESSION_STRATEGY - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- compressionCodec() - Method in class org.apache.iceberg.puffin.BlobMetadata
- computeIfAbsent(int, StructLike, Supplier<V>) - Method in class org.apache.iceberg.util.PartitionMap
- computeIfAbsent(CharSequence, Supplier<V>) - Method in class org.apache.iceberg.util.CharSequenceMap
- ComputeUpdateIterator - Class in org.apache.iceberg.spark
-
An iterator that finds delete/insert rows which represent an update, and converts them into update records from changelog tables within a single Spark task.
- computeUpdates(Iterator<Row>, StructType, String[]) - Static method in class org.apache.iceberg.spark.ChangelogIterator
-
Creates an iterator composing
RemoveCarryoverIterator
andComputeUpdateIterator
to remove carry-over rows and compute update rows - concat(Class<T>, T[]...) - Static method in class org.apache.iceberg.util.ArrayUtil
- concat(Iterable<File>, File, int, Schema, Map<String, String>) - Static method in class org.apache.iceberg.parquet.Parquet
-
Combines several files into one
- concat(Iterable<CloseableIterable<E>>) - Static method in interface org.apache.iceberg.io.CloseableIterable
- conf() - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- conf() - Method in class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- config() - Method in class org.apache.iceberg.connect.IcebergSinkConnector
- config() - Method in class org.apache.iceberg.rest.auth.OAuth2Util.AuthSession
- config() - Static method in class org.apache.iceberg.rest.ResourcePaths
- config() - Method in class org.apache.iceberg.rest.responses.LoadTableResponse
- config() - Method in interface org.apache.iceberg.rest.responses.LoadViewResponse
- config(String, String) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- CONFIG - Static variable in class org.apache.iceberg.flink.actions.Actions
- CONFIG_DEF - Static variable in class org.apache.iceberg.connect.IcebergSinkConfig
- CONFIG_SERIALIZATION_DISABLED - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- CONFIG_SERIALIZATION_DISABLED_DEFAULT - Static variable in class org.apache.iceberg.mr.InputFormatConfig
- ConfigBuilder(Configuration) - Constructor for class org.apache.iceberg.mr.InputFormatConfig.ConfigBuilder
- ConfigProperties - Class in org.apache.iceberg.hadoop
- ConfigResponse - Class in org.apache.iceberg.rest.responses
-
Represents a response to requesting server-side provided configuration for the REST catalog.
- ConfigResponse() - Constructor for class org.apache.iceberg.rest.responses.ConfigResponse
- ConfigResponse.Builder - Class in org.apache.iceberg.rest.responses
- ConfigResponseParser - Class in org.apache.iceberg.rest.responses
- Configurable<C> - Interface in org.apache.iceberg.hadoop
-
Interface used to avoid runtime dependencies on Hadoop Configurable
- configure(Configuration) - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- configure(JobConf) - Static method in class org.apache.iceberg.mr.mapred.MapredIcebergInputFormat
-
Configures the
JobConf
to use theMapredIcebergInputFormat
and returns a helper to add further configuration. - configure(Job) - Static method in class org.apache.iceberg.mr.mapreduce.IcebergInputFormat
-
Configures the
Job
to use theIcebergInputFormat
and returns a helper to add further configuration. - configureDataWrite(Avro.DataWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureDataWrite(ORC.DataWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureDataWrite(Parquet.DataWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureEndpoint(T, String) - Static method in class org.apache.iceberg.aws.AwsClientFactories
-
Deprecated.Not for public use. To configure the endpoint for a client, please use
S3FileIOProperties.applyEndpointConfigurations(S3ClientBuilder)
,AwsProperties.applyGlueEndpointConfigurations(GlueClientBuilder)
, orAwsProperties.applyDynamoDbEndpointConfigurations(DynamoDbClientBuilder)
accordingly. It will be removed in 2.0.0 - configureEqualityDelete(Avro.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureEqualityDelete(ORC.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureEqualityDelete(Parquet.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureHadoopConf(Object, Object) - Static method in class org.apache.iceberg.CatalogUtil
-
Dynamically detects whether an object is a Hadoop Configurable and calls setConf.
- configureHttpClientBuilder(String) - Static method in class org.apache.iceberg.aws.AwsClientFactories
-
Deprecated.Not for public use. To configure the httpClient for a client, please use
HttpClientProperties.applyHttpClientConfigurations(AwsSyncClientBuilder)
. It will be removed in 2.0.0 - configureInputJobCredentials(TableDesc, Map<String, String>) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- configureInputJobProperties(TableDesc, Map<String, String>) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- configureJobConf(TableDesc, JobConf) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- configureOutputJobProperties(TableDesc, Map<String, String>) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- configurePositionDelete(Avro.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configurePositionDelete(ORC.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configurePositionDelete(Parquet.DeleteWriteBuilder) - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- configureTableJobProperties(TableDesc, Map<String, String>) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- conflictDetectionFilter(Expression) - Method in class org.apache.iceberg.BaseOverwriteFiles
- conflictDetectionFilter(Expression) - Method in interface org.apache.iceberg.OverwriteFiles
-
Sets a conflict detection filter used to validate concurrently added data and delete files.
- conflictDetectionFilter(Expression) - Method in interface org.apache.iceberg.RowDelta
-
Sets a conflict detection filter used to validate concurrently added data and delete files.
- connectGroupId() - Method in class org.apache.iceberg.connect.IcebergSinkConfig
- connectorName() - Method in class org.apache.iceberg.connect.IcebergSinkConfig
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ExpressionContext
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringArrayContext
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- constant() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformArgumentContext
- constant(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringArrayContext
- constant(int) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StringMapContext
- constant(C) - Static method in class org.apache.iceberg.parquet.ParquetValueReaders
- constant(C, int) - Static method in class org.apache.iceberg.parquet.ParquetValueReaders
- constant(T) - Static method in class org.apache.iceberg.avro.ValueReaders
- ConstantContext() - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ConstantContext
- ConstantContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ConstantContext
- constantHolder(Types.NestedField, int, T) - Static method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- constants(C) - Static method in class org.apache.iceberg.orc.OrcValueReaders
- constantsMap(ContentScanTask<?>) - Static method in class org.apache.iceberg.util.PartitionUtil
- constantsMap(ContentScanTask<?>, BiFunction<Type, Object, Object>) - Static method in class org.apache.iceberg.util.PartitionUtil
- constantsMap(ContentScanTask<?>, Schema) - Method in class org.apache.iceberg.spark.source.EqualityDeleteRowReader
- constantsMap(ContentScanTask<?>, Types.StructType, BiFunction<Type, Object, Object>) - Static method in class org.apache.iceberg.util.PartitionUtil
- ConstantVectorHolder(int) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorHolder.ConstantVectorHolder
- ConstantVectorHolder(Types.NestedField, int, T) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorHolder.ConstantVectorHolder
- ConstantVectorReader(Types.NestedField, T) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader.ConstantVectorReader
- constrained() - Static method in class org.apache.iceberg.flink.source.assigner.GetSplitResult
- CONSTRAINED - Enum constant in enum class org.apache.iceberg.flink.source.assigner.GetSplitResult.Status
-
There are pending splits.
- Container<T> - Class in org.apache.iceberg.mr.mapred
-
A simple container of objects that you can get and set.
- Container() - Constructor for class org.apache.iceberg.mr.mapred.Container
- contains(int, StructLike) - Method in class org.apache.iceberg.util.PartitionSet
- contains(Object) - Method in class org.apache.iceberg.util.CharSequenceSet
- contains(Object) - Method in class org.apache.iceberg.util.PartitionSet
- contains(Object) - Method in class org.apache.iceberg.util.StructLikeSet
- contains(String) - Method in class org.apache.iceberg.spark.SparkTableCache
- containsAll(Collection<?>) - Method in class org.apache.iceberg.util.CharSequenceSet
- containsAll(Collection<?>) - Method in class org.apache.iceberg.util.PartitionSet
- containsAll(Collection<?>) - Method in class org.apache.iceberg.util.StructLikeSet
- containsKey(int, StructLike) - Method in class org.apache.iceberg.util.PartitionMap
- containsKey(Object) - Method in class org.apache.iceberg.util.CharSequenceMap
- containsKey(Object) - Method in class org.apache.iceberg.util.PartitionMap
- containsKey(Object) - Method in class org.apache.iceberg.util.SerializableMap
- containsKey(Object) - Method in class org.apache.iceberg.util.StructLikeMap
- containsNaN() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- containsNaN() - Method in interface org.apache.iceberg.ManifestFile.PartitionFieldSummary
-
Returns true if at least one file in the manifest has a NaN value for the field.
- containsNull() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- containsNull() - Method in interface org.apache.iceberg.ManifestFile.PartitionFieldSummary
-
Returns true if at least one file in the manifest has a null value for the field.
- containsValue(Object) - Method in class org.apache.iceberg.util.CharSequenceMap
- containsValue(Object) - Method in class org.apache.iceberg.util.PartitionMap
- containsValue(Object) - Method in class org.apache.iceberg.util.SerializableMap
- containsValue(Object) - Method in class org.apache.iceberg.util.StructLikeMap
- content() - Method in interface org.apache.iceberg.ContentFile
-
Returns type of content stored in the file; one of DATA, POSITION_DELETES, or EQUALITY_DELETES.
- content() - Method in interface org.apache.iceberg.DataFile
- content() - Method in class org.apache.iceberg.GenericManifestFile
- content() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the content stored in the manifest; either DATA or DELETES.
- content() - Method in class org.apache.iceberg.ManifestWriter
- content() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- content() - Method in class org.apache.iceberg.spark.SparkContentFile
- CONTENT - Static variable in interface org.apache.iceberg.DataFile
- ContentCache - Class in org.apache.iceberg.io
-
Class that provides file-content caching during reading.
- ContentCache(long, long, long) - Constructor for class org.apache.iceberg.io.ContentCache
-
Constructor for ContentCache class.
- ContentFile<F> - Interface in org.apache.iceberg
-
Superinterface of
DataFile
andDeleteFile
that exposes common methods. - contentFileDS(Table) - Method in class org.apache.iceberg.spark.actions.DeleteReachableFilesSparkAction
- contentFileDS(Table, Set<Long>) - Method in class org.apache.iceberg.spark.actions.DeleteReachableFilesSparkAction
- ContentFileUtil - Class in org.apache.iceberg.util
- ContentScanTask<F extends ContentFile<F>> - Interface in org.apache.iceberg
-
A scan task over a range of bytes in a content file.
- context() - Method in class org.apache.iceberg.AllDeleteFilesTable.AllDeleteFilesTableScan
- ContinuousIcebergEnumerator - Class in org.apache.iceberg.flink.source.enumerator
- ContinuousIcebergEnumerator(SplitEnumeratorContext<IcebergSourceSplit>, SplitAssigner, ScanContext, ContinuousSplitPlanner, IcebergEnumeratorState) - Constructor for class org.apache.iceberg.flink.source.enumerator.ContinuousIcebergEnumerator
- ContinuousSplitPlanner - Interface in org.apache.iceberg.flink.source.enumerator
-
This interface is introduced so that we can plug in different split planner for unit test
- ContinuousSplitPlannerImpl - Class in org.apache.iceberg.flink.source.enumerator
- ContinuousSplitPlannerImpl(TableLoader, ScanContext, String) - Constructor for class org.apache.iceberg.flink.source.enumerator.ContinuousSplitPlannerImpl
- controlTopic() - Method in class org.apache.iceberg.connect.IcebergSinkConfig
- Conversions - Class in org.apache.iceberg.types
- convert(byte[]) - Static method in class org.apache.iceberg.util.UUIDUtil
- convert(byte[], int) - Static method in class org.apache.iceberg.util.UUIDUtil
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergBinaryObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDateObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDecimalObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergFixedObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimeObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampWithZoneObjectInspector
- convert(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergUUIDObjectInspector
- convert(Object) - Method in interface org.apache.iceberg.mr.hive.serde.objectinspector.WriteObjectInspector
- convert(ByteBuffer) - Static method in class org.apache.iceberg.util.UUIDUtil
- convert(List<String>, List<TypeInfo>, List<String>) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts the Hive list of column names and column types to an Iceberg schema.
- convert(List<String>, List<TypeInfo>, List<String>, boolean) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts the Hive list of column names and column types to an Iceberg schema.
- convert(List<FieldSchema>) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts a Hive schema (list of FieldSchema objects) to an Iceberg schema.
- convert(List<FieldSchema>, boolean) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts a Hive schema (list of FieldSchema objects) to an Iceberg schema.
- convert(UUID) - Static method in class org.apache.iceberg.util.UUIDUtil
- convert(Schema) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(TableSchema) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Deprecated.Use
FlinkSchemaUtil.convert(ResolvedSchema)
instead. - convert(ResolvedSchema) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert the flink table schema to apache iceberg schema with column comment.
- convert(Expression) - Static method in class org.apache.iceberg.flink.FlinkFilters
-
Convert flink expression to iceberg expression.
- convert(LogicalType) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert a
Flink type
to aType
. - convert(TypeInfo) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts a Hive typeInfo object to an Iceberg type.
- convert(Schema) - Static method in class org.apache.iceberg.arrow.ArrowSchemaUtil
-
Convert Iceberg schema to Arrow Schema.
- convert(Schema) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert a
Schema
to aFlink type
. - convert(Schema) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts the Iceberg schema to a Hive schema (list of FieldSchema objects).
- convert(Schema) - Static method in class org.apache.iceberg.orc.ORCSchemaUtil
- convert(Schema) - Static method in class org.apache.iceberg.pig.SchemaUtil
- convert(Schema) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a
Schema
to aSpark type
. - convert(Schema, String) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Schema, String) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
- convert(Schema, String) - Method in class org.apache.iceberg.parquet.TypeToMessageType
- convert(Schema, Map<Types.StructType, String>) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Schema, TableSchema) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert a Flink
TableSchema
to aSchema
based on the given schema. - convert(Schema, Row) - Static method in class org.apache.iceberg.spark.SparkValueConverter
- convert(Schema, StructType) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aSchema
based on the given schema. - convert(Schema, StructType, boolean) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aSchema
based on the given schema. - convert(Type) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Type) - Static method in class org.apache.iceberg.flink.FlinkSchemaUtil
-
Convert a
Type
to aFlink type
. - convert(Type) - Static method in class org.apache.iceberg.hive.HiveSchemaUtil
-
Converts an Iceberg type to a Hive TypeInfo object.
- convert(Type) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a
Type
to aSpark type
. - convert(Type, Object) - Static method in class org.apache.iceberg.spark.SparkValueConverter
- convert(Type, BiFunction<Integer, Types.StructType, String>) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Type, Map<Types.StructType, String>) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(Types.NestedField) - Static method in class org.apache.iceberg.arrow.ArrowSchemaUtil
- convert(Types.StructType, String) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convert(TypeDescription) - Static method in class org.apache.iceberg.orc.ORCSchemaUtil
-
Convert an ORC schema to an Iceberg schema.
- convert(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
-
Converts a Parquet schema to an Iceberg schema.
- convert(AggregateFunc) - Static method in class org.apache.iceberg.spark.SparkAggregates
- convert(Predicate) - Static method in class org.apache.iceberg.spark.SparkV2Filters
- convert(Predicate[]) - Static method in class org.apache.iceberg.spark.SparkV2Filters
- convert(Filter) - Static method in class org.apache.iceberg.spark.SparkFilters
- convert(Filter[]) - Static method in class org.apache.iceberg.spark.SparkFilters
- convert(DataType) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aType
with new field ids. - convert(StructType) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aSchema
with new field ids. - convertAndPrune(MessageType) - Static method in class org.apache.iceberg.parquet.ParquetSchemaUtil
-
Converts a Parquet schema to an Iceberg schema and prunes fields without IDs.
- convertConstant(Type, Object) - Static method in class org.apache.iceberg.data.IdentityPartitionConverters
-
Conversions from internal representations to Iceberg generic values.
- convertConstant(Type, Object) - Static method in class org.apache.iceberg.flink.data.RowDataUtil
- convertConstant(Type, Object) - Static method in class org.apache.iceberg.spark.source.EqualityDeleteRowReader
- convertDeleteFiles(Iterable<DeleteFile>) - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteStrategy
-
Define how to convert the deletes.
- convertedEqualityDeleteFilesCount() - Method in interface org.apache.iceberg.actions.ConvertEqualityDeleteFiles.Result
-
Returns the count of the deletes that been converted.
- ConvertEqualityDeleteFiles - Interface in org.apache.iceberg.actions
-
An action for converting the equality delete files to position delete files.
- ConvertEqualityDeleteFiles.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- ConvertEqualityDeleteStrategy - Interface in org.apache.iceberg.actions
-
A strategy for the action to convert equality delete to position deletes.
- convertToByteBuffer(UUID) - Static method in class org.apache.iceberg.util.UUIDUtil
- convertToByteBuffer(UUID, ByteBuffer) - Static method in class org.apache.iceberg.util.UUIDUtil
- convertToSpark(Type, Object) - Static method in class org.apache.iceberg.spark.SparkValueConverter
- convertTypes(Types.StructType, String) - Static method in class org.apache.iceberg.avro.AvroSchemaUtil
- convertWithFreshIds(Schema, StructType) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aSchema
based on the given schema. - convertWithFreshIds(Schema, StructType, boolean) - Static method in class org.apache.iceberg.spark.SparkSchemaUtil
-
Convert a Spark
struct
to aSchema
based on the given schema. - copy() - Method in interface org.apache.iceberg.ContentFile
-
Copies this file.
- copy() - Method in class org.apache.iceberg.data.GenericRecord
- copy() - Method in interface org.apache.iceberg.data.Record
- copy() - Method in interface org.apache.iceberg.encryption.EncryptionKeyMetadata
- copy() - Method in class org.apache.iceberg.flink.IcebergTableSink
- copy() - Method in class org.apache.iceberg.flink.source.IcebergTableSource
- copy() - Method in class org.apache.iceberg.GenericManifestFile
- copy() - Method in class org.apache.iceberg.GenericPartitionFieldSummary
- copy() - Method in interface org.apache.iceberg.ManifestFile
-
Copies this
manifest file
. - copy() - Method in interface org.apache.iceberg.ManifestFile.PartitionFieldSummary
-
Copies this
summary
. - copy() - Method in class org.apache.iceberg.PartitionData
- copy() - Method in class org.apache.iceberg.PartitionKey
- copy() - Method in class org.apache.iceberg.SortKey
- copy() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- copy() - Method in class org.apache.iceberg.spark.actions.SetAccumulator
- copy() - Method in class org.apache.iceberg.spark.SparkContentFile
- copy(boolean) - Method in interface org.apache.iceberg.ContentFile
-
Copies this file (potentially without file stats).
- copy(F, boolean, Set<Integer>) - Static method in class org.apache.iceberg.util.ContentFileUtil
-
Copies the
ContentFile
with the specific stat settings. - copy(String, Object) - Method in interface org.apache.iceberg.data.Record
- copy(String, Object, String, Object) - Method in interface org.apache.iceberg.data.Record
- copy(String, Object, String, Object, String, Object) - Method in interface org.apache.iceberg.data.Record
- copy(ByteBuffer) - Static method in class org.apache.iceberg.util.ByteBuffers
- copy(Map<String, Object>) - Method in class org.apache.iceberg.data.GenericRecord
- copy(Map<String, Object>) - Method in interface org.apache.iceberg.data.Record
- copy(DataInputView, DataOutputView) - Method in class org.apache.iceberg.flink.sink.shuffle.AggregatedStatisticsSerializer
- copy(DataFile) - Method in class org.apache.iceberg.DataFiles.Builder
- copy(DeleteFile) - Method in class org.apache.iceberg.FileMetadata.Builder
- copy(AggregatedStatistics) - Method in class org.apache.iceberg.flink.sink.shuffle.AggregatedStatisticsSerializer
- copy(AggregatedStatistics, AggregatedStatistics) - Method in class org.apache.iceberg.flink.sink.shuffle.AggregatedStatisticsSerializer
- copy(PartitionSpec, StructLike) - Static method in class org.apache.iceberg.DataFiles
- COPY_ON_WRITE - Enum constant in enum class org.apache.iceberg.RowLevelOperationMode
- copyData(Types.StructType, Object[]) - Static method in class org.apache.iceberg.PartitionData
- copyFor(StructLike) - Method in class org.apache.iceberg.data.InternalRecordWrapper
- copyFor(StructLike) - Method in class org.apache.iceberg.PartitionData
- copyFor(StructLike) - Method in class org.apache.iceberg.util.StructLikeWrapper
-
Creates a copy of this wrapper that wraps a struct.
- copyFor(StructLike) - Method in class org.apache.iceberg.util.StructProjection
- copyFrom(IcebergSqlExtensionsParser.CallArgumentContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CallArgumentContext
- copyFrom(IcebergSqlExtensionsParser.ConstantContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.ConstantContext
- copyFrom(IcebergSqlExtensionsParser.IdentifierContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.IdentifierContext
- copyFrom(IcebergSqlExtensionsParser.NumberContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NumberContext
- copyFrom(IcebergSqlExtensionsParser.StatementContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.StatementContext
- copyFrom(IcebergSqlExtensionsParser.TransformContext) - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TransformContext
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergBinaryObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDateObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergDecimalObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergFixedObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimeObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergTimestampWithZoneObjectInspector
- copyObject(Object) - Method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergUUIDObjectInspector
- copyOf(Map<K, V>) - Static method in class org.apache.iceberg.util.SerializableMap
- copyOf(ManifestFile) - Static method in class org.apache.iceberg.GenericManifestFile
- copyOf(Table) - Static method in class org.apache.iceberg.SerializableTable
-
Creates a read-only serializable table that can be sent to other nodes in a cluster.
- copyOf(Table) - Static method in class org.apache.iceberg.spark.source.SerializableTableWithSize
- copyOnWriteRequirements(Table, RowLevelOperation.Command, DistributionMode, boolean, long) - Static method in class org.apache.iceberg.spark.SparkWriteUtil
-
Builds requirements for copy-on-write DELETE, UPDATE, MERGE operations.
- copyOnWriteRequirements(RowLevelOperation.Command) - Method in class org.apache.iceberg.spark.SparkWriteConf
- copyWithAppendsBetween(Long, long) - Method in class org.apache.iceberg.flink.source.ScanContext
- copyWithBranch(String) - Method in class org.apache.iceberg.spark.source.SparkTable
- copyWithoutFieldCounts(Metrics, Set<Integer>) - Static method in class org.apache.iceberg.MetricsUtil
-
Copies a metrics object without value, NULL and NaN counts for given fields.
- copyWithoutFieldCountsAndBounds(Metrics, Set<Integer>) - Static method in class org.apache.iceberg.MetricsUtil
-
Copies a metrics object without counts and bounds for given fields.
- copyWithoutStats() - Method in interface org.apache.iceberg.ContentFile
-
Copies this file without file stats.
- copyWithoutStats() - Method in class org.apache.iceberg.spark.SparkContentFile
- copyWithSnapshotId(long) - Method in class org.apache.iceberg.flink.source.ScanContext
- copyWithSnapshotId(long) - Method in class org.apache.iceberg.spark.source.SparkTable
- copyWithStats(Set<Integer>) - Method in interface org.apache.iceberg.ContentFile
-
Copies this file with column stats only for specific columns.
- count() - Method in class org.apache.iceberg.metrics.DefaultTimer
- count() - Method in class org.apache.iceberg.metrics.FixedReservoirHistogram
- count() - Method in interface org.apache.iceberg.metrics.Histogram
-
Return the number of observations.
- count() - Method in interface org.apache.iceberg.metrics.MetricsContext.Counter
-
Deprecated.
- count() - Method in interface org.apache.iceberg.metrics.Timer
-
The number of times
Timer.time(Duration)
was called. - count() - Method in interface org.apache.iceberg.metrics.TimerResult
- count(String) - Static method in class org.apache.iceberg.expressions.Expressions
- count(Counter, CloseableIterable<T>) - Static method in interface org.apache.iceberg.io.CloseableIterable
-
Counts the number of elements in the given
CloseableIterable
by incrementing theCounter
instance for eachIterator.next()
call. - count(Counter, CloseableIterator<T>) - Static method in interface org.apache.iceberg.io.CloseableIterator
- COUNT - Enum constant in enum class org.apache.iceberg.expressions.Expression.Operation
- COUNT - Enum constant in enum class org.apache.iceberg.metrics.MetricsContext.Unit
- COUNT_STAR - Enum constant in enum class org.apache.iceberg.expressions.Expression.Operation
- CountAggregate<T> - Class in org.apache.iceberg.expressions
- CountAggregate(Expression.Operation, BoundTerm<T>) - Constructor for class org.apache.iceberg.expressions.CountAggregate
- countAttempts(Counter) - Method in class org.apache.iceberg.util.Tasks.Builder
- counter() - Method in class org.apache.iceberg.spark.source.EqualityDeleteRowReader
- counter(String) - Method in interface org.apache.iceberg.metrics.MetricsContext
-
Get a named counter using
MetricsContext.Unit.COUNT
- counter(String, Class<T>, MetricsContext.Unit) - Method in class org.apache.iceberg.hadoop.HadoopMetricsContext
-
The Hadoop implementation delegates to the FileSystem.Statistics implementation and therefore does not require support for operations like unit() and count() as the counter values are not directly consumed.
- counter(String, Class<T>, MetricsContext.Unit) - Method in class org.apache.iceberg.metrics.DefaultMetricsContext
-
Deprecated.will be removed in 2.0.0, use
Counter
instead. - counter(String, Class<T>, MetricsContext.Unit) - Method in interface org.apache.iceberg.metrics.MetricsContext
-
Deprecated.will be removed in 2.0.0, use
MetricsContext.counter(String, Unit)
instead. - counter(String, MetricsContext.Unit) - Method in class org.apache.iceberg.hadoop.HadoopMetricsContext
-
The Hadoop implementation delegates to the FileSystem.Statistics implementation and therefore does not require support for operations like unit() as the counter values are not directly consumed.
- counter(String, MetricsContext.Unit) - Method in class org.apache.iceberg.metrics.DefaultMetricsContext
- counter(String, MetricsContext.Unit) - Method in interface org.apache.iceberg.metrics.MetricsContext
-
Get a named counter.
- Counter - Interface in org.apache.iceberg.metrics
-
Generalized Counter interface for creating telemetry-related instances when counting events.
- counterFrom(Map<String, String>, String) - Static method in interface org.apache.iceberg.metrics.CommitMetricsResult
- counterFrom(Map<String, String>, String, MetricsContext.Unit) - Static method in interface org.apache.iceberg.metrics.CommitMetricsResult
- CounterResult - Interface in org.apache.iceberg.metrics
-
A serializable version of a
Counter
that carries its result. - countFor(DataFile) - Method in class org.apache.iceberg.expressions.CountAggregate
- countFor(DataFile) - Method in class org.apache.iceberg.expressions.CountNonNull
- countFor(DataFile) - Method in class org.apache.iceberg.expressions.CountStar
- countFor(StructLike) - Method in class org.apache.iceberg.expressions.CountAggregate
- countFor(StructLike) - Method in class org.apache.iceberg.expressions.CountNonNull
- countFor(StructLike) - Method in class org.apache.iceberg.expressions.CountStar
- CountNonNull<T> - Class in org.apache.iceberg.expressions
- CountNonNull(BoundTerm<T>) - Constructor for class org.apache.iceberg.expressions.CountNonNull
- Counts() - Constructor for class org.apache.iceberg.MetricsModes.Counts
- countStar() - Static method in class org.apache.iceberg.expressions.Expressions
- CountStar<T> - Class in org.apache.iceberg.expressions
- CountStar(BoundTerm<T>) - Constructor for class org.apache.iceberg.expressions.CountStar
- create() - Method in class org.apache.iceberg.aws.s3.S3OutputFile
-
Create an output stream for the specified location if the target object does not exist in S3 at the time of invocation.
- create() - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- create() - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Creates the table.
- create() - Static method in class org.apache.iceberg.deletes.PositionDelete
- create() - Method in class org.apache.iceberg.encryption.AesGcmOutputFile
- create() - Method in interface org.apache.iceberg.encryption.NativeEncryptionOutputFile
- create() - Method in class org.apache.iceberg.flink.sink.RowDataTaskWriterFactory
- create() - Method in interface org.apache.iceberg.flink.sink.TaskWriterFactory
-
Initialize a
TaskWriter
with given task id and attempt id. - create() - Method in class org.apache.iceberg.hadoop.HadoopOutputFile
- create() - Method in class org.apache.iceberg.inmemory.InMemoryOutputFile
- create() - Method in interface org.apache.iceberg.io.OutputFile
-
Create a new file and return a
PositionOutputStream
to it. - create() - Method in class org.apache.iceberg.orc.OrcValueReaders.StructReader
- create() - Static method in class org.apache.iceberg.util.CharSequenceMap
- create() - Method in class org.apache.iceberg.view.BaseMetastoreViewCatalog.BaseViewBuilder
- create() - Method in interface org.apache.iceberg.view.ViewBuilder
-
Create the view.
- create(ByteBuffer) - Static method in class org.apache.iceberg.encryption.NativeFileCryptoParameters
-
Creates the builder.
- create(List<BoundAggregate<?, ?>>) - Static method in class org.apache.iceberg.expressions.AggregateEvaluator
- create(Map<Integer, PartitionSpec>) - Static method in class org.apache.iceberg.util.PartitionMap
- create(Map<Integer, PartitionSpec>) - Static method in class org.apache.iceberg.util.PartitionSet
- create(Map<String, String>) - Static method in class org.apache.iceberg.aws.s3.signer.S3V4RestSignerClient
- create(Schema) - Static method in class org.apache.iceberg.avro.GenericAvroReader
- create(Schema) - Static method in class org.apache.iceberg.avro.GenericAvroWriter
- create(Schema) - Static method in class org.apache.iceberg.data.avro.DataWriter
- create(RowType, Types.StructType, Types.StructType) - Static method in class org.apache.iceberg.flink.data.RowDataProjection
-
Creates a projecting wrapper for
RowData
rows. - create(TableIdentifier, List<UpdateRequirement>, List<MetadataUpdate>) - Static method in class org.apache.iceberg.rest.requests.UpdateTableRequest
- create(TableIdentifier, TableMetadata, TableMetadata) - Static method in interface org.apache.iceberg.catalog.TableCommit
-
This creates a
TableCommit
instance to be applied for a single table withUpdateRequirement
s to be validated andMetadataUpdate
s that have been applied. - create(Schema) - Static method in class org.apache.iceberg.avro.GenericAvroReader
- create(Schema) - Static method in class org.apache.iceberg.data.GenericRecord
- create(Schema) - Static method in class org.apache.iceberg.mapping.MappingUtil
-
Create a name-based mapping for a schema.
- create(Schema) - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- create(Schema, String) - Method in interface org.apache.iceberg.Tables
- create(Schema, List<Expression>) - Static method in class org.apache.iceberg.expressions.AggregateEvaluator
- create(Schema, Set<Integer>) - Static method in class org.apache.iceberg.util.StructProjection
-
Creates a projecting wrapper for
StructLike
rows. - create(Schema, Schema) - Static method in class org.apache.iceberg.data.avro.DataReader
- create(Schema, Schema, Map<Integer, ?>) - Static method in class org.apache.iceberg.data.avro.DataReader
- create(Schema, PartitionSpec, String) - Method in interface org.apache.iceberg.Tables
- create(Schema, PartitionSpec, Map<String, String>, String) - Method in interface org.apache.iceberg.Tables
- create(Schema, PartitionSpec, SortOrder, Map<String, String>, String) - Method in class org.apache.iceberg.hadoop.HadoopTables
-
Create a table using the FileSystem implementation resolve from location.
- create(Schema, PartitionSpec, SortOrder, Map<String, String>, String) - Method in interface org.apache.iceberg.Tables
- create(Schema, Schema) - Static method in class org.apache.iceberg.flink.data.RowDataProjection
-
Creates a projecting wrapper for
RowData
rows. - create(Schema, Schema) - Static method in class org.apache.iceberg.util.StructProjection
-
Creates a projecting wrapper for
StructLike
rows. - create(Types.NestedField...) - Static method in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- create(Types.StructType) - Static method in class org.apache.iceberg.data.GenericRecord
- create(Types.StructType) - Static method in class org.apache.iceberg.util.StructLikeMap
- create(Types.StructType) - Static method in class org.apache.iceberg.util.StructLikeSet
- create(Types.StructType, Types.StructType) - Static method in class org.apache.iceberg.util.StructProjection
-
Creates a projecting wrapper for
StructLike
rows. - CREATE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- CREATE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- CREATE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateReplaceBranchClauseContext
- CREATE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateReplaceTagClauseContext
- CREATE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- CREATE_TABLE_RETRIES - Static variable in class org.apache.iceberg.connect.IcebergSinkConfig
- createAllowMissing(Types.StructType, Types.StructType) - Static method in class org.apache.iceberg.util.StructProjection
-
Creates a projecting wrapper for
StructLike
rows. - createAssigner() - Method in class org.apache.iceberg.flink.source.assigner.OrderedSplitAssignerFactory
- createAssigner() - Method in class org.apache.iceberg.flink.source.assigner.SimpleSplitAssignerFactory
- createAssigner() - Method in interface org.apache.iceberg.flink.source.assigner.SplitAssignerFactory
- createAssigner(Collection<IcebergSourceSplitState>) - Method in class org.apache.iceberg.flink.source.assigner.OrderedSplitAssignerFactory
- createAssigner(Collection<IcebergSourceSplitState>) - Method in class org.apache.iceberg.flink.source.assigner.SimpleSplitAssignerFactory
- createAssigner(Collection<IcebergSourceSplitState>) - Method in interface org.apache.iceberg.flink.source.assigner.SplitAssignerFactory
- createBatchedReaderFunc(Function<TypeDescription, OrcBatchReader<?>>) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- createBatchedReaderFunc(Function<MessageType, VectorizedReader<?>>) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- createBranch(String) - Method in interface org.apache.iceberg.ManageSnapshots
-
Create a new branch.
- createBranch(String) - Method in class org.apache.iceberg.SnapshotManager
- createBranch(String, long) - Method in interface org.apache.iceberg.ManageSnapshots
-
Create a new branch pointing to the given snapshot id.
- createBranch(String, long) - Method in class org.apache.iceberg.SnapshotManager
- createCatalog(String, Map<String, String>) - Method in class org.apache.iceberg.flink.FlinkCatalogFactory
- createCatalog(String, Map<String, String>, Configuration) - Method in class org.apache.iceberg.flink.FlinkCatalogFactory
- CreateChangelogViewProcedure - Class in org.apache.iceberg.spark.procedures
-
A procedure that creates a view for changed rows.
- CREATED_BY_PROPERTY - Static variable in class org.apache.iceberg.puffin.StandardPuffinProperties
-
human-readable identification of the application writing the file, along with its version.
- createDatabase(String, CatalogDatabase, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- createDataIterator(IcebergSourceSplit) - Method in class org.apache.iceberg.flink.source.reader.AvroGenericRecordReaderFunction
- createDataIterator(IcebergSourceSplit) - Method in class org.apache.iceberg.flink.source.reader.DataIteratorReaderFunction
- createDataIterator(IcebergSourceSplit) - Method in class org.apache.iceberg.flink.source.reader.MetaDataReaderFunction
- createDataIterator(IcebergSourceSplit) - Method in class org.apache.iceberg.flink.source.reader.RowDataReaderFunction
- createdAtMillis() - Method in class org.apache.iceberg.io.FileInfo
- createdBy(String) - Method in class org.apache.iceberg.puffin.Puffin.WriteBuilder
-
Sets file-level "created-by" property.
- createDynamicTableSink(DynamicTableFactory.Context) - Method in class org.apache.iceberg.flink.FlinkDynamicTableFactory
- createDynamicTableSource(DynamicTableFactory.Context) - Method in class org.apache.iceberg.flink.FlinkDynamicTableFactory
- createEmpty() - Static method in class org.apache.iceberg.catalog.SessionCatalog.SessionContext
- createEncryptionManager(Map<String, String>, KeyManagementClient) - Static method in class org.apache.iceberg.encryption.EncryptionUtil
- createEnumerator(SplitEnumeratorContext<SingleThreadedIteratorSource.GlobalSplit<T>>) - Method in class org.apache.iceberg.flink.maintenance.operator.SingleThreadedIteratorSource
- createEnumerator(SplitEnumeratorContext<IcebergSourceSplit>) - Method in class org.apache.iceberg.flink.source.IcebergSource
- createFunction(ObjectPath, CatalogFunction, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- createInputSplits(int) - Method in class org.apache.iceberg.flink.source.FlinkInputFormat
- createInstance() - Method in class org.apache.iceberg.flink.sink.shuffle.AggregatedStatisticsSerializer
- createKey() - Method in class org.apache.iceberg.mr.mapred.AbstractMapredIcebergRecordReader
- createKmsClient(Map<String, String>) - Static method in class org.apache.iceberg.encryption.EncryptionUtil
- createMetadataTableInstance(TableOperations, String, String, MetadataTableType) - Static method in class org.apache.iceberg.MetadataTableUtils
- createMetadataTableInstance(TableOperations, String, TableIdentifier, TableIdentifier, MetadataTableType) - Static method in class org.apache.iceberg.MetadataTableUtils
- createMetadataTableInstance(Table, MetadataTableType) - Static method in class org.apache.iceberg.MetadataTableUtils
- createNamespace(String[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkCatalog
- createNamespace(String[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- createNamespace(Namespace) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
Create a namespace in the catalog.
- createNamespace(Namespace) - Method in class org.apache.iceberg.inmemory.InMemoryCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.catalog.BaseSessionCatalog.AsCatalog
- createNamespace(Namespace, Map<String, String>) - Method in interface org.apache.iceberg.catalog.SupportsNamespaces
-
Create a namespace in the catalog.
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.dell.ecs.EcsCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.hive.HiveCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.inmemory.InMemoryCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.nessie.NessieCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.nessie.NessieIcebergClient
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.rest.RESTCatalog
- createNamespace(Namespace, Map<String, String>) - Method in class org.apache.iceberg.snowflake.SnowflakeCatalog
- createNamespace(SessionCatalog.SessionContext, Namespace) - Method in interface org.apache.iceberg.catalog.SessionCatalog
-
Create a namespace in the catalog.
- createNamespace(SessionCatalog.SessionContext, Namespace, Map<String, String>) - Method in interface org.apache.iceberg.catalog.SessionCatalog
-
Create a namespace in the catalog.
- createNamespace(SessionCatalog.SessionContext, Namespace, Map<String, String>) - Method in class org.apache.iceberg.rest.RESTSessionCatalog
- createNamespace(SupportsNamespaces, CreateNamespaceRequest) - Static method in class org.apache.iceberg.rest.CatalogHandlers
- CreateNamespaceRequest - Class in org.apache.iceberg.rest.requests
-
A REST request to create a namespace, with an optional set of properties.
- CreateNamespaceRequest() - Constructor for class org.apache.iceberg.rest.requests.CreateNamespaceRequest
- CreateNamespaceRequest.Builder - Class in org.apache.iceberg.rest.requests
- CreateNamespaceResponse - Class in org.apache.iceberg.rest.responses
-
Represents a REST response for a request to create a namespace / database.
- CreateNamespaceResponse() - Constructor for class org.apache.iceberg.rest.responses.CreateNamespaceResponse
- CreateNamespaceResponse.Builder - Class in org.apache.iceberg.rest.responses
- createNanValueCounts(Stream<FieldMetrics<?>>, MetricsConfig, Schema) - Static method in class org.apache.iceberg.MetricsUtil
-
Construct mapping relationship between column id to NaN value counts from input metrics and metrics config.
- createOrOverwrite() - Method in class org.apache.iceberg.aws.s3.S3OutputFile
- createOrOverwrite() - Method in class org.apache.iceberg.encryption.AesGcmOutputFile
- createOrOverwrite() - Method in interface org.apache.iceberg.encryption.NativeEncryptionOutputFile
- createOrOverwrite() - Method in class org.apache.iceberg.hadoop.HadoopOutputFile
- createOrOverwrite() - Method in class org.apache.iceberg.inmemory.InMemoryOutputFile
- createOrOverwrite() - Method in interface org.apache.iceberg.io.OutputFile
-
Create a new file and return a
PositionOutputStream
to it. - createOrReplace() - Method in class org.apache.iceberg.view.BaseMetastoreViewCatalog.BaseViewBuilder
- createOrReplace() - Method in interface org.apache.iceberg.view.ViewBuilder
-
Create or replace the view.
- CreateOrReplaceBranchContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateOrReplaceBranchContext
- createOrReplaceTableTransaction(String, TableOperations, TableMetadata) - Static method in class org.apache.iceberg.Transactions
- CreateOrReplaceTagContext(IcebergSqlExtensionsParser.StatementContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateOrReplaceTagContext
- createOrReplaceTransaction() - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- createOrReplaceTransaction() - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Starts a transaction to create or replace the table.
- createOuterSerializerWithNestedSerializers(TypeSerializer<?>[]) - Method in class org.apache.iceberg.flink.sink.shuffle.AggregatedStatisticsSerializer.AggregatedStatisticsSerializerSnapshot
- createPartition(ObjectPath, CatalogPartitionSpec, CatalogPartition, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- createReader(SourceReaderContext) - Method in class org.apache.iceberg.flink.maintenance.operator.MonitorSource
- createReader(SourceReaderContext) - Method in class org.apache.iceberg.flink.maintenance.operator.SingleThreadedIteratorSource
- createReader(SourceReaderContext) - Method in class org.apache.iceberg.flink.source.IcebergSource
- createReader(Schema, MessageType) - Method in class org.apache.iceberg.data.parquet.BaseParquetReaders
- createReader(Schema, MessageType, Map<Integer, ?>) - Method in class org.apache.iceberg.data.parquet.BaseParquetReaders
- createReaderFactory() - Method in class org.apache.iceberg.spark.source.SparkMicroBatchStream
- createReaderFunc(BiFunction<Schema, Schema, DatumReader<?>>) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- createReaderFunc(Function<Schema, DatumReader<?>>) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- createReaderFunc(Function<TypeDescription, OrcRowReader<?>>) - Method in class org.apache.iceberg.orc.ORC.ReadBuilder
- createReaderFunc(Function<MessageType, ParquetValueReader<?>>) - Method in class org.apache.iceberg.parquet.Parquet.ReadBuilder
- createRecordReader(InputSplit, TaskAttemptContext) - Method in class org.apache.iceberg.mr.mapreduce.IcebergInputFormat
- createRecordReader(InputSplit, TaskAttemptContext) - Method in class org.apache.iceberg.pig.IcebergPigInputFormat
- createReplaceBranchClause() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateOrReplaceBranchContext
- createReplaceBranchClause() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- CreateReplaceBranchClauseContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateReplaceBranchClauseContext
- createReplaceTagClause() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateOrReplaceTagContext
- createReplaceTagClause() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- CreateReplaceTagClauseContext(ParserRuleContext, int) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.CreateReplaceTagClauseContext
- createResolvingReader(Function<Schema, DatumReader<?>>) - Method in class org.apache.iceberg.avro.Avro.ReadBuilder
- CreateSnapshotEvent - Class in org.apache.iceberg.events
- CreateSnapshotEvent(String, String, long, long, Map<String, String>) - Constructor for class org.apache.iceberg.events.CreateSnapshotEvent
- createStructReader(List<Type>, List<ParquetValueReader<?>>, Types.StructType) - Method in class org.apache.iceberg.data.parquet.BaseParquetReaders
- createStructReader(List<Type>, List<ParquetValueReader<?>>, Types.StructType) - Method in class org.apache.iceberg.data.parquet.GenericParquetReaders
- createStructReader(Types.StructType, List<ValueReader<?>>, Map<Integer, ?>) - Method in class org.apache.iceberg.data.avro.DataReader
- createStructWriter(List<ValueWriter<?>>) - Method in class org.apache.iceberg.data.avro.DataWriter
- createStructWriter(List<ParquetValueWriter<?>>) - Method in class org.apache.iceberg.data.parquet.BaseParquetWriter
- createStructWriter(List<ParquetValueWriter<?>>) - Method in class org.apache.iceberg.data.parquet.GenericParquetWriter
- createTable(ObjectPath, CatalogBaseTable, boolean) - Method in class org.apache.iceberg.flink.FlinkCatalog
- createTable(Configuration, Properties) - Static method in class org.apache.iceberg.mr.Catalogs
-
Creates an Iceberg table using the catalog specified by the configuration.
- createTable(Catalog, Namespace, CreateTableRequest) - Static method in class org.apache.iceberg.rest.CatalogHandlers
- createTable(TableIdentifier, Schema) - Method in interface org.apache.iceberg.catalog.Catalog
-
Create an unpartitioned table.
- createTable(TableIdentifier, Schema) - Method in class org.apache.iceberg.rest.RESTCatalog
- createTable(TableIdentifier, Schema, PartitionSpec) - Method in interface org.apache.iceberg.catalog.Catalog
-
Create a table.
- createTable(TableIdentifier, Schema, PartitionSpec) - Method in class org.apache.iceberg.rest.RESTCatalog
- createTable(TableIdentifier, Schema, PartitionSpec, String, Map<String, String>) - Method in interface org.apache.iceberg.catalog.Catalog
-
Create a table.
- createTable(TableIdentifier, Schema, PartitionSpec, String, Map<String, String>) - Method in class org.apache.iceberg.rest.RESTCatalog
- createTable(TableIdentifier, Schema, PartitionSpec, Map<String, String>) - Method in interface org.apache.iceberg.catalog.Catalog
-
Create a table.
- createTable(TableIdentifier, Schema, PartitionSpec, Map<String, String>) - Method in class org.apache.iceberg.rest.RESTCatalog
- createTable(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkCachedTableCatalog
- createTable(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkCatalog
- createTable(Identifier, StructType, Transform[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- CreateTableRequest - Class in org.apache.iceberg.rest.requests
-
A REST request to create a table, either via direct commit or staging the creation of the table as part of a transaction.
- CreateTableRequest() - Constructor for class org.apache.iceberg.rest.requests.CreateTableRequest
- CreateTableRequest.Builder - Class in org.apache.iceberg.rest.requests
- createTableTransaction(String, TableOperations, TableMetadata) - Static method in class org.apache.iceberg.Transactions
- createTableTransaction(String, TableOperations, TableMetadata, MetricsReporter) - Static method in class org.apache.iceberg.Transactions
- createTag(String, long) - Method in interface org.apache.iceberg.ManageSnapshots
-
Create a new tag pointing to the given snapshot id
- createTag(String, long) - Method in class org.apache.iceberg.SnapshotManager
- createTransaction() - Method in class org.apache.iceberg.BaseMetastoreCatalog.BaseMetastoreCatalogTableBuilder
- createTransaction() - Method in interface org.apache.iceberg.catalog.Catalog.TableBuilder
-
Starts a transaction to create the table.
- createVectorSchemaRootFromVectors() - Method in class org.apache.iceberg.arrow.vectorized.ColumnarBatch
-
Create a new instance of
VectorSchemaRoot
from the arrow vectors stored in this arrow batch. - createView(ViewCatalog, Namespace, CreateViewRequest) - Static method in class org.apache.iceberg.rest.CatalogHandlers
- createView(Identifier, String, String, String[], StructType, String[], String[], String[], Map<String, String>) - Method in class org.apache.iceberg.spark.SparkCatalog
- CreateViewRequest - Interface in org.apache.iceberg.rest.requests
- CreateViewRequestParser - Class in org.apache.iceberg.rest.requests
- createWriter(MessageType) - Method in class org.apache.iceberg.data.parquet.BaseParquetWriter
- createWriterFunc(BiFunction<Schema, TypeDescription, OrcRowWriter<?>>) - Method in class org.apache.iceberg.orc.ORC.DataWriteBuilder
- createWriterFunc(BiFunction<Schema, TypeDescription, OrcRowWriter<?>>) - Method in class org.apache.iceberg.orc.ORC.DeleteWriteBuilder
- createWriterFunc(BiFunction<Schema, TypeDescription, OrcRowWriter<?>>) - Method in class org.apache.iceberg.orc.ORC.WriteBuilder
- createWriterFunc(Function<Schema, DatumWriter<?>>) - Method in class org.apache.iceberg.avro.Avro.DataWriteBuilder
- createWriterFunc(Function<Schema, DatumWriter<?>>) - Method in class org.apache.iceberg.avro.Avro.DeleteWriteBuilder
- createWriterFunc(Function<Schema, DatumWriter<?>>) - Method in class org.apache.iceberg.avro.Avro.WriteBuilder
- createWriterFunc(Function<MessageType, ParquetValueWriter<?>>) - Method in class org.apache.iceberg.parquet.Parquet.DataWriteBuilder
- createWriterFunc(Function<MessageType, ParquetValueWriter<?>>) - Method in class org.apache.iceberg.parquet.Parquet.DeleteWriteBuilder
- createWriterFunc(Function<MessageType, ParquetValueWriter<?>>) - Method in class org.apache.iceberg.parquet.Parquet.WriteBuilder
- credential() - Method in class org.apache.iceberg.aws.s3.signer.S3V4RestSignerClient
-
A credential to exchange for a token in the OAuth2 client credentials flow.
- credential() - Method in interface org.apache.iceberg.rest.auth.AuthConfig
- credential() - Method in class org.apache.iceberg.rest.auth.OAuth2Util.AuthSession
- CREDENTIAL - Static variable in class org.apache.iceberg.rest.auth.OAuth2Properties
-
A credential to exchange for a token in the OAuth2 client credentials flow.
- credentials() - Method in class org.apache.iceberg.catalog.SessionCatalog.SessionContext
-
Returns the session's credential map.
- credentialsProvider(String, String, String) - Method in class org.apache.iceberg.aws.AwsClientProperties
-
Returns a credentials provider instance.
- CredentialSupplier - Interface in org.apache.iceberg.io
-
Interface used to expose credentials held by a FileIO instance.
- ctorImpl(Class<?>, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Deprecated.since 1.6.0, will be removed in 1.7.0
- ctorImpl(String, Class<?>...) - Method in class org.apache.iceberg.common.DynMethods.Builder
-
Deprecated.since 1.6.0, will be removed in 1.7.0
- current() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- current() - Method in class org.apache.iceberg.BaseTransaction.TransactionTableOperations
- current() - Method in class org.apache.iceberg.hadoop.HadoopTableOperations
- current() - Static method in enum class org.apache.iceberg.hive.HiveVersion
- current() - Method in class org.apache.iceberg.BaseRewriteManifests
- current() - Method in class org.apache.iceberg.StaticTableOperations
- current() - Method in interface org.apache.iceberg.TableOperations
-
Return the currently loaded table metadata, without checking for updates.
- current() - Method in class org.apache.iceberg.view.BaseViewOperations
- current() - Method in interface org.apache.iceberg.view.ViewOperations
-
Return the currently loaded view metadata, without checking for updates.
- CURRENT_SCHEMA - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for the JSON representation of current schema.
- CURRENT_SNAPSHOT_ID - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for current snapshot id.
- CURRENT_SNAPSHOT_SUMMARY - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for current snapshot summary.
- CURRENT_SNAPSHOT_TIMESTAMP - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for current snapshot timestamp.
- currentAncestorIds(Table) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Return the snapshot IDs for the ancestors of the current table state.
- currentAncestors(Table) - Static method in class org.apache.iceberg.util.SnapshotUtil
-
Returns an iterable that traverses the table's snapshots from the current to the last known ancestor.
- currentCatalog() - Method in class org.apache.iceberg.spark.source.SparkView
- currentDefinitionLevel() - Method in class org.apache.iceberg.parquet.ColumnIterator
- currentDL - Variable in class org.apache.iceberg.parquet.BasePageIterator
- currentFieldName() - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- currentFileHasNext() - Method in class org.apache.iceberg.flink.source.DataIterator
- currentFilePath() - Method in class org.apache.iceberg.io.RollingDataWriter
- currentFileRows() - Method in class org.apache.iceberg.io.RollingDataWriter
- currentMetadata() - Method in class org.apache.iceberg.BaseTransaction
- currentMetadataLocation() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- currentMetadataLocation() - Method in class org.apache.iceberg.view.BaseViewOperations
- currentMetricsValues() - Method in class org.apache.iceberg.spark.source.EqualityDeleteRowReader
- currentNamespace() - Method in class org.apache.iceberg.spark.source.SparkView
- currentPageCount() - Method in class org.apache.iceberg.parquet.BasePageIterator
- currentPath() - Method in class org.apache.iceberg.flink.data.ParquetWithFlinkSchemaVisitor
- currentPath() - Method in class org.apache.iceberg.io.BaseTaskWriter.RollingEqDeleteWriter
- currentPath() - Method in class org.apache.iceberg.orc.OrcSchemaVisitor
- currentPath() - Method in class org.apache.iceberg.parquet.ParquetTypeVisitor
- currentPath() - Method in class org.apache.iceberg.parquet.TypeWithSchemaVisitor
- currentPath() - Method in class org.apache.iceberg.spark.data.ParquetWithSparkSchemaVisitor
- currentRepetitionLevel() - Method in class org.apache.iceberg.parquet.ColumnIterator
- currentRL - Variable in class org.apache.iceberg.parquet.BasePageIterator
- currentRows() - Method in class org.apache.iceberg.io.BaseTaskWriter.RollingEqDeleteWriter
- currentSchemaId() - Method in class org.apache.iceberg.TableMetadata
- currentSchemaId() - Method in interface org.apache.iceberg.view.ViewMetadata
- currentSnapshot() - Method in class org.apache.iceberg.BaseMetadataTable
- currentSnapshot() - Method in class org.apache.iceberg.BaseTable
- currentSnapshot() - Method in class org.apache.iceberg.BaseTransaction.TransactionTable
- currentSnapshot() - Method in class org.apache.iceberg.SerializableTable
- currentSnapshot() - Method in interface org.apache.iceberg.Table
-
Get the current
snapshot
for this table, or null if there are no snapshots. - currentSnapshot() - Method in class org.apache.iceberg.TableMetadata
- currentUser() - Static method in class org.apache.iceberg.hive.HiveHadoopUtil
- currentVersion() - Method in class org.apache.iceberg.BaseMetastoreTableOperations
- currentVersion() - Static method in class org.apache.iceberg.dell.ecs.PropertiesSerDesUtil
-
Get version of current serializer implementation.
- currentVersion() - Method in class org.apache.iceberg.view.BaseView
- currentVersion() - Method in class org.apache.iceberg.view.BaseViewOperations
- currentVersion() - Method in interface org.apache.iceberg.view.View
-
Get the current version for this view, or null if there are no versions.
- currentVersion() - Method in interface org.apache.iceberg.view.ViewMetadata
- currentVersionId() - Method in interface org.apache.iceberg.view.ViewMetadata
- custom(String, Map<String, String>, Configuration, String) - Static method in interface org.apache.iceberg.flink.CatalogLoader
- CustomOrderExpressionVisitor() - Constructor for class org.apache.iceberg.expressions.ExpressionVisitors.CustomOrderExpressionVisitor
- CustomOrderSchemaVisitor() - Constructor for class org.apache.iceberg.types.TypeUtil.CustomOrderSchemaVisitor
D
- data(PartitionSpec, String) - Static method in class org.apache.iceberg.DataFiles
- DATA - Enum constant in enum class org.apache.iceberg.FileContent
- DATA - Enum constant in enum class org.apache.iceberg.ManifestContent
- DATA_COMPLETE - Enum constant in enum class org.apache.iceberg.connect.events.PayloadType
-
Maps to payload of type
DataComplete
- DATA_FILES - Enum constant in enum class org.apache.iceberg.ManifestReader.FileType
- DATA_FILES - Enum constant in enum class org.apache.iceberg.MetadataTableType
- DATA_PLANNING_MODE - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- DATA_PLANNING_MODE - Static variable in class org.apache.iceberg.TableProperties
- DATA_WRITTEN - Enum constant in enum class org.apache.iceberg.connect.events.PayloadType
-
Maps to payload of type
DataWritten
- database() - Method in class org.apache.iceberg.hive.HiveTableOperations
- databaseExists(String) - Method in class org.apache.iceberg.flink.FlinkCatalog
- DataComplete - Class in org.apache.iceberg.connect.events
-
A control event payload for events sent by a worker that indicates it has finished sending all data for a commit request.
- DataComplete(UUID, List<TopicPartitionOffset>) - Constructor for class org.apache.iceberg.connect.events.DataComplete
- DataComplete(Schema) - Constructor for class org.apache.iceberg.connect.events.DataComplete
- DataFile - Interface in org.apache.iceberg
-
Interface for data files listed in a table manifest.
- dataFileFormat() - Method in class org.apache.iceberg.flink.FlinkWriteConf
- dataFileFormat() - Method in class org.apache.iceberg.spark.SparkWriteConf
- dataFiles() - Method in class org.apache.iceberg.connect.data.IcebergWriterResult
- dataFiles() - Method in class org.apache.iceberg.connect.events.DataWritten
- dataFiles() - Method in class org.apache.iceberg.io.DataWriteResult
- dataFiles() - Method in interface org.apache.iceberg.io.TaskWriter
-
Close the writer and get the completed data files, it requires that the task writer would produce data files only.
- dataFiles() - Method in class org.apache.iceberg.io.WriteResult
- DataFiles - Class in org.apache.iceberg
- DataFiles.Builder - Class in org.apache.iceberg
- dataFilesCount() - Method in interface org.apache.iceberg.actions.RewriteDataFiles.FileGroupFailureResult
- DataFilesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's data files as rows. - DataFilesTable.DataFilesTableScan - Class in org.apache.iceberg
- DataIterator<T> - Class in org.apache.iceberg.flink.source
-
Flink data iterator that reads
CombinedScanTask
into aCloseableIterator
- DataIterator(FileScanTaskReader<T>, CombinedScanTask, FileIO, EncryptionManager) - Constructor for class org.apache.iceberg.flink.source.DataIterator
- DataIteratorBatcher<T> - Interface in org.apache.iceberg.flink.source.reader
-
Batcher converts iterator of T into iterator of batched
RecordsWithSplitIds<RecordAndPosition<T>>
, as FLIP-27'sSplitReader.fetch()
returns batched records. - DataIteratorReaderFunction<T> - Class in org.apache.iceberg.flink.source.reader
-
A
ReaderFunction
implementation that usesDataIterator
. - DataIteratorReaderFunction(DataIteratorBatcher<T>) - Constructor for class org.apache.iceberg.flink.source.reader.DataIteratorReaderFunction
- dataManifests(FileIO) - Method in interface org.apache.iceberg.Snapshot
-
Return a
ManifestFile
for each data manifest in this snapshot. - DataOperations - Class in org.apache.iceberg
-
Data operations that produce snapshots.
- dataPlanningMode() - Method in class org.apache.iceberg.spark.SparkReadConf
- dataPlanningMode() - Method in class org.apache.iceberg.SparkDistributedDataScan
- DataReader<T> - Class in org.apache.iceberg.data.avro
- DataReader(Schema, Schema, Map<Integer, ?>) - Constructor for class org.apache.iceberg.data.avro.DataReader
- dataSchema() - Method in class org.apache.iceberg.data.BaseFileWriterFactory
- dataSequenceNumber() - Method in interface org.apache.iceberg.ContentFile
-
Returns the data sequence number of the file.
- dataSequenceNumber(long) - Method in interface org.apache.iceberg.RewriteFiles
-
Configure the data sequence number for this rewrite operation.
- dataSpec() - Method in class org.apache.iceberg.StreamingDelete
- DataStatisticsCoordinatorProvider - Class in org.apache.iceberg.flink.sink.shuffle
-
DataStatisticsCoordinatorProvider provides the method to create new
DataStatisticsCoordinator
- DataStatisticsCoordinatorProvider(String, OperatorID, Schema, SortOrder, int, StatisticsType) - Constructor for class org.apache.iceberg.flink.sink.shuffle.DataStatisticsCoordinatorProvider
- DataStatisticsOperator - Class in org.apache.iceberg.flink.sink.shuffle
-
DataStatisticsOperator collects traffic distribution statistics.
- DataTableScan - Class in org.apache.iceberg
- DataTableScan(Table, Schema, TableScanContext) - Constructor for class org.apache.iceberg.DataTableScan
- DataTask - Interface in org.apache.iceberg
-
A task that returns data as
rows
instead of where to read data. - DataTaskReader - Class in org.apache.iceberg.flink.source
- DataTaskReader(Schema) - Constructor for class org.apache.iceberg.flink.source.DataTaskReader
- dataTimestampMillis() - Method in class org.apache.iceberg.ScanSummary.PartitionMetrics
- dataType() - Method in class org.apache.iceberg.spark.source.SparkMetadataColumn
- dataType() - Method in interface org.apache.spark.sql.connector.iceberg.catalog.ProcedureParameter
-
Returns the type of this parameter.
- DataWriter<T> - Class in org.apache.iceberg.data.avro
- DataWriter<T> - Class in org.apache.iceberg.io
- DataWriter(Schema) - Constructor for class org.apache.iceberg.data.avro.DataWriter
- DataWriter(FileAppender<T>, FileFormat, String, PartitionSpec, StructLike, EncryptionKeyMetadata) - Constructor for class org.apache.iceberg.io.DataWriter
- DataWriter(FileAppender<T>, FileFormat, String, PartitionSpec, StructLike, EncryptionKeyMetadata, SortOrder) - Constructor for class org.apache.iceberg.io.DataWriter
- DataWriteResult - Class in org.apache.iceberg.io
-
A result of writing data files.
- DataWriteResult(List<DataFile>) - Constructor for class org.apache.iceberg.io.DataWriteResult
- DataWriteResult(DataFile) - Constructor for class org.apache.iceberg.io.DataWriteResult
- DataWritten - Class in org.apache.iceberg.connect.events
-
A control event payload for events sent by a worker that contains the table data that has been written and is ready to commit.
- DataWritten(Schema) - Constructor for class org.apache.iceberg.connect.events.DataWritten
- DataWritten(Types.StructType, UUID, TableReference, List<DataFile>, List<DeleteFile>) - Constructor for class org.apache.iceberg.connect.events.DataWritten
- DATE - Enum constant in enum class org.apache.iceberg.types.Type.TypeID
- DATE_INSPECTOR - Static variable in class org.apache.iceberg.mr.hive.serde.objectinspector.IcebergObjectInspector
- dateFromDays(int) - Static method in class org.apache.iceberg.util.DateTimeUtil
- dates() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- dates() - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- DateTimeUtil - Class in org.apache.iceberg.util
- DateToDaysFunction() - Constructor for class org.apache.iceberg.spark.functions.DaysFunction.DateToDaysFunction
- DateToMonthsFunction() - Constructor for class org.apache.iceberg.spark.functions.MonthsFunction.DateToMonthsFunction
- DateToYearsFunction() - Constructor for class org.apache.iceberg.spark.functions.YearsFunction.DateToYearsFunction
- DateType() - Constructor for class org.apache.iceberg.types.Types.DateType
- day() - Static method in class org.apache.iceberg.transforms.Transforms
-
Returns a day
Transform
for date or timestamp types. - day(int, String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- day(String) - Static method in class org.apache.iceberg.expressions.Expressions
- day(String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- day(String, int) - Method in interface org.apache.iceberg.transforms.PartitionSpecVisitor
- day(String, int, SortDirection, NullOrder) - Method in interface org.apache.iceberg.transforms.SortOrderVisitor
- day(String, String) - Method in class org.apache.iceberg.PartitionSpec.Builder
- day(Type) - Static method in class org.apache.iceberg.transforms.Transforms
-
Deprecated.use
Transforms.day()
instead; will be removed in 2.0.0 - Days<T> - Class in org.apache.iceberg.transforms
- Days() - Constructor for class org.apache.iceberg.transforms.Days
- DAYS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- DAYS - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- DAYS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.NonReservedContext
- DAYS() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.TimeUnitContext
- daysFromDate(LocalDate) - Static method in class org.apache.iceberg.util.DateTimeUtil
- daysFromInstant(Instant) - Static method in class org.apache.iceberg.util.DateTimeUtil
- DaysFunction - Class in org.apache.iceberg.spark.functions
-
A Spark function implementation for the Iceberg day transform.
- DaysFunction() - Constructor for class org.apache.iceberg.spark.functions.DaysFunction
- DaysFunction.DateToDaysFunction - Class in org.apache.iceberg.spark.functions
- DaysFunction.TimestampNtzToDaysFunction - Class in org.apache.iceberg.spark.functions
- DaysFunction.TimestampToDaysFunction - Class in org.apache.iceberg.spark.functions
- daysToIsoDate(int) - Static method in class org.apache.iceberg.util.DateTimeUtil
- daysToMonths(int) - Static method in class org.apache.iceberg.util.DateTimeUtil
- daysToYears(int) - Static method in class org.apache.iceberg.util.DateTimeUtil
- decimal(int, int) - Static method in class org.apache.iceberg.avro.ValueWriters
- decimal(int, int) - Static method in class org.apache.iceberg.data.orc.GenericOrcWriters
- decimal(ValueReader<byte[]>, int) - Static method in class org.apache.iceberg.avro.ValueReaders
- DECIMAL - Enum constant in enum class org.apache.iceberg.types.Type.TypeID
- DECIMAL_64 - Enum constant in enum class org.apache.hadoop.hive.ql.exec.vector.VectorizedSupport.Support
- DECIMAL_INT32_MAX_DIGITS - Static variable in class org.apache.iceberg.parquet.TypeToMessageType
- DECIMAL_INT64_MAX_DIGITS - Static variable in class org.apache.iceberg.parquet.TypeToMessageType
- DECIMAL_VALUE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsLexer
- DECIMAL_VALUE - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser
- DECIMAL_VALUE() - Method in class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- decimalAsFixed(ColumnDescriptor, int, int) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- decimalAsInteger(ColumnDescriptor, int, int) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- decimalAsLong(ColumnDescriptor, int, int) - Static method in class org.apache.iceberg.parquet.ParquetValueWriters
- decimalBytesReader(Schema) - Static method in class org.apache.iceberg.avro.ValueReaders
- DecimalLiteralContext(IcebergSqlExtensionsParser.NumberContext) - Constructor for class org.apache.spark.sql.catalyst.parser.extensions.IcebergSqlExtensionsParser.DecimalLiteralContext
- decimalRequiredBytes(int) - Static method in class org.apache.iceberg.types.TypeUtil
- decimals() - Static method in class org.apache.iceberg.data.orc.GenericOrcReaders
- decimals(int, int) - Static method in class org.apache.iceberg.spark.data.SparkOrcValueReaders
- DecimalUtil - Class in org.apache.iceberg.util
- DecimalVectorUtil - Class in org.apache.iceberg.arrow.vectorized.parquet
- decode(byte[]) - Static method in class org.apache.iceberg.avro.AvroEncoderUtil
- decode(byte[]) - Static method in class org.apache.iceberg.connect.events.AvroUtil
- decode(byte[]) - Static method in class org.apache.iceberg.ManifestFiles
-
Decode the binary data into a
ManifestFile
. - decode(InputStream, D) - Method in class org.apache.iceberg.data.avro.IcebergDecoder
- decode(InputStream, D) - Method in class org.apache.iceberg.data.avro.RawDecoder
- decodeFormData(String) - Static method in class org.apache.iceberg.rest.RESTUtil
-
Decodes a map of form data from application/x-www-form-urlencoded.
- decodeNamespace(String) - Static method in class org.apache.iceberg.rest.RESTUtil
-
Takes in a string representation of a namespace as used for a URL parameter and returns the corresponding namespace.
- DecoderResolver - Class in org.apache.iceberg.data.avro
-
Resolver to resolve
Decoder
to aResolvingDecoder
. - decodeString(String) - Static method in class org.apache.iceberg.rest.RESTUtil
-
Decodes a URL-encoded string.
- decomposePredicate(JobConf, Deserializer, ExprNodeDesc) - Method in class org.apache.iceberg.mr.hive.HiveIcebergStorageHandler
- decrypt(byte[], byte[]) - Method in class org.apache.iceberg.encryption.Ciphers.AesGcmDecryptor
- decrypt(byte[], int, int, byte[]) - Method in class org.apache.iceberg.encryption.Ciphers.AesGcmDecryptor
- decrypt(byte[], int, int, byte[], int, byte[]) - Method in class org.apache.iceberg.encryption.Ciphers.AesGcmDecryptor
- decrypt(Iterable<EncryptedInputFile>) - Method in interface org.apache.iceberg.encryption.EncryptionManager
-
Variant of
EncryptionManager.decrypt(EncryptedInputFile)
that provides a sequence of files that all need to be decrypted in a single context. - decrypt(Iterable<EncryptedInputFile>) - Method in class org.apache.iceberg.encryption.StandardEncryptionManager
- decrypt(EncryptedInputFile) - Method in interface org.apache.iceberg.encryption.EncryptionManager
-
Given an
EncryptedInputFile.encryptedInputFile()
representing the raw encrypted bytes from the underlying file system, and given metadata about how the file was encrypted viaEncryptedInputFile.keyMetadata()
, return anInputFile
that returns decrypted input streams. - decrypt(EncryptedInputFile) - Method in class org.apache.iceberg.encryption.PlaintextEncryptionManager
- decrypt(EncryptedInputFile) - Method in class org.apache.iceberg.encryption.StandardEncryptionManager
- decryptionKey() - Method in class org.apache.iceberg.gcp.GCPProperties
- dedupName() - Method in class org.apache.iceberg.transforms.Months
- dedupName() - Method in interface org.apache.iceberg.transforms.Transform
-
Return the unique transform name to check if similar transforms for the same source field are added multiple times in partition spec builder.
- DEFAULT_BATCH_SIZE - Static variable in class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader
- DEFAULT_CATALOG_NAME - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- DEFAULT_CONTROL_GROUP_PREFIX - Static variable in class org.apache.iceberg.connect.IcebergSinkConfig
- DEFAULT_DATABASE - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- DEFAULT_DATABASE_NAME - Static variable in class org.apache.iceberg.flink.FlinkCatalogFactory
- DEFAULT_FILE_FORMAT - Static variable in class org.apache.iceberg.TableProperties
- DEFAULT_FILE_FORMAT_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- DEFAULT_NAME_MAPPING - Static variable in class org.apache.iceberg.TableProperties
- DEFAULT_PARTITION_SPEC - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for the JSON representation of current(default) partition spec.
- DEFAULT_SORT_ORDER - Static variable in class org.apache.iceberg.TableProperties
-
Reserved table property for the JSON representation of current(default) sort order.
- DEFAULT_VIEW_FORMAT_VERSION - Static variable in interface org.apache.iceberg.view.ViewMetadata
- DEFAULT_WRITE_METRICS_MODE - Static variable in class org.apache.iceberg.TableProperties
- DEFAULT_WRITE_METRICS_MODE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- defaultActions() - Static method in interface org.apache.iceberg.delta.DeltaLakeToIcebergMigrationActionsProvider
-
Get the default implementation of
DeltaLakeToIcebergMigrationActionsProvider
- defaultAlwaysNull() - Method in class org.apache.iceberg.common.DynFields.Builder
-
Instructs this builder to return AlwaysNull if no implementation is found.
- defaultCatalog() - Method in interface org.apache.iceberg.view.ViewVersion
-
The default catalog when the view is created.
- DefaultCounter - Class in org.apache.iceberg.metrics
-
A default
Counter
implementation that uses anAtomicLong
to count events. - defaultEmitter() - Static method in interface org.apache.iceberg.flink.source.reader.SerializableRecordEmitter
- defaultErrorHandler() - Static method in class org.apache.iceberg.rest.ErrorHandlers
- defaultFactory() - Static method in class org.apache.iceberg.aliyun.AliyunClientFactories
- defaultFactory() - Static method in class org.apache.iceberg.aws.AwsClientFactories
- defaultFormat(FileFormat) - Method in interface org.apache.iceberg.UpdateProperties
-
Set the default file format for the table.
- defaultLocationProperty() - Static method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
-
The property used to set a default location for tables in a namespace.
- defaultLockManager() - Static method in class org.apache.iceberg.util.LockManagers
- DefaultMetricsContext - Class in org.apache.iceberg.metrics
-
A default
MetricsContext
implementation that uses native Java counters/timers. - DefaultMetricsContext() - Constructor for class org.apache.iceberg.metrics.DefaultMetricsContext
- defaultNamespace() - Method in class org.apache.iceberg.spark.SparkCatalog
- defaultNamespace() - Method in class org.apache.iceberg.spark.SparkSessionCatalog
- defaultNamespace() - Method in interface org.apache.iceberg.view.ViewVersion
-
The default namespace to use when the SQL does not contain a namespace.
- defaults() - Method in class org.apache.iceberg.rest.responses.ConfigResponse
-
Properties that should be used as default configuration.
- defaultSortOrderId() - Method in class org.apache.iceberg.TableMetadata
- defaultSpec(PartitionSpec) - Method in class org.apache.iceberg.io.OutputFileFactory.Builder
- defaultSpecId() - Method in class org.apache.iceberg.TableMetadata
- DefaultSplitAssigner - Class in org.apache.iceberg.flink.source.assigner
-
Since all methods are called in the source coordinator thread by enumerator, there is no need for locking.
- DefaultSplitAssigner(SerializableComparator<IcebergSourceSplit>) - Constructor for class org.apache.iceberg.flink.source.assigner.DefaultSplitAssigner
- DefaultSplitAssigner(SerializableComparator<IcebergSourceSplit>, Collection<IcebergSourceSplitState>) - Constructor for class org.apache.iceberg.flink.source.assigner.DefaultSplitAssigner
- defaultTargetFileSize() - Method in class org.apache.iceberg.actions.SizeBasedDataRewriter
- defaultTargetFileSize() - Method in class org.apache.iceberg.actions.SizeBasedFileRewriter
- defaultTargetFileSize() - Method in class org.apache.iceberg.actions.SizeBasedPositionDeletesRewriter
- DefaultTimer - Class in org.apache.iceberg.metrics
-
A default
Timer
implementation that uses aStopwatch
instance internally to measure time. - DefaultTimer(TimeUnit) - Constructor for class org.apache.iceberg.metrics.DefaultTimer
- defaultValue() - Method in class org.apache.iceberg.SystemConfigs.ConfigEntry
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.aws.dynamodb.DynamoDbCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.aws.glue.GlueCatalog
-
This method produces the same result as using a HiveCatalog.
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.BaseMetastoreCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.dell.ecs.EcsCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.hadoop.HadoopCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.hive.HiveCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.inmemory.InMemoryCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.jdbc.JdbcCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.nessie.NessieCatalog
- defaultWarehouseLocation(TableIdentifier) - Method in class org.apache.iceberg.snowflake.SnowflakeCatalog
- definitionLevels - Variable in class org.apache.iceberg.parquet.BasePageIterator
- DelegateFileIO - Interface in org.apache.iceberg.io
-
This interface is intended as an extension for FileIO implementations that support being a delegate target.
- DelegatingInputStream - Interface in org.apache.iceberg.io
- DelegatingOutputStream - Interface in org.apache.iceberg.io
- delete(long) - Method in interface org.apache.iceberg.deletes.PositionDeleteIndex
-
Set a deleted row position.
- delete(long, long) - Method in interface org.apache.iceberg.deletes.PositionDeleteIndex
-
Set a range of deleted row positions.
- delete(F, long, Long) - Method in class org.apache.iceberg.ManifestWriter
-
Add a delete entry for a file.
- delete(F, long, Long) - Method in class org.apache.iceberg.RollingManifestWriter
-
Add a delete entry for a file.
- delete(CharSequence) - Method in class org.apache.iceberg.StreamingDelete
-
Add a specific data path to be deleted in the new snapshot.
- delete(CharSequence, long, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.PositionDeltaWriter
-
Deletes a position in the provided spec/partition.
- delete(CharSequence, long, T, PartitionSpec, StructLike) - Method in class org.apache.iceberg.io.BasePositionDeltaWriter
- delete(CharSequence, long, T, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.PositionDeltaWriter
-
Deletes a position in the provided spec/partition and records the deleted row in the delete file.
- delete(String, Class<T>, Supplier<Map<String, String>>, Consumer<ErrorResponse>) - Method in interface org.apache.iceberg.rest.RESTClient
- delete(String, Class<T>, Map<String, String>, Consumer<ErrorResponse>) - Method in class org.apache.iceberg.rest.HTTPClient
- delete(String, Class<T>, Map<String, String>, Consumer<ErrorResponse>) - Method in interface org.apache.iceberg.rest.RESTClient
- delete(String, Map<String, String>, Class<T>, Supplier<Map<String, String>>, Consumer<ErrorResponse>) - Method in interface org.apache.iceberg.rest.RESTClient
- delete(String, Map<String, String>, Class<T>, Map<String, String>, Consumer<ErrorResponse>) - Method in class org.apache.iceberg.rest.HTTPClient
- delete(String, Map<String, String>, Class<T>, Map<String, String>, Consumer<ErrorResponse>) - Method in interface org.apache.iceberg.rest.RESTClient
- delete(DataFile) - Method in class org.apache.iceberg.StreamingDelete
-
Add a specific data file to be deleted in the new snapshot.
- delete(DeleteFile) - Method in class org.apache.iceberg.StreamingDelete
-
Add a specific delete file to be deleted in the new snapshot.
- delete(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
-
Delete those rows whose equality fields has the same values with the given row.
- delete(T, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.EqualityDeltaWriter
-
Deletes a row from the provided spec/partition.
- DELETE - Enum constant in enum class org.apache.iceberg.actions.DeleteOrphanFiles.PrefixMismatchMode
- DELETE - Enum constant in enum class org.apache.iceberg.ChangelogOperation
- DELETE - Static variable in class org.apache.iceberg.DataOperations
-
Data is deleted from the table and no data is added.
- DELETE - Static variable in class org.apache.iceberg.spark.ChangelogIterator
- DELETE_AVRO_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- DELETE_AVRO_COMPRESSION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- DELETE_BATCH_SIZE - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Configure the batch size used when deleting multiple files from a given S3 bucket
- DELETE_BATCH_SIZE_DEFAULT - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Default batch size used when deleting files.
- DELETE_BATCH_SIZE_MAX - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Max possible batch size for deletion.
- DELETE_DEFAULT_FILE_FORMAT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_DISTRIBUTION_MODE - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ENABLED - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Determines if
S3FileIO
deletes the object when io.delete() is called, default to true. - DELETE_ENABLED_DEFAULT - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
- DELETE_FILE_PATH - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_PATH - Static variable in class org.apache.iceberg.PositionDeletesTable
- DELETE_FILE_POS - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_ROW_DOC - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_ROW_FIELD_ID - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_ROW_FIELD_NAME - Static variable in class org.apache.iceberg.MetadataColumns
- DELETE_FILE_THRESHOLD - Static variable in class org.apache.iceberg.actions.SizeBasedDataRewriter
-
The minimum number of deletes that needs to be associated with a data file for it to be considered for rewriting.
- DELETE_FILE_THRESHOLD_DEFAULT - Static variable in class org.apache.iceberg.actions.SizeBasedDataRewriter
- DELETE_FILES - Enum constant in enum class org.apache.iceberg.ManifestReader.FileType
- DELETE_FILES - Enum constant in enum class org.apache.iceberg.MetadataTableType
- DELETE_FORMAT - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- DELETE_GRANULARITY - Static variable in class org.apache.iceberg.spark.SparkWriteOptions
- DELETE_GRANULARITY - Static variable in class org.apache.iceberg.TableProperties
- DELETE_GRANULARITY_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ISOLATION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ISOLATION_LEVEL_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_MODE - Static variable in class org.apache.iceberg.TableProperties
- DELETE_MODE_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ORC_BLOCK_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ORC_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ORC_COMPRESSION_STRATEGY - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ORC_STRIPE_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_ORC_WRITE_BATCH_SIZE - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_COMPRESSION - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_COMPRESSION_LEVEL - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_DICT_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_PAGE_ROW_LIMIT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_PAGE_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_ROW_GROUP_CHECK_MAX_RECORD_COUNT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_ROW_GROUP_CHECK_MIN_RECORD_COUNT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PARQUET_ROW_GROUP_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_PLANNING_MODE - Static variable in class org.apache.iceberg.spark.SparkSQLProperties
- DELETE_PLANNING_MODE - Static variable in class org.apache.iceberg.TableProperties
- DELETE_SCAN_COLUMNS - Static variable in class org.apache.iceberg.AllDeleteFilesTable.AllDeleteFilesTableScan
- DELETE_SCAN_WITH_STATS_COLUMNS - Static variable in class org.apache.iceberg.AllDeleteFilesTable.AllDeleteFilesTableScan
- DELETE_TAGS_PREFIX - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Used by
S3FileIO
to tag objects when deleting. - DELETE_TARGET_FILE_SIZE_BYTES - Static variable in class org.apache.iceberg.TableProperties
- DELETE_TARGET_FILE_SIZE_BYTES_DEFAULT - Static variable in class org.apache.iceberg.TableProperties
- DELETE_THREADS - Static variable in class org.apache.iceberg.aws.s3.S3FileIOProperties
-
Number of threads to use for adding delete tags to S3 objects, default to
Runtime.availableProcessors()
- DELETE_WORKER_THREAD_POOL_SIZE - Static variable in class org.apache.iceberg.SystemConfigs
-
Sets the size of the delete worker pool.
- DELETE_WORKER_THREAD_POOL_SIZE - Static variable in class org.apache.iceberg.util.ThreadPools
- deleteBatchSize() - Method in class org.apache.iceberg.aws.s3.S3FileIOProperties
- deleteBatchSize() - Method in class org.apache.iceberg.gcp.GCPProperties
- deleteByRowFilter(Expression) - Method in class org.apache.iceberg.StreamingDelete
-
Add a filter to match files to delete.
- deleteColumn(String) - Method in interface org.apache.iceberg.UpdateSchema
-
Delete a column in the schema.
- DeleteCounter - Class in org.apache.iceberg.deletes
-
A counter to be used to count deletes as they are applied.
- DeleteCounter() - Constructor for class org.apache.iceberg.deletes.DeleteCounter
- DELETED_DUPLICATE_FILES - Static variable in class org.apache.iceberg.SnapshotSummary
- DELETED_FILES_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- DELETED_FILES_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- DELETED_RECORDS_PROP - Static variable in class org.apache.iceberg.SnapshotSummary
- DELETED_ROWS_COUNT - Static variable in interface org.apache.iceberg.ManifestFile
- DeletedColumnVector - Class in org.apache.iceberg.spark.data.vectorized
- DeletedColumnVector(Type, boolean[]) - Constructor for class org.apache.iceberg.spark.data.vectorized.DeletedColumnVector
- deletedDataFiles() - Method in class org.apache.iceberg.actions.RewriteDataFilesActionResult
- DeletedDataFileScanTask - Interface in org.apache.iceberg
-
A scan task for deletes generated by removing a data file from the table.
- deletedDataFilesCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted data files.
- deletedDataFilesCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted data files.
- deletedEqualityDeleteFilesCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted equality delete files.
- deletedEqualityDeleteFilesCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted equality delete files.
- deletedFile(PartitionSpec, ContentFile<?>) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- deletedFile(PartitionSpec, DataFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- deletedFile(PartitionSpec, DeleteFile) - Method in class org.apache.iceberg.SnapshotSummary.Builder
- deletedFilesCount() - Method in class org.apache.iceberg.GenericManifestFile
- deletedFilesCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the number of files with status DELETED in the manifest file.
- deletedFilesCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- deletedManifestListsCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted manifest lists.
- deletedManifestListsCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted manifest lists.
- deletedManifestsCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted manifests.
- deletedManifestsCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted manifests.
- deletedOtherFilesCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted metadata json, version hint files.
- deletedPositionDeleteFilesCount() - Method in interface org.apache.iceberg.actions.DeleteReachableFiles.Result
-
Returns the number of deleted position delete files.
- deletedPositionDeleteFilesCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted position delete files.
- deletedRowPositions() - Method in class org.apache.iceberg.data.DeleteFilter
- deletedRowsCount() - Method in class org.apache.iceberg.GenericManifestFile
- deletedRowsCount() - Method in interface org.apache.iceberg.ManifestFile
-
Returns the total number of rows in all files with status DELETED in the manifest file.
- deletedRowsCount() - Method in class org.apache.iceberg.spark.actions.ManifestFileBean
- DeletedRowsScanTask - Interface in org.apache.iceberg
-
A scan task for deletes generated by adding delete files to the table.
- deletedStatisticsFilesCount() - Method in interface org.apache.iceberg.actions.ExpireSnapshots.Result
-
Returns the number of deleted statistics files.
- deletedVectorHolder(int) - Static method in class org.apache.iceberg.arrow.vectorized.VectorHolder
- DeletedVectorHolder(int) - Constructor for class org.apache.iceberg.arrow.vectorized.VectorHolder.DeletedVectorHolder
- DeletedVectorReader() - Constructor for class org.apache.iceberg.arrow.vectorized.VectorizedArrowReader.DeletedVectorReader
- deleteFile(CharSequence) - Method in interface org.apache.iceberg.DeleteFiles
-
Delete a file path from the underlying table.
- deleteFile(CharSequence) - Method in class org.apache.iceberg.StreamingDelete
- deleteFile(String) - Method in class org.apache.iceberg.aliyun.oss.OSSFileIO
- deleteFile(String) - Method in class org.apache.iceberg.aws.s3.S3FileIO
- deleteFile(String) - Method in class org.apache.iceberg.azure.adlsv2.ADLSFileIO
- deleteFile(String) - Method in class org.apache.iceberg.dell.ecs.EcsFileIO
- deleteFile(String) - Method in class org.apache.iceberg.encryption.EncryptingFileIO
- deleteFile(String) - Method in class org.apache.iceberg.gcp.gcs.GCSFileIO
- deleteFile(String) - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- deleteFile(String) - Method in class org.apache.iceberg.inmemory.InMemoryFileIO
- deleteFile(String) - Method in interface org.apache.iceberg.io.FileIO
-
Delete the file at the given path.
- deleteFile(String) - Method in class org.apache.iceberg.io.ResolvingFileIO
- deleteFile(String) - Method in class org.apache.iceberg.BaseRewriteManifests
- deleteFile(DataFile) - Method in class org.apache.iceberg.BaseOverwriteFiles
- deleteFile(DataFile) - Method in interface org.apache.iceberg.DeleteFiles
-
Delete a file tracked by a
DataFile
from the underlying table. - deleteFile(DataFile) - Method in interface org.apache.iceberg.OverwriteFiles
-
Delete a
DataFile
from the table. - deleteFile(DataFile) - Method in interface org.apache.iceberg.RewriteFiles
-
Remove a data file from the current table state.
- deleteFile(DataFile) - Method in class org.apache.iceberg.StreamingDelete
- deleteFile(DeleteFile) - Method in interface org.apache.iceberg.RewriteFiles
-
Remove a delete file from the table state.
- deleteFile(InputFile) - Method in interface org.apache.iceberg.io.FileIO
- deleteFile(OutputFile) - Method in interface org.apache.iceberg.io.FileIO
-
Convenience method to
delete
anOutputFile
. - DeleteFile - Interface in org.apache.iceberg
-
Interface for delete files listed in a table delete manifest.
- deleteFileBuilder(PartitionSpec) - Static method in class org.apache.iceberg.FileMetadata
- deleteFileFormat() - Method in class org.apache.iceberg.spark.SparkWriteConf
- deleteFiles() - Method in class org.apache.iceberg.connect.data.IcebergWriterResult
- deleteFiles() - Method in class org.apache.iceberg.connect.events.DataWritten
- deleteFiles() - Method in class org.apache.iceberg.io.DeleteWriteResult
- deleteFiles() - Method in class org.apache.iceberg.io.WriteResult
- deleteFiles(Iterable<String>) - Method in class org.apache.iceberg.aws.s3.S3FileIO
-
Deletes the given paths in a batched manner.
- deleteFiles(Iterable<String>) - Method in class org.apache.iceberg.azure.adlsv2.ADLSFileIO
- deleteFiles(Iterable<String>) - Method in class org.apache.iceberg.gcp.gcs.GCSFileIO
- deleteFiles(Iterable<String>) - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- deleteFiles(Iterable<String>) - Method in class org.apache.iceberg.io.ResolvingFileIO
- deleteFiles(Iterable<String>) - Method in interface org.apache.iceberg.io.SupportsBulkOperations
-
Delete the files at the given paths.
- deleteFiles(ExecutorService, Consumer<String>, Iterator<FileInfo>) - Method in class org.apache.iceberg.spark.actions.DeleteReachableFilesSparkAction
-
Deletes files and keeps track of how many files were removed for each file type.
- deleteFiles(FileIO, Iterable<String>, String, boolean) - Static method in class org.apache.iceberg.CatalogUtil
-
Helper to delete files.
- deleteFiles(SupportsBulkOperations, Iterator<FileInfo>) - Method in class org.apache.iceberg.spark.actions.DeleteReachableFilesSparkAction
- DeleteFiles - Interface in org.apache.iceberg
-
API for deleting files from a table.
- DeleteFilesTable - Class in org.apache.iceberg
-
A
Table
implementation that exposes a table's delete files as rows. - DeleteFilesTable.DeleteFilesTableScan - Class in org.apache.iceberg
- DeleteFilter<T> - Class in org.apache.iceberg.data
- DeleteFilter(String, List<DeleteFile>, Schema, Schema) - Constructor for class org.apache.iceberg.data.DeleteFilter
- DeleteFilter(String, List<DeleteFile>, Schema, Schema, DeleteCounter) - Constructor for class org.apache.iceberg.data.DeleteFilter
- deleteFromRowFilter(Expression) - Method in interface org.apache.iceberg.DeleteFiles
-
Delete files that match an
Expression
on data rows from the table. - deleteFromRowFilter(Expression) - Method in class org.apache.iceberg.StreamingDelete
- deleteGranularity() - Method in class org.apache.iceberg.spark.SparkWriteConf
- DeleteGranularity - Enum Class in org.apache.iceberg.deletes
-
An enum that represents the granularity of deletes.
- deleteKey(T) - Method in class org.apache.iceberg.io.BaseTaskWriter.BaseEqualityDeltaWriter
-
Delete those rows with the given key.
- deleteKey(T, PartitionSpec, StructLike) - Method in interface org.apache.iceberg.io.EqualityDeltaWriter
-
Deletes a key from the provided spec/partition.
- DeleteLoader - Interface in org.apache.iceberg.data
-
An API for loading delete file content into in-memory data structures.
- deleteManifest(ManifestFile) - Method in class org.apache.iceberg.BaseRewriteManifests
- deleteManifest(ManifestFile) - Method in interface org.apache.iceberg.RewriteManifests
-
Deletes a
manifest file
from the table. - deleteManifests(FileIO) - Method in interface org.apache.iceberg.Snapshot
-
Return a
ManifestFile
for each delete manifest in this snapshot. - deleteOrphanFiles(Table) - Method in interface org.apache.iceberg.actions.ActionsProvider
-
Instantiates an action to delete orphan files.
- deleteOrphanFiles(Table) - Method in class org.apache.iceberg.spark.actions.SparkActions
- DeleteOrphanFiles - Interface in org.apache.iceberg.actions
-
An action that deletes orphan metadata, data and delete files in a table.
- DeleteOrphanFiles.PrefixMismatchMode - Enum Class in org.apache.iceberg.actions
-
Defines the action behavior when location prefixes (scheme/authority) mismatch.
- DeleteOrphanFiles.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- DeleteOrphanFilesSparkAction - Class in org.apache.iceberg.spark.actions
-
An action that removes orphan metadata, data and delete files by listing a given location and comparing the actual files in that location with content and metadata files referenced by all valid snapshots.
- DeleteOrphanFilesSparkAction.FileURI - Class in org.apache.iceberg.spark.actions
- deletePlanningMode() - Method in class org.apache.iceberg.spark.SparkReadConf
- deletePlanningMode() - Method in class org.apache.iceberg.SparkDistributedDataScan
- deletePositions(CharSequence, List<CloseableIterable<T>>) - Static method in class org.apache.iceberg.deletes.Deletes
- deletePositions(CharSequence, CloseableIterable<StructLike>) - Static method in class org.apache.iceberg.deletes.Deletes
- deletePrefix(String) - Method in class org.apache.iceberg.aws.s3.S3FileIO
-
This method provides a "best-effort" to delete all objects under the given prefix.
- deletePrefix(String) - Method in class org.apache.iceberg.azure.adlsv2.ADLSFileIO
- deletePrefix(String) - Method in class org.apache.iceberg.gcp.gcs.GCSFileIO
- deletePrefix(String) - Method in class org.apache.iceberg.hadoop.HadoopFileIO
- deletePrefix(String) - Method in class org.apache.iceberg.io.ResolvingFileIO
- deletePrefix(String) - Method in interface org.apache.iceberg.io.SupportsPrefixOperations
-
Delete all files under a prefix.
- deleteReachableFiles(String) - Method in interface org.apache.iceberg.actions.ActionsProvider
-
Instantiates an action to delete all the files reachable from given metadata location.
- deleteReachableFiles(String) - Method in class org.apache.iceberg.spark.actions.SparkActions
- DeleteReachableFiles - Interface in org.apache.iceberg.actions
-
An action that deletes all files referenced by a table metadata file.
- DeleteReachableFiles.Result - Interface in org.apache.iceberg.actions
-
The action result that contains a summary of the execution.
- DeleteReachableFilesSparkAction - Class in org.apache.iceberg.spark.actions
-
An implementation of
DeleteReachableFiles
that uses metadata tables in Spark to determine which files should be deleted. - deletes() - Method in interface org.apache.iceberg.AddedRowsScanTask
-
A list of
delete files
to apply when reading the data file in this task. - deletes() - Method in class org.apache.iceberg.BaseFileScanTask
- deletes() - Method in interface org.apache.iceberg.FileScanTask
-
A list of
delete files
to apply when reading the task's data file. - Deletes - Class in org.apache.iceberg.deletes
- DELETES - Enum constant in enum class org.apache.iceberg.ManifestContent
- DeleteSchemaUtil - Class in org.apache.iceberg.io
- deletesDataFiles() - Method in class org.apache.iceberg.StreamingDelete
- deletesDeleteFiles() - Method in class org.apache.iceberg.StreamingDelete
- deleteTags() - Method in class org.apache.iceberg.aws.s3.S3FileIOProperties
- deleteThreads() - Method in class org.apache.iceberg.aws.s3.S3FileIOProperties
- deleteWhere(Predicate[]) - Method in class org.apache.iceberg.spark.source.SparkTable
- deleteWhere(Filter[]) - Method in class org.apache.iceberg.spark.RollbackStagedTable
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.actions.DeleteOrphanFiles
-
Passes an alternative delete implementation that will be used for orphan files.
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.actions.DeleteReachableFiles
-
Passes an alternative delete implementation that will be used for files.
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.actions.ExpireSnapshots
-
Passes an alternative delete implementation that will be used for manifests, data and delete files.
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.ExpireSnapshots
-
Passes an alternative delete implementation that will be used for manifests and data files.
- deleteWith(Consumer<String>) - Method in class org.apache.iceberg.BaseRewriteManifests
- deleteWith(Consumer<String>) - Method in interface org.apache.iceberg.SnapshotUpdate
-
Set a callback to delete files instead of the table's default.
- deleteWith(Consumer<String>) - Method in class org.apache.iceberg.spark.actions.DeleteOrphanFilesSparkAction
- deleteWith(Consumer<String>) - Method in class org.apache.iceberg.spark.actions.DeleteReachableFilesSparkAction
- deleteWith(Consumer<String>) - Method in class org.apache.iceberg.spark.actions.ExpireSnapshotsSparkAction
- DeleteWriteResult - Class in org.apache.iceberg.io
-
A result of writing delete files.
- DeleteWriteResult(List<DeleteFile>) - Constructor for class org.apache.iceberg.io.DeleteWriteResult
- DeleteWriteResult(List<DeleteFile>, CharSequenceSet) - Constructor for class org.apache.iceberg.io.DeleteWriteResult
- DeleteWriteResult(DeleteFile) - Constructor for class org.apache.iceberg.io.DeleteWriteResult
- DeleteWriteResult(DeleteFile, CharSequenceSet) - Constructor for class org.apache.iceberg.io.DeleteWriteResult
- DellClientFactories - Class in org.apache.iceberg.dell
- DellClientFactory - Interface in org.apache.iceberg.dell
- DellProperties - Class in org.apache.iceberg.dell
- DellProperties() - Constructor for class org.apache.iceberg.dell.DellProperties
- DellProperties(Map<String, String>) - Constructor for class org.apache.iceberg.dell.DellProperties
- deltaLakeConfiguration(Configuration) - Method in interface org.apache.iceberg.delta.SnapshotDeltaLakeTable
-
Sets the Hadoop configuration used to access delta lake table's logs and datafiles.
- DeltaLakeToIcebergMigrationActionsProvider - Interface in org.apache.iceberg.delta
-
An API that provide actions for migration from a delta lake table to an iceberg table.
- DeltaLakeToIcebergMigrationActionsProvider.DefaultDeltaLakeToIcebergMigrationActions - Class in org.apache.iceberg.delta
- desc - Variable in class org.apache.iceberg.parquet.BaseColumnIterator
- desc - Variable in class org.apache.iceberg.parquet.BasePageIterator
- desc(String) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add a field to the sort by field name, descending with nulls first.
- desc(String, NullOrder) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add a field to the sort by field name, descending with the given null order.
- desc(Term) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add an expression term to the sort, descending with nulls first.
- desc(Term, NullOrder) - Method in class org.apache.iceberg.BaseReplaceSortOrder
- desc(Term, NullOrder) - Method in class org.apache.iceberg.SortOrder.Builder
-
Add an expression term to the sort, descending with the given null order.
- desc(Term, NullOrder) - Method in interface org.apache.iceberg.SortOrderBuilder
-
Add an expression term to the sort, descending with the given null order.
- DESC - Enum constant in enum class org.apache.iceberg.SortDirection
- DESC - Static variable in class org.apache.spark.sql.catalyst.parser.extensions.
Table.newIncrementalAppendScan()
instead.