| Interface | Description | 
|---|---|
| ExtendedParser | |
| SupportsReplaceView | 
| Class | Description | 
|---|---|
| ChangelogIterator | An iterator that transforms rows from changelog tables within a single Spark task. | 
| CommitMetadata | utility class to accept thread local commit properties | 
| ComputeUpdateIterator | An iterator that finds delete/insert rows which represent an update, and converts them into
 update records from changelog tables within a single Spark task. | 
| ExtendedParser.RawOrderField | |
| FileRewriteCoordinator | |
| IcebergSpark | |
| JobGroupInfo | Captures information about the current job which is used for displaying on the UI | 
| JobGroupUtils | |
| PathIdentifier | |
| PositionDeletesRewriteCoordinator | |
| PruneColumnsWithoutReordering | |
| PruneColumnsWithReordering | |
| RemoveNetCarryoverIterator | This class computes the net changes across multiple snapshots. | 
| RollbackStagedTable | An implementation of StagedTable that mimics the behavior of Spark's non-atomic CTAS and RTAS. | 
| ScanTaskSetManager | |
| Spark3Util | |
| Spark3Util.CatalogAndIdentifier | This mimics a class inside of Spark which is private inside of LookupCatalog. | 
| Spark3Util.DescribeSchemaVisitor | |
| SparkAggregates | |
| SparkCachedTableCatalog | An internal table catalog that is capable of loading tables from a cache. | 
| SparkCatalog | A Spark TableCatalog implementation that wraps an Iceberg  Catalog. | 
| SparkContentFile<F> | |
| SparkDataFile | |
| SparkDeleteFile | |
| SparkExceptionUtil | |
| SparkExecutorCache | An executor cache for reducing the computation and IO overhead in tasks. | 
| SparkFilters | |
| SparkFunctionCatalog | A function catalog that can be used to resolve Iceberg functions without a metastore connection. | 
| SparkReadConf | A class for common Iceberg configs for Spark reads. | 
| SparkReadOptions | Spark DF read options | 
| SparkSchemaUtil | Helper methods for working with Spark/Hive metadata. | 
| SparkSessionCatalog<T extends org.apache.spark.sql.connector.catalog.TableCatalog & org.apache.spark.sql.connector.catalog.FunctionCatalog & org.apache.spark.sql.connector.catalog.SupportsNamespaces> | A Spark catalog that can also load non-Iceberg tables. | 
| SparkSQLProperties | |
| SparkStructLike | |
| SparkTableCache | |
| SparkTableUtil | Java version of the original SparkTableUtil.scala
 https://github.com/apache/iceberg/blob/apache-iceberg-0.8.0-incubating/spark/src/main/scala/org/apache/iceberg/spark/SparkTableUtil.scala | 
| SparkTableUtil.SparkPartition | Class representing a table partition. | 
| SparkUtil | |
| SparkV2Filters | |
| SparkValueConverter | A utility class that converts Spark values to Iceberg's internal representation. | 
| SparkWriteConf | A class for common Iceberg configs for Spark writes. | 
| SparkWriteOptions | Spark DF write options | 
| SparkWriteRequirements | A set of requirements such as distribution and ordering reported to Spark during writes. | 
| SparkWriteUtil | A utility that contains helper methods for working with Spark writes. |