Interface ExtendedSupportsDelete

  • All Superinterfaces:
    org.apache.spark.sql.connector.catalog.SupportsDelete
    All Known Implementing Classes:
    SparkTable, StagedSparkTable

    public interface ExtendedSupportsDelete
    extends org.apache.spark.sql.connector.catalog.SupportsDelete
    • Method Summary

      All Methods Instance Methods Default Methods 
      Modifier and Type Method Description
      default boolean canDeleteWhere​(org.apache.spark.sql.sources.Filter[] filters)
      Checks if it is possible to delete data from a data source table that matches filter expressions.
      • Methods inherited from interface org.apache.spark.sql.connector.catalog.SupportsDelete

        deleteWhere
    • Method Detail

      • canDeleteWhere

        default boolean canDeleteWhere​(org.apache.spark.sql.sources.Filter[] filters)
        Checks if it is possible to delete data from a data source table that matches filter expressions.

        Rows should be deleted from the data source iff all of the filter expressions match. That is, the expressions must be interpreted as a set of filters that are ANDed together.

        Spark will call this method to check if the delete is possible without significant effort. Otherwise, Spark will try to rewrite the delete operation if the data source table supports row-level operations.

        Parameters:
        filters - filter expressions, used to select rows to delete when all expressions match
        Returns:
        true if the delete operation can be performed