[SQL][DOC][MINOR] update documents for Table and WriteBuilder
## What changes were proposed in this pull request? Update the docs to reflect the changes made by https://github.com/apache/spark/pull/24129 ## How was this patch tested? N/A Closes #24658 from cloud-fan/comment. Authored-by: Wenchen Fan <wenchen@databricks.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
This commit is contained in:
parent
20fb01bbea
commit
1e0facb60d
|
@ -30,8 +30,8 @@ import java.util.Set;
|
|||
* implementation can be a directory on the file system, a topic of Kafka, or a table in the
|
||||
* catalog, etc.
|
||||
* <p>
|
||||
* This interface can mixin the following interfaces to support different operations, like
|
||||
* {@code SupportsRead}.
|
||||
* This interface can mixin {@code SupportsRead} and {@code SupportsWrite} to provide data reading
|
||||
* and writing ability.
|
||||
* <p>
|
||||
* The default implementation of {@link #partitioning()} returns an empty array of partitions, and
|
||||
* the default implementation of {@link #properties()} returns an empty map. These should be
|
||||
|
|
|
@ -70,6 +70,12 @@ public interface WriteBuilder {
|
|||
" does not support batch write");
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a {@link StreamingWrite} to write data to streaming source. By default this method
|
||||
* throws exception, data sources must overwrite this method to provide an implementation, if the
|
||||
* {@link Table} that creates this write returns {@link TableCapability#STREAMING_WRITE} support
|
||||
* in its {@link Table#capabilities()}.
|
||||
*/
|
||||
default StreamingWrite buildForStreaming() {
|
||||
throw new UnsupportedOperationException(getClass().getName() +
|
||||
" does not support streaming write");
|
||||
|
|
Loading…
Reference in a new issue