[SPARK-27046][DSTREAMS] Remove SPARK-19185 related references from documentation

## What changes were proposed in this pull request?

SPARK-19185 is resolved so the reference can be removed from the documentation.

## How was this patch tested?

cd docs/
SKIP_API=1 jekyll build
Manual webpage check.

Closes #23959 from gaborgsomogyi/SPARK-27046.

Authored-by: Gabor Somogyi <gabor.g.somogyi@gmail.com>
Signed-off-by: Sean Owen <sean.owen@databricks.com>
This commit is contained in:
Gabor Somogyi 2019-03-04 09:31:46 -06:00 committed by Sean Owen
parent e64c110d21
commit 5252d8b987

View file

@ -96,7 +96,7 @@ In most cases, you should use `LocationStrategies.PreferConsistent` as shown abo
The cache for consumers has a default maximum size of 64. If you expect to be handling more than (64 * number of executors) Kafka partitions, you can change this setting via `spark.streaming.kafka.consumer.cache.maxCapacity`.
If you would like to disable the caching for Kafka consumers, you can set `spark.streaming.kafka.consumer.cache.enabled` to `false`. Disabling the cache may be needed to workaround the problem described in SPARK-19185. This property may be removed in later versions of Spark, once SPARK-19185 is resolved.
If you would like to disable the caching for Kafka consumers, you can set `spark.streaming.kafka.consumer.cache.enabled` to `false`.
The cache is keyed by topicpartition and group.id, so use a **separate** `group.id` for each call to `createDirectStream`.