[SPARK-29148][CORE][FOLLOW-UP] Don't dynamic allocation warning when it's disabled

### What changes were proposed in this pull request?

Currently, after https://github.com/apache/spark/pull/27313, it shows the warning about dynamic allocation which is disabled by default.

```bash
$ ./bin/spark-shell
```

```
...
20/02/18 11:04:56 WARN ResourceProfile: Please ensure that the number of slots available on your executors is
limited by the number of cores to task cpus and not another custom resource. If cores is not the limiting resource
then dynamic allocation will not work properly!
```

This PR brings back the configuration checking for this warning. Seems mistakenly removed at https://github.com/apache/spark/pull/27313/files#diff-364713d7776956cb8b0a771e9b62f82dL2841

### Why are the changes needed?

To remove false warning.

### Does this PR introduce any user-facing change?

Yes, it will don't show the warning. It's master only change so no user-facing to end users.

### How was this patch tested?

Manually tested.

Closes #27615 from HyukjinKwon/SPARK-29148.

Authored-by: HyukjinKwon <gurwls223@apache.org>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
This commit is contained in:
HyukjinKwon 2020-02-19 23:01:49 -08:00 committed by Dongjoon Hyun
parent ef90f1422f
commit 7c4ad6316e

View file

@ -183,7 +183,7 @@ class ResourceProfile(
"no corresponding task resource request was specified.")
}
}
if(!shouldCheckExecCores) {
if(!shouldCheckExecCores && Utils.isDynamicAllocationEnabled(sparkConf)) {
// if we can't rely on the executor cores config throw a warning for user
logWarning("Please ensure that the number of slots available on your " +
"executors is limited by the number of cores to task cpus and not another " +