6c2d351f54
## What changes were proposed in this pull request? Although we use shebang `#!/usr/bin/env bash`, `minikube docker-env` returns invalid commands in `non-bash` environment and causes failures at `eval` because it only recognizes the default shell. We had better add `--shell bash` option explicitly in our `bash` script. ```bash $ bash -c 'eval $(minikube docker-env)' bash: line 0: set: -g: invalid option set: usage: set [-abefhkmnptuvxBCHP] [-o option-name] [--] [arg ...] bash: line 0: set: -g: invalid option set: usage: set [-abefhkmnptuvxBCHP] [-o option-name] [--] [arg ...] bash: line 0: set: -g: invalid option set: usage: set [-abefhkmnptuvxBCHP] [-o option-name] [--] [arg ...] bash: line 0: set: -g: invalid option set: usage: set [-abefhkmnptuvxBCHP] [-o option-name] [--] [arg ...] $ bash -c 'eval $(minikube docker-env --shell bash)' ``` ## How was this patch tested? Manual. Run the script with non-bash shell environment. ``` bin/docker-image-tool.sh -m -t testing build ``` Closes #24517 from dongjoon-hyun/SPARK-27626. Authored-by: Dongjoon Hyun <dhyun@apple.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com> |
||
---|---|---|
.. | ||
beeline | ||
beeline.cmd | ||
docker-image-tool.sh | ||
find-spark-home | ||
find-spark-home.cmd | ||
load-spark-env.cmd | ||
load-spark-env.sh | ||
pyspark | ||
pyspark.cmd | ||
pyspark2.cmd | ||
run-example | ||
run-example.cmd | ||
spark-class | ||
spark-class.cmd | ||
spark-class2.cmd | ||
spark-shell | ||
spark-shell.cmd | ||
spark-shell2.cmd | ||
spark-sql | ||
spark-sql.cmd | ||
spark-sql2.cmd | ||
spark-submit | ||
spark-submit.cmd | ||
spark-submit2.cmd | ||
sparkR | ||
sparkR.cmd | ||
sparkR2.cmd |