[SPARK-29433][WEBUI] Fix tooltip stages table
### What changes were proposed in this pull request? In the Web UI, Stages table, the tool tip of Input and output column are not corrrect. Actual tooltip messages: Bytes and records read from Hadoop or from Spark storage. Bytes and records written to Hadoop. In this column we are only showing bytes, not records ![image](https://user-images.githubusercontent.com/12819544/66608286-85a0e480-ebb6-11e9-812a-9760bea53664.png) ![image](https://user-images.githubusercontent.com/12819544/66608323-96515a80-ebb6-11e9-9e5f-e3f2cc99a3b3.png) ![image](https://user-images.githubusercontent.com/12819544/66608450-de707d00-ebb6-11e9-84e3-0917b5cfe6f6.png) ![image](https://user-images.githubusercontent.com/12819544/66608468-eaf4d580-ebb6-11e9-8c5b-2a9a290bea9c.png) ### Why are the changes needed? Simple correction of a tooltip ### Does this PR introduce any user-facing change? Yes, tooltip correction ### How was this patch tested? Manual testing Closes #26084 from planga82/feature/SPARK-29433_Tooltip_correction. Lead-authored-by: Pablo <soypab@gmail.com> Co-authored-by: Unknown <soypab@gmail.com> Signed-off-by: Sean Owen <sean.owen@databricks.com>
This commit is contained in:
parent
b5b1b69f79
commit
782a94d289
|
@ -31,9 +31,9 @@ private[spark] object ToolTips {
|
|||
val SHUFFLE_READ_BLOCKED_TIME =
|
||||
"Time that the task spent blocked waiting for shuffle data to be read from remote machines."
|
||||
|
||||
val INPUT = "Bytes and records read from Hadoop or from Spark storage."
|
||||
val INPUT = "Bytes read from Hadoop or from Spark storage."
|
||||
|
||||
val OUTPUT = "Bytes and records written to Hadoop."
|
||||
val OUTPUT = "Bytes written to Hadoop."
|
||||
|
||||
val STORAGE_MEMORY =
|
||||
"Memory used / total available memory for storage of data " +
|
||||
|
|
Loading…
Reference in a new issue