spark-instrumented-optimizer/sql/core/src/main
Huaxin Gao 62d2fa5e99 [SPARK-15749][SQL] make the error message more meaningful
## What changes were proposed in this pull request?

For table test1 (C1 varchar (10), C2 varchar (10)), when I insert a row using
```
sqlContext.sql("insert into test1 values ('abc', 'def', 1)")
```
I got error message

```
Exception in thread "main" java.lang.RuntimeException: RelationC1#0,C2#1 JDBCRelation(test1)
requires that the query in the SELECT clause of the INSERT INTO/OVERWRITE statement
generates the same number of columns as its schema.
```
The error message is a little confusing. In my simple insert statement, it doesn't have a SELECT clause.

I will change the error message to a more general one

```
Exception in thread "main" java.lang.RuntimeException: RelationC1#0,C2#1 JDBCRelation(test1)
requires that the data to be inserted have the same number of columns as the target table.
```

## How was this patch tested?

I tested the patch using my simple unit test, but it's a very trivial change and I don't think I need to check in any test.

Author: Huaxin Gao <huaxing@us.ibm.com>

Closes #13492 from huaxingao/spark-15749.
2016-06-16 14:37:10 -07:00
..
java/org/apache/spark/sql [SPARK-15979][SQL] Rename various Parquet support classes. 2016-06-15 20:05:08 -07:00
resources [SPARK-15543][SQL] Rename DefaultSources to make them more self-describing 2016-05-25 23:54:24 -07:00
scala/org/apache/spark/sql [SPARK-15749][SQL] make the error message more meaningful 2016-06-16 14:37:10 -07:00