Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Struct Data Type(Part-1): New struct type, DDL statements and Describe #1114

Merged
merged 63 commits into from
May 15, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
63 commits
Select commit Hold shift + click to select a range
322f8f2
Cleaned up the grammar.
hjafarpour Jan 29, 2018
0911202
Merge remote-tracking branch 'upstream/master'
hjafarpour Jan 29, 2018
f69c25c
More clean up
hjafarpour Jan 30, 2018
3e6200b
Merge remote-tracking branch 'upstream/master' into KSQL-665-Remove-U…
hjafarpour Jan 30, 2018
415aa49
Added parent reference to AST nodes.
hjafarpour Jan 31, 2018
0253884
Merge remote-tracking branch 'upstream/master'
hjafarpour Jan 31, 2018
3846f4c
Merge branch 'master' of https://github.com/confluentinc/ksql
hjafarpour Jan 31, 2018
7893e2c
Merge branch 'master' of https://github.com/confluentinc/ksql
hjafarpour Feb 6, 2018
f391466
Merge branch 'master' of https://github.com/confluentinc/ksql
hjafarpour Feb 7, 2018
b45efb0
Merge remote-tracking branch 'upstream/master'
hjafarpour Feb 13, 2018
4af44e3
Merge branch 'master' of https://github.com/confluentinc/ksql
hjafarpour Feb 14, 2018
38c5456
Merge remote-tracking branch 'upstream/master' into Add-Parent-Refere…
hjafarpour Feb 14, 2018
f84d561
remove unused/old suppressions. reduce scope on some (#915)
norwood Mar 13, 2018
e885e75
Merge remote-tracking branch 'upstream/4.1.x'
hjafarpour Mar 13, 2018
fa1a261
Merge branch 'master' of https://github.com/confluentinc/ksql
hjafarpour Mar 13, 2018
0f68684
rename kafkaTopic->topic to match SourceInfo
norwood Mar 12, 2018
ef41a33
proper typing for KafkaTopicInfo
norwood Mar 12, 2018
e1821cf
Fix integration test annotations and remove maven-surefire-plugin ove…
ewencp Mar 13, 2018
511f5e3
Merge remote-tracking branch 'origin/4.1.x'
Mar 14, 2018
7032379
KSQL-739 return kafka cluster id as part of server info (#932)
xvrl Mar 14, 2018
8b58d47
Merge remote-tracking branch 'origin/4.1.x'
dguy Mar 15, 2018
1e96708
reduce checkstyle suppressions (#938)
dguy Mar 15, 2018
41eeb16
remove AbbreviationAsWordInName suppression (#937)
dguy Mar 15, 2018
2e1d04d
fix build (#946)
dguy Mar 15, 2018
c2e05f3
Merge remote-tracking branch 'origin/4.1.x'
rodesai Mar 15, 2018
02983f6
Merge branch '4.1.x'
Mar 15, 2018
e19bfb7
Merge remote-tracking branch 'origin/4.1.x'
Mar 15, 2018
dabfab2
Merge remote-tracking branch 'upstream/4.1.x'
hjafarpour Mar 16, 2018
46ce7e8
Merge remote-tracking branch 'origin/4.1.x'
dguy Mar 16, 2018
d5493ba
cluster id check subsumes broker compatibility (#959)
xvrl Mar 16, 2018
b873594
Guava 21.0 -> 24.0 (#869)
sullis Mar 16, 2018
6a60bf7
Merge branch '4.1.x'
rodesai Mar 16, 2018
7c84bf5
Merge remote-tracking branch 'upstream/master' into Add-Parent-Refere…
hjafarpour Mar 16, 2018
01b2871
Minor changes.
hjafarpour Mar 19, 2018
2050f03
Merge branch 'master' into Add-Parent-Reference-To-AST-Nodes
hjafarpour Mar 20, 2018
aab1f09
Need to go to another branch.
hjafarpour Mar 21, 2018
1bf87ba
Merge remote-tracking branch 'upstream/master' into Add-Struct-Type
hjafarpour Mar 21, 2018
f4083d8
Struct type support in DDL statements.
hjafarpour Mar 22, 2018
127babb
Added unit tests.
hjafarpour Mar 22, 2018
17334bb
Merge remote-tracking branch 'upstream/master' into Add-Parent-Refere…
hjafarpour Mar 28, 2018
df70636
Merged with master.
hjafarpour Mar 28, 2018
c4626bc
Cleaned the merge!
hjafarpour Mar 28, 2018
f75ea11
More clean up.
hjafarpour Mar 28, 2018
604cbfd
More clean up.
hjafarpour Mar 28, 2018
dc6e63c
Initialize the node parent to Optional.empty().
hjafarpour Mar 29, 2018
f4745ea
Merge remote-tracking branch 'upstream/master' into Add-Parent-Refere…
hjafarpour Mar 29, 2018
c3bfd72
Minor fix.
hjafarpour Mar 29, 2018
096d37e
Merge remote-tracking branch 'upstream/master' into Add-Struct-Type
hjafarpour Apr 3, 2018
e7399d6
Describe with struct type works!
hjafarpour Apr 3, 2018
c106874
Removed old doc files.
hjafarpour Apr 3, 2018
434bf5d
Merge remote-tracking branch 'upstream/master' into Add-Struct-Type
hjafarpour Apr 12, 2018
6fb6bcb
Merge remote-tracking branch 'upstream/master' into Add-Parent-Refere…
hjafarpour May 2, 2018
5517b54
Merge branch 'Add-Parent-Reference-To-AST-Nodes' into Add-Struct-Type
hjafarpour May 2, 2018
787a8a8
Merge remote-tracking branch 'upstream/master' into Add-Parent-Refere…
hjafarpour May 7, 2018
30bcab0
Applied review feedback.
hjafarpour May 7, 2018
cb50d56
Merge branch 'Add-Parent-Reference-To-AST-Nodes' into Add-Struct-Type
hjafarpour May 7, 2018
573c28a
Minor change
hjafarpour May 7, 2018
8a6e836
Merge remote-tracking branch 'upstream/master' into Add-Parent-Refere…
hjafarpour May 9, 2018
83e06c0
Merge branch 'Add-Parent-Reference-To-AST-Nodes' into Add-Struct-Type
hjafarpour May 9, 2018
4be7b47
Applied review feedback
hjafarpour May 10, 2018
d625753
Applied the feedback.
hjafarpour May 14, 2018
ab80df9
Merge remote-tracking branch 'upstream/master' into Add-Struct-Type
hjafarpour May 14, 2018
87dc0da
Fix findbug issue.
hjafarpour May 14, 2018
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -157,7 +157,7 @@ private List<FieldSchemaInfo> buildTestSchema(int size) {
List<FieldSchemaInfo> res = new ArrayList<>();
List<Field> fields = dataSourceBuilder.build().fields();
for (Field field : fields) {
res.add(new FieldSchemaInfo(field.name(), SchemaUtil.getSchemaFieldName(field)));
res.add(new FieldSchemaInfo(field.name(), SchemaUtil.getSchemaFieldType(field)));
}

return res;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@
import java.util.List;
import java.util.Optional;
import java.util.Set;
import java.util.stream.Collectors;

import static org.apache.avro.Schema.create;
import static org.apache.avro.Schema.createArray;
Expand Down Expand Up @@ -171,17 +172,52 @@ public static Schema buildSchemaWithAlias(final Schema schema, final String alia
.put("MAP", "MAP")
.build();

public static String getSchemaFieldName(Field field) {
public static String getSchemaFieldType(Field field) {
if (field.schema().type() == Schema.Type.ARRAY) {
return "ARRAY[" + TYPE_MAP.get(field.schema().valueSchema().type().name()) + "]";
return "ARRAY[" + getSchemaFieldType(field.schema().valueSchema().fields().get(0)) + "]";
} else if (field.schema().type() == Schema.Type.MAP) {
return "MAP[" + TYPE_MAP.get(field.schema().keySchema().type().name()) + ","
+ TYPE_MAP.get(field.schema().valueSchema().type().name()) + "]";
return "MAP[" + getSchemaFieldType(field.schema().keySchema().fields().get(0)) + ","
+ getSchemaFieldType(field.schema().valueSchema().fields().get(0)) + "]";
} else if (field.schema().type() == Schema.Type.STRUCT) {
StringBuilder stringBuilder = new StringBuilder("STRUCT <");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

stringBuilder.append(
field.schema().fields().stream()
.map(getSchemaFieldType)
.collect(Collectors.joining(", ")));

stringBuilder.append(
field.schema().fields().stream()
.map(schemaField -> getSchemaFieldType(schemaField))
.collect(Collectors.joining(", ")));
stringBuilder.append(">");
return stringBuilder.toString();
} else {
return TYPE_MAP.get(field.schema().type().name());
}
}


//TODO: Improve the format with proper indentation.
public static String describeSchema(Schema schema) {
if (schema.type() == Schema.Type.ARRAY) {
return "ARRAY[" + describeSchema(schema.valueSchema()) + "]";
} else if (schema.type() == Schema.Type.MAP) {
return "MAP[" + describeSchema(schema.keySchema()) + ","
+ describeSchema(schema.valueSchema()) + "]";
} else if (schema.type() == Schema.Type.STRUCT) {
StringBuilder stringBuilder = new StringBuilder("STRUCT < ");
boolean addComma = false;
for (Field structField: schema.fields()) {
if (addComma) {
stringBuilder.append(", ");
} else {
addComma = true;
}
stringBuilder
.append("\n\t " + structField.name() + " " + describeSchema(structField.schema()));
}
stringBuilder.append("\n >");
return stringBuilder.toString();
} else {
return TYPE_MAP.get(schema.type().name());
}
}

public static String getJavaCastString(Schema schema) {
switch (schema.type()) {
case INT32:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -295,4 +295,44 @@ public void shouldReturnFieldNameWithoutAliasAsIs() {
assertThat("Invalid field name", SchemaUtil.getFieldNameWithNoAlias(schema.fields().get(0)),
equalTo(schema.fields().get(0).name()));
}

@Test
public void shouldCreateCorrectSchemaDescription() {
Schema addressSchema = SchemaBuilder.struct()
.field("NUMBER", Schema.INT64_SCHEMA)
.field("STREET", Schema.STRING_SCHEMA)
.field("CITY", Schema.STRING_SCHEMA)
.field("STATE", Schema.STRING_SCHEMA)
.field("ZIPCODE", Schema.INT64_SCHEMA)
.build();

SchemaBuilder schemaBuilder = SchemaBuilder.struct();
Schema structSchema = schemaBuilder
.field("ordertime", Schema.INT64_SCHEMA)
.field("orderid", Schema.INT64_SCHEMA)
.field("itemid", Schema.STRING_SCHEMA)
.field("orderunits", Schema.FLOAT64_SCHEMA)
.field("arraycol",schemaBuilder.array(Schema.FLOAT64_SCHEMA))
.field("mapcol", schemaBuilder.map(Schema.STRING_SCHEMA, Schema.FLOAT64_SCHEMA))
.field("address", addressSchema).build();

String schemaDescription = SchemaUtil.describeSchema(structSchema);

assertThat(schemaDescription, equalTo("STRUCT < \n"
+ "\t ordertime BIGINT, \n"
+ "\t orderid BIGINT, \n"
+ "\t itemid VARCHAR(STRING), \n"
+ "\t orderunits DOUBLE, \n"
+ "\t arraycol ARRAY[DOUBLE], \n"
+ "\t mapcol MAP[VARCHAR(STRING),DOUBLE], \n"
+ "\t address STRUCT < \n"
+ "\t NUMBER BIGINT, \n"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would be nice to add an indent here. I'm fine with punting this to another PR though.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, for now I'll leave it as is. We may want to provide unified ux for both web based UI and CLI

+ "\t STREET VARCHAR(STRING), \n"
+ "\t CITY VARCHAR(STRING), \n"
+ "\t STATE VARCHAR(STRING), \n"
+ "\t ZIPCODE BIGINT\n"
+ " >\n"
+ " >"));
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -35,10 +35,12 @@
import io.confluent.ksql.util.KsqlPreconditions;
import io.confluent.ksql.util.SchemaUtil;
import io.confluent.ksql.util.StringUtil;
import io.confluent.ksql.util.TypeUtil;
import io.confluent.ksql.util.timestamp.TimestampExtractionPolicy;
import io.confluent.ksql.util.timestamp.TimestampExtractionPolicyFactory;



/**
* Base class of create table/stream command
*/
Expand Down Expand Up @@ -140,7 +142,7 @@ private SchemaBuilder getStreamTableSchema(List<TableElement> tableElementList)
}
tableSchema = tableSchema.field(
tableElement.getName(),
SchemaUtil.getTypeSchema(tableElement.getType())
TypeUtil.getTypeSchema(tableElement.getType())
);
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -143,8 +143,8 @@ private AbstractStreamCreateStatement addAvroFields(
) {
List<TableElement> elements = new ArrayList<>();
for (Field field : schema.fields()) {
TableElement tableElement = new TableElement(field.name().toUpperCase(), SchemaUtil
.getSqlTypeName(field.schema()));
TableElement tableElement = new TableElement(field.name().toUpperCase(),
TypeUtil.getKsqlType(field.schema()));
elements.add(tableElement);
}
StringLiteral schemaIdLiteral = new StringLiteral(String.format("%d", schemaId));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,11 +35,13 @@
import io.confluent.ksql.parser.tree.DropTable;
import io.confluent.ksql.parser.tree.DropTopic;
import io.confluent.ksql.parser.tree.Expression;
import io.confluent.ksql.parser.tree.PrimitiveType;
import io.confluent.ksql.parser.tree.QualifiedName;
import io.confluent.ksql.parser.tree.RegisterTopic;
import io.confluent.ksql.parser.tree.SetProperty;
import io.confluent.ksql.parser.tree.StringLiteral;
import io.confluent.ksql.parser.tree.TableElement;
import io.confluent.ksql.parser.tree.Type;
import io.confluent.ksql.util.KafkaTopicClient;
import io.confluent.ksql.util.KsqlException;

Expand Down Expand Up @@ -92,8 +94,8 @@ public void shouldCreateCommandForCreateTable() {
tableProperties.put(DdlConfig.KEY_NAME_PROPERTY, new StringLiteral("COL1"));
final DdlCommand result = commandFactories.create(sqlExpression,
new CreateTable(QualifiedName.of("foo"),
Arrays.asList(new TableElement("COL1", "BIGINT"), new TableElement
("COL2", "VARCHAR")), true,
Arrays.asList(new TableElement("COL1", new PrimitiveType(Type.KsqlType.BIGINT)), new TableElement
("COL2", new PrimitiveType(Type.KsqlType.STRING))), true,
tableProperties),
Collections.emptyMap());

Expand All @@ -108,8 +110,8 @@ public void shouldFailCreateTableIfKeyNameIsIncorrect() {
try {
final DdlCommand result = commandFactories.create(sqlExpression,
new CreateTable(QualifiedName.of("foo"),
Arrays.asList(new TableElement("COL1", "BIGINT"), new TableElement
("COL2", "VARCHAR")), true,
Arrays.asList(new TableElement("COL1", new PrimitiveType(Type.KsqlType.BIGINT)), new TableElement
("COL2", new PrimitiveType(Type.KsqlType.STRING))), true,
tableProperties),
Collections.emptyMap());

Expand All @@ -128,8 +130,8 @@ public void shouldFailCreateTableIfTimestampColumnNameIsIncorrect() {
try {
commandFactories.create(sqlExpression,
new CreateTable(QualifiedName.of("foo"),
Arrays.asList(new TableElement("COL1", "BIGINT"), new TableElement
("COL2", "VARCHAR")), true,
Arrays.asList(new TableElement("COL1", new PrimitiveType(Type.KsqlType.BIGINT)), new TableElement
("COL2", new PrimitiveType(Type.KsqlType.STRING))), true,
tableProperties),
Collections.emptyMap());

Expand All @@ -144,8 +146,8 @@ public void shouldFailCreateTableIfKeyIsNotProvided() {
try {
commandFactories.create(sqlExpression,
new CreateTable(QualifiedName.of("foo"),
Arrays.asList(new TableElement("COL1", "BIGINT"), new TableElement
("COL2", "VARCHAR")), true, properties),
Arrays.asList(new TableElement("COL1", new PrimitiveType(Type.KsqlType.BIGINT)), new TableElement
("COL2", new PrimitiveType(Type.KsqlType.STRING))), true, properties),
Collections.emptyMap());

} catch (KsqlException e) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,11 @@
import io.confluent.ksql.metastore.MetaStoreImpl;
import io.confluent.ksql.parser.KsqlParser;
import io.confluent.ksql.parser.tree.AbstractStreamCreateStatement;
import io.confluent.ksql.parser.tree.Array;
import io.confluent.ksql.parser.tree.Map;
import io.confluent.ksql.parser.tree.Statement;
import io.confluent.ksql.parser.tree.TableElement;
import io.confluent.ksql.parser.tree.Type;
import io.confluent.ksql.serde.avro.KsqlAvroTopicSerDe;

import static org.easymock.EasyMock.anyObject;
Expand Down Expand Up @@ -70,19 +73,33 @@ public void shouldPassAvroCheck() throws Exception {
expect(schemaRegistryClient.getLatestSchemaMetadata(anyString())).andReturn(schemaMetadata);
replay(schemaRegistryClient);
AbstractStreamCreateStatement abstractStreamCreateStatement = getAbstractStreamCreateStatement
("CREATE STREAM S1 WITH "
+ "(kafka_topic='s1_topic', "
("CREATE STREAM S1 WITH (kafka_topic='s1_topic', "
+ "value_format='avro' );");
Pair<AbstractStreamCreateStatement, String> checkResult = avroUtil.checkAndSetAvroSchema(abstractStreamCreateStatement, new HashMap<>(), schemaRegistryClient);
AbstractStreamCreateStatement newAbstractStreamCreateStatement = checkResult.getLeft();
assertThat(newAbstractStreamCreateStatement.getElements(), equalTo(Arrays.asList(
new TableElement("ORDERTIME", "BIGINT"),
new TableElement("ORDERID", "BIGINT"),
new TableElement("ITEMID", "VARCHAR"),
new TableElement("ORDERUNITS", "DOUBLE"),
new TableElement("ARRAYCOL", "ARRAY<DOUBLE>"),
new TableElement("MAPCOL", "MAP<VARCHAR,DOUBLE>")
)));
List<TableElement> tableElements = newAbstractStreamCreateStatement.getElements();
assertThat(tableElements.size(), equalTo(6));
assertThat(tableElements.get(0).getName(), equalTo("ORDERTIME"));
assertThat(tableElements.get(0).getType().getKsqlType(), equalTo(Type.KsqlType.BIGINT));

assertThat(tableElements.get(1).getName(), equalTo("ORDERID"));
assertThat(tableElements.get(1).getType().getKsqlType(), equalTo(Type.KsqlType.BIGINT));

assertThat(tableElements.get(2).getName(), equalTo("ITEMID"));
assertThat(tableElements.get(2).getType().getKsqlType(), equalTo(Type.KsqlType.STRING));

assertThat(tableElements.get(3).getName(), equalTo("ORDERUNITS"));
assertThat(tableElements.get(3).getType().getKsqlType(), equalTo(Type.KsqlType.DOUBLE));

assertThat(tableElements.get(4).getName(), equalTo("ARRAYCOL"));
assertThat(tableElements.get(4).getType().getKsqlType(), equalTo(Type.KsqlType.ARRAY));
assertThat(((Array) tableElements.get(4).getType()).getItemType().getKsqlType(),
equalTo(Type.KsqlType.DOUBLE));

assertThat(tableElements.get(5).getName(), equalTo("MAPCOL"));
assertThat(tableElements.get(5).getType().getKsqlType(), equalTo(Type.KsqlType.MAP));
assertThat(((Map) tableElements.get(5).getType()).getValueType().getKsqlType(),
equalTo(Type.KsqlType.DOUBLE));
}

@Test
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
{
"name": "max integer group by",
"statements": [
"CREATE STREAM TEST (ID bigint, NAME varchar, VALUE int) WITH (kafka_topic='test_topic', value_format='DELIMITED', key='ID');",
"CREATE STREAM TEST (ID bigint, NAME varchar, VALUE integer) WITH (kafka_topic='test_topic',value_format='DELIMITED', key='ID');",
"CREATE TABLE S2 as SELECT id, max(value) FROM test group by id;"
],
"inputs": [
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
{
"name": "min integer group by",
"statements": [
"CREATE STREAM TEST (ID bigint, NAME varchar, VALUE int) WITH (kafka_topic='test_topic', value_format='DELIMITED', key='ID');",
"CREATE STREAM TEST (ID bigint, NAME varchar, VALUE integer) WITH (kafka_topic='test_topic',value_format='DELIMITED', key='ID');",
"CREATE TABLE S2 as SELECT id, min(value) FROM test group by id;"
],
"inputs": [
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
{
"name": "sum int",
"statements": [
"CREATE STREAM TEST (ID bigint, NAME varchar, VALUE int) WITH (kafka_topic='test_topic', value_format='DELIMITED', key='ID');",
"CREATE STREAM TEST (ID bigint, NAME varchar, VALUE integer) WITH (kafka_topic='test_topic',value_format='DELIMITED', key='ID');",
"CREATE TABLE S2 as SELECT id, sum(value) FROM test group by id;"
],
"inputs": [
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
{
"name": "topk distinct integer",
"statements": [
"CREATE STREAM TEST (ID bigint, NAME varchar, VALUE int) WITH (kafka_topic='test_topic', value_format='JSON', key='ID');",
"CREATE STREAM TEST (ID bigint, NAME varchar, VALUE integer) WITH (kafka_topic='test_topic',value_format='JSON', key='ID');",
"CREATE TABLE S2 as SELECT id, topkdistinct(value, 3) as topk FROM test group by id;"
],
"inputs": [
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
{
"name": "topk integer",
"statements": [
"CREATE STREAM TEST (ID bigint, NAME varchar, VALUE int) WITH (kafka_topic='test_topic', value_format='JSON', key='ID');",
"CREATE STREAM TEST (ID bigint, NAME varchar, VALUE integer) WITH (kafka_topic='test_topic',value_format='JSON', key='ID');",
"CREATE TABLE S2 as SELECT id, topk(value, 3) as topk FROM test group by id;"
],
"inputs": [
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -262,10 +262,8 @@ primaryExpression
| STRING #stringLiteral
| BINARY_LITERAL #binaryLiteral
| POSITION '(' valueExpression IN valueExpression ')' #position
| '(' expression (',' expression)+ ')' #rowConstructor
| ROW '(' expression (',' expression)* ')' #rowConstructor
| qualifiedName '(' ASTERISK ')' over? #functionCall
| qualifiedName '(' (expression (',' expression)*)? ')' over? #functionCall
| qualifiedName '(' (expression (',' expression)*)? ')' over? #functionCall
| identifier '->' expression #lambda
| '(' identifier (',' identifier)* ')' '->' expression #lambda
| '(' query ')' #subqueryExpression
Expand Down Expand Up @@ -310,7 +308,7 @@ type
: type ARRAY
| ARRAY '<' type '>'
| MAP '<' type ',' type '>'
| ROW '(' identifier type (',' identifier type)* ')'
| STRUCT '<' identifier type (',' identifier type)* '>'
| baseType ('(' typeParameter (',' typeParameter)* ')')?
;

Expand Down Expand Up @@ -386,7 +384,7 @@ number
nonReserved
: SHOW | TABLES | COLUMNS | COLUMN | PARTITIONS | FUNCTIONS | SCHEMAS | CATALOGS | SESSION
| ADD
| OVER | PARTITION | RANGE | ROWS | PRECEDING | FOLLOWING | CURRENT | ROW | MAP | ARRAY
| OVER | PARTITION | RANGE | ROWS | PRECEDING | FOLLOWING | CURRENT | ROW | STRUCT | MAP | ARRAY
| TINYINT | SMALLINT | INTEGER | DATE | TIME | TIMESTAMP | INTERVAL | ZONE
| YEAR | MONTH | DAY | HOUR | MINUTE | SECOND
| EXPLAIN | ANALYZE | FORMAT | TYPE | TEXT | GRAPHVIZ | LOGICAL | DISTRIBUTED
Expand Down Expand Up @@ -509,6 +507,7 @@ PRECEDING: 'PRECEDING';
FOLLOWING: 'FOLLOWING';
CURRENT: 'CURRENT';
ROW: 'ROW';
STRUCT: 'STRUCT';
WITH: 'WITH';
RECURSIVE: 'RECURSIVE';
VALUES: 'VALUES';
Expand Down
Loading