We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use spark-submit (v2.4.6) to write parquet to s3://redact/fol/part_filename=ju.txt/ (parquet is https://github.com/tooptoop4/presto-1/blob/master/part-00000-e35ff4f1-5a4b-4b5e-afde-fe00fcc1a7b2-c000.snappy.parquet?raw=true)
Use spark-sql to run: CREATE EXTERNAL TABLE s.t (opr_date TIMESTAMP,city STRING,hour DOUBLE,temp DOUBLE,temp_diff DOUBLE,temp_normal DOUBLE,dew DOUBLE,cloud_cover DOUBLE,feels_like DOUBLE,feels_like_diff DOUBLE,precip DOUBLE,wind_dir DOUBLE,wind_speed DOUBLE,hdd DOUBLE,cdd DOUBLE ) PARTITIONED BY (part_filename STRING) STORED AS PARQUET LOCATION 's3a://redact/fol/';
Use spark-sql to run: MSCK REPAIR TABLE s.t;
Query in HiveServer2 (v2.3.4): select * from s.t;
Query in PrestoSQL (336 and JDBC driver is 336): select *,date_format(opr_date,'%f') from s.t;
show create table s.t;
CREATE TABLE s.t ( opr_date timestamp(3), city varchar, hour double, temp double, temp_diff double, temp_normal double, dew double, cloud_cover double, feels_like double, feels_like_diff double, precip double, wind_dir double, wind_speed double, hdd double, cdd double, part_filename varchar ) WITH ( external_location = 's3a://redact/fol', format = 'PARQUET', partitioned_by = ARRAY['part_filename'] )
cc @martint
The text was updated successfully, but these errors were encountered:
This is covered by #3977
Sorry, something went wrong.
No branches or pull requests
Use spark-submit (v2.4.6) to write parquet to s3://redact/fol/part_filename=ju.txt/ (parquet is https://github.com/tooptoop4/presto-1/blob/master/part-00000-e35ff4f1-5a4b-4b5e-afde-fe00fcc1a7b2-c000.snappy.parquet?raw=true)
Use spark-sql to run:
CREATE EXTERNAL TABLE s.t (opr_date TIMESTAMP,city STRING,hour DOUBLE,temp DOUBLE,temp_diff DOUBLE,temp_normal DOUBLE,dew DOUBLE,cloud_cover DOUBLE,feels_like DOUBLE,feels_like_diff DOUBLE,precip DOUBLE,wind_dir DOUBLE,wind_speed DOUBLE,hdd DOUBLE,cdd DOUBLE ) PARTITIONED BY (part_filename STRING) STORED AS PARQUET LOCATION 's3a://redact/fol/';
Use spark-sql to run:
MSCK REPAIR TABLE s.t;
Query in HiveServer2 (v2.3.4):
select * from s.t;
Query in PrestoSQL (336 and JDBC driver is 336):
select *,date_format(opr_date,'%f') from s.t;
show create table s.t;
CREATE TABLE s.t (
opr_date timestamp(3),
city varchar,
hour double,
temp double,
temp_diff double,
temp_normal double,
dew double,
cloud_cover double,
feels_like double,
feels_like_diff double,
precip double,
wind_dir double,
wind_speed double,
hdd double,
cdd double,
part_filename varchar
)
WITH (
external_location = 's3a://redact/fol',
format = 'PARQUET',
partitioned_by = ARRAY['part_filename']
)
cc @martint
The text was updated successfully, but these errors were encountered: