Skip to content

[VL] Support config velox parquet writer option storeDecimalAsInteger for …#11839

Open
lifulong wants to merge 1 commit intoapache:mainfrom
lifulong:support_config_parquet_store_decimal_as_integer_for_velox
Open

[VL] Support config velox parquet writer option storeDecimalAsInteger for …#11839
lifulong wants to merge 1 commit intoapache:mainfrom
lifulong:support_config_parquet_store_decimal_as_integer_for_velox

Conversation

@lifulong
Copy link
Contributor

@lifulong lifulong commented Mar 27, 2026

…compatible with spark conf spark.sql.parquet.writeLegacyFormat

What changes are proposed in this pull request?

Support config spark.sql.parquet.writeLegacyFormat while use native write, compatible with Vanilla spark.
Velox doesn’t expose any config to control how Parquet decimal columns are actually written.
I have added this parameter via PR facebookincubator/velox#16941.
This feature is really useful when Spark or Flink reads Hive tables using ParquetHiveSerDe defined in Hive CREATE TABLE statements, especially with older Hive versions like 2.1.
With Velox’s current write logic, it decides whether to write decimals as int or fixed_len_byte_array based on precision.
When write decimal use Int32/Int64 will cause Spark and Flink to throw exceptions when reading those Hive tables.

Depends on facebookincubator/velox#16941

How was this patch tested?

test at our produce env

Was this patch authored or co-authored using generative AI tooling?

co-authored with cursor

…compatible with spark conf spark.sql.parquet.writeLegacyFormat
@github-actions
Copy link

Run Gluten Clickhouse CI on x86

@zhouyuan zhouyuan changed the title support config velox parquet writer option storeDecimalAsInteger for … [VL] Support config velox parquet writer option storeDecimalAsInteger for … Mar 27, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CORE works for Gluten Core VELOX

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants