You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Map conversion fails for decimal types in test_all_types Iceberg table
Description
The test_daft_iceberg_table_collect_correct test is failing for the test_all_types table. The error occurs when trying to convert a Map with decimal types to an Arrow array. This issue seems to be related to a mismatch in the decimal precision between the Iceberg schema and the Daft/Arrow representation.
Error Message
pyo3_runtime.PanicException: called `Result::unwrap()` on an `Err` value: InvalidArgumentError("MapArray expects `field.data_type` to match its inner DataType, but found \nStruct([Field { name: \"key\", data_type: Int64, is_nullable: true, metadata: {} }, Field { name: \"value\", data_type: Decimal(32, 32), is_nullable: true, metadata: {} }])\nvs\n\n\nField { name: \"entries\", data_type: Struct([Field { name: \"key\", data_type: Int64, is_nullable: true, metadata: {} }, Field { name: \"value\", data_type: Decimal(10, 2), is_nullable: true, metadata: {} }]), is_nullable: true, metadata: {} }")
Context
The test_all_types table was commented out in the WORKING_SHOW_COLLECT list to temporarily fix the issue in a previous PR.
The problem seems to be related to the round-trip conversion of decimal types, especially in the context of Map data structures.
It's likely connected to the recent changes in recursively changing types.
Steps to Reproduce
Run the Iceberg integration tests with pytest tests/integration/iceberg -m 'integration'
Observe the failure in test_daft_iceberg_table_collect_correct for the test_all_types table.
Expected Behavior
The test should pass, correctly handling the decimal types within the Map structure.
Actual Behavior
The test fails due to a mismatch in the decimal precision between the Iceberg schema (Decimal(32, 32)) and the Daft/Arrow representation (Decimal(10, 2)).
Possible Solutions
Investigate the decimal type conversion process in the Iceberg to Daft/Arrow pipeline.
Consider adding a mechanism to preserve the original decimal precision during the conversion.
The text was updated successfully, but these errors were encountered:
Map conversion fails for decimal types in test_all_types Iceberg table
Description
The
test_daft_iceberg_table_collect_correct
test is failing for thetest_all_types
table. The error occurs when trying to convert a Map with decimal types to an Arrow array. This issue seems to be related to a mismatch in the decimal precision between the Iceberg schema and the Daft/Arrow representation.Error Message
Context
test_all_types
table was commented out in theWORKING_SHOW_COLLECT
list to temporarily fix the issue in a previous PR.Steps to Reproduce
pytest tests/integration/iceberg -m 'integration'
test_daft_iceberg_table_collect_correct
for thetest_all_types
table.Expected Behavior
The test should pass, correctly handling the decimal types within the Map structure.
Actual Behavior
The test fails due to a mismatch in the decimal precision between the Iceberg schema (Decimal(32, 32)) and the Daft/Arrow representation (Decimal(10, 2)).
Possible Solutions
The text was updated successfully, but these errors were encountered: