-
Notifications
You must be signed in to change notification settings - Fork 28.8k
[SPARK-53342][SQL] Fix Arrow converter to handle multiple record batches in single IPC stream #52090
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
89bf987
to
ac1f23b
Compare
I would like @hvanhovell to take a look though if he finds some time. |
s"Expected ${inputRows.length} rows processed, got ${iterator.totalRowsProcessed}") | ||
} | ||
|
||
test("multiple record batches in single IPC stream") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for adding this test coverage.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM. Thank you, @grundprinzip .
Merged to master. Could you make backporting PRs for all live release branches, please, @grundprinzip ? |
…hes in single IPC stream ### What changes were proposed in this pull request? This PR adds a new method in ArrowConverters that allows properly decoding an Arrow IPC stream, which can contain multiple record batches. All of the other methods can only deal with message streams that contain exactly one record batch. ### Why are the changes needed? Previously, when an Arrow IPC stream contained multiple record batches, only the first batch would be processed and the remaining batches would be ignored. This resulted in data loss and incorrect results when working with Arrow data that was serialized as a single stream with multiple batches. ### Does this PR introduce _any_ user-facing change? Yes. This fixes a data correctness issue where users would lose data when processing Arrow streams with multiple batches. The behavior change is that all batches in a stream are now correctly processed instead of only the first one. ### How was this patch tested? Added comprehensive test cases. ### Was this patch authored or co-authored using generative AI tooling? Tests Generated-by: Claude Code 🤖 Generated with [Claude Code](https://claude.ai/code) Closes apache#52090 from grundprinzip/SPARK-53342. Authored-by: Martin Grund <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
…hes in single IPC stream ### What changes were proposed in this pull request? This PR adds a new method in ArrowConverters that allows properly decoding an Arrow IPC stream, which can contain multiple record batches. All of the other methods can only deal with message streams that contain exactly one record batch. ### Why are the changes needed? Previously, when an Arrow IPC stream contained multiple record batches, only the first batch would be processed and the remaining batches would be ignored. This resulted in data loss and incorrect results when working with Arrow data that was serialized as a single stream with multiple batches. ### Does this PR introduce _any_ user-facing change? Yes. This fixes a data correctness issue where users would lose data when processing Arrow streams with multiple batches. The behavior change is that all batches in a stream are now correctly processed instead of only the first one. ### How was this patch tested? Added comprehensive test cases. ### Was this patch authored or co-authored using generative AI tooling? Tests Generated-by: Claude Code 🤖 Generated with [Claude Code](https://claude.ai/code) Closes apache#52090 from grundprinzip/SPARK-53342. Authored-by: Martin Grund <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
…hes in single IPC stream This PR adds a new method in ArrowConverters that allows properly decoding an Arrow IPC stream, which can contain multiple record batches. All of the other methods can only deal with message streams that contain exactly one record batch. Previously, when an Arrow IPC stream contained multiple record batches, only the first batch would be processed and the remaining batches would be ignored. This resulted in data loss and incorrect results when working with Arrow data that was serialized as a single stream with multiple batches. Yes. This fixes a data correctness issue where users would lose data when processing Arrow streams with multiple batches. The behavior change is that all batches in a stream are now correctly processed instead of only the first one. Added comprehensive test cases. Tests Generated-by: Claude Code 🤖 Generated with [Claude Code](https://claude.ai/code) Closes apache#52090 from grundprinzip/SPARK-53342. Authored-by: Martin Grund <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
Thank you, @grundprinzip . |
Oh, we cannot touch |
Thanks! |
… batches in single IPC stream ### What changes were proposed in this pull request? Cherry-Pick: #52090 This PR adds a new method in ArrowConverters that allows properly decoding an Arrow IPC stream, which can contain multiple record batches. All of the other methods can only deal with message streams that contain exactly one record batch. ### Why are the changes needed? Previously, when an Arrow IPC stream contained multiple record batches, only the first batch would be processed and the remaining batches would be ignored. This resulted in data loss and incorrect results when working with Arrow data that was serialized as a single stream with multiple batches. ### Does this PR introduce _any_ user-facing change? Yes. This fixes a data correctness issue where users would lose data when processing Arrow streams with multiple batches. The behavior change is that all batches in a stream are now correctly processed instead of only the first one. ### How was this patch tested? Added comprehensive test cases. ### Was this patch authored or co-authored using generative AI tooling? Tests Generated-by: Claude Code 🤖 Generated with [Claude Code](https://claude.ai/code) Closes #52090 from grundprinzip/SPARK-53342. Authored-by: Martin Grund <martin.grunddatabricks.com> Closes #52130 from grundprinzip/SPARK-53342-4.0. Authored-by: Martin Grund <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
What changes were proposed in this pull request?
This PR adds a new method in ArrowConverters that allows properly decoding an Arrow IPC stream, which can contain multiple record batches. All of the other methods can only deal with message streams that contain exactly one record batch.
Why are the changes needed?
Previously, when an Arrow IPC stream contained multiple record batches, only the first batch would be processed and the remaining batches would be ignored. This resulted in data loss and incorrect results when working with Arrow data that was serialized as a single stream with multiple batches.
Does this PR introduce any user-facing change?
Yes. This fixes a data correctness issue where users would lose data when processing Arrow streams with multiple batches. The behavior change is that all batches in a stream are now correctly processed instead of only the first one.
How was this patch tested?
Added comprehensive test cases.
Was this patch authored or co-authored using generative AI tooling?
Tests Generated-by: Claude Code
🤖 Generated with Claude Code