Skip to content

Conversation

grundprinzip
Copy link
Contributor

@grundprinzip grundprinzip commented Aug 21, 2025

What changes were proposed in this pull request?

This PR adds a new method in ArrowConverters that allows properly decoding an Arrow IPC stream, which can contain multiple record batches. All of the other methods can only deal with message streams that contain exactly one record batch.

Why are the changes needed?

Previously, when an Arrow IPC stream contained multiple record batches, only the first batch would be processed and the remaining batches would be ignored. This resulted in data loss and incorrect results when working with Arrow data that was serialized as a single stream with multiple batches.

Does this PR introduce any user-facing change?

Yes. This fixes a data correctness issue where users would lose data when processing Arrow streams with multiple batches. The behavior change is that all batches in a stream are now correctly processed instead of only the first one.

How was this patch tested?

Added comprehensive test cases.

Was this patch authored or co-authored using generative AI tooling?

Tests Generated-by: Claude Code

🤖 Generated with Claude Code

@github-actions github-actions bot added the SQL label Aug 21, 2025
@HyukjinKwon
Copy link
Member

I would like @hvanhovell to take a look though if he finds some time.

s"Expected ${inputRows.length} rows processed, got ${iterator.totalRowsProcessed}")
}

test("multiple record batches in single IPC stream") {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for adding this test coverage.

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM. Thank you, @grundprinzip .

@dongjoon-hyun
Copy link
Member

dongjoon-hyun commented Aug 26, 2025

Merged to master.

Could you make backporting PRs for all live release branches, please, @grundprinzip ?

grundprinzip added a commit to grundprinzip/spark that referenced this pull request Aug 26, 2025
…hes in single IPC stream

### What changes were proposed in this pull request?

This PR adds a new method in ArrowConverters that allows properly decoding an Arrow IPC stream, which can contain multiple record batches. All of the other methods can only deal with message streams that contain exactly one record batch.

### Why are the changes needed?

Previously, when an Arrow IPC stream contained multiple record batches, only the first batch would be processed and the remaining batches would be ignored. This resulted in data loss and incorrect results when working with Arrow data that was serialized as a single stream with multiple batches.

### Does this PR introduce _any_ user-facing change?

Yes. This fixes a data correctness issue where users would lose data when processing Arrow streams with multiple batches. The behavior change is that all batches in a stream are now correctly processed instead of only the first one.

### How was this patch tested?

Added comprehensive test cases.

### Was this patch authored or co-authored using generative AI tooling?

Tests Generated-by: Claude Code

🤖 Generated with [Claude Code](https://claude.ai/code)

Closes apache#52090 from grundprinzip/SPARK-53342.

Authored-by: Martin Grund <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
grundprinzip added a commit to grundprinzip/spark that referenced this pull request Aug 26, 2025
…hes in single IPC stream

### What changes were proposed in this pull request?

This PR adds a new method in ArrowConverters that allows properly decoding an Arrow IPC stream, which can contain multiple record batches. All of the other methods can only deal with message streams that contain exactly one record batch.

### Why are the changes needed?

Previously, when an Arrow IPC stream contained multiple record batches, only the first batch would be processed and the remaining batches would be ignored. This resulted in data loss and incorrect results when working with Arrow data that was serialized as a single stream with multiple batches.

### Does this PR introduce _any_ user-facing change?

Yes. This fixes a data correctness issue where users would lose data when processing Arrow streams with multiple batches. The behavior change is that all batches in a stream are now correctly processed instead of only the first one.

### How was this patch tested?

Added comprehensive test cases.

### Was this patch authored or co-authored using generative AI tooling?

Tests Generated-by: Claude Code

🤖 Generated with [Claude Code](https://claude.ai/code)

Closes apache#52090 from grundprinzip/SPARK-53342.

Authored-by: Martin Grund <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
grundprinzip added a commit to grundprinzip/spark that referenced this pull request Aug 26, 2025
…hes in single IPC stream

This PR adds a new method in ArrowConverters that allows properly decoding an Arrow IPC stream, which can contain multiple record batches. All of the other methods can only deal with message streams that contain exactly one record batch.

Previously, when an Arrow IPC stream contained multiple record batches, only the first batch would be processed and the remaining batches would be ignored. This resulted in data loss and incorrect results when working with Arrow data that was serialized as a single stream with multiple batches.

Yes. This fixes a data correctness issue where users would lose data when processing Arrow streams with multiple batches. The behavior change is that all batches in a stream are now correctly processed instead of only the first one.

Added comprehensive test cases.

Tests Generated-by: Claude Code

🤖 Generated with [Claude Code](https://claude.ai/code)

Closes apache#52090 from grundprinzip/SPARK-53342.

Authored-by: Martin Grund <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
@dongjoon-hyun
Copy link
Member

Thank you, @grundprinzip .

@dongjoon-hyun
Copy link
Member

dongjoon-hyun commented Aug 26, 2025

Oh, we cannot touch branch-3.4 because it's officially in the end-of-life status. It's not a live release branch.

@grundprinzip
Copy link
Contributor Author

Thanks!

dongjoon-hyun pushed a commit that referenced this pull request Aug 26, 2025
… batches in single IPC stream

### What changes were proposed in this pull request?

Cherry-Pick: #52090

This PR adds a new method in ArrowConverters that allows properly decoding an Arrow IPC stream, which can contain multiple record batches. All of the other methods can only deal with message streams that contain exactly one record batch.

### Why are the changes needed?

Previously, when an Arrow IPC stream contained multiple record batches, only the first batch would be processed and the remaining batches would be ignored. This resulted in data loss and incorrect results when working with Arrow data that was serialized as a single stream with multiple batches.

### Does this PR introduce _any_ user-facing change?

Yes. This fixes a data correctness issue where users would lose data when processing Arrow streams with multiple batches. The behavior change is that all batches in a stream are now correctly processed instead of only the first one.

### How was this patch tested?

Added comprehensive test cases.

### Was this patch authored or co-authored using generative AI tooling?

Tests Generated-by: Claude Code

🤖 Generated with [Claude Code](https://claude.ai/code)

Closes #52090 from grundprinzip/SPARK-53342.

Authored-by: Martin Grund <martin.grunddatabricks.com>

Closes #52130 from grundprinzip/SPARK-53342-4.0.

Authored-by: Martin Grund <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants