3623: [C] Fix negative block size validation in datafile reader#26
3623: [C] Fix negative block size validation in datafile reader#26martin-augment wants to merge 1 commit intomainfrom
Conversation
The file_read_block_count() function in datafile.c reads block size
using zigzag encoding, which can produce negative numbers from
malicious Avro container files. These negative values were passed
directly to avro_malloc(), causing allocation failures.
This patch adds validation to reject negative block size values with
a clear error message before attempting memory allocation.
Bug: Negative block size from varint decoding causes
allocation-size-too-big when cast to size_t
Impact: DoS via crafted .avro file
Co-Authored-By: Claude <noreply@anthropic.com>
WalkthroughThe change adds a validation check in ✨ Finishing touches
📜 Recent review detailsConfiguration used: Organization UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (1)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
🔇 Additional comments (1)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
🤖 Augment PR SummarySummary: Prevents the C datafile reader from accepting negative Avro block sizes, avoiding invalid allocations/reads when files are corrupt. Changes:
🤖 Was this summary useful? React with 👍 or 👎 |
PR Review: Fix negative block size validation in datafile readerSummaryThis PR adds a critical security fix to validate that block sizes read from Avro datafiles are non-negative before attempting memory allocation. The fix prevents potential undefined behavior and security issues from maliciously crafted files. Code Quality ✅Positive aspects:
Style consistency:
Security Analysis 🔒Critical security fix:
Attack vector: Potential Issues & Recommendations1. Missing NULL check after allocation
|
value:useful; category:bug; feedback:The Claude AI reviewer is correct! This is an old but a valid issue that has to be fixed! The result of malloc()/realloc() should be checked for NULL before being used. Prevents null pointer usage |
value:useful; category:bug; feedback:The Claude AI reviewer is correct! An attacker could craft an ,avro file with a really big long value, like i64::MAX, and cause an out of memory error on the machine that reads this file. Prevents an attack with a perfectly valid Avro file. |
3623: To review by AI