Skip to content

Conversation

@blueberrymuffin3
Copy link
Contributor

The code for setting up the input buffers currently causes a crash when getting enums for input because it calculates the size of the input using core::mem::size_of() instead of bincode, which have different results. I wrote a new test to demonstrate the issue. Without the changes in this PR, it panics:

image

By updating the initial size to use bincode::serialized_size(&T::Input::default()), the problem is fixed. This does not solve for cases when the enum variants have data and are encoded to different sized, which would require a much larger change.

@caspark
Copy link
Contributor

caspark commented Jan 3, 2025

Ah, this was my fault - didn't consider this when I swapped out bytemuck for bincode serialization of inputs! Thanks for the fix.

This does not solve for cases when the enum variants have data and are encoded to different sized, which would require a much larger change.

FYI I do have support for this in my fork as a pretty self contained commit on main; hopefully I can PR it some time - if I can scrape together some free time.

@gschup gschup merged commit 989d2eb into gschup:main Jan 9, 2025
2 checks passed
@gschup
Copy link
Owner

gschup commented Jan 9, 2025

Thanks for spotting & fixing :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants