-
Notifications
You must be signed in to change notification settings - Fork 4
Dev/wasmtime #41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Dev/wasmtime #41
Conversation
This is currently unable to retrieve outputs after function execution. I will have to rework a part of the memory model.
wasm backend finally works on matmul now. Fixed a few bugs I had introduced in input writing and context base offsets.
I tested on wasm backend, not on cheri
The wasm memory is still of fixed size, but we can now provide our own buffer for it, which can be on the heap as well. This is optional, and can be toggled easily.
...when using heap for wasm memory.
removes heapless wasm memory initialization as announced in #13 (comment)
and fixing a bug in memory read
…into dev/wasmtime
I don't know why the solution with the constant did not work, but compiling server crate with feature "wasmtime-precompiled" did not make it "true".
|
Also should add the new features to the workflow so they will be tested on updates (need to resolve the merge conflicts for the workflow to run as I understand) |
|
|
The |
Contexts are allowed to allocate a larger amount of memory than they were asked for. This specifically is a requirement for the Wasm backends, where the Wasm standard dictates the Wasm memory to be a multiple of the Wasm page size (64Kib), so we need to round up to the next multiple. This mainly adapts the OOB tests to take the actual context size into account.
This PR adds a Wasmtime backend using Wasmtime for function isolation. The backend has two flavors:
dandelion-wasmtime-jitdandelion-wasmtime-precomp