Running CAAL locally on Mobile or WatchOS yet?👀 #67
Replies: 2 comments 4 replies
-
|
@MaxSikorski, welcome! Thanks for bringing the discussion to the GH. I haven't tried training very small models yet, but its on my radar and dev path. I think a CAAL Lite for on-device use is in the roadmap. It might be more basic and less conversational, but could work for basic needs. Then if needed maybe route back to your server running CAAL with a bigger model. |
Beta Was this translation helpful? Give feedback.
-
|
Glad to be here; I do like your project and need to spin up my own instance asap. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
After seeing the most recent video on the YouTube channel, Your LLM Doesn't Need Your Keys, I made a comment about running smaller models that are specifically local on edge devices like phones, watches, and of course even glasses. Has anyone played around with any of this local fine-tuning of very small models? <1B parameter models that could be easily ran on any iPhone or Pixel device or even a Apple watch.
I believe this is the future and I'm very interested if anybody else has done this. I have yet to try this out, by the way. I'll get around to trying this.
Beta Was this translation helpful? Give feedback.
All reactions