[Docs] Simple example of using MLX distributed#2973
Conversation
|
@awni where would be the ideal place for |
|
I think we should make a new subsection under "Neural Networks", call it say "Distributed" and add all the helper routines and maybe also distributed layers there. |
| ## Doc Development Setup | ||
|
|
||
| To enable live refresh of docs while writing: | ||
|
|
||
| Install sphinx autobuild | ||
| ``` | ||
| pip install sphinx-autobuild | ||
| ``` | ||
|
|
||
| Run auto build on docs/src folder | ||
| ``` | ||
| sphinx-autobuild ./src ./build/html | ||
| ``` |
|
For a data parallel training example, should I pull the existing example from |
|
We might not need an additional example given we already have something in the usage section. Wdyt? |
|
I think the existing example is good. Maybe we could put the existing example in a dedicated doc under examples section and then reference both DP and TP examples in the |
|
I have extracted the data parallel training example into |
awni
left a comment
There was a problem hiding this comment.
This is super nice, thanks for the contribution!
related issue: #2930