Thanks for contributing to lyenv!
This guide focuses on the GUI-first workflow experience, including:
- how data moves between nodes (ports + wiring)
- how to write node code using the Hybrid Node Runtime
- a complete example you can reproduce: KV Set → KV Get
- how to export and publish plugins to the Plugin Center
- Workflow = Plugin
- Group = Command
- Nodes = Steps
- Edges = Data dependencies
Nodes exchange data through ports:
- Output ports produce values
- Input ports consume values
- A connection means:
Upstream.output_port → Downstream.input_port
At runtime, values are stored in a flow “bus”:
flow.outputs.<node_id>.<port_name>
The GUI exporter generates a wiring map (flow_wiring.json) so each node can resolve its inputs from upstream outputs.
Create an env:
lyenv create ./demo
lyenv init ./demo
cd ./demoActivate:
- Linux/macOS:
eval "$(lyenv activate)"
- Windows PowerShell:
lyenv activate | Invoke-Expression
Start GUI and register env:
lyenv gui start --open
lyenv gui add . --name=demolyenv GUI uses a Hybrid Node Runtime so node authors can choose between two styles, but the node type does not split.
- Inputs come from
argv(port order) - Outputs go to
stdout - For multi-output: print a JSON array:
["o1","o2"]
Example (simple_node.py):
import sys, json
a = sys.argv[1] if len(sys.argv) > 1 else ""
b = sys.argv[2] if len(sys.argv) > 2 else ""
print(json.dumps([a.upper(), b.lower()], ensure_ascii=False))- You can call
read_request()in the node script - You can use
mutate/config_plugin/emit_artifact/log - You can return a full stdio JSON response via
respond_ok/respond_error - To map outputs reliably to ports, return:
respond_ok("", extra={"outputs": ["out1", "out2"]})Why it works: the runner forwards the request JSON into child stdin and merges child stdio responses automatically.
Goal:
- User inputs
key val - Write plugin config:
kv.<key> = <val> - Read it back and print the value
Expected output: bar
Create nodes:
StartWriteKV(Code node, Python)ReadKV(Code node, Python)End
-
Start
outputs:key,val -
WriteKV
inputs:key,val
outputs:key -
ReadKV
inputs:key
outputs:val -
End
inputs:val
Start.key→WriteKV.keyStart.val→WriteKV.valWriteKV.key→ReadKV.keyReadKV.val→End.val
This node writes kv.<key> = <val> into plugin config via mutations, and outputs key.
import sys
from lyenv_sdk import read_request, mutate, respond_ok, respond_error, log
def main():
# Load request (available because hybrid runtime forwards stdin)
read_request()
key = sys.argv[1] if len(sys.argv) > 1 else ""
val = sys.argv[2] if len(sys.argv) > 2 else ""
key = key.strip()
if not key:
respond_error("empty key")
return
mutate(f"kv.{key}", val, scope="plugin")
log(f"write kv.{key}={val}")
# Output for downstream port mapping:
respond_ok("", extra={"outputs": [key]})
if __name__ == "__main__":
main()This node reads kv.<key> from plugin config and outputs val.
import sys
from lyenv_sdk import read_request, config_plugin, respond_ok, respond_error, log
def main():
read_request()
key = sys.argv[1] if len(sys.argv) > 1 else ""
key = key.strip()
if not key:
respond_error("empty key")
return
val = config_plugin(f"kv.{key}", "")
log(f"read kv.{key}={val}")
respond_ok("", extra={"outputs": [str(val)]})
if __name__ == "__main__":
main()Click Run, choose the group command, input args: foo bar
Expected final output: bar
If a node has outputs: a, b, c
Your node program can just print:
import json
print(json.dumps(["A","B","C"], ensure_ascii=False))The runner maps outputs in order.
Tip: prefer JSON array outputs to avoid space-splitting issues.
Export the workflow as a plugin from GUI.
Install locally:
lyenv plugin add /path/to/exported-plugin --name=myflowRun:
lyenv run myflow run -- foo bar✅ Commit source only:
plugins/<NAME>/
manifest.yaml
scripts/
config.yaml (optional)
❌ Do NOT commit zip artifacts.
PR flow:
- Fork the
plugin-centerrepo - Add/modify
plugins/<NAME>/... - Bump version in
manifest.yaml - Open PR to
main
After merge, CI will:
- build
<NAME>-<VERSION>.zip - upload to GitHub Release assets (tag=
artifacts) - update
index.yamland open an automatic PR
Merge the index PR to publish.
- Most common: missing edges
- Ensure
Upstream.output → Downstream.inputconnections exist
- Ensure input port names match what you connect
- Ensure one input port has only one incoming edge
- With hybrid runtime you can safely call
read_request() - Use
mutate/config_plugin/log/respond_ok(extra={"outputs":[...]})
As the plugin-center repository grows, cloning the entire repository can become slow and unnecessary, especially when you only want to add or update one plugin.
To improve contributor experience, we strongly recommend using Git partial clone and sparse checkout.
This allows you to fetch only the files you need.
git clone --filter=blob:none --no-checkout https://github.com/<ORG>/<PLUGIN-CENTER-REPO>.git
cd <PLUGIN-CENTER-REPO>git sparse-checkout init --coneFor example, to work on a single plugin:
git sparse-checkout set plugins/my-plugin
git checkout mainOr if you also need the index file:
git sparse-checkout set plugins/my-plugin index.yaml
git checkout main✅ Result:
- Only
plugins/my-plugin/(and optionallyindex.yaml) are downloaded - No need to fetch the full repository history or all plugin files
git sparse-checkout set plugins/my-plugin
git checkout main
# edit files
git add plugins/my-plugin
git commit -m "Update my-plugin: fix inputs/outputs"
git push origin my-branchOpen a Pull Request as usual.
When adding a new plugin, only commit source files:
plugins/my-plugin/
manifest.yaml
scripts/
config.yaml (optional)
❌ Do NOT commit:
- zip artifacts
- build outputs
- generated files
Artifacts are built automatically by CI and uploaded as GitHub Release assets.
If sparse checkout is not available, you may use a shallow clone:
git clone --depth=1 https://github.com/<ORG>/<PLUGIN-CENTER-REPO>.gitUsing sparse checkout:
- reduces clone time dramatically
- lowers disk usage
- makes contributing feasible even on slow networks
- scales better as the plugin ecosystem grows
We highly encourage all contributors to adopt this workflow.
Thanks again for contributing to lyenv 🚀




