A step-by-step guide to creating an offline autonomous Minecraft AI bot, powered by a local LLM (LM Studio Meta‑LLaMA‑3.1‑Instruct), playing alongside you on a Fabric 1.21 server managed with TLauncher.
- Java 17+ installed (required for Fabric and Minecraft ≥ 1.18) :contentReference[oaicite:1]{index=1}
- Node.js installed
- TLauncher with Fabric 1.21 profile
- LM Studio v0.3.6+ with local API enabled
- Official Fabric Server installer (universal JAR)
Install Java 17+ and verify with:
java -version
# Expect something like "openjdk version \"17.x.x\""Minecraft ≥ 1.18 and Fabric ≥ 1.17 require Java 17+ ([wiki.fabricmc.net][1], [github.com][2]).
-
Download the Fabric Installer JAR from the Fabric website.
-
In an empty folder, run:
java -jar fabric-installer.jar
-
Select Server
-
Choose Minecraft 1.21.x
-
Click Install, generating:
fabric-server-launch.jarstart.sh/start.bat- Dependencies including
server.jar([wiki.fabricmc.net][3], [wiki.fabricmc.net][1])
-
In the server folder, update:
-
eula.txt→eula=true -
server.properties→online-mode=false enable-command-block=true
These settings allow TLauncher (cracked) players to connect ([wiki.fabricmc.net][1], [wiki.fabricmc.net][3]).
-
On Windows: run
start.bat -
On Linux/macOS:
chmod +x start.sh ./start.sh
The console should show the server is online on port 25565.
-
Open TLauncher with Fabric 1.21 profile.
-
Navigate to Multiplayer → Direct Connect
-
Enter:
localhost
You should now join your offline Fabric server world.
-
Install LM Studio v0.3.6+.
-
In the Developer tab, enable the OpenAI‑compatible HTTP API at
http://localhost:1234/v1/… -
Test with:
curl http://localhost:1234/v1/models
You should see
"meta-llama-3.1-8b-instruct:2"listed ([youtube.com][4], [wiki.fabricmc.net][3], [wiki.fabricmc.net][1], [lmstudio.ai][5]). -
Test tool-calling:
curl -i http://localhost:1234/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{ "model":"meta-llama-3.1-8b-instruct:2", "messages":[{"role":"system","content":"You can call functions."}, {"role":"user","content":"Call echo with text \"hello world\""}], "functions":[{ "name":"echo", "description":"Returns the same text", "parameters":{"type":"object","properties":{"text":{"type":"string"}},"required":["text"]} }] }'
✅ Confirm response has
tool_callsJSON .
mkdir mc-ai-bot
cd mc-ai-bot
npm init -y
npm install mineflayer mineflayer-pathfinder mineflayer-auto-eat mineflayer-collectblock node-fetch minecraft-dataSave the complete bot.js script (provided in the code block below).
-
Connects to
localhost:25565and identifies asLLM_Bot -
Defines tools
mineBlockandfollowPlayerfor LM to call -
Handles chat:
- Parses
tool_callsJSON and runs in-game actions - Cleans markdown, supports alternative
actionJSON format
- Parses
-
Sends feedback to LLM to maintain context
(See code in bot.js)
Ensure both server and TLauncher are running, then in your bot directory:
node bot.jsYou should see in-game chat:
LLM_Bot: Hello! I'm your AI teammate. Chat with me!
In the Minecraft chat, type:
@LLM_Bot please mine oak_log@LLM_Bot follow me
Your bot should mine logs and follow you accordingly!
Add more tools:
storeItems,eatFood,fightMob, etc.- Provide contextual feedback (inventory, health).
- Refine prompts or use function-calling for behavior chaining.
| Issue | Solution |
| - | |
| Fabric server crashes | Ensure Java 17+ is used ([docs.mcserversoft.com][6], [wiki.fabricmc.net][1], [github.com][7], [youtube.com][8]) |
| Bot fails to parse JSON | Use cleanJson() to strip markdown |
| Function calls not triggered | Confirm LM Studio v0.3.6+ supports OpenAI-style API |
Use the one which is present in this repository
You’ve set up a fully offline, autonomous AI companion using TLauncher + Fabric + LM Studio + Mineflayer, complete with structured tool-calling for in-game behaviors. Ready to play and extend!
Happy coding & gaming! ⚔️🚀