[TOC]
This project demonstrates the use of smolagents from Hugging Face to create AI agents. The samples were created on Windows and will create files in the C:\\tmp directory.
- Python >= 3.13
- Ollama
- Hugging Face Token
To start the Ollama server, execute the following command:
ollama serveThe server listens on port 11434 by default.
While the Ollama server is running, download the model using the following command:
ollama run qwen2.5This only needs to be done once.
Note: This is a small agent and may get confused easily. For a more robust implementation, use a model with more parameters.
Create a .env file with the following content:
HF_TOKEN=<HUGGING_FACE_TOKEN>Replace <HUGGING_FACE_TOKEN> with your actual Hugging Face token. This is required because all the samples use the Inference API with the following model by default:
Qwen/QwQ-32B
Switch to the Ollama model if needed by changing the
active_modeltolocal_modelin the model.ModelMgr module.
-
Clone the repository:
git clone https://github.com/rcw3bb/sample-smolagents.git cd sample-smolagents -
Install the dependencies:
If Poetry is not yet installed, use the following command to install it:
python -m pip install poetry
After installation, make
poetryavailable to theCLIby updating thePATHenvironment variable to include the following if you are using Windows:%LOCALAPPDATA%\Programs\Python\Python313\Scripts
If your system Python version is lower than Python 3.13, use the following command to install it:
poetry python install 3.13
poetry install
poetry run python -m sample.simple.file_management_sampleObserve whether it uses the provided tools.
poetry run python -m sample.simple.file_management_managed_sampleObserve whether it uses the provided tools.
poetry run python -m sample.mcp.stdio.file_management_sampleObserve whether it uses the provided tools.
poetry run python -m sample.mcp.stdio.file_management_managed_sampleObserve whether it uses the provided tools.
poetry -C <ROOT_DIR> run python -m mcp_servers.file_manager_server_stdioWhere <ROOT_DIR> is the directory that contains the mcp_servers directory.
Use the following prompt to test the server:
Write the text "Hello, World!" to "C:/tmp/mcp-stdio-hello.txt" and show me its content.
Expect to see that the write_file and read_file tools were utilized.
The server must be running before running any sample from this section.
poetry run python -m mcp_servers.file_manager_server_ssepoetry run python -m sample.mcp.sse.file_management_sampleObserve whether it uses the provided tools.
poetry run python -m sample.mcp.sse.file_management_managed_sampleObserve whether it uses the provided tools.
-
Run the server using the following command:
poetry -C <ROOT_DIR> run python -m mcp_servers.file_manager_server_sse
Where <ROOT_DIR> is the directory that contains the
mcp_serversdirectory. This will run a server on port8000. -
Use the following address to attach to an agent:
http://localhost:8000/sse
Use the following prompt to test the server:
Write the text "Hello, World!" to "C:/tmp/mcp-sse-hello.txt" and show me its content.
Expect to see that the write_file and read_file tools were utilized.
Ronaldo Webb