-
Notifications
You must be signed in to change notification settings - Fork 0
AI Notes
Contrary to Fizzy Hydra, which was developed with significant AI assistance through "vibe coding", Live 4 Bubbles has made very minimal use of AI in its development. This page documents the actual role of AI in both projects and clarifies the extent of its usage.
Live 4 Bubbles development has been primarily manual, with minimal AI assistance:
0% AI - Zero AI assistance. The entire concept, vision, and architecture of Live 4 Bubbles emerged from human creativity and experience with:
- Live 4 Life (spatial audio system developed since 2011),
- SuperCollider expertise accumulated over years,
- Live coding practice and community knowledge and visionage de vidéos de Live Coding Algoraves.
0% AI - All written manually. All SuperCollider code was written manually, including:
- Multi-track control system architecture,
- Parameter management systems,
- GUI layout and interaction design,
- OSC/MIDI controller integration.
~30% AI - Minimal assistance, mostly manual. Minimal AI assistance was used for some Haskell code (such as help for some functions, code indentation and organization, updates), representing around 30% of the codebase. All suggestions were reviewed, tested, and always modified, refined or corrected.
Sometimes AI never managed to help me to set up a function, despite the time spent debugging with it. Fortunately, the answer sometimes came from a human being. For instance, a question unresolved for weeks with AI was finally solved by a human, here thanks to Alex McLean on the Tidal Cycles forum, by chance.
~30% AI - Some help, extensively revised. As can be seen through the abundant use of emojis throughout the documentation, which AI seems to love so much..., I used AI assistance. However, I adjusted and reduced the emojis as much as I could. Some assistance with README structure and wording (~30% of content), but all text was subsequently reviewed, corrected, and rewritten to accurately reflect the project and my ideas.
For example, I generated the workshop GIF with AI by selecting and ordering a series of images with 2 interpolations between them, and adjusting the background and framing.
In contrast, Fizzy Hydra was developed with much more extensive use of "vibe coding" (AI-assisted development):
Human-conceived. I designed the overall vision and control architecture.
Significant AI assistance for code generation. As acknowledged in the Fizzy Hydra repository, Claude Code was instrumental in developing:
- Hydra visual code generation,
- Sequencer control mechanisms,
- Interactive visual sequence creation,
- Implementation of various algorithms.
Developed with Claude Code assistance. AI played a substantial role in accelerating development and implementing features.
When AI is used in tool development, a core principle is maintained: AI is only used upstream in the development process, never in real-time performance.
- Upstream: AI may help write, optimize, and improve the code that creates the tools,
- Performance: The live coding performance remains a purely human gesture, with all its imperfections and spontaneity.
This approach preserves the "authenticity" of the performative gesture.
It's important to remember that current AI LLMs are fundamentally word-guessing machines. As Timnit Gebru aptly described, they are "stochastic parrots" - they generate text based on statistical patterns rather than true understanding. This limitation must be kept in mind when using AI as a development assistant.
I view current AI as a tool to help me to achieve my creative and performance goals. It is neither inherently good nor bad - these are human concepts. Like any tool, it embodies both potentials simultaneously. Like nuclear energy, which can both generate power and destroy, everything depends on how humans choose to use it. And humans themselves are both good and bad at once.
I used ChatGPT, Claude on the web, as well as Ollama. I have not yet tried Gemini or the agentic version of OpenAI Codex. Better results appeared as soon as I used Claude Code, which allowed it to consider and act on the entire codebase more easily.
Claude Code has been the primary AI assistant used in developing Fizzy Hydra. It has proven particularly effective for:
- Code generation: Creating new features and functions,
- Debugging: Identifying and fixing bugs,
- Documentation: Helping structure and write documentation,
- Refactoring: Restructuring code for better maintainability,
- Optimization: Improving algorithm efficiency, but only when verified with real performance tests.
However, there were significant negative points with AI-assisted development:
Many optimization proposals were not actual optimizations and were in fact harmful. AI models base their suggestions on averages or outdated/inaccurate databases. It was only when I verified the so-called optimizations with real performance tests in the browser that I realized I needed to revert changes. At times, I lost control and it generated a lot of code that became difficult to manage.
While AI is a powerful tool, the artist maintains full creative control:
- All AI-generated code should be reviewed and modified,
- Artistic decisions remain human-driven,
- The performance itself is never AI-generated.
Important principles to follow:
- Understand the code AI generates, don't just copy or let it work without control. AI models are very verbose and redundant, which is why it's crucial to always verify in small steps.
- Learn from AI's approaches and techniques while being always precise and restricting it to small tasks. Carefully circumscribe its scope of action to avoid hallucinations.
While Live 4 Bubbles was developed mostly without AI, future work may explore:
-
Pattern Generation and Database Construction: Using AI to analyze and generate Tidal patterns, including:
- Analyzing pattern databases from other livecoders available online,
- Transcribing classical pieces (e.g., Erik Satie's Gnossiennes and Gymnopédies) into Tidal patterns,
- Using AI to reconstruct and analyze these patterns,
- Generating stylistically coherent new patterns.
- Sample Classification: AI-assisted organization of sound libraries.
-
Open Source Models: In alignment with the project's open source philosophy (CC BY-NC-SA 4.0), exploring open source AI models through tools like Ollama to:
- Compare performance with proprietary models like Claude,
- Ensure the entire development pipeline can remain open source,
- Contribute findings back to the community,
- Maintain independence from proprietary services.
- Performance Tools: Creating AI-assisted (but not AI-driven) performance aids.
The goal would be to experiment with AI upstream (for tool building and database creation) while keeping performance itself purely human.
- How do open source models compare to proprietary ones for live coding development?
- What are the best practices for AI-assisted live coding tool development?
- How can AI help with pattern analysis and database organization without replacing human creativity?
- What ethical considerations arise in AI-assisted artistic tool creation?
- Ecological considerations: As mentioned by artist friend Julien Dajez, it would be interesting to measure the amount of energy used in AI utilization, which remains considerable. This highlights the necessity of using small, curated local databases.
Xon - ChristOn - Live 4 Bubbles
ResearchGate | YouTube | Vimeo | FesseBook | Buy me a coffee | Patreon