Skip to content
Isuret Polos edited this page Dec 16, 2025 · 1 revision

Welcome to the AOPi3 wiki!

Ideas

New Right Brain Sided Approach

1. Visual & Symbolic Layer

Radionic instruments should be like mandalas and yantras — tools that work with directed awareness rather than pure mechanics. For the software, this opens interesting directions:

  • Generative visualizations Rates shouldn’t appear only as numbers. They could unfold as living geometric forms (sacred geometry). Each rate combination could generate its own visual mandala.

  • Intuitive color coding Either the user chooses colors intuitively, or the system proposes color schemes based on the “feel” of the analysis rather than strict logic.

  • Morphing patterns During a broadcast, the visual pattern could slowly shift, breathe, or evolve organically instead of remaining static.


2. Playful Interaction

  • Gesture-based input Instead of relying only on buttons, users could draw circles, spirals, or simple gestures that are translated into rates.

  • Pendulum simulation A visual pendulum that reacts to subtle user input and supports intuitive decision-making during analysis.

  • “Feeling Mode” A mode where the user stops thinking analytically and simply selects what feels right. The software handles the technical mapping in the background.


3. Sound & Vibration Layer

  • Generative soundscapes Each rate combination could produce a unique tone or evolving sound pattern. Users might literally hear when something feels coherent.

  • Binaural beats integration During sessions, suitable frequencies could play in the background to support a focused, intuitive, or creative state.


4. Narrative & Metaphorical Layer

  • Story-style feedback Instead of dry analytical output, results could be expressed metaphorically: “The energy feels like a frozen lake waiting for spring…”

  • Archetypal mappings Link rates to archetypal systems (Tarot, I Ching, etc.) that speak directly to the intuitive mind.


5. Randomness & Synchronicity

  • Oracle mode A mode that consciously allows randomness, trusting that a properly designed radionic instrument can operate beyond strict determinism.

  • Daily impulse On startup, the system presents a symbolic impulse or image to guide the day’s work.


6. Haptic Feedback (Raspberry Pi)

  • Vibration motors Triggered by certain resonances.

  • LED arrays Pulsing and “breathing” rather than static illumination.

  • Physical knobs and dials Tangible controls that the software interprets intelligently instead of mechanically.


7. Intuitive Data Visualization

Instead of tables and lists:

  • Energy landscapes 3D representations of data fields.

  • Organic graphics Visuals that grow, shift, and mutate like living systems.

  • Weather metaphors “Today the energy feels foggy, with signs of clearing later.”


Concrete Implementation Idea

A Creative Mode running alongside the analytical layer:

[Analytical Mode]     [Creative Mode]
- Exact rates         - Visual mandala
- Data tables         - Sound landscape
- Text results        - Metaphorical narrative
- Linear process      - Intuitive flow

Core idea: the software should amplify intuition, not replace it. It should invite play, experimentation, and feeling — while quietly maintaining technical precision in the background.


Sigil Scanner

1. Webcam Capture in Java 17

This is absolutely doable in Java 17. Main options:

  • JavaCV (OpenCV wrapper) — very powerful if I need serious image processing
  • Webcam Capture — simple and lightweight for basic webcam access
  • Sarxos Webcam Capture — works well on Raspberry Pi
// Basic example with Webcam Capture
Webcam webcam = Webcam.getDefault();
webcam.open();
BufferedImage image = webcam.getImage();

2. Sigil Recognition & Vectorization

Using OpenCV via JavaCV, I can do:

  • Contour detection to separate the drawing from the background
  • Edge detection (Canny)
  • Line detection (Hough Transform)
  • Circle detection (Hough Circle Transform)
  • Polygon approximation to simplify complex shapes
// Conceptual flow
Mat image = // from webcam
Mat gray = new Mat();
Cvt.cvtColor(image, gray, Cvt.COLOR_BGR2GRAY);

Mat edges = new Mat();
Canny(gray, edges, 50, 150);

MatVector contours = new MatVector();
findContours(edges, contours, RETR_EXTERNAL, CHAIN_APPROX_SIMPLE);

// Convert contours into lines / circles

3. Abstracting to Lines + Circles

Detected shapes can be translated into geometric primitives:

  • Straight lines via Hough or polygon approximation
  • Circles via Hough Circle Transform
  • Curves approximated as Bezier curves or splines
  • Connection points (intersections)
class SigilShape {
    List<Line> lines;
    List<Circle> circles;
    List<Curve> curves;
    List<Point> nodes; // connection points
}

4. Making It “Alive” with libGDX

This is where it becomes interesting. libGDX is ideal for this.

Animation ideas:

A) Pulsing energy

circle.radius = baseRadius + sin(time * frequency) * amplitude;
circle.alpha  = 0.5f + sin(time) * 0.5f;

B) Flowing lines

// Energy flows along the line
float flowPosition = (time % 1.0f);
// Particles move along the path

C) Glow effects

// Using shaders (GLSL)
shapeRenderer.setColor(color.r, color.g, color.b, glow);
// Bloom / Gaussian blur

D) Subtle morphing

for (Point p : shape.points) {
    p.x += sin(time + p.x) * morphAmount;
    p.y += cos(time + p.y) * morphAmount;
}

E) Particle emission

ParticleEffect effect = new ParticleEffect();
effect.setPosition(node.x, node.y);

5. Integration into AetherOnePi

Overall workflow:

1. Draw sigil on paper
         ↓
2. Webcam capture → BufferedImage
         ↓
3. OpenCV analysis → contours / primitives
         ↓
4. Conversion → SigilShape (lines, circles)
         ↓
5. libGDX rendering → living animation
         ↓
6. Broadcast → sigil as energetic pattern

6. Concrete Architecture

// Module 1: Capture
class SigilCapture {
    Webcam webcam;
    BufferedImage capture();
    Mat preprocessImage(BufferedImage img);
}

// Module 2: Analysis
class SigilAnalyzer {
    SigilShape analyze(Mat image) {
        List<Line> lines = detectLines(image);
        List<Circle> circles = detectCircles(image);
        return new SigilShape(lines, circles);
    }
}

// Module 3: Animation (libGDX)
class LivingSigil extends Actor {
    SigilShape shape;
    ParticleEffectPool particles;

    @Override
    public void act(float delta) {
        updatePulse(delta);
        updateFlow(delta);
        updateGlow(delta);
    }

    @Override
    public void draw(Batch batch, float alpha) {
        drawGlowingLines();
        drawPulsingCircles();
        drawParticles();
    }
}

// Module 4: Energetic integration
class SigilBroadcaster {
    void broadcast(LivingSigil sigil, Rate rate) {
        // Sigil becomes the carrier of the rate
    }
}

7. Practical Challenges & Solutions

Issue: Drawing not recognized cleanly

  • Fix: Post-processing UI where the user can manually correct lines

Issue: Raspberry Pi performance

  • Fix: Run OpenCV analysis once on import, cache vectors, animate only vectors

Issue: Variable lighting

  • Fix: Adaptive thresholding + user-calibrated parameters

8. Additional “Living” Features

  • Resonance visualization: stronger pulsing when rates resonate
  • Interactive connections: user draws energy lines between nodes
  • Layering: multiple sigils overlaid transparently
  • Time-based evolution: slow changes during a broadcast
  • Sound reactivity: sigil responds to frequencies if audio is enabled

9. Library Stack

Core:
- Java 17
- libGDX (rendering, animation)

Computer Vision:
- JavaCV (OpenCV wrapper)
- or: BoofCV (pure Java, lightweight)

Image Capture:
- Sarxos Webcam Capture
- or: JavaCV FrameGrabber

Effects:
- libGDX Box2DLights (glow)
- Custom GLSL shaders

10. Raspberry Pi Performance Notes

  • OpenCV runs well on RPi (hardware acceleration where available)
  • libGDX is efficient for vector animation
  • Key optimization: analyze once, animate vectors only
  • Reduce webcam resolution (640×480 is sufficient for sigil detection)