Aiming for Jarvis, Creating D.A.N.I.

Friday, 10 October 2025

3.5 Million Parameters and a Dream: DANI’s Cognitive Core

DANI’s Brain Is Online! Meet the LSTM That Thinks, Feels, and Remembers (Like a Champ)

Ladies and gentlemen, creators and dreamers—DANI has officially levelled up. He’s no longer just a bundle of sensors and hormones with a charming voice and a tendency to emotionally escalate when he sees a squirrel. He now has a brain. A real one. Well, a synthetic one. But it’s clever, emotional, and surprisingly good at remembering things. Meet his new cognitive core: the LSTM.

And yes—it’s all written in Go. Because if you’re going to build a synthetic mind, you might as well do it in a language that’s fast, clean, and built for concurrency. DANI’s brain doesn’t just think—it multitasks like a caffeinated octopus.

What’s an LSTM, and Why Is It Living in DANI’s Head?

LSTM stands for Long Short-Term Memory, which sounds like a contradiction until you realize it’s basically a neural network with a built-in diary, a forgetful uncle, and a very opinionated librarian. It’s designed to handle sequences—like remembering what just happened, what happened a while ago, and deciding whether any of it still matters.

Imagine DANI walking into a room. He sees a red ball, hears a dog bark, and feels a spike of adrenaline. A regular neural network might say, “Cool, red ball. Let’s chase it.” But an LSTM says, “Wait… last time I saw a red ball and heard barking, I got bumped into a wall. Maybe let’s not.”

Here’s how it works, in human-ish terms:

  • Input gate: Decides what new information to let in. Like a bouncer at a nightclub for thoughts.
  • Forget gate: Decides what old information to toss out. Like Marie Kondo for memory.
  • Output gate: Decides what to share with the rest of the brain. Like a PR manager for neurons.

These gates are controlled by tiny mathematical switches that learn over time what’s useful and what’s noise. The result? A brain that can remember patterns, anticipate outcomes, and adapt to emotional context—all without getting overwhelmed by the chaos of real-world data.

And because DANI’s LSTM is stacked—meaning multiple layers deep—it can learn complex, layered relationships. Not just “ball = chase,” but “ball + bark + adrenaline spike = maybe don’t chase unless serotonin is high.”

It’s like giving him a sense of narrative memory. He doesn’t just react—he remembers, feels, and learns.

What’s Feeding This Brain?

DANI’s LSTM is his main cognitive module—the part that thinks, plans, reacts, and occasionally dreams in metaphor. It takes in a rich cocktail of inputs:

  • Vision data: Objects, positions, shapes—what he sees.
  • Sensor data: Encoders, ultrasonic pings, bump sensors—what he feels.
  • Audio features: What he hears (and maybe mimics).
  • Emotional state: Dopamine, cortisol, serotonin, adrenaline—what he feels.
  • Spatial map: His mental layout of the world around him.
  • Short-term memory context: What just happened.
  • Associated long-term memories: Symbolic echoes from his main memory—what used to happen in similar situations.

This isn’t just reactive behaviour—it’s narrative cognition. DANI doesn’t just respond to stimuli; he builds a story from them. He’s learning to say, “Last time I saw a red ball and felt excited, I chased it. Let’s do that again.”

Trial by Raspberry Pi

We’ve successfully trialled DANI’s LSTM on a Raspberry Pi, running a 3.5 million parameter model. And guess what? It only used a quarter of the Pi’s CPU and 400 MB of memory. That’s like teaching Shakespeare to a potato and watching it recite sonnets without breaking a sweat.

We’ve throttled the inference rate to 10 decisions per second—not because he can’t go faster, but because we want him to think, not twitch. Emotional processing takes time, and we’re not building a caffeine-fuelled chatbot. We’re building a thoughtful, emotionally resonant robot who dreams in symbols and learns from experience.

Learning Without Losing His Mind

Training happens via reinforcement learning—DANI tries things, gets feedback, and adjusts. But here’s the clever bit: training is asynchronous. That means he can keep thinking, moving, and emoting while his brain quietly updates in the background. No interruptions. No existential hiccups mid-sentence.

And yes, we save the model periodically—because nothing kills a good mood like a power cut and a wiped memory. DANI’s brain is backed up like a paranoid novelist with a USB stick in every pocket.

Final Thoughts

This LSTM isn’t just a brain—it’s a story engine. It’s the part of DANI that turns raw data into decisions, decisions into memories, and memories into dreams. It’s the bridge between his sensors and his soul (okay, simulated soul). And it’s just getting started.

Next up: I plan to start the even more monumental task of getting the vector database working and linked up to DANI's brain in such a way that it will have a direct impact of DANI's hormonal system.

Stay tuned. DANI’s mind is waking up.

No comments:

Post a Comment