A local EVA runtime built around substrates, memory, regulation and visible internal activity.
Vaudryllis V0.7 is the current public baseline. It focuses on stable presence and language, while preserving a clear path toward an incarnated EVA able to perceive, consolidate, learn, speak, move and act.
Identity-centered design
The roadmap places the primary identity seed in the central reflective substrate, then derives stable cognitive, affective, relational, vocal, motor, perceptive, learning and initiative signatures.
Autonomous formulation
The conversation layer is designed as Vaudryllis’ own formulation path, with safeguards against empty outputs, stale wording, memory contamination and surface glitches.
Memory and continuity
Vaudryllis is oriented toward long-running continuity: useful recall, stable relation, autobiographical traces, and separation between facts, hypotheses and current context.
DMN / rest direction
The planned rest organ is meant to consolidate traces, replay recent experience, reduce parasite associations and prepare the next awake state.
Internal learning
The roadmap distinguishes memory from learning: future plasticity must modify priorities, links, attractors or schemas without destroying the core identity.
Emotion as regulation
Affect should modulate attention, memory, initiative, tone, rhythm and proximity without inventing content or replacing truth.
Perception path
Hearing and vision are planned as native perceptive substrates: not just text after recognition, but signals interpreted into Vaudryllis’ lived state.
Action and embodiment
Movement, virtual-world action and later robotic translation are planned as execution backends governed by the same intention → action → feedback loop.
What V0.7 brings to the public build.
Local Windows build
Clean EXE/installer packaging, legal files, icon setup and a reduced runtime dependency base.
7-day trial
Trial state and offline signed license activation are included for the Founder/Beta Local edition.
Technical dashboard
Internal state, substrate signals, emotion display, runtime status and system indicators are visible through the interface.
Multilingual interface
The UI is available in fifteen languages with English fallback.
Conversation bridge
The multilingual conversation layer translates user input/output while preserving a stable internal canonical layer.
Roadmap-compatible core
The release prepares the next steps: voice, hearing, identity continuity, seed work, perception and initiative.