◆ AIUNITES|AIZinesAIByJobRedomyVideoBateVoiceStryFurnishThingsBizStryAI YHWHCloudsionGameaticaUptownITInThisWorldERPiseERPizeAITSQL🌌 COSMOS

HMN
Human Movement Notation

The AIUNITES family of open protocols for encoding human movement as plain text. Write once — use in a gym log, a game engine, a clinical record, a cable rig, and an AI voice synthesizer. No conversion. No proprietary software. No paying anyone to read your own data.

How can you have AI gains in this world without having unity?

💪
MNN — Muscular Neuro Notation
BODWAVE™ product line
The body. Muscles, nerves, joints, resistance vectors, compensation. LOD 1–4, ~149 muscles. v1.6.0
Read spec →
🎙️
VRN
Voice Resonance Notation
Vocal production. Resonance chambers, articulators, breath placement, register. 75+ symbols.
VoiceStry →
🤖
VNN
Voice Neural Notation
AI voice synthesis. Neural voice mapping, formants, phoneme targeting for AI systems.
Spec in development
{Push.H} [Con:Pec.S+++, Dlt.A+] → MedPec/Axil
[Pos:L.Sh(IR:25,Flex:90)] [Vec:H:Mid,A:0°,Src:Cable] One MNN string (HMN protocol) — readable by humans, parseable by machines, executable by avatars and cable rigs

Every field that touches human movement built its own silo. Whether by design or by accident, the result is the same: your data doesn't travel with you.

A biomechanics lab uses files only their software can read. A gym manufacturer stores joint angles in firmware only their machines can interpret. A clinic documents exercises in prose that no other system can parse. A game studio uses motion capture files that encode skeleton data but nothing about which muscles fired or why. These fields evolved independently, in different decades, solving different problems. Nobody planned the fragmentation — but nobody solved it either.

The practical effect is the same regardless of intent: when your movement data lives inside a vendor's format, taking it with you is expensive and painful. Exporting, converting, and re-importing between systems requires specialized software, technical expertise, or both. For most people, the data effectively stays behind when they switch providers, facilities, or platforms.

Three Domains, One Notation

MNN is a notation for human movement — not just exercise. The same string works across all three.

🏋️

Exercise & Rehabilitation

→ BodSpas (BODWAVE)

Gym logging, physical therapy, clinical documentation, personal training. Track which angle clears the acromion, log nerve flare-ups alongside sets, document compensation patterns over time.

3×12×20lb RPE:6
[Con:Pec.S+++] [Pos:L.Sh(IR:25)]
🌍

Virtual Worlds & Avatars

→ InThisWorld (inthisworld.com)

Virtual worlds, VR training, Second Life / OpenSim, digital twins, animation. Pose an avatar precisely using joint angles, animate contraction sequences, build training simulations.

[Pos:L.Sh(IR:25,Flex:90)]
→ llEuler2Rot(<x,y,z> * DEG_TO_RAD)
🤖

Remote Control & Robotics

→ BodSpas (BODWAVE)

Cable rigs, exoskeletons, robotic rehabilitation, isokinetic machines, teleoperation. Drive a pulley to the exact height and angle, set joint limits, reproduce a prescribed position.

[Vec:H:Mid,A:0°,Src:Cable]
→ pulley.setHeight(1.2m), angle(0°)

Why No Unified Standard Existed

These fields evolved independently, in different decades, solving different problems. The silos weren't necessarily planned — but they're real.

🔒 Equipment Data

Gym machines track joint angles in proprietary firmware. That data lives on their hardware and doesn't follow you to the next facility.

💻 Software Formats

Motion capture, biomechanics, and EMG each have their own file formats. Reading the data often requires the same software that generated it.

🏥 Clinical Records

Physical therapy and clinical notes are written in prose. Rich in detail, but no other system can parse or reuse them.

🎓 Separate Vocabularies

Biomechanics, sports medicine, dance, and animation each developed their own terminology for the same body doing the same things.

🎮 Virtual Worlds

Game engines use skeleton data with no concept of which muscles fired or why. The movement looks right but carries no physiological meaning.

📖 MNN's Answer

The string you write in the gym is the same string a game engine executes. Your data. Your format. Portable.

Joint Taxonomy

MNN models the body as a tree of joints rooted at the pelvis. Every joint has a symbol, defined degrees of freedom, and physiological range limits. The same taxonomy is used in the gym, in the avatar rig, and in the cable controller.

Pelvis (root)
├── Spine
│   ├── Sp.L  Lumbar
│   ├── Sp.T  Thoracic
│   ├── Sp.C  Cervical
│   │   └── AA    Atlantoaxial (C1–C2)
│   └── Head
│       └── TMJ   Jaw
├── SC    Sternoclavicular  (L/R)
│   └── AC    Acromioclavicular (L/R)
│       └── Scap  Scapula        (L/R)
│           └── Sh    Shoulder       (L/R)
│               └── El    Elbow         (L/R)
│                   └── RU    Forearm       (L/R)
│                       └── Wr    Wrist         (L/R)
│                           └── MCP/PIP/DIP  Fingers 1–5
└── Hip   Hip            (L/R)
    └── Kn    Knee          (L/R)
        └── Ank   Ankle        (L/R)
            └── Sub   Subtalar    (L/R)
                └── MTP/PIP/DIP  Toes 1–5
Symbol Joint DOF Key Axes Notes
Axial — Midline
Sp.LLumbar Spine3Flex, Lat, Rot−30 to 80° flex
Sp.TThoracic Spine3Flex, Lat, RotPrimary rotation segment
Sp.CCervical Spine3Flex, Lat, RotC3–C7 aggregated
AAAtlantoaxial (C1–C2)1RotUp to 45° each side
TMJTemporomandibular (Jaw)2Open, LatValue in mm, not degrees
Shoulder Girdle — Paired
SCSternoclavicular3Elev, Pro, RotOften implicit in Scap
ACAcromioclavicular3UpRot, Tilt, RotOften implicit in Scap
ScapScapulothoracic3Pro, Elev, UpRotComposite SC + AC motion
ShGlenohumeral (Shoulder)3Flex, Abd, IR, ER−45 to 180° flex; 0–180° abd
Upper Limb — Paired
ElElbow (Humeroulnar)1Flex0–145°
RURadioulnar / Forearm1Pro, Sup0–90° each
WrWrist2Flex, Rad, Uln−70 to 80° flex
MCP/PIP/DIPFingers (F1–F5)1–2Flex, AbdDigit suffix: F1=thumb … F5=little
Lower Limb — Paired
HipHip (Femoroacetabular)3Flex, Abd, IR, ER−30 to 125° flex
KnKnee (Tibiofemoral)1Flex0–140°
AnkAnkle (Talocrural)1Dors, PlanDorsi/plantarflexion only
SubSubtalar1Inv, EvInversion/eversion separate
MTP/PIP/DIPToes (T1–T5)1FlexT1=hallux; negative=extension

~139 DOF full body · ~25 DOF for typical exercise & rehab use · ~149 muscles LOD 1–4 · Full spec: MNN Spec v1.5 →

MNN v1.6 — Transition Notation

Static poses are snapshots. A living avatar needs duration, easing, and sequencing. Transition notation is the layer that maps to what the cerebellum actually does — timing and smoothing movement between intended states.

The Transition Operator
{Curl}
[Pos:L.El(Flex:0)] ~400ms.ease-in [Con:Bic+++]
[Pos:L.El(Flex:130)] ~600ms.ease-out [Con:Bic+]
[Pos:L.El(Flex:0)]
One rep of a bicep curl. Origin pose → duration → easing → muscle peak → target pose → repeat. The ~Nms operator is the transition. The .ease- suffix is the curve. [Con:] marks peak activation at the midpoint.

🧠 ease-bio — the default

The cerebellum produces a bell-shaped velocity profile on every voluntary movement: acceleration at onset, peak at midpoint, deceleration at arrival. ease-bio is this curve. Any avatar using a linear lerp looks mechanical. ease-bio is the default when no curve is specified.

Other curves

ease-linear — mechanical, cable rigs & robots.
ease-snap — ballistic/reflex, fast and abrupt.
ease-in — slow start, reaching & extending.
ease-out — fast start, catching & landing.
ease-spring — overshoot and return, tremor.

🔁 Loop & cycle tags

Living avatars need repeating motion. @loop repeats a sequence indefinitely. @cycle:800ms sets the period. @phase:0.5 offsets bilateral limbs out of phase — the walk cycle.

Squat — simultaneous joints
{Squat}
[Pos:L.Hip(Flex:0) R.Hip(Flex:0)
     L.Kn(Flex:0) R.Kn(Flex:0)] ~800ms.ease-bio
[Pos:L.Hip(Flex:90) R.Hip(Flex:90)
     L.Kn(Flex:100) R.Kn(Flex:100)]
[Pos:L.Hip(Flex:0) R.Hip(Flex:0)
     L.Kn(Flex:0) R.Kn(Flex:0)] ~900ms.ease-bio
Walk cycle — loop + phase
{Walk} @loop @cycle:800ms
[Pos:L.Hip(Flex:30) R.Hip(Flex:-10)]
~400ms.ease-bio
[Pos:L.Hip(Flex:-10) R.Hip(Flex:30)]
~400ms.ease-bio @phase:0.5
Idle breathing — infinite loop
{Breathe} @loop @cycle:4000ms
[Pos:Sp.T(Flex:2)]
~1800ms.ease-in
[Pos:Sp.T(Flex:0)]
~2200ms.ease-out
Neural ↔ Notation Mapping
Neural layerRoleMNN equivalent
Motor cortexTarget joint angle[Pos:] pair
CerebellumTiming & smoothing~Nms.ease-bio
Alpha motor neuronsPeak activation[Con:] on transition
Proprioceptive loopCycle & phase@cycle, @phase
🎭 Try MNN Avatar Viewer → | Full HMN Spec v1.2 →

MNN v1.6 — Avatar Surface Layer

Transition notation drives skeleton and timing. The Surface Layer drives what the body looks like — morph targets, muscle definition, body composition. Two optional tags, backward compatible with every existing MNN string.

👤 [Body:] — Baseline

Declares the avatar’s body composition once at session level. Mass, body fat %, frame size, height. Body fat is the key variable — it scales how visible muscle contraction is on the surface. A BF:8% avatar and a BF:30% avatar have the same notation; the rendering engine scales morph magnitudes accordingly.

[Body:Mass:85kg, BF:12%, Frame:L, Height:182cm]

💪 [Morph:] — Override

Explicit morph target weights for when the surface should differ from the default. Use for isometric contractions (joint doesn’t move but surface must show), muscle head specificity (peaked vs round bicep), vascular pump, or clinical atrophy. For standard contractions, the default mapping handles it — no tag needed.

[Con:Quad.VL++++] [Pos:L.Kn(Flex:0)]
[Morph:Quad.VL:1.0] // isometric

[Morph:Bic.Long:0.95, Bic.Short:0.4] // peaked

📈 Default mapping

For standard contractions, no [Morph:] is needed. The activation level maps directly to a morph weight: +→ 0.25, ++→ 0.50, +++→ 0.75, ++++→ 1.00. The morph target identifier is the MNN muscle symbol. An LOD 1 engine that receives [Morph:Bic.Long:0.9] falls back to the parent Bic target.

Avatar Surface Layer spec → | Full HMN Spec v1.2 →

Where HMN Sits

Every field that touches human movement built its own silo. HMN is the first system that crosses all of them in portable plain text.

System Domain Text-portable Neuromuscular Joint angles Voice Avatar-ready
Labanotation Dance / theater
Eshkol-Wachman Movement research ~ ~
HamNoSys / SiGML Sign language / avatars ~ ✓ (arms/hands) ✓ (arms only)
BVH / FBX Motion capture / animation
SMPL / SMPL-X AI / ML body models ~ (face)
ISB JCS Biomechanics research
EMG + SENIAM Muscle activity measurement ✓ (sensor)
VRN (HMN) Vocal production ~ (jaw/larynx)
MNN (HMN) Universal body via VRN

∼ = partial coverage · ✗ = not supported · ✓ = fully supported

Build a Notation String

The HMN Builder lets you assemble an MNN string interactively — pick muscles, set joint angles, choose a movement pattern. Copy the result into a gym log, game engine, or cable rig.