The AIUNITES family of open protocols for encoding human movement as plain text. Write once — use in a gym log, a game engine, a clinical record, a cable rig, and an AI voice synthesizer. No conversion. No proprietary software. No paying anyone to read your own data.
How can you have AI gains in this world without having unity?
Every field that touches human movement built its own silo. Whether by design or by accident, the result is the same: your data doesn't travel with you.
A biomechanics lab uses files only their software can read. A gym manufacturer stores joint angles in firmware only their machines can interpret. A clinic documents exercises in prose that no other system can parse. A game studio uses motion capture files that encode skeleton data but nothing about which muscles fired or why. These fields evolved independently, in different decades, solving different problems. Nobody planned the fragmentation — but nobody solved it either.
The practical effect is the same regardless of intent: when your movement data lives inside a vendor's format, taking it with you is expensive and painful. Exporting, converting, and re-importing between systems requires specialized software, technical expertise, or both. For most people, the data effectively stays behind when they switch providers, facilities, or platforms.
MNN is a notation for human movement — not just exercise. The same string works across all three.
Gym logging, physical therapy, clinical documentation, personal training. Track which angle clears the acromion, log nerve flare-ups alongside sets, document compensation patterns over time.
Virtual worlds, VR training, Second Life / OpenSim, digital twins, animation. Pose an avatar precisely using joint angles, animate contraction sequences, build training simulations.
Cable rigs, exoskeletons, robotic rehabilitation, isokinetic machines, teleoperation. Drive a pulley to the exact height and angle, set joint limits, reproduce a prescribed position.
These fields evolved independently, in different decades, solving different problems. The silos weren't necessarily planned — but they're real.
Gym machines track joint angles in proprietary firmware. That data lives on their hardware and doesn't follow you to the next facility.
Motion capture, biomechanics, and EMG each have their own file formats. Reading the data often requires the same software that generated it.
Physical therapy and clinical notes are written in prose. Rich in detail, but no other system can parse or reuse them.
Biomechanics, sports medicine, dance, and animation each developed their own terminology for the same body doing the same things.
Game engines use skeleton data with no concept of which muscles fired or why. The movement looks right but carries no physiological meaning.
The string you write in the gym is the same string a game engine executes. Your data. Your format. Portable.
MNN models the body as a tree of joints rooted at the pelvis. Every joint has a symbol, defined degrees of freedom, and physiological range limits. The same taxonomy is used in the gym, in the avatar rig, and in the cable controller.
Pelvis (root) ├── Spine │ ├── Sp.L Lumbar │ ├── Sp.T Thoracic │ ├── Sp.C Cervical │ │ └── AA Atlantoaxial (C1–C2) │ └── Head │ └── TMJ Jaw ├── SC Sternoclavicular (L/R) │ └── AC Acromioclavicular (L/R) │ └── Scap Scapula (L/R) │ └── Sh Shoulder (L/R) │ └── El Elbow (L/R) │ └── RU Forearm (L/R) │ └── Wr Wrist (L/R) │ └── MCP/PIP/DIP Fingers 1–5 └── Hip Hip (L/R) └── Kn Knee (L/R) └── Ank Ankle (L/R) └── Sub Subtalar (L/R) └── MTP/PIP/DIP Toes 1–5
| Symbol | Joint | DOF | Key Axes | Notes |
|---|---|---|---|---|
| Axial — Midline | ||||
| Sp.L | Lumbar Spine | 3 | Flex, Lat, Rot | −30 to 80° flex |
| Sp.T | Thoracic Spine | 3 | Flex, Lat, Rot | Primary rotation segment |
| Sp.C | Cervical Spine | 3 | Flex, Lat, Rot | C3–C7 aggregated |
| AA | Atlantoaxial (C1–C2) | 1 | Rot | Up to 45° each side |
| TMJ | Temporomandibular (Jaw) | 2 | Open, Lat | Value in mm, not degrees |
| Shoulder Girdle — Paired | ||||
| SC | Sternoclavicular | 3 | Elev, Pro, Rot | Often implicit in Scap |
| AC | Acromioclavicular | 3 | UpRot, Tilt, Rot | Often implicit in Scap |
| Scap | Scapulothoracic | 3 | Pro, Elev, UpRot | Composite SC + AC motion |
| Sh | Glenohumeral (Shoulder) | 3 | Flex, Abd, IR, ER | −45 to 180° flex; 0–180° abd |
| Upper Limb — Paired | ||||
| El | Elbow (Humeroulnar) | 1 | Flex | 0–145° |
| RU | Radioulnar / Forearm | 1 | Pro, Sup | 0–90° each |
| Wr | Wrist | 2 | Flex, Rad, Uln | −70 to 80° flex |
| MCP/PIP/DIP | Fingers (F1–F5) | 1–2 | Flex, Abd | Digit suffix: F1=thumb … F5=little |
| Lower Limb — Paired | ||||
| Hip | Hip (Femoroacetabular) | 3 | Flex, Abd, IR, ER | −30 to 125° flex |
| Kn | Knee (Tibiofemoral) | 1 | Flex | 0–140° |
| Ank | Ankle (Talocrural) | 1 | Dors, Plan | Dorsi/plantarflexion only |
| Sub | Subtalar | 1 | Inv, Ev | Inversion/eversion separate |
| MTP/PIP/DIP | Toes (T1–T5) | 1 | Flex | T1=hallux; negative=extension |
~139 DOF full body · ~25 DOF for typical exercise & rehab use · ~149 muscles LOD 1–4 · Full spec: MNN Spec v1.5 →
Static poses are snapshots. A living avatar needs duration, easing, and sequencing. Transition notation is the layer that maps to what the cerebellum actually does — timing and smoothing movement between intended states.
~Nms operator is the transition. The .ease- suffix is the curve. [Con:] marks peak activation at the midpoint.The cerebellum produces a bell-shaped velocity profile on every voluntary movement: acceleration at onset, peak at midpoint, deceleration at arrival. ease-bio is this curve. Any avatar using a linear lerp looks mechanical. ease-bio is the default when no curve is specified.
ease-linear — mechanical, cable rigs & robots.
ease-snap — ballistic/reflex, fast and abrupt.
ease-in — slow start, reaching & extending.
ease-out — fast start, catching & landing.
ease-spring — overshoot and return, tremor.
Living avatars need repeating motion. @loop repeats a sequence indefinitely. @cycle:800ms sets the period. @phase:0.5 offsets bilateral limbs out of phase — the walk cycle.
| Neural layer | Role | MNN equivalent |
|---|---|---|
| Motor cortex | Target joint angle | [Pos:] pair |
| Cerebellum | Timing & smoothing | ~Nms.ease-bio |
| Alpha motor neurons | Peak activation | [Con:] on transition |
| Proprioceptive loop | Cycle & phase | @cycle, @phase |
Transition notation drives skeleton and timing. The Surface Layer drives what the body looks like — morph targets, muscle definition, body composition. Two optional tags, backward compatible with every existing MNN string.
Declares the avatar’s body composition once at session level. Mass, body fat %, frame size, height. Body fat is the key variable — it scales how visible muscle contraction is on the surface. A BF:8% avatar and a BF:30% avatar have the same notation; the rendering engine scales morph magnitudes accordingly.
Explicit morph target weights for when the surface should differ from the default. Use for isometric contractions (joint doesn’t move but surface must show), muscle head specificity (peaked vs round bicep), vascular pump, or clinical atrophy. For standard contractions, the default mapping handles it — no tag needed.
For standard contractions, no [Morph:] is needed. The activation level maps directly to a morph weight: +→ 0.25, ++→ 0.50, +++→ 0.75, ++++→ 1.00. The morph target identifier is the MNN muscle symbol. An LOD 1 engine that receives [Morph:Bic.Long:0.9] falls back to the parent Bic target.
Every field that touches human movement built its own silo. HMN is the first system that crosses all of them in portable plain text.
| System | Domain | Text-portable | Neuromuscular | Joint angles | Voice | Avatar-ready |
|---|---|---|---|---|---|---|
| Labanotation | Dance / theater | ✗ | ✗ | ✗ | ✗ | ✗ |
| Eshkol-Wachman | Movement research | ~ | ✗ | ✓ | ✗ | ~ |
| HamNoSys / SiGML | Sign language / avatars | ~ | ✗ | ✓ (arms/hands) | ✗ | ✓ (arms only) |
| BVH / FBX | Motion capture / animation | ✗ | ✗ | ✓ | ✗ | ✓ |
| SMPL / SMPL-X | AI / ML body models | ✗ | ✗ | ✓ | ~ (face) | ✓ |
| ISB JCS | Biomechanics research | ✗ | ✗ | ✓ | ✗ | ✗ |
| EMG + SENIAM | Muscle activity measurement | ✗ | ✓ (sensor) | ✗ | ✗ | ✗ |
| VRN (HMN) | Vocal production | ✓ | ✓ | ~ (jaw/larynx) | ✓ | ✓ |
| MNN (HMN) | Universal body | ✓ | ✓ | ✓ | via VRN | ✓ |
∼ = partial coverage · ✗ = not supported · ✓ = fully supported
The HMN Builder lets you assemble an MNN string interactively — pick muscles, set joint angles, choose a movement pattern. Copy the result into a gym log, game engine, or cable rig.