SCBE AETHERMOORE
Developer Tools

Everything ships as a tool you can run.

Libraries, CLIs, APIs, and 24 browser demos. Install from npm or PyPI, or try in the browser with zero setup.

Core Libraries

npm + PyPI

scbe-aethermoore

The full governance framework. 14-layer pipeline, Sacred Tongues tokenizer, PQC crypto, governance gate, and multi-agent fleet orchestration in one package.

npm install scbe-aethermoore

pip install scbe-aethermoore
v3.3.0 TypeScript Python MIT + Commercial
npm

@scbe/kernel

Standalone kernel package with pipeline14, spiralSeal, and PQC primitives. Lighter than the full package for embedded use.

Pipeline PQC Lightweight
npm

@scbe/sixtongues

Sacred Tongues tokenizer as a standalone package. 6 languages, 256 tokens each, phi-weighted. Bijective encoding/decoding.

Tokenizer KO/AV/RU/CA/UM/DR 1,536 tokens

Governance Tools

API

Pump API

Middleware that sits between your users and your model. Computes tongue profile, checks null pattern, makes governance decision (ALLOW/QUARANTINE/DENY) in <10ms.

View pricing →
CLI

scbe-code-prism

Static analysis tool that scans codebases for governance compliance. Checks layer annotations, axiom comments, and security boundary enforcement.

pip install scbe-aethermoore && scbe-code-prism scan ./src
CLI

scbe-convert-to-sft

Converts any codebase into SFT training pairs. Extracts function signatures, docstrings, and test cases into instruction/response JSONL format.

scbe-convert-to-sft --input ./src --output training.jsonl
API

Governance Gate

FastAPI endpoint that evaluates the full 14-layer pipeline on any input. Returns risk tier, tongue profile, confidence score, and audit record.

uvicorn src.api.main:app --port 8000
Try the demo →

Post-Quantum Cryptography

Library

ML-KEM-768 (Kyber)

NIST FIPS 203 key encapsulation. 1,184-byte public keys. Quantum-resistant key exchange built into the governance pipeline.

FIPS 2031184-byte keys
Library

ML-DSA-65 (Dilithium)

NIST FIPS 204 digital signatures. 3,293-byte signatures. Signs governance decisions, audit records, and ledger entries.

FIPS 2043293-byte sigs
Library

AES-256-GCM Envelopes

Symmetric encryption with authenticated data. Wraps governance packets for transport between agents. Replay protection with timestamp validation.

AEADReplay protection

Multi-Agent Tools

Framework

HYDRA Fleet Orchestrator

Spawn, coordinate, and govern fleets of AI agents. Balanced ternary consensus, role assignment (leader/validator/executor/observer), health monitoring.

Swarm demo →
Agent

AetherBrowse

Governed browser automation agent. Semantic antivirus membrane, tongue-weighted Dijkstra routing, and prompt injection detection for web tasks.

Browser demo →
Protocol

MCP Server

Model Context Protocol server exposing SCBE tools to any MCP-compatible AI agent. Sacred Tongue encoding, governance scanning, training data ingestion.

MCP 1.26+Claude CodeCopilot

24 Interactive Demos

Every tool has a browser demo. No installation, no API keys, no signup. Just open and try.

Visualization

Harmonic Wall 3D, Phi-Poincare, Distance Explorer, Energy Landscape, Energy Wells, Embedding Space, Swarm Formations, Attack Radar

Open demo hub →

Interactive Tools

Tongue Encoder, Tongue Heatmap, Risk Calculator, Formula Vault, Governance Gate, Context Fingerprint, Spectral Coherence, Notarize

Try tongue encoder →

Testing & Analysis

Benchmarks, Breathing Phase, Temporal Session, Audio Composition, Audio Telemetry, Small Business Helper

Run benchmarks →

Training Pipeline

Script

Codebase-to-SFT

Converts any source tree into supervised fine-tuning pairs. Extracts code patterns, docstrings, and test cases into governance-labeled JSONL.

python scripts/codebase_to_sft.py --src ./src --out training.jsonl
Pipeline

Label Consolidation

Cleans raw training data: drops unknowns, merges micro-labels into 24 consolidated families, caps class imbalance. 690K noise records eliminated.

python scripts/clean_training_labels.py
Notebook

Canonical Colab Training Lane

Use notebooks/scbe_canonical_training_lane_colab.ipynb when you want one Colab route for raw export upload, normalization, and QLoRA fine-tuning on free T4 GPU. Keep Kaggle aligned as a secondary notebook surface and require python scripts/system/kaggle_notebook_smoke.py --micro-train to pass before any long run.

Dataset on HF →