SCBE AETHERMOORE
Automation

67 workflows. From commit to production.

14 n8n orchestration workflows and 53 GitHub Actions covering CI/CD, content publishing, training pipelines, multi-cloud deployment, and governance scanning. Every workflow is governed.

14
n8n workflows
53
GitHub Actions
149
Automation scripts
5
Cloud targets

n8n Orchestration Workflows

Self-hosted n8n with a FastAPI bridge for governance scanning, Sacred Tongue encoding, and training data ingestion.

User/Scheduler → n8n Trigger → FastAPI Bridge (port 8001) → Governance Scan → Execute → Log to Airtable/HF
n8n

Content Publisher

Multi-platform content publishing with governance scanning. Writes to Medium, LinkedIn, Dev.to, X/Twitter, Bluesky, GitHub, and HuggingFace. Every post passes through the governance gate before publishing.

9 platformsGovernance gateRate limiting
n8n

Web Agent Task Orchestrator

Dispatches autonomous web research tasks to the AetherBrowse agent fleet. Handles URL extraction, content summarization, and training data generation from web sources.

Web researchAuto-summarizeSFT generation
n8n

M5 Mesh Data Funnel

Ingests data from Notion, Dropbox, and local sources. Runs governance checks on every record, syncs to Airtable, and publishes governed datasets to HuggingFace.

NotionDropboxAirtableHuggingFace
n8n

AI-to-AI Research Debate

Orchestrates multi-model debates where different AI models argue positions on research questions. Generates DPO training pairs from the debate transcripts.

Multi-modelDPO pairsResearch
n8n

Vertex AI + HuggingFace Pipeline

Bidirectional sync between Google Vertex AI and HuggingFace. Trains on Vertex, deploys to HF. Governance gate validates model artifacts before promotion.

Vertex AIHuggingFaceModel sync
n8n

X Growth + Merch Ops

Social media growth automation with merchandise operations. Scheduled posts, engagement tracking, and merch inventory sync.

X/TwitterEngagementMerch
Bridge

FastAPI Bridge Server

Connects n8n to the SCBE governance engine. Exposes /health, /v1/governance/scan, /v1/tongue/encode, /v1/agent/task, /v1/training/ingest, and /v1/vertex/push-to-hf endpoints.

python -m uvicorn workflows.n8n.scbe_n8n_bridge:app --port 8001

CI/CD Pipeline (53 GitHub Actions)

Core

Build + Test + Lint

Main CI pipeline: TypeScript compile, Vitest (5,957 tests), pytest (785 tests), Prettier, flake8, circular dependency check. Runs on every push and PR.

Security

Security Gates

Vulnerability scanning, conflict marker detection, weekly automated security audits. PQC key rotation checks and dependency auditing.

Publish

Auto-Publish

Automated npm and Docker publishing on tagged releases. Dry-run verification, pre-publish safety checks, and changelog generation.

Deploy

Multi-Cloud Deploy

AWS Lambda, EKS, GKE, and Cloud Run deployment workflows. Docker image building and registry pushing included.

Nightly

Automated Checks

Nightly connector health, multicloud training, daily ops, social media updates, and review workflows. Runs on schedule, not on push.

Sync

Platform Sync

HuggingFace model sync, Notion integration, cloud kernel data pipeline, and Vertex AI training triggers.

Training Pipelines

Raw Data (824K JSONL records)
  ↓ clean_training_labels.py (drop 690K unknowns, consolidate 254 → 24 labels)
Cleaned Data (41,703 samples, 24 categories)
  ↓ scbe_canonical_training_lane_colab.ipynb (one-notebook upload → normalize → QLoRA route on Colab T4)
  ↓ finetune_qwen_governance.ipynb (QLoRA with Colab-first T4 lane; Kaggle aligned second after kaggle_notebook_smoke.py passes)
Fine-Tuned Model (Qwen2.5-0.5B + LoRA adapter)
  ↓ push_to_hub (HuggingFace)
Production Model (issdandavis/scbe-governance-qwen-0.5b)
Data

Multi-Source Ingestion

Ingests from Notion, Obsidian, Dropbox, GitHub, web research, and game session logs. Deduplicates, labels, and validates before entering the training pool.

Quality

Label Consolidation

254 raw labels consolidated to 24 families. 690,479 "unknown" records dropped. Class imbalance capped at 5,000 samples per label. Minimum 10 samples per label.

Train

QLoRA Fine-Tuning

Start with scbe_canonical_training_lane_colab.ipynb when you want one Colab surface for upload, normalization, and QLoRA fine-tune. Keep Kaggle aligned to the same contract as a secondary lane, not a CPU-first fallback. Run python scripts/system/kaggle_notebook_smoke.py --micro-train before any long Kaggle job.

Deployment Targets

Docker

Container Stack

Multi-stage Dockerfile: Node 20 (TS compile) → liboqs (PQC) → Python 3.11 → runtime. Ports 8080 (API) + 3000 (gateway). Docker Compose for full stack.

Kubernetes

K8s Manifests

Pre-built manifests for EKS (AWS), GKE (Google), and generic Kubernetes. Includes service accounts, RBAC, and ingress configs.

Serverless

Cloud Functions

AWS Lambda and Google Cloud Run deployment configs. Trust policy JSON for IAM roles. Stateless governance evaluation on demand.