My first test build of the MCP (Master Control Program) is live. It’s simple, raw, and local-first but I officially have a working dashboard, secure routing, and NKN messaging between machines. The system is starting to form its own nervous system.
Today’s work pulled together several key parts of the sovereign stack: decentralized messaging, HTTPS routing, reverse proxies, and service aggregation through a dashboard. While this is still foundational, it sets the tone for the kind of command center I want...fast, private, modular.
I installed nkn-client
and created a lightweight listener on both nodes. One on the server:
const NKNClient = require('nkn-client');
const client = new NKNClient({ identifier: 'server-node' });
client.on('connect', () => console.log('Connected:', client.addr));
client.on('message', (src, payload) => console.log(`From ${src}: ${payload}`));
And another on LoreForge using lore-node
as the identifier. Messaging works both ways , a small but powerful confirmation that my mesh is functional.
Configured the following A records on Netlify:
server.arynwood.com → xxx.xxx.x.xxx
n8n.arynwood.com → xxx.xxx.x.xxx
This lets me separate services across named endpoints , a key step for readability and future cert renewals.
I installed NGINX and configured reverse proxies for both the base dashboard and n8n. Then I used Certbot to issue Let's Encrypt SSL certificates.
# Example config for n8n reverse proxy
server {
listen 80;
server_name n8n.arynwood.com;
location / {
proxy_pass http://localhost:5678;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
Ran sudo certbot --nginx -d n8n.arynwood.com
and server.arynwood.com
, everything passed.
Here’s the barebones HTML index served at /var/www/html/index.html
:
<h1>Arynwood MCP Dashboard</h1>
<ul>
<li><a href="https://n8n.arynwood.com">n8n Workflows</a></li>
<li><a href="https://ollama.arynwood.com">Ollama LLM</a></li>
<li><a href="/whisper-ui">Whisper Transcriber</a></li>
<li><a href="/forge">LoreForge GPU Panel</a></li>
<li><a href="/bots">MCP Bots</a></li>
</ul>
I kept it simple on purpose, future versions will include dynamic routing and authentication.
I wired a test workflow with a Webhook node pointing to /mistral
:
/webhook-test/mistral
worked as expected./webhook/mistral
returns success.curl -X POST https://n8n.arynwood.com/webhook/mistral \
-H "Content-Type: application/json" \
-d '{"prompt":"Your prompt here"}'
At this stage, it's not about polish... it's about wiring the right neurons. This setup proves the basic infrastructure works: secure routing, message transport, service orchestration. The MCP is still embryonic, but it’s alive.
Next up: integrating Ollama directly into workflows, building auth around dashboards, and giving this system a face.
Lorelei Noble