DevLog 250624 MCP Status Update with Automatic1111 & Ollama Serve

> Log Date: 2025-06-25

Spent today advancing the MCP infrastructure by installing Automatic1111, confirming Ollama model access, and refining persona response behavior with Estra.

I made strong forward movement today on the MCP stack. A new repo for aryncore-mcp has been established to contain modular tools, handler scripts, and persona logic. This repo is now the backbone for my orchestrated LLM system. Ollama is functioning, models are available, and Stable Diffusion via Automatic1111 is in progress.


Ollama Serve + Persona Orchestration

Ran the main entrypoint:

python3 -m backend.mcp_orchestrator

Confirmed Ollama is running and listening at :11434. Queried models using:

curl http://localhost:11434/api/tags

Models available include mistral, llama3:8b-instruct, and codellama. Received LLM error during persona interaction due to a transient connection issue. Verified it was not firewall-related (UFW inactive).


SSH and Port Forwarding

Attempted to forward Ollama’s port over SSH:

ssh -L 11434:localhost:11434 user@remote

Port already in use locally. Confirmed status:

ss -tuln | grep 11434

Estra Persona Enhancement

Estra now prompts the user with a mission-oriented questionnaire. Planning to introduce a persistent memory prefix for each response. Will prototype using hardcoded headers, then migrate to a dynamic prefix injection per LLM response.


Automatic1111 Stable Diffusion Setup

No previous installation found. Followed setup instructions:


sudo apt install wget git python3 python3-venv libgl1 libglib2.0-0
git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui
cd stable-diffusion-webui
mkdir -p models/Stable-diffusion/
bash webui.sh
    

For LAN access, use:

COMMANDLINE_ARGS="--listen" bash webui.sh

Next Actions


Repository: skyevault/aryncore-mcp

> Written and deployed by Lorelei Noble

Back to DevLogs