Currently Working On

Local LLM deployment and configuration

Configuring a local LLM using Ollama . This is being installed on an old gaming machine that I wiped clean and set up with Debian 12. The intention is to leverage the API provided by Ollama and create local agents to assist Abby and myself with our work and day to day tasks.

I am hoping to use this experience to show clients how they could set up a local LLM within their organization. These LLMs could handle specialized initatives like writing draft product documentation or automatically compile consumable release notes.

MCP Experimentation

Over the last few weeks I have been experimenting with MCP and the Google equivilant A2A protocols. I am keeping notes as I move along and plan on publishing soon.

Contact Me

Thank you for visiting my site. I hope you found something useful or interesting. Please use this form to send me any feedback, questions or just to connect. Have a wonderful day!