The Applied Go Weekly Newsletter logo

The Applied Go Weekly Newsletter

Subscribe
Archives
August 11, 2025

Blue Gopher Meets Stochastic Parrot • The Applied Go Weekly Newsletter 2025-08-10

AppliedGoNewsletterHeader640.png

Your weekly source of Go news, tips, and projects

2025-08-10 Newsletter Badge 3.png

Blue Gopher Meets Stochastic Parrot

Hi ,

Geez, the previous Spotlight was a bit long and code-heavy (for a Spotlight), wasn't it? Even though this newsletter is in 🌴 "Summer Break" 🌴 mode and basically consists of a Spotlight section only...

So for this issue, I'll refrain from posting any code and instead look at packages, tools, and apps that help marry Go to AI (continuing the summer break's topic "AI and LLMs").

Speaking of LLMs, just a few days ago, OpenAI announced GPT-5. If you tried it already (or tried to try it) via the API, you may have noticed that OpenAI demands a biometric identification before letting you talk to the model, without an apparent reason. Luckily, there is a way around it. The identification only applies for streaming responses. Switch streaming mode off in your LLM client and enjoy GPT-5 responses without exposing private information.

After I did so in Open WebUI (Workspace > Models > Edit Model > Advanced Parameters > Streaming Mode: Off), I asked five LLMs, including GPT-5, a question inspired by a GPT-5 hands-on article:

"What is your favorite obscure fact in the world? Use as few words as possible."

Here is how they answered:

GLM-4.5: "Wombats poop cubes."

Kimi K2: "Octopuses have three hearts; two pump blood to the gills, one to the body. Blue blood."

GPT-OSS 120B: "Bananas are berries; strawberries aren’t."

GPT-5 mini: 'Turritopsis dohrnii can revert to juvenile form—biologically "immortal."'

I knew that fact about Wombats, and the other three sounded plausible and were quickly verified with Wikipedia. But GPT-5's answer was kind of unexpected:

GPT-5: "Humans glow in the dark."

Holy mackerel, what has GPT-5 been smoking?

It turned out that GPT-5's answer isn't wrong at all, it's just slightly exaggerated. Humans do emit light but in an intensity too faint to see with the naked eye. You'd need ultrasensitive cameras in darkness to record an ultra-weak "biophoton" emission from human bodies, caused by oxidative reactions.

Spotlight: About Go and AI

Go code can make use of LLMs and LLMs can make use of Go code, as I demonstrated in the previous two Spotlights. Because I wanted to see the concepts of LLM calling and MCP server clearly in front of me, I decided to use no library, SDK, or framework. These would only obscure what's going on behind the scenes. Pure, dependency-free Go code, on the other hand, reveals the underlying mechanics of Go-LLM communication.

This doesn't mean that packages, SDKs, or frameworks are useless. They can hide boilerplate code behind convenient APIs and save developers from reinventing the wheel. This Spotlight is about them.

The Gopher and the Parrot

You may have heard the term "stochastic parrots" being used for LLMs, as they statiscitally mimic thinking and reasoning without real understanding. Still, they're capable of doing tasks that few would have deemed possible until the release of GPT-3.5 caused worldwide attention.

With Go, you can tap the capabilites of LLMs as well as provide extra functionality to LLMs. Moreover, many useful applications and tools are written in Go (which, from a pure user perspectie, isn't thaaaat interesting, but as a Gopher, you can read the sources and maybe participate in the development in one way or another.)

The following list is by no way exhaustive; it shall serve as an entry point for digging deeper.

Libraries

I'll start with some libraries that you can use to enrich your apps with AI capabilites or build AI tools.

LangChainGo

The first one, LangChainGo is a classic. Created in 2023 (which is quite a while back, considering the rather short (public) history of LLMs) as a port of the Python project LangChain, LangChainGo enables Go applications to integrate with LLMs, process documents for LLM use, and develop agentic tools and workflows.

LangChainGo predates the model context protocol (MCP) but builds upon a similar idea: to chain (hence the name) LLMs, tools, and processing steps together to achieve what we would call today an agentic workflow. Built-in chains can call LLMs, chain steps together, do map-reduce operations to process large documents, chain LLM conversations together to build conversation memory, or provide document retrieval. A wealth of specialized components help with prompting, integrating models and embeddings, parsing output, splitting data into smaller chunks, retrieving text from vector databases, and constructing agents.

LangChainGo is a good choice if you seek LangChain-style composability and if you want to port LangChain project over to Go.

tmc/langchaingo: LangChain for Go, the easiest way to write LLM-based programs in Go

The MCP Go SDK

When building LLM agents today, there is no way around the Model Context Protocol, or MCP for short. Although made by a single company (Anthropic), it's open-source and has been adopted across vendors. Anthropic offers SDKs for multiple languages, Go included. The Go SDK is still in an early phase, however, so expect breaking changes and a fluctuating API. Still, for future projects, the official SDK is a robust choice.

modelcontextprotocol/go-sdk: The official Go SDK for Model Context Protocol servers and clients. Maintained in collaboration with Google.

mark3labs/mcp-go

If you want an MCP library that's more stable than the official SDK, mark3labs/mcp-go is a popular package and certainly a good alternative to the official SDK while it's in its pre-1.0 phase. The author of mcp-go, Ed Zynda, has been added to the Anthropic MCP steering committee back in May, which underscores mcp-go's central position in the Go MCP landscape. Moreover, the official Go SDK is heavily inspired by mcp-go (alongside other SDKs such as mcp-golang and go-mcp.

mark3labs/mcp-go: A Go implementation of the Model Context Protocol (MCP), enabling seamless integration between LLM applications and external data sources and tools.

Related to mcp-go is mark3labs/mcphost. An MCP Host brings LLMs, MCP clients, and MCP servers together. Being a CLI tool, it's configured through JSON config files and can run interactively or in batch mode.

mark3labs/mcphost

Dive

Dive by Deep Noodle (I love the company name! Here's where it originates from) is a modern AI/LLM toolkit. If you want to build agent frameworks (including hierarchical multi-agent systems) and workflow automation with multiple LLM backends and streaming support, Dive is worth a closer look.

(BTW, Deep Noodle are also the makers of Rizor, an embeddable scripting language for Go.)

deepnoodle-ai/dive: Dive is an AI toolkit for Go that can be used to create specialized AI agents, automate workflows, and quickly integrate with the leading LLMs.

Chromem

If you want to build AI apps with retrieval-augmented generation (RAG) or other embedding-based features, you need a vector database. Chromem is an alternative to stand-alone vector databases, as you can embed it into your app—no extra running binary required. Moreover, Chromem has zero dependencies. Think "The SQLite of vector databases."

philippgille/chromem-go: Embeddable vector database for Go with Chroma-like interface and zero third-party dependencies. In-memory with optional persistence.

Applications

Libraries and SDKs aren't the only way to make your AI-development life easier. Applications can provide infrastructure and tooling as an alternative to hosted services.

Ollama

For running LLMs on consumer hardware, Ollama is a great choice. With Ollama, installing and running an LLM locally is like pulling and running docker images. With a decent choice of open source LLMs, most offered at several sizes to match the available hardware, you can spend hours playing with models and watching your task monitor showing the GPU go 100%. The OpenAI-compatible /chat/completions API makes Ollama accessible to LLM clients that speak the OpenAI protocol.

ollama/ollama: Get up and running with OpenAI gpt-oss, DeepSeek-R1, Gemma 3 and other models.

LocalAI

LocalAI, together with its sister projects LocalAGI and LocalRecall, goes beyond pure text LLMs. LocalAI aims at providing a drop-in replacement for OpenAI’s API backed by locally running models. LocalAGI adds support for building no-code agents, and LocalRecall manages knowledge bases stored in local vector databases powered by Chromem (see above).

A unique selling point for LocalAI is its support of multimodal LLMs. Whether you want to generate audio from text, generate or analyze images, recognize speech, detect voice activity, or generate videos, LocalAI can run the required models and orchestrate your workflow.

mudler/LocalAI: 🤖 The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference

Crush

A local, terminal-based, open source, no-strings-attached tool for AI assisted coding, written in Go? Meet Crush, the AI coding TUI app that you can connect to LLMs and language servers of your choice (gopls comes preconfigured, yay) and let the LLM hack away.

Crush came into existence as an offspring of OpenCode after Charm hired OpenCode's main developer. OpenCode continues as a separate project under a new GitHub org. There has been some confusion about the two projects as Charm kept the original name for a while, causing two identically-named projects living side-by-side. But the name "OpenCode" wouldn't have been a good fit for Charm anyway, who name their software "Bubble Tea", "Lip Gloss", "Huh", "Glamour", and "VHS", to list a few. So eventually, OpenCode became Crush.

Anyway, I tested Crush and I love it. Hope it'll help me crush(*) development times for the stockpile of apps I have in my head...

(*) haha, silly pun, haha, yes, pun intended, lol

charmbracelet/crush: The glamourous AI coding agent for your favourite terminal 💘

The Final Verdict™

Outside the realm of AI science and research (where researchers seem to love Python and Jupiter Notebooks to pieces), AI becomes increasingly language-agnostic, and Go, being a perfect glue language, steadily grows into a first-class AI apps language.

The above list of Go AI libs and tools shall pave the way for exploring AI with Go. If you try out one or more of these and suddenly find that time flew by in an instant... well, it wasn't me!! No, no, certainly not!

Happy coding! ʕ◔ϖ◔ʔ

Questions or feedback? Drop me a line. I'd love to hear from you.

Best from Munich, Christoph

Not a subscriber yet?

If you read this newsletter issue online, or if someone forwarded the newsletter to you, subscribe for regular updates to get every new issue earlier than the online version, and more reliable than an occasional forwarding. 

Find the subscription form at the end of this page.

How I can help

If you're looking for more useful content around Go, here are some ways I can help you become a better Gopher (or a Gopher at all):

On AppliedGo.net, I blog about Go projects, algorithms and data structures in Go, and other fun stuff.

Or visit the AppliedGo.com blog and learn about language specifics, Go updates, and programming-related stuff. 

My AppliedGo YouTube channel hosts quick tip and crash course videos that help you get more productive and creative with Go.

Enroll in my Go course for developers that stands out for its intense use of animated graphics for explaining abstract concepts in an intuitive way. Numerous short and concise lectures allow you to schedule your learning flow as you like.

Check it out.


Christoph Berger IT Products and Services
Dachauer Straße 29
Bergkirchen
Germany

Don't miss what's next. Subscribe to The Applied Go Weekly Newsletter:
LinkedIn
Powered by Buttondown, the easiest way to start and grow your newsletter.