GenAI in a box: Docker, Neo4J, LangChain, and Ollama offer new stack to developers

GenAI in a box: Docker, Neo4J, LangChain, and Ollama offer new stack to developers

A new, free GenAI stack aimed at making it easy for developers to get started has been previewed at DockerCon, under way in Los Angeles, with components from Docker, Neo4J, LangChain and Ollama.

“There’s a lot of excitement around GenAI, developers keep asking us how do I get started? We also hear that they want to be able to experiment locally without having to get OpenAI keys and they want to try with their company data without putting that in the cloud,” said Docker CEO Scott Johnston, speaking to DevClass. “We’ve defined this GenAI stack which addresses many use cases, and containerised each of these different services … all the developers have to do is docker compose up which stands up the stack, and they’re good to go.”

Alongside Docker technology for running containers, Neo4J provides a graph database manager, LangChain a framework for working with language models, and Ollama a runtime for downloading and running models, plus a set of open source models to run.

Two starter applications are provided, one a support agent bot and the other a Python coding assistant. “The LLM that’s in the box is Llama 2 because it’s open source and available,” said Johnston. “But if [developers] want to swap out that LLM (Large Language Model) for their own, or bring their own data to it, they can.”

A benefit of Ollama, Johnston told us, is that “it’s an abstraction across the LLMs,” making it easier to swap one for another.

Why Neo4J? Sudhir Hasbe, chief product officer at Neo4J, speaking to DevClass, said that “graphs are really good at implicit or explicit relationships. You know exactly the the relationship between all your entities, and therefore whenever you’re making decisions, it’s much easier to know exactly why you arrived at a specific decision.

“You can absolutely go store a relational database with many tables and foreign keys and all of that, but then you have to stitch that together when an application is being built, and every time the traversal changes you will have to rebuild that kind of complex query, versus in graphs, it’s pretty natural.”

Is Neo4J in the new stack limited so that developers will have to purchase a license beyond a certain point? “What gets deployed with Docker is the Community Edition and it’s a local deployment,” Hasbe said. “They can develop everything. Once they graduate that application to be enterprise grade, they want better SLAs (Service Level Agreements), support, high availability, clustering, then they can purchase the enterprise edition.” Paid for options include both self-managed and cloud-hosted offerings.

The idea, he told us, is that “developers can try out the new technologies, learn from it on their desktop without having to spend any money, and then figure out their applications and use cases … hopefully the applications go into production and then we will make money.”

What programming languages are supported? The focus is on Python, Hasbe said. “Python has been the most dominant language with LangChain, and that’s where we’ve seen most of the requests come in.”

Along with the GenAI stack, Docker also introduced Docker AI today, this being a coding assistant for Docker and Docker Compose files. Do not the likes of GitHub Copilot already do that? “Our tests are showing that we can get them to a much better state faster, because of the focused nature of our LLM,” claimed Johnston.