An AI that knows you and your business

Cohere's new Command-R model does two things many AI models do not, that are essential for business: Retrieval Augmented Generation (RAG) and tool use.

It matters that AI has easily updatable memory. Without dynamic memory, AI cannot act sequentially on a plan or get to know you or your business.

Up to now, one method to address this, besides expensive one-off fine-tuning, is the so-called Retrieval Augmented Generation or RAG. At the heart of RAG lies the principle of leveraging a repository of information (like documents, web pages, or database entries). The relevant documents are retrieved, usually via search functionality, which is then fed to the generative AI's context window. RAG is, therefore, only as good as the retrieval functionality and is also limited by the context window size and accuracy.

Cohere claims its model, Command-R, features a longer context length, supporting up to 128k tokens (that's about 80,000 words in English), and that, like Google's Gemini, it scores perfectly in a needle-in-a-haystack text (where it is tested to find information anywhere in a document).

GPT4 and Claude, by comparison, context windows are lossy, missing much that's in the middle of long documents.

Command-R can be accessed via an API or self-hosted.

Next
Next

The political economy of AI