Running Mamba Models on Oobabooga's Text-Generation-Webui

What is Mamba?

Before we dive into using Mamba models with the text-generation WebUI, let’s first understand what Mamba is. Mamba is a new type of language model architecture that offers an alternative to the widely-used Transformer models. It’s called a “linear time, state-space model,” which refers to its key properties:

Linear Time Scaling: Unlike Transformers, whose computational complexity scales sub-quadratically with the sequence length, Mamba models scale linearly. This means they can handle much longer sequences more efficiently.

State-Space Formulation: Mamba uses a state-space formulation, which is a mathematical framework for modeling dynamic systems. This allows it to capture long-range dependencies more effectively than traditional approaches.

The end result is that Mamba models can achieve similar or better performance than Transformers, while being significantly more computationally efficient, especially on long sequences.

Prerequisites

This tutorial assumes you already have Oobabooga’s text-generation WebUI set up and running on your system. If not, head over to the project’s GitHub repository and follow the installation instructions.

Installing the Latest Transformers Version

Since Mamba support is a relatively new addition to the Transformers library, you’ll need to install the latest version from the GitHub repository:

  1. Open a terminal and navigate to your text-generation-webui directory.
  2. Activate the virtual environment by running the appropriate command for your operating system (e.g., ./cmd_linux.sh on Linux).
  3. Install the latest Transformers version: pip install git+https://github.com/huggingface/transformers@main

Loading a Mamba Model

With the latest Transformers installed, the text-generation-webui will now automatically recognize and run Mamba models. You can test this by downloading a Mamba model from the Hugging Face Hub, such as state-spaces’s mamba-2.8b-hf model.

Note: Many older Mamba models (especially those created before March 2024) may not be compatible with the Transformers Mamba runtime, as support for this is still relatively new. Over time, more Mamba models will be released with full compatibility.

Once you’ve downloaded a compatible Mamba model, you can load it into the text-generation-webui just like any other model.

That’s it! You’re now ready to experiment with Mamba models and experience their impressive performance, especially on longer sequences.

This article was updated on March 19, 2024

My name is Hunter and I’m a senior computer science student at New College of Florida. I spend much of my free time working on my homelab and personal, tech-related, projects. I hope this blog will fix a great error of mine; I almost never record my adventures, solutions, or projects. I have relied heavily on tech blogs growing up and I feel it would be irresponsible not to give back to that community now that I have the knowledge to.