mergekit

Tools for merging pretrained large language models.

6.9k
Stars
+60
Stars/month
1
Releases (6m)

Star Growth

+12 (0.2%)
6.8k6.9k7.1kMar 27Apr 1

Overview

mergekit is a comprehensive toolkit for merging pretrained large language models, designed to combine multiple specialized models into single versatile systems without the computational overhead of traditional ensembling. Using an innovative out-of-core approach, it enables sophisticated model merging even in resource-constrained environments, running efficiently on CPU or with as little as 8GB of VRAM. The tool supports major model architectures including Llama, Mistral, GPT-NeoX, and StableLM, offering multiple merge algorithms to suit different use cases. By operating directly in the weight space, mergekit allows researchers and practitioners to transfer capabilities between models, find optimal trade-offs between different behaviors, and create new functionalities through creative combinations—all while maintaining the same inference cost as a single model. With over 6,900 GitHub stars, it has become an essential tool in the AI research community for model experimentation and capability enhancement.

Deep Analysis

Key Differentiator

The definitive model merging toolkit — supports more merge algorithms (SLERP, TIES, DARE, frankenmerging, evolutionary) than any alternative, with uniquely efficient out-of-core processing that enables elaborate merges on 8GB VRAM

Capabilities

  • Merge pre-trained language models with multiple algorithms
  • Out-of-core merging for resource-constrained environments
  • LoRA extraction from merged models
  • Mixture of Experts (MoE) merging
  • Evolutionary merge optimization
  • Multi-stage merging workflows
  • Tokenizer transplantation

🔗 Integrations

Hugging Face HubLlamaMistralGPT-NeoXStableLM

Best For

  • Creating custom merged LLMs by combining specialized models
  • Research into model merging techniques and weight-space optimization

Not Ideal For

  • Teams without ML expertise to evaluate merge quality
  • Production model training (merging is a complement, not replacement)

Languages

Python

Deployment

pip install (editable)CLI tool

Pricing Detail

Free: Fully open source (LGPL v3)
Paid: N/A — free

Known Limitations

  • Requires understanding of model architectures for good merges
  • Merge quality is unpredictable — results need evaluation
  • LGPL license may be restrictive for some commercial uses
  • CPU-heavy process even with GPU acceleration

Pros

  • + Memory-efficient architecture enables complex merges on modest hardware (8GB VRAM minimum) using lazy tensor loading and out-of-core processing
  • + Comprehensive algorithm support includes linear interpolation, SLERP, DARE, and evolutionary methods for diverse merging strategies
  • + Production-ready with support for major model families (Llama, Mistral, GPT-NeoX) and flexible CPU/GPU execution options

Cons

  • - Requires deep understanding of model architectures and merge parameters to achieve optimal results without degrading performance
  • - Limited documentation for advanced techniques may require experimentation to find best practices for specific use cases
  • - Merge quality heavily depends on compatibility between source models and their training distributions

Use Cases

  • Combining domain-specific fine-tuned models (e.g., code + math specialists) into a single multi-capability model for deployment efficiency
  • Creating custom models by merging open-source base models with specialized fine-tunes for specific applications or languages
  • Research and experimentation with model capabilities, testing different merge ratios and algorithms to discover emergent behaviors

Getting Started

1. Install: `pip install mergekit` and ensure you have sufficient disk space for model weights 2. Configure: Create a YAML merge configuration specifying source models, merge method (linear/slerp), and parameters 3. Execute: Run `mergekit-yaml config.yaml ./output-model --copy-tokenizer` to perform the merge and save the resulting model

Compare mergekit