diff --git a/README.md b/README.md
index cd03abf003..8ee959d218 100644
--- a/README.md
+++ b/README.md
@@ -41,9 +41,12 @@ Features:
## Table of Contents
- [Axolotl](#axolotl)
- [Table of Contents](#table-of-contents)
- - [Axolotl supports](#axolotl-supports)
- [Quickstart β‘](#quickstart-)
- [Usage](#usage)
+ - [Badge β€π·οΈ](#badge-οΈ)
+ - [Contributing π€](#contributing-)
+ - [Sponsors π€β€](#sponsors-)
+ - [Axolotl supports](#axolotl-supports)
- [Advanced Setup](#advanced-setup)
- [Environment](#environment)
- [Docker](#docker)
@@ -75,14 +78,6 @@ Features:
- [Tokenization Mismatch b/w Inference \& Training](#tokenization-mismatch-bw-inference--training)
- [Debugging Axolotl](#debugging-axolotl)
- [Need help? π](#need-help-)
- - [Badge β€π·οΈ](#badge-οΈ)
- - [Community Showcase](#community-showcase)
- - [Contributing π€](#contributing-)
- - [Sponsors π€β€](#sponsors-)
- - [π Diamond Sponsors - Contact directly](#-diamond-sponsors---contact-directly)
- - [π₯ Gold Sponsors - $5000/mo](#-gold-sponsors---5000mo)
- - [π₯ Silver Sponsors - $1000/mo](#-silver-sponsors---1000mo)
- - [π₯ Bronze Sponsors - $500/mo](#-bronze-sponsors---500mo)
@@ -105,36 +100,11 @@ Features:
-## Axolotl supports
-
-| | fp16/fp32 | lora | qlora | gptq | gptq w/flash attn | flash attn | xformers attn |
-|-------------|:----------|:-----|-------|------|-------------------|------------|--------------|
-| llama | β
| β
| β
| β
| β
| β
| β
|
-| Mistral | β
| β
| β
| β
| β
| β
| β
|
-| Mixtral-MoE | β
| β
| β
| β | β | β | β |
-| Mixtral8X22 | β
| β
| β
| β | β | β | β |
-| Pythia | β
| β
| β
| β | β | β | β |
-| cerebras | β
| β
| β
| β | β | β | β |
-| btlm | β
| β
| β
| β | β | β | β |
-| mpt | β
| β | β | β | β | β | β |
-| falcon | β
| β
| β
| β | β | β | β |
-| gpt-j | β
| β
| β
| β | β | β | β |
-| XGen | β
| β | β
| β | β | β | β
|
-| phi | β
| β
| β
| β | β | β | β |
-| RWKV | β
| β | β | β | β | β | β |
-| Qwen | β
| β
| β
| β | β | β | β |
-| Gemma | β
| β
| β
| β | β | β
| β |
-| Jamba | β
| β
| β
| β | β | β
| β |
-
-β
: supported
-β: not supported
-β: untested
-
## Quickstart β‘
Get started with Axolotl in just a few steps! This quickstart guide will walk you through setting up and running a basic fine-tuning task.
-**Requirements**: Nvidia GPU (Ampere architecture or newer for `bf16` and Flash Attention), Python >=3.10 and PyTorch >=2.3.1.
+**Requirements**: *Nvidia* GPU (Ampere architecture or newer for `bf16` and Flash Attention) or *AMD* GPU, Python >=3.10 and PyTorch >=2.3.1.
```bash
git clone https://github.com/axolotl-ai-cloud/axolotl
@@ -165,6 +135,78 @@ accelerate launch -m axolotl.cli.inference examples/openllama-3b/lora.yml \
accelerate launch -m axolotl.cli.train https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/examples/openllama-3b/lora.yml
```
+## Badge β€π·οΈ
+
+Building something cool with Axolotl? Consider adding a badge to your model card.
+
+```markdown
+[](https://github.com/axolotl-ai-cloud/axolotl)
+```
+
+[](https://github.com/axolotl-ai-cloud/axolotl)
+
+## Sponsors π€β€
+
+If you love axolotl, consider sponsoring the project by reaching out directly to [wing@axolotl.ai](mailto:wing@axolotl.ai).
+
+---
+
+- [Modal](https://modal.com/) Modal lets you run data/AI jobs in the cloud, by just writing a few lines of Python. Customers use Modal to deploy Gen AI models at large scale, fine-tune LLM models, run protein folding simulations, and much more.
+
+---
+
+## Contributing π€
+
+Please read the [contributing guide](./.github/CONTRIBUTING.md)
+
+Bugs? Please check the [open issues](https://github.com/axolotl-ai-cloud/axolotl/issues/bug) else create a new Issue.
+
+PRs are **greatly welcome**!
+
+Please run the quickstart instructions followed by the below to setup env:
+```bash
+pip3 install -r requirements-dev.txt -r requirements-tests.txt
+pre-commit install
+
+# test
+pytest tests/
+
+# optional: run against all files
+pre-commit run --all-files
+```
+
+Thanks to all of our contributors to date. Help drive open source AI progress forward by contributing to Axolotl.
+
+
+
+
+
+## Axolotl supports
+
+| | fp16/fp32 | lora | qlora | gptq | gptq w/flash attn | flash attn | xformers attn |
+|-------------|:----------|:-----|-------|------|-------------------|------------|--------------|
+| llama | β
| β
| β
| β
| β
| β
| β
|
+| Mistral | β
| β
| β
| β
| β
| β
| β
|
+| Mixtral-MoE | β
| β
| β
| β | β | β | β |
+| Mixtral8X22 | β
| β
| β
| β | β | β | β |
+| Pythia | β
| β
| β
| β | β | β | β |
+| cerebras | β
| β
| β
| β | β | β | β |
+| btlm | β
| β
| β
| β | β | β | β |
+| mpt | β
| β | β | β | β | β | β |
+| falcon | β
| β
| β
| β | β | β | β |
+| gpt-j | β
| β
| β
| β | β | β | β |
+| XGen | β
| β | β
| β | β | β | β
|
+| phi | β
| β
| β
| β | β | β | β |
+| RWKV | β
| β | β | β | β | β | β |
+| Qwen | β
| β
| β
| β | β | β | β |
+| Gemma | β
| β
| β
| β | β | β
| β |
+| Jamba | β
| β
| β
| β | β | β
| β |
+
+β
: supported
+β: not supported
+β: untested
+
+
## Advanced Setup
### Environment
@@ -682,86 +724,6 @@ See [this debugging guide](docs/debugging.qmd) for tips on debugging Axolotl, al
## Need help? π
-Join our [Discord server](https://discord.gg/HhrNrHJPRb) where we our community members can help you.
-
-Need dedicated support? Please contact us at [βοΈwing@openaccessaicollective.org](mailto:wing@openaccessaicollective.org) for dedicated support options.
-
-## Badge β€π·οΈ
-
-Building something cool with Axolotl? Consider adding a badge to your model card.
-
-```markdown
-[](https://github.com/axolotl-ai-cloud/axolotl)
-```
-
-[](https://github.com/axolotl-ai-cloud/axolotl)
-
-## Community Showcase
-
-Check out some of the projects and models that have been built using Axolotl! Have a model you'd like to add to our Community Showcase? Open a PR with your model.
-
-Open Access AI Collective
-- [Minotaur 13b](https://huggingface.co/openaccess-ai-collective/minotaur-13b-fixed)
-- [Manticore 13b](https://huggingface.co/openaccess-ai-collective/manticore-13b)
-- [Hippogriff 30b](https://huggingface.co/openaccess-ai-collective/hippogriff-30b-chat)
-
-PocketDoc Labs
-- [Dan's PersonalityEngine 13b LoRA](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-13b-LoRA)
-
-## Contributing π€
-
-Please read the [contributing guide](./.github/CONTRIBUTING.md)
-
-Bugs? Please check the [open issues](https://github.com/axolotl-ai-cloud/axolotl/issues/bug) else create a new Issue.
-
-PRs are **greatly welcome**!
-
-Please run the quickstart instructions followed by the below to setup env:
-```bash
-pip3 install -r requirements-dev.txt -r requirements-tests.txt
-pre-commit install
-
-# test
-pytest tests/
-
-# optional: run against all files
-pre-commit run --all-files
-```
-
-Thanks to all of our contributors to date. Help drive open source AI progress forward by contributing to Axolotl.
-
-
-
-
-
-## Sponsors π€β€
-
-OpenAccess AI Collective is run by volunteer contributors such as [winglian](https://github.com/winglian),
-[NanoCode012](https://github.com/NanoCode012), [tmm1](https://github.com/tmm1),
-[mhenrichsen](https://github.com/mhenrichsen), [casper-hansen](https://github.com/casper-hansen),
-[hamelsmu](https://github.com/hamelsmu) and many more who help us accelerate forward by fixing bugs, answering
-community questions and implementing new features. Axolotl needs donations from sponsors for the compute needed to
-run our unit & integration tests, troubleshooting community issues, and providing bounties. If you love axolotl,
-consider sponsoring the project via [GitHub Sponsors](https://github.com/sponsors/OpenAccess-AI-Collective),
-[Ko-fi](https://ko-fi.com/axolotl_ai) or reach out directly to
-[wing@openaccessaicollective.org](mailto:wing@openaccessaicollective.org).
-
----
-
-#### π Diamond Sponsors - [Contact directly](mailto:wing@openaccessaicollective.org)
-
----
-
-#### π₯ Gold Sponsors - $5000/mo
-
----
-
-#### π₯ Silver Sponsors - $1000/mo
+Join our [Discord server](https://discord.gg/HhrNrHJPRb) where our community members can help you.
----
-
-#### π₯ Bronze Sponsors - $500/mo
-
- - [JarvisLabs.ai](https://jarvislabs.ai)
-
----
+Need dedicated support? Please contact us at [βοΈwing@axolotl.ai](ailto:wing@axolotl.ai) for dedicated support options.
|