Marqo is an end-to-end vector search engine that handles embedding generation, storage, and retrieval through a single API. Quickly build multimodal search apps with images and text via open source or custom models without creating embeddings yourself.
Marqo 🤖 is an end-to-end vector search engine that aims to make building advanced semantic search applications easy.
At its core, Marqo handles:
✅ Vector generation: You can plug in state-of-the-art models like CLIP 🖼, without creating embeddings yourself. Bring your models or use Marqo's defaults.
✅ Vector storage: Uses HNSW indexes for lightning-fast approximate nearest neighbors search. Scale to 100M+ docs.
✅ Vector retrieval: Search text, images, or combinations via a simple API. Build multimodal search apps seamlessly.
-
⬆️ Horizontally scalable - scale inference and storage separately.
-
🌎 Multilingual - leverage models that support 100+ languages.
-
🧮 Ranking modifiers - use numeric fields to influence result order.
-
🔎 Filtering query DSL.
-
📈 Bulk indexing/querying.
-
🎯 Context vectors to tailor searches.
The goal of Marqo is to make building advanced vector search functionality easy for developers. You focus on your application logic while Marqo handles the behind-the-scenes machine learning complexity.
Whether you're looking to build a multimodal search engine, enable chatbots to leverage custom knowledge bases or take advantage of transformer models for search, Marqo is worth checking out.
- 👩💻 Saves engineering effort: Marqo handles the complexity of vector search so you can focus on your application logic—no need to create and manage embeddings and indexes.
- ⚡️ Accelerates development: Go from documents to searchable index in just a few lines of code—rapid iteration and prototyping.
- 🧠 Leverages SOTA models: Pluggable architecture allows you to easily integrate and experiment with semantic models (CLIP, GPT, etc).
- 📈 Scalable vector search: Horizontally scalable to 100M+ docs while maintaining speed. No need to shard indexes yourself.
- 🔎 Developer friendly: Rich query syntax, highlighting, filtering, multimodal search, and more built-in. Optimize search without low-level index tweaking.
The central value proposition for an AI engineer is faster and easier development of vector search functionality to power applications. Marqo handles the machine learning complexity like inference and indexing, enabling engineers to focus on building their solutions vs wrestling with matrices.
The pluggable architecture, scalability, and developer-friendly query language are additional reasons an engineer may find Marqo worth exploring.
- 👷🏽♀️ Builders: Pandu Kerr, Li Wan, Joshua Kim, Tom Hamer
- 👩🏽💼 Builders on LinkedIn: https://www.linkedin.com/in/pandukerr/, https://www.linkedin.com/in/liwan94/, https://www.linkedin.com/in/joshua-kim-cs/, https://www.linkedin.com/in/tom-hamer-%F0%9F%A6%9B-04a6369b/
- 👩🏽🏭 Builders on X: https://twitter.com/jn2clark
- 👩🏽💻 Contributors: 26
- 💫 GitHub Stars: 3.8k
- 🍴 Forks: 158
- 👁️ Watch: 35
- 🪪 License: Apache-2.0
- 🔗 Links: Below 👇🏽
- GitHub Repository: https://github.com/marqo-ai/marqo
- Official Website: https://www.marqo.ai/
- Slack Community: https://marqo-community.slack.com/join/shared_invite/zt-22hhps0bo-cB9mZKQIw2x3KCkYpAu9AA#/shared-invite
- LinkedIn Page: https://www.linkedin.com/company/marqo-ai
- X Page: https://twitter.com/marqo_ai
- Profile in The AI Engineer: https://github.com/theaiengineer/awesome-opensource-ai-engineering/blob/main/libraries/marqo/README.md