Skip to content
Change the repository type filter

All

    Repositories list

    • Qwen2-VL-7B-Instruct

      Public template
      Qwen2-VL-7B-Instruct is a 7-billion-parameter multimodal language model developed by Alibaba Cloud’s Qwen team, designed for instruction-based tasks with advanced visual and multilingual capabilities.
      Python
      0000Updated Dec 2, 2024Dec 2, 2024
    • Python
      0000Updated Nov 19, 2024Nov 19, 2024
    • Python
      0000Updated Nov 19, 2024Nov 19, 2024
    • Python
      1000Updated Nov 4, 2024Nov 4, 2024
    • Code Llama is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 34 billion parameters. This is the repository for the 34B instruct-tuned version in the Hugging Face Transformers format. This model is designed for general code synthesis and understanding.
      Python
      4000Updated Nov 4, 2024Nov 4, 2024
    • Ministral-8B-Instruct is an LLM developed by Mistral AI, specifically designed for instruction-based tasks.
      Python
      0000Updated Nov 4, 2024Nov 4, 2024
    • Whisper-large-v3-turbo is an efficient automatic speech recognition model by OpenAI, featuring 809 million parameters and significantly faster than its predecessor, Whisper large-v3.
      Python
      1000Updated Oct 23, 2024Oct 23, 2024
    • bark

      Public template
      Bark is a transformer-based text-to-audio model created by Suno. Bark can generate highly realistic, multilingual speech as well as other audio - including music, background noise and simple sound effects. The model can also produce nonverbal communications like laughing, sighing and crying.
      Python
      11400Updated Oct 10, 2024Oct 10, 2024
    • Python
      2000Updated Oct 4, 2024Oct 4, 2024
    • Llama 3.2 11B Vision Instruct model is part of Meta's latest series of large language models that introduce significant advancements in multimodal AI capabilities, allowing for both text and image inputs.
      Python
      6500Updated Sep 28, 2024Sep 28, 2024
    • Python
      1000Updated Sep 26, 2024Sep 26, 2024
    • Python
      0000Updated Sep 26, 2024Sep 26, 2024
    • Python
      4000Updated Sep 25, 2024Sep 25, 2024
    • Python
      3100Updated Sep 25, 2024Sep 25, 2024
    • Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 13B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.
      Python
      2100Updated Sep 22, 2024Sep 22, 2024
    • Python
      1000Updated Sep 21, 2024Sep 21, 2024
    • Python
      MIT License
      0000Updated Sep 17, 2024Sep 17, 2024
    • DINet

      Public
      Python
      1000Updated Sep 17, 2024Sep 17, 2024
    • GPT-Neo 125M is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. GPT-Neo refers to the class of models, while 125M represents the number of parameters of this particular pre-trained model.
      Python
      11000Updated Sep 16, 2024Sep 16, 2024
    • Python
      0000Updated Sep 16, 2024Sep 16, 2024
    • Python
      0000Updated Sep 16, 2024Sep 16, 2024
    • Python
      0000Updated Sep 16, 2024Sep 16, 2024
    • Python
      0000Updated Sep 16, 2024Sep 16, 2024
    • ControlNet is a neural network structure to control diffusion models by adding extra conditions. This checkpoint corresponds to the ControlNet conditioned on Canny edges. It can be used in combination with Stable Diffusion.
      Python
      2000Updated Sep 16, 2024Sep 16, 2024
    • BART model pre-trained on English language, and fine-tuned on CNN Daily Mail. It was introduced in the paper BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension by Lewis et al. and first released in [this repository (https://github.com/pytorch/fairseq/tree/master/examples/bart).
      Python
      0600Updated Sep 13, 2024Sep 13, 2024
    • IDEFICS (Image-aware Decoder Enhanced à la Flamingo with Interleaved Cross-attentionS) is an open-access reproduction of Flamingo, a closed-source visual language model developed by Deepmind. Like GPT-4, the multimodal model accepts arbitrary sequences of image and text inputs and produces text outputs.
      Python
      3000Updated Sep 13, 2024Sep 13, 2024
    • Vicuna is a chat assistant trained by fine-tuning Llama 2 on user-shared conversations collected from ShareGPT.
      Python
      1000Updated Sep 13, 2024Sep 13, 2024
    • Universal-Sentence-Encoder-Multilingual-QA is a model developed by researchers at Google mainly for the purpose of question answering. You can use this template to import the model in Inferless.
      Python
      3000Updated Sep 13, 2024Sep 13, 2024
    • Controlnet v1.1 was released in lllyasviel/ControlNet-v1-1 by Lvmin Zhang. This checkpoint is a conversion of the original checkpoint into diffusers format. It can be used in combination with Stable Diffusion, such as runwayml/stable-diffusion-v1-5.
      Python
      2000Updated Sep 13, 2024Sep 13, 2024
    • Falcon-7B-Instruct is a 7B parameters causal decoder-only model built by TII based on Falcon-7B and finetuned on a mixture of chat/instruct datasets. It is made available under the Apache 2.0 license.
      Python
      0000Updated Sep 13, 2024Sep 13, 2024