Skip to content
@aiming-lab

AIMING Lab

AIMING Lab @ UNC-Chapel Hill. AIMING stands for Adaptive Intelligence through Alignment, Interaction and Learning.

Popular repositories Loading

  1. GRAPE GRAPE Public

    GRAPE: Guided-Reinforced Vision-Language-Action Preference Optimization

    Python 84 3

  2. MMedPO MMedPO Public

    MMedPO: Aligning Medical Vision-Language Models with Clinical-Aware Multimodal Preference Optimization

    Python 20 1

  3. MJ-Video MJ-Video Public

    MJ-VIDEO: Fine-Grained Benchmarking and Rewarding Video Preferences in Video Generation

    Python 9 1

  4. CITER CITER Public

    CITER: Collaborative Inference for Efficient Large Language Model Decoding with Token-Level Routing

    2

  5. CARES CARES Public

    Forked from richard-peng-xia/CARES

    [NeurIPS'24] CARES: A Comprehensive Benchmark of Trustworthiness in Medical Vision Language Models

    Python

  6. CREAM CREAM Public

    Forked from Raibows/CREAM

    [ICLR'25] Code for paper "CREAM: Consistency Regularized Self-Rewarding Language Models".

Repositories

Showing 10 of 13 repositories
  • MJ-Video Public

    MJ-VIDEO: Fine-Grained Benchmarking and Rewarding Video Preferences in Video Generation

    aiming-lab/MJ-Video’s past year of commit activity
    Python 9 1 0 0 Updated Feb 23, 2025
  • MMedPO Public

    MMedPO: Aligning Medical Vision-Language Models with Clinical-Aware Multimodal Preference Optimization

    aiming-lab/MMedPO’s past year of commit activity
    Python 20 Apache-2.0 1 0 0 Updated Feb 11, 2025
  • aiming-lab/MJ-VIDEO.github.io’s past year of commit activity
    HTML 0 0 0 0 Updated Feb 6, 2025
  • CITER Public

    CITER: Collaborative Inference for Efficient Large Language Model Decoding with Token-Level Routing

    aiming-lab/CITER’s past year of commit activity
    2 0 2 0 Updated Feb 5, 2025
  • GRAPE Public

    GRAPE: Guided-Reinforced Vision-Language-Action Preference Optimization

    aiming-lab/GRAPE’s past year of commit activity
    Python 84 MIT 3 3 0 Updated Feb 3, 2025
  • MMIE Public Forked from Lillianwei-h/MMIE

    [ICLR'25 Oral] MMIE: Massive Multimodal Interleaved Comprehension Benchmark for Large Vision-Language Models

    aiming-lab/MMIE’s past year of commit activity
    Python 0 MIT 3 0 0 Updated Nov 3, 2024
  • RULE Public Forked from richard-peng-xia/RULE

    [EMNLP'24] RULE: Reliable Multimodal RAG for Factuality in Medical Vision Language Models

    aiming-lab/RULE’s past year of commit activity
    Python 0 MIT 4 0 0 Updated Oct 22, 2024
  • MMed-RAG Public Forked from richard-peng-xia/MMed-RAG

    [ICLR'25] MMed-RAG: Versatile Multimodal RAG System for Medical Vision Language Models

    aiming-lab/MMed-RAG’s past year of commit activity
    Python 0 MIT 12 0 0 Updated Oct 20, 2024
  • CREAM Public Forked from Raibows/CREAM

    [ICLR'25] Code for paper "CREAM: Consistency Regularized Self-Rewarding Language Models".

    aiming-lab/CREAM’s past year of commit activity
    0 1 0 0 Updated Oct 15, 2024
  • CARES Public Forked from richard-peng-xia/CARES

    [NeurIPS'24] CARES: A Comprehensive Benchmark of Trustworthiness in Medical Vision Language Models

    aiming-lab/CARES’s past year of commit activity
    Python 0 CC-BY-4.0 5 0 0 Updated Sep 26, 2024

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…