Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LMMSEvalInferenceEngine #1301

Merged
merged 7 commits into from
Oct 28, 2024
Merged

Add LMMSEvalInferenceEngine #1301

merged 7 commits into from
Oct 28, 2024

Conversation

elronbandel
Copy link
Member

No description provided.

Signed-off-by: elronbandel <[email protected]>
from unitxt.text_utils import print_dict

with settings.context(
disable_hf_datasets_cache=False,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should document this new flag in the documentation in a relatively prominent way. (e.g. in one of the first tutorials that loads data from HF).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need to add this example to example main page, and disable the run of this examples by default in regressions.

streaming=True,
)

test_dataset = list(tqdm(dataset["test"], total=30))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That does this do? Why do you nee tqdm?

test_dataset = list(tqdm(dataset["test"], total=30))

predictions = inference_model.infer(test_dataset)
evaluated_dataset = evaluate(predictions=predictions, data=test_dataset)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you get same results as with the inference engine HFLlavaInferenceEngine?

elronbandel and others added 6 commits October 27, 2024 10:12
Signed-off-by: Elron Bandel <[email protected]>
Signed-off-by: Elron Bandel <[email protected]>
Signed-off-by: Elron Bandel <[email protected]>
Signed-off-by: elronbandel <[email protected]>
@elronbandel elronbandel merged commit 62127a6 into main Oct 28, 2024
9 checks passed
@elronbandel elronbandel deleted the add-llms-inference branch October 28, 2024 14:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants