Skip to content

Commit

Permalink
Be more specific about missign RoPE parameters (#1781)
Browse files Browse the repository at this point in the history
  • Loading branch information
rasbt authored Oct 7, 2024
1 parent 1809a39 commit 3f4992a
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion litgpt/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -129,8 +129,9 @@ def rope_cache(self, device: Optional[torch.device] = None) -> Tuple[torch.Tenso
}
else:
# Some but not all parameters are specified; raise an error
missing_params = [param for param, present in zip(adjusted_params_required, params_present) if not present]
raise ValueError(
"The following adjusted RoPE parameters are missing in rope_adjustments."
f"The following adjusted RoPE parameters are missing in rope_adjustments: {', '.join(missing_params)}. "
"All adjusted RoPE parameters must be specified together."
)

Expand Down

0 comments on commit 3f4992a

Please sign in to comment.