Skip to content

Commit

Permalink
fix(export): GPT models w/ bias=False convert properly (#11255)
Browse files Browse the repository at this point in the history
Signed-off-by: Terry Kong <[email protected]>
  • Loading branch information
terrykong authored and yashaswikarnati committed Nov 21, 2024
1 parent 86f267c commit 41f1bca
Showing 1 changed file with 8 additions and 0 deletions.
8 changes: 8 additions & 0 deletions nemo/export/trt_llm/tensorrt_llm_build.py
Original file line number Diff line number Diff line change
Expand Up @@ -118,6 +118,14 @@ def build_and_save_engine(
build_config.lora_config = lora_config

model = model_cls.from_config(model_config)
if not model_config.bias and model_config.architecture == 'GPTForCausalLM':
# NOTE: GPT models in megatron-core that set bias=False sets the bias false globally
# whereas bias=False in TRTLLM GPT models sets it false everywhere except
# LayerNorm. This change makes TRTLLM's implementation match megatron-core.
for name, module in model.named_modules():
if isinstance(module, tensorrt_llm.layers.normalization.LayerNorm):
module.bias = None
module.register_parameter('bias', None)
model = optimize_model(
model,
use_parallel_embedding=model_config.use_parallel_embedding,
Expand Down

0 comments on commit 41f1bca

Please sign in to comment.