You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
During the process of building a lora model without a modules_to_save value, the error below occurs.
Isn't the lora model based on the llama model without a modules_to_save value supported yet?
[error log] Traceback (most recent call last): File "/code/tensorrt_llm/examples/llama/build.py", line 828, in <module> args = parse_arguments() File "/code/tensorrt_llm/examples/llama/build.py", line 499, in parse_arguments lora_config = LoraConfig.from_hf(args.hf_lora_dir, File "/usr/local/lib/python3.10/dist-packages/tensorrt_llm/runtime/lora_manager.py", line 60, in from_hf if "lm_head" in adapter_config["modules_to_save"]: TypeError: argument of type 'NoneType' is not iterable
Hi,
During the process of building a lora model without a modules_to_save value, the error below occurs.
Isn't the lora model based on the llama model without a modules_to_save value supported yet?
[error log]
Traceback (most recent call last): File "/code/tensorrt_llm/examples/llama/build.py", line 828, in <module> args = parse_arguments() File "/code/tensorrt_llm/examples/llama/build.py", line 499, in parse_arguments lora_config = LoraConfig.from_hf(args.hf_lora_dir, File "/usr/local/lib/python3.10/dist-packages/tensorrt_llm/runtime/lora_manager.py", line 60, in from_hf if "lm_head" in adapter_config["modules_to_save"]: TypeError: argument of type 'NoneType' is not iterable
[adapter_config.json]
{ "alpha_pattern": {}, "auto_mapping": null, "base_model_name_or_path": "/home/model/Llama2-7b-hf", "bias": "none", "fan_in_fan_out": false, "inference_mode": true, "init_lora_weights": true, "layers_pattern": null, "layers_to_transform": null, "lora_alpha": 32, "lora_dropout": 0.1, "modules_to_save": null, "peft_type": "LORA", "r": 8, "rank_pattern": {}, "revision": null, "target_modules": [ "v_proj", "q_proj", "down_proj", "k_proj", "up_proj" ], "task_type": "CAUSAL_LM" }
The text was updated successfully, but these errors were encountered: