Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Which source file contains the code for loading the model? #2068

Open
HongfengDu opened this issue Jul 31, 2024 · 3 comments
Open

Which source file contains the code for loading the model? #2068

HongfengDu opened this issue Jul 31, 2024 · 3 comments
Labels
question Further information is requested stale

Comments

@HongfengDu
Copy link

I want to customize the model loading process and modify the logic for loading the model

@nv-guomingz
Copy link
Collaborator

@nv-guomingz nv-guomingz added the question Further information is requested label Jul 31, 2024
@HongfengDu
Copy link
Author

https://github.com/NVIDIA/TensorRT-LLM/blob/main/examples/llama/convert_checkpoint.py#L414
Thank you, I would like to modify the CPP file. Another issue, Executor(std::vector<uint8_t> const& engineBuffer, std::string const& jsonConfigStr, ModelType modelType, ExecutorConfig const& executorConfig); the interface how call

Copy link

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 15 days."

@github-actions github-actions bot added the stale label Aug 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested stale
Projects
None yet
Development

No branches or pull requests

2 participants