Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use multi-gpu in running llava? #1003

Open
2 of 4 tasks
xiaocaoxu opened this issue Jan 30, 2024 · 1 comment
Open
2 of 4 tasks

How to use multi-gpu in running llava? #1003

xiaocaoxu opened this issue Jan 30, 2024 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@xiaocaoxu
Copy link

System Info

GPU:3090
CUDA:12.2

Who can help?

@ncomly-nvidia @symphonylyh

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

build llm engine:
python ../llama/build.py --model_dir /models/llava-llama-2-finetune_full-mmcm-2023-11-01-03-15-20/
--dtype float32
--remove_input_padding
--use_gpt_attention_plugin float32
--enable_context_fmha
--use_gemm_plugin float32
--output_dir /models/llava_trt/1.0/fp32/2-gpu/
--max_batch_size 1
--world_size 2
--tp_size 2
--max_prompt_embedding_table_size 576
build visual engine:
python build_visual_engine.py --model_name llava-v1.5-7b --model_path /models/llava-llama-2-finetune_full-mmcm-2023-11-01-03-15-20
run:
mpirun -n 2 --allow-run-as-root python run.py --max_new_tokens 512 --input_text "Question: which city is this? Answer:" --hf_model_dir /models/llava-llama-2-finetune_full-mmcm-2023-11-01-03-15-20 --visual_engine_dir visual_engines/llava-v1.5-7b --llm_engine_dir /models/llava_trt/1.0/fp32/2-gpu --decoder_llm

get error:
image

Expected behavior

LLaVA use multi-gpu

actual behavior

error

additional notes

None

@xiaocaoxu xiaocaoxu added the bug Something isn't working label Jan 30, 2024
@kaiyux
Copy link
Member

kaiyux commented Feb 28, 2024

Hi @xiaocaoxu , we pushed an update to the main branch which should contain the fix of this issue, can you please verify it on the latest main branch? Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants