-
Notifications
You must be signed in to change notification settings - Fork 5
Open
Description
Hi, thank you for your great work!
I tried to test the model that you released! But I found the evaluation results are different from the LLaVA repo.
I evaludated the POPE benchmark.
| Model | LLaVA repo | SeVa repo |
|---|---|---|
| LLaVA-1.5 | 85.91 | 86.288 |
| SeVa-7b-diffu500 | 85.10 | 86.719 |
BTW, I found the temperature used in SeVa inference is 1.0. But when I evaluate SeVa with temperature=1.0 in LLaVA repo, also got 85.10.
Have any comments on these? Thank you very much!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels