Skip to content
This repository was archived by the owner on Jun 24, 2024. It is now read-only.
This repository was archived by the owner on Jun 24, 2024. It is now read-only.

Warning: Bad token in vocab at index xxx #11

@CheatCod

Description

@CheatCod

running cargo run --release -- -m ~/dev/llama.cpp/models/7B/ggml-model-f16.bin -f prompt gives a bunch of "Warning: Bad token in vocab at index..."
image

The path points to ggml converted llama model, which I have verified that they work with llama.cpp

Metadata

Metadata

Assignees

Labels

issue:bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions