Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ModelNotFoundError: Cannot find model record in appConfig for DeepSeek-R1-Distill-Llama-8B-q4f16_1-MLC #663

Open
jvjmarinello opened this issue Feb 1, 2025 · 0 comments

Comments

@jvjmarinello
Copy link

jvjmarinello commented Feb 1, 2025

I upgraded my npm package to the latest version (@mlc-ai/[email protected]) to try the new DeepSeek models.

Everything works as expected when I try to load older models, like Llama 3.2; However, when I try to load R1 I get an error. Not sure if it's a bug or issue in my config. Please advise.

How I load the model:
const modelID = "DeepSeek-R1-Distill-Llama-8B-q4f16_1-MLC"
const engine = await webllm.CreateServiceWorkerMLCEngine(modelID, {
initProgressCallback: (report) => console.log(report.text),
});

Error

Error loading model DeepSeek-R1-Distill-Llama-8B-q4f16_1-MLC: ModelNotFoundError: Cannot find model record in appConfig for DeepSeek-R1-Distill-Llama-8B-q4f16_1-MLC. Please check if the model ID is correct and included in the model_list configuration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant