You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I upgraded my npm package to the latest version (@mlc-ai/[email protected]) to try the new DeepSeek models.
Everything works as expected when I try to load older models, like Llama 3.2; However, when I try to load R1 I get an error. Not sure if it's a bug or issue in my config. Please advise.
How I load the model:
const modelID = "DeepSeek-R1-Distill-Llama-8B-q4f16_1-MLC"
const engine = await webllm.CreateServiceWorkerMLCEngine(modelID, {
initProgressCallback: (report) => console.log(report.text),
});
Error
Error loading model DeepSeek-R1-Distill-Llama-8B-q4f16_1-MLC: ModelNotFoundError: Cannot find model record in appConfig for DeepSeek-R1-Distill-Llama-8B-q4f16_1-MLC. Please check if the model ID is correct and included in the model_list configuration.
The text was updated successfully, but these errors were encountered:
I upgraded my npm package to the latest version (@mlc-ai/[email protected]) to try the new DeepSeek models.
Everything works as expected when I try to load older models, like Llama 3.2; However, when I try to load R1 I get an error. Not sure if it's a bug or issue in my config. Please advise.
How I load the model:
const modelID = "DeepSeek-R1-Distill-Llama-8B-q4f16_1-MLC"
const engine = await webllm.CreateServiceWorkerMLCEngine(modelID, {
initProgressCallback: (report) => console.log(report.text),
});
Error
The text was updated successfully, but these errors were encountered: