// Initialize the ChatWebLLM model with the model record. constmodel = newChatWebLLM({ model:"Phi-3-mini-4k-instruct-q4f16_1-MLC", chatOptions: { temperature:0.5, }, });
// Call the model with a message and await the response. constresponse = awaitmodel.invoke([ newHumanMessage({ content:"My name is John." }), ]);
To use this model you need to have the
@mlc-ai/web-llm
module installed. This can be installed usingnpm install -S @mlc-ai/web-llm
.You can see a list of available model records here: https://github.com/mlc-ai/web-llm/blob/main/src/config.ts
Example