Public Alpha

Introduction to LLMs

Learn the basics of large language models (LLMs) and how MUI Chat uses them to generate code for your projects.

What is an LLM?

LLM stands for large language model. MUI Chat generates code using a few different LLMs that provide robust programming and web development capabilities. These include:

The way an LLM works is like a very smart autocomplete. It doesn't 'know' information, as such, but based on all of the training information it has consumed, it's able to generate text based on patterns detected during training. To learn more about how this works, take a look at watching this video.

In the case of MUI Chat, the LLM helps you generate code that aligns with your prompt, based on similar applications with similar specifications that it has been trained on.

Prompts

A prompt is a message you send to the LLM. How you write your prompts is key to getting good results. Refer to our effective prompting guide for more guidance.

Context

Context refers to all the information about the project that's available to the AI: your prompts, previous responses, and the existing code. The context window is the amount of text that an LLM can process at once.

MUI Chat provides a large context window, but you still can't rely on the LLM remembering your whole conversation. There's a tradeoff here, as larger context windows consume tokens faster, which increases costs.

One way to preserve context while keeping the context window small is to get the LLM to summarize the conversation so far, copy that summary, then reset the context window and paste the summary.

MUI Chat credits system

Credits to operate MUI Chat are based on usage. For each successful reply, we charge the user 1 credit. So if you have 30 credits, you can get 30 responses from MUI Chat.