Skip to content

What is Local LLM Plugin?

Local LLM Plugin allows to load a large language model (LLM) of GGUF format and run it on the Unreal Engine.

Run locally and within BP/C++

  • Runs offline on a local PC.
  • Just add one component to your BP and you are ready to use it.
  • No Python or dedicated server is required.

Useful features

  • Works asynchronously, additional questions can be asked at any time during answer generation.
  • Supports saving and loading of "state" that preserve the context of a conversation with an LLM, allowing you to resume a previous conversation later.
  • Supports multibyte characters.