The LLM inference Plugin provides simplified access to offline Large Language Model chat capabilities for use in mobile games.
It includes samples and guides on building your own purpose driven A.I. chatbot for in game characters and general chat. Also, included are guides on how to use the extensive and growing list of open source LLM models and LoRas. The plugin is simple to use and suitable for beginners of Large Language models.
The Plugin implements a GameInstance Subsystem that provides configuration options to an on device LLM inference model runtime.