This project demonstrates a complete pipeline for generating adaptive, context-sensitive NPC dialogue inside a Unity 2D game, using lightweight LLMs. The system combines gameplay events, persistent NPC memory, personality profiles, and dynamic prompt construction to produce dialogues that change meaningfully across playthroughs.
The goal is to turn simple NPCs into reactive characters influenced by world events, prior interactions, and player-defined personalities, so as to fully immerse the player in the fictive world.

The GIF illustrates one of the system’s core features: NPC mood evolution based on repeated interactions, which directly affects the generated dialogue.
A carriage and its escort are trapped in a dark, gloomy forest. One wheel is broken and need urgent repair for them to achieve their journey. The player is a mercenary, hired by a merchant along another mercenary. The goal is to find how to repair the carriage to complete your mission.
Beware of those who lurks in the shadow…
<video width=”640” height
https://github.com/user-attachments/assets/32a1a5d0-7c80-4b64-8e25-06338d90514a
If this is not working, see at https://axidix.github.io/RPGdibAI/
Below is the technicak diagram summarizing how gameplay logic, memory, and AI inference communicate to produce adaptive interactions.

Each NPC maintains a structured memory including:
Memory allows NPCs to react based on the unfolding narrative and previous interactions with the player instead of generating isolated lines.
Dialogue is dynamically generated at runtime:
The output is fully contextual: the same NPC speaks differently depending on personality, the world, and the player’s actions.
At the start of the game, the player chooses a personality line for each NPC.
This creates strong variation across playthroughs. The sets used in the demo were:
Set A:
Set B:
These personalities directly influence every generated line.
The dialogue system is embedded into actual game events:
Game events modify the world state, which then modifies the prompt and, consequently, the generated dialogue.
Handles core gameplay events such as the bandit encounter, the repair station, and world-state updates. It emits narrative signals that the Memory Manager consumes.
Manages persistent memory for each NPC:
This allows for evolving NPC behavior across the demo.
Builds the context provided to the LLM:
The output is a prompt specific to the exact situation, ensuring context-aware dialogue.
Sends API requests to HuggingFace Inference, receives generated text, trims unnecessary tokens, and forwards the result to the dialogue system.
Displays dialogue in the UI and updates the NPC’s conversation log for future interactions.
Introduce more nuanced emotional states (fear, trust, excitement, suspicion) with temporal smoothing and mood decay.
Replace long conversation logs with:
This will reduce prompt size and improve long-term coherence.
Allow NPCs to react to each other’s dialogue, not only the player’s interactions.
Enrich the world by producing NPC interactions without player action (e.g NPCs starting conversation randomly between them, calling the player to make a request)
Optional support for:
This would remove API dependence. However, performance would have to be studied.
Leverage AI tools to produce voice adapted to the NPC’s personality and lines, to deepen player immersion.