Show HN: LLM-powered NPCs running on your hardware

https://github.com/GigaxGames/gigax

Gigax Logo

Twitter Discord

πŸ‘Ÿ Runtime, LLM-powered NPCs

Interactive_NPCs.mp4

Features

  • πŸ•ΉοΈ NPCs that <speak>, <jump>, <attack> and perform any other action you've defined
  • ⚑ <1 second GPU inference on most machines
  • πŸ€— Open-weights models available, fined-tuned from: Llama-3, Phi-3, Mistral, etc.
  • πŸ”’ Structured generation with Outlines 〰️ means the output format is always respected
  • πŸ—„οΈ Coming soon: Local server mode, with language-agnostic API
  • πŸ“œ Available on API: Runtime quest generation, for players and NPCs
  • πŸ˜Άβ€πŸŒ«οΈ Available on API: Memory creation, storage and retrieval with a Vector DB

Gigax has new releases and features on the way. Make sure to ⭐ star and πŸ‘€ watch this repository!

Usage

Model instantiation

from outlines import models
from gigax.step import NPCStepper
from transformers import AutoTokenizer, AutoModelForCausalLM
# Download model from the Hub
llm = Llama.from_pretrained(
    repo_id="Gigax/NPC-LLM-3_8B-GGUF",
    filename="npc-llm-3_8B.gguf"
    # n_gpu_layers=-1, # Uncomment to use GPU acceleration
    # n_ctx=2048, # Uncomment to increase the context window
)
model = models.LlamaCpp(llm) 
# Instantiate a stepper: handles prompting + output parsing
stepper = NPCStepper(model=model)

Stepping an NPC

  • From there, stepping an NPC is a one-liner:
action = await stepper.get_action(
    context=context,
    locations=locations,
    NPCs=NPCs,
    protagonist=protagonist,
    items=items,
    events=events,
)
  • We provide classes to instantiate Locations, NPCs, etc. :
from gigax.parse import CharacterAction
from gigax.scene import (
    Character,
    Item,
    Location,
    ProtagonistCharacter,
    Skill,
    ParameterType,
)
# Use sample data
context = "Medieval world"
current_location = Location(name="Old Town", description="A quiet and peaceful town.")
locations = [current_location] # you can add more locations to the scene
NPCs = [
    Character(
    name="John the Brave",
    description="A fearless warrior",
    current_location=current_location,
    )
]
protagonist = ProtagonistCharacter(
    name="Aldren",
    description="Brave and curious",
    current_location=current_location,
    memories=["Saved the village", "Lost a friend"],
    quests=["Find the ancient artifact", "Defeat the evil warlock"],
    skills=[
        Skill(
            name="Attack",
            description="Deliver a powerful blow",
            parameter_types=[ParameterType.character],
        )
    ],
    psychological_profile="Determined and compassionate",
)
items = [Item(name="Sword", description="A sharp blade")]
events = [
    CharacterAction(
        command="Say",
        protagonist=protagonist,
        parameters=[items[0], "What a fine sword!"],
    )
]

API

Contact us to give our NPC API a try - we'll take care of model serving, NPC memory, and more!

{
"by": "Tantris",
"descendants": 4,
"id": 40212925,
"kids": [
40213226,
40213174
],
"score": 24,
"text": "Hey HN,<p>My team is open-sourcing the inference stack and fined-tuned models we use to create LLM-powered NPCs:\n<a href=\"https:&#x2F;&#x2F;github.com&#x2F;GigaxGames&#x2F;gigax\">https:&#x2F;&#x2F;github.com&#x2F;GigaxGames&#x2F;gigax</a><p>The generative agents paper [1] pioneered the idea of prompting LLMs to create autonomous NPCs. But existing implementations require multiple calls to an LLM to make the agent plan its day, chat with people, and interact with its environment [2].<p>Our approach allows NPCs to be stepped at runtime with a single pass on consumer-grade hardware, with reasonable latency.<p>To achieve this, we&#x27;ve fine-tuned open-source LLMs [3] to parse a text description of a 3d scene, and respond with custom actions like `greet &lt;someone&gt;`, `grab &lt;item&gt;`, or `say &lt;utterance&gt;`.\nThis simple whitespace-separated Β« function calling Β» format is less verbose than json and thus helps with inference speed. We&#x27;re also using the Outlines library [4] to force the model to adhere to this format.<p>We&#x27;re launching an API and we&#x27;re looking for partnerships with studios to integrate our tech into upcoming games. Would love to connect if this is of any interest to you!<p>Thanks in advance for your feedback :)<p>[1] <a href=\"https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2304.03442\" rel=\"nofollow\">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2304.03442</a>\n[2] <a href=\"https:&#x2F;&#x2F;github.com&#x2F;joonspk-research&#x2F;generative_agents&#x2F;tree&#x2F;main&#x2F;reverie&#x2F;backend_server&#x2F;persona&#x2F;prompt_template&#x2F;v3_ChatGPT\">https:&#x2F;&#x2F;github.com&#x2F;joonspk-research&#x2F;generative_agents&#x2F;tree&#x2F;m...</a>\n[3] <a href=\"https:&#x2F;&#x2F;huggingface.co&#x2F;Gigax\" rel=\"nofollow\">https:&#x2F;&#x2F;huggingface.co&#x2F;Gigax</a> (phi-3 fine-tune)\n[4] <a href=\"https:&#x2F;&#x2F;github.com&#x2F;outlines-dev&#x2F;outlines&#x2F;tree&#x2F;main\">https:&#x2F;&#x2F;github.com&#x2F;outlines-dev&#x2F;outlines&#x2F;tree&#x2F;main</a>",
"time": 1714494886,
"title": "Show HN: LLM-powered NPCs running on your hardware",
"type": "story",
"url": "https://github.com/GigaxGames/gigax"
}
{
"author": "GigaxGames",
"date": null,
"description": "LLM-powered NPCs running on your hardware. Contribute to GigaxGames/gigax development by creating an account on GitHub.",
"image": "https://opengraph.githubassets.com/680e14c1d3382c298a31b17f30e7aab6a51c229c820cdf72299d3b6324cdb7bf/GigaxGames/gigax",
"logo": "https://logo.clearbit.com/github.com",
"publisher": "GitHub",
"title": "GitHub - GigaxGames/gigax: LLM-powered NPCs running on your hardware",
"url": "https://github.com/GigaxGames/gigax"
}
{
"url": "https://github.com/GigaxGames/gigax",
"title": "GitHub - GigaxGames/gigax: LLM-powered NPCs running on your hardware",
"description": "πŸ‘Ÿ Runtime, LLM-powered NPCs Interactive_NPCs.mp4 Features πŸ•ΉοΈ NPCs that &lt;speak&gt;, &lt;jump&gt;, &lt;attack&gt; and perform any other action you've defined ⚑ &lt;1 second GPU...",
"links": [
"https://github.com/GigaxGames/gigax"
],
"image": "https://opengraph.githubassets.com/680e14c1d3382c298a31b17f30e7aab6a51c229c820cdf72299d3b6324cdb7bf/GigaxGames/gigax",
"content": "<div><article><div>\n<p><a target=\"_blank\" href=\"https://github.com/GigaxGames/gigax/blob/main/docs/assets/images/gigax_logo_black.png\"><img src=\"https://github.com/GigaxGames/gigax/raw/main/docs/assets/images/gigax_logo_black.png\" alt=\"Gigax Logo\" /></a></p>\n<p><a target=\"_blank\" href=\"https://twitter.com/GigaxGames\"><img src=\"https://camo.githubusercontent.com/024c64b63a35286fcef072f0fa680235eaa5de75133001d0b7051cb59e327edd/68747470733a2f2f696d672e736869656c64732e696f2f747769747465722f666f6c6c6f772f476967617847616d65733f7374796c653d736f6369616c\" alt=\"Twitter\" /></a>\n<a target=\"_blank\" href=\"https://discord.gg/rRBSueTKXg\"><img src=\"https://camo.githubusercontent.com/66b7431132fec818b66eb00e25e40c443125c3ed42e4070c69005190e0b36c12/68747470733a2f2f696d672e736869656c64732e696f2f646973636f72642f313039303139303434373930363933343832353f636f6c6f723d383141314331266c6f676f3d646973636f7264266c6f676f436f6c6f723d7768697465267374796c653d666c61742d737175617265\" alt=\"Discord\" /></a></p>\n<p><strong>πŸ‘Ÿ Runtime, LLM-powered NPCs</strong></p>\n<details>\n <summary>\n <span>Interactive_NPCs.mp4</span>\n <span></span>\n </summary>\n <video src=\"https://private-user-images.githubusercontent.com/33256624/326448287-6dc65347-7d55-45a3-90c1-d2f39941b1a0.mp4?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzQ3OTk5MDEsIm5iZiI6MTczNDc5OTYwMSwicGF0aCI6Ii8zMzI1NjYyNC8zMjY0NDgyODctNmRjNjUzNDctN2Q1NS00NWEzLTkwYzEtZDJmMzk5NDFiMWEwLm1wND9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDEyMjElMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQxMjIxVDE2NDY0MVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWNhM2RmOGQzYTIwMmQ0YjlkZjRhYmQ2ODYwNDI2YjgzNzI4MjZjMzJkZjRlM2Y5MGRkMTA5NDA3ZTRkNDI3N2EmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.5PiBxn8xUfbUfnp5rnMbuJzJInLtw_ul69O_fCgoq-4\" controls=\"controls\" muted=\"muted\">\n </video>\n</details>\n<hr />\n</div>\n<p></p><h2>Features</h2><a target=\"_blank\" href=\"https://github.com/GigaxGames/gigax#features\"></a><p></p>\n<ul>\n<li> πŸ•ΉοΈ NPCs that <code>&lt;speak&gt;</code>, <code>&lt;jump&gt;</code>, <code>&lt;attack&gt;</code> and perform any other action you've defined</li>\n<li> ⚑ &lt;1 second GPU inference on most machines</li>\n<li> <a target=\"_blank\" href=\"https://huggingface.co/Gigax\">πŸ€— Open-weights models available</a>, fined-tuned from: Llama-3, Phi-3, Mistral, etc.</li>\n<li> πŸ”’ Structured generation with <a target=\"_blank\" href=\"https://github.com/outlines-dev/outlines/tree/main\">Outlines 〰️</a> means the output format is always respected</li>\n<li> πŸ—„οΈ <em>Coming soon:</em> Local server mode, with language-agnostic API</li>\n<li> πŸ“œ <em><strong><a target=\"_blank\" href=\"https://tally.so/r/w7d2Rz\">Available on API</a></strong></em>: Runtime quest generation, for players and NPCs</li>\n<li> πŸ˜Άβ€πŸŒ«οΈ <em><strong><a target=\"_blank\" href=\"https://tally.so/r/w7d2Rz\">Available on API</a></strong></em>: Memory creation, storage and retrieval with a Vector DB</li>\n</ul>\n<p>Gigax has new releases and features on the way. Make sure to ⭐ star and πŸ‘€ watch this repository!</p>\n<p></p><h2>Usage</h2><a target=\"_blank\" href=\"https://github.com/GigaxGames/gigax#usage\"></a><p></p>\n<p></p><h3>Model instantiation</h3><a target=\"_blank\" href=\"https://github.com/GigaxGames/gigax#model-instantiation\"></a><p></p>\n<ul>\n<li>\n<p>We provide various models on the <a target=\"_blank\" href=\"https://huggingface.co/Gigax\">πŸ€— Huggingface hub</a>:</p>\n<ul>\n<li><a target=\"_blank\" href=\"https://huggingface.co/Gigax/NPC-LLM-7B\">NPC-LLM-7B</a> (our Mistral-7B fine-tune)</li>\n<li><a target=\"_blank\" href=\"https://huggingface.co/Gigax/NPC-LLM-3_8B\">NPC-LLM-3_8B</a> (our Phi-3 fine-tune)</li>\n<li><a target=\"_blank\" href=\"https://huggingface.co/Gigax/NPC-LLM-3_8B-128k\">NPC-LLM-3_8B-128k</a> (our Phi-3 128k context length fine-tune)</li>\n</ul>\n</li>\n<li>\n<p>All these models are also available in <a target=\"_blank\" href=\"https://huggingface.co/docs/hub/en/gguf\">gguf</a> format to run them on CPU using <a target=\"_blank\" href=\"https://llama-cpp-python.readthedocs.io/en/latest/\">llama_cpp</a></p>\n<ul>\n<li><a target=\"_blank\" href=\"https://huggingface.co/Gigax/NPC-LLM-7B-GGUF\">NPC-LLM-7B-GGUF</a></li>\n<li><a target=\"_blank\" href=\"https://huggingface.co/Gigax/NPC-LLM-3_8B-GGUF\">NPC-LLM-3_8B-GGUF</a></li>\n<li><a target=\"_blank\" href=\"https://huggingface.co/Gigax/NPC-LLM-3_8B-128k-GGUF\">NPC-LLM-3_8B-128k-GGUF</a></li>\n</ul>\n</li>\n<li>\n<p>Start by instantiating one of them using outlines:</p>\n</li>\n</ul>\n<div><pre><span>from</span> <span>outlines</span> <span>import</span> <span>models</span>\n<span>from</span> <span>gigax</span>.<span>step</span> <span>import</span> <span>NPCStepper</span>\n<span>from</span> <span>transformers</span> <span>import</span> <span>AutoTokenizer</span>, <span>AutoModelForCausalLM</span>\n<span># Download model from the Hub</span>\n<span>llm</span> <span>=</span> <span>Llama</span>.<span>from_pretrained</span>(\n <span>repo_id</span><span>=</span><span>\"Gigax/NPC-LLM-3_8B-GGUF\"</span>,\n <span>filename</span><span>=</span><span>\"npc-llm-3_8B.gguf\"</span>\n <span># n_gpu_layers=-1, # Uncomment to use GPU acceleration</span>\n <span># n_ctx=2048, # Uncomment to increase the context window</span>\n)\n<span>model</span> <span>=</span> <span>models</span>.<span>LlamaCpp</span>(<span>llm</span>) \n<span># Instantiate a stepper: handles prompting + output parsing</span>\n<span>stepper</span> <span>=</span> <span>NPCStepper</span>(<span>model</span><span>=</span><span>model</span>)</pre></div>\n<p></p><h3>Stepping an NPC</h3><a target=\"_blank\" href=\"https://github.com/GigaxGames/gigax#stepping-an-npc\"></a><p></p>\n<ul>\n<li>From there, stepping an NPC is a one-liner:</li>\n</ul>\n<div><pre><span>action</span> <span>=</span> <span>await</span> <span>stepper</span>.<span>get_action</span>(\n <span>context</span><span>=</span><span>context</span>,\n <span>locations</span><span>=</span><span>locations</span>,\n <span>NPCs</span><span>=</span><span>NPCs</span>,\n <span>protagonist</span><span>=</span><span>protagonist</span>,\n <span>items</span><span>=</span><span>items</span>,\n <span>events</span><span>=</span><span>events</span>,\n)</pre></div>\n<ul>\n<li>We provide classes to instantiate <code>Locations</code>, <code>NPCs</code>, etc. :</li>\n</ul>\n<div><pre><span>from</span> <span>gigax</span>.<span>parse</span> <span>import</span> <span>CharacterAction</span>\n<span>from</span> <span>gigax</span>.<span>scene</span> <span>import</span> (\n <span>Character</span>,\n <span>Item</span>,\n <span>Location</span>,\n <span>ProtagonistCharacter</span>,\n <span>Skill</span>,\n <span>ParameterType</span>,\n)\n<span># Use sample data</span>\n<span>context</span> <span>=</span> <span>\"Medieval world\"</span>\n<span>current_location</span> <span>=</span> <span>Location</span>(<span>name</span><span>=</span><span>\"Old Town\"</span>, <span>description</span><span>=</span><span>\"A quiet and peaceful town.\"</span>)\n<span>locations</span> <span>=</span> [<span>current_location</span>] <span># you can add more locations to the scene</span>\n<span>NPCs</span> <span>=</span> [\n <span>Character</span>(\n <span>name</span><span>=</span><span>\"John the Brave\"</span>,\n <span>description</span><span>=</span><span>\"A fearless warrior\"</span>,\n <span>current_location</span><span>=</span><span>current_location</span>,\n )\n]\n<span>protagonist</span> <span>=</span> <span>ProtagonistCharacter</span>(\n <span>name</span><span>=</span><span>\"Aldren\"</span>,\n <span>description</span><span>=</span><span>\"Brave and curious\"</span>,\n <span>current_location</span><span>=</span><span>current_location</span>,\n <span>memories</span><span>=</span>[<span>\"Saved the village\"</span>, <span>\"Lost a friend\"</span>],\n <span>quests</span><span>=</span>[<span>\"Find the ancient artifact\"</span>, <span>\"Defeat the evil warlock\"</span>],\n <span>skills</span><span>=</span>[\n <span>Skill</span>(\n <span>name</span><span>=</span><span>\"Attack\"</span>,\n <span>description</span><span>=</span><span>\"Deliver a powerful blow\"</span>,\n <span>parameter_types</span><span>=</span>[<span>ParameterType</span>.<span>character</span>],\n )\n ],\n <span>psychological_profile</span><span>=</span><span>\"Determined and compassionate\"</span>,\n)\n<span>items</span> <span>=</span> [<span>Item</span>(<span>name</span><span>=</span><span>\"Sword\"</span>, <span>description</span><span>=</span><span>\"A sharp blade\"</span>)]\n<span>events</span> <span>=</span> [\n <span>CharacterAction</span>(\n <span>command</span><span>=</span><span>\"Say\"</span>,\n <span>protagonist</span><span>=</span><span>protagonist</span>,\n <span>parameters</span><span>=</span>[<span>items</span>[<span>0</span>], <span>\"What a fine sword!\"</span>],\n )\n]</pre></div>\n<p></p><h2>API</h2><a target=\"_blank\" href=\"https://github.com/GigaxGames/gigax#api\"></a><p></p>\n<p>Contact us to <a target=\"_blank\" href=\"https://tally.so/r/w7d2Rz\">give our NPC API a try</a> - we'll take care of model serving, NPC memory, and more!</p>\n</article></div>",
"author": "",
"favicon": "https://github.githubassets.com/favicons/favicon.svg",
"source": "github.com",
"published": "",
"ttr": 76,
"type": "object"
}