Skip to content

nobodywho-ooo/nobodywho

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Nobody Who

Discord Matrix Mastodon Godot Engine Contributor Covenant Docs

NobodyWho is a library that lets you run LLMs locally and efficiently on any device.

We currently support Python and Godot, with more integrations on the way.

At a Glance

  • πŸƒ Run any LLM locally, offline, for free
  • βš’οΈ Fast, simple tool calling - just pass a normal function
  • πŸ‘Œ Guaranteed perfect tool calling every time, automatically derives a grammar from your function signature
  • πŸ—¨οΈ Conversation-aware preemptive context shifting, for lobotomy-free conversations of infinite length
  • πŸ’» Ship optimized native code for multiple platforms: Windows, Linux, macOS, Android
  • ⚑ Super fast inference on GPU powered by Vulkan or Metal
  • πŸ€– Compatible with thousands of pre-trained LLMs - use any LLM in the GGUF format
  • πŸ¦™ Powered by the wonderful llama.cpp

Python

Quick Start

Start by installing NobodyWho. This is simply

pip install nobodywho

Next download a model. For a quick start we recommend this one. It is quite small, but will get the job done.

Then you can try to get a response from the model with the following code snippet:

from nobodywho import Chat
chat = Chat("./path/to/your/model.gguf")
response = chat.ask("Is water wet?").completed()
print(response)

You can also setup a basic chat bot very quickly with the code snippet below:

from nobodywho import Chat, TokenStream
chat = Chat("./path/to/your/model.gguf")
while True:
    prompt = input("Enter your prompt: ")
    response : TokenStream = chat.ask(prompt)
    for token in response:
        print(token, end="", flush=True)
    print()

Tool calling

Once you have a chat up and running you will likely want to give it access to tools. This is very easy in NobodyWho:

import math
from nobodywho import tool, Chat

@tool(description="Calculates the area of a circle given its radius")
def circle_area(radius: float) -> str:
    area = math.pi * radius ** 2
    return f"Circle with radius {radius} has area {area:.2f}"

chat = Chat("./path/to/your/model.gguf", tools=[circle_area])

Adding tools to your chat like above will automatically make these available to the model. There plenty of things you can do with tools and many of these are coverend in our docs.

Godot

You can install it from inside the Godot editor: In Godot 4.5+, go to AssetLib and search for "NobodyWho".

...or you can grab a specific version from our github releases page. You can install these zip files by going to the "AssetLib" tab in Godot and selecting "Import".

Make sure that the ignore asset root option is set in the import dialogue.

For further instructions on how to setup NobodyWho in Godot please refer to our docs.

Documentation

The documentation has everything you might want to know: https://docs.nobodywho.ooo/

How to Help

  • ⭐ Star the repo and spread the word about NobodyWho!
  • Join our Discord or Matrix communities
  • Found a bug? Open an issue!
  • Submit your own PR - contributions welcome
  • Help improve docs or write tutorials

Can I export to HTML5 or iOS?

Currently only Linux, MacOS, Android and Windows are supported platforms.

iOS exports seem very feasible. See issue #114

Web exports will be a bit trickier to get right. See issue #111.

Licensing

There has been some confusion about the licensing terms of NobodyWho. To clarify:

Linking two programs or linking an existing software with your own work does not – at least under European law – produce a derivative or extend the coverage of the linked software licence to your own work. [1]

You are allowed to use this plugin in proprietary and commercial projects, free of charge.

If you distribute modified versions of the code in this repo, you must open source those changes.

Feel free to make proprietary projects using NobodyWho, but don't make a proprietary fork of NobodyWho.

About

NobodyWho is an inference engine that lets you run LLMs locally and efficiently on any device.

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Sponsor this project

 

Packages

No packages published