Skip to content

gnl00/-i-ati

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

848 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

@i

@i is an AI Agent application with tool calling, subagent execution, long-term memory, and workspace operations.

Core Capabilities

  • Unified multi-provider access: built-in adapters for OpenAI-compatible / Claude-compatible / Gemini-compatible providers.
  • Agent toolchain: built-in file read/write, directory traversal, command execution, web search/fetch, subagent spawn/wait, plan management, skill loading, memory read/write, and scheduled tasks.
  • MCP support: connect to local or remote MCP servers, and search/import configs from the MCP Registry.
  • Skills system: scan local folders and import SKILL.md, enable skills per chat.
  • Long-term memory: main process uses better-sqlite3 + sqlite-vec for semantic memory storage and vector retrieval.
  • Tasks and scheduling: plan review, step status management, and scheduled prompt delivery to specific chats.
  • Subagents: spawn background researcher/coder/reviewer-style subagents with isolated execution context, live status updates, and parent-run confirmation bridging.
  • Artifacts / Workspace: each session is bound to an isolated workspace for file browsing and previewing dev services.
  • Telegram bot support: receive Telegram messages and attachments through a gateway, map them into the shared chat runtime, and reply back through the same unified agent pipeline.

Project Structure

src/main          Electron main process, IPC, database, tool execution, scheduler
src/preload       preload bridge
src/renderer/src  React UI, Zustand store, chat and settings screens
src/shared        shared constants, prompts, tool definitions, schema
src/data          built-in provider definitions
resources         bundled resources
docs              design and data-flow docs

Local Development

pnpm install
pnpm dev

Architecture

The app follows a clear split:

  1. renderer handles UI, state, and event-driven streaming rendering.
  2. preload exposes a controlled Electron API to the renderer.
  3. main owns the database, model requests, tool execution, subagent runtime, MCP connections, memory retrieval, scheduling, and host adapters such as Telegram.

On submission, the renderer triggers MainChatSubmitService via IPC. The main process builds system prompts, skill prompts, message context, and tool definitions, then sends a unified model request. Streaming output is parsed into text segments, tool calls, and tool results, and pushed back to the UI. When needed, the main agent can also spawn background subagents that run with their own runtime context and report status/results back into the same chat flow.

Telegram follows the same main-process path through a host adapter and gateway layer. Incoming Telegram text, commands, and supported attachments are normalized into the shared chat/message model, executed by the same runtime, and formatted back into Telegram replies.

Screenshots

Main chat window

chat-windows

Chat sidebar

chat-sheet

Setting section

setting-sections

Task plan bar

task-plan-bar

FAQ

macOS: app cannot be opened after install

sudo xattr -r -d com.apple.quarantine /Applications/at-i.app

Linux: icons not refreshed

sudo gtk-update-icon-cache /usr/share/icons/hicolor
sudo update-icon-caches /usr/share/icons/hicolor
sudo update-desktop-database /usr/share/applications

References

License

This project is licensed under the GNU General Public License v3.0 or later.

  • SPDX identifier: GPL-3.0-or-later
  • See LICENSE for the full text.

About

@i-ati We aim to build an Agent :P

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages