This AI agent is designed to serve as a fully private, locally running personal assistant that helps users manage and understand their digital workspace without relying on external services. Its core aim is to provide a free, secure alternative to cloud-based tools by enabling intelligent operations on local documents (such as searching for information, summarizing files or entire folders, classifying and organizing content, and ranking documents by relevance) while also augmenting its capabilities with internet-based research. By combining local data access with AI-driven reasoning, it allows users to efficiently navigate and act on their information while keeping full control over their data.
Domo is your personal assistant on your computerStart the app with:
./run_app.shRun the job workflow directly with:
.venv/bin/python -m tools.job.mainProcess a single local job folder with:
.venv/bin/python -m tools.job.main data/jobs/my-jobNotes:
- Open the local URL shown by Streamlit, typically
http://localhost:8051. - The Streamlit assistant now uses a session-scoped chat UI with three panes:
- chat for clarification and confirmation
- a context panel showing retained parameter values, their source, and their status
- an activity panel showing agent decisions and workflow progress, with raw workflow output available per run
- Domo always asks for confirmation in chat before executing a workflow.
- This repo must be run with the local
.venv, not a global Streamlit install. ./run_app.shuses.venv/bin/streamlitexplicitly, which avoidsModuleNotFoundErrorfrom Anaconda/global Python.- If you prefer the raw command, use
.venv/bin/streamlit run app/streamlit_app.py --server.port 8051. - Use the
streamlitCLI to run the app, notpython app/streamlit_app.py. - Start Ollama on
http://localhost:11434before using the assistant or the job workflow. python -m tools.job.mainwith no argument runs the default ATS search usingconfig.yaml.python -m tools.job.main <folder>accepts a folder containingjob_description_raw.txt,cleaned_job_description.txt,job_description.txt,job description.txt,job_description.pdf, orjob description.pdf.- You can also pass a company-name-style hint like
checkr; the workflow will searchdata/jobs/and pick the closest dated matching folder. - Generated artifacts are written under the configured outputs folder, which defaults to
data/outputs/. - Main application settings now live in
config.yaml, including paths, debug mode, Ollama settings, and job-search parameters. - In the assistant,
run_job_agentis the online search workflow andcreate_job_filesis the local-folder workflow for generating output files from an existing job folder or company-name hint. - The assistant can also override
job_search.role,job_search.location,job_search.ignore_location, andjob_search.remote_onlyfor a single run based on your prompt. - The allowed per-prompt overrides are listed explicitly in
config.yamlunderprompt_overrides; currently onlyrun_job_agenthas overrideable fields, whilecreate_job_filesandmatch_cvare empty.
domo/
├── README.md
├── config.yaml
├── pyproject.toml
├── requirements.txt
├── run_app.sh
├── app/
│ └── streamlit_app.py
├── assistant/
│ ├── audit.py
│ ├── domo_agent.py
│ ├── policy.py
│ ├── registry.py
│ ├── router.py
│ ├── runtime.py
│ └── schemas.py
├── integrations/
│ └── ollama_client.py
├── img/
│ ├── domo_blue.webp
│ └── domo_yellow.webp
├── tools/
│ ├── job/
│ │ ├── __init__.py
│ │ ├── clean_job_description.py
│ │ ├── create_job_files.py
│ │ ├── discover_jobs.py
│ │ ├── export_job_pdf.py
│ │ ├── filesystem.py
│ │ ├── generate_application_materials.py
│ │ ├── main.py
│ │ ├── models.py
│ │ ├── prompts.py
│ │ └── run_job_agent.py
└── data/
├── jobs/
├── logs/
└── outputs/
assistant/runtime.pynow manages a conversational session, retains structured context values, prepares confirmations, and records structured activity events before dispatching tools.assistant/policy.pyvalidates tool arguments and restricts job folder paths to the project data roots.tools/job/run_job_agent.pylaunches the online job-search workflow through the project.venv.tools/job/create_job_files.pylaunches the local job-folder workflow through the same processor.tools/job/main.pysupports three modes:- no argument: search ATS sources using the parameters in
config.yaml - folder containing
job_description_raw.txt: clean the raw job ad, generate a PDF, and generate application materials - folder containing
cleaned_job_description.txt: skip cleaning and generate the remaining outputs from the cleaned text - folder containing
job_description.txtorjob description.txt: normalize the text intojob_description_raw.txt, createjob_metadata.jsonif needed, then continue through the normal flow - folder containing
job_description.pdforjob description.pdf: extract the PDF text intojob_description_raw.txt, createjob_metadata.jsonif needed, then continue through the normal flow
- no argument: search ATS sources using the parameters in
- Main configuration lives in
config.yaml, including:debug.enabledpathsollamajob_workflowprompt_overridesjob_search
- Job search parameters in
config.yamlinclude:rolelocationignore_locationremote_onlysourcescompaniesmax_jobsmax_results_per_sourcemax_company_attempts_per_source
- Batch search writes discovered job inputs under the configured jobs folder, which defaults to
data/jobs/. - Local folder mode can resolve a company-name hint such as
checkrto the closest dated matching folder under the jobs directory. - The assistant workflow
create_job_filesuses that local-folder mode directly and writes the six generated job files under the outputs directory. - Generated artifacts are written under timestamped folders in the configured outputs folder, which defaults to
data/outputs/. - Current generated files are:
cleaned_job_description.txtjob_description.pdfapplication_notes.txtsummary.txtskills.txtsample_cv.txt
- The workflow can search and prepare documents, but it does not currently submit applications through portals or monitor application status.
This project is licensed under the MIT License.
