Skip to content

Commit 50f8094

Browse files
authored
Merge pull request #32 from WEHI-ResearchComputing/gh-pages
Initial Docusaurus setup for GH-Pages with all mdx and typescript
2 parents 0be0f4d + 2fee02f commit 50f8094

39 files changed

Lines changed: 21365 additions & 56 deletions

README.md

Lines changed: 120 additions & 56 deletions
Original file line numberDiff line numberDiff line change
@@ -1,86 +1,150 @@
1-
# PartiNet
2-
PartiNet is a particle-picking pipeline for cryo-EM micrographs. It provides denoising, adaptive detection, and STAR file generation for downstream processing.
1+
# PartiNet 🔬
32

4-
# Links
5-
- Documentation: https://mihinp.github.io/partinet_documentation/
6-
- Model weights (Hugging Face): https://huggingface.co/MihinP/PartiNet
3+
PartiNet is a three-stage pipeline for automated particle picking in cryo-EM micrographs, combining advanced denoising with state-of-the-art deep learning detection.
74

8-
# Getting started (quick)
9-
1. Clone the repository
105

11-
```powershell
6+
## Features
7+
8+
- 🧹 Advanced denoising for improved signal-to-noise ratio
9+
- 🎯 Deep learning-based particle detection
10+
- ⚡ Multi-GPU support for faster processing
11+
- 🔄 Seamless integration with RELION workflows
12+
- 📊 Confidence-based particle filtering
13+
- 🖼️ Visual detection validation
14+
15+
## Prerequisites
16+
17+
Before starting, ensure you have:
18+
- Motion-corrected micrographs
19+
- GPU access (recommended)
20+
- PartiNet installation (see Installation section)
21+
22+
## Installation
23+
24+
```bash
1225
git clone git@github.com:WEHI-ResearchComputing/PartiNet.git
1326
cd PartiNet
1427
```
1528

16-
2. Create a Python virtual environment (recommended)
29+
Alternatively, use our containers:
30+
31+
```bash
32+
# Docker
33+
docker run ghcr.io/wehi-researchcomputing/partinet:latest
1734

18-
```powershell
19-
python -m venv .venv
20-
.\.venv\Scripts\Activate.ps1
21-
pip install -U pip
35+
# Singularity/Apptainer
36+
singularity run oras://ghcr.io/wehi-researchcomputing/partinet:latest
2237
```
2338

24-
3. Install requirements
39+
## Directory Structure
2540

26-
```powershell
27-
pip install -r requirements.txt
28-
# or editable install for development
29-
pip install -e .
3041
```
31-
32-
4. Download model weights (see Hugging Face README)
33-
34-
```powershell
35-
# If you have git-lfs and access via HTTPS/SSH
36-
git lfs install
37-
git clone https://huggingface.co/MihinP/PartiNet
38-
# or use the huggingface_hub python client
39-
python -m pip install huggingface_hub
40-
python - <<'PY'
41-
from huggingface_hub import hf_hub_download
42-
hf_hub_download(repo_id="MihinP/PartiNet", filename="best.pt", repo_type="model")
43-
PY
42+
project_directory/
43+
├── motion_corrected/ # 📁 Input micrographs
44+
├── denoised/ # 🧹 Denoised outputs
45+
├── exp/ # 🎯 Detection results
46+
│ ├── labels/ # 📋 Coordinates
47+
│ └── ... # 🖼️ Visualizations
48+
└── partinet_particles.star # ⭐ Final output
4449
```
4550

46-
# Quick usage examples
51+
## Pipeline Stages
4752

48-
- Denoise images
49-
50-
```powershell
51-
partinet denoise --source /data/raw_micrographs --project /data/partinet_project
53+
### 1. Denoise
54+
```bash
55+
partinet denoise \
56+
--source /data/my_project/motion_corrected \
57+
--project /data/my_project
5258
```
5359

54-
- Detect particles
60+
### 2. Detect
61+
```bash
62+
partinet detect \
63+
--weight /path/to/model_weights.pt \
64+
--source /data/partinet_picking/denoised \
65+
--device 0,1,2,3 \
66+
--project /data/partinet_picking
67+
```
5568

56-
```powershell
57-
partinet detect --weight /path/to/best.pt --source /data/partinet_project/denoised --project /data/partinet_project
69+
### 3. Generate STAR File
70+
```bash
71+
partinet star \
72+
--labels /data/my_project/exp/labels \
73+
--images /data/my_project/denoised \
74+
--output /data/my_project/partinet_particles.star \
75+
--conf 0.1
5876
```
5977

60-
- Generate STAR files
78+
## Key Parameters
6179

62-
```powershell
63-
partinet star --project /data/partinet_project --output /data/partinet_project/exp/particles.star
64-
```
80+
### Detection
81+
- `--backbone-detector`: Choice of neural network architecture
82+
- `--weight`: Path to model weights
83+
- `--conf-thres`: Detection confidence threshold
84+
- `--iou-thres`: Overlap filtering threshold
85+
- `--device`: GPU device selection
6586

66-
# Containerized usage
87+
### STAR Generation
88+
- `--conf`: Confidence threshold for particle filtering
89+
- `--output`: Path for final STAR file
6790

68-
- Docker
91+
## Output Files
6992

70-
```powershell
71-
docker run --gpus all -v /data:/data ghcr.io/wehi-researchcomputing/partinet:main partinet detect --weight /path/to/best.pt --source /data/denoised --project /data/partinet_project
72-
```
93+
1. **Denoised Micrographs** (`denoised/*.mrc`)
94+
- Cleaned micrographs with improved SNR
95+
96+
2. **Detection Results** (`exp/`)
97+
- `labels/*.txt`: Particle coordinates
98+
- `*.png`: Visualization overlays
99+
100+
3. **STAR File** (`partinet_particles.star`)
101+
- Ready for RELION processing
73102

74-
- Apptainer / Singularity
103+
## Advanced Usage
75104

76-
```powershell
77-
apptainer exec --nv --no-home -B /data oras://ghcr.io/wehi-researchcomputing/partinet:main-singularity partinet detect --weight /path/to/best.pt --source /data/denoised --project /data/partinet_project
105+
For detailed information about specific commands:
106+
107+
```bash
108+
partinet --help
109+
partinet <command> --help
78110
```
79111

80-
# Development notes
81-
- Tests and CI: see `.github/workflows/` for CI pipelines.
82-
- Contributing: open issues and PRs on the main repo. Use the documentation site for user-facing docs and developer notes.
112+
Available commands:
113+
- `denoise`: Clean input micrographs
114+
- `detect`: Identify particles
115+
- `star`: Generate STAR files
116+
- `train`: Train custom models (step1/step2)
117+
- `test`: Evaluate model performance
118+
119+
## Troubleshooting
120+
121+
- **GPU Issues**
122+
- Verify GPU availability: `nvidia-smi`
123+
- Check CUDA installation
124+
- Ensure proper device selection
125+
126+
- **Path Issues**
127+
- Verify directory permissions
128+
- Check mount points in container setups
129+
- Ensure absolute paths are used
130+
131+
## Contributing
132+
133+
We welcome contributions! Please see our [Contributing Guidelines](CONTRIBUTING.md) for details.
134+
135+
## License
136+
137+
This project is licensed under the terms of the LICENSE file included in the repository.
138+
139+
## Citation
140+
141+
If you use PartiNet in your research, please cite:
142+
```
143+
Citation information will be added upon publication
144+
```
83145

146+
## Support
84147

85-
# Support
86-
- For questions or issues, open an issue in the main repo: https://github.com/WEHI-ResearchComputing/PartiNet/issues
148+
For issues and questions:
149+
- Open an [Issue](https://github.com/WEHI-ResearchComputing/PartiNet/issues)
150+
- Check existing [Discussions](https://github.com/WEHI-ResearchComputing/PartiNet/discussions)

docs/.gitignore

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
# Dependencies
2+
/node_modules
3+
4+
# Production
5+
/build
6+
7+
# Generated files
8+
.docusaurus
9+
.cache-loader
10+
11+
# Misc
12+
.DS_Store
13+
.env.local
14+
.env.development.local
15+
.env.test.local
16+
.env.production.local
17+
18+
npm-debug.log*
19+
yarn-debug.log*
20+
yarn-error.log*

docs/docs/getting-started.md

Lines changed: 135 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,135 @@
1+
---
2+
sidebar_position: 3
3+
---
4+
5+
# Getting Started
6+
7+
This guide walks you through your first PartiNet analysis using the three-stage pipeline. We'll process cryo-EM micrographs from start to finish.
8+
9+
## Prerequisites
10+
11+
Before starting, ensure you have:
12+
- PartiNet installed (see [Installation](installation.md))
13+
- Motion-corrected micrographs in a source directory
14+
- A project directory where outputs will be saved
15+
- GPU access for optimal performance
16+
17+
## Directory Structure
18+
19+
PartiNet expects and creates the following directory structure:
20+
21+
```
22+
project_directory/
23+
├── motion_corrected/ # 📁 Your input micrographs
24+
│ ├── micrograph1.mrc
25+
│ ├── micrograph2.mrc
26+
│ └── ...
27+
├── denoised/ # 🧹 Created by denoise stage
28+
│ ├── micrograph1.mrc
29+
│ ├── micrograph2.mrc
30+
│ └── ...
31+
├── exp/ # 🎯 Created by detect stage
32+
│ ├── labels/ # 📋 Detection coordinates
33+
│ │ ├── micrograph1.txt
34+
│ │ ├── micrograph2.txt
35+
│ │ └── ...
36+
│ ├── micrograph1.png # 🖼️ Micrographs with detections drawn
37+
│ ├── micrograph2.
38+
│ └── ...
39+
└── partinet_particles.star # ⭐ Final STAR file (created by star stage)
40+
```
41+
42+
**Pipeline Flow:**
43+
1. **Input**`motion_corrected/` (your micrographs)
44+
2. **Stage 1**`denoised/` (cleaned micrographs)
45+
3. **Stage 2**`exp*/` (detections + visualizations)
46+
4. **Stage 3**`*.star` (final particle coordinates)
47+
48+
## Stage 1: Denoise
49+
50+
The first stage removes noise from your micrographs and improves signal-to-noise ratios:
51+
52+
<div class="container-tabs">
53+
54+
```shell title="Local Installation"
55+
partinet denoise \
56+
--source /data/my_project/motion_corrected \
57+
--project /data/my_project
58+
```
59+
60+
</div>
61+
62+
**What this does:**
63+
- Reads micrographs from `motion_corrected/` directory
64+
- Applies denoising algorithms
65+
- Saves cleaned micrographs to `denoised/` directory in your project folder
66+
67+
## Stage 2: Detect
68+
69+
The detection stage identifies particles in your denoised micrographs:
70+
71+
<div class="container-tabs">
72+
73+
```shell title="Local Installation"
74+
partinet detect \
75+
--weight /path/to/downloaded/model_weights.pt \
76+
--source /data/partinet_picking/denoised \
77+
--device 0,1,2,3 \
78+
--project /data/partinet_picking
79+
```
80+
81+
</div>
82+
83+
**What this creates:**
84+
- `exp/` directory in your project folder
85+
- `exp/labels/` directory containing detection coordinates for each micrograph
86+
- Micrographs with detection boxes drawn on top (saved in `exp/`)
87+
88+
**Key parameters:**
89+
- `--backbone-detector`: Neural network architecture to use
90+
- `--weight`: Path to trained model weights
91+
- `--conf-thres`: Confidence threshold for detections (0.0 = accept all)
92+
- `--iou-thres`: Intersection over Union threshold for filtering overlapping detections
93+
- `--device`: GPU devices to use (0,1,2,3 = use 4 GPUs)
94+
95+
## Stage 3: Star
96+
97+
The final stage converts detections to STAR format and applies confidence filtering:
98+
99+
<div class="container-tabs">
100+
101+
```shell title="Local Installation"
102+
partinet star \
103+
--labels /data/my_project/exp/labels \
104+
--images /data/my_project/denoised \
105+
--output /data/my_project/partinet_particles.star \
106+
--conf 0.1
107+
```
108+
109+
</div>
110+
111+
**What this does:**
112+
- Reads detection labels from `exp/labels/`
113+
- Filters particles based on confidence threshold (0.1 in this example)
114+
- Creates a STAR file ready for further processing in RELION or other software
115+
116+
## Output Files
117+
118+
After running all three stages, you'll have:
119+
120+
1. **Denoised micrographs** (`denoised/`) - Cleaned input for particle detection
121+
2. **Detection visualizations** (`exp/*.mrc`) - Micrographs with particle boxes drawn
122+
3. **Detection coordinates** (`exp/labels/*.txt`) - Raw detection data
123+
4. **STAR file** (`*.star`) - Final particle coordinates ready for downstream processing
124+
125+
126+
## Next Steps
127+
128+
- Learn more about individual stages: [Denoise](stages/denoise.md), [Detect](stages/detect.md), [STAR](stages/star.md)
129+
130+
## Troubleshooting
131+
132+
If you encounter issues:
133+
- Ensure all paths exist and are accessible
134+
- Check GPU availability with `nvidia-smi`
135+
- Verify container mounting with `-B` flags includes all necessary paths

0 commit comments

Comments
 (0)