demo.mp4
Tap elements in your app, add notes, and copy structured output that helps AI coding agents find the exact views you're referring to. Based on the original agentation web implementation.
Add the package via Swift Package Manager:
dependencies: [
.package(url: "https://github.com/ertembiyik/swift-agentation.git", from: "1.0.0")
]Or in Xcode: File → Add Package Dependencies → paste the repository URL.
Requirements: iOS 17.0+, Swift 5.9+
Call install() once at app launch to add the floating toolbar:
#if DEBUG
import Agentation
#endif
@main
struct MyApp: App {
init() {
#if DEBUG
Agentation.shared.install()
#endif
}
var body: some Scene {
WindowGroup {
ContentView()
}
}
}You can also install into a specific UIWindowScene:
Agentation.shared.install(in: scene)Agentation uses an overlay window with visual controlls to manage the capture session and configure it but if you want to update the default values or control it programmatically you can do it using Agentation class:
await Agentation.shared.start() // Begin a capture session
Agentation.shared.pause() // Pause element detection
await Agentation.shared.resume() // Resume detection with a fresh snapshot
Agentation.shared.stop() // End the session
Agentation.shared.copyFeedback() // Copy annotations to clipboard
Agentation.shared.clearFeedback() // Clear all annotations
Agentation.shared.showToolbar()
Agentation.shared.hideToolbar()Feedback can be copied as Markdown (default) or JSON:
Agentation.shared.outputFormat = .markdown // or .jsonAgentation supports two strategies for discovering elements on screen. Data sources are types conforming to the HierarchyDataSource protocol — you can implement your own if needed.
The default data source is AccessibilityHierarchyDataSource, you can change it both through settings screen on Agentation toolbar or set the default one via:
Agentation.shared.selectedDataSourceType = .accessibility The accessibility data source uses the iOS accessibility tree. It captures elements that have isAccessibilityElement = true and extracts their accessibilityLabel, accessibilityValue, accessibilityHint, and traits.
This is the recommended source for both SwiftUI and UIKit. Use standard accessibility modifiers to tag your elements.
SwiftUI:
struct ProfileView: View {
var body: some View {
VStack {
Image(systemName: "person.circle")
.accessibilityLabel("Profile picture")
Text("Jane Doe")
Button("Edit Profile") { }
}
}
}UIKit:
class ProfileViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
avatarImageView.accessibilityLabel = "Profile picture"
avatarImageView.isAccessibilityElement = true
editButton.accessibilityLabel = "Edit Profile"
}
}Example output:
**Viewport:** 390×844
## ProfileViewController
### 1. image "Profile picture"
**Location:** ProfileViewController > "Profile picture"
**Frame:** x:165 y:120 w:60 h:60
**Feedback:** Make the avatar larger
### 2. statictext "Jane Doe"
**Location:** ProfileViewController > "Jane Doe"
**Frame:** x:140 y:190 w:110 h:20
**Feedback:** Use bold font
### 3. button "Edit Profile"
**Location:** ProfileViewController > "Edit Profile"
**Frame:** x:100 y:230 w:190 h:44
**Feedback:** Add a loading stateThe view hierarchy data source walks the UIView tree directly. It captures every leaf view regardless of accessibility settings, making it useful when you need full coverage of views that aren't exposed to VoiceOver.
This source works best with UIKit because it can traverse the entire view tree. For SwiftUI, views don't expose their backing UIViews directly for majority of views, so the source cannot identify them without help. Use the .agentationTag() modifier to register SwiftUI views — it creates a frame-based mapping that Agentation looks up during capture.
UIKit:
No setup required — the data source traverses the UIView tree directly and captures all leaf views. You can optionally set accessibilityIdentifier or accessibilityLabel to improve element naming in the output.
class ProfileViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
view.addSubview(avatarImageView)
view.addSubview(nameLabel)
view.addSubview(editButton)
}
}SwiftUI (requires .agentationTag()):
struct ProfileView: View {
var body: some View {
VStack {
Image(systemName: "person.circle")
.agentationTag("Avatar")
Text("Jane Doe")
.agentationTag("UserName")
Button("Edit Profile") { }
.agentationTag("EditButton")
}
}
}Paths use [tag] for .agentationTag(), #id for accessibilityIdentifier, and "label" for accessibilityLabel:
Example output:
**Viewport:** 390×844
## ProfileViewController
### 1. image "avatarImage"
**Location:** ProfileViewController > #avatarImage
**Frame:** x:165 y:120 w:60 h:60
**Feedback:** Make the avatar larger
### 2. text "Jane Doe"
**Location:** ProfileViewController > "Jane Doe"
**Frame:** x:140 y:190 w:110 h:20
**Feedback:** Use bold font
### 3. button "editProfileButton"
**Location:** ProfileViewController > #editProfileButton
**Frame:** x:100 y:230 w:190 h:44
**Feedback:** Add a loading stateFor the SwiftUI example with .agentationTag():
### 1. image "Avatar"
**Location:** ProfileViewController > [Avatar]
...These properties on Agentation.shared control capture behavior:
| Property | Default | Description |
|---|---|---|
selectedDataSourceType |
.accessibility |
Which data source to use (.accessibility or .viewHierarchy). |
outputFormat |
.markdown |
Output format when copying feedback (.markdown or .json). |
includeHiddenElements |
false |
When enabled, elements with very low alpha or zero size are included in the snapshot. |
includeSystemViews |
false |
When enabled, system-provided views (keyboard, status bar internals) are included. |
experimentalFrameTracking |
false |
Enables CADisplayLink-based tracking so highlights follow elements that move or animate. |
- agentation — the original TypeScript/React implementation
- UniversalGlass — cross-version glass material effects used for the toolbar UI