Agentic Workflow: AI-Assisted Development with Claude Code
Table of Contents
- 1. Introduction
- 2. Architecture Overview
- 3. Core Components
- 4. Complete Workflow
- 5. Scheduling with Cron
- 6. Agentic Patterns
- 7. Team Onboarding Checklist
- 8. Benefits
- 9. Advanced Agent Patterns (sage.el Session Notes - 2026-01-11)
- 10. Lessons Learned: File Restructuring (2026-01-11)
- 11. FreeBSD-Specific Patterns
- 12. References
1. Introduction
This document describes a comprehensive workflow for AI-assisted software development using Claude Code integrated with modern development tooling. The approach emphasizes:
- Repository management with consistent organization
- Task tracking with dependency-aware issue management
- Branch isolation through git worktrees
- Session persistence via terminal multiplexers
- AI assistance with context-aware instructions
The goal is to create a seamless environment where human developers and AI assistants can collaborate effectively across multiple projects.
2. Architecture Overview
┌─────────────────────────────────────────────────────────────────┐ │ Developer Workstation │ ├─────────────────────────────────────────────────────────────────┤ │ │ │ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │ │ │ tmux │ │ Claude Code │ │ Task │ │ │ │ sessions │◄──►│ (AI) │◄──►│ Tracker │ │ │ └──────────────┘ └──────────────┘ └──────────────┘ │ │ │ │ │ │ │ ▼ ▼ ▼ │ │ ┌─────────────────────────────────────────────────────────────┐│ │ │ Repository Root ││ │ │ ┌─────────────────────────────────────────────────────┐ ││ │ │ │ project/ │ ││ │ │ │ ├── .git/ # Version control │ ││ │ │ │ ├── .tasks/ # Task database │ ││ │ │ │ ├── CLAUDE.md # AI instructions │ ││ │ │ │ ├── worktrees/ # Branch checkouts │ ││ │ │ │ │ ├── feature-auth/ # Feature branch │ ││ │ │ │ │ ├── fix-login/ # Bugfix branch │ ││ │ │ │ │ └── refactor-api/ # Refactor branch │ ││ │ │ │ └── (main checkout) │ ││ │ │ └─────────────────────────────────────────────────────┘ ││ │ └─────────────────────────────────────────────────────────────┘│ │ │ │ │ ▼ │ │ ┌─────────────────────────────────────────────────────────────┐│ │ │ Scheduled Sync ││ │ │ (cron every 4h) ││ │ │ ├── Pull all repos ││ │ │ └── Create/update worktrees ││ │ └─────────────────────────────────────────────────────────────┘│ │ │ └─────────────────────────────────────────────────────────────────┘
3. Core Components
3.1. Repository Management
Use a consistent directory structure for all repositories:
~/repos/
├── github.com/
│ ├── org-a/
│ │ ├── project-1/
│ │ └── project-2/
│ └── org-b/
│ └── project-3/
└── gitlab.com/
└── ...
3.1.1. Sync Script
Create a script to sync all repositories with push access:
#!/bin/sh # sync.sh - Parallel sync of all repos with commit access # Configuration JOBS="${JOBS:-$(nproc)}" REPO_ROOT="${REPO_ROOT:-$HOME/repos}" MAX_RETRIES=3 RETRY_DELAY=30 echo "Syncing repositories with $JOBS parallel jobs..." # Rate limiting helper with exponential backoff sync_with_retry() { local repo="$1" local attempt=1 local repo_path="$REPO_ROOT/github.com/$repo" while [ $attempt -le $MAX_RETRIES ]; do if [ -d "$repo_path" ]; then git -C "$repo_path" pull --ff-only && return 0 else git clone "https://github.com/$repo" "$repo_path" && return 0 fi echo "Retry $attempt/$MAX_RETRIES for $repo (rate limit?)" sleep $((RETRY_DELAY * attempt)) attempt=$((attempt + 1)) done return 1 } export -f sync_with_retry export REPO_ROOT MAX_RETRIES RETRY_DELAY # Get repos with push access, sync in parallel gh repo list --limit 1000 --json nameWithOwner,viewerPermission \ --jq '.[] | select(.viewerPermission == "ADMIN" or .viewerPermission == "WRITE") | .nameWithOwner' | \ parallel -j "$JOBS" 'sync_with_retry {}' echo "Sync complete!"
3.1.2. SSH vs HTTPS Authentication
By default, the sync script uses HTTPS. For organizations with SSO or if you prefer SSH:
# Configure git to use SSH instead of HTTPS git config --global url."git@github.com:".insteadOf "https://github.com/" # Ensure SSH agent is running with your key eval "$(ssh-agent -s)" ssh-add ~/.ssh/id_ed25519 # For organizations with SSO, authorize your SSH key: # https://github.com/settings/keys → Configure SSO
Note: Never store credentials in ~/.netrc - use gh auth or SSH keys instead.
3.2. Git Worktrees
Worktrees allow working on multiple branches simultaneously without stashing or switching contexts.
3.2.1. Directory Structure
project/ ├── .git/ # Shared git database ├── worktrees/ # All branch checkouts │ ├── feature-auth/ # origin/feature/auth │ ├── fix-login-bug/ # origin/fix/login-bug │ └── release-v2/ # origin/release/v2 └── (main checkout) # Default branch
3.2.2. Common Commands
# List all worktrees git worktree list # Create worktree for existing remote branch git worktree add worktrees/feature-x origin/feature/x # Create worktree with new branch git worktree add -b feature/new worktrees/feature-new # Remove worktree git worktree remove worktrees/feature-x # Clean up stale references git worktree prune
3.2.3. Worktree Locking
Git worktrees have built-in locking to prevent concurrent modifications:
# Lock a worktree (prevents accidental removal) git worktree lock worktrees/feature-x --reason "Long-running experiment" # Check lock status git worktree list --porcelain | grep -A2 "worktree" # Unlock when done git worktree unlock worktrees/feature-x
Warning: Avoid working in the same worktree from multiple terminals simultaneously. If you need concurrent access, create separate worktrees for each terminal session.
3.2.4. Scale Considerations
At scale (400+ repos × 5 branches = 2000+ worktrees), monitor resource usage:
# Check total worktree disk usage du -sh ~/repos/github.com/*/*/worktrees 2>/dev/null | sort -h | tail -20 # Count total worktrees find ~/repos -type d -name worktrees -exec ls -1 {} \; 2>/dev/null | wc -l # Prune all stale worktrees across repos find ~/repos -name .git -type d -execdir git worktree prune \;
Recommendations for large installations:
- Set up incremental sync (only repos with recent activity)
- Create priority tiers (critical repos sync hourly, others every 4h)
- Archive inactive branches older than 90 days
- Monitor inode usage:
df -i
3.2.5. Automated Worktree Creation
Add to your sync script:
# For each repo, create worktrees for all remote branches create_worktrees() { local repo_path="$1" local worktrees_dir="$repo_path/worktrees" mkdir -p "$worktrees_dir" cd "$repo_path" # Get default branch default_branch=$(git symbolic-ref refs/remotes/origin/HEAD | sed 's@^refs/remotes/origin/@@') # Fetch and create worktrees git fetch --prune origin for branch in $(git branch -r | grep -v HEAD | sed 's/origin\///'); do [ "$branch" = "$default_branch" ] && continue safe_name=$(echo "$branch" | tr '/' '-') worktree_path="$worktrees_dir/$safe_name" if [ ! -d "$worktree_path" ]; then git worktree add "$worktree_path" "origin/$branch" fi done git worktree prune }
3.3. Task Tracking
Use a git-native task tracker that stores issues alongside code.
3.3.1. Initialization
# Initialize task tracking in a project cd ~/repos/github.com/org/project task init git add .tasks && git commit -m "chore: initialize task tracking"
3.3.2. Daily Workflow
# Discovery task list # List all issues task ready # Show unblocked work task blocked # Show blocked issues task search "keyword" # Search issues # Working task create "Title" -p 1 # Create issue (priority 0-4) task show <id> # View details task update <id> --status in_progress task close <id> # Complete issue # Sync task sync && git push # Always push when done
3.3.3. Task-Driven Development
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ task ready │────►│ Work on │────►│ task close │
│ │ │ task │ │ │
└─────────────┘ └─────────────┘ └─────────────┘
│ │ │
▼ ▼ ▼
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Pick issue │ │ git commit │ │ task sync │
│ (unblocked) │ │ small steps │ │ git push │
└─────────────┘ └─────────────┘ └─────────────┘
3.4. Terminal Multiplexer (tmux)
Use tmux for session persistence and multi-window workflows.
3.4.1. Standard Session Layout
# Create named session tmux new -s project-name # Recommended window layout: # Window 0: Claude Code / AI assistant # Window 1: Editor (vim/emacs/code) # Window 2: Logs / monitoring # Window 3: Git / tasks
3.4.2. Key Bindings Reference
| Key | Action |
|---|---|
| C-b c | New window |
| C-b n/p | Next/prev window |
| C-b % | Vertical split |
| C-b " | Horizontal split |
| C-b d | Detach |
| C-b [ | Scroll mode |
| C-b z | Zoom current pane |
| C-b & | Kill window |
3.4.3. Session Script
#!/bin/sh # dev-session.sh - Start development session PROJECT="${1:-default}" tmux has-session -t "$PROJECT" 2>/dev/null if [ $? != 0 ]; then tmux new-session -d -s "$PROJECT" -n claude tmux new-window -t "$PROJECT" -n editor tmux new-window -t "$PROJECT" -n logs tmux new-window -t "$PROJECT" -n git tmux select-window -t "$PROJECT:0" fi tmux attach -t "$PROJECT"
3.5. Claude Code Integration
Claude Code reads instructions from CLAUDE.md files at global and project levels.
3.5.1. Prompt Caching Architecture
The CLAUDE.md hierarchy enables efficient prompt caching:
┌─────────────────────────────────────────────────┐ │ Prompt Cache Layers │ ├─────────────────────────────────────────────────┤ │ │ │ ~/.claude/CLAUDE.md ──► CACHED (static) │ │ Global instructions Rarely changes │ │ │ │ repo/CLAUDE.md ──► CACHED (per-repo) │ │ Project instructions Changes with repo │ │ │ │ repo/CONTINUE.md ──► DYNAMIC │ │ Session state Changes each session│ │ │ └─────────────────────────────────────────────────┘
This structure minimizes token usage by caching stable instructions while keeping session state dynamic.
3.5.2. Context Window Optimization
For large repositories, help the AI focus on relevant files:
## AI Context Hints ### Key Files (read these first) - src/main.py - Application entry point - src/api/routes.py - API definitions - src/models/ - Data models ### Ignore Patterns - node_modules/, vendor/, .git/ - *.log, *.tmp, *.cache - build/, dist/, coverage/ ### Architecture Summary - Pattern: MVC with service layer - Database: PostgreSQL with SQLAlchemy - API: REST, OpenAPI 3.0 spec in docs/api.yaml
3.5.3. Global Instructions
Create ~/.claude/CLAUDE.md for settings that apply to all projects:
# Claude Code Global Instructions ## Session Startup At the start of every session: - Check task tracker: `task ready` - Check worktrees: `git worktree list` - Read CONTINUE.md if it exists ## Tool Preferences - Read tool for reading files - Write tool for creating new files - Edit tool for modifying existing files - Never use `git add .` or `git add -A` ## Commit Protocol - Atomic changes: one logical change per commit - Always buildable: never commit broken code - Test before commit: run tests before each commit ## Commit Format <type>(<scope>): <description> Types: feat, fix, docs, test, chore, refactor
3.5.4. Project Instructions
Create CLAUDE.md in each project root for project-specific guidance:
# Project Instructions ## Quick Start ```bash # Build make build # Test make test # Run make run ``` ## Architecture Notes - Uses MVC pattern - Database: PostgreSQL - API: REST with OpenAPI spec ## Key Files - src/main.py - Application entry point - src/api/ - REST endpoints - tests/ - Test suite
3.5.5. Session Handoff
Use CONTINUE.md for session continuity:
## Next Session - **Active Task:** project-123 - **Current Focus:** Implementing auth middleware - **Blockers:** Waiting for API spec review ## Recent Changes - Added JWT validation - Updated user model ## Notes See experiments/notes/session-2026-01-10.md
4. Complete Workflow
4.1. Morning Startup
# 1. Start tmux session tmux attach -t dev || tmux new -s dev # 2. Sync repositories (or let cron handle it) ~/repos/sync.sh # 3. Check available work cd ~/repos/github.com/org/project task ready # 4. Start Claude Code claude
4.2. Development Cycle
# 1. Pick a task task update proj-42 --status in_progress # 2. Create or use worktree cd worktrees/feature-x # 3. Work with Claude Code assistance # ... coding, testing, iterating ... # 4. Commit incrementally git add -p git commit -m "feat(auth): add token refresh" # 5. Complete task task close proj-42 task sync && git push
4.3. End of Session
# 1. Update CONTINUE.md with next steps # 2. Sync tasks task sync && git push # 3. Detach tmux (don't exit) # C-b d
5. Scheduling with Cron
# Sync repositories every 4 hours 0 */4 * * * $HOME/repos/sync.sh >> $HOME/.local/log/sync.log 2>&1
6. Agentic Patterns
This workflow aligns with emerging agentic AI development patterns:
| Pattern | Implementation | Notes |
|---|---|---|
| Tool Use | task, git, gh commands | AI executes real commands |
| Memory | CONTINUE.md, task tracker | Persists across sessions |
| Planning | Task dependencies, blockers | Structured work breakdown |
| Reflection | Session retrospectives | Learn from each session |
6.0.1. Session Retrospectives
Add to your end-of-session routine:
## Session Retrospective ### What worked - Claude understood the codebase quickly via CLAUDE.md - Worktree isolation prevented merge conflicts ### What didn't - Context window filled up on large file - Needed to break task into smaller pieces ### Improvements for next time - Add key file hints to CLAUDE.md - Create sub-tasks for complex features
6.0.2. Tracking AI Contributions
Measure AI-assisted development:
# Count AI-assisted commits git log --grep="Co-Authored-By: Claude" --oneline | wc -l # Percentage of recent commits with AI total=$(git log --oneline -100 | wc -l) ai=$(git log --grep="Co-Authored-By: Claude" --oneline -100 | wc -l) echo "AI-assisted: $((ai * 100 / total))%"
7. Team Onboarding Checklist
For teams adopting this workflow:
7.1. Prerequisites
[ ]Install required tools:git,gh,tmux,parallel[ ]Configure GitHub CLI:gh auth login[ ]Set up SSH keys (optional but recommended)[ ]Clone repository management tool
7.2. Initial Setup
[ ]Create~/repos/directory structure[ ]Copy and customizesync.sh[ ]Add sync to crontab[ ]Create~/.claude/CLAUDE.mdwith team standards
7.3. Per-Project Setup
[ ]Initialize task tracker:task init[ ]Create projectCLAUDE.md[ ]Addworktrees/to.gitignore[ ]Document key files and architecture
7.4. First Week
[ ]Complete one full task cycle (create → work → close)[ ]Practice worktree workflow[ ]Review and update CLAUDE.md based on experience[ ]Share learnings with team
8. Benefits
- Consistency: Same workflow across all projects
- Context Preservation: Worktrees prevent context switching overhead
- AI Continuity: CLAUDE.md and CONTINUE.md maintain AI context
- Parallel Work: Multiple branches active simultaneously
- Auditability: Tasks tracked in git alongside code
- No Vendor Lock-in: All tools are open source (git, tmux, parallel)
- Portable: Works on macOS, Linux, BSD, WSL
9. Advanced Agent Patterns (sage.el Session Notes - 2026-01-11)
Patterns discovered while building sage.el, an AI coding assistant for Emacs.
9.1. Multi-Agent Review Fan-Out
Use parallel specialized agents for comprehensive code reviews:
┌─────────────────────┐
│ Code Changes │
└──────────┬──────────┘
┌───────────────────┼───────────────────┐
▼ ▼ ▼
┌──────────────────┐ ┌──────────────────┐ ┌──────────────────┐
│ Domain Expert │ │ L7/Protocol │ │ CTO/Architect │
│ - idioms │ │ - tool patterns │ │ - security │
│ - conventions │ │ - API contracts │ │ - extensibility │
│ - integration │ │ - error handling│ │ - scale │
└──────────────────┘ └──────────────────┘ └──────────────────┘
Key insight: Different expert perspectives catch different issue types.
| Reviewer | Catches |
|---|---|
| Domain Expert | Idiomatic patterns, library integration |
| L7/Protocol | API contracts, error handling, edge cases |
| CTO/Architect | Security, extensibility, scale concerns |
9.2. Tool Evolution Lifecycle
Tools naturally evolve through distinct phases:
Phase 1: Shell-Based (Prototype) ├── Quick to implement using CLI tools (rg, grep, curl) └── May fail on edge cases (unknown file types, network timeouts) Phase 2: Native Primitives (Production) ├── Uses language-native functions ├── No external process dependencies └── Predictable, testable behavior Phase 3: Self-Extending (Advanced) ├── Tools can create new tools dynamically ├── Persistent across sessions └── AI-discoverable via introspection
Example evolution:
;; Phase 1: Shell command (problematic) (shell-command-to-string "rg pattern *.el") ;; Error: rg: unrecognized file type: clj ;; Phase 2: Native primitive (reliable) (directory-files-recursively dir "\\.el$") (string-match-p pattern content) ;; Phase 3: Self-extending factory (create-tool "custom_search" "Search files" ...)
9.3. Hard Dependencies Over Graceful Degradation
Principle: Fail fast with clear errors. Fallbacks create inconsistent behavior.
;; WRONG: Silent fallback with different behavior (if (require 'magit nil t) (magit-git-insert ...) ; Returns structured data (shell-command-to-string ...)) ; Returns raw string - different! ;; CORRECT: Hard dependency with clear error (unless (require 'magit nil t) (error "Git tools require magit. M-x package-install RET magit"))
Why this matters:
- Consistent behavior across all environments
- Easier debugging (one code path)
- Clear installation requirements
- No "works on my machine" issues
9.4. Tool Usage Mining
Analyze agent's own tool usage to prioritize feature development:
# Extract tool usage from Claude Code logs find ~/.claude/projects -name "*.jsonl" -exec cat {} \; | \ grep -o '"tool":"[^"]*"' | sort | uniq -c | sort -rn # Results from sage.el development: # 9277 Bash → prioritize eval_elisp # 1364 Read → prioritize read_file # 1267 Edit → prioritize edit_file # 442 Glob → prioritize glob_files # 415 Write → prioritize write_file # 392 TodoWrite → prioritize org_todo tools
Key insight: Implement tools the agent actually uses most, not what seems theoretically important.
9.5. Self-Extending Tool Factory
Enable AI to create persistent tools for itself:
┌─────────────────────────────────────────────────────────┐ │ Tool Factory Pattern │ ├─────────────────────────────────────────────────────────┤ │ │ │ create_tool │ │ │ │ │ ├──► JSON schema for parameters │ │ ├──► Code body (evaluated at runtime) │ │ └──► Persist to ~/.config/tools.el │ │ │ │ On startup: │ │ └──► reload_tools loads persisted definitions │ │ │ └─────────────────────────────────────────────────────────┘
Example: AI creates Hacker News tool:
(create_tool :name "hackernews_summary" :description "Fetch top HN stories" :parameters '((count . "number of stories")) :code '(fetch-json "https://hacker-news.firebaseio.com/v0/topstories.json" (lambda (ids) (mapcar #'fetch-story (take count ids)))))
9.6. Testing with Mocking
Unit test agent tools by mocking external dependencies:
(ert-deftest test-web-fetch () "Test web_fetch with mocked HTTP." (cl-letf (((symbol-function 'url-retrieve-synchronously) (lambda (url &rest _) (let ((buf (generate-new-buffer " *mock*"))) (with-current-buffer buf (insert "HTTP/1.1 200 OK\n\n<html>Mock</html>")) buf)))) (should (string-match-p "Mock" (tool-web-fetch "http://example.com")))))
9.7. Demo-Driven Documentation
Executable demos serve as both documentation and tests:
make demo # Interactive GUI demo make demo-batch # Terminal output (CI-friendly) make demo-quick # Quick screencast demo # Remote testing for cross-platform verification ./bin/run-demo --remote pi.lan ./bin/run-demo --screenshot pi.lan # Capture visual output
9.8. Observation-Driven Development
Document the observation → root cause → solution chain:
## Observation: code_search fails with "unrecognized file type" **Context:** REPL error output **Error:** `rg: unrecognized file type: clj` **Root Cause:** Shell command with hardcoded file types **Solution:** Replace with `directory-files-recursively` + `string-match-p` **Bead:** project-bij (closed)
Key insight: Always trace errors back to root cause, don't just fix symptoms.
9.9. Background Agent Pattern
For long-running operations, use background agents:
1. Launch: Task tool with run_in_background: true 2. Continue: Do other work while agent runs 3. Monitor: tail -f /tmp/tasks/xxx.output 4. Retrieve: Read output file when complete
Use cases:
- SSH to remote machines
- Long compilation/test runs
- Screenshot capture
- Network operations with timeouts
10. Lessons Learned: File Restructuring (2026-01-11)
Patterns discovered during large-scale directory restructuring of event files.
10.1. Org-Publish Cache Invalidation
When moving files, the org-publish cache retains references to old paths:
# Symptom: Publish fails with "No such file" for moved files Error: No such file: "events/defcon-32-schedule.org" # Solution: Clear the timestamp cache before republishing rm -f ~/.org-timestamps/wal.sh-main.cache gmake publish
Key insight: The cache stores absolute paths. Moving files requires cache invalidation.
10.2. Internal Reference Updates
When restructuring directories, update all internal references:
| Reference Type | Example | Action |
|---|---|---|
#+INCLUDE: |
#+INCLUDE: "old-file.org" |
Update relative path |
[[file:...]] |
[[file:images/old.png]] |
Update to new location |
| Asset paths | thumbnails/image.png |
Move assets with parent |
# Find all references to old paths grep -rn 'old-path/' --include='*.org' . # Bulk update with replace_all in Edit tool
10.3. Server-Side Cleanup
Org-publish only uploads new/modified files; it doesn't delete orphans:
# Old files remain accessible at old URLs # Bots continue indexing stale paths # Manual cleanup required after restructuring ssh server 'rm -rf ~/site/events/old-directory/' # Or create redirects for SEO # .htaccess: Redirect 301 /old-path /new-path
10.4. Restructuring Workflow
Recommended approach for large file reorganizations:
1. Create beads epic with subtasks for each component 2. Move files with git mv (preserves history) 3. Update all internal links 4. Clear org-publish cache 5. Publish and verify 6. Clean up orphaned server files 7. Close beads and sync
10.5. Asset Co-location
Keep generated assets (images from dot/graphviz blocks) with their source:
BEFORE (fragmented): events/ ├── conf-2019.org # Source file ├── aws.png # Generated image (orphaned after move) └── terraform.png # Another orphan AFTER (co-located): events/conf/2019/ ├── index.org # Source file ├── aws.png # Generated image └── terraform.png # Stays with source
11. FreeBSD-Specific Patterns
This workflow runs on FreeBSD 14.3. Platform differences to consider:
11.1. Command Differences
| Linux/macOS | FreeBSD | Notes |
|---|---|---|
make |
gmake |
GNU Make required for complex Makefiles |
md5sum |
md5 |
Different output format |
nproc |
sysctl -n hw.ncpu |
CPU count |
sed -i |
sed -i '' |
In-place edit syntax |
11.2. Tool Availability
# ripgrep may not be available # Fall back to grep when needed grep -rn 'pattern' --include='*.org' . # Or install via pkg pkg install ripgrep
11.3. Poetry/Emacs Integration
# Emacs runs through Poetry for dependency isolation poetry run emacs --batch -l project-config.el ... # Not just 'emacs' directly