back to posts
#47 Part 4 2026-02-11 15 min

From Zero to Dev Box: How I Used Claude to Clone My Mac Environment to a Windows PC in One Session

I bought a mini PC with no monitor or keyboard. An hour later, it was a fully configured dev box - and I never touched it.

From Zero to Dev Box: How I Used Claude to Clone My Mac Environment to a Windows PC in One Session

From Zero to Dev Box: How I Used Claude to Clone My Mac Environment to a Windows PC in One Session

The future of cross-platform development setup is conversational


I bought a mini PC off Amazon. AMD Ryzen, 16GB RAM, 512GB SSD. No monitor. No keyboard. No mouse. Just a box with potential and an ethernet cable.

An hour later, it was a fully configured development machine mirroring my M3 Mac environment---Python, Node, Go, MCP servers, Cloudflare Workers connections, the works. And I never plugged in a monitor.

Here’s how Claude did it.


The Setup Challenge

The Hardware Situation

The mini PC (MKPC, as I named it) arrived as a bare box. Windows 11 pre-installed, but nothing else. The plan was simple: make it a secondary dev machine that mirrors my Mac environment closely enough that I can seamlessly work across both.

The traditional approach would be: hook up a temporary monitor, grab a keyboard, spend a weekend manually installing tools, Googling “how to set PATH on Windows 11” for the forty-seventh time, forgetting half the tools I actually use, and ending up with a machine that sort of works but doesn’t quite match.

I didn’t do any of that.

The AI-Assisted Approach

Here’s what I actually did:

  1. Plugged in ethernet
  2. Found MKPC’s IP address with arp -a from my Mac
  3. Borrowed a keyboard for just long enough to complete Windows OOBE and enable Remote Desktop
  4. Connected via Microsoft’s “Windows App” from my Mac
  5. Let Claude take over

Total manual intervention: about 10 minutes.


Phase 1: The Discovery --- Analyzing the Source Machine

Before building the target, Claude needed to understand the source. This is where things got interesting.

Claude as System Auditor

Using Desktop Commander MCP, Claude systematically scanned my M3 Mac. Not just running brew list --- actually analyzing what I had installed, categorizing it, and building a migration plan.

The commands were straightforward: brew list, pip list, node --version, go version, checking MCP configurations, inspecting global npm packages. But the output wasn’t a dump. Claude organized everything into a structured inventory:

Core Runtimes:
  Python 3.14.0
  Node.js 25.2.1
  Git 2.52.0
  Go 1.25.4

Package Managers:
  Homebrew (47 formulas, 12 casks)
  pip packages (23 packages)
  npm globals (8 packages)
  uv (modern Python tooling)

MCP Servers:
  17 blockchain MCP servers
  Desktop Commander
  Longterm Memory (PostgreSQL-backed)
  Backchannel (Cloudflare Worker)
  PAI History Search (Cloudflare Worker)
  Voice mode, PDF tools, multi-agent coordinator

The Categorization That Mattered

Here’s where Claude stopped being a tool runner and started being an engineer. It automatically sorted every component into migration categories:

  1. NPX-based (easy --- just need Node): Desktop Commander, PDF tools, several blockchain MCPs
  2. Python/uvx-based (need Python + uv): longterm memory server, voice mode
  3. Custom Node servers (need repo clone + npm install): backchannel, history search
  4. macOS-only (won’t work on Windows): AppleScript tools, iMessage MCP, Raycast integration
  5. Remote/Cloudflare (just need URLs): backchannel worker, history search worker

This was the magic moment. An AI understood my development environment better than I could have documented it myself. I’ve been using this Mac for a year and I couldn’t have produced that categorized inventory without spending an hour manually. Claude did it in about 30 seconds.


Phase 2: The Execution --- Building the Windows Environment

Remote Desktop + Desktop Commander = Full Control

The architecture for the migration was almost absurd in its simplicity:

I’m watching a Windows desktop where an AI is typing PowerShell commands, downloading installers, and configuring PATH variables. I’m drinking coffee.

The Installation Sequence

Claude worked through the installation methodically:

Core runtimes first:

# Node.js (direct download, not winget - more reliable for PATH)
# Python 3.13.1 via python.org installer
# Git 2.47.1 via winget
# Go 1.23.5 via direct download

Modern tooling second:

# uv/uvx for Python package management
pip install uv

# Claude CLI
npm install -g @anthropic-ai/claude-code

# Wrangler for Cloudflare
npm install -g wrangler

Python packages third:

pip install flask psycopg2-binary pillow requests

The version numbers don’t match the Mac exactly --- Windows got slightly older versions in some cases. That’s fine. The point isn’t identical versions; it’s functional parity.

The PATH Problem (And Why Windows Still Hurts)

Everything was installed. Claude verified each tool with version checks. Python works. Node works. Git works. Ship it.

Except --- Claude Desktop couldn’t find any of them.

The root cause is a Windows classic: GUI applications don’t always inherit the user’s PATH environment variable. The system PATH gets updated by installers, but running applications need to be restarted to pick up changes. And Claude Desktop, running as a GUI app, was using the PATH from when it first launched.

Claude diagnosed this in about 10 seconds. The fix was pragmatic: use full paths in claude_desktop_config.json instead of relying on PATH resolution:

{
  "mcpServers": {
    "desktop-commander": {
      "command": "C:\\Tools\\node-v22.13.1-win-x64\\npx.cmd",
      "args": ["-y", "@wonderwhy-er/desktop-commander@0.2.12"]
    }
  }
}

This is the kind of gotcha that wastes hours when you’re doing it manually. You install everything, it all works in PowerShell, and then your IDE or tool can’t find anything. You start questioning your life choices. Claude just… fixed it.


Phase 3: The Cloudflare Discovery --- Finding Remote MCP URLs

The Challenge

Two of my most important MCP servers --- Backchannel (multi-agent coordination) and PAI History Search (conversation memory) --- aren’t local processes. They’re Cloudflare Workers. Remote MCPs accessed via URL.

But what URLs? I knew they existed. I knew they were deployed. I didn’t have the URLs memorized, and they weren’t in any config file on the Mac in an obvious format.

Claude + Wrangler = Full Visibility

Claude’s approach:

  1. Install Wrangler on MKPC (already done in Phase 2)
  2. Authenticate with Cloudflare via OAuth (wrangler login)
  3. List all deployed workers via the Cloudflare API
  4. Find my account subdomain programmatically
  5. Construct the MCP endpoint URLs

The wrangler commands didn’t all work as expected --- some returned errors, some required different flags than documented. Claude adapted on the fly, pivoting to direct API calls when CLI commands failed:

Backchannel: https://backchannel.myronkoch-dev.workers.dev/sse
PAI History Search: https://pai-history-search.myronkoch-dev.workers.dev/sse

These got added to MKPC’s Claude Desktop config. Now the Windows machine has access to the same shared memory and coordination infrastructure as the Mac.


Phase 4: Verification and Edge Cases

What Worked Immediately

What Didn’t Work (Honest Assessment)

macOS-only tools are macOS-only. AppleScript integration, iMessage MCP, Raycast workflows --- none of these have Windows equivalents. This isn’t a solvable problem; it’s a platform boundary.

VoiceMode setup remained untested. The local voice pipeline (Whisper + Kokoro) depends on MLX, which is Apple Silicon only. Windows would need a different stack --- probably faster-whisper with CUDA. That’s a project for another day.

Some PATH issues required human intervention. Specifically, restarting Claude Desktop after PATH changes. Claude can install software and modify environment variables, but it can’t restart its own host application. Someone still needs to click the X and reopen it.

Windows OOBE still needs a physical keyboard. Until Microsoft supports headless initial setup (or until I set up a PXE boot with an answer file), that 10-minute keyboard session is unavoidable.


The Implications

This Is a New Workflow Category

What happened here isn’t just “I set up a computer.” It’s a pattern:

Discovery --- AI audits the source environment, building a structured inventory that’s more complete and better organized than anything I’d write manually.

Planning --- AI categorizes components by migration difficulty, identifies blockers, and creates an execution plan.

Execution --- AI installs and configures everything, handling platform differences and gotchas in real-time.

Verification --- AI tests each component and reports what works, what doesn’t, and why.

Each phase is conversational and auditable. The transcript IS the documentation. If I need to do this again in six months, I don’t need a wiki page --- I have the conversation.

What This Enables

Reproducible environments. Run the same audit on any machine, get exact parity. New laptop? Same conversation, same result.

Onboarding acceleration. New team member joins and needs a dev environment? Have Claude set up their machine while they read the README.

Disaster recovery. Lost your dev machine? The audit lives in conversation history. Rebuild from memory.

Cross-platform parity. Maintain matching environments across Mac, Windows, and Linux without manually tracking what’s installed where.

The Meta-Learning

At the end of the session, I had Claude save everything to long-term memory --- the software versions, the paths, the URLs, the configurations, the gotchas. Key facts stored in PostgreSQL with vector embeddings.

Why? Because next time I set up a machine, Claude will REMEMBER this session. The learnings compound. Each setup gets faster because the AI accumulates institutional knowledge about my infrastructure.

This is the difference between a tool and an assistant. A tool does what you tell it. An assistant learns from experience.


Conclusion: The Future is Conversational DevOps

I used to spend weekends setting up dev environments. Now I spend conversations.

The stack that made this possible isn’t exotic:

That’s it. No custom scripts. No Ansible playbooks. No Docker images. Just a conversation with an AI that understands system administration.

The setup isn’t perfect --- macOS-only tools don’t port, PATH issues still bite, and the initial Windows OOBE needs human hands. But the 90% that CAN be automated? It’s automated. And the conversation that automated it doubles as documentation.

If you’re still spending weekends setting up dev machines: stop. Install Desktop Commander. Let Claude audit your environment. Then point it at the new machine and go get coffee.

Your dev environment is one conversation away.


Appendix: Quick Reference

Tools Used

Key Commands

# Find headless PC on network (from Mac)
arp -a

# Install global npm tools (on Windows)
npm install -g wrangler @anthropic-ai/claude-code

# Query Cloudflare for deployed workers
wrangler whoami
curl -H "Authorization: Bearer $TOKEN" \
  "https://api.cloudflare.com/client/v4/accounts/$ACCT/workers/scripts"

Config File Location (Windows)

C:\Users\{username}\AppData\Local\Claude\claude_desktop_config.json