How to Connect OpenAI Codex to Figma
Connect OpenAI Codex to Figma through AI Bridge. Expose the local Bridge API via tunnel so Codex's cloud sandbox can read designs and tokens.
OpenAI Codex is a cloud-based coding agent. You give it a task, it spins up a sandboxed environment, writes code, runs tests, and opens a PR. No local terminal, no IDE — just autonomous execution.
Connecting Codex to Figma means giving that sandbox access to the AI Bridge API. The challenge: Bridge runs locally on your machine at localhost:8867, and Codex runs in a remote cloud sandbox that cannot reach your localhost.
How Codex works differently
Claude Code and Cursor run on your machine. They hit http://localhost:8867 directly. No extra setup needed.
Codex runs remotely. Its sandbox has outbound internet access, but it cannot reach services running on your laptop. You need to expose Bridge to the internet so the sandbox can call it.
Two approaches:
- Tunnel — expose your local Bridge via ngrok or cloudflared. Easiest option.
- Cloud deployment — run Bridge in the same cloud environment as Codex. More robust, more setup.
For most workflows, a tunnel is sufficient.
Setup with tunnel
Start Bridge as usual. The Figma plugin must be running and connected.
Verify it works locally:
curl http://localhost:8867/statusYou should see connected: true. Now expose it:
# Option A: ngrok
ngrok http 8867
# Option B: cloudflared
cloudflared tunnel --url http://localhost:8867Both give you a public URL like https://abc123.ngrok-free.app or https://abc123.trycloudflare.com. This URL proxies to your local Bridge.
Test the tunnel:
curl https://abc123.ngrok-free.app/statusSame response as localhost. The full command API works identically — only the host changes.
Writing the Codex task prompt
Codex works best with precise, self-contained task definitions. Include the tunnel URL, the specific Figma node to read, and the expected output format.
A good task prompt:
Use the Figma Bridge API at https://abc123.ngrok-free.app
1. GET /status — verify connection
2. POST /command with:
{"command": "get-node-props", "params": {"nodeId": "6430:44087"}}
3. POST /command with:
{"command": "get-bound-variables", "params": {"nodeId": "6430:44087"}}
Read the component structure and design token bindings.
Generate a React component with CSS modules.
Use var(--token-name) references for all design tokens.
Do not hardcode any colors, spacing, or typography values.
Key: use "params", not "args" in the request body.Codex will execute each step, parse the Bridge responses, and generate the component with proper token usage.
What Codex returns
Codex produces a PR-ready diff. For a card component, the output looks like:
/* Card.module.css */
.root {
padding: var(--spacing-xl);
border-radius: var(--radius-lg);
background: var(--color-surface);
gap: var(--spacing-md);
}
.title {
color: var(--color-text-primary);
font-size: var(--size-text-md);
font-weight: var(--weight-medium);
}function Card({ title, children }) {
return (
<div className={styles.root}>
<h3 className={styles.title}>{title}</h3>
{children}
</div>
);
}No hardcoded #181818. No magic 16px. Every value traces back to a design token from Figma.
Codex vs Claude Code vs Cursor
All three work with AI Bridge. They differ in where they run and how you interact with them.
| Feature | Codex | Claude Code | Cursor |
|---|---|---|---|
| Runs in | Cloud sandbox | Local terminal | Local editor |
| Bridge access | Via tunnel | Direct localhost | Direct localhost |
| Interaction | Async task | Interactive CLI | Inline in editor |
| Output | PR / diff | Files on disk | Files on disk |
| Best for | Autonomous generation | Interactive workflow | Inline editing |
Codex is strongest when you want hands-off execution: define the task, walk away, review the PR. Claude Code is better for iterative work where you inspect intermediate results. Cursor fits best when you are editing existing code and want AI completions in context.
The Bridge API is the same in all cases. The commands, the params, the responses — identical. Only the network path differs.
Security considerations
Tunnels expose your Bridge to the internet. A few things to keep in mind:
- Use the tunnel only while Codex is running. Shut it down after.
- ngrok and cloudflared generate random URLs, making discovery unlikely.
- Bridge is read-write — anyone with the URL can create and modify Figma nodes. Do not share tunnel URLs publicly.
- For team setups, consider adding an auth proxy in front of the tunnel.
Get started
AI Bridge works with Codex, Claude Code, Cursor, and any model that can make HTTP calls.