You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using native custom tools in the ChatGPT desktop app is a bit tricky: it doesn't yet support the full MCP feature set that local AI agents do, and the built-in connectors/plugins run in the cloud.
10
+
Using **native custom tools** in the ChatGPT desktop app is still a bit awkward: it doesn’t expose the full MCP feature set that local AI agents do, and the built-in connectors/plugins all run in the cloud.
11
11
12
-
However, since the desktop app is a frontend for OpenAI's GPT-5.1 Thinking model (standard vs extended thinking), if we can connect a local MCP/tool as a connector to the ChatGPT app, we can reuse the capabilities you already pay for in your ChatGPT subscription — with the nice desktop UI — and wire them directly into Binary Ninja.
12
+
However, the desktop app is “just” a frontend for OpenAI’s GPT-5.1 Thinking model (standard vs. extended thinking). If we can connect a local MCPtool server as a **connector** to the ChatGPT app, we can:
13
13
14
-
This post walks through how to connect Binary Ninja to the ChatGPT desktop app to build an automated, low-cost (assuming you already have ChatGPT Plus) workflow for AI-assisted reverse engineering.
14
+
- Reuse the compute you already pay for in your ChatGPT subscription,
15
+
- Keep Binary Ninja running locally,
16
+
- And control it from a nice desktop UI.
17
+
18
+
This post walks through how to connect **Binary Ninja** to the **ChatGPT desktop app** to build an automated, low-cost (assuming you already have ChatGPT Plus) workflow for AI-assisted reverse engineering.
19
+
20
+
> These steps were tested on Windows, but the overall MCP / connector flow is the same on macOS and Linux. You mainly need to adjust paths and shell commands.
21
+
{: .prompt-info }
22
+
23
+
---
15
24
16
25
## Prerequisites
17
26
27
+
You’ll need:
28
+
18
29
-**ChatGPT desktop client (Windows/macOS)**
19
30
Version `1.2025.258` or later.
20
31
@@ -24,7 +35,10 @@ This post walks through how to connect Binary Ninja to the ChatGPT desktop app t
24
35
-**Basic familiarity with:**
25
36
- Python virtual environments
26
37
- MCP concepts (tool servers over stdio/HTTP)
27
-
- Ngrok or similar HTTP tunneling tools
38
+
- ngrok or a similar HTTP tunneling tool
39
+
40
+
> If you already have a preferred virtual-environment tool (e.g., `venv`, Poetry, Conda), you can use that instead of `uv`—just adapt the commands in this guide.
41
+
{: .prompt-tip }
28
42
29
43
---
30
44
@@ -44,7 +58,7 @@ Open the binary you want to analyze, then click that indicator. It should change
44
58
45
59
> `MCP: Running`
46
60
47
-
This means the MCP bridge script is active inside Binary Ninja.
61
+
This means the MCP bridge script is active inside Binary Ninja and ready to accept connections.
48
62
49
63

50
64
@@ -56,15 +70,15 @@ This means the MCP bridge script is active inside Binary Ninja.
Inside that folder, find the **`bridge`** subfolder. All following commands are run from there.
79
+
Inside that folder, open the **`bridge`** subfolder. All commands in the rest of this guide are run from there.
66
80
67
-
It’s recommended to use **`uv`** (the Rust-based Python package manager) to manage a virtual environment:
81
+
It’s convenient to use **`uv`** (the Rust-based Python package manager) to create an isolated environment:
68
82
69
83
```shell
70
84
uv init
@@ -73,20 +87,23 @@ uv add -r .\requirements.txt
73
87
74
88
This will:
75
89
76
-
- Initialize a new Python project with an isolated environment.
77
-
- Install the dependencies listed in `requirements.txt`.
90
+
* Initialize a new Python project with an isolated environment.
91
+
* Install the dependencies listed in `requirements.txt`.
78
92
79
-
------
93
+
> Keep the `bridge` environment dedicated to this plugin. Mixing unrelated packages into the same environment can make debugging MCP issues much harder later.
94
+
> {: .prompt-warning }
95
+
96
+
---
80
97
81
98
## Step 3 – Convert the bridge to a FastMCP HTTP server
82
99
83
-
The original bridge script only supports **stdio** as an MCP transport, but the ChatGPT desktop app expects an HTTP-based MCP endpoint. So we’ll switch it to use **FastMCP** with `streamable-http` transport.
100
+
The original bridge script only supports **stdio** as an MCP transport, but the ChatGPT desktop app expects an **HTTP-based** MCP endpoint. To fix that, we’ll switch it to **FastMCP** with the`streamable-http` transport.
84
101
85
102
From the `bridge` folder, do the following.
86
103
87
104
### 3.1 – Install `fastmcp`
88
105
89
-
Instead of relying on the MCP Python library’s built-in FastMCP, use the dedicated `fastmcp` package for better compatibility:
106
+
Instead of relying on the MCP Python library’s built-in FastMCP, install the dedicated `fastmcp` package for better compatibility:
90
107
91
108
```shell
92
109
uv add fastmcp
@@ -126,13 +143,11 @@ This exposes the MCP server over HTTP on `localhost:8050`.
126
143
127
144
As of **2025-11-16**, the ChatGPT desktop app runs an internal validation pass (likely using a small model) to decide whether a connector is “safe.” If the connector fails that check, you might see:
128
145
129
-
> ```
130
-
> Connector is not safe
131
-
> ```
146
+
> `Connector is not safe`
132
147
133
148
when trying to add it.
134
149
135
-
A practical workaround (described in the OpenAI community thread) is to provide very explicit safety instructions in the MCP metadata:
150
+
A practical workaround (described in an OpenAI community thread) is to provide very explicit safety instructions in the MCP metadata:
136
151
137
152
Change:
138
153
@@ -160,14 +175,14 @@ You should see logs indicating that the MCP server is up and listening on the co
The MCP server is currently running **locally**. For the ChatGPT **cloud** environment to reach it, we need to expose it via a reverse proxy. Here we’ll use **ngrok**.
182
+
Right now the MCP server is running **locally**. For the ChatGPT **cloud** environment to reach it, we need to expose it via a reverse proxy. Here we’ll use **ngrok**.
168
183
169
184
1. Sign up for an ngrok account (if you don’t already have one):
We’ll use this URL in the ChatGPT connector configuration.
195
210
196
-
------
211
+
> When ngrok is running, anything that can reach the public URL can talk to your MCP server. Only expose this from a trusted network, and avoid loading highly sensitive or proprietary binaries while experimenting.
212
+
> {: .prompt-danger }
213
+
214
+
---
197
215
198
216
## Step 5 – Create a custom connector in the ChatGPT desktop app
199
217
@@ -204,22 +222,22 @@ Open the **ChatGPT desktop app**.
204
222
205
223

206
224
207
-
1. Click the **Back** button, then click **Create**on the top-right to create a new connector.
225
+
3. Click the **Back** button, then click **Create**in the top-right to create a new connector.
208
226
209
227
Fill in the fields:
210
228
211
-
-**Name**: e.g., `Binary Ninja MCP`
229
+
***Name**: e.g., `Binary Ninja MCP`
212
230
213
-
-**Description**: e.g., `Use Binary Ninja analysis tools from ChatGPT`
231
+
***Description**: e.g., `Use Binary Ninja analysis tools from ChatGPT`
214
232
215
-
-**Icon**: You can use the Binary Ninja icon from:
233
+
***Icon**: You can use the Binary Ninja icon from:
Use the HTTPS endpoint from ngrok **plus `/mcp`**. For example:
239
+
***MCP server URL**:
240
+
Use the HTTPS endpoint from ngrok **plus `/mcp`**. For example:
223
241
224
242
```text
225
243
https://your-random-subdomain.ngrok-free.app/mcp
@@ -229,27 +247,37 @@ Fill in the fields:
229
247
230
248
Save the connector.
231
249
232
-
------
250
+
---
233
251
234
252
## Step 6 – Use Binary Ninja from the ChatGPT desktop app
235
253
236
254
Back in the ChatGPT desktop app, open a new chat:
237
255
238
-
1. In the model selector, choose the **Binary Ninja connector** you just created (or select the GPT model and pick the connector under tools, depending on UI).
239
-
2. Start chatting and issue a request that uses Binary Ninja (e.g., “Analyze the current function,” “Summarize cross-references to this address,” etc.).
256
+
1. In the model selector, choose the **Binary Ninja connector** you just created (or pick a GPT model and enable the connector under **Tools**, depending on the UI).
257
+
2. Start chatting and issue a request that uses Binary Ninja—for example:
258
+
259
+
* “Analyze the current function.”
260
+
* “Summarize cross-references to this address.”
261
+
* “Map out the call graph starting from the current function.”
240
262
241
263

242
264
243
265
When ChatGPT calls a tool for the first time in a session, it will ask for permission:
244
266
245
-
- Approve the tool call.
246
-
- Optionally check **“Remember”** to auto-approve that tool for the rest of the session.
267
+
* Approve the tool call.
268
+
* Optionally check **“Remember”** to auto-approve that tool for the rest of the session.
247
269
248
270

249
271
250
272
At this point, you have Binary Ninja wired into ChatGPT, with the MCP bridge and ngrok tunnel in between.
251
273
252
-
------
274
+
> If the connector appears but calls fail, double-check:
275
+
> – Is the MCP server running in the `bridge` environment?
276
+
> – Is ngrok still active and pointing at the correct port?
277
+
> – Did you include the `/mcp` suffix in the connector URL?
278
+
> {: .prompt-tip }
279
+
280
+
---
253
281
254
282
## Step 7 – Example reverse-engineering prompt
255
283
@@ -349,19 +377,15 @@ Binary Ninja aids
349
377
* Prefer HLIL; drop lower when needed for precision.
350
378
```
351
379
352
-
You can tweak this further (e.g., add rules for malware-specific behaviors, a particular C2 family, or your internal naming conventions), but this should give ChatGPT enough structure to do serious, repeatable reverse-engineering passes with Binary Ninja.
380
+
You can tweak this further—for example, adding rules for specific malware families, internal naming conventions, or your own note-taking style—but it should give ChatGPT enough structure to perform serious, repeatable reverse-engineering passes with Binary Ninja.
353
381
354
-
------
382
+
---
355
383
356
-
That’s it — you now have a Binary-Ninja-to-ChatGPT workflow that’s:
384
+
That’s it—you now have a Binary-Ninja-to-ChatGPT workflow that’s:
357
385
358
-
- Local where it matters (Binary Ninja, your binaries),
359
-
- Cloud where it’s convenient (ChatGPT’s reasoning),
360
-
- And glued together with an MCP bridge plus ngrok.
386
+
* Local where it matters (Binary Ninja, your binaries),
387
+
* Cloud where it’s convenient (ChatGPT’s reasoning),
388
+
* And glued together with an MCP bridge plus ngrok.
0 commit comments