Skip to content

Commit 30862e3

Browse files
(docs): support for openai.tools.mcp (#10073)
## Background As a follow up for PR #10026 ## Summary Docs updated to include example and configuration info
1 parent f4eb20f commit 30862e3

File tree

3 files changed

+106
-0
lines changed

3 files changed

+106
-0
lines changed

content/cookbook/00-guides/19-openai-responses.mdx

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -130,6 +130,31 @@ console.log(result.text);
130130
console.log(result.sources);
131131
```
132132

133+
### MCP Tool
134+
135+
The Responses API also supports connecting to [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) servers. This allows models to call tools exposed by remote MCP servers or service connectors.
136+
137+
```ts
138+
import { openai } from '@ai-sdk/openai';
139+
import { generateText } from 'ai';
140+
141+
const result = await generateText({
142+
model: openai.responses('gpt-5-mini'),
143+
prompt: 'Search the web for the latest NYC mayoral election results',
144+
tools: {
145+
mcp: openai.tools.mcp({
146+
serverLabel: 'web-search',
147+
serverUrl: 'https://mcp.exa.ai/mcp',
148+
serverDescription: 'A web-search API for AI agents',
149+
}),
150+
},
151+
});
152+
153+
console.log(result.text);
154+
```
155+
156+
For more details on configuring the MCP tool, including authentication, tool filtering, and connector support, see the [OpenAI provider documentation](/providers/ai-sdk-providers/openai#mcp-tool).
157+
133158
## Using Persistence
134159

135160
With the Responses API, you can persist chat history with OpenAI across requests. This allows you to send just the user's last message and OpenAI can access the entire chat history:

content/docs/03-ai-sdk-core/16-mcp-tools.mdx

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,13 @@ description: Learn how to connect to Model Context Protocol (MCP) servers and us
1212
The AI SDK supports connecting to [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) servers to access their tools, resources, and prompts.
1313
This enables your AI applications to discover and use capabilities across various services through a standardized interface.
1414

15+
<Note>
16+
If you're using OpenAI's Responses API, you can also use the built-in
17+
`openai.tools.mcp` tool, which provides direct MCP server integration without
18+
needing to convert tools. See the [OpenAI provider
19+
documentation](/providers/ai-sdk-providers/openai#mcp-tool) for details.
20+
</Note>
21+
1522
## Initializing an MCP Client
1623

1724
We recommend using HTTP transport (like `StreamableHTTPClientTransport`) for production deployments. The stdio transport should only be used for connecting to local servers as it cannot be deployed to production environments.

content/providers/01-ai-sdk-providers/03-openai.mdx

Lines changed: 74 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -486,6 +486,80 @@ The code interpreter tool can be configured with:
486486
be customized.
487487
</Note>
488488

489+
#### MCP Tool
490+
491+
The OpenAI responses API supports connecting to [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) servers through the `openai.tools.mcp` tool. This allows models to call tools exposed by remote MCP servers or service connectors.
492+
493+
```ts
494+
import { openai } from '@ai-sdk/openai';
495+
import { generateText } from 'ai';
496+
497+
const result = await generateText({
498+
model: openai('gpt-5'),
499+
prompt: 'Search the web for the latest news about AI developments',
500+
tools: {
501+
mcp: openai.tools.mcp({
502+
serverLabel: 'web-search',
503+
serverUrl: 'https://mcp.exa.ai/mcp',
504+
serverDescription: 'A web-search API for AI agents',
505+
}),
506+
},
507+
});
508+
```
509+
510+
The MCP tool can be configured with:
511+
512+
- **serverLabel** _string_ (required)
513+
514+
A label to identify the MCP server. This label is used in tool calls to distinguish between multiple MCP servers.
515+
516+
- **serverUrl** _string_ (required if `connectorId` is not provided)
517+
518+
The URL for the MCP server. Either `serverUrl` or `connectorId` must be provided.
519+
520+
- **connectorId** _string_ (required if `serverUrl` is not provided)
521+
522+
Identifier for a service connector. Either `serverUrl` or `connectorId` must be provided.
523+
524+
- **serverDescription** _string_ (optional)
525+
526+
Optional description of the MCP server that helps the model understand its purpose.
527+
528+
- **allowedTools** _string[] | object_ (optional)
529+
530+
Controls which tools from the MCP server are available. Can be:
531+
532+
- An array of tool names: `['tool1', 'tool2']`
533+
- An object with filters:
534+
```ts
535+
{
536+
readOnly: true, // Only allow read-only tools
537+
toolNames: ['tool1', 'tool2'] // Specific tool names
538+
}
539+
```
540+
541+
- **authorization** _string_ (optional)
542+
543+
OAuth access token for authenticating with the MCP server or connector.
544+
545+
- **headers** _Record&lt;string, string&gt;_ (optional)
546+
547+
Optional HTTP headers to include in requests to the MCP server.
548+
549+
<Note>
550+
The tool calls made by the model when using the OpenAI MCP tool are approved
551+
by default. Be sure to connect to only trusted MCP servers, who you trust to
552+
share your data with.
553+
</Note>
554+
555+
<Note>
556+
The OpenAI MCP tool is different from the general MCP client approach
557+
documented in [MCP Tools](/docs/ai-sdk-core/mcp-tools). The OpenAI MCP tool is
558+
a built-in provider-defined tool that allows OpenAI models to directly connect
559+
to MCP servers, while the general MCP client requires you to convert MCP tools
560+
to AI SDK tools first.
561+
</Note>
562+
489563
#### Local Shell Tool
490564

491565
The OpenAI responses API support the local shell tool for Codex models through the `openai.tools.localShell` tool.

0 commit comments

Comments
 (0)