Skip to content

Commit a598103

Browse files
authored
Python: Migrate to new Google GenAI SDK (#13371)
### Motivation and Context <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? 2. What problem does it solve? 3. What scenario does it contribute to? 4. If it fixes an open issue, please link to the issue here. --> Closes #12970, #13352 ### Description <!-- Describe your changes, the overall approach, the underlying design. These notes will help understanding how your code works. Thanks! --> 1. Migrate Google AI connector to use the new Google AI SDK. 2. This PR also marks the VertexAI connector as deprecated in favor of the Google AI connector. With this update, if users wants to connect to a VertexAI project instead of a Google AI endpoint, they can do so with the Google AI connector. 3. This PR also adds the ability for users to provide a custom client to the Google AI connector (#13352). ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [x] The code builds clean without any errors or warnings - [x] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [x] All unit tests pass, and I have added new tests where possible - [ ] I didn't break anyone 😄
1 parent 5b30c8b commit a598103

38 files changed

+4876
-4676
lines changed

python/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Highlights
44
- Flexible Agent Framework: build, orchestrate, and deploy AI agents and multi-agent systems
55
- Multi-Agent Systems: Model workflows and collaboration between AI specialists
66
- Plugin Ecosystem: Extend with Python, OpenAPI, Model Context Protocol (MCP), and more
7-
- LLM Support: OpenAI, Azure OpenAI, Hugging Face, Mistral, Vertex AI, ONNX, Ollama, NVIDIA NIM, and others
7+
- LLM Support: OpenAI, Azure OpenAI, Hugging Face, Mistral, Google AI, ONNX, Ollama, NVIDIA NIM, and others
88
- Vector DB Support: Azure AI Search, Elasticsearch, Chroma, and more
99
- Process Framework: Build structured business processes with workflow modeling
1010
- Multimodal: Text, vision, audio

python/pyproject.toml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ classifiers = [
2424
]
2525
dependencies = [
2626
# azure agents
27-
"azure-ai-projects >= 1.0.0b12",
27+
"azure-ai-projects ~= 1.0.0b12",
2828
"azure-ai-agents >= 1.2.0b3",
2929
"aiohttp ~= 3.8",
3030
"cloudevents ~=1.0",
@@ -93,7 +93,7 @@ faiss = [
9393
]
9494
google = [
9595
"google-cloud-aiplatform ~= 1.114.0",
96-
"google-generativeai ~= 0.8"
96+
"google-genai ~= 1.51.0"
9797
]
9898
hugging_face = [
9999
"transformers[torch] ~= 4.28",

python/samples/concepts/setup/ALL_SETTINGS.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -25,9 +25,9 @@
2525
| Google AI | [GoogleAIChatCompletion](../../../semantic_kernel/connectors/ai/google/google_ai/services/google_ai_chat_completion.py) | gemini_model_id, <br> api_key | GOOGLE_AI_GEMINI_MODEL_ID, <br> GOOGLE_AI_API_KEY | Yes, <br> Yes | [GoogleAISettings](../../../semantic_kernel/connectors/ai/google/google_ai/google_ai_settings.py) |
2626
| | [GoogleAITextCompletion](../../../semantic_kernel/connectors/ai/google/google_ai/services/google_ai_text_completion.py) | gemini_model_id, <br> api_key | GOOGLE_AI_GEMINI_MODEL_ID, <br> GOOGLE_AI_API_KEY | Yes, <br> Yes | |
2727
| | [GoogleAITextEmbedding](../../../semantic_kernel/connectors/ai/google/google_ai/services/google_ai_text_embedding.py) | embedding_model_id, <br> api_key | GOOGLE_AI_EMBEDDING_MODEL_ID, <br> GOOGLE_AI_API_KEY | Yes, <br> Yes | |
28-
| Vertex AI | [VertexAIChatCompletion](../../../semantic_kernel/connectors/ai/google/vertex_ai/services/vertex_ai_chat_completion.py) | project_id, <br> region, <br> gemini_model_id | VERTEX_AI_PROJECT_ID, <br> VERTEX_AI_REGION, <br> VERTEX_AI_GEMINI_MODEL_ID | Yes, <br> No, <br> Yes | [VertexAISettings](../../../semantic_kernel/connectors/ai/google/vertex_ai/vertex_ai_settings.py) |
29-
| | [VertexAITextCompletion](../../../semantic_kernel/connectors/ai/google/google_ai/services/google_ai_text_completion.py) | project_id, <br> region, <br> gemini_model_id | VERTEX_AI_PROJECT_ID, <br> VERTEX_AI_REGION, <br> VERTEX_AI_GEMINI_MODEL_ID | Yes, <br> No, <br> Yes | |
30-
| | [VertexAITextEmbedding](../../../semantic_kernel/connectors/ai/google/google_ai/services/google_ai_text_embedding.py) | project_id, <br> region, <br> embedding_model_id | VERTEX_AI_PROJECT_ID, <br> VERTEX_AI_REGION, <br> VERTEX_AI_EMBEDDING_MODEL_ID | Yes, <br> No, <br> Yes | |
28+
| Vertex AI | [GoogleAIChatCompletion](../../../semantic_kernel/connectors/ai/google/google_ai/services/google_ai_chat_completion.py) | project_id, <br> region, <br> gemini_model_id | GOOGLE_AI_CLOUD_PROJECT_ID, <br> GOOGLE_AI_CLOUD_REGION, <br> GOOGLE_AI_GEMINI_MODEL_ID, <br> GOOGLE_AI_USE_VERTEXAI | Yes, <br> No, <br> Yes, <br> Yes (must set to true) | [GoogleAISettings](../../../semantic_kernel/connectors/ai/google/google_ai/google_ai_settings.py) |
29+
| | [GoogleAITextCompletion](../../../semantic_kernel/connectors/ai/google/google_ai/services/google_ai_text_completion.py) | project_id, <br> region, <br> gemini_model_id | GOOGLE_AI_CLOUD_PROJECT_ID, <br> GOOGLE_AI_CLOUD_REGION, <br> GOOGLE_AI_GEMINI_MODEL_ID, <br> GOOGLE_AI_USE_VERTEXAI | Yes, <br> No, <br> Yes, <br> Yes (must set to true) | |
30+
| | [GoogleAITextEmbedding](../../../semantic_kernel/connectors/ai/google/google_ai/services/google_ai_text_embedding.py) | project_id, <br> region, <br> embedding_model_id | GOOGLE_AI_CLOUD_PROJECT_ID, <br> GOOGLE_AI_CLOUD_REGION, <br> GOOGLE_AI_EMBEDDING_MODEL_ID, <br> GOOGLE_AI_USE_VERTEXAI | Yes, <br> No, <br> Yes, <br> Yes (must set to true) | |
3131
| HuggingFace | [HuggingFaceTextCompletion](../../../semantic_kernel/connectors/ai/hugging_face/services/hf_text_completion.py) | ai_model_id | N/A | Yes | |
3232
| | [HuggingFaceTextEmbedding](../../../semantic_kernel/connectors/ai/hugging_face/services/hf_text_embedding.py) | ai_model_id | N/A | Yes | |
3333
| NVIDIA NIM | [NvidiaChatCompletion](../../../semantic_kernel/connectors/ai/nvidia/services/nvidia_chat_completion.py) | ai_model_id, <br> api_key, <br> base_url | NVIDIA_CHAT_MODEL_ID, <br> NVIDIA_API_KEY, <br> NVIDIA_BASE_URL | Yes (default: meta/llama-3.1-8b-instruct), <br> Yes, <br> No | [NvidiaAISettings](../../../semantic_kernel/connectors/ai/nvidia/settings/nvidia_settings.py) |

python/samples/concepts/setup/chat_completion_services.py

Lines changed: 4 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -249,10 +249,7 @@ def get_google_ai_chat_completion_service_and_request_settings() -> tuple[
249249
Please refer to the Semantic Kernel Python documentation for more information:
250250
https://learn.microsoft.com/en-us/python/api/semantic-kernel/semantic_kernel?view=semantic-kernel
251251
"""
252-
from semantic_kernel.connectors.ai.google.google_ai import (
253-
GoogleAIChatCompletion,
254-
GoogleAIChatPromptExecutionSettings,
255-
)
252+
from semantic_kernel.connectors.ai.google import GoogleAIChatCompletion, GoogleAIChatPromptExecutionSettings
256253

257254
chat_service = GoogleAIChatCompletion(service_id=service_id)
258255
request_settings = GoogleAIChatPromptExecutionSettings(service_id=service_id)
@@ -356,13 +353,10 @@ def get_vertex_ai_chat_completion_service_and_request_settings() -> tuple[
356353
Please refer to the Semantic Kernel Python documentation for more information:
357354
https://learn.microsoft.com/en-us/python/api/semantic-kernel/semantic_kernel?view=semantic-kernel
358355
"""
359-
from semantic_kernel.connectors.ai.google.vertex_ai import (
360-
VertexAIChatCompletion,
361-
VertexAIChatPromptExecutionSettings,
362-
)
356+
from semantic_kernel.connectors.ai.google import GoogleAIChatCompletion, GoogleAIChatPromptExecutionSettings
363357

364-
chat_service = VertexAIChatCompletion(service_id=service_id)
365-
request_settings = VertexAIChatPromptExecutionSettings(service_id=service_id)
358+
chat_service = GoogleAIChatCompletion(service_id=service_id, use_vertexai=True)
359+
request_settings = GoogleAIChatPromptExecutionSettings(service_id=service_id)
366360

367361
return chat_service, request_settings
368362

python/samples/concepts/setup/text_completion_services.py

Lines changed: 3 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -262,12 +262,9 @@ def get_vertex_ai_text_completion_service_and_request_settings() -> tuple[
262262
Please refer to the Semantic Kernel Python documentation for more information:
263263
https://learn.microsoft.com/en-us/python/api/semantic-kernel/semantic_kernel?view=semantic-kernel
264264
"""
265-
from semantic_kernel.connectors.ai.google.vertex_ai import (
266-
VertexAITextCompletion,
267-
VertexAITextPromptExecutionSettings,
268-
)
265+
from semantic_kernel.connectors.ai.google import GoogleAITextCompletion, GoogleAITextPromptExecutionSettings
269266

270-
text_service = VertexAITextCompletion()
271-
request_settings = VertexAITextPromptExecutionSettings()
267+
text_service = GoogleAITextCompletion()
268+
request_settings = GoogleAITextPromptExecutionSettings()
272269

273270
return text_service, request_settings

python/samples/concepts/setup/text_embedding_services.py

Lines changed: 3 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -296,13 +296,10 @@ def get_vertex_ai_text_embedding_service_and_request_settings() -> tuple[
296296
Please refer to the Semantic Kernel Python documentation for more information:
297297
https://learn.microsoft.com/en-us/python/api/semantic-kernel/semantic_kernel?view=semantic-kernel
298298
"""
299-
from semantic_kernel.connectors.ai.google.vertex_ai import (
300-
VertexAIEmbeddingPromptExecutionSettings,
301-
VertexAITextEmbedding,
302-
)
299+
from semantic_kernel.connectors.ai.google import GoogleAIEmbeddingPromptExecutionSettings, GoogleAITextEmbedding
303300

304-
embedding_service = VertexAITextEmbedding()
301+
embedding_service = GoogleAITextEmbedding()
305302
# Note: not all models support specifying the dimensions or there may be constraints on the dimensions
306-
request_settings = VertexAIEmbeddingPromptExecutionSettings(output_dimensionality=768)
303+
request_settings = GoogleAIEmbeddingPromptExecutionSettings(output_dimensionality=768)
307304

308305
return embedding_service, request_settings

python/semantic_kernel/connectors/ai/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -38,9 +38,9 @@ All base clients inherit from the [`AIServiceClientBase`](../../services/ai_serv
3838
| [Google AI](./google/README.md) | [`GoogleAIChatCompletion`](./google/google_ai/services/google_ai_chat_completion.py) |
3939
| | [`GoogleAITextCompletion`](./google/google_ai/services/google_ai_text_completion.py) |
4040
| | [`GoogleAITextEmbedding`](./google/google_ai/services/google_ai_text_embedding.py) |
41-
| [Vertex AI](./google/README.md) | [`VertexAIChatCompletion`](./google/vertex_ai/services/vertex_ai_chat_completion.py) |
42-
| | [`VertexAITextCompletion`](./google/vertex_ai/services/vertex_ai_text_completion.py) |
43-
| | [`VertexAITextEmbedding`](./google/vertex_ai/services/vertex_ai_text_embedding.py) |
41+
| [Vertex AI](./google/README.md) | [`GoogleAIChatCompletion`](./google/google_ai/services/google_ai_chat_completion.py) |
42+
| | [`GoogleAITextCompletion`](./google/google_ai/services/google_ai_text_completion.py) |
43+
| | [`GoogleAITextEmbedding`](./google/google_ai/services/google_ai_text_embedding.py) |
4444
| HuggingFace | [`HuggingFaceTextCompletion`](./hugging_face/services/hf_text_completion.py) |
4545
| | [`HuggingFaceTextEmbedding`](./hugging_face/services/hf_text_embedding.py) |
4646
| Mistral AI | [`MistralAIChatCompletion`](./mistral_ai/services/mistral_ai_chat_completion.py) |

python/semantic_kernel/connectors/ai/google/README.md

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Once you have an API key, you can start using Gemini models in SK using the `goo
1414
kernel = Kernel()
1515
kernel.add_service(
1616
GoogleAIChatCompletion(
17-
gemini_model_id="gemini-1.5-flash",
17+
gemini_model_id="gemini-2.5-flash",
1818
api_key="...",
1919
)
2020
)
@@ -39,9 +39,11 @@ Once you have your project and your environment is set up, you can start using G
3939
```Python
4040
kernel = Kernel()
4141
kernel.add_service(
42-
VertexAIChatCompletion(
42+
GoogleAIChatCompletion(
4343
project_id="...",
44-
gemini_model_id="gemini-1.5-flash",
44+
region="...",
45+
gemini_model_id="gemini-2.5-flash",
46+
use_vertexai=True,
4547
)
4648
)
4749
...
@@ -51,4 +53,4 @@ kernel.add_service(
5153
5254
## Why is there code that looks almost identical in the implementations on the two connectors
5355

54-
The two connectors have very similar implementations, including the utils files. However, they are fundamentally different as they depend on different packages from Google. Although the namings of many types are identical, they are different types.
56+
The two connectors have very similar implementations, including the utils files. However, they are fundamentally different as they depend on different packages from Google. Although the namings of many types are identical, they are different types.
Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
# Copyright (c) Microsoft. All rights reserved.
2+
3+
from semantic_kernel.connectors.ai.google.google_ai.google_ai_prompt_execution_settings import (
4+
GoogleAIChatPromptExecutionSettings,
5+
GoogleAIEmbeddingPromptExecutionSettings,
6+
GoogleAIPromptExecutionSettings,
7+
GoogleAITextPromptExecutionSettings,
8+
)
9+
from semantic_kernel.connectors.ai.google.google_ai.services.google_ai_chat_completion import GoogleAIChatCompletion
10+
from semantic_kernel.connectors.ai.google.google_ai.services.google_ai_text_completion import GoogleAITextCompletion
11+
from semantic_kernel.connectors.ai.google.google_ai.services.google_ai_text_embedding import GoogleAITextEmbedding
12+
13+
__all__ = [
14+
"GoogleAIChatCompletion",
15+
"GoogleAIChatPromptExecutionSettings",
16+
"GoogleAIEmbeddingPromptExecutionSettings",
17+
"GoogleAIPromptExecutionSettings",
18+
"GoogleAITextCompletion",
19+
"GoogleAITextEmbedding",
20+
"GoogleAITextPromptExecutionSettings",
21+
]

python/semantic_kernel/connectors/ai/google/google_ai/google_ai_prompt_execution_settings.py

Lines changed: 0 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,9 @@
11
# Copyright (c) Microsoft. All rights reserved.
22

3-
import sys
43
from typing import Annotated, Any, Literal
54

65
from pydantic import Field
76

8-
if sys.version_info >= (3, 12):
9-
from typing import override # pragma: no cover
10-
else:
11-
from typing_extensions import override # pragma: no cover
12-
137
from semantic_kernel.connectors.ai.prompt_execution_settings import PromptExecutionSettings
148

159

@@ -50,19 +44,6 @@ class GoogleAIChatPromptExecutionSettings(GoogleAIPromptExecutionSettings):
5044
),
5145
] = None
5246

53-
@override
54-
def prepare_settings_dict(self, **kwargs) -> dict[str, Any]:
55-
"""Prepare the settings as a dictionary for sending to the AI service.
56-
57-
This method removes the tools and tool_choice keys from the settings dictionary, as
58-
the Google AI service mandates these two settings to be sent as separate parameters.
59-
"""
60-
settings_dict = super().prepare_settings_dict(**kwargs)
61-
settings_dict.pop("tools", None)
62-
settings_dict.pop("tool_config", None)
63-
64-
return settings_dict
65-
6647

6748
class GoogleAIEmbeddingPromptExecutionSettings(PromptExecutionSettings):
6849
"""Google AI Embedding Prompt Execution Settings."""

0 commit comments

Comments
 (0)