v0.0.90
Added
-
Added audio filter
KrispVivaFilterusing the Krisp VIVA SDK. -
Added
--folderargument to the runner, allowing files saved in that folder to be downloaded fromhttp://HOST:PORT/file/FILE. -
Added
GeminiLiveVertexLLMService, for accessing Gemini Live via Google Vertex AI. -
Added some new configuration options to
GeminiLiveLLMService:thinkingenable_affective_dialogproactivity
Note that these new configuration options require using a newer model than the default, like "gemini-2.5-flash-native-audio-preview-09-2025". The last two require specifying
http_options=HttpOptions(api_version="v1alpha"). -
Added
on_pipeline_errorevent toPipelineTask. This event will get fired when anErrorFrameis pushed (useFrameProcessor.push_error()).@task.event_handler("on_pipeline_error") async def on_pipeline_error(task: PipelineTask, frame: ErrorFrame): ...
-
Added a
service_tierInputParamto theBaseOpenAILLMService. This parameter can influence the latency of the response. For example"priority"will result in faster completions, but in exchange for a higher price.
Changed
- Updated
GeminiLiveLLMServiceto use thegoogle-genailibrary rather than use WebSockets directly.
Deprecated
-
LivekitFrameSerializeris now deprecated. UseLiveKitTransportinstead. -
pipecat.service.openai_realtimeis now deprecated, usepipecat.services.openai.realtimeinstead orpipecat.services.azure.realtimefor Azure Realtime. -
pipecat.service.aws_nova_sonicis now deprecated, usepipecat.services.aws.nova_sonicinstead. -
GeminiMultimodalLiveLLMServiceis now deprecated, useGeminiLiveLLMService.
Fixed
-
Fixed a
GoogleVertexLLMServiceissue that would generate an error if no token information was returned. -
GeminiLiveLLMServicewill now end gracefully (i.e. after the bot has finished) upon receiving anEndFrame. -
GeminiLiveLLMServicewill try to seamlessly reconnect when it loses its connection.