Skip to content

Commit 403844e

Browse files
authored
feat: add Semantic Kernel Adapter documentation and usage examples in user guides (#5256)
Partially address #5205 and #5226
1 parent 7020f2a commit 403844e

File tree

3 files changed

+330
-107
lines changed

3 files changed

+330
-107
lines changed

python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb

+95
Original file line numberDiff line numberDiff line change
@@ -327,6 +327,101 @@
327327
"response = await model_client.create([UserMessage(content=\"What is the capital of France?\", source=\"user\")])\n",
328328
"print(response)"
329329
]
330+
},
331+
{
332+
"cell_type": "markdown",
333+
"metadata": {},
334+
"source": [
335+
"## Semantic Kernel Adapter\n",
336+
"\n",
337+
"The {py:class}`~autogen_ext.models.semantic_kernel.SKChatCompletionAdapter`\n",
338+
"allows you to use Semantic kernel model clients as a\n",
339+
"{py:class}`~autogen_core.models.ChatCompletionClient` by adapting them to the required interface.\n",
340+
"\n",
341+
"You need to install the relevant provider extras to use this adapter. \n",
342+
"\n",
343+
"The list of extras that can be installed:\n",
344+
"\n",
345+
"- `semantic-kernel-anthropic`: Install this extra to use Anthropic models.\n",
346+
"- `semantic-kernel-google`: Install this extra to use Google Gemini models.\n",
347+
"- `semantic-kernel-ollama`: Install this extra to use Ollama models.\n",
348+
"- `semantic-kernel-mistralai`: Install this extra to use MistralAI models.\n",
349+
"- `semantic-kernel-aws`: Install this extra to use AWS models.\n",
350+
"- `semantic-kernel-hugging-face`: Install this extra to use Hugging Face models.\n",
351+
"\n",
352+
"For example, to use Anthropic models, you need to install `semantic-kernel-anthropic`."
353+
]
354+
},
355+
{
356+
"cell_type": "code",
357+
"execution_count": null,
358+
"metadata": {
359+
"vscode": {
360+
"languageId": "shellscript"
361+
}
362+
},
363+
"outputs": [],
364+
"source": [
365+
"# pip install \"autogen-ext[semantic-kernel-anthropic]\""
366+
]
367+
},
368+
{
369+
"cell_type": "markdown",
370+
"metadata": {},
371+
"source": [
372+
"To use this adapter, you need create a Semantic Kernel model client and pass it to the adapter.\n",
373+
"\n",
374+
"For example, to use the Anthropic model:"
375+
]
376+
},
377+
{
378+
"cell_type": "code",
379+
"execution_count": 1,
380+
"metadata": {},
381+
"outputs": [
382+
{
383+
"name": "stdout",
384+
"output_type": "stream",
385+
"text": [
386+
"finish_reason='stop' content='The capital of France is Paris. It is also the largest city in France and one of the most populous metropolitan areas in Europe.' usage=RequestUsage(prompt_tokens=0, completion_tokens=0) cached=False logprobs=None\n"
387+
]
388+
}
389+
],
390+
"source": [
391+
"import os\n",
392+
"\n",
393+
"from autogen_core.models import UserMessage\n",
394+
"from autogen_ext.models.semantic_kernel import SKChatCompletionAdapter\n",
395+
"from semantic_kernel import Kernel\n",
396+
"from semantic_kernel.connectors.ai.anthropic import AnthropicChatCompletion, AnthropicChatPromptExecutionSettings\n",
397+
"from semantic_kernel.memory.null_memory import NullMemory\n",
398+
"\n",
399+
"sk_client = AnthropicChatCompletion(\n",
400+
" ai_model_id=\"claude-3-5-sonnet-20241022\",\n",
401+
" api_key=os.environ[\"ANTHROPIC_API_KEY\"],\n",
402+
" service_id=\"my-service-id\", # Optional; for targeting specific services within Semantic Kernel\n",
403+
")\n",
404+
"settings = AnthropicChatPromptExecutionSettings(\n",
405+
" temperature=0.2,\n",
406+
")\n",
407+
"\n",
408+
"anthropic_model_client = SKChatCompletionAdapter(\n",
409+
" sk_client, kernel=Kernel(memory=NullMemory()), prompt_settings=settings\n",
410+
")\n",
411+
"\n",
412+
"# Call the model directly.\n",
413+
"model_result = await anthropic_model_client.create(\n",
414+
" messages=[UserMessage(content=\"What is the capital of France?\", source=\"User\")]\n",
415+
")\n",
416+
"print(model_result)"
417+
]
418+
},
419+
{
420+
"cell_type": "markdown",
421+
"metadata": {},
422+
"source": [
423+
"Read more about the [Semantic Kernel Adapter](../../../reference/python/autogen_ext.models.semantic_kernel.rst)."
424+
]
330425
}
331426
],
332427
"metadata": {

python/packages/autogen-core/docs/src/user-guide/core-user-guide/components/model-clients.ipynb

+95
Original file line numberDiff line numberDiff line change
@@ -336,6 +336,101 @@
336336
"print(response)"
337337
]
338338
},
339+
{
340+
"cell_type": "markdown",
341+
"metadata": {},
342+
"source": [
343+
"## Semantic Kernel Adapter\n",
344+
"\n",
345+
"The {py:class}`~autogen_ext.models.semantic_kernel.SKChatCompletionAdapter`\n",
346+
"allows you to use Semantic kernel model clients as a\n",
347+
"{py:class}`~autogen_core.models.ChatCompletionClient` by adapting them to the required interface.\n",
348+
"\n",
349+
"You need to install the relevant provider extras to use this adapter. \n",
350+
"\n",
351+
"The list of extras that can be installed:\n",
352+
"\n",
353+
"- `semantic-kernel-anthropic`: Install this extra to use Anthropic models.\n",
354+
"- `semantic-kernel-google`: Install this extra to use Google Gemini models.\n",
355+
"- `semantic-kernel-ollama`: Install this extra to use Ollama models.\n",
356+
"- `semantic-kernel-mistralai`: Install this extra to use MistralAI models.\n",
357+
"- `semantic-kernel-aws`: Install this extra to use AWS models.\n",
358+
"- `semantic-kernel-hugging-face`: Install this extra to use Hugging Face models.\n",
359+
"\n",
360+
"For example, to use Anthropic models, you need to install `semantic-kernel-anthropic`."
361+
]
362+
},
363+
{
364+
"cell_type": "code",
365+
"execution_count": null,
366+
"metadata": {
367+
"vscode": {
368+
"languageId": "shellscript"
369+
}
370+
},
371+
"outputs": [],
372+
"source": [
373+
"# pip install \"autogen-ext[semantic-kernel-anthropic]\""
374+
]
375+
},
376+
{
377+
"cell_type": "markdown",
378+
"metadata": {},
379+
"source": [
380+
"To use this adapter, you need create a Semantic Kernel model client and pass it to the adapter.\n",
381+
"\n",
382+
"For example, to use the Anthropic model:"
383+
]
384+
},
385+
{
386+
"cell_type": "code",
387+
"execution_count": null,
388+
"metadata": {},
389+
"outputs": [
390+
{
391+
"name": "stdout",
392+
"output_type": "stream",
393+
"text": [
394+
"finish_reason='stop' content='The capital of France is Paris. It is also the largest city in France and one of the most populous metropolitan areas in Europe.' usage=RequestUsage(prompt_tokens=0, completion_tokens=0) cached=False logprobs=None\n"
395+
]
396+
}
397+
],
398+
"source": [
399+
"import os\n",
400+
"\n",
401+
"from autogen_core.models import UserMessage\n",
402+
"from autogen_ext.models.semantic_kernel import SKChatCompletionAdapter\n",
403+
"from semantic_kernel import Kernel\n",
404+
"from semantic_kernel.connectors.ai.anthropic import AnthropicChatCompletion, AnthropicChatPromptExecutionSettings\n",
405+
"from semantic_kernel.memory.null_memory import NullMemory\n",
406+
"\n",
407+
"sk_client = AnthropicChatCompletion(\n",
408+
" ai_model_id=\"claude-3-5-sonnet-20241022\",\n",
409+
" api_key=os.environ[\"ANTHROPIC_API_KEY\"],\n",
410+
" service_id=\"my-service-id\", # Optional; for targeting specific services within Semantic Kernel\n",
411+
")\n",
412+
"settings = AnthropicChatPromptExecutionSettings(\n",
413+
" temperature=0.2,\n",
414+
")\n",
415+
"\n",
416+
"anthropic_model_client = SKChatCompletionAdapter(\n",
417+
" sk_client, kernel=Kernel(memory=NullMemory()), prompt_settings=settings\n",
418+
")\n",
419+
"\n",
420+
"# Call the model directly.\n",
421+
"model_result = await anthropic_model_client.create(\n",
422+
" messages=[UserMessage(content=\"What is the capital of France?\", source=\"User\")]\n",
423+
")\n",
424+
"print(model_result)"
425+
]
426+
},
427+
{
428+
"cell_type": "markdown",
429+
"metadata": {},
430+
"source": [
431+
"Read more about the [Semantic Kernel Adapter](../../../reference/python/autogen_ext.models.semantic_kernel.rst)."
432+
]
433+
},
339434
{
340435
"cell_type": "markdown",
341436
"metadata": {},

0 commit comments

Comments
 (0)