1

I am trying to run this simple Python code:

import os
from autogen import AssistantAgent, UserProxyAgent
llm_config = { "config_list": [          {
            "model": "DeepSeek-R1",  
            "api_key": "XXX",  
            "base_url": "https://aistudioaiservicesXXXX.openai.azure.com/",
            "api_type": "azure",
            "api_version": "2024-05-01-preview"  # Adaptez selon votre configuration
        }] }
assistant = AssistantAgent("assistant", llm_config=llm_config)
user_proxy = UserProxyAgent("user_proxy", code_execution_config=False)
user_proxy.initiate_chat(
    assistant,
    message="Tell me a joke about NVDA and TESLA stock prices.",
)

But I get this error:

InternalServerError: Error code: 500 - {'error': {'code': 'InternalServerError', 'message': 'Backend returned unexpected response. Please contact Microsoft for help.'}}

How can I fix this issue?

1
  • move to azure.ai.inference Commented Feb 24 at 10:33

1 Answer 1

0

Deploy gpt-35-turbo module /Other moule etc in Azure OpenAI to use AutoGen, follow the instructions provided in this documentation.

Ensure that the following dependencies are installed: pyautogen ,autogenstudio , requests and json . Also, ensure that the following services are running: autogenstudio ui --port 8088 and Docker

enter image description here

import autogen
import os

AZURE_OPENAI_API_KEY ="Your Azure OpenAI Service API Key"
AZURE_OPENAI_ENDPOINT = "Your Azure OpenAI Service Endpoint"
AZURE_DEPLOYMENT_NAME = "gpt-35-turbo"
AZURE_API_VERSION = "2023-12-01-preview"

if not AZURE_OPENAI_API_KEY or not AZURE_OPENAI_ENDPOINT or not AZURE_DEPLOYMENT_NAME:
    raise ValueError("ERROR: Please set your Azure OpenAI API key, endpoint, and deployment name.")

config_list = [
    {
        "model": AZURE_DEPLOYMENT_NAME,
        "api_key": AZURE_OPENAI_API_KEY,
        "base_url": AZURE_OPENAI_ENDPOINT,
        "api_type": "azure",
        "api_version": AZURE_API_VERSION,
    }
]
assistant = autogen.AssistantAgent(
    name="azure_assistant",
    llm_config={"cache_seed": 42, "config_list": config_list}
)
user_proxy = autogen.UserProxyAgent(
    name="user_proxy",
    human_input_mode="ALWAYS" 
)
if __name__ == "__main__":
    print("\n Chat with your Azure OpenAI Assistant! Type 'exit' to quit.\n")
    while True:
        user_input = input("You: ")
        if user_input.lower() == "exit":
            print("Exiting chat.")
            break
        response = user_proxy.initiate_chat(assistant, message=user_input)
        print(f"Assistant: {response}")

Output

Output

  • Since the DeepSeek-R1 module has not yet launched in Azure OpenAI, to use DeepSeek-R1 you have Azure AI project , Hub etc in Azure AI Studio . Refer this MSDOC for Prerequisites.

Azure

To connect to DeepSeek-R1 we have to use azure-ai-inference package .So we need to replace autogen with azure-ai-inference package Below the sample code to connect to DeepSeek-R1 module in Azure AI project.

from azure.ai.inference import ChatCompletionsClient
from azure.core.credentials import AzureKeyCredential

api_key =  'api_key'
if not api_key:
  raise Exception("A key should be provided to invoke the endpoint")

client = ChatCompletionsClient(
    endpoint='endpoint',
    credential=AzureKeyCredential(api_key)
)

model_info = client.get_model_info()
print("Model name:", model_info.model_name)
print("Model type:", model_info.model_type)
print("Model provider name:", model_info.model_provider_name)

payload = {
  "messages": [
    {
      "role": "user",
      "content": "I am going to Paris, what should I see?"
    },
    {
      "role": "assistant",
      "content": "Paris, the capital of France, is known for its stunning architecture, art museums, historical landmarks, and romantic atmosphere. Here are some of the top attractions to see in Paris:\n\n1. The Eiffel Tower: The iconic Eiffel Tower is one of the most recognizable landmarks in the world and offers breathtaking views of the city.\n2. The Louvre Museum: The Louvre is one of the world's largest and most famous museums, housing an impressive collection of art and artifacts, including the Mona Lisa.\n3. Notre-Dame Cathedral: This beautiful cathedral is one of the most famous landmarks in Paris and is known for its Gothic architecture and stunning stained glass windows.\n\nThese are just a few of the many attractions that Paris has to offer. With so much to see and do, it's no wonder that Paris is one of the most popular tourist destinations in the world."
    },
    {
      "role": "user",
      "content": "What is so great about #1?"
    }
  ],
  "max_tokens": 2048
}
response = client.complete(payload)

print("Response:", response.choices[0].message.content)
print("Model:", response.model)
print("Usage:")
print(" Prompt tokens:", response.usage.prompt_tokens)
print(" Total tokens:", response.usage.total_tokens)
print(" Completion tokens:", response.usage.completion_tokens)

Output

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.