BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News OpenAI Features New o3-mini Model on Microsoft Azure OpenAI Service

OpenAI Features New o3-mini Model on Microsoft Azure OpenAI Service

This item in japanese

Log in to listen to this article

OpenAI has launched the o3-mini model, which is now accessible through the Microsoft Azure OpenAI Service. According to the company, this model represents an advancement in AI technology, featuring improved cost efficiency and enhanced reasoning capabilities compared to the previous o1-mini model released last September.

The o3-mini model is expected to benefit developers and enterprises looking to enhance their AI applications. It offers faster performance and lower latency while effectively handling more complex reasoning tasks. Yina Arenas, Vice President of Product at Microsoft, Core AI, writes in an AI and machine learning blog:

With faster performance and lower latency, o3-mini is designed to handle complex reasoning workloads while maintaining efficiency.

A notable new aspect of the o3-mini model is the reasoning effort parameter. This feature allows users to adjust the model's cognitive load, enabling low, medium, or high levels of reasoning. For instance, a low level of reasoning might be suitable for simple data processing tasks, while a high level could be used for complex decision-making processes.

An example is given in the AI Vercel SDK documentation:

import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';

// Reduce reasoning effort for faster responses
const { text } = await generateText({
  model: openai('o3-mini'),
  prompt: 'Explain quantum entanglement briefly.',
  providerOptions: {
    openai: { reasoningEffort: 'low' },
  },
});

Additionally, the o3-mini model supports structured outputs by incorporating JSON Schema constraints. This feature ensures that the model's outputs are in a format that is easily understandable and usable by other systems - it facilitates automated workflows, making it more straightforward for organizations to implement AI within their existing systems. A Rest call with structured output could look like this:

curl -X POST  https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_MODEL_DEPLOYMENT_NAME/chat/completions?api-version=2025-01-31 \
  -H "api-key: $AZURE_OPENAI_API_KEY" \
  -H "Content-Type: application/json" \
    -d '{
        "messages": [
                {"role": "system", "content": "Extract the event information."},
                {"role": "user", "content": "Alice and Bob are going to a science fair on Friday."}
            ],
            "response_format": {
                "type": "json_schema",
                "json_schema": {
                    "name": "CalendarEventResponse",
                    "strict": true,
                    "schema": {
                        "type": "object",
                        "properties": {
                            "name": {
                              "type": "string"
                            },
                            "date": {
                                "type": "string"
                            },
                            "participants": {
                                "type": "array",
                                "items": {
                                    "type": "string"
                                }
                            }
                        },
                        "required": [
                            "name",
                            "date",
                            "participants"
                        ],
                        "additionalProperties": false
                    }
                }
          }
  }'

Note: o3-mini version 2025-01-31

The model also supports functions and external tools akin to its predecessor, making it suitable for various AI-powered automation tasks. These tasks could include automating customer support responses, managing inventory levels, or even controlling manufacturing processes.

Another significant change is the introduction of developer messages, which replace the previous system messages – a new approach that provides a more structured framework for instruction handling, enabling developers to create more responsive AI applications. Moreover, the Azure OpenAI Service has ensured backward compatibility by aligning old system messages with the new format to assist with transitions.

Continuing its predecessor's capabilities, the o3-mini model improves on key areas such as coding, mathematics, and scientific reasoning. These enhancements are essential for organizations requiring high-performance AI solutions.

ShinChven Zhang concludes in a blog post:

While the o3-mini currently lacks support for image processing, its text-only processing capability, advanced reasoning abilities, and cost-effectiveness make it a compelling choice for various applications. The availability of o3-mini to free users in ChatGPT is a significant step towards democratizing access to powerful AI models, potentially driving innovation in multiple fields, such as coding, STEM research, and AI-powered automation.

Lastly, developers can learn more about OpenAI o3-mini in GitHub Copilot and GitHub Models and sign up in Azure AI Foundry to access o3-mini and other advanced AI models.

About the Author

BT