OpenAI’s powerful AI models, notably GPT-4 and ChatGPT, are also available on Microsoft Azure. So what does this mean for Azure customers? What is the background to this and what are the benefits for businesses of sourcing AI services from Azure rather than directly from OpenAI? Read this article to learn more.
What is OpenAI?
OpenAI, the company behind the GPT language models and ChatGPT, is one of the world’s leading developers of artificial intelligence (AI). Founded in 2015 as a non-profit organisation funded by donations, OpenAI soon released its first AI solutions. 2016 saw the launch of OpenAI Gym, a set of simulation environments for developing machine learning (reinforcement learning). OpenAI Five followed in 2017, with AI agents working in teams of five in a complex computer game (DOTA 2) and even defeating human professional players in a simplified version of the game.
In 2018, the Generative Pre-Trained Transformer (GPT) language models, known as Large Language Models (LLMs), appeared. With these LLMs, especially GPT-3 in 2020 and ChatGPT in 2022, OpenAI has attracted widespread attention and significantly advanced research in generative AI and natural language processing (NLP).
Partnership with Microsoft
Top AI researchers cost money, and training large AI models requires extremely high computing power. Although Microsoft and AWS provided at least $800,000 worth of cloud services for free in the early years, donations were not enough to keep OpenAI running – in 2018, OpenAI needed to raise 30 million dollars for cloud computing. In 2019, the company dropped the non-profit restriction and accepted that Microsoft invested $1 billion in OpenAI.
Since then, OpenAI’s systems have been running on high-performance computing machines in Azure. Microsoft exclusively licensed the GPT-3 models in 2020 and a year later launched Azure OpenAI, a service that made these models available to Azure customers. In 2023, Microsoft announced further multi-billion dollar investments in OpenAI and in return would fully integrate OpenAI’s models into its products.
Within Azure AI, Microsoft’s cloud portfolio for AI applications, this is done with the Azure OpenAI service, which has been generally available since early 2023. This allows Azure customers to use the capabilities of the latest OpenAI models, including ChatGPT and GPT-4, directly through their Azure subscription.
Azure OpenAI Service
Azure OpenAI Service provides Azure customers with the functionality of current OpenAI models, optimised for enterprise use. Models can be accessed via REST APIs, the web-based Azure OpenAI Studio, or via an SDK (software development kit).
As of August 2023, these AI model families have been available in Azure OpenAI for new deployments (current info):
- GPT-3.5-Turbo: Family of LLMs that can process and generate natural language as well as code (an evolution of GPT-3).
- GPT-4: further development of GPT-3.5 / GPT-3.5-Turbo, which is designed to be “multimodal”, i.e. in principle also capable of accepting images as input (not yet available).
- Embeddings: LLM models that can convert text into a numerical representation (vectors) for further processing. This helps to determine relationships between two pieces of text, such as semantic similarities.
- DALL·E: Models that can generate images following natural language instructions (currently DALL-E 2).
Azure OpenAI Studio lets you create and manage new resources. Once this is done, you can use the corresponding model in your own applications through API calls. You can also try out your own code using the Playground in Azure OpenAI Studio.
It is also possible to optimise models for specific applications, i.e. to train them with your own data. At present, only existing customers with legacy GPT-3 models can do this, but OpenAI plans to enable fine-tuning for newer models soon. A new feature from June 2023 is the ability to connect your own data to Azure Open AI and run current models on it, without the need for additional training or tweaking.
Advantages of Azure OpenAI over OpenAI
So what are the advantages or disadvantages of using OpenAI models in Azure rather than via the OpenAI API (or Playground or other web interfaces)?
At first glance, the services are very similar: (almost) the same models are available and the APIs offer you the same functionality. For example, you can use them to write documents or computer code, answer questions, analyse text, create chatbots, translate texts or provide a voice interface for applications. OpenAI also provides several support channels for paying customers. And the price per 1k token is the same for both services as well.
The big advantage of Azure OpenAI is Azure’s strong focus on enterprise customers and the requirements of production environments, such as availability, security, privacy and compliance.
Availability and reliability
Azure OpenAI Service runs on the global Azure infrastructure with more than 60 Azure regions. Azure also offers a range of high availability, monitoring, disaster recovery and backup features with multiple redundancy options such as availability zones and groups.
In theory, this means that Azure also offers high availability and high performance for OpenAI models in the regions where you want to operate. In practice, however, you should check this in advance, as not all models are available in all regions (you can also check the latest information in the model documentation).
Security, privacy & compliance
Azure has many security features, including rights management with role-based access control, strong encryption for your data (both in storage and in transit), layered security for data centres, cloud infrastructures and processes in Azure, and threat intelligence. OpenAI cannot compete with this, even though the OpenAI systems are fundamentally secure because they are hosted in Azure.
For enterprise use, privacy and compliance are particularly important. In the case of OpenAI, it cannot be excluded that the provider will use the transmitted data to train its models when using its web interfaces (e.g. ChatGPT, DALL-E). This prohibits the use of personal data.
Azure, on the other hand, ensures that no one but the customer has access to customer data, not even Microsoft. Azure OpenAI supports isolated and secured virtual networks and Private Link for secure connections. Microsoft also holds over 100 compliance certifications across a wide range of industries and geographies, including ISO 27001, ISO 27018, SOC 1, 2 and 3, C5, DSGVO and many more.
AI content filtering
Another unique feature of Azure AI is AI content filtering and misuse monitoring, which are part of Microsoft’s efforts to promote the responsible use of AI. This is because the OpenAI models were trained on large amounts of data from the internet, which inevitably introduced social bias and other undesirable content into the training. This may be reflected in the output of language models. That’s why Azure AI provides the ability to filter and moderate the content of inputs and outputs for harmful content, and monitor for misuse by users.
Conclusion
If you want to explore the capabilities of the OpenAI models or try out the latest beta versions, you will find many exciting opportunities with OpenAI. But under no circumstances should personal or business-critical data be submitted to OpenAI.
The Azure OpenAI Service is therefore the solution of choice for productive enterprise deployments. Integrated with Microsoft Azure, it provides the availability and security you need, helps customers with privacy and compliance, and makes it easy to integrate OpenAI capabilities into your own applications.
Want to learn more about artificial intelligence, machine learning, AI language models, Azure AI and Azure OpenAI Service?
Find out how to successfully use AI in your organisation!
By attending our SoftwareOne Internal Knowledge Management & AI Workshop, you will gain the knowledge and strategic insight you need to successfully deploy AI in your organisation. Together, we will explore the fundamentals of big data and AI, and provide you with concrete steps and best practices for using AI as a valuable tool to improve business performance.
Find out how to successfully use AI in your organisation!
By attending our SoftwareOne Internal Knowledge Management & AI Workshop, you will gain the knowledge and strategic insight you need to successfully deploy AI in your organisation. Together, we will explore the fundamentals of big data and AI, and provide you with concrete steps and best practices for using AI as a valuable tool to improve business performance.