For many businesses that are looking for robust and cost-efficient ways to implement cloud-based AI applications, public cloud hosting services offer a wide array of tools and services. When considering which cloud service provider (CSP) to use for AI, it may seem that adopting a multi-cloud approach that accounts for the potential strengths and weaknesses of different providers is a great path. It can often lead, though, to unexpected future financial and operational costs. A well-chosen single cloud approach, by contrast, can yield significant performance, security, and budgetary advantages in both the short and long terms. 

To help demonstrate the possible advantages of a centralized, single-cloud approach for AI, we built a simple retrieval-augmented generation (RAG) LLM and hosted it on roughly equivalent AWS and Azure services. With both setups, we used the GPT-4o-mini model deployed on Azure OpenAI within Azure AI Foundry. We then tested each LLM’s search time and end-to-end execution performance. We found that running our OpenAI RAG LLM app on Azure reduced end-to-end execution time by 59.7 percent compared to the AWS deployment. In addition, Azure AI Search provided a faster search service layer for the app, reducing search time by up to 88.8 percent compared to Amazon Kendra. 

We also evaluated two critical areas that decision makers need to keep in mind when choosing a cloud-based AI solution: security and ongoing costs. The centralized Azure environment benefited from the practical security of a single Identity and Access Management system, a single cloud GUI, and a unified interoperability, governance, and compliance environment. When we researched the associated costs of each implementation while including common ongoing charges such as tokens, number of endpoints, API access, data input/output, and more, we found that the reduced complexity, lower overhead, and lack of duplication with the Azure solution could help companies avoid the higher costs associated with the potential operational inefficiencies and increased security risks of multiple providers. 

Deploying an OpenAI app on Azure offers performance and cost benefits and the potential for better security and development velocity compared to a multi-cloud approach with AWS. 

To learn more about the potential advantages of single-cloud AI solutions with Azure, check out the report below.