Microsoft Azure OpenAI lets enterprises feed corporate secrets to ChatGPT

Apparently you're all dying to do this?

Updated Microsoft wants to make it easier for enterprises to feed their proprietary data, along with user queries, into OpenAI's GPT-4 or ChatGPT within Azure and see the results.

This functionality, available as a public preview via the Azure OpenAI Service, eliminates the need for training or fine-tuning your own generative AI models, said Andy Beatman, senior product marketing manager for Azure AI, this week, noting this was a "highly requested customer capability."

We can only assume he means highly requested by actual customers – not Microsoft executives continuing to drive this AI hype.

We're told the system basically works like this: a user fires off a query to Azure; Microsoft's cloud figures out what internal corporate data is needed to complete that request; the question and retrieved data are combined into a new query that is passed to an OpenAI model of choice hosted within Azure; the model predicts an answer; and that result is sent back to the user.

This is allegedly useful. The models are managed by Microsoft in its cloud, and OpenAI doesn't have access to them nor customer data, queries, and output, according to Redmond. This information also isn't used to train any other services nor do customers have access to other customers' models and data. You essentially use your own private instance of an OpenAI GPT model.

"Azure OpenAI on your data, together with Azure Cognitive Search, determines what data to retrieve from the designated data source based on the user input and provided conversation history," Microsoft explained. "This data is then augmented and resubmitted as a prompt to the OpenAI model, with retrieved information being appended to the original prompt."

Turning the focus to proprietary data

Microsoft has sunk more than $10 billion into OpenAI, and is rapidly integrating the upstart's AI models and tools into products and services throughout its broad portfolio.

There is no doubt there is a push for ways to craft tailored models – ones that go beyond their base training and are customized for individual applications and organizations. That way when a query comes in, a specific answer can be generated rather than a generic one.

This approach has been talked about for a number of years. In recent months, as the pace of generative AI innovation accelerated, vendors started answering the call. Nvidia late last year introduced NeMo – a framework within its larger AI Enterprise platform, which helps organizations augment their LLMs with proprietary data.

"When we work with enterprise companies, many of them are interested in creating models for their own purposes with their own data," Manuvir Das, Nvidia's vice president of enterprise computing, told journalists during the lead-up to the GPU giant's GTC 2023 show in March.

Two months later, Nvidia teamed up with ServiceNow to enable companies using ServiceNow's cloud platform and Nvidia AI tools to train AI models on their own information.

Redmond's turn

Now comes Microsoft. "With the advanced conversational AI capabilities of ChatGPT and GPT-4, you can streamline communication, enhance customer service, and boost productivity throughout your organization," wrote Beatman. "These models not only leverage their pre-trained knowledge but also access specific data sources, ensuring that responses are based on the latest available information."

Through these latest capabilities in the Azure OpenAI Service, enterprises can simplify such processes as document intake and indexing, software development, HR procedures, self-service data requests, customer service tasks, revenue creation, and interactions with customers and other businesses, we're told.

The service can draw from a customer's corporate data in any selected source and location – whether it's stored locally, in the cloud, or at the edge – and it provides tools for processing and organizing that information. It also can integrate with an enterprise's existing storage through an API and software-development kit (SDK) from Microsoft.

In addition, it includes a sample app to accelerate the time to implement the service.

Organizations will need an approved Azure OpenAI Service application to do all of the above, and use either GPT-3.5-Turbo or GPT-4. "Once your data source is connected, you can start asking questions and conversing with the OpenAI models through Azure AI Studio," Beatman wrote. "This enables you to gain valuable insights and make informed business decisions."

There are some caveats. Users should not ask long questions, instead breaking them down to multiple questions. The max limit for the number of tokens per model response is 1,500 – including the user's question, any system messages, retrieved search documents (known as "chunks") plus internal prompts and the response.

They should also limit the responses to their data, which "encourages the model to respond using your data only, and is selected by default," Microsoft wrote. ®

Updated to add on June 23

If you're wondering about the privacy and data protection aspects of this, spokespeople for Microsoft have been in touch to say there's more information on that here, with the following points highlighted:

Your prompts (inputs) and completions (outputs), your embeddings, and your training data:

  • are NOT available to other customers.
  • are NOT available to OpenAI.
  • are NOT used to improve OpenAI models.
  • are NOT used to improve any Microsoft or 3rd party products or services.
  • are NOT used for automatically improving Azure OpenAI models for your use in your resource (The models are stateless, unless you explicitly fine-tune models with your training data).

Your fine-tuned Azure OpenAI models are available exclusively for your use. The Azure OpenAI Service is fully controlled by Microsoft; Microsoft hosts the OpenAI models in Microsoft’s Azure environment and the Service does NOT interact with any services operated by OpenAI (e.g. ChatGPT, or the OpenAI API).

So there you go.

Similar topics

TIP US OFF

Send us news


Other stories you might like