Using Copilot AI to call the OpenAI API from Visual Studio 2022
Updated 7 months ago on May 05, 2024
Using Copilot AI to call the OpenAI API from Visual Studio 2022
Can the advanced artificial intelligence in Visual Studio 2022 turn a complex IDE into a low-code toolkit replacement suitable for non-coders to build business applications?
The latter tools are aimed at "citizen developers" who can't write much but can use wizards, drag-and-drop and other techniques to create business applications.
Some argue that advanced AI, such as GitHub Copilot and GitHub Copilot Chat, frees low-code developers from functionality limitations and allows them to use more robust development tools, such as Visual Studio, to build more productive applications.
I proved this point with one case study in the article "Using Artificial Intelligence to Rapidly Create a Data-Driven WinForms Desktop Application".
Now I'll take it a step further and look at using the OpenAI API endpoint access from Visual Studio 2022 to bring artificial intelligence capabilities into .NET business applications.
To do this, you need to pay OpenAI for access to its API. I have a ChatGPT Plus subscription that includes the ability to generate secret API keys, which I thought could be used to access OpenAI API endpoints. They aren't. API access and ChatGPT are completely different services that are paid for separately. So why does ChatGPT Plus need the API's secret key? Authentication.
Having funded my API account, the next step was simply to confirm access to the OpenAI API endpoints.
OpenAI's Developer Quickstart shows three ways to access OpenAI API endpoints: curl, Python, or Node.js - no mention of .NET or C#.
I thought that Microsoft would have an official SDK, library or something like that ready to work with OpenAI API in Visual Studio, because the companies are close partners, thanks to Microsoft's investment of more than 10 billion dollars in OpenAI. However, that didn't happen.
In fact, using Visual Studio doesn't seem to be a big deal for OpenAI developers. There is little mention of Visual Studio on the OpenAI developer forum, but a lot of mention of VS Code, thanks to its extensions for Python.
Microsoft's guide "Getting Started with OpenAI in .NET from April 2023" explains that OpenAI models such as GPT-4 can be accessed using REST APIs and libraries. For the latter, the company recommends the Azure OpenAI .NET SDK.
However, to do so, you need to apply for access to the Azure OpenAI Service, which I did. But Microsoft seems to be strictly vetting these applications, asking for a Microsoft contact person and so on, so I don't expect to get access anytime soon or at all.
While I expected the official .NET/OpenAI SDKS or libraries or whatever from Microsoft to be out-of-the-box plug-ins for Visual Studio, but they weren't, I found a few third-party offerings that tried to fill the gap. I tried one of them, OpenAI.Net, and it looked promising until an error returned by OpenAI referencing the deprecated model, text-davinci-003, when executing its example.
This reference seems to be embedded in the library, as I haven't found a way to specify a viable model. There may be some way to do this by digging into the repo code, but it's not something a citizen developer should worry about.
In fact, many of the textbooks and other manuals I studied were outdated because things change very quickly in the AI field (well, yes!).
After this OpenAI.Net obsolescence problem, I haven't even tried to mess with other options like OpenAI-API-dotnet or OpenAiNg or Betalgo.OpenAI or OpenAI-DotNet, which OpenAI says "Use them at your own risk!" because it doesn't check their correctness or security.
This left me with a second alternative after Microsoft's recommended access-controlled Azure option - REST APIs.
First, I wanted to make sure I could access the OpenAI API endpoints. I thought my ChatGPT Plus subscription, which can generate API secret keys, would work. But it didn't.
I learned that API access and ChatGPT access are two completely different things that are paid for separately. So why does ChatGPT Plus need the API secret keys? Authentication.
Anyway, I had to give more money to OpenAI to test their API. So here's the first tip for citizen developers: the ChatGPT Plus API secret key does not give you access to the API.
The OpenAI Developer's Quick Start Guide shows three ways to access the API: curl, Python, or Node.js. No mention of .NET or C# (guess an investment of over 10 billion dollars doesn't go far enough).
I tried using OpenAI's sample Node.js code to perform an API request that involves creating an openai-test.js file and calling it from the command line. I used the exact same sample code and the exact same command and got the following error: "Unable to use import statement outside of module".
Then I tried the curl option because OpenAI said: "curl is a popular command-line tool used by developers to send HTTP requests to API [sic]. It requires minimal setup time, but has fewer features than full-featured programming languages such as Python or JavaScript."
It sounded great to a civilian developer. But it also didn't work.
I used the exact code provided by OpenAI and tried to run it from the command line, getting a pop-up warning that multi-line input may cause commands to be executed unexpectedly.
I reduced the lines of code to one line and got an error message that no API key was provided, even though it was specified in the curl command as a Windows environment variable, which I set: $OPENAI_API_KEY, according to OpenAI instructions.
So I specified the secret key right in the code, which I shouldn't do, which resulted in an error message about not using HTTP correctly. Again, this was an exact example of OpenAI code.
Supposedly there's a way to use ^ characters at the end of each line in the curl command to treat it as a single line, but it didn't work for me when Copilot tried it.
I ended up finding that the Git Bash installation handles the multiline curl command just fine.
Since the Node.js example was unsuccessful and the curl example doesn't work without installing additional programs, I didn't even try the Python option from OpenAI. I know a little bit of C#, but not Python. Obviously, C# just isn't a first-class citizen in the OpenAI development space, which is why VS Code and its super-popular Python extensions are much more commonly used than Visual Studio.
I wonder what gets more than $10 billion dollars these days if not good investor support for a flagship programming language?
In any case, after reaching out to the OpenAI API endpoint and getting chatty responses in return, I tested the REST approach in a Visual Studio 2022 console application.
Microsoft Copilot on the web was used, as well as GitHub Copilot and GitHub Copilot Chat from the IDE.
I first explained Microsoft Copilot online what I was doing and asked how to proceed. He provided me with Program.cs code to get started, and from there it was just a game of Whack-an-Error with the AI hammer. In hindsight, it would have been better to ask Copilot Chat in Visual Studio.
The process turned into repeated cycles of brute force, trial and error as I ran the code, got a new error and used Copilot Chat / slash commands and link #'s to fix things.
Sometimes Copilot would just run around in circles, fixing one bug and introducing a new one, which would fix itself but reintroduce the old bug, and so on.
I expected Copilot Chat to have a thorough knowledge of my code base. But it didn't. One day he said: "I apologize for the confusion. Without seeing your current code, I can only give you a general example."
Why it couldn't "see" my current code is unknown. I used the # reference mechanism to point Copilot Chat to specific code, so who knows?
On the other hand, AI surprised me with its ability to look into other
repositories and understand their code. I asked Copilot Chat about the
deprecated OpenAI model when I tried to use the OpenAI.Net SDK. I was told
that the SDK may not have a direct way to specify the model in the
TextCompletion.Get
method.
I provided the SDK URL and asked him to look into the
SDK to find out, and he said:
I looked into the Newtonsoft.Json SDK at the link provided on GitHub. Unfortunately, it turns out that this SDK does not currently support specifying a model for text completions. The TextCompletion.Get method has no parameter for the model, and there are no other methods or properties that allow you to specify a model. As a workaround, you can use the OpenAI API directly with the HttpClient class provided by .NET. This will give you full control over the API request and allow you to specify the model.
Of course, that's exactly what I did in two console applications using
HTTPClient
, one with Newtonsoft.Json and one with System.Text.Json (coincidentally,
Microsoft published an article last month titled "Migrate from
Newtonsoft.Json to System.Text.Json").
Perhaps due to the use of the Newtonsoft.Json libraries, the first console application contained much less Program.cs code:
Newtonsoft.Json application code
using System.Text; using Newtonsoft.Json; class Program { static async Task Main(string[] args) { var openAiApiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY"); // Getting API key from environment variables try { string prompt = "How long until we reach mars?"; string model = "gpt-3.5-turbo"; // Specify the model here int maxTokens = 60; var completionRequest = new { model = model, messages = new[] { new { role = "system", content = "You are a helpful helper." }, new { role = "user", content = prompt } } }, max_tokens = maxTokens }; var json = JsonConvert.SerializeObject(completionRequest); var data = new StringContent(json, Encoding.UTF8, "application/json"); using var client = new HttpClient(); client.DefaultRequestHeaders.Add("Authorization", $"Bearer {openAiApiKey}"); var response = await client.PostAsync("https://api.openai.com/v1/chat/completions", data); var result = await response.Content.ReadAsStringAsync(); Console.WriteLine("Generated text:"); Console.WriteLine(result); } catch (Exception ex) { Console.WriteLine($"Error: {ex.Message}"); } } }
than in the System.Text.Json version:
System.Text.Json application code
using System.Net.Http.Json; using System.Text; using System.Text.Json; public class Program { public static async Task Main(string[] args) { var openAiApiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY"); if (string.IsNullOrEmpty(openAiApiKey)) { Console.WriteLine("API key not found in environment variables."); return; } var model = "gpt-3.5-turbo"; // Replace with the desired model var maxTokens = 50; // Replace with the desired maximum tokens var httpClient = new HttpClient(); httpClient.DefaultRequestHeaders.Add("Authorization", $"Bearer {openAiApiKey}"); var chatRequest = new { model = model, messages = new[] { new { role = "system", content = "Once upon a time" } }, max_tokens = maxTokens }; var content = new StringContent(JsonSerializer.Serialize(chatRequest), Encoding.UTF8, "application/json"); var response = await httpClient.PostAsync("https://api.openai.com/v1/chat/completions", content); if (response.IsSuccessStatusCode) { var chatResponse = await response.Content.ReadFromJsonAsync(); ProcessChatResponse(chatResponse!); // Use the null-forgiving postfix to suppress the warning } else { Console.WriteLine($"Error: {response.StatusCode}"); } } static void ProcessChatResponse(ChatResponse chatResponse) { if (chatResponse?.Choices != null && chatResponse.Choices.Any()) { var message = chatResponse.Choices[0].Message; if (message != null) { Console.WriteLine("Generated text:"); Console.WriteLine(message.Content); } else { Console.WriteLine("No message content was found in the response."); } } else { Console.WriteLine("No response options found in the reply."); } } } } public class ChatResponse { public string? Id { get; set; } public string? Object { get; set; } public int Created { get; set; } public string? Model { get; set; } public Choice[]? Choices { get; set; } public string? SystemFingerprint { get; set; } public class Usage { public int PromptTokens { get; set; } public int CompletionTokens { get; set; } public int TotalTokens { get; set; } } public class Choice { public int Index { get; set; } public Message? Message { get; set; } public object? Logprobs { get; set; } // Assuming that Logprobs is not used, set it as an object public string? FinishReason { get; set; } } public class Message { public string? Role { get; set; } public string? Content { get; set; } } }
Related Topics
More Press
- Will ChatGPT replace developers? 6 months ago
- TensorFlow artificial intelligence models are at risk due to Keras API flaw 6 months ago
Let's get in touch!
Please feel free to send us a message through the contact form.
Drop us a line at request@nosota.com / Give us a call over nosota.skype