A Context-Aware Python Chatbot Using ChatGPT API: A Compete Tutorial Series
On this page, I am outlining a series of videos I made to clearly explain how to build a ChatBot using OpenAI’s ChatGPT API.
Contents
- 1 ChatGPT API in Python Code, Example-based Tutorial
- 2 Your Own ChatBot: Effortless Python Guide to Harnessing the Power of ChatGPT API!
- 3 OpenAI GPT-4 Python ChatBot Code
- 4 Saving Conversations with Python GPT-4 API ChatBot: Conversational AI
- 5 Context-Aware Conversational AI in Python using ChatGPT API
- 6 Chat with your Documents using Python ChatGPT API
ChatGPT API in Python Code, Example-based Tutorial
This video explains, with a simple example, how to use the ChatGPT API in a Python program. It also demonstrates how to access the API, which requires an API key from OpenAI. The key can be obtained by signing up for an account on platform.openai.com and navigating to the API Keys section. There, users can create a new secret key.
The Python program I wrote in the video requires the OpenAI and JSON packages. The program sends a request to the API using the openai.ChatCompletion.create function with the GPT-3.5-Turbo model. There are other models available as well.
Overall, the video serves as an introduction to ChatGPT API to be included in Python programs.
Your Own ChatBot: Effortless Python Guide to Harnessing the Power of ChatGPT API!
The video provides a tutorial on how to use OpenAI’s ChatGPT API to create a simple Python chatbot. It begins by outlining the steps to obtain an API key from OpenAI, emphasizing the affordable cost of API calls for small scale, educational, and research uses. I delve into the Python code, explaining OpenAI and JSON packages, establishing an infinite loop to facilitate user interaction, and using the input function to capture user queries. I also detail how the code sends user queries to OpenAI via the API, and the conditions under which the program will terminate.
The video illustrates the process of parsing the JSON response from the API. Application developers might use the ChatGPT API in a commercial context. The limitation of the above video is that the conversation is lost once the program terminates. A future video provided below solves this issue.
OpenAI GPT-4 Python ChatBot Code
This video focuses on GPT-4 model over GPT 3.5 turbo model. Switching back and forth is simple using OpenAI’s API. I demonstrated how I accessed GPT-4 through the existing OpenAI API and found the responses to be more detailed and accurate, albeit slower, compared to the previous GPT-3.5 version. When making the video, I expected image input capability with GPT-4 release, but this feature was planned for future updates.
Despite the slower response time with GPT-4, the results are more accurate and engaging.
GPT-4 is OpenAI’s most advanced system, capable of solving difficult problems with greater accuracy due to its broad general knowledge and problem-solving abilities. It is more creative and collaborative, can assist in creative and technical writing tasks, and has advanced reasoning capabilities surpassing the earlier ChatGPT models. It is also designed to be safer and more aligned with human expectations, being 82% less likely to respond to requests for disallowed content and 40% more likely to produce factual responses than GPT-3.5 [reference].
A limitation of this program explained in the video is (also) that conversation history is lost upon termination, and the next video addresses this issue.
Saving Conversations with Python GPT-4 API ChatBot: Conversational AI
The video describes enhancements to a previous Python program (written in an earlier video of this page) that uses OpenAI’s GPT-4 model to function as a chatbot. The program could use GPT-3.5 too. I updated the program to store conversations in a text file. This file is created when the program is run, and it’s named with a timestamp to distinguish it from files generated in other sessions.
In this new version of the program, the user’s input and the AI’s response are both written to the text file every time the AI generates a response. This file record is maintained alongside the interaction on the terminal. A new file will be created each time the program is run, preserving the conversation history of each session.
However, the video also highlights a limitation of this program: it doesn’t retain the context of the conversation from one query to the next. An example is provided where the AI failed to relate a question about whether ‘GAN’ (Generative Adversarial Networks) is a trend to the prior question asking what a GAN is. In contrast, when the same questions were posed to ChatGPT (https://chat.openai.com/), it provided a response that acknowledged the connection between the two questions, demonstrating its understanding of the conversation’s context.
The issue is solved in the next video.
Context-Aware Conversational AI in Python using ChatGPT API
In this video tutorial, we address the issue of contextual memory in our AI ChatBot. While the previous version was adept at answering individual questions, it could not remember the context of conversations, leading to issues when asked subsequent related questions. We demonstrate this issue by conversing with the ChatBot about YouTuber Mr. Beast.
We dig deeper into OpenAI’s ChatCompletion API to resolve the issue. We modify our code to include the context of the conversation in every new question, and we primarily focus our changes around the API. We use a Python list named “discussions” to save all our past questions and answers, with the initial element being a “system” role message that instructs the model that it is a helpful assistant. Each user question and model response are appended to our “discussions” list, which is then used in the ChatCompletion’s “create” method.
To demonstrate the improvements, we revisit the conversation about Mr. Beast. This time, the AI ChatBot successfully remembers the context and answers subsequent questions accurately. The video highlighted that the AI even interpreted and responded to somewhat vague queries, such as whether Mr. Beast’s content is “international.”
The video tutorial also addresses a potential issue with this method – the growing length of the “discussions” variable as the conversation progresses. To avoid reaching a token limit, it’s recommended to keep only the most recent questions and answers in the list or to summarize the previous conversation and provide the summary instead.
The code after solving the issue of the forgetful nature of the ChatBot is provided below.
import openai import json import time import os openai.api_key="YOUR OPENAI API KEY HERE." timestamp = time.strftime("%Y_%m_%d-%H_%M_%S", time.gmtime()) filename = timestamp + ".txt" if not os.path.exists(filename): with open(filename, 'w') as f: f.write("User: Welcome to OpenAI chat!\n") discussions=[{"role": "system", "content": "You are a helpful assistant."}] while (True): p=input("Enter quit to quit, or enter your prompt: ") if (p=="quit"): break discussions.append({"role": "user", "content":p}) completion = openai.ChatCompletion.create( model="gpt-3.5-turbo", messages=discussions ) x=json.loads(str(completion)) response = x["choices"][0]["message"]["content"] discussions.append({"role": "assistant", "content": response}) with open(filename, 'a') as f: f.write("User: " + p + "\n") f.write("AI: " + response + "\n") print("\nAI says: ", response, "\n") print("Have a nice day!")
Chat with your Documents using Python ChatGPT API
Building a Document Conversation System using the approaches we learned on this page requires something additional. The video immediately above this text details how to chat with documents using the Python ChatGPT API, LangChain, and LlamaIndex. Further details and the code can be found on this link: Talk2Doc: Conversing with Your Documents.
3 Comments
Nice to meet you, good tutorial
Iam digital markiter and a good wiriter and iam complet work for one or two day finly
Holaa