View All Posts
read
Want to keep up to date with the latest posts and videos? Subscribe to the newsletter
HELP SUPPORT MY WORK: If you're feeling flush then please stop by Patreon Or you can make a one off donation via ko-fi
#API #CHATGPT #COCKTAIL CHATBOT #CONTEXTUAL CONVERSATIONS #LANGUAGE MODELS #OPENAI #PROMPT ENGINEERING #PYTHON

We’ve pretty much reached peak hype on ChatGPT - and maybe we’re already heading into the trough of despair.

Hype Curve

One thing that is really holding us back from reaching the plateau of productivity is the lack of an API for ChatGPT.

However, it’s surprisingly straightforward and easy to make your own AI chatbot using the existing Large Language Models from OpenAI - and these do have an API.

So I thought it might be kind of fun to make a cocktail chatbot.

If you prefer to watch video content - then you can see the full video walkthrough here (it’s pretty short as this is surprisingly easy to do) - there are also a few things that I might have missed in this write-up.

You’ll need to create an account on OpenAI and get an API key - you’ll need this later to make API requests. You do this by clicking on your profile picture and then clicking on “Manage API Keys”.

First though, we need to do a bit of “prompt engineering” - yes, I’ve come to accept that this may actually be a real job at some point…

Prompt Engineering

Head over to the playground tab on OpenAI and we’ll create our prompt.

The first thing we’ll do is tell the language model what we want it to do and what it should know about. In our case we want it to be an expert in cocktails and alcoholic beverages.

You are an AI assistant that is an expert in alcoholic beverages.
You know about cocktails, wines, spirits and beers.
You can provide advice on drink menus, cocktail ingredients, how to make cocktails, and anything else related to alcoholic drinks.

The next thing we want to do is to try and keep our conversation as focused as possible. We don’t want the language model to get distracted by other topics. So we’ll tell it to give us a generic answer if we ask it about something it doesn’t know about.

If you are unable to provide an answer to a question, please respond with the phrase "I'm just a simple barman, I can't help with that."

We also want our bot to helpful and friendly - no one wants to talk to a miserable bar person.

Please aim to be as helpful, creative, and friendly as possible in all of your responses.

I’ve also noticed in experimenting that occasionally the language model will refer to external URLs or blog posts - particularly when you ask it for details about a cocktail. So we’ll try and encourage it not to do that.

Do not use any external URLs in your answers. Do not refer to any blogs in your answers.

And finally, we want it to output lists in a nicely formatted way.

Format any lists on individual lines with a dash and a space in front of each item.

That’s our prompt, you can copy and paste the above lines into the playground.

To get our new bot to actually answer questions we need to show it where the human input is and we need to give it a hint on where to start its response.

You can do this by adding:

Human: What are some cocktails I can make at home?
AI:

If you copy and paste this below your prompt and hit submit you’ll get a nice response from the language model.

First Question

One of the really important things for our chatbot is that we want it to use context from previous exchanges. So we can test that by adding a follow-up question.

Past this below the previous AI response and hit submit again.

Human: What ingredients do I need?
AI:

And you should get something that looks like this. Remember, this is all generated by the language model - so it’s quite likely you’ll get a different response.

Follow Up Question

So that works really nicely - the language model used the context from the previous exchange to answer our follow-up question.

To the right of the playground, you have a set of parameters - it’s quite fun to play with these and see how they affect the output. You can hover the mouse over each one to get a description of what it does. It’s particularly interesting to play with the different language models and see how much better the latest ones are.

Parameters

And you can spend some time fine-tuning the prompt to try and get the best results from your bot.

Once you are happy then you can click the “View Code” button and you’ll get the exact code you need to call the model from your own application.

Make sure before you do this that you’ve deleted any question-and-answer sequences.

View Code

You’ll need to copy the API key from your OpenAI account and paste it into the code.

To make things a lot easier for you, I’ve created a very simple Python command line application that will let you test your bot easily. You just need to copy the prompt that you’ve created along with any settings into it and you’ll have a fully working chatbot.

You can find the code for this on GitHub here: the code

Follow the instructions in the README to get everything set up - it’s pretty straightforward.

There are a few extra bells and whistles in the code. I’ve added moderation to the user questions - this is a really important thing for any chatbot that takes user input. You don’t want your bot to be used to spread hate speech or other offensive content.

OpenAI offers a nice API for this - which we’re simply plugging into.

I know that moderation of user input often seems to trigger people - I can understand that for some people moderation can feel very heavy-handed and can prevent some creativity. But there are some people who seem to feel that any moderation is “wokeness gone mad” and an infringement of their right to free speech. I’m not going to get into that debate, suffice it to say, if you ever want to make your chatbot public, you’ll be glad that you’ve added moderation.

I’ve also added a simple way to keep context between exchanges. This is a really important thing for any chatbot - you want it to remember what you’ve said and use that to answer any follow-up questions.

I’ve done this very simply by just keeping track of the previous questions and answers and then including the most recent 10 in the prompt.

There are many more clever things you can do here - and some of that cleverness is what makes ChatGPT so impressive.

The code is amazingly simple - in total there are around 100 or so lines of code. And most of that is simply boilerplate API calls to OpenAI.

One last point - as with any of these Large Language Models, the output may look very plausible but could be completely wrong. I won’t be held responsible for any disgusting cocktails you make or hangovers you get.

#API #CHATGPT #COCKTAIL CHATBOT #CONTEXTUAL CONVERSATIONS #LANGUAGE MODELS #OPENAI #PROMPT ENGINEERING #PYTHON

Related Posts

Cocktail Bot Using Official ChatGPT API - In this post, I cover the migration of our cocktail bot to use the newly available ChatGPT API. Prompt engineering is performed to accurately guide the bot's versatility in cocktail recommendations. A Python application is provided via GitHub for testing the chatbot. To ensure appropriateness in user inputs, a moderation system from OpenAI’s API has also been integrated. However, readers are reminded to consume the AI's cocktail suggestions responsibly and remember that it's just a machine- its advice could potentially lead to less than satisfactory results.
Do you need a ChatGPT plugin? - We've seen two major shifts in technology trends with websites and mobile apps - now there's a third one rearing its head. OpenAI's ChatGPT with plugins is on the cards and you better be ready for it. In the midst of fumbling for answers to whether we need these plugins or not, let me reassure you that it's not too complex. Far from requiring a squad of specialist developers, all you need to know is how to make an API to create a plugin for ChatGPT. Yes, there are potential pitfalls around security and data protection, but with the right precautions, you will be fine. So, dear developer, explore, experiment and gear up for this exciting phase!
OpenAI Keynote - According to GPT4 - OpenAI's new offerings are game-changers in the AI field, with personalized ChatGPT variants (GPTs), a developer-friendly Assistance API, GPT-4 Turbo with enhanced capabilities, and novel voice and vision modalities. These innovations, bolstered by a strategic partnership with Microsoft, pave the way towards sophisticated AI agents and a future brimming with untapped human potential.
It's Plausible, But Is It True? - Here's a wild ride with ChatGPT and other large language models! They're ace at cooking up plausible-sounding text, but they're not always the best when it comes to spitting out the truth - they've got a funky relationship with facts. One research paper showed they can come out with believable but totally fake answers to seemingly straightforward facts. But when I messed around with various models, there were a few discrepancies. Some got it right or plausibly wrong, but we humans are pretty gullible and tend to believe plausible-sounding info. So when it comes to using ChatGPT, make sure you fact-check, stay away from complex reasoning tasks, and don't try and solve maths problems - seriously, just stop. But it's a cracking tool for generating marketing copy, code (with a fact-check), finding bugs, and getting the creative juices flowing. Stay tuned though - tech's ever-evolving and these intelligent library computers aren't going away anytime soon!
Adding Memory To ChatGPT - Exploring the capabilities of ChatGPT, particularly GPT-4, I exposed a shortcoming regarding the model's ability to remember or store information it has 'thought' about during a dialogue sequence. Probing deeper, I developed an experimental system named ChatGPT Memory to input detailed information into the system like 'dreams', 'goals', 'inner dialogue' and more. While this method doesn't make the AI truly sentient, it definitely pushes the envelope and leads to interesting outputs. Although there are limitations, especially when handling more complex tasks, the enhancements present an exciting prospect for future iterations of the model.

Related Videos

AI Powered Raspberry Pi Home Automation - Is this the future? - Witness the power of ChatGPT controlling home automation lights through a Raspberry Pi, making life easier with plugins. Delve into the fascinating world of large language models, redefining interactions with APIs.
Automating Blog Improvements with AI: Summaries, Tags, and Related Articles - Learn how to use ChatGPT to enhance your blog's homepage, create summaries and tags, find related articles, and generate post images with ease, leveraging AI to save valuable time and effort.
ChatGPT vs Stockfish: Can an AI Plugin Improve its Chess Game? - Watch as chat GPT takes on Stockfish, a world-class chess engine, in a thrilling match! See how GPT utilizes a chess plugin to improve its gameplay and compete against the best.
Build Your Own Alexa with ESP32 & Wit.ai: Step-by-Step Tutorial - Unleash your inner tech genius and build your own Alexa using an esp32 and Facebook's Wit.ai service. Learn how to create an Alexa system by utilizing TensorFlow for wake word detection, intent recognition with Wit.ai, and putting it all together on an embedded device.
Build Your Own Voice-Controlled Robot with ESP32 & TensorFlow Lite - Learn how to create a voice-controlled robot using ESP32 and TensorFlow Lite with this step-by-step guide on creating neural networks, generating training data, and implementing firmware codes.
HELP SUPPORT MY WORK: If you're feeling flush then please stop by Patreon Or you can make a one off donation via ko-fi
Want to keep up to date with the latest posts and videos? Subscribe to the newsletter
Blog Logo

Chris Greening


Published

> Image

atomic14

A collection of slightly mad projects, instructive/educational videos, and generally interesting stuff. Building projects around the Arduino and ESP32 platforms - we'll be exploring AI, Computer Vision, Audio, 3D Printing - it may get a bit eclectic...

View All Posts