The power of GPT is not the LLM but the prompt

Ian Gotts @iangotts
6 min readNov 25, 2023

You meet your hero. What will you ask?

Imagine you had a brief opportunity to meet your business, sports or celebrity hero, be it Marc Benioff, Taylor Swift, Ryan Reynolds, or any other figure you admire. The key to making the most of such a rare encounter would be asking the right questions.

I started having bass guitar lessons with an awesome bass teacher who was in an all-female Led Zep tribute band and is a sought-after session musician. I said, “I don’t want you to teach me how to play the bass better”. She was slightly taken aback. But then she understood when I added “I want you to teach me to think like you and how you approach bass playing”. That transformed how she has taught me, and I have learned and grown so much faster.

This principle directly applies to using Generative Pre-trained Transformers (GPT) effectively. The power of GPT lies not in its complex algorithms or extensive Large Language Models (LLM) but in the prompts you provide. The questions that you ask and the way that you ask them.

Prompting: Ask the smart questions

Think of prompting as the questions you would ask your hero. In the context of GPTs, such as ChatGPT, Claude, Bard, or other LLM, the questions you pose are critical. It’s like having an intern at your disposal — one without prior knowledge but with immense enthusiasm and speed. The trick is in the detail and length of your prompts. The more detailed (and longer) the prompt, the better and more comprehensive the response.

Ask that intern to “book you a restaurant” will get a generic answer or a series of follow-up questions. But say “I want to entertain a client who loves Indian food. It needs to be over lunch somewhere where we can have a private business conversation. He is in town on Tue 3rd,” and you will get a far better result. First time.

However, it is challenging to consistently enter 3–4 (or more) paragraphs of prompt each time to achieve the best results. And then you need to remember the results so you can constantly improve the quality of the results.

Some of the prompts the Elements.cloud Product Management team use internally are 300–400 words long. And they have been refined over several months. But they give the team 20–50x productivity improvements. They are a critical part of their day to day processes.

Screenshot of Prompts in GeePeeTee.cloud

Prompt Templates: operationalizing use of GPT

Enter the concept of ‘Prompt Templates’. These are standard prompts with different sections that you replace based on your specific requirements before sending them to the GPT. This is what the Elements.cloud Product Management team have developed. And they have a library of prompt templates.

Prompt templates should be stored centrally and able to be shared with co-workers so that you can “operationalize” your use of GPT inside your organization. We think of them as productivity applications, and understand the value and investment it has taken to refine the prompt templates to where they are now.

Prompt templates need some access controls so that only certain people can edit them. You don’t want months of refinement lost because someone changes them by mistake when they were simply trying to copy them to use. You wouldn’t give everyone Admin rights on Salesforce, would you? And ideally, you can provide feedback to the editor of each prompt template so that they can be constantly improved. Which means they also need to be versioned. And you should be able to see if they are used. A prompt template is almost the same as a Salesforce configuration.

CustomGPT and ChatGPT Plus: fine-tuned, specialized GPTs

OpenAI has introduced custom GPTs.These are specialized models that you can enhance by adding specific instructions, up to 10 knowledge files, and API links for added information. This has enabled some of that detailed prompting information to be stored as part of the custom GPT so it doesn’t need to be entered every time. BTW You need to have a ChatGPT Plus ($20/m) account to create and use them.

Users with ChatGPT Plus accounts can create these custom GPTs, which can be private or public. If the GPT is made private, it will only be accessible to individuals who have the specific link to that GPT. This option is useful for GPTs containing sensitive or specialized information that the creator wishes to restrict to a select audience — e.g “Success Story writer” for co-workers. Public models are discoverable by other users, making them suitable for broader applications or for sharing information and tools that could be beneficial to a wider audience. Think of them as another distribution channel for content creators.

There are 1.000s of custom GPTs being created. The quality varies significantly. This is based on the expertise of the creator, the quality of the instructions, the specific knowledge attached, and how much time has been taken to fine-tune the instructions and knowledge.

The best ones have a narrow focus, such as our “Process Mapping Coach”, “Salesforce DevOps” by Vernon Keenan or our private “Image Creator” that uses our color palette and styles. This specificity allows for more detailed instructions and knowledge, leading to more targeted and useful responses. But what makes the biggest difference is if the creator has provided good prompts that work well with their custom GPT. When you publish the GPT you can provide 4 prompts but these are very short and, as we’ve already explained, you need more detailed prompts to get great results.

GeePeeTee.cloud: A hub for Custom GPTs and Prompt Templates

GeePeeTee.cloud is an app that was initially created as a list of Salesforce-specific GPTs so that the ecosystem can easily find them. But the scope rapidly grew!!! Now it is also a prompt management app.

Public GPT listings in GeePeeTee.cloud

Note: this is NOT a replacement for Einstein Prompt Studio, which is the app for managing prompt templates inside Salesforce. GeePeeTee.cloud is for GPTs and prompt templates that are outside of Salesforce — i.e. using ChatGPT, Claude, Bard etc etc.

Public custom GPTs and prompt templates: It allows creators to list their custom GPT and then add prompts that work well with it. This helps users get the most out of the GPT. The creator, who is the subject matter expert, knows the right questions to ask. Remember, the more detailed the question, the better the answer.

Private custom GPTs and prompt templates: You can list custom GPTs that are private that you have created or that have been shared with you. This is your personal list. Any GPT listing can be shared by the creator and they can grant view or edit rights. The creator — or those they give edit rights to — can add and manage the GPT listing and the associated prompts.

Standard GPTs and prompt templates: Anyone can use it to store the prompts that they use with any LLM. They can create standard GPT listings that are private and then can add prompts. These listings can also be shared with co-workers. The listings could be for the free version of OpenAI or any other LLM — Claude, Bard etc etc.

Prompt template text replacement: The challenge is if your prompt template has sections that need to be replaced before you use them. GeePeeTee.cloud has a neat solution. If a prompt template has any text surrounded by { } — e.g. {insert website URL} or {add 3 key features} — when you click to copy the prompt template icon it pops a window with a copy of the prompt template, and the text to be replaced is highlighted. You make your changes and then you can click the icon to copy the revised prompt template so that you can use it in the GPT.

The final word

The true power of GPT lies in understanding how to communicate with it effectively. The right prompt can unlock a wealth of information and capabilities, much like asking the right question to your hero can yield invaluable insights. In the business world, this means crafting prompts that are detailed, specific, and aligned with your objectives. Whether it’s generating content, analyzing data, or providing customer service, the quality of your prompts directly impacts the output you receive from GPT.

--

--