Director, Learning Design & Development| PMIAsheville, NC, United States
With Generative AI, iteratively refining and optimizing prompts can lead to better AI-generated results. This may involve adjusting the specificity or clarity of the prompt to increase relevance and accuracy of results.
What examples do you have of how improving a prompt drastically changed the output quality? What specific changes did you make that led to the improvement?
There are framewoks to create prompt. This is part of the Prompt Desing discipline. Those that gave me and the initiatives where I was included are:R-T-F (Role-Task-Format), T-A-G (Task, action, goal), B-A-B (Before, after, bridge), C-A-R-E (context, action, result, example), R-I-S-E (role, input, steps, expectations).
I disagree with the idea that frameworks like R-T-F, T-A-G, B-A-B, C-A-R-E, and R-I-S-E are essential for prompt creation. These structures can be restrictive and may stifle creativity and flexibility in designing effective prompts.
...
2 replies by Beverly Johnson and Hakam Madi
Jul 30, 2024 12:42 AM
Hakam Madi
...
I would agree if we are at an exploratory phase, but after the exploratory phase, we should standardise the outputs just like we standardise our forms, templates, and procedures.
Mar 07, 2025 7:46 PM
Beverly Johnson
...
I think that the frameworks are essential in that they exist to help those that are new, jumpstart the creativity needed for prompt engineering. While some individuals can jump right in, others need guidelines and examples before they become comfortable with the tool
This morning, one of my LinkedIn contacts complained that GenAI tools don't seem to have the ability to craft decent quality PMP practice exam questions. He had used the public version of ChatGPT. I decided to check the same with PMI Infinity and got better results - seven out of ten questions were acceptable.
My first prompt was "Generate ten different questions about project management which would be similar in style and level of difficulty to what is asked on the PMP exam"
It only gave me the questions but neglected to provide any answers. Realizing that this was likely it interpreting what I had asked for "as is", I then added: "Generate ten different questions about project management which would be similar in style and level of difficulty to what is asked on the PMP exam".
With that it was able to provided more useful output.
Kiron
To improve getting answers from GenAI tools, try adding specific instructions to your prompt, such as "Generate ten different questions about project management, similar in style and difficulty to the PMP exam, and provide detailed answers for each." This ensures the tool understands the need for questions and answers in the output.
...
2 replies by Christina Dietrich and Rami Kaibni
Jul 10, 2024 12:21 PM
Rami Kaibni
...
Kiron, I do agree with Booma, in that adding more details and very specific instructions no matter how small they are can improve the outcome generated!
Feb 06, 2025 3:28 PM
Christina Dietrich
...
Appreciate your explicitly stating that answers were wanted, Booma. I didn't see that in the original modified text and was confused about how one would get an improved response from posting the same request twice.
It is important to remember this: generative AI is just "predictive text with storoids". Obviously not only text will be the result. BUT the important thing is the answer will just to complete your question (prompt) with the things that have more probability to complete it. You can manage it using some of the parameters like temperature. So, it is very important when creating the prompt to put clear the role, the place where the role works/live/etc, the task the role has to accomplish and the format of the answer. This is an example of R-T-F. You have to eliminate as ambiguity as possible. If not, then hallucinations will happened.
You make an excellent point. Generative AI, while powerful, functions as predictive text on steroids and needs clear, unambiguous prompts to produce accurate results. Using frameworks like R-T-F can help manage this by specifying the role, context, tasks, and desired format, reducing the risk of ambiguous or erroneous outputs. To get better results from GenAI, provide context-rich, detailed prompts that include specific instructions and desired outcomes. This reduces ambiguity and guides the AI to generate more accurate and relevant responses.
...
1 reply by Janice Timchuck
Jul 04, 2025 11:54 AM
Janice Timchuck
...
Just learning about AI. I appreciate all of the feedback.
I find that talking to our company AI as an assistant (as I heard from a PMI webinar) is helpful and treating my sessions as co-writing sessions as I would with a coworker in the past. Getting a good tone is a challenge and I often have to ask it to rewrite something 'less formally but still professional,' otherwise it goes from overly formal to overly casual. I use it most as an enhanced thesaurus, asking it for various ways to say different phrases, or asking it to change certain words within what it drafts. Saving Changes...
I formulated a prompt that was something like a mini-program to help stakeholders generate various types of requirements based on information they input. Refining the prompt made, over several iterations, made the difference between GenAI providing the options to choose from after each response and whether it allowed the stakeholder to choose an option or just spit out responses to all the options at once.
I need to refine it, again, now. What worked on GPT4 does not work exactly the same on GPT4o.
I've also found that, for some topics, asking the same question in different ways can give you a more comprehensive answer when you combine the results. Saving Changes...
I like to phase my prompts in a series of steps and often map / plan my prompts before engaging with the AI tool. This saves time and less rework in revising one prompt over and over again.
...
4 replies by Marci Hoover, Moses John Kariuki, and Rami Kaibni
Jul 10, 2024 12:21 PM
Rami Kaibni
...
A smart way to do it, Dominic. I more or less do the same thing!
Mar 21, 2025 1:25 PM
Moses John Kariuki
...
I do the same.
Mar 21, 2025 1:25 PM
Moses John Kariuki
...
I do the same.
Jul 13, 2025 4:43 PM
Marci Hoover
...
I agree with this approach from a time saving standpoint and fewer prompts being required. Like any good process, you plan it out (as far as you're able) before diving in.
I have received improved output by adding more details to the initial prompt over several iterations - including output examples, any constraints, type of research you want the model to conduct, etc. Saving Changes...
Senior Projects Manager | Field & Marten AssociatesNew Westminster, British Columbia, Canada
Jun 23, 2024 5:15 PM
Replying to Booma Pugazhenthi
...
To improve getting answers from GenAI tools, try adding specific instructions to your prompt, such as "Generate ten different questions about project management, similar in style and difficulty to the PMP exam, and provide detailed answers for each." This ensures the tool understands the need for questions and answers in the output.
Kiron, I do agree with Booma, in that adding more details and very specific instructions no matter how small they are can improve the outcome generated! Saving Changes...
Senior Projects Manager | Field & Marten AssociatesNew Westminster, British Columbia, Canada
Jun 29, 2024 8:47 AM
Replying to Dominic Williams
...
I like to phase my prompts in a series of steps and often map / plan my prompts before engaging with the AI tool. This saves time and less rework in revising one prompt over and over again.
A smart way to do it, Dominic. I more or less do the same thing! Saving Changes...