At Message Lab Media, we’ve been exploring how healthcare marketers can use AI to make our jobs faster and easier. It’s a tall task because healthcare marketing requires nuance, compassion and clarity.
Our writers have been using AI for help with things like:
- Generating ideas for new articles.
- Understanding and explaining complex treatments, conditions and research findings.
- Repurposing content into different formats, like turning a long-form article or referral guide into email copy and social posts.
But we haven’t been using AI for any actual writing because, in our test cases, it’s often inaccurate and heavy on buzzwords, but light on substance. Still, we’re constantly asking: Are there any writing tasks we can automate with AI without jeopardizing quality, accuracy and sensitive information?
Right now, we have exactly one project that fits the bill for AI as an author. We built a custom GPT in partnership with Children’s Health, the nation’s 8th largest pediatric health system. Here’s a brief case study of how we found the right project for an AI author and how it works.
First, what is a Custom GPT? How is it different from ChatGPT?
- ChatGPT is a general, open model that answers many kinds of questions. The free version uses a basic model and your data may be used to improve the system unless you opt out; paid versions keep your data private within your workspace and separate from model training.
- A custom GPT is a closed, specialized version of ChatGPT that you set up for a specific purpose. It usually requires a paid plan, and any data you add stays private to your workspace and separate from the data used to train public models.
Identifying a writing project a robot can do
Custom GPTs are built on a set of rules they can follow over and over again. In terms of creating content, custom GPTs are good at:
- Following a style guide
- Filling out a template
- Learning what to do based on examples
This means a project with a clear template, lots of examples and content that’s somewhat formulaic can be a good fit for a custom GPT.
Where we don’t want to use the robot
We’re still learning exactly how GPTs and AI use the data we input. We err on the side of caution with what information we enter into a GPT (even a custom or closed one) and avoid entering:
- Any personal information that’s not easy to find online. For example, you can likely find a doctor’s name and where they attended medical school in a quick Google search. But we would not put information like their home address or personal contact information into a GPT.
- New information, such as content about a new treatment or study that a client might not be ready to share with the world.
- Personalized patient data. We might ask the robot to explain how a VP shunt works, but we wouldn’t enter an entire story that could be used to identify an individual who had a VP shunt.
We also don’t want to use AI in places where it would compromise quality. Thus far, we’re not seeing AI-authored content meet the mark in:
- Content we want to compete in search. (Per Google, using AI alone won’t automatically penalize you in search. But AI-generated content often lacks the depth, credible expertise and/or originality to truly compete in search).
- Feature articles about discoveries and innovations (AI is often inaccurate and creates copy that sounds good at first blush, but upon closer inspection lacks nuance, detail and authentic storylines.)
- Patient and donor stories (Also lacks nuance, compassion and compelling storytelling — and we don’t want to put potentially sensitive information into a GPT).
In short, AI is a great research and writing assistant, but not yet a reliable author. Still, we recently identified one project where it works well as an author: Provider bios!
The perfect healthcare marketing task for a CustomGPT
Provider bios are a good fit for a custom GPT because:
- They are templated and formulaic.
- They do not include patient information.
- They only include information that’s typically already available online.
- Competing in SEO is not a top priority.
- In this case, we already had a robust library of bios we could use to train the custom GPT.
How we trained the custom GPT to write provider bios:
- Gave it links to dozens of bios and told it to replicate the style and tone.
- Uploaded our provider bio template with clear instructions about what information to include.
- Provided clear instructions about the style guide, format and word count.
- Told it what NOT to include, such as their undergraduate major.
How we set up the process with providers
- We created a simple form where providers shared their title, medical school, care philosophy and specialty/focus areas. Questions about care philosophy and specialty are crucial in making sure each bio is unique and engaging.
- We asked physicians to submit their CV. The CV is a backup in case there is missing information in the questionnaire and can save the time of sending follow-up questions to the provider.
How we set up the process with writers
- Writers upload the completed questionnaire into the GPT.
- If the questionnaire is missing info, writers look for that information in the physician’s CV. Important: The writer pulls only the information they need from the CV and does not input the entire CV. This avoids giving the robot more information than it needs.
- The GPT creates a draft in a few seconds.
- The writer edits the draft.
We also had writers give the GPT feedback on any mistakes it makes so it can improve for next time. Common mistakes we observed include:
- Adjusting or embellishing quotes so they sound sappy or are a big departure from the provider’s actual words or intended meaning.
- Inventing answers if certain information was not provided. For example, on one bio where the provider did not include information about interests outside of work, the GPT added “in her spare time, she enjoys playing tennis with her daughter” instead of leaving that section blank.
- Straying from the style guide in places where the style guide strays from AP style, for example using “says” instead of “said” for quotes.
End result: 50-75% time saved per bio
All told, our custom GPT has cut down the time it takes to draft each bio by 50-75%. Each bio took about an hour to create with a human author, where a human plus custom GPT averages 15-30 minutes. This means a faster turnaround time for the client and led them to consider creating bios for providers not currently on their site, like physical therapists and child life specialists.
It also made this task faster and easier for our writers. Moving ahead, we’re going to keep testing AI and deploying it for other formulaic projects. This will help us find ways to make certain projects more efficient and affordable — while opening more time to spend on the content AI can’t create.
Learn more
At Message Lab Media, we specialize in helping health systems, science organizations and foundations engage and inspire their key audiences – from patients to doctors to donors. Reach out to us to learn more about our work and how we can help you.
