How Original is ChatGPT Content?

ChatGPT just launched its API, and GPT-4 has entered the building, making it easier than ever to sync unlimited at-your-fingertips content creation with your workflow. 

Whether you work in-house creating landing pages or at an agency churning out massive amounts of content for your clients, you need fast, reliable content now more than ever. And because Google deprioritizes duplicate content, you want to make sure that you aren’t putting out the same content as all your competitors. 

Is ChatGPT content unique? 

If you missed my last AI content experiment, we covered what it takes to edit a piece of Jasper content to make it agency-ready. This time, I want to know how creative ChatGPT can be. 

Can it spin up truly different content about the same topics using different prompts? Or will using this writing tool make you sound exactly like everyone else? 

Lately, prompt engineering has been the go-to solution for anyone trying to create unique AI content. I want to put that to the test. I’ll use varied (but related) prompts to generate paragraphs. Then, I’ll compare both outputs, looking specifically for shared phrases or structures. 

Hypothesis: Due to the nature of its training and limitations, ChatGPT outputs will use similar phrases regardless of prompt engineering. 

Humans, for what it’s worth, are the same way. There are only so many different ways to write about what we write about. We fall into routines. Habits. Cliches. Our vocabulary gets monotonous, and our writing suffers for it. 

And, because ChatGPT is trained on human writing, I’m anticipating a similar downfall. 

However, I’m excited to see the results because where one person has their own crutch words and writing tendencies, ChatGPT has everyone’s crutch words and writing tendencies. Maybe it will generate original phrases even with similar prompts. I’d love to be surprised.

Test 1: Where I Ask ChatGPT for Vegan Food Ideas

Say, for instance, you’re writing a blog post about vegan options great for people who have recently made this dietary adjustment. You’d want something straightforward but informative. This is where ChatGPT excels. 

Prompt 1: tell me about good food options for new vegans

Prompt 2: What are great options for people who just started eating vegan?

Right off the bat, it’s easy to see that these outputs turned out similar in length (about 500 words), they both have seven items in their list, and they have introductory and conclusion paragraphs. If you’re in the market for a low-lift vegan food article, this is the 90-second rice version. 

So, how original are they? I’ve highlighted shared exact phrases in purple.

Aaaaaand that is a lot of purple. Honestly, it would have been easier to highlight the words that aren’t shared than the ones that are. Let’s break it down into bite-sized pieces so you don’t have to keep squinting at your screen.

At first glance, the opening paragraphs of each look clean and unique. However, an eagle-eyed reader will recognize verbiage from Prompt 1’s first paragraph has been regurgitated in Prompt 2’s closing paragraph. 

(And yeah, it still counts as plagiarism.) 

You might have also noticed that the seven items on both lists are identical. I’m pretty sure even vegan diets aren’t that predictable. Plus, they’re in the exact same order. Not quite bespoke marketing. 

One last note here. One of the biggest differences between these two pieces boils down to adjective choice. “Popular” is used routinely throughout Prompt 1’s output. In contrast, Prompt 2 never says popular but instead opts for “great.” They don’t switch it up, either. Once they’ve decided on their adjective of choice, they seem content to stick with it. Neither is particularly inspiring, but their repetition is an obvious indication of AI-generated material.

Test 1.5: Vegan Food, One More Time With Feeling

It’s clear that without much prompt differentiation, ChatGPT is going to offer up similar content. To get unique outputs, you need unique inputs. This is where prompt engineering comes in. The more time you spend crafting a detailed prompt, the better the quality should be. 

(Of course, there is no guarantee that someone else won’t use that same input and create an output identical to yours. But more on that later.) 

Before I start fiddling with completely new prompts, I want to ask about vegan food again, but this time pushing for a different tone. We’ll compare the output from Prompt 1 from the previous test to a new output. 

Prompt 1: Tell me about good food options for new vegans

Prompt 2: Write a few paragraphs about vegan food for beginners with an easygoing tone

Here, I’ve highlighted in yellow similar phrases as well as continuing to highlight duplicate content in purple because — hallelujah! — these two outputs are actually different. 

IMO, the easygoing tone was accomplished pretty well. “Hey there” might be too casual for a typical blog post, but extra fluff like “don’t worry, it’s not as hard as you might think” or “they come in all shapes, sizes, and colors” helps make the output feel personable. 

Barring nuts and seeds, Prompt 2 still highlighted all the same types of foods as Prompt 1. This is somewhat expected as we get the same heavy hitters: fruits and vegetables, whole grains, legumes, tofu and tempeh, and milk and cheese alternatives. What surprised me here is that the easygoing tone also recommends different foods. 

Are black bean tacos or chickpea curry notably more easygoing options? Or has the model read enough Pinterest food blogs to know that laidback tones are often paired with specific recipes? I’m not sure.

Test 2: Nantucket Vacation Homes Or Bust

Prompt 1: Write with authority about vacation homes on Nantucket, Massachusetts for interested buyers

Prompt 2: Write an educational but laidback post about vacation properties on Nantucket, Massachusetts for a buyer already familiar with the island

There are two main differences between these prompts — target audience and tone. I wanted Prompt 1 to sound like it knew Nantucket like the back of its hand and was confident in its expertise. For Prompt 2, while still requesting a knowledgeable speaker, I hoped it would cater specifically to buyers who already knew the ins and outs of the island.

If we’re grading the AI out of 10, I’d give this one a 5/10. 

The prompts themselves were moderately successful in generating different materials, but these swaths of purple and yellow don’t lie. These outputs are still largely similar.

Interestingly, ChatGPT emphasized affordability in both outputs, thinking that’s the obvious next word after “vacation homes,” despite not requesting that in either prompt.

In its defense, I also didn’t specifically request that it write for a high-end luxury audience. However, buyers interested in vacation properties on Nantucket likely don’t have huge financial constraints, so price might not be a barrier. Implying that it should be could turn off lucrative buyers. This is the kind of context a human writer would likely infer from an assignment that AI simply isn’t able to see and implement. 

But this isn’t a quality critique, so let’s take a look at what sets these two outputs apart. 

The biggest difference is in their body paragraphs. Because Prompt 2 was designed with buyers who already know the island but Prompt 1 wasn’t, the outputs managed to cater to slightly different audiences. 

Prompt 1 mentions Sconset (a colloquial term for Siasconset) and Madaket (as in, Madaket Beach). Using shorthand versions of these enclaves makes this feel more familiar with the area. I almost would have expected that from Prompt 2 instead. 

And let’s not brush over that these are called more affordable neighborhoods. Properties listed here are on par with prices throughout the island with multimillion-dollar homes for sale. 

Prompt 2, on the other hand, highlights zoning laws. Certainly, this is something someone considering buying a home would be interested in, but it’s not necessarily known by everyone familiar with the island. It’s educational but doesn’t dive too deep, which is exactly what the prompt asked for. 

Regardless of the content’s effectiveness, at least here there was variety. ChatGPT proved that it could, when prompted, generate different outputs if you nudged it in the right direction. 

Test 3: The Same Prompt Over and Over and Over

We’ve tried generating content on different topics, so this time I want to see how ChatGPT performs when asked to create new material while repeatedly using the same prompt. For a refrain like this, we have to go to Music City. 

Prompt: Tell me about things to do in Nashville, Tennessee

I asked ChatGPT five times to tell me about things to do in Nashville, Tennessee. Each time, this prompt generated a list of ten attractions. The first three times, the list featured the same ten things to do in the same order. 

  1. Country Music Hall of Fame and Museum
  2. Ryman Auditorium
  3. Lower Broadway
  4. Grand Ole Opry
  5. Belle Meade Plantation
  6. The food scene
  7. Frist Art Museum
  8. Centennial Park
  9. Johnny Cash Museum
  10. Music Row

These three outputs are scarily similar. It’s the vegan food experiment all over again. Nearly the same two-sentence intro paragraph followed by a listicle. Throughout, there are a few places where they differ — swapping adjectives or sentence structure like we’ve seen in previous tests — but frankly not enough to notice.

By the fourth generation, it started to shake things up. 

We swapped the order (moving the Johnny Cash Museum from 9 to 4 on the list) and replaced Music Row with the Tennessee State Capitol. Plus! A conclusion! We’re really begging for scraps here, folks. 

Generation five throws in a hail Mary — Jack Daniels Distillery. This is actually in Lynchburg, which is a ninety-minute drive from Nashville, but that’s not the point here. At least it tried something new.

General observations and key takeaways

You’re in a hurry. Here’s the short version: 

  • ChatGPT is basically glorified predictive text. It likes patterns. They’re all it knows. 
  • High-quality content is original content, but not all original content is high quality. 
  • Not only does prompt engineering work, but it’s also a necessity. Getting granular with your prompts is the only way to generate varied content through ChatGPT.

So, what happens if your content isn’t original? 

Duplicate content isn’t going to work for your website. Having informative, authoritative, unique content is crucial for SEO, and how AI and SEO interact is still up in the air. In the meantime, there are plenty of AI detection tools out there to help you keep an eye on the content you’re posting.

Is AI content right for you?

At Verblio, we’re constantly experimenting with ways to make your content production as seamless as possible. AI writing tools can be a powerful asset, but it’s clear they still need the human touch to create a finished product. Our Human-Crafted AI cuts costs but keeps the quality you love.

Avatar photo

Rachel Ghazel

Rachel is a content marketer by day and moonlights as a children’s author. She's managed millions of words at marketing agencies and written nearly as many. If you don’t see her color-coding her to-do list, she’s probably dusting her dictionary collection or drifting through the library stacks.

Questions? Check out our FAQs or contact us.