In 2025, Merriam-Webster named “slop” its word of the year — a reflection of growing frustration with the low-quality, mass-produced content filling feeds, inboxes and search results. “AI slop” has become shorthand for something many people feel instinctively: Content is everywhere, but much of it feels repetitive, rushed or hollow.

But is content quality actually declining or are we reacting to something more nuanced?

To explore that question, we sat down with two mdg-ers who approach the issue from very different angles: Vanessa Page, copy director, and Alex Velinov, VP of AI and Technology. What emerged wasn’t a debate, but a thoughtful conversation about sameness, speed and where human judgment still matters. 

When Everything Starts to Sound the Same 
From Vanessa’s perspective, much of what people label AI slop — at least from a messaging standpoint — comes down to repetition. 

“I think a lot of what people are reacting to is the sameness,” she explained. “That feeling of seeing the same ideas and sentence structures pop up over and over again.”  The word “gateway,” is a good example, she said. It’s not a word you saw much in marketing a few years ago. Now, it’s a frequent find, along with short, choppy constructions like “It’s not this. It’s that.”  

She also pointed to an increase in small but telling mistakes — missing words, awkward phrasing, content that doesn’t quite make sense. “You just think, If an editor looked at this, they wouldn’t have let it pass through,” she said. 

Alex sees the same phenomenon but frames it differently. He calls it normalization — when content begins to follow familiar schemas so closely that it all starts to look the same. “If anyone starts to use the same schema, it’s not a schema anymore,” he said. “Everything is normalized. It looks the same.” 

To make the point, he reached outside of marketing entirely. “If you look at the Top 40, probably 80 or 90% of songs have exactly the same structure,” he noted. “That’s normalization — and it’s not related to AI at all.” AI, in other words, didn’t invent sameness. It just made it easier to reproduce at scale.

Scale, Speed and Responsibility 
Both Alex and Vanessa agreed that AI has the potential to bring real advantages to content workflows — particularly scale and speed. It can help teams get past the blank page, generate ideas and produce more with fewer resources. 

But those gains don’t guarantee successful outcomes. And they certainly don’t replace human insight.  

“You are responsible for the output,” Alex said plainly. “It doesn’t matter how it’s done. At the end of the day, you put your name on it.” If content doesn’t meet an organization’s standards, the answer isn’t to blame the tool — it’s to rethink the process around it. “Go back to the drawing board until it’s good enough for you to say, ‘That’s good enough for me.’”

While Vanessa agrees, her caution comes from a different place. She worries about what gets lost when speed becomes the primary goal. “Sometimes it feels like we’re making efficiency our strategy,” she said. “And creativity can get lost in that.” Some of the best ideas, she noted, don’t arrive instantly — they emerge after time, pressure and iteration.

Authenticity in an AI World 
As AI becomes more embedded in everyday work, brands are increasingly eager to emphasize how “human” they are. That tension — using AI while trying not to appear automated — surfaced repeatedly in the conversation. Alex believes this may simply be a transitional phase. “I don’t see why you need to hide it,” he said. What matters, in his view, is the message and the idea behind the content — not whether AI helped produce it. 

AI, he argued, can democratize creativity. “If you have good ideas and a strong concept, you don’t need a Hollywood budget anymore,” he said. The tools can lower barriers — but they don’t replace thinking.

Rules,  Experimentation  and the Gray Area 
When it comes to governance, both agreed that some rules are necessary, particularly around data and risk. But Alex emphasized that waiting for perfect clarity before adopting AI can be its own risk.

“If you wait for the day everything is fixed, when that day comes, you start from zero,” he said. Experimentation, even when messy, builds essential experience.

Vanessa agreed — with a caveat. “It’s really easy to let AI do your thinking for you,” she said. Her concern is not only about maintaining the ability to strategize and generate original ideas today, but also about cultivating tomorrow’s talent. “I don’t know how newer people will develop the skills they need if there’s always a shortcut.”

So … AI Slop or Not? 
The conversation ultimately lands in the middle — intentionally so. AI isn’t inherently making content better or worse. In many ways, it’s amplifying what already exists. Strong ideas scale more easily. Mediocre ones do too. The real challenge for organizations isn’t whether to use AI, but how — and where human judgment remains non-negotiable.  

As Alex put it, “A tool in the hands of someone who knows how to use it is powerful. In the wrong hands, it’s just noise.”

Posted in READ ALL ABOUT IT