Trademark Disputes, Spam Policies, and AI Scraping Protections: September SEO Trends You Shouldn’t Ignore

As Q4 kicks off, search and content marketers are navigating another wave of change—this time spanning everything from Google’s evolving spam documentation to new protections against AI scraping. Below, we break down this month’s most pressing developments and what they mean for your SEO strategy moving forward.
WordPress and WP Engine: A Legal Divide Worth Watching
In a surprising twist, WP Engine has been banned from using the WordPress trademark. The core issue? WP Engine runs as its own CMS on top of the open-source WordPress codebase without an official trademark license. Critics argue that this amounts to unauthorized use of WordPress’s security framework and brand equity.
Why it matters:
For agencies and clients using WP Engine, this could impact branding, trust, and long-term support. While technical functionality remains intact, platform clarity and licensing transparency will matter more than ever in CMS recommendations moving forward.
Google Updates Spam Policy to Address AI, Scraping, and Abuse
Google has quietly updated its spam policy documentation with new guidelines covering AI-generated content, web scraping, and abuse tactics like content duplication and low-value aggregation
This documentation update is significant not only for how Google defines “spam” in 2024, but also for how it communicates the line between AI-augmented content and AI-abused manipulation.
Takeaway:
Review this documentation to better inform conversations with clients. It can help distinguish between responsible AI-assisted content workflows and strategies that might put rankings at risk.
“From Small Businesses” Carousel Now Testing on Mobile
Google is testing a new “From Small Businesses” carousel in mobile results. This initiative is likely tied to the “small business” attribute in Google Business Profiles.
What to watch:
If your clients qualify as small businesses, ensure their GBP listings are updated and optimized. This includes enabling the small business label, gathering reviews, uploading high-quality imagery, and including local keywords in descriptions
Being featured in this carousel could improve visibility and click-through rates, especially for mobile-first searches with strong local intent.
Cloudflare’s AI Audit Tool Could Change the Scraping Game
Cloudflare is rolling out an AI Audit tool designed to detect and report scraping activity from AI bots. The end goal? Give publishers the ability to charge or restrict AI-powered crawlers.
Why it’s a big deal:
This marks the first real step toward compensating content creators whose work is used to train LLMs or power AI-driven search. If adopted widely, it could introduce a new monetization layer for content while reshaping how data access is regulated across the web.
Google and DeepMind Now Recognize Author Entities
Google, through DeepMind, is now explicitly recognizing content creators as individual entities, particularly in YMYL (Your Money or Your Life) categories like healthcare, finance, and legal.
How to check if your authors are recognized:
Search for a blog article on your site and click “About this source.” If Google identifies the author by name, their expertise is likely being attributed to the content.
Next steps:
- Create robust author profile pages
- Include credentials and experience, especially for regulated industries
- Use structured data to tag authorship (e.g., author and Person schema)
Establishing authority and trust at the author level is becoming just as important as the domain level—especially in E-E-A-T-critical industries.
Understanding Where Toxic Backlinks Come From
Clients frequently ask “Why are we getting toxic backlinks?” While many expect a nefarious explanation, the reality is often more mechanical or random. Here are the most common sources of toxic links today:
- Link farms and PBNs created to manipulate rankings
- Spammy directories that list low-quality, unrelated businesses
- Hacked websites that unknowingly auto-link to your domain
- Competitor sabotage through negative SEO tactics
- Automated link-building tools that point to unrelated sites
- Foreign-language or irrelevant domains scraping and linking without context
Why it matters:
Understanding the source of toxic links is key to proper disavowal. It also helps clients recognize that many toxic backlinks are the result of random bots, not necessarily bad PR or shady practices.
If your client is underperforming and a backlink audit reveals quality issues, consider adding a disavow process to your monthly workflow. Tools like SEMrush or Ahrefs can help flag these domains before they impact trust signals.
Historical URL Auditing with Wayback Machine and Screaming Frog
Preserving SEO value from legacy pages or past site structures can be difficult—unless you know where to look. A practical way to surface missing URLs is by combining Wayback Machine snapshots with Screaming Frog SEO Spider.
Here’s how to do it:
- Select a historical version of the site on archive.org
- Crawl that snapshot in Screaming Frog
- Export and clean the list of historical URLs
- Crawl the current site and compare
- Identify URLs that are missing or significantly altered
- Rebuild or redirect key pages to recover lost equity
Bonus tip:
Use this method when onboarding new clients or after major migrations. It often uncovers overlooked legacy content that’s still relevant, link-worthy or indexed.
Final Thoughts
This month’s search updates reinforce a few critical truths:
- Search engines are prioritizing legitimacy: Whether it’s spam policy clarification, small business promotion, or author verification, trust is at the center of modern SEO.
- Content creators are beginning to take control: Cloudflare’s scraping audit and DeepMind’s author recognition are early signs that publishers will have a larger voice in how their work is used.
- AI is changing how we produce, evaluate, and protect content: Marketers must understand the difference between using AI tools and optimizing for AI systems, and stay on the ethical side of both.
Need help auditing your backlinks, updating author schema, or protecting your content from AI scraping? Our SEO and web strategy teams are here to help.