Here’s a shocking truth: Google’s crawl team is now directly calling out WordPress plugins that are wasting valuable crawl budget—and it’s causing a stir in the SEO world. But here’s where it gets controversial: while some developers are quick to fix these issues, others are ignoring Google’s warnings altogether. Could this be the start of a bigger clash between Google and plugin creators? Let’s dive in.
During the latest Search Off the Record podcast, Google’s Gary Illyes revealed that his team has been filing bug reports against plugins responsible for excessive crawl waste. One standout example? WooCommerce. Google flagged its add-to-cart URL parameters as a major culprit, and WooCommerce swiftly addressed the issue. And this is the part most people miss: despite Google’s efforts, not all developers are as responsive. For instance, a commercial calendar plugin generating infinite URL paths remains unfixed, even after Google’s outreach.
So, what’s the root of the problem? According to Google’s 2025 year-end crawl report, action parameters—like ?addtocart=true—accounted for nearly 25% of all crawl issues. Combined with faceted navigation (50%), these two categories make up a staggering 75% of crawl problems. Here’s the kicker: these parameters often aren’t intentionally added by site owners but are injected by CMS plugins, creating a bloated URL space that Googlebot struggles to navigate.
Illyes explained how they identified the WooCommerce issue: ‘We dug into where these action parameters were coming from and traced them back to popular WordPress plugins. Once we confirmed they were open-source, we filed bug reports directly.’ WooCommerce’s quick fix was commendable, but other unnamed plugins remain unaddressed. Bold question: Should Google take a more aggressive approach to force unresponsive developers to act?
This isn’t a new issue. Illyes has been warning about URL parameter problems for years, and Google even formalized guidelines for faceted navigation and updated its URL parameter best practices. Yet, the data shows the problem persists. Why? Because crawl waste is often embedded in the plugin layer, leaving website owners in a tough spot: it’s not their fault, but it’s their responsibility to fix.
Google’s solution? Proactive use of robots.txt to block problematic parameter URLs before they strain your server. But is this enough? Google filing bugs against open-source plugins is a step in the right direction, but it raises questions about accountability and collaboration in the developer community.
Controversial thought: What if Google started penalizing sites using plugins that ignore these warnings? Would that spark change, or would it create more friction? Let’s discuss in the comments—do you think Google is doing enough, or should they take a harder line? The full podcast episode is available here for those who want to dig deeper.