Google is struggling with some of the biggest technical issues in its history, affecting users’ experiences across tools, including spending controls and campaigns on the all-important Google Ads platform. The company says it hopes to provide an update soon explaining the problems and answering questions about when they will be resolved. Other mainline Google products are experiencing significant issues ranging from dramatic fluctuations in SEO and poor SERPs to GA4’s outright failures.
Because of widespread concern, AIMCLEAR gathered its experts to discuss the problems and share how we’re working through them with clients. The group consisted of: Chief Marketing Officer Amanda Farley, Director of Integrated Ad Operations Tim Halloran, SEO Lead Lea Scudamore, Software Engineer Reed Dunbar and Senior Integrated Ad Operations Manager Amber Deedler. Vice President of Integrated Content and Public Relations Rob Karwath moderated the conversation, captured via this edited transcript.
Google bugs & issues: Aimclear’s experts help you sort it all out
Rob: What are we seeing, how serious are these Google troubles, and how are they affecting our clients?
Amanda: Issues are emerging across all facets of Google at the moment, spanning SEO, GA4, Google Ads and beyond. A common sentiment is that something highly unusual is happening. We’ve observed a range of problems, from GA4 overwriting conversions and random conversions, to alterations in the primary or secondary source, all of which can significantly disrupt ad performance. Google also appears to be grappling with a significant bug related to overspending, a matter that was reported yesterday and that we have personally encountered.
It’s somewhat heartening to see the industry voicing its concerns as well, illustrating the magnitude of the issue, particularly within Google Search results pages. Lea can provide a more detailed discussion about that. But typically quite reliable elements are malfunctioning. This ranges from the appearance of spam sites to inaccuracies in local listings. And the issue is pervasive.
Lea: Navigating the local aspects has presented challenges. For instance, a seemingly simple action—me searching for “Chinese food near me” on a Sunday when I opted not to cook—yielded perplexing results. Google recommended a Chinese restaurant in Chicago, despite me being in Duluth, Minn., and presumably triangulated by the nearest three cell towers.
This unexpected search result, a big deviation from how local searches are supposed to function, is not an isolated incident. Another instance involved a client who is highly dependent on local searches calling us to question why Google also was delivering Chicago results to them. They’re in Brooklyn Park, Minn. This isn’t exclusive to my experiences or devices. Clients are informing us of broken search results from their local standpoints, indicating a consistent problem.
Scrutinizing SERPs (Search Engine Results Pages) and site checks on Google, I’ve encountered aberrations, such as it showcasing what I would describe as subpar and spammy sites, reminiscent of results from a decade ago. These are thin-content websites, some of them blatant spam, that under normal circumstances shouldn’t rank highly. Yet they do.
Meanwhile, reliable SEO tools indicate site rankings dropping for keywords. A live, incognito check, even with client verification, suggests a throttling of organic results, some of which we’ve captured in snapshots. Speculation suggests the possibility that we may be in the middle of a re-index, or perhaps our competitors are re-indexing. But evidence is emerging of manipulation of organic SERPs to bolster profits on the paid side. I have clients reporting, via analytics, a 64 percent downturn in organic revenue, which contradicts our discussions from just two weeks earlier, when they were trending up by 15 percent and reporting record figures.
These discrepancies aren’t just confounding—they’re misleading. A scenario where analytics reflects a 64 percent drop in revenue, while in reality, they are breaking records compared with any previous month, underscores a critical inconsistency between analytics and real-world performance.
In a recent encounter, an agency friend asked me if analytics was significantly disrupted or if they should terminate their analytics team. I had to confirm the former and make sure they understood this had nothing to do with their team’s proficiency. My apprehension is that many agencies and in-house analytics professionals might wrongfully bear the brunt of, or even lose their positions due to, these inconsistencies.
Ultimately, this all seems fundamentally rooted in Google’s actions, whether it be deliberate obfuscation or another undisclosed reason.
Rob: These outcomes present real-world consequences for clients that rely on search.
Reed: My main takeaway is it doesn’t seem as if there’s consistency. A model can break, and something can change, leading us to acknowledge the need for a solution. But when workarounds lead to further workarounds, it creates a seemingly endless cascade. If the situation remains inconsistent, offering no resolution and instead perpetually evolves, it becomes almost more problematic than simply identifying an issue and finding a solution.
Amanda: Another aspect that raises a red flag to me is the frequency of core updates rolled out recently. I’ve never seen them occur at this rate. While updates are common, experiencing so many consecutively is new. We’ve also read reports indicating a decline on Google’s end, attributed in part to some of their top-performing CDN network websites. Much of this seems like risky moves on Google’s part. What’s the rush with rollouts?
Lea: As for GA4, they are under pressure to improve it since Universal is no longer an option. But GA4 is like cake batter that’s been left in an unheated oven. Since it debuted last July, it’s hardly been a crowd favorite. And now it’s become a moldy, foul-smelling mess.
After launching the initial core in August, finishing in early September, we observed a decline in search results quality. The “helpful content” update, also in September, didn’t seem to remedy this. Given the circumstances, I wouldn’t be surprised if Google introduced another core update within the next three months, if not sooner.
Historically, they release core updates semi-annually, around March and September. Now, with an August update and a second update in October, it seems that they’re patching frequently. It’s not uncommon for companies to release products that aren’t fully polished, but they usually refine them over time. The past couple of months have seen a surge in low-quality web traffic. Even if metrics suggest an uptick in brand traffic, its actual value is debatable.
Amanda: On the spam side of things, we validate every lead that comes through on our client account, including those that use Hubspot, Salesforce and other third-party services.
Lea: The results are just not that great. And the traffic coming isn’t great. It’s just traffic that doesn’t pay bills. So, I think Google is in a patching phase and moving as fast as they can.
Amanda: Fraudulent activity is prevalent in the display network and the search partners’ network. We had already identified quality issues with these networks. But recent audits, especially in the last month, have exposed even more problems. In just two weeks, we observed a significant amount of unreliable data across various industries and clients. Every account showed spam originating from these networks, prompting us to disable them.
Some marketers are not fully aware how to properly validate this traffic. They might receive reports highlighting numerous conversions, not realizing that a significant portion of these are just spam bots filling out forms. This deceptive trend is among the most concerning I’ve encountered.
Rob: What advice are we providing to clients in this chaotic situation?
Amanda: So, on the ad side, we’re getting stricter controls and putting in new processes because it just has been so volatile, such as with the recent client budget issue. They spent more than an entire month’s budget in a single day. We’re increasing our triggers. We’re increasing our processes.
We’re well-positioned to assist. We must continually develop our reporting for those who haven’t adopted this approach. It’s essential to view the data independently and not rely solely on Google, especially when its accuracy is in question.
Tim: From the media-buying perspective, I’ve always emphasized the importance of redundancy for your most important conversion events. When dealing with platforms like Facebook—and recalling our experiences from 2016, 2019 and particularly 2020 with the introduction of iOS 14—it’s crucial to diversify our data sources. Ensure that you’re not only tracking conversions but also effectively utilizing the pixel. If you’ve integrated on the server side, it’s vital to monitor that as well.
Tertiary methods also are available. For instance, aside from the pixel, consider incorporating event-based pixel actions. This offers a three-tiered defense mechanism for conversions on social platforms. This same strategy can be applied with Google. Many advertisers are transitioning away from GA4, opting to use it as a backup instead.
Advertisers are now integrating the Google tag on their sites, reverting to our earlier strategy of creating a specific G-tag conversion for each client account. They use GA4 conversions as a secondary source, and if there’s an issue with the G-tag, it becomes their primary focus. The key takeaway: Always maintain multiple layers of defense. In the event of a pixel malfunction or similar setback, it’s imperative to have alternative strategies in place.
Rob: Amber, what are your observations and experiences?
Amber: The core concern for our clients is trust. They want assurance that we’re implementing the right strategies. Both Amanda and I have discussed the necessity of standardizing processes across the board, coupled with a contingency plan. Our clients need experts who can swiftly address any potential issues.
Even as we notice occasional glitches in our tools and rules, our primary objective remains client trust.
Relying solely on automated rules isn’t foolproof, as we’ve seen instances of them failing. It’s essential to have the right personnel and processes in place to ensure that our clients maintain their trust in us and that we can consistently deliver quality work. Thankfully, we’ve implemented a failsafe for billing. This prevents any unexpected withdrawals from client accounts.
Even when other lines of defense falter, our clients can be confident that our billing safeguards remain intact. We’re implementing more consistent checks. These are similar to the regular verifications we used to conduct for campaign-level settings. Our goal is to monitor more closely, especially due to the recent bugs. We’re ensuring that Google isn’t placing us in inappropriate locations or altering our bid amounts in ways that are nonsensical. We’re becoming stricter.
Rob: Human support is crucial to ensure quality.
Amber: Genuine human intervention is essential in technology.
Rob: Lea, how are you addressing these concerns with our clients?
Lea: First, I contacted them via phone calls instead of emails. I spoke directly to CEOs or my relevant contacts to inform them of these odd occurrences. As I onboard clients, I ensure that they understand how to use analytics. They’re analytical thinkers, so we’ve spent significant time reviewing the details collaboratively, guiding them on what to focus on during screen shares. Interestingly, the data I see differs from theirs.
This confirms our doubts about the data’s accuracy. A common question I’ve received is about decision-making, especially when current data seems unreliable. But we always strategize content development with an SEO perspective, planning three to six months ahead, which gives us a clear direction. We are relying on previous reports from before the data discrepancies began and leaning into historical reports to help guide us
We consistently export our data monthly, which has been incredibly beneficial. For instance, when I analyze August data on GA4, I’m aware that our performance increased from July to August. However, today the GA4 data indicates a decline for the client in August. Within a five-hour span, refreshing the screen showed fluctuating revenue amounts for that month. At one point, the system reported no data available.
It’s clear clients are seeing different data from what I observe. I’ve communicated to them about the broad issues currently affecting analytics. I’ve provided them with examples and ensured that they receive reports. It’s crucial they understand the industry-wide significance of these discrepancies.
To further educate them, I’ve directed them to content from other trusted sources. I’ve recommended insights from industry experts such as Barry Schwartz from Search Engine Roundtable and other platforms, such as Search Engine Land. This ensures that they know this isn’t an isolated incident.
Reed: Everyone brought up excellent points. My suggestion would be to simplify and align closely with what you can confidently define as truth. For instance, with tracking revenue, perhaps aim to define revenue in terms closest to your CRM. It’s vital to distinguish between ambiguous numbers and those that you know are accurate. While we might label some numbers as ambiguous, they can serve as indicators. But it’s the numbers we’re certain about that should guide our decision-making.
Rob: It’s vital to stick with what we know is accurate. I appreciate everyone’s input. This discussion will likely prove valuable for our clients, and we’ll continue to keep them informed. Any concluding thoughts?
Lea: Stay resilient. This isn’t the first time we’ve navigated Google’s roller coaster.
Amanda: Absolutely, we’re all in this together.
Tim: Let’s persevere.
Amanda: United we stand.
Lea: Precisely.