3/1/2023
<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/6625a616-061a-440b-9e8a-1912f121968d/3a2b5cf3-3756-4da3-b022-14f80c926b52/Journal_Icon.png" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/6625a616-061a-440b-9e8a-1912f121968d/3a2b5cf3-3756-4da3-b022-14f80c926b52/Journal_Icon.png" width="40px" /> Google kills everything it touches, even its own products — there’s a website dedicated to list every casualty to date.
</aside>
Google, once the poster child of innovation and ethical business practices, has been experiencing a steady decline in public opinion. It’s due to a series of what I like to call “bad decisions”. This becomes apparent when Alphabet, Google’s parent company, announced a recent net profit drop of 27% — a huge shift for any company, let alone the titan Google. What’s more striking to me is how anti-consumer Google has become.
But how did everything get to this point? What bad decisions did they make? That’s what I’d like to explore in this “brief” post. From allowing political emails to bypass their own spam filters to abandoning their app store safety checks, Google seems like a different company nowadays. It has fallen from grace in the eyes of its users.
Google announced that Chrome will no longer support adblocking extensions once Manifest V3 is rolled out sometime in 2023. That date is likely to be delayed, which I’ll touch upon in a bit. Google’s main motivation for implementing this change to receive more ad revenue, even when its at the expense of the user experience. To make matters worse for developers, Google has not provided concrete specifications nor addressed their concerns about this rollout. How can developers prepare for Manifest V3 if Google isn’t sharing needed information?
“Manifest V3 gets rid of persistent background scripts in a move they cited as “security focused.”
“Security focused” sounds good to the layman, right? But nowadays corporations will lie through their teeth if they can cite changes due to “security” or being “for the kids”. It’s a blanket protection, a criticism shield, that’s hard to argue against unless you know the field itself. After all, what average person is going to argue against changes that “help the kids”? Many users won’t do the research required to realize the changes often have nothing to do with protecting children — it just makes for easy persuasion.
Red flags go off whenever I see any variation of the reasons above. For example, EU’s attempt to outlaw encryption via the Online Safety Bill (OSB) shows what can go wrong when policymakers either do not understand encryption, or understand it & yet still act in bad faith. Encryption, a security standard at this point, is being scapegoated for “stopping bad people from doing bad things”. To increase the chances of stopping vaguely-defined crimes, the EU is willing to implement changes that will hurt the majority of users. It’s like the European version of the NSA: spying & data collecting under the disguise of helping the public.
From Ars Technica’s “Chrome’s “Manifest V3” plan to limit ad-blocking extensions is delayed”:
Google started this mess in 2018 with a blog post outlining a plan for "Trustworthy Chrome Extensions, by default." As part of the Manifest V3 rollout, Google's official story is that it wanted to reduce "overly-broad access" given to extensions and that a more limited extension platform would "enable more performant" extensions. The fun side-effect of all that is more limited ad blocking, which would conveniently help Google's bottom line. The old timeline would have finally implemented the full Manifest V3 transition six years after this initial blog post, but now it sounds like it will take even longer.

Announced back in 2018, Manifest V3 has been delayed time and time again.
Another decision that contributed to Google's fall from grace was their decision to get rid of their “own” app store safety checks, which Google has already been doing for years. Instead, they have copied Apple’s inferior approach of having the developer provide list the data they collect. Nothing stops them from lying. Google was asked what the punishment for breaking these guidelines was, and they gave a vague “case-by-case” response.
From Wired’s “You Can’t Trust App Developers”:
***“When you land on Twitter’s app page or TikTok’s app page and click on Data Safety, the first thing you see is these companies declaring that they don’t share data with third parties. That’s ridiculous — you immediately know something is off,” says Jen Caltrider, Mozilla’s project lead. “As a privacy researcher, I could tell this information was not going to help people make informed decisions. What’s more, a regular person reading it would most certainly walk away with a false sense of security.”
“If we find that a developer has provided inaccurate information in their Data Safety form and is in violation of the policy, we will require the developer to correct the issue to comply. Apps that aren’t compliant are subject to enforcement actions.”***
The lack of specifics for identifying what qualifies as “incorrect data” & the “enforcement actions” for it don’t leave me with much hope. It reads as a spokesperson’s reply, similar to “don’t worry, we’ll take care of it, trust us.” But with changes like these, it’s ironic that you can no longer trust the “data privacy & security” section.

An example of a provided vs. blank section in the new data privacy & security rollout
Google wants increased ad revenues, even when it’s at the cost of the user’s experience. While writing this I noticed Google changed how search results are displayed. I recommend reading this TechCrunch article that explains these deceptive changes better than I can. To summarize: