The web is bloated. It’s bloated with bullshit.
And I don’t mean with bullshit content, that’s fine; you should be able to freely express your opinion.
With “bullshit-bytes” sent down the wire, so to speak.
Hold your horses, I hear you.
“Have you heard about Nickleback’s new song?”
But that’s quite superficial, without trying to understand the real reasons and context.
So here we go, one year later, and many bytes saved and flashy ads avoided.
Read on before coming to conclusions, and who knows, perhaps you might want to escape the bullshit web too.
You can then white-list certain domains and individual scripts, e.g. allow all scripts from a web application you trust (e.g. ProtonMail’s web application)
Will it work 100% of the time, without any issues ever?
100% not. But most of the time, yes, it works without major drawbacks.
The need comes down to a few factors:
More in depth
Or uBlock origin.
Or through Brave, if you’re brave enough.
Use Cookie AutoDelete.
Pair that with a Pi-hole/Adguard installation on your network/Rpi and smooth sailing awaits.
In both extensions, you can whitelist certain urls or cookies you “trust”/need.
Save a lot of bandwidth
It saves a lot of bandwidth. With a lot I mean around 90%.
For me it’s a game changer, since I live in a small rural town browsing the web with a 4G modem and have limited bandwidth.
To read articles and browse the web.
In that case, you can decide if it’s worth enabling JS or bail out.
I don’t have numbers for the amount of time I saved/lost browsing the web with JS disabled.
I might be biased, but I feel that whitelisting some common JS (e.g. a CDN) works most of the time and enables to render substantial portions of most webpages.
Less cookie banners
That’s one the most satisfying achievements of this experiment.
No JS? No cookie banners.
Less googly/creepy trackers (perceived privacy)
Essentially the above. No JS? Less tracking.
Unexpected pros # 1 (paywall)
I was reading an article on a popular Italian news magazine, without issues.
Shared it on Twitter, and some people complained about the nasty paywall, not being able to read it fully.
Unexpected pros # 2 (degradation)
Quality content is served with HTML fallback behavior/Graceful degradation in mind.
You’ll need to manually whitelist domains you trust and perform some one-time manual actions.
It’s a small price to pay if you want to save bandwidth and browse the web faster.
I hear someone shouting from afar:
“Why don’t you use Lynx at this point”
“Or cURL while you’re at it?! hahha”
I get it. It’s not perfect and it can be cumbersome from time to time.