Table of contents

The web is bloated. It’s bloated with bullshit.

And I don’t mean with bullshit content, that’s fine; you should be able to freely express your opinion.

With “bullshit-bytes” sent down the wire, so to speak.

In the past I wrote about an experiment I was conducting (that sounds so scientific and proper), namely browsing the web with JavaScript disabled.

Hold your horses, I hear you.

“Have you heard about Nickleback’s new song?”

But that’s quite superficial, without trying to understand the real reasons and context.

So here we go, one year later, and many bytes saved and flashy ads avoided.

Read on before coming to conclusions, and who knows, perhaps you might want to escape the bullshit web too.

TLDR;

Disable JavaScript. Using Noscript.

You can then white-list certain domains and individual scripts, e.g. allow all scripts from a web application you trust (e.g. ProtonMail’s web application)

Will it work 100% of the time, without any issues ever?

100% not. But most of the time, yes, it works without major drawbacks.

The idea is essentially: Disable JavaScript globally, and enable it where you need it.

The need comes down to a few factors:

  • You need it to read a article or access information on a webpage that won’t render without JavaScript

  • You want to use the interactivity that JavaScript provides on a certain web app (read “JavaScript application”)

More in depth

Use Noscript.

Or uBlock origin.

Or through Brave, if you’re brave enough.

Use Cookie AutoDelete.

Pair that with a Pi-hole/Adguard installation on your network/Rpi and smooth sailing awaits.

In both extensions, you can whitelist certain urls or cookies you “trust”/need.

Pros

Save a lot of bandwidth

It saves a lot of bandwidth. With a lot I mean around 90%.

For me it’s a game changer, since I live in a small rural town browsing the web with a 4G modem and have limited bandwidth.

Simply put: Most of the time, you don’t need JavaScript at all.

To read articles and browse the web.

If you’re using a JavaScript web app, you’ll most likely encounter a blank page, or the dreaded <noscript> “You need JavaScript to run this app” will appear.

In that case, you can decide if it’s worth enabling JS or bail out.

Faster browsing

I don’t have numbers for the amount of time I saved/lost browsing the web with JS disabled.

I might be biased, but I feel that whitelisting some common JS (e.g. a CDN) works most of the time and enables to render substantial portions of most webpages.

Awww yessss!

That’s one the most satisfying achievements of this experiment.

No JS? No cookie banners.

Less googly/creepy trackers (perceived privacy)

Essentially the above. No JS? Less tracking.

Unexpected pros # 1 (paywall)

I was reading an article on a popular Italian news magazine, without issues.

Shared it on Twitter, and some people complained about the nasty paywall, not being able to read it fully.

They were using JavaScript to remove content, not to add it for paid users.

Unexpected pros # 2 (degradation)

Most of the time, browsing with JavaScript disabled, it just works.

Quality content is served with HTML fallback behavior/Graceful degradation in mind.

Cons

You’ll need to manually whitelist domains you trust and perform some one-time manual actions.

It’s a small price to pay if you want to save bandwidth and browse the web faster.

Conclusion

I hear someone shouting from afar:

“Why don’t you use Lynx at this point”
“Or cURL while you’re at it?! hahha”

I get it. It’s not perfect and it can be cumbersome from time to time.

Continue reading