Very interesting points. Browsing webpages needs gigabytes of RAM today.
I do not know how software development has got to be so irresponsible...
I do not remember who it was who intentionally gave slow computers to their
programmers to make sure they wrote efficient code. No such thinking today.
I think it is a market-oriented problem; components are cheap, RAM is cheap,
and all trouble stems from that. I will stop ranting about this now.

I think that there are some bad design choices on the software side, like
relying on other components that bring their own latency, memory needs, and
problems to any one large software framework (say, Open/LibreOffice). I can
think of the dreaded dbus. Also, try running two separate firefoxes at once
under the same UID.

But there are ideas. Controlling resources is a thing, and I think that
"containerizing" execution may help here. Appropriate resources can be
allocated per process, with caps on CPU time, I/O, etc. I do not know how
to do this off the top of my head, but if there is an OS that should do it
well for you, Linux is its name. Does anyone have a solution of this kind
to offer so I do not have to do endless browsing for it? Very interested.

It is hard to force open-source developers to do you the favour and make
their software lean and robust to beyond what their testing suite extends.
The response to this is: "here is the code, fix what you do not like".