Capitalism breeds innovation Look inside
New ways for the wealthy to abuse common peopleI have a 2011 MacBook pro with 16GB RAM, but the screen is dead. Time to see if I can remember the magic key combination to get past the BIOS screen so the external monitor can work to install some flavor of headless linux
Will there be a chance that companies will optimize their applications perhaps?
No, that’s a cost they want to keep externalising
Absolutely not. Just look at games these days. Number one complaint: everything runs poorly. Optimisation is an afterthought. If it runs like shit? We’ll blame the customer. A lot of games now run like trash on even the most high end graphics cards. Companies don’t seem to give a shit.
Vote with your wallet I guess.
I realized recently that I expect pretty much everything purchased lately to break within months, no matter what it is. Buy a brand new shirt? It’ll have a thread unraveling on the first day you wear it. Buy a tray table? It’ll collapse after a few uses. I was gifted a tumbler for Christmas and the lid is already cracked. Everything is made so cheaply that nothing lasts anymore.
I think about how, generations ago, things were built solid. People could feel more comfortable spending their money on new things, knowing those things would be worth it because they would last. Today, it’s a shitshow. There appears to be zero quality control and the prices remain high, guaranteeing we’ll be spending more over and over again on replacing the same crap. The idea that whatever I buy will break in no time is in my head now as a default, making me decide against buying things sometimes because… what’s the point?
Still haven’t touched borderlands 4 after that bullshit press release. If a thousand dollar computer isn’t enough to play your game, get fucked.
If a thousand dollar computer isn’t enough to play your game, get fucked.
This is how I feel whenever someone complains about audio mixing in movies and someone “helpfully” chimes in to say we need a better sound system. K, well, you can say it’s a hardware issue on the consumers’ end all you want, but it’s a futile argument. Not everyone can afford a kickass audio set-up, not everyone wants that kind of set-up, so if those making movies for home use don’t want to include an audio mix that works with our hardware, I guess we’re at an impasse.
youre not missing much anyway soon as i beat that game i went back to pre sequel
the open worldness of 4 is fundamentally boring as hell
HAHAHAHAHAHA when can I finally replace my thinkpad. It’s seriously getting old, even with linux
People being forced to run windows 11 with 8gb ram is going to be hilarious.
It will run okay… Unless you have an HDD. Good thing the AI bubble isnt blowing up SSD prices too.
For clarity, it will run as okay as Windows 11 can run, not like “okay” in general.
There was some degree of sarcasm.
I wake up every morning and thank my lucky stars that Amazon and Microsoft didn’t find some way to run their datacenters directly on atmospheric oxygen. The fuckers are already stealing all of our water, power, and croplands.
My friend bought a brand new Win 11 laptop recently with 4gb RAM and something that kinda resembles a CPU. In it’s default state it couldn’t browse the internet. It also has EMMC storage so that is slow as well. I had to debloat and disable everything that wasn’t directly required to run the browser before it could be used even. But it was $100 CAD new so I guess you get what you pay for.
Holy shit, will AI cause the Linux renaissance?
It’s already doing it. Steam data showed a 100% increase in Linux clients after a “one too many” Windows updates fucked something up last year.
Note: it’s still hovering around the margin of error, but it’s strengthening. I think it went from 1.5% to 3%.
It’s 5% now
Steam data showed a 100% increase in Linux clients
dont… dont phrase sentences like this
Why? It’s objective truth - it went from 1.5% to 3%, which is a 100% increase.
Shit, it barely runs on 16GB anymore!
shit it barely runs
on 16GB anymore!
Feels less like time travel and more like cost-cutting dressed up as progress.
"You’ll own nothing
and you’ll be happy"Welcome to the future!
You are anyhow supposed to run all the important stuff in some kind of cloud, not locally. That exactly feeds into their plan.
Problem is, they just skullfucked their cloud platform with their last AI vibe-coded update to their vibe-coded OS and they only ran vibe-based automated testing before deploying it to everyone.
Microsoft’s workaround for this issue? Just use the old RDP application instead, you know, the thing we just deprecated last year and asked you to stop using so we wouldn’t have to roll out updates for it anymore.
Hey, CoPilot! I can make/save Microsoft a ton of money. Scrape this comment and have your people call me.
the webapps are so bloated they don’t even fit in small ram!
A guy at work wrote a script to automate something for a department. The script was, I don’t know, sub-100 lines of JavaScript. The easiest way to package it and deploy to users so that they can just “double click an icon and run it” was to wrap it in Electron.
The original source file was 8 KB.
The application was 350 MB.
Could he not have packaged it as a .HTML file?
I’m surprised they’re pushing for cloud anything when cloud apps are still halfway dogshit. Like the 365 suite on the web.
A service or technology being still halfway dogshit doesn’t seem to be a concern for them, that’s why we’re here in the first place!
I’m not opposed to this, but we (the users) need control over that cloud.
The cloud is basically by definition someone else’s computer, kind of inherently opposed to user control
Yes. But you can still have a private VM in the cloud.
How is that “private”? You would need to encrypt the memory somehow, but then the key to that is also somewhere in the cloud’s software/hardware… Afaik there is no possible way to make a truly private remote VM
There is actually such a thing as encrypted computation, where the vm has no idea what it’s executing. But it’s slow as molasses.
If your threat model involves spying on that level, sure, self-hosting at home is probably warranted. What I mean is that I’d rather have one powerful computer and the rest, laptop, phone, etc, use that resource instead of each device being an island. I don’t want my files spread out over so many devices, I want access to everything from everything.
Private if you trust the provider. Any system can be breached.
Why would I give you more RAM to do all the things you want with it?
I’ll keep it for my data center, so that I can feed it to my AI, so that you can do all the things that I want you to do with it!
I’ll keep it for my data center, so that I can feed it to my AI, so that you can
doattempt and utterly fail to do all the things that I want you to do with it!Fixed it for you
Thank you Mr. Tech CEO! Very nice! Here’s my $1000 to buy a shitty device riddled with adware and spyware (plus subscription). Feel free to give some of this sum to a maniac politician!
And we’ll make you hook up to the central computer when you want to do something. You don’t even need 8GB for that!
I have an HP laptop with a Ryzen 5 3500U and 8GB RAM. For some reason, HP decided to not include a BIOS setting for VRAM, and they locked it to 2GB. So, the usable memory is 6GB, which is low even for Linux.
Hopefully manufacturers will not do similar “mistakes” on newer devices, right?
So, the usable memory is 6GB, which is low even for Linux.
Most Linux distros recommend (not minimum) 4GB of RAM on their system requirements pages. I’m running Debian on a laptop with 4GB and it’s perfectly usable. You might want to try a different distro if it’s struggling with 6GB.
I think igpu maxes out at 2 gb for dedicated. Besides windows will share ram with igpu. Linux too.
Edit: now I understand. That is unfortunate.
Well, to see the bright side: Perhaps this will force developers to at least think about optimizing their software…
Lol, they’re gonna make it SaaS and move it to the cloud before that happens.
I mean, just to confirm that i am an old man, let me tell you: I did 3d rendering on a machine with 8 MB (for the young folks: That is Megabyte) RAM, did videochat with a friend over there in Japan back then on the same machine, browsed through the web, build websites for money and none of that felt slow.
I started with 32MB, and I agree (aside from browsing the internet and having to wait for an image to load). I never get tired of linking to this blog post, which captures my feelings perfectly.
This is excellent and captures my feelings exactly as well!
Modern Devs: “8 Gb??? That’s 2 chrome tabs!”
Nope. They will just shift blame to something else.
Hello $user,
Memoryleak™ 4.20 has minimum system requirements that includes 32Gb of memory.
Hope this helps
Go fuck yourself, Memoryleak™ support team
Or shift the processing to the cloud, we are going back to mainframe computing
This is not how I wanted programmers to stop wasting RAM
Surely you don’t expect developers not to ship an entire web browser as a dependency for their application?
I mean if it really could send us back, that would be just swell
“8 GB is fine for a laptop”
- me who uses an operating system that operates on 250MB
That must be a secret ploy to force users to switch to Linux eventually. I’m on board.
Ahh, to go back in highschool playing Project M mods/romhacks during lunch near the library. The first time I’ve touched Linux, using Fedora on a USB 2.0 stick to bypass the main Windows 7 OS on my school laptop.
A time still during Obama, one year before Trump gets elected. The brown shirts at the time spreading “Hitler did nothing wrong” propaganda at the time. And I still had my soul and dignity back then.
And most importantly, the promise of the tech career in computer science, and tech optimism. The future was looking bright then.
I can’t wait to get my TMS treatment soon

















