Possibly related:
I also don’t understand why every chat app needs 1GB of RAM to itself.
Solution: if you only have 4GB ram, nothing can use more than 4GB
I had one stick of 16GB and it was not enough. I was going to get a second stick, but said screw it and got two 32GB (it’s a laptop and only has two slots).
How does that even happen 💀💀 I have 2x8gb, usually have teams open, Firefox, telegram, a virtual machine with windows 10, a few IDEs and it usually only takes 10-12gb max mostly due to the vm requiring flat 8 gigs
I dunno but the extra RAM was like a night and day difference.
How does 2 x 32 GB sticks give you 67 GB of RAM? Did you download more RAM?
This is probably down to decimal versus binary unit prefixes. As far as I’m aware, RAM is almost always still power of two kibi-, mebi- or gibibytes, unlike more permanent storage, and it often gets the kilo-, mega- and giga- prefixes regardless.
In other words, if you mix up thousands and 1024s you can get 64×1024×1024×1000 (whoops) which is roughly 67 billion.
This being a laptop, is it possible there’s 4GB soldered plus the 2 DIMM slots? I think I’ve seen something similar on a thinkpad.
Sounds more plausible. Either that or the system is reporting RAM + swap - VRAM reserved memory somehow.
It absolutely will try, it just gets killed by the oom reaper.
Unless you have the
vm.overcommit_memory
sysctl set to 2, and your overcommit is set to less than your system memory.Then, when an application requests more memory than you have available, it will just get an error instead of needing to be killed by OOM when it attempts to use the memory at a later time.
Isn’t there a trade off though?
Yes. Memory allocated, but not written to, still counts toward your limit, unlike in overcommit modes 0 or 1.
The default is to hope that not enough applications on the system cash out on their memory and force the system OOM. You get more efficient use of memory, but I don’t like this approach.
And as a bonus, if you use overcommit 2, you get access to
vm.admin_reserve_kbytes
which allows you to reserve memory only for admin users. Quite nice.
I’ve used Linux for years and never in my life have I seen anything crash or close because of a oom killer. It’s myth for me that it exists. Me looking at my firefox occupying 6GB of the 8GB ram and opening intellij so it becomes full and swap is on 3GB.
It only happens when you run out of swap and ram
Its not a myth at all. If a software uses too much RAM it has to be killed because otherwise the OS crashes. You can read more about it here: https://linux-mm.org/OOM_Killer
Here is the source code: https://codebrowser.dev/linux/linux/mm/oom_kill.c.html
It is just not very tuned for desktop as it will lock up the system and empty every single type of buffer in the kernel before it is actually invoked.
It depends on the app. For some apps it just kills the app and everything is happy
My Firefox has a couple hundred tabs open, one of which had a memory leak. It was getting killed by the OOM killer (on my 64GB of RAM system!) about twice a day. It’s not doing it anymore, though; I must’ve closed the correct tab.
Doesn’t Firefox offload unused tabs by now?
Hey, unused memory is wasted memory
If you got it, flaunt it.
every chat app might use ~1GB because most of them are electron apps, which all spawn their own instance of chromium
Does it mean 35.1 GB out of the 44.3 GB is actually cached? Then you have quite low actual RAM usage considering you have 67 GB.
Oh good question. Now I’m wondering. 44+35 is bigger than the 67GB I have, but normally I would expect pretty much all the RAM to hold cached data, where some is also marked as free in case a process needs it.
Can someone explain this memory screen, as your question has raised many more for me!
“Cache” means space used for disk caching. It’s free to be used for processes as needed, but the system consumes idle RAM until then to speed things up, so it’s technically not “free”, even though it isn’t used by system processes. In Linux, used - cache gives you the actual consumption by processes.
Thanks, someone else also mentioned this. Cached is considered used in Linux, where as in Windows it’s considered free since applications can use it if they need it even though it holds data.
Don’t be confused by cached ram, be confused by the oom killer activating while you have plenty of swap and for some reason it kills the shell you ran Firefox from.
If you want to go on a memory allocation adventure try disabling memory overcommit 🥲
https://www.linuxatemyram.com/
Electron apps are bullshit though.
This site says Linux calls cached RAM “free” but in my screen shot it’s definitely being shown as “used”. I guess this is a choice of this app?
Most likely, try running
htop
ortop
(can’t remember which is which) in a terminal.Well
top
currently shows:MiB Mem : 64076.1 total, 2630.3 free, 51614.1 used, 34046.9 buff/cache MiB Swap: 4096.0 total, 2.3 free, 4093.7 used. 12462.0 avail Mem
While the “Mission Center” app shows:
Subtract cached and free from total to get actual usage Htop shows visually though with cached as yellow or so I think you are using about 30 gb ram.
Honestly, apart from firefox, what are you running? Does that include vms? I have 8GiB ram(7.1 usable) and uses like 1.8gb on idle and about 5-6.5gb on my personal highest usage
many Linux distros are optimized to use as much available RAM as possible, free RAM is wasted RAM
Most would still run with a lot less anyway
It’s already been explained elsewhere, but the cache can be free, as needed - that’s how linux works.
There’s 57+ GB available ram, yet.Yip, got that now. I misunderstood, as it’s different to Windows, which shows cached memory as free since it’s available to apps as needed.
You could probably configure your system monitor to show available memory - that is memory available given that cache can be dropped - rather than free memory that should always be as close to zero as possible.
Well in a turn of events, the stupid photo printing website I was using just kept filling RAM up until it was full then GNOME crashed me back to the login page.
Many people who don’t know what they’re talking about in this thread. No, used memory does not include cached memory. You can confirm this trivially by running
free -m
and adding up the numbers (used + cached + free = total). Used memory can not be reclaimed until the process holding it frees it or dies. Not all cached memory can be reclaimed either, which is why the kernel reports an estimate of available memory. That’s the number that really matters, because aside from some edges cases that’s the number that determines whether you’re out of memory or not.Anyway the fact that you can’t run Linux with 16GB is weird and indicates that some software you are using has a RAM leak (a Firefox extension perhaps?). Firefox will use memory if it’s there but it’s designed to cope with low memory as well, it just unloads tabs quicker so you have to reload often. There are also extensions that make tab unloading more aggressive, maybe that would help - especially if there’s memory pressure from other processes too.
Yeah the cache as part of used memory theory didn’t stack up. This comment (sorry, Lemmy probably doesn’t handle the link well) showed 54GB in use, 30GB cached, and 13GB available. 54+12 = 67GB total so cached doesn’t seem to be counted as in use since it should be counted as free (mostly).
In the end, I’m pretty sure it’s a memory hog website. It kept filling up until GNOME crashed and I lost my progress (I was trying to order prints for 1000 photos on a horrible website that made me change settings one photo at a time, and the longer I took the more RAM filled up).
Anyway the fact that you can’t run Linux with 16GB is weird
I mean, it runs fine. It’s more how I’m using it. Firefox 4GB, Element 1GB, Signal 1GB, Beeper 1GB, Steam 2GB, Joplin 1GB. That’s all just open and idle (chats and Steam don’t even have windows, just background) and are the minimum I would have open at any point. That’s already 10GB. By the time I open a couple of windows in a Jetbrains IDE or a particularly demanding website and suddenly it’s suffocating.
I have a memory consumption issue with Ubuntu, because I stupidly set up the system to have 0 swap. This means under high memory pressure, the entire system could suddenly crash.
To be fair, Windows isn’t a shining beacon either because whenever I attempt something very GPU intensive like running local LLMs the GPU overheats in a split second before the fans have time to spin up and the entire system shuts down.
How does that happen? Shouldn’t the GPU and CPU have thermal throttling so even under intense loads it just slows down to keep temps down?
When I play games on my laptop the integrated graphics are at 100% most of the time but it doesn’t cause the system to crash.
So the system is a gaming laptop which might explain things. The CPU has liquid metal for cooling and a lower TDP so it’s fine. Whereas the GPU has a higher TGP and if ran hard draws like 120W. If the GPU fans are not already on this quickly overwhelming the GPU thermally.
Weird. If I was going to saturate my GPU, I’d pick an intensive game. Seems odd that a gaming laptop might get overwhelmed and shut off if a game is too intensive? Or is is something special about LLMs that make it the Archilles’ heel?
I’m not the type to run super intensive games, and even those games have plenty of warm up time in the form of a loading screen.
That being said, I have had instances of my entire system shutting down due to a graphically intensive game, but it’s much rarer than when running a local LLM.
Edit: Found a game that consistently shuts down my system: Ride 5, but only in the menus.