Does the CLI still work? If so, you could download and play all the Windows 7 compatible, DRM-free games in your library just fine. Alternatively, if you already had these games installed, they’ll work fine without launching Steam first.
Does the CLI still work? If so, you could download and play all the Windows 7 compatible, DRM-free games in your library just fine. Alternatively, if you already had these games installed, they’ll work fine without launching Steam first.
full mirrors of YT
Yeah…not going to happen.
Sounds cool, I just fail to understand how this takes Cinnamon “out to the real world”.
The feature itself is great. It records the last two hours by default and lets you easily create clips from that. The editor is right there in the Steam overlay, it’s pretty great.
I only used it under Linux, and that’s where I’d say it is still very much a beta experience. I have an AMD Radeon 7800 XT. Most of the time, Steam picks up on its hardware acceleration - sometimes it doesn’t. When it doesn’t, it falls back to CPU encoding (obviously) which occupies around 3-4 cores on my 7950X3D to record 3440x1440 at the highest quality setting. GPU encodes are H.264 even though the GPU is perfectly capable of encoding AV1. Performance impact ranges from almost zero to as much as 30%, which seems a bit excessive. On some games that have a splash screen (Sea of Thieves for example), all it will record is said splash screen, even when it’s not shown anymore: you get gameplay sounds, but the video is just a static image with mouse cursor artifacts. It didn’t record sound from one of the microphones I tried. After swapping it out for a different one, my voice is being recorded. At least one session the shortcut for saving a clip just resulted in an error sound instead of a clip being saved.
So it’s a bit disappointing so far. Yeah, Linux shenanigans and relatively small user base, but Valve out of all companies should treat Linux as a first-class platform. Yes, they do a lot for Linux, with Proton and whatnot. But ironically Steam itself is only in an “okay, it kind of works” state. No official packages for anything but apt-based distributions and Wayland (scaling) support is meh at best.
It did seem to work a lot better on the Steam Deck with very little performance impact in my short testing, so there’s that.
What’s Ullr?
Let me guess without reading: kernel-level anti-cheat?
Sounds about right. There are some valid and good use cases for “AI”, but the majority is just buzzword marketing.
Block it and move on with your life?
The main thing (by far) degrading a battery is charging cycles. After 7 years with say 1,500 cycles most batteries will have degraded far beyond “80%” (which is always just an estimate from the electronics anyway). Yes, you can help it a bit by limiting charging rate, heat and limit the min/max %, but it’s not going to be a night and day difference. After 7 years with daily use, you’re going to want to swap the battery, if not for capacity reduction then for safety issues.
Technically, wired charging degrades the battery less than wireless charging, mainly because of the excessive heat generated by the latter. The same way slower wired charging generates less heat. Lower and upper charging limits also help (the tighter the better).
But I personally don’t bother with it. In my experience, battery degradation and longevity mostly comes down to the “battery lottery”, comparable to the “silicon lottery” where some CPUs overclock/undervolt better than others. I’ve had phone batteries mostly charged with a slow wired charger degrade earlier and more compared to almost exclusively wireless charging others. No battery is an exact verbatim copy of another one. Heck, I had a 2 month old battery die on me after just ~20 cycles once. It happens.
Sure, on average you might get a bit more life out of your batteries, but in my opinion it’s not worth it.
The way I see it with charging limits is that sure, your battery might degrade 5% more over the span of 2 years when always charging it to 100% (all numbers here are just wild estimates and, again, depend on your individual battery). But when you limit charging to 80% for example, you get 20% less capacity from the get go. Unless of course you know exactly on what days you need 100% charge and plan your charging ahead of time that way.
Something I personally could never be bothered with. I want to use my device without having to think about it. If that means having to swap out the battery one year earlier, then so be it.
The article links an article from March '24 talking about the introduction of these devices that contains this part:
The scanner that Adams and police officials introduced during Thursday’s news conference in a lower Manhattan station came from Evolv, a publicly traded company that has been accused of doctoring the results of software testing to make its scanners appear more effective than they are.
So they could never be trusted but were still allowed to proceed.
Technically no, but if you want to install apps from the App Store, then yes.
Happy cake day!
I’m no expert here, but I’m pretty sure branch prediction logic is not part of the instruction set, so I don’t see how RISC alone would “fix” these types of issues.
I think you have to go back 20-30 years to get CPUs without branch prediction logic. And VSCodium is quite the resource hog (as is the modern web), so good luck with that.
I always hear power efficiency as an argument that ARM chips are magically better at, but Ryzen AI 300 and Intel Core Ultra 200V series seem to be very competitive with Qualcomm’s offering. It’s hard to compare 1:1 as the same chip in different laptops can be configured very differently in terms of TDP and power curves and the efficiency “sweet spots” aren’t the same for all these different chips. Core Ultra 200V is also awaiting more thorough testing, but it seems to be right up there with the Snapdragon.
I honestly found the Snapdragon X very underwhelming after all that marketing of how much better it was than Apple’s M3 and Intel’s and AMD’s offerings. By the time the Snapdragon was actually available in end-user products, AMD’s and Intel’s competing generations were right around the corner and we’ve also seen a vastly improved M4 chip (although only in an iPad so far, so meh). Add to that the issues that you’ll encounter because while Windows’ x86 to ARM translation layer has certainly improved, it’s nowhere near as seamless as what Apple did.
To me it felt like previous Windows on ARM attempts: promised a lot, released with problems (mainly compatibility this time), then quickly forgotten because x86 chips caught up anyway.
See you in 2-3 years!
Nvidia might be selling the shovels to the customer during this gold rush, but TSMC is making them.
Yes! Can I haz pancakes too?
Pretty calm here for a “panic”.
I have several components in my network that are at least 6 years old. Is that a problem…?