Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Agreed! Got one from work, and it's a beast on Fedora 36 with the 11th gen. Even the discrete-ish Iris Xe graphics are surprisingly fast. So cool that we'll actually be able to update the innards in a few years as necessary to keep it feeling fresh.

Edit: A small but nice design feature is the light that comes on to imply whether the usb-c port is charging properly. Coming from a mac that removed this feature when usb-c charging was introduced, this is a huge luxury.



I just played through Mass Effect 2 & 3 on mine (Intel i5) with no problems, using Wine. When I do upgrade my main board, I may get the i7 for better graphics performance.

Another huge + is setting battery charge limit with a console command (1). When I’m connected to power at home, I run `ectool fwchargelimit 60` to keep the battery at 60%. If I’m going out, I set it to 100% in the morning and let it charge.

1: https://community.frame.work/t/exploring-the-embedded-contro...


On my thinkpad, I've found it most useful to additionally set a start charge threshold much lower than the charge limit. My use case only sees a handful of minutes without charger per day. With charging starting at 40% and ending at 80%, the battery gets charged sometimes as seldom as once per week.


The other way to look at it with regards to the framework is to just not worry about it. Replacing the battery is trivial and only $60 https://frame.work/products/battery

Once the ecosystem picks up speed and there are multiple vendors, perhaps the price or even capacity will improve. Although this may be possible with the thinkpad as well.


I’m curious: why do you set the max battery percentage?


This doesn't seem useful based on what I know about LiIon batteries: they don't benefit from partial charges, since their capacity loss comes from discharge behaviour.


Source: https://batteryuniversity.com/article/bu-808-how-to-prolong-...

Exposing the battery to high temperature and dwelling in a full state-of-charge for an extended time can be more stressful than cycling.

Most Li-ions charge to 4.20V/cell, and every reduction in peak charge voltage of 0.10V/cell is said to double the cycle life. For example, a lithium-ion cell charged to 4.20V/cell typically delivers 300–500 cycles. If charged to only 4.10V/cell, the life can be prolonged to 600–1,000 cycles; 4.0V/cell should deliver 1,200–2,000 and 3.90V/cell should provide 2,400–4,000 cycles.

On the negative side, a lower peak charge voltage reduces the capacity the battery stores. As a simple guideline, every 70mV reduction in charge voltage lowers the overall capacity by 10 percent. Applying the peak charge voltage on a subsequent charge will restore the full capacity.


LiIon batteries apparently benefit a lot from not never being at 100% and 0%.

I think there is a reason why car manufacturers over-provision (i.e. the better designed electric vehicles never charge to real 100 %)


>Even the discrete-ish Iris Xe graphics are surprisingly fast. So cool that we'll actually be able to update the innards in a few years as necessary to keep it feeling fresh.

Isn't Intel graphics always been the best bet for Linux due to their excellent driver support? I'm excited for their discrete GPUs just for the sake of proper Linux support.

I have an 15W haswell machine in the corner decoding & encoding multiple HD camera feeds from motion on integrated GPU using intel_vaapi while the CPU is free for postgres, redis and a qemu VM - 24*7.


Intel GPUs have been the best bet for Linux laptops for a long time (over 10 years), but for the last two or three, AMD has been just as good. Just avoid dual GPU laptops ("Optimus" or whatever), it's very problematic on Linux and somewhat problematic even on Windows.


I do have a laptop with one of those early intel/AMD 6000 series dual GPU laptops, I remember setting up GPU drivers for it (AMD GPU) used to be troublesome but nowadays even Ubuntu sets it up by default and the devices could be switched with just DRI_PRIME.

But of course the performance is awful and intel GPU is better for most tasks; Newer AMD GPU and open-source drivers are likely much better as you say.


> Even the discrete-ish Iris Xe graphics are surprisingly fast

Anyone have any idea how well this stacks up against RDNA2? I'd love it to be close enough to not have to worry much about, but from what I hear AMD have it significantly better




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: