What’s amazing is that in the past I’ve felt the need to upgrade within a few years.
New video format or more demanding music software is released that slows the machine down, or battery life craters.
Well, I haven’t had even a tinge of feeling that I need to upgrade after getting my M1 Pro MBP. I can’t remember it ever skipping a beat running a serious Ableton project, or editing in Resolve.
Can stuff be faster? Technically of course. But this is the first machine that even after several years I’ve not caught myself once wishing that it was faster or had more RAM. Not once.
Perhaps it’s my age, or perhaps it’s just the architecture of these new Mac chips are just so damn good.
Laptops in general are just better than they used to be, with modern CPUs and NVMe disks. I feel exactly the same seeing new mobile AMD chips too, I'm pretty sure I'll be happy with my Ryzen 7040-based laptop for at least a few years.
Apple's M1 came at a really interesting point. Intel was still dominating the laptop game for Windows laptops, but generational improvements felt pretty lame. A whole lot of money for mediocre performance gains, high heat output and not very impressive battery. The laptop ecosystem changed rapidly as not only the Apple M1 arrived, but also AMD started to gain real prominence in the laptop market after hitting pretty big in the desktop and data center CPU market. (Addendum: and FWIW, Intel has also gotten a fair bit better at mobile too in the meantime. Their recent mobile chipsets have shown good efficiency improvements.)
If Qualcomm's Windows on ARM efforts live past the ARM lawsuit, I imagine a couple generations from now they could also have a fairly compelling product. In my eyes, there has never been a better time to buy a laptop.
(Obligatory: I do have an M2 laptop in my possession from work. The hardware is very nice, it beats the battery life on my AMD laptop even if the AMD laptop chews through some compute a bit faster. That said, I love the AMD laptop because it runs Linux really well. I've tried Asahi on an M1 Mac Mini, it is very cool but not something I'd consider daily driving soon.)
> Laptops in general are just better than they used to be, with modern CPUs and NVMe disks. I feel exactly the same seeing new mobile AMD chips too, I'm pretty sure I'll be happy with my Ryzen 7040-based laptop for at least a few years.
You say that, but I get extremely frustrated at how slow my Surface Pro 10 is (with an Ultra 7 165U).
It could be Windows of course, but this is a much more modern machine than my Macbook Air (M1) and feels like it's almost 10 years old at times in comparison. - despite being 3-4 years newer.
It's true that Linux may be a bit better in some cases, if you have a system that has good Linux support, but I think in most cases it should never make a very substantial difference. On some of the newer Intel laptops, there are still missing power management features anyways, so it's hard to compare.
That said, Intel still has yet to catch up to AMD on efficiency unfortunately, they've improved generationally but if you look at power efficiency benchmarks of Intel CPUs vs AMD you can see AMD comfortably owns the entire top of the chart. Also, as a many-time Microsoft Surface owner, I can also confirm that these devices are rarely good showcases for the chipsets inside of them: they tend to be constrained by both power and thermal limits. There are a lot of good laptops on the market, I wouldn't compare a MacBook, even a MacBook Air, a laptop, with a Surface Pro, a 2-in-1 device. Heck, even my Intel Surface Laptop 4, a device I kinda like, isn't the ideal showcase for its already mediocre 11th gen Intel processor...
The Mac laptop market is pretty easy: you buy the laptops they make, and you get what you get. On one hand, that means no need to worry about looking at reviews or comparisons, except to pick a model. They all perform reasonably well, the touchpad will always be good, the keyboard is alright. On the other hand, you really do get what you get: no touchscreens, no repairability, no booting directly into Windows, etc.
I changed the wording to be "booting directly" to clarify that I'm not including VMs. If I have to explain why that matters I guess I can, but I am pretty sure you know.
If the roles were reversed would you still need an explanation? e.g. If I could run macOS inside of a VM on Windows and run things like Final Cut and XCode with sufficient performance, would you think there's no benefit to being able to boot macOS natively?
Booting natively means you need real drivers, which don't exist for Windows on Mac as well as for macOS on PC. It'd be useless. Just use the VM, it's good.
And it's not the same - running Windows natively on Mac would seriously degrade the Mac, while running macOS on a PC has no reason to make it worse than with Windows. Why not buy a PC laptop at that point? The close hardware/OS integration is the whole point of the product. Putting Windows into a VM lets you use best of both.
The question was a hypothetical. What if the macOS VM was perfect? If it was perfect, would it then not matter if you couldn't just boot into macOS?
I'm pretty sure you would never use a Windows PC just to boot into a macOS VM, even if it was flawless. And there are people who would never boot a Mac, just to boot into a Windows VM, even if it was flawless. And no, it's not flawless. Being able to run a relatively old strategy game is not a great demonstration of the ability generally play any random Windows game. I have a Parallels and VMWware Fusion license (well... Had, anyway), and I'm a long time (20 years) Linux user, I promise that I am not talking out my ass when it comes to knowing all about the compromises of interoperability software.
To be clear, I am not trying to tell you that the interoperability software is useless, or that it doesn't work just fine for you. I'm trying to say that in a world where the marketshare of Windows is around 70%, a lot of people depend on software and workflows that only work on Windows. A lot of people buy PCs specifically to play video games, possibly even as a job (creating videos/streaming/competing in esports teams/developing video games and related software) and they don't want additional input latency, lower performance, and worse compatibility.
Even the imperfections of virtual machines aside, some people just don't like macOS. I don't like macOS or Windows at all. I think they are both irritating to use in a way that I find hard to stomach. That doesn't mean that I don't acknowledge the existence of many people who very much rely on their macOS and Windows systems, the software ecosystems of their respective systems, and the workflows that they execute on those systems.
So basically, aside from the imperfections of a virtual machine, the ability to choose to run Windows as your native operating system is really important for the obvious case where it's the operating system you would prefer to run.
I agree. I would like to be able to use any hardware to it's full potential with any OS, even if the OS is running as a VM inside another OS. That's more difficult to pull off due to needing to then run both OSs at once. So then at least let me install the OS I want directly on the hardware and legally use any other OS in a VM with as much performance as possible.
There is nothing stopping you, technically or legally, from replacing the OS on a Mac. Apple went out of their way to make it possible (compared to devices with Qualcomm chips, for example) and the platform is reasonably compatible with PC.
The point of this whole thing is that practically speaking, it matters to the person deciding to buy a computer as to whether they can feasibly install their OS of choice on it. It stands to reason then, that a downside of buying a Mac computer is that you can not practically run Windows natively on a modern Mac. In practice, it does not matter who's fault this is.
Aside: Really, it's a combination of factors. First, Apple uses a bespoke boot chain, interrupt controller, etc. instead of UEFI and following ARM SystemReady standards like virtually all of the other desktop and server-class ARM machines, and didn't bother with any interoperability. The boot process is absolutely designed just to be able to boot XNU, with tiny escape hatches making it slightly easier to jam another payload into it. On the other hand, just out of pure coincidence, Windows apparently statically links the HAL since Windows 10 version 2004, making it impossible for a straight port to be done anymore. In any case, the Apple Silicon computers are designed to boot macOS, and "went out of their way to make it possible" is an absurd overstatement of what they did. What they did was "do the absolute minimum to make it possible without doing anything to make it strictly impossible." Going out of their way implies they actually made an effort to make it possible, but officially as far as I know Apple has only ever actually acknowledged virtual machines.
I think it would be fair to argue that the reverse is true, too: If you choose to buy a PC, you will be stuck with Windows, or an alternative PC operating system. (Of course, usually a Linux distribution, but sometimes a *BSD, or maybe Illumos. Or hell, perhaps Haiku.) That said, objectively speaking Windows has more marketshare and a larger ecosystem, for better or worse, so the number of people who strictly need and strictly want Windows is going to naturally be higher than the comparative numbers for macOS. This doesn't imply one is better than the other, but it still matters if you're talking about what laptop to buy.
> the platform is reasonably compatible with PC.
Not sure what you mean here. The Apple Silicon platform has basically nothing in common with the x64 PC. I guess it has a PCI express bus, but even that is not attached the same way as any typical x64 PC.
The Apple Silicon platform is actually substantially similar to the iOS platform.
> compared to devices with Qualcomm chips, for example
Also not sure what this is meant to mean, but with the Snapdragon X Elite platform, Qualcomm engineers have been working on upstream Linux support for a while now. In contrast I don't think Apple has contributed or even publicly acknowledged Asahi Linux or any of the Linux porting efforts to Apple Silicon.
If we're mostly concerned about CPU grunt, it's really hard to question the Ryzen 7040, which like the M1, is also not the newest generation chip, though it is newer than the M1 by a couple of years. Still, comparing an M1 MacBook Pro with a Framework 16 on Geekbench:
Both of these CPUs perform well enough that most users will not need to be concerned at all about the compute power. Newer CPUs are doing better but it'd be hard to notice day-to-day.
As for other laptop features... That'll obviously be vendor-dependent. The biggest advantage of the PC market is all of the choices you get to make, and the biggest disadvantage of the PC market is all of the choices you have to make. (Edit: Though if anyone wants a comparison point, just for sake of argument, I think generally the strongest options have been from ASUS. Right now, the Zephyrus G16 has been reviewing pretty good, with people mostly just complaining that it is too expensive. Certainly can't argue with that. Personally, I run Framework, but I don't really run the latest-and-greatest mobile chipsets most of the time, and I don't think Framework is ideal for people who want that.)
Ultimately it'll be subjective, but the fans don't really spin up on my Framework 16 unless I push things. Running a game or compiling on all cores for a while will do the trick. The exact battery life, thermals and noise will be heavily dependent on the laptop; the TDP of modern laptop CPUs is probably mostly pretty comparable so a lot of it will come down to thermal design. Same for battery life and noise, depends a lot on things other than the CPU.
Our Thinkpads are definitely hotter; fans spin up routinely on the Ryzens but never does on the M1. Battery life is infinitely better on the M1.
The other thing I hate about the Thinkpads is that the build/screen/trackpad quality sucks in comparison to the Apple stuff. And for all the griping about Mac OS on this site, Windows is way worse - you can tell MS's focus is on linux in the cloud these days. All the ancillary stuff Apple is good at is underappreciated.
I only do coding & browsing so maybe I'm a weak example but I find this even with my pretty old Intel laptops these days.
My Skylake one (I think that would be 6 years old now?) is doing absolutely fine. My Broadwell one is starting to feel a little aged but perfectly usable, I wouldn't even _consider_ upgrading it if I was in the bottom 95% of global income.
Compiling is very slow on these, but I think I'd avoid compilation on my laptop even if I had a cutting edge CPU?
Depends. I used to offload almost all compilation tasks, but now I only really do this if it's especially large. If I want to update my NixOS configuration I don't bother offloading it anymore. (NixOS isn't exactly Gentoo or anything, but I do have some overrides that necessitate a decent amount of compilation, mainly dogfooding my merge requests before they get merged/released.)
>Laptops in general are just better than they used to be, with modern CPUs and NVMe disks.
I've had my xps 13 since 2016. Really the only fault I have against it nowadays is that 8gb of ram is not sufficient to run intellij anymore (hell, sometimes it even bogs down my 16gb mbp).
Now, I've also built an absolute beast of a workstation with a 7800x3d, 64gb ram, 24 gb vram and a fast ssd. Is it faster than both? Yeah. Is my old xps slow enough to annoy me? Not really. Youtube has been sluggish to load / render here lately but I think that's much more that google is making changes to make firefox / ublock a worse experience than any fault of the laptop.
Regarding Youtube, Google is also waging a silent war against Invidious. It's to the point that even running helper scripts to trick Youtube isn't enough (yet). I can't imagine battling active and clever adversaries speeds up Youtube page loads as it runs through its myriad checks that block Invidious.
I am on Intel TGL currently and can't wait for Strix Halo next year. That is truly something else, it's nothing we have seen in notebooks before iGPU wise.
I've had a couple of Tiger Lake laptops, a Thinkpad and I believe my Surface Laptop 4. Based on my experience with current AMD mobile chipsets, I can only imagine the Strix Halo will be quite a massive uplift for you even if the generational improvements aren't impressive.
I've owned an M1 MBP base model since 2021 and I just got an M3 Max for work. I was curious to see if it "felt" different and was contemplating an upgrade to M4. You know what? It doesn't really feel different. I think my browser opens about 1 second faster from a cold start. But other than that, no perceptible difference day to day.
My work machine was upgraded from an M1 with 16GB of RAM to an M3 Max with 36GB and the difference in Xcode compile times is beyond belief: I went from something like 1-2 minutes to 15-20 seconds.
Obviously if opening a browser is the most taxing thing your machine is doing the difference will be minimal. But video or music editing, application-compiling and other intensive tasks, then the upgrade is PHENOMENAL.
FWIW I think that's more the core count than anything. I have a M1 Max as a personal machine and an M3 Max at work and while the M3 Max is definitely faster, it isn't world-beating.
I think most of that difference is going to be the huge increase in performance core count between the base chip and the Max (from 4 to 12). The RAM certainly doesn't hurt though!
My current work machine is M1 Max 64Gb and it's the fastest computer I've ever used. Watching rust code compile makes me laugh out loud it's so quick. Really curious what the newer ones are like, but tbh I don't feel any pressure to upgrade (could just be blissfully ignorant).
I went from 12 to 15 pro max, the difference is significant. I can listen to Spotify while shooting from the camera. On my old iPhone 12, this is not possible.
Test Spotify against YouTube Music (and others) - I personally see no reason for Spotify when I have YouTube Premium, which performs with less overhead.
> I wonder what it will take to make Mac/iOS feel faster
I know, disabling shadows and customisable animation times ;) On a jailbroken phone I once could disable all animation delays, it felt like a new machine (must add that the animations are very important and generally great ux design, but most are just a tad too slow)
16 pro has a specialized camera button which is a game changer for street / travel photography. I upgraded from 13 pro and use that. But no other noticeable improvements. Maybe Apple intelligence summarizing wordy emails.
I think the only upgrade now is from a non-Pro to Pro, since a 120Hz screen is noticeably better than a 60Hz screen (and a borderline scam that a 1000 Euro phone does not have 120Hz).
I realize this isn't your particular use case. But with newer iPhones, you can use USB-C directly for audio. I've been using the Audio Technica ATH-M50xSTS for a while now. The audio quality is exceptional. For Slack/Team/Zoom calls, the sidetone feature plays your voice back inside the headphones, with the level being adjustable via a small toggle switch on the left side. That makes all the difference, similar to transparency/adaptive modes on the AirPod Pro 2s (or older cellphones and landlines).
I use a small Anker USB-A to USB-C adapter [1]. They're rock solid.
As great as the AirPod Pro 2s are, a wired connection is superior in terms of reliability and latency. Although greatly improved over the years, I still have occasional issues connecting or switching between devices.
Out of curiosity, what's the advantage of a jailbroken iPhone nowadays? I'd typically unlock Android phones in the past, but I don't see a need on iOS today.
Interestingly, the last time I used Android, I had to sideload Adguard (an adblocker). On the App Store, it's just another app alongside competing adblockers. No such apps existed in the Play Store to provide system-level blocking, proxying, etc. Yes, browser extensions can be used, but that doesn't cover Google's incessant quest to bypass adblockers (looking at you Google News).
> Out of curiosity, what's the advantage of a jailbroken iPhone nowadays? I'd typically unlock Android phones in the past, but I don't see a need on iOS today.
I have custom scripts,
Ad blocking without VPNs, Application firewalls.
I've found compile times on large C++ code bases to be the only thing I really notice improving. I recently upgraded my work machine from a 2017 i7 to a shiny new Ryzen 9 9950x and my clean compile times went from 3.5 minutes to 15 seconds haha. When I compile with an M2 Max, it's about 30s, so decent for a laptop, but also it was 2x the price of my new desktop workstation.
Can confirm. I have an M2 Air from work and an M1 Pro for personal, and tbh, both absolutely fly. I haven't had a serious reason to upgrade. The only reason I do kind of want to swap out my M1 Pro is because the 13" screen is a wee small, but I also use the thing docked more often than not so it's very hard to justify spending the money.
The biggest difference I’ve seen is iPad Sidecar mode works far more reliably with the M3 Max than the M1 Max.
There have been incremental improvements in speed and nits too, but having Sidecar not randomly crash once a day once on M3 was very nice.
On the other side, as someone doing a lot of work in the GenAI space, I'm simultaneously amazed that I can run Flux [dev] on my laptop and use local LLMs for a variety of tasks, while also wishing that I had more RAM and more processing power, despite having a top of the line M3 max MBP.
But it is wild that two years ago running any sort of useful genAI stuff on a MBP was more-or-less a theoretical curiosity, and already today you can easily run models that would have exceeded SotA 2 years ago.
Somewhat ironically, I got into the "AI" space a complete skeptic, but thinking it would be fun to play with nonetheless. After 2 years of daily work with this models I'm starting to be increasingly convinced they are going to become increasingly disruptive. No AGI, but it will certainly reduce a lot of labor and enable things that we're really feasible before. Best of all, it's clear a lot of this work will be doable from a laptop!
> I haven’t had even a tinge of feeling that I need to upgrade after getting my M1 Pro MBP.
I upgraded my M1 MBP to a MacBook Air M3 15" and it was a major upgrade. It is the same weight but 40% faster and so much nicer to work on while on the sofa or traveling. The screen is also brighter.
I think very few people actually do need the heavy MBPs, especially not the web/full-stack devs who populate Hacker News.
EDIT: The screens are not different in terms of brightness.
> I think very few people actually do need the heavy MBPs, especially not the web/full-stack devs who populate Hacker News.
I can fairly easily get my M1 Air to have thermal issues while on extended video calls with some Docker containers running, and have been on calls with others having the same issue. Kind of sucks if it's, say, an important demo. I mostly use it as a thin client to my desktop when I'm away from home, so it's not really an issue, but if I were using it as a primary device I'd want a machine with a fan.
I try to avoid docker in general during local dev and luckily it has worked out for me even with microservice architectures. It reduces dramatically CPU and RAM needs and also reduces cycle time.
YouTube shows a small red "HDR" label on the video settings icon for actual HDR content. For this label to appear, the display must support HDR. With your M3 Pro, the HDR label should appear in Chrome and Safari.
You can also right-click on the video to enable "Stats for nerds" for more details. Next to color, look for "smpte2084 (PQ) / bt2020". That's usually the highest-quality HDR video [2,3].
You can ignore claims such as "Dolby Vision/Audio". YouTube doesn't support those formats, even if the source material used it. When searching for videos, apply the HDR filter afterward to avoid videos falsely described as "HDR".
Keep in mind that macOS uses a different approach when rendering HDR content. Any UI elements outside the HDR content window will be slightly dimmed, while the HDR region will use the full dynamic range.
I consider Vivid [4] an essential app for MacBook Pro XDR displays.
Once installed, you can keep pressing the "increase brightness" key to go beyond the default SDR range, effectively doubling the brightness of your display without sacrificing color accuracy. It's especially useful outdoors, even indoors, depending on the lighting conditions. And fantastic for demoing content to colleagues or in public settings (like conference booths).
> With your M3 Pro, the HDR label should appear in Chrome and Safari.
Ahh. Not Firefox, of course.
Thanks, I just ran a random nature video in Safari. It was pretty. The commercials before it were extremely annoying though. I don't think it's even legal here to have so many ads per minute of content as Google inserts on youtube.
Hah, I tried skimming through a 2 hour youtube video in Safari and every time i fast forwarded a couple min google inserted two ads. Basically I watched ads more than the video.
How can people use anything that doesn't run ublock origin these days?
For me faster refresh rate is noticeable on phone or ipad where you scroll all the time. On a laptop you don't have that much smooth scrolling. For me it's a non issue on laptop, not even once I wished it had faster refresh. While I always notice when switching between Pro and non Pro iPad.
I find 60Hz on the non-Pro iPhone obnoxious since switching to 120Hz screens. On the other hand, I do not care much about 60Hz when it comes to computer screens. I think touch interfaces make low refresh rates much more noticeable.
No doomscrolling at all. Even when switching between home screens is like it's dropping frames left and right (it's not of course, but that's what it looks like coming from 120Hz). A Galaxy A54 that we still have in the house that was just over 300 Euro feels much smoother than my old iPhone 15 that cost close to 1000 Euro because it has a 120Hz screen.
Even 90Hz (like on some Pixels) is substantially better than the iPhone's 60Hz.
The Galaxy must be new. In my experience Android phones get extremely laggy [1] as they get old and the 120 Hz refresh won't save you :)
I just noticed that I don't really try to follow the screen when I scroll down HN, for example. Yes it's blurry but I seem not to care.
[1] Source: my Galaxy something phone that I keep on my desk for when I do Android development. It has no personal stuff on it, it's only used to test apps that I work on, and even that isn't my main job (nothing since early spring this year for example). It was very smooth when I bought it, now it takes 5+ seconds to start any application on it and they stutter.
A lot of my work can be easily done with a Celeron - it's editing source, compiling very little, running tests on Python code, running small Docker containers and so on. Could it be faster? Of course! Do I need it to be faster? Not really.
I am due to update my Mac mini because my current one can't run Sonoma, but, apart from that, it's a lovely little box with more than enough power for me.
I still use Ivy Bridge and Haswell workstations (with Linux, SSD and discrete GPU) as my daily drivers and for the things I do they still feel fast. Honestly a new Celeron probably beats them performance wise.
The modern AMD or Intel desktops I've tried obviously are much faster when performing large builds and such but for general computing, web browsing, and so forth I literally don't feel much of a difference. Now for mobile devices it's a different story due to the increased efficiency and hence battery life.
It's so nice being able to advise a family member who is looking to upgrade their intel Mac to something new, and just tell them to buy whatever is out, not worry about release dates, not worry about things being out of date, and so on.
The latest of whatever you have will be so much better than the intel one, and the next advances will be so marginal, that it's not even worth looking at a buyer's guide.
My 2019 i9 flagship MBP is just so, so terrible, and my wife's M1 MacBook Air is so, so great. I can't get over how much better her computer is than mine.
I would normally never upgrade so soon after getting an M1 but running local LLMs is extremely cool and useful to the point where I'd want the extra RAM and CPU to run larger models more quickly.
I'm bumping from a still-excellent M1 MAX / 64GB to M4 MAX / 128GB, mostly for local GenAI. It gives me some other uplift and also enables me to sell this system while it's still attractive. I'm able to exhaust local 7B models fairly easily on it.
Yep, the same, M1 Pro from 2021. It's remarkable how snappy it still feels years later, and I still virtually never hear the fan. The M-series of chips is a really remarkable achievement in hardware.
I dont think this has anything to do with the hardware. I think we have entered an age where users in general are not upgrading. As such, software can't demand more and more performance. The M1 came out at a time where mostly all hardware innovation had staggered. Default RAM in a laptop has been 16G for over 5 years. 2 years ago, you couldn't even get more than 16 in most laptops. As such, software hardware requirements havent changed. So any modern CPU is going to feel overpowered. This isn't unique to M1's.
That’s because today’s hw is perfectly capable of running tomorrow’s software at reasonable speed. There aren’t huge drivers of new functionality that needs new software. Displays are fantastic, cellular speeds are amazing and can stream video, battery life is excellent, UIs are smooth with no jankiness, and cameras are good enough.
Why would people feel the need to upgrade?
And this applies already to phones. Laptops have been slowing for even longer.
Until everything starts running local inference. A real Siri that can operate your phone for you, and actually do things like process cross-app conditions ("Hey Siri, if I get an email from my wife today, notify me, then block out my calendar for the afternoon.") would use those increased compute and memory resources easily.
Apple has been shipping "neural" processors for a while now, and when software with local inference starts landing, Apple hardware will be a natural place for it. They'll get to say "Your data, on your device, working for you; no subscription or API key needed."
That's a very big maybe. The LLM experience locally is currently very very different from the hosted models most people play with. The future is still very uncertain.
I think regretting Mac upgrades is a real thing, at least for me. I got a 32G Mac mini in January to run local LLMs. While it does so beautifully, there are now smaller LLMs that run fine on my very old 8G M1 MacBook Pro, and these newer smaller models do almost all of what I want for NLP tasks, data transformation, RAG, etc. I feel like I wasted my money.
Which ones in particular? I have an M2 air with 8GB, and doing some RAG development locally would be fantastic. I tried running Ollama with llama3.2 and it predictably bombed.
I always catch myself in this same train of thought until it finally re-occurs to me that "no, the variable here is just that you're old." Part of it is that I have more money now, so I buy better products that last longer. Part of it is that I have less uninterrupted time for diving deeply into new interests which leads to always having new products on the wishlist.
In the world of personal computers, I've seen very few must-have advances in adulthood. The only two unquestionable big jumps I can think of off hand are Apple's 5K screens (how has that been ten years?!) and Apple Silicon. Other huge improvements were more gradual, like Wi-Fi, affordable SSDs, and energy efficiency. (Of course it's notable that I'm not into PC gaming, where I know there has been incredible advances in performance and display tech.)
I've had Macs before, from work, but there is something about the M1 Pro that feels like a major step up.
Only recently I noticed some slowness. I think Google Photos changed something and they show photos in HDR and it causes unsmooth scrolling. I wonder if it's something fixable on Google's side though.
I agree with you about not needing to upgrade but, it still stands that IMHO Apple is better off with upgrading or even having the need to upgrade with competition. (Also it's really good that Macs now have 16GB of ram by default). As I have had my M1 14.2 Max I believe that the only reason I would want to upgrade is that I can configure it with 128GB of ram which allows you to load newer AI models on device.
The MacBook Pro seems like it does have some quality of life improvements such as Thunderbolt 5, the camera is now a center stage (follows you) 14 megapixel camera now all of them have three USB-C ports and the battery life claims of 22-24 hours. Regardless if you want a MacBook Pro and you don't have one there is now an argument on not just going to buy the previous model.
Same. I used to upgrade every 1.5 years or so. But with every Apple Silicon generation so far I have felt that there are really no good reasons to upgrade. I have a MacBook M3 Pro for work, but there are no convincing differences compared to the M1 Pro.
In fact, I bought a highly discounted Mac Studio with M1 Ultra because the M1 is still so good and it gives me 10Gbit ethernet, 20 cores and a lot of memory.
The only thing I am thinking about is going back to the MacBook Air again since I like the lighter form factor. But the display, 24 GiB max RAM and only 2 Thunderbolt ports would be a significant downgrade.
And M1 from 4 years ago instead of M3 from last year; while a 2x speed improvement in the benchmarks they listed is good, it also shows that the M series CPUs see incremental improvements, not exponential or revolutionary. I get the feeling - but a CPU expert can correct me / say more - that their base design is mostly unchanged since M1, but the manufacturing process has improved (leading to less power consumption/heat), the amount of cores has increased, and they added specialized hardware for AI-related workloads.
That said, they are in a very comfortable position right now, with neither Intel, AMD, or another competitor able to produce anything close to the bang-for-watt that Apple is managing. Little pressure from behind them to push for more performance.
Their sales pitch when they released the M1 was that the architecture would scale linearly and so far this appears to be true.
It seems like they bump the base frequency of the CPU cores with every revision to get some easy performance gains (the M1 was 3.2 GHz and the M3 is now 4.1 GHz for the performance cores), but it looks like this comes at the cost of it not being able to maintain the performance; some M3 reviews noted that the system starts throttling much earlier than an M1.
Same feeling. The jump from all the previous laptops I owned to an M1 was an incredible jump. The thing is fast, has amazing battery life and stays cold.
Never felt the need to upgrade.
probably the next update wave is coming from the need of AI features for more local memory and compute. The software is just not there yet in usual tasks but it's just a question of time I guess. Of course there will be the pressure to do that in the cloud as usual, but local compute will always remain a market.
and probably it's good that at least one of the big players has a business model that supports driving that forward
IMO Apple Silicon is just that good; have played No Man’s Sky on a new MB Air 13” at 1080p and mid to high settings.
An MB Air with an M3 and no fan out gamed my old gtx 1080 box which stuttered on NMS size games all the time
Shows just how poorly Intel has done. That company should be razed to the ground figuratively and the infrastructure given to a new generation of chip makers; the last one is clearly out of their element
I also have an M1 Pro MBP and mostly feel the same. The most tempting thing about the new ones is the space black option. Prior to the M1, I was getting a new laptop every year or two and there was always something wrong with them - butterfly keyboard, Touch Bar etc. This thing is essentially perfect though, it still feels and performs like a brand new computer.
I have an MBP M1 Max and the only time I really feel like I need more oomph is when I'm doing live previews and/or rendering in After Effects. I find myself having to clear the cache constantly.
Other than that it cruises across all other applications. Hard to justify an upgrade purely for that one issue when everything else is so solid. But it does make the eyes wander...
> Perhaps it’s my age, or perhaps it’s just the architecture of these new Mac chips are just so damn good.
I feel the same of my laptop of 2011 so I guess it is partly age (not feeling the urge to always have the greatest) and partly it is non LLM and gaming related computing is not demanding enough to force us to upgrade.
I think the last decade had an explosion in the amount of resources browsers needed and used (partly workloads moving over, partly moving to more advanced web frameworks, partly electron apps proliferating).
The last few years Chrome seems to have stepped up energy and memory use, which impacts most casual use these days. Safari has also become more efficient, but it never felt bloated the way Chrome used to.
But this ad is specifically for you! (Well, and those pesky consumers clinging on to that i7!):
> Up to 7x faster image processing in Affinity Photo when compared to the 13‑inch MacBook Pro with Core i7, and up to 1.8x faster when compared to the 13-inch MacBook Pro with M1.
I feel the same way about my M1 Macbook Air ... it's such a silly small and powerful machine. I've got money to upgrade, I just have no need. It's more than enough for even demanding Logic sessions and Ollama for most 8b models. I love it.
The only reason I'd want to upgrade my M1 Pro MBP is because I kind of need more RAM and storage. The fact that I'm even considering a new laptop just for things that before could have been a trivial upgrade is quite illuminating.
Yeah, I feel like Apple has done the opposite of planned obsolescence with the M chips.
I have a Macbook Air M1 that I'd like to upgrade, but they're not making it easy. I promised myself a couple of years ago I'll never buy a new expensive computing device/phone unless it supports 120 hertz and Wi-Fi 7, a pretty reasonable request I think.
I got the iPhone 16 Pro, guess I can wait another year for a new Macbook (hopefully the Air will have a decent display by then, I'm not too keen to downgrade the portability just to get a good display).
So the intel era is not Apple products? Butterfly keyboard is not an Apple invention?
They have the highest product quality of any laptop manufacturer, period. But to say that all Apple products hold value well is simply not true. All quality products hold value well, and most of Apples products are quality.
I guarantee you that if Apple produced a trashy laptop it would have no resell value.
It's expected Intel-based Macs would lose value quickly considering how much better the M1 models were. This transition was bigger than when they moved from PowerPC to Intel.
One complicating factor in the case of the Intel Macs is that an architectural transition happened after they came out. So they will be able to run less and less new software over the next couple of years, and they lack most AI-enabling hardware acceleration.
That said, they did suffer from some self inflicted hardware limitations, as you hint. One reason I like the MBP is the return of the SD card slot.
Similar for me. MacBook Air M1 (8 cpu / 8 gpu; 16 GB RAM)...running in or out of clamshell with a 5k monitor, I rarely notice issues. Typically, if I'm working very inefficiently (obnoxious amount of tabs with Safari and Chrome; mostly web apps, Slack, Zoom, Postman, and vscode), I'll notice a minor lag during a video call while screen sharing...even then, it still keeps up.
(Old Pentium Pro, PII, multi chip desktop days) -- When I did a different type of work, I would be in love with these new chips. I just don't throw as much at my computer anymore outside of things being RAM heavy.
The M1 (with 16 GB ram) is really an amazing chip. I'm with you, outside of a repair/replacement? I'm happy to wait for 120hz refresh, faster wifi, and longer battery life.
> Yeah, I feel like Apple has done the opposite of planned obsolescence with the M chips.
They always have. If you want an objective measure of planned obsolescence, look at the resale value. Apple products hold their resale value better than pretty much every competitor because they stay useful for far longer.
I have exactly the same experience, usually after 3 years I'm desperate for new Mac but right now I genuinely think I'd prefer not to change. I have absolutely no issues with my M1 Pro, battery and performance is still great.
I feel exactly the same. The one thing that would get me to pull the trigger on a newer one is if they start supporting SVE2 instructions, which would be super useful for a specific programming project I’ve been playing with.
100% agree on this. Ive had this thing for 3 years and I still appreciate how good it is. Of course the M4 tingles my desire for new cool toys, but I honestly don´t think I would notice much difference with my current use.
Same boat—I'm on a lowly M1 MacBook Air, and haven't felt any need to upgrade (SwiftUI development, video editing, you name it), which is wild for a nearly 4 year-old laptop.
I am replacing a Dell laptop because the case is cracking, not because it's too slow (it isn't lightning fast, of course, but it sure is fast enough for casual use).
I’m using the M3 Air 13 in (splurged for 24 GB of RAM, I’m sure 16 is fine) to make iOS apps in Xcode and produce music in Ableton and it’s been more than performant for those tasks
Only downside is the screen. The brightness sort of has to be maxed out to be readable and viewing at a wrong angle makes even that imperfect
That said it’s about the same size / weight as an iPad Pro which feels much more portable than a pro device
Tbf, the only thing I miss with my M2 MacBook is the ability to run x86_64 VM’s with decent performance locally.
I’ve tried a bunch of ways to do this - and frankly the translation overhead is absolute pants currently.
Not a showstopper though, for the 20-30% of complete pain in the ass cases where I can’t easily offload the job onto a VPS or a NUC or something, I just have a ThinkPad.
Yup, honestly the main reason I'd like to upgrade from my M1 MBA is the newer webcams are 1080p instead of 720p, and particularly much better in low light like in the evening.
If you're in the ecosystem get an iphone mount - image quality is unreal compared to anything short of some fancy DSLR setup - it is some setup but not much with magnets in iphone.
- RAM: 32GB is just not enough. I have firefox open with a mere 60-ish tabs and RSS is already at 21GB. I add my IDE and building a c++ app at -j10 and I start getting OOMs left and right, I have to close my browser whenever I build.
- Graphics: the GPU and compute capabilities are definitely not there when comparing to NVidia mid-range offering, it's more akin to a laptop 2060.
- Can only do 3 video outputs at most while there are enough hardware outputs for 5.
> I have firefox open with a mere 60-ish tabs and RSS is already at 21GB.
Isn't that just Firefox deciding to let things stay in memory because the memory is there? Anyway, Safari seems to run fine with 8Gb and will unload inactive tabs.
> building a c++ app at -j10
Strange you're getting OOM errors instead of more swapping. But why insist on running 10 tasks at the same time then?
when the hardware wait time is the same as the duration of my impulsive decisions i no longer have a hardware speed problem, i have a software suggestion problem
I got an MBP M1 with 32gb of RAM. It'll probably be another 2-3 years or longer before I feel the pressure to upgrade if not longer. I've even started gaming (something I dropped nearly 20 years ago when I switched to mac) again due to Geforce Now, I just don't see the reason.
Frankly though, if the mac mini was a slightly lower price point I'd definitely create my own mac mini cluster for my AI home lab.
I hate to say it but that's like a boomer saying they never felt the need to buy a computer, because they've never wished their pen and paper goes faster. Or a UNIX greybeard saying they don't need a Mac since they don't think its GUI would make their terminal go any faster. If you've hit a point in your life where you're no longer keeping up with the latest technological developments like AI, then of course you don't need to upgrade. A Macbook M1 can't run half the stuff posted on Hugging Face these days. Even my 128gb Mac Studio isn't nearly enough.
> If you've hit a point in your life where you're no longer keeping up with the latest technological developments like AI, then of course you don't need to upgrade.
That's me, I don't give a shit about AI, video editing, modern gaming or Kubernetes. That newest and heaviest piece of software I care about is VSCode. So I think you're absolutely correct. Most things new since Docker and VSCode has not contributed massively to how I work and most of the things I do could be done just fine 8-10 years ago.
I think the difference is that AI is a very narrow niche/hobby at the moment. Of course if you're in that niche having more horsepower is critical. But your boomer/greybeard comparisons fall flat because they're generally about age or being set in your ways. I don't think "not being into AI image generation" is (currently) about being stuck in your ways.
To me it's more like 3d printing as a niche/hobby.
Playing with them locally? Yes, of course it's a niche hobby. The people doing stuff with them that's not either playing with them or developing not just an "AI" product, but a specific sort of AI product, are just using ChatGPT or some other prepackaged thing that either doesn't run locally, or does, but is sized to fit on ordinary hardware.
< 1% of all engagement with a category thing is niche/hobby, yes.
I get that you're probably joking, but - if I use Claude / ChatGPT o1 in my editor and browser, on an M1 Pro - what exactly am I missing by not running e.g. HF models locally? Am I still the greybeard without realising?
Using the term "bro" assumes that all AI supporters are men. This erases the fact that many women and nonbinary people are also passionate about AI technology and are contributing to its development. By using "AI bro" as an insult, you are essentially saying that women and nonbinary people are not welcome in the AI community and that our contributions don't matter. https://www.reddit.com/r/aiwars/comments/13zhpa7/the_misogyn...
Is there an alternative term you would prefer people to use when referring to a pattern of behavior perceived as a combination of being too excited about AI and being unaware (perhaps willfully) that other people can be reasonably be much less interested in the hype? Because that argument could definitely benefit from being immune to deflections based on accusations of sexism.
When I see that someone is excited about something, I believe in encouraging them. If you're looking for a more polite word to disparage people who love and are optimistic about something new, then you're overlooking what that says about your character. Also AI isn't just another fad like NFTs and web3. This is it. This is the big one.
> Also AI isn't just another fad like NFTs and web3. This is it. This is the big one.
That's thoroughly unconvincing. That kind of talk is exactly what so many people are tired of hearing. Especially if it's coming from technically-minded people who don't have any reason to be talking like PR drones.
What makes you think I care about convincing you? These days every shot caller on earth is scrambling to get piece of AI. Either by investing in it or fighting it. You come across as someone who wants to hate on AI. Haters aren't even players. They're NPCs.
So people who aren't obsessed with AI to your liking are:
- boomer luddites
- primitive single-celled organisms
- NPCs
And even people who are enthusiastic about AI but aren't fanatical about running it locally get scorn from you.
I can understand and forgive some amount of confirmation bias leading you to overestimate the importance and popularity of what you work on, but the steady stream of broad insults at anyone who even slightly disagrees with you is dismaying. That kind of behavior is wildly inappropriate for this forum. Please stop.
That’s interesting because I would’ve thought having strong local compute was the old way of thinking. I run huge jobs that consume very large amounts of compute. But the machines doing the work aren’t even in the same state I’m in. Then again maybe I’m even older as I’m basically on the terminal server / mainframe compute model. :)
I work with AI models all day every day, keep up with everything, love frontier tech, I love and breathe LLMs. And I, like OP, haven't seen the need to upgrade from the M1 MBP because it runs the small 1-7B models just fine, and anything bigger I want on some GPU instance anyway, or I want a frontier model which wouldn't run on the newest and biggest MBP. So it's not just us Boomers hating on new stuff, the M series MacBooks are just really good.
Given that models are only going to get larger, and the sheer amount of compute required, I think the endgame here is dedicated "inference boxes" that actual user-facing devices call into. There are already a couple of home appliances like these - NAS, home automation servers - which have some intersecting requirements (e.g. storage for NAS) - so maybe we just need to resurrect the "home server" category.
I agree, and if you want to have the opportunity to build such a product, then you need a computer whose specs today are what a home server would have in four years. If you want to build the future you have to live in the future. I'm proud to make stuff most people can't even run yet, because I know they'll be able to soon. That buys me time to polish their future and work out all the bugs too.
So every user of a computer that doesn't create their own home-grown ML models is a boomer? This can't possibly be a generational thing. Just about everyone on the planet is at a place in their life where they don't make their own AIs.
Eventually as the tools for doing it become better they'll all want to or need to. By then, most computers will be capable of running those tools too. Which means when that happens, people will come up another way to push the limits of compute.
New video format or more demanding music software is released that slows the machine down, or battery life craters.
Well, I haven’t had even a tinge of feeling that I need to upgrade after getting my M1 Pro MBP. I can’t remember it ever skipping a beat running a serious Ableton project, or editing in Resolve.
Can stuff be faster? Technically of course. But this is the first machine that even after several years I’ve not caught myself once wishing that it was faster or had more RAM. Not once.
Perhaps it’s my age, or perhaps it’s just the architecture of these new Mac chips are just so damn good.