Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think there's a huge use-case locally, if you're happy with the subscription cost and privacy. That is, yet. Give it maybe 2 years and someone will probably invent something which local inference would seriously benefit from. I'm anticipating inference for the home appliances (something mac mini form factor that plugs into your router) but that's based on what would make logical sense for consumers, not what consumers would fall for.

Apple seems to be using LPDDR, but HBM will also likely be a key tech. SK Hynix and Samsung are the most reputable for both.



Thanks.

>> Apple seems to be using LPDDR, but HBM will also likely be a key tech. SK Hynix and Samsung are the most reputable for both.

So not much Micron? Any US based stocks to invest in? :-)


I forgot about Micron, absolutely. TSMC is the supplier for all of these, so you're covering both memory and compute if that's your strategy (the risk is that US TSMC is over provisioning manufacturing based on the pandemic hardware boom).


Thanks!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: