Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Inference will be dirt cheap for things like coding but you'll want much more compute for architectural planning, personal assistants with persistent real time "thinking / memory", as well as real time multimedia. I could put 10 M4s to work right now and it won't be enough for what I've been cooking.
 help



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: