Hacker Newsnew | past | comments | ask | show | jobs | submit | shosca's commentslogin

in my case with a 6900xt:

1. sudo pacman -S ollama-rocm

2. ollama serve

3. ollama run deepseek-r1:32b


Does that entire model fit in gpu memory? How's it run?

I tried running a model larger than ram size and it loads some layers into the gpu but offloads to the cpu also. It's faster than cpu alone for me, but not by a lot.


you're right, actually noticed gpu clocking up and down with 32b, 14b clocks up fully and actually runs faster


Nice, last time I tried out ROCm on Arch a few years ago it was a nightmare. Glad to see it's just one package install away these days, assuming you didn't do any setup beforehand.


I think you do still have to have the ROCm drivers installed, but it's not very hard to do from AMD's website.


everything from arch repos, well cachyos and arch :)


Anaconda also has committed engineering resources to assist with the no-GIL transition.



You can also use https://github.com/SimonBrazell/privacy-redirect to auto redirect from twitter to nitter


Oooh this is great, thank you. I've been wanting to set up something like this for quite a while and haven't really spent the time to figure out how I'd do it. Glad to have an option just land on my screen like this! Cheers :)

Also just realized there are one or two other services they could redirect (e.g. Medium -> scribe.rip). Will see if it's feasible for them to easily add...



Will this finallly motivate Zoom to use proper Pipewire API so that screen capture works on Wayland KDE?



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: