…and it went very smoothly. I installed on a spare PC for now, but I could absolutely see this becoming my daily driver. I’m mostly surprised at how snappy and responsive it is, even on 10 year old hardware!

    • TropicalDingdong@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      16 hours ago

      I’ll give it a shot, but tbh, it’s been a bit of a slog. I’m on the new Z13, the 128gb variant.

      I can’t find an “it just works” variant where both ollama and rocm play nice on the hardware AND the mediatek card works correctly. It’s either I’m able to self host fullsize llms (and do the rest of my ml work) OR I get fully functional wifi.

      I’ve got the whole install process for ollama + rocm + openwebui all set on Ubuntu, but the wifi card is barely getting 20 mbps. But access to rocm (and I assume it will be the same in pytorch) is buttery smooth and I can run medium models in the range of hundreds of tokens per second locally.

      When I throw on bazzite I’m hitting 350 mbps down but it doesn’t seem like it’s got the right rocm/ driver/ kernel/ ollama combo because I’m not even able to get 5 tps.