• PriorityMotif@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    You can run it on your own machine. It won’t work on a phone right now, but I guarantee chip manufacturers are working on a custom SOC right now which will be able to run a rudimentary local model.

    • TriflingToad@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      3 months ago

      It will run on a phone right now. Llama3.2 on Pixel 8

      Only drawback is that it requires a lot of RAM so I needed to close all other applications, but that could be fixed easily on the next phone. Other than that it was quite fast and only took ~3gb of storage!