• John Richard@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 day ago

    It would be less if NVIDIA & AMD wasn’t in an antitrust duopoly. You can get two XTX for less than $2000 with 48GB total VRAM.

    • vanderbilt@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      11 hours ago

      Unfortunately getting an AI workload to run on those XTXs, and run correctly, is another story entirely.

      • John Richard@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        10 hours ago

        ROCm has made a lot of improvements. $2000 for 48GB of VRAM makes up for any minor performance decrease as opposed to spending $2200 or more for 24GB VRAM with NVIDIA.

        • vanderbilt@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          6 hours ago

          ROCm certainly has gotten better, but the weird edge cases remain; alongside the fact that merely getting certain models to run is problematic. I am hoping that RNDA4 is paired with some tooling improvements. No more massive custom container builds, no more versioning nightmares. At my last startup we tried very hard to get AMD GPUs to work, but there were too many issues.