• AdrianTheFrog@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 months ago

    I can run a small LLM on my 3060, but most of those models were originally trained on a cluster of a100s (maybe as few as 10, so more like one largish server than one datacenter)

    Bitnet came out recently and is looking like it will lower these requirements significantly (essentially training a model using ternary numbers instead of floats to reduce requirements, which turns out to not lower the quality that significantly)