• WalnutLum@lemmy.ml
    link
    fedilink
    English
    arrow-up
    10
    ·
    5 days ago

    This is interesting but I’ll reserve judgement until I see comparable performance past 8 billion params.

    All sub-4 billion parameter models all seem to have the same performance regardless of quantization nowadays, so 3 billion is a little hard to see potential in.