cyrano@lemmy.dbzer0.com to Lemmy Shitpost@lemmy.world · 2 months agoAGI achieved 🤖lemmy.dbzer0.comimagemessage-square89linkfedilinkarrow-up118arrow-down10cross-posted to: fuck_ai@lemmy.world
arrow-up118arrow-down1imageAGI achieved 🤖lemmy.dbzer0.comcyrano@lemmy.dbzer0.com to Lemmy Shitpost@lemmy.world · 2 months agomessage-square89linkfedilinkcross-posted to: fuck_ai@lemmy.world
minus-squareZacryon@feddit.orglinkfedilinkarrow-up1·2 months agoI know that words are tokenized in the vanilla transformer. But do GPT and similar LLMs still do that as well? I assumed they also tokenize on character/symbol level, possibly mixed up with additional abstraction down the chain.
I know that words are tokenized in the vanilla transformer. But do GPT and similar LLMs still do that as well? I assumed they also tokenize on character/symbol level, possibly mixed up with additional abstraction down the chain.