EvaByte [Byte-Level LLM]

EvaByte is a efficient byte-level language model with multibyte prediction and EVA attention, built by the University of Hong Kong and SambaNova Systems.
This Space is an unofficial demo of the instruction-tuned version EvaByte/EvaByte-SFT.
For full details on architecture, training recipe, and benchmarks, see their blog post and the project repository:

If you liked this Space, follow me on Twitter: @KantaHayashiAI

1 2048
0 4
0.05 1