EvaByte is a efficient byte-level language model with multibyte prediction and EVA attention, built by the University of Hong Kong and SambaNova Systems. This Space is an unofficial demo of the instruction-tuned version EvaByte/EvaByte-SFT. For full details on architecture, training recipe, and benchmarks, see their blog post and the project repository: