Zyphra AI has introduced ZAYA1-8B, a compact yet powerful language model designed for reasoning tasks. Built using a special approach called Mixture of Experts (MoE), it packs 760 million active parameters out of a total of 8.4 billion. Trained entirely on AMD hardware, the model shows remarkable results, often surpassing larger open-weight models on math










