Tether’s BitNet LoRA framework enables AI model training across smartphones, GPUs, and consumer devices.
The system reduces memory use and boosts performance, with up to 77.8% lower VRAM requirements.
Users can fine-tune models up to 13B parameters on mobile devices, expanding edge AI capabilities.
Tether announced a new AI framework through its QVAC Fabric platform, enabling cross-platform BitNet LoRA training on consumer devices. The update allows billion-parameter models to run on smartphones and GPUs. CEO Paolo Ardoino shared the development, highlighting reduced costs and broader access to AI tools.
The QVAC Fabric update introduces cross-platform support for BitNet LoRA fine-tuning. This allows AI models to run across different hardware and operating systems.
Notably, the framework supports GPUs from AMD, Intel, and Apple, including mobile chipsets. It also uses Vulkan and Metal backends for compatibility.
According to Tether, this is the first time BitNet LoRA works across such a wide range of devices. As a result, users can train models on everyday hardware.
The system reduces memory and compute needs by combining BitNet and LoRA techniques. BitNet compresses model weights into simplified values, while LoRA limits trainable parameters.
Together, these methods lower hardware requirements significantly. For example, GPU inference runs two to eleven times faster than CPU on mobile devices.
Additionally, memory usage drops sharply compared to full-precision models. Benchmarks show up to 77.8% less VRAM use than comparable systems.
Tether also demonstrated fine-tuning on smartphones. Tests showed 125 million parameter models trained in minutes on devices like Samsung S25.
The framework enables larger models to run on edge devices. Tether reported successful fine-tuning of models up to 13 billion parameters on iPhone 16.
Moreover, the system supports mobile GPUs such as Adreno, Mali, and Apple Bionic. This expands AI development beyond specialized hardware.
According to Paolo Ardoino, AI development often depends on expensive infrastructure. He said this framework shifts capabilities toward local devices.
Tether added that the system reduces reliance on centralized platforms. It also allows users to train and process data directly on their devices.