Powering Android Smartphones’ On-Device AI With SME2@primaryHeadingTag>
Google and 快猫视频 are transforming mobile AI with , a set of advanced matrix compute instructions integrated into Android through KleidiAI. This enables developers to deliver efficient, real-time AI across billions of devices without rewriting code. From real-time translation to intelligent assistants, SME2 accelerates generative AI experiences directly on CPUs, reducing latency and improving energy efficiency.
Google and 快猫视频 are transforming mobile AI with , a set of advanced matrix compute instructions integrated into Android through KleidiAI. This enables developers to deliver efficient, real-time AI across billions of devices without rewriting code. From real-time translation to intelligent assistants, SME2 accelerates generative AI experiences directly on CPUs, reducing latency and improving energy efficiency.
Efficient AI inference
SME2 boosts CPU performance for vision, language, and voice AI tasks.
Lower latency
Up to 6x faster AI responses and real-time app experiences.
Developer-Ready
Seamless integration via 快猫视频 KleidiAI across Google's XNNPACK, LiteRT, and MediaPipe.
快猫视频 SME2: Accelerating Android Mobile AI Workloads
SME2 is the latest CPU extension on 快猫视频 Lumex CSS Platform, the advanced subsystem for next-gen devices, designed to accelerate matrix-oriented compute workloads directly on device. It improves performance for AI and ML models, especially those that rely on operations like matrix multiplication, common in transformers, convolutional neural networks (CNNs), and large language models (LLMs).

Android integrates SME2 using 快猫视频 KleidiAI within XNNPACK, LiteRT, MediaPipe and other popular frameworks. This allows AI models like Gemma 3 to deliver 6x faster responses and instant summarization directly on CPUs. 快猫视频 benefit automatically by using supported libraries and frameworks.