Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Towards intelligent mobile on-device assistants: low-level text-to-actions with GPT LLMs
0
Zitationen
5
Autoren
2026
Jahr
Abstract
Abstract The field of Artificial Intelligence has witnessed remarkable progress in recent years, especially with the emergence of large language models (LLMs) based on the transformer architecture. Cloud-based LLMs, such as OpenAI’s ChatGPT, offer impressive capabilities but come with concerns regarding latency and privacy due to network dependencies. This article presents an approach to LLM inference that allows LLMs with billions of parameters to be executed directly on mobile devices without network connectivity. The article showcases a fine-tuned GPT LLM with 3 billion parameters that can operate on devices with as low as 4GB of memory. Through the integration of native code and model quantization techniques, the application not only serves as a general-purpose assistant but also facilitates mobile interactions with our text-to-actions feature. With text-to-actions, the assistant capabilities are extended beyond just text conversation, enabling the communication between the low-level LLM and the device’s operating system, autonomously performing tasks such as making calls, searching the web, or scheduling events. The article provides insights into the training pipeline, implementation details, test results, and future directions of on-device LLM inference. This technology opens up possibilities for empowering users with sophisticated AI capabilities while preserving their privacy and eliminating latency concerns.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.687 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.591 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 8.114 Zit.
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
2019 · 6.867 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.