Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Quantum Computing Meets Large Language Models: Insights, Challenges, and Future Directions
0
Zitationen
10
Autoren
2026
Jahr
Abstract
Large Language Models (LLMs) such as GPT and LLaMA have transformed artificial intelligence, but their rapid growth has brought major challenges in computation, energy use, and scalability. Quantum computing (QC) offers a fundamentally different way to process information, with the potential to accelerate key operations in training, inference, optimization, and representation learning. This survey provides a structured and comprehensive review of how QC can support the future development of LLMs. It examines quantum algorithms, hybrid quantum-classical methods, quantum neural networks, quantum embeddings, and security applications. It also evaluates their feasibility in both the current noisy intermediate-scale quantum (NISQ) era and the future fault-tolerant era. The survey highlights current progress, identifies major technical barriers, and outlines practical research directions needed to build scalable, efficient, and secure quantum-enhanced language models.
Ähnliche Arbeiten
Federated Learning: Challenges, Methods, and Future Directions
2020 · 4.373 Zit.
Deep Learning: Methods and Applications
2014 · 3.302 Zit.
Mobile Edge Computing: A Survey on Architecture and Computation Offloading
2017 · 2.895 Zit.
Machine Learning: An Artificial Intelligence Approach
2013 · 2.639 Zit.
Machine learning and deep learning
2021 · 2.319 Zit.