OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 01.04.2026, 01:44

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

HARNESSING LARGE LANGUAGE MODELS FOR HIGH-PERFORMANCE COMPUTING: OPPORTUNITIES AND CHALLENGES

2025·0 Zitationen·Azerbaijan Journal of High Performance ComputingOpen Access
Volltext beim Verlag öffnen

0

Zitationen

1

Autoren

2025

Jahr

Abstract

High-Performance Computing (HPC) is a cornerstone of scientific and engineering advancements, enabling complex computations in areas such as climate modeling, genomics, and artificial intelligence. Concurrently, Large Language Models (LLMs) have emerged as powerful AI-driven tools capable of code optimization, automation, and scientific reasoning. The integration of LLMs into HPC systems presents significant opportunities, including enhanced code generation, improved workload management, and efficient parallel execution. However, this convergence also introduces several challenges, such as high computational costs, scalability issues, memory constraints, security risks, and interpretability concerns. This paper explores the role of LLMs in HPC, discusses existing research and industrial applications, and highlights key challenges and potential solutions. Furthermore, it provides insights into recent advances in AI-powered HPC solutions and presents case studies showcasing real-world implementations. The paper concludes with future research directions, focusing on efficient LLM architectures, integration with emerging HPC technologies, and ethical considerations. The findings emphasize the need for continued innovation to make LLMs more efficient, scalable, and reliable for HPC applications.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Big Data and Digital EconomyMachine Learning in Materials ScienceArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen