OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 03.05.2026, 13:08

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

ChatHTN: a consultation model for hypertension

2026·0 Zitationen·Scientific ReportsOpen Access
Volltext beim Verlag öffnen

0

Zitationen

4

Autoren

2026

Jahr

Abstract

The rapid development of large language models (LLMs) has greatly advanced natural language processing (NLP). While these models perform remarkably well in tasks such as text generation and translation, they still face challenges in highly specialized domains such as hypertension, where domain expertise and personalization are crucial. To address these limitations, we introduce a generative framework for intelligent hypertension consultation that combines a domain-specific knowledge graph, a multi-task fine-tuning strategy, and a specialized dataset. The knowledge graph enhances the model’s medical knowledge, while the multi-task fine-tuning mechanism optimizes tasks like medical entity recognition and etiology classification to ensure consistency. To further strengthen reasoning ability, we construct HTN-5M, a large-scale Chinese dataset that embeds chain-of-thought (CoT) reasoning in a structured triplet format (input, output, CoT), supporting both question-answer generation and auxiliary learning. Experimental results demonstrate that our approach outperforms strong baselines, achieving a 16.25% improvement over DeepSeek-LLM-67B-base on the CMB benchmark and an approximate 10% gain over HuatuoGPT across three medical datasets.

Ähnliche Arbeiten