Domain Adaptation of Large Language Models for the Telco Industry
, Researcher, Orange
Discover a unique dataset that has been designed for Large Language Model fine-tuning in the Telco domain from a large variety of data sources. Learn about an efficient way to adapt a 7B large language models (LLMs) to a technical domain using a maximum of three NVIDIA A100 40GB GPUs through a performance comparison of different Domain Adaptation Pre-Training and Instruct Tuning methods for domain adaptation. LLMs must be adapted to specific domains, such as telco, to capture the intricate lexical, semantic, and concept-specific nuances. This adaptation enables them to proficiently manage technical documents, network modeling, and other critical use-cases demanded by telco operators, like Orange. Yet, as LLMs grow in complexity, their adaptation becomes increasingly costly. This presentation will examine the efficacy of various parameter-efficient fine-tuning methods for telco domain adaptation. We'll evaluate the best domain-adaptive pre-training techniques for tailoring a foundational model to a particular domain, using both intrinsic metrics like perplexity and domain-specific task evaluations.
イベント:
日付:
レベル:
トピック:
業界:
NVIDIA technology: Cloud / Data Center GPU,RTX GPU