TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Helping fix aircraft – from NLP to Bayes Nets

1 点作者 hazrmard超过 1 年前

1 comment

hazrmard超过 1 年前
This post describes a research project I did which ended up needing domain-specific natural language processing (NLP).<p>The project started out with learning Bayesian Networks. The nets were learned to determine probabilities of maintenance actions on machines. But, once we explored the data, we realized extensive pre-processing was needed. Most of the information about actions was in free-form descriptions, and not tabulated numerically.<p>So, the choice was between using a large language model (LLM), remotely or locally, or a more rustic NLP approach. I chose the latter for:<p>1. Explainability. Smaller models are easier to decompose and analyze.<p>2. Security. Using LLMs would likely require cloud-based or off-premise devices. They were an additional security hurdle in our case.<p>3. Speed. This was a very domain-specific dataset with relatively few training instances. Fine-tuning LLMs on this would take time (data collection + training).<p>4. Performance. The use-case was not generative text, but text retrieval for Bayesian reasoning. We could get away with NLP models that could only adequately infer similarity between text.