Paper: Just Tell Me: Prompt Engineering in Business Process Management [8 pages]
Researchers at Kuhne Logistics University (among other contributors) are interested in understanding how to get more accurate outputs from LLMs.
Many companies are trying to introduce LLM models such as GPT to their domains. In this paper, researchers explore how models can be best applied to Business Process Management(BPM). BPM tasks involve process monitoring and process extraction from text.
In the industry, companies have been trying to use fine tune language models to get good results for a specific domain but this requires a lot of good quality data.
Prompt Engineering could be an alternative solution. The input prompt could be highly specific prompt such as “act like a accountant with the context of business A” or “act like a marketing consultant with design lessons from company X” could be used to extract specific information from a big model. This approach of “freezing” pre-trained models is particularly attractive, as model sizes continue to increase.
Here are some advantages to prompt engineering as an approach:
Good training data for certain domains is not always available
Prompt Engineering is more compute efficient since it doesn’t involving re training
Prompt templates allow inheritance of features for LLMs
Allows more observability and explainability
Here are some challenges:
Prompt length is limited and some representations are inaccessible
Choice and model and prompt highly affected end results
Prompts have limited transferability
Not all domains and processes can be easily prompted
So essentially,
"Prompt Engineer may be better than fine tuning because it doesn’t have the compute and memory requirements of fine tuning with a good dataset."