Zero-shot Text Matching For Automated Auditing Using Sentence Transformers | Awesome Learning to Hash Add your paper to Learning2Hash

Zero-shot Text Matching For Automated Auditing Using Sentence Transformers

David Biesner, Maren Pielka, Rajkumar Ramamurthy, Tim Dilmaghani, Bernd Kliem, RΓΌdiger Loitz, Rafet Sifa . 2022 21st IEEE International Conference on Machine Learning and Applications (ICMLA) 2022 – 4 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Efficiency Few Shot & Zero Shot ICML Supervised Unsupervised

Natural language processing methods have several applications in automated auditing, including document or passage classification, information retrieval, and question answering. However, training such models requires a large amount of annotated data which is scarce in industrial settings. At the same time, techniques like zero-shot and unsupervised learning allow for application of models pre-trained using general domain data to unseen domains. In this work, we study the efficiency of unsupervised text matching using Sentence-Bert, a transformer-based model, by applying it to the semantic similarity of financial passages. Experimental results show that this model is robust to documents from in- and out-of-domain data.

Similar Work