What is BERTScore – Bidirectional Encoder Representations from Transformers Score?
Lois Angelo Dar Juan2025-07-03T06:03:55+00:00BERTScore (Bidirectional Encoder Representations from Transformers Score) Cheat Sheet BERTScore is an effective evaluation metric that looks beyond surface-level word matching to assess the meaning behind the generated text. Instead of counting overlapping words like traditional metrics such as BLEU or ROUGE, BERTScore taps into the power of pre-trained transformer models (like BERT) to compare the semantic similarity between tokens in the generated output and a reference sentence. It does this by calculating the cosine similarity between their contextual embeddings. Initially proposed by Zhang et al. (2020), BERTScore has quickly become a popular choice in natural language processing tasks where [...]