Reduce and explain BERT. New green AI technique improves efficiency of language models Noticia

A team of researchers at the University of Granada has developed an innovative methodology for the compression of BERT-based language models. The approach, called Persistent BERT Compression and Explainability (PBCE), uses persistent homology to identify and eliminate redundant neurons, achieving a reduction in model size of up to 47% in BERT Base and 42% in BERT Large without significantly affecting the accuracy of natural language processing (NLP) tasks.