Reduce and explain BERT. New green AI technique improves efficiency of language models

31 March, 2025

  • Researchers at the University of Granada present a key advance in green AI, reducing the computational footprint of BERT without sacrificing performance.
  • The new PBCE method allows to reduce the size of BERT by more than 40%, making artificial intelligence more accessible and sustainable.
  • This innovative approach improves the explainability of deep neural networks.

A team of researchers at the University of Granada has developed an innovative methodology for the compression of BERT-based language models. The approach, called Persistent BERT Compression and Explainability (PBCE), uses persistent homology to identify and eliminate redundant neurons, achieving a reduction in model size of up to 47% in BERT Base and 42% in BERT Large without significantly affecting the accuracy of natural language processing (NLP) tasks.

Large language models (LLMs) such as BERT have revolutionised the field of artificial intelligence, but their high resource consumption limits their use on lower-capacity devices. Moreover, they are essentially black-box models, which are difficult to explain and interpret. With PBCE, researchers have taken a step towards green AI, an approach that aims to improve the efficiency of models without increasing their carbon footprint.

‘This method not only reduces the size of the model, but also improves the interpretability of the behaviour of the neurons, bringing us closer to more explainable neural networks,’ explains Luis Balderas, one of the authors of the study.

The research, published in the journal Applied Sciences, shows that PBCE manages to maintain high performance on the GLUE Benchmark, a standard set of tests for evaluating language models. In fact, on several key tasks, the compressed model even outperforms the original version of BERT.

This breakthrough opens up new possibilities for implementing AI models on mobile devices and resource-constrained systems, making artificial intelligence more accessible and efficient.

The Andalusian Inter-University Institute in Data Science and Computational Intelligence, known as DaSCI Institute, is a collaborative entity between the universities of Granada, Jaén and Córdoba. It is dedicated to advanced research and training in the field of Artificial Intelligence, with a particular focus on Data Science and Computational Intelligence. The institute brings together an outstanding group of researchers working on joint projects, promoting the development and application of innovative technologies in various sectors. With the aim of becoming a benchmark in its field, the DaSCI promotes the transfer of scientific knowledge to the socio-economic environment, thus contributing to technological progress and the digitisation of industry. https://dasci.es/es/ 

For more information, see the full study at: DOI 0.3390/app15010390

Contact: 

News reported in: