224

glassoformer: a query-sparse transformer for post-fault power grid voltage prediction

IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2022
Abstract

We propose GLassoformer, a novel and efficient transformer architecture leveraging group Lasso regularization to reduce the number of queries of the standard self-attention mechanism. Due to the sparsified queries, GLassoformer is more computationally efficient than the standard transformers. On the power grid post-fault voltage prediction task, GLassoformer shows remarkably better prediction than many existing benchmark algorithms in terms of accuracy and stability.

View on arXiv
Comments on this paper