Paper 2024/1429

Powerformer: Efficient and High-Accuracy Privacy-Preserving Language Model with Homomorphic Encryption

Dongjin Park, Chung-Ang University
Eunsang Lee, Sejong University
Joon-Woo Lee, Chung-Ang University
Abstract

We propose \textit{Powerformer}, an efficient homomorphic encryption (HE)-based privacy-preserving language model (PPLM) designed to reduce computation overhead while maintaining model performance. Powerformer incorporates three key techniques to optimize encrypted computations: 1) A novel distillation technique that replaces softmax and layer normalization (LN) with computationally efficient power and linear functions, ensuring no performance degradation while enabling seamless encrypted computation. 2) A pseudo-sign composite approximation method that accurately approximates GELU and tanh functions with minimal computational overhead. 3) A homomorphic matrix multiplication algorithm specifically optimized for Transformer models, enhancing efficiency in encrypted environments. By integrating these techniques, Powerformer based on the BERT-base model achieves a 45\% reduction in computation time compared to the state-of-the-art HE-based PPLM without any loss in accuracy.

Metadata
Available format(s)
PDF
Category
Applications
Publication info
Preprint.
Keywords
Privacy-Preserving Machine LearningHomomorphic EncryptionTransformerImplementation
Contact author(s)
thrudgelmir @ cau ac kr
eslee3209 @ sejong ac kr
jwlee2815 @ cau ac kr
History
2025-04-13: revised
2024-09-12: received
See all versions
Short URL
https://ia.cr/2024/1429
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2024/1429,
      author = {Dongjin Park and Eunsang Lee and Joon-Woo Lee},
      title = {Powerformer: Efficient and High-Accuracy Privacy-Preserving Language  Model with Homomorphic Encryption},
      howpublished = {Cryptology {ePrint} Archive, Paper 2024/1429},
      year = {2024},
      url = {https://eprint.iacr.org/2024/1429}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.