TY - GEN
T1 - Polynomial Adaptation of Large-Scale CNNs for Homomorphic Encryption-Based Secure Inference
AU - Baruch, Moran
AU - Drucker, Nir
AU - Ezov, Gilad
AU - Goldberg, Yoav
AU - Kushnir, Eyal
AU - Lerner, Jenny
AU - Soceanu, Omri
AU - Zimerman, Itamar
N1 - Publisher Copyright: © The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.
PY - 2025/1/1
Y1 - 2025/1/1
N2 - Enabling secure inference of large-scale CNNs using Homomorphic Encryption (HE) requires a preliminary step for adapting unencrypted pre-trained models to only use polynomial operations. Prior art advocates for high-degree polynomials for accurate approximations, which comes at the price of extensive computations. We demonstrate that low-degree polynomials can be sufficient for accurate approximation even for large-scale DNNs. For that, we introduce a dedicated fine-tuning process on unencrypted data that reduces the input range to the activation functions. The resulting models have competitive accuracy of up to 3.5% degradation from the original non-polynomial model, which outperforms prior art on tasks such as ImageNet classification over ResNet and ConvNeXt. Upon adaptation, these models can process HE-encrypted samples and are ready for secure inference. Based on these, we provide optimization insights for activation functions and skip connections, enhancing HE evaluation efficiency. We evaluated ResNet50-152 on encrypted ImageNet samples, an accomplishment not previously reached by polynomial networks, in just 3:13–7:12 min, using commodity hardware under the CKKS scheme with 128-bit security. In comparison to prior high-degree polynomial solutions, our low-degree polynomials boost evaluation latency, for example, by 3× for ResNet-50 and CIFAR-10. We further show our approach versatility, by adapting the CLIP model for secure zero-shot predictions, highlighting new potential in HE and transfer learning.
AB - Enabling secure inference of large-scale CNNs using Homomorphic Encryption (HE) requires a preliminary step for adapting unencrypted pre-trained models to only use polynomial operations. Prior art advocates for high-degree polynomials for accurate approximations, which comes at the price of extensive computations. We demonstrate that low-degree polynomials can be sufficient for accurate approximation even for large-scale DNNs. For that, we introduce a dedicated fine-tuning process on unencrypted data that reduces the input range to the activation functions. The resulting models have competitive accuracy of up to 3.5% degradation from the original non-polynomial model, which outperforms prior art on tasks such as ImageNet classification over ResNet and ConvNeXt. Upon adaptation, these models can process HE-encrypted samples and are ready for secure inference. Based on these, we provide optimization insights for activation functions and skip connections, enhancing HE evaluation efficiency. We evaluated ResNet50-152 on encrypted ImageNet samples, an accomplishment not previously reached by polynomial networks, in just 3:13–7:12 min, using commodity hardware under the CKKS scheme with 128-bit security. In comparison to prior high-degree polynomial solutions, our low-degree polynomials boost evaluation latency, for example, by 3× for ResNet-50 and CIFAR-10. We further show our approach versatility, by adapting the CLIP model for secure zero-shot predictions, highlighting new potential in HE and transfer learning.
UR - http://www.scopus.com/inward/record.url?scp=85214208265&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-76934-4_1
DO - 10.1007/978-3-031-76934-4_1
M3 - منشور من مؤتمر
SN - 9783031769337
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 3
EP - 25
BT - Cyber Security, Cryptology, and Machine Learning - 8th International Symposium, CSCML 2024, Proceedings
A2 - Dolev, Shlomi
A2 - Elhadad, Michael
A2 - Kutyłowski, Mirosław
A2 - Persiano, Giuseppe
PB - Springer Science and Business Media Deutschland GmbH
T2 - 8th International Symposium on Cyber Security, Cryptology, and Machine Learning, CSCML 2024
Y2 - 19 December 2024 through 20 December 2024
ER -