TY - GEN
T1 - CryptoRNN - Privacy-Preserving Recurrent Neural Networks Using Homomorphic Encryption
AU - Bakshi, Maya
AU - Last, Mark
N1 - Publisher Copyright: © 2020, Springer Nature Switzerland AG.
PY - 2020/1/1
Y1 - 2020/1/1
N2 - Recurrent Neural Networks (RNNs) are used extensively for mining sequential datasets. However, performing inference over an RNN model requires the data owner to expose his or her raw data to the machine learning service provider. Homomorphic encryption allows calculations to be performed on ciphertexts, where the decrypted result is the same as if the calculation has been made directly on the plaintext. In this research, we suggest a Privacy-Preserving RNN–based inference system using homomorphic encryption. We preserve the functionality of RNN and its ability to make the same predictions on sequential data, within the limitations of homomorphic encryption, as those obtained for plaintext on the same RNN model. In order to achieve this goal, we need to address two main issues. First, the noise increase between successive calculations and second, the inability of homomorphic encryption to work with the most popular activation functions for neural networks (sigmoid, ReLU and tanh). In this paper, we suggest several methods to handle both issues and discuss the trade-offs between the proposed methods. We use several benchmark datasets to compare the encrypted and unencrypted versions of the same RNN in terms of accuracy, performance, and data traffic.
AB - Recurrent Neural Networks (RNNs) are used extensively for mining sequential datasets. However, performing inference over an RNN model requires the data owner to expose his or her raw data to the machine learning service provider. Homomorphic encryption allows calculations to be performed on ciphertexts, where the decrypted result is the same as if the calculation has been made directly on the plaintext. In this research, we suggest a Privacy-Preserving RNN–based inference system using homomorphic encryption. We preserve the functionality of RNN and its ability to make the same predictions on sequential data, within the limitations of homomorphic encryption, as those obtained for plaintext on the same RNN model. In order to achieve this goal, we need to address two main issues. First, the noise increase between successive calculations and second, the inability of homomorphic encryption to work with the most popular activation functions for neural networks (sigmoid, ReLU and tanh). In this paper, we suggest several methods to handle both issues and discuss the trade-offs between the proposed methods. We use several benchmark datasets to compare the encrypted and unencrypted versions of the same RNN in terms of accuracy, performance, and data traffic.
KW - Data privacy
KW - Encrypted machine learning
KW - Encrypted recurrent neural netwroks
KW - Homomorphic encryption
KW - Privacy preserving machine learning
KW - Privacy preserving recurrent neural networks
UR - http://www.scopus.com/inward/record.url?scp=85087744487&partnerID=8YFLogxK
U2 - https://doi.org/10.1007/978-3-030-49785-9_16
DO - https://doi.org/10.1007/978-3-030-49785-9_16
M3 - Conference contribution
SN - 9783030497842
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 245
EP - 253
BT - Cyber Security Cryptography and Machine Learning - 4th International Symposium, CSCML 2020, Proceedings
A2 - Dolev, Shlomi
A2 - Weiss, Gera
A2 - Kolesnikov, Vladimir
A2 - Lodha, Sachin
PB - Springer
T2 - 4th International Symposium on Cyber Security Cryptography and Machine Learning, CSCML 2020
Y2 - 2 July 2020 through 3 July 2020
ER -