Analysing the Adversarial Landscape of Binary Stochastic Networks

Yi Xiang Marcus Tan, Yuval Elovici, Alexander Binder

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


We investigate the robustness of stochastic ANNs to adversarial attacks. We perform experiments on three known datasets. Our experiments reveal similar susceptibility of stochastic ANNs compared to conventional ANNs when confronted with simple iterative gradient-based attacks in the white-box settings. We observe, however, that in black-box settings, SANNs are more robust than conventional ANNs against boundary and surrogate attacks. Consequently, we propose improved attacks against stochastic ANNs. In the first step, we show that using stochastic networks as surrogates outperforms deterministic ones, when performing surrogate-based black-box attacks. In order to further boost adversarial success rates, we propose in a second step the novel Variance Mimicking (VM) surrogate training, and validate its improved performance.

Original languageAmerican English
Title of host publicationInformation Science and Applications - Proceedings of ICISA 2020
EditorsHyuncheol Kim, Kuinam J. Kim, Suhyun Park
PublisherSpringer Science and Business Media Deutschland GmbH
Number of pages13
ISBN (Print)9789813363847
StatePublished - 1 Jan 2021
EventiCatse International Conference on Information Science and Applications, ICISA 2020 - Busan, Korea, Republic of
Duration: 16 Dec 202018 Dec 2020

Publication series

NameLecture Notes in Electrical Engineering
Volume739 LNEE


ConferenceiCatse International Conference on Information Science and Applications, ICISA 2020
Country/TerritoryKorea, Republic of


  • Adversarial machine learning
  • Binary neural network
  • Black-box attack
  • Stochastic neural network

All Science Journal Classification (ASJC) codes

  • Industrial and Manufacturing Engineering


Dive into the research topics of 'Analysing the Adversarial Landscape of Binary Stochastic Networks'. Together they form a unique fingerprint.

Cite this