Anagha Athavale

M.Tech.

Roles
  • PreDoc Researcher
Courses
Publications (created while at TU Wien)
    2024
    • Distillation based Robustness Verification with PAC Guarantees
      Indri, P., Blohm, P., Athavale, A., Bartocci, E., Weissenbacher, G., Maffei, M., Nickovic, D., Gärtner, T., & Malhotra, S. (2024). Distillation based Robustness Verification with PAC Guarantees. In Volume 235: International Conference on Machine Learning, 21-27 July 2024, Vienna, Austria. 41st International Conference on Machine Learning (ICML 2024), Vienna, Austria.
      Metadata
      Abstract
      We present a distillation based approach to verify the robustness of any Neural Network (NN). Conventional formal verification methods cannot tractably assess the global robustness of real-world NNs. To address this, we take advantage of a gradient-aligned distillation framework to transfer the robustness properties from a larger teacher network to a smaller student network. Given that the student NN can be formally verified for global robustness, we theoretically investigate how this guarantee can be transferred to the teacher NN. We draw from ideas in learning theory to derive a sample complexity for the distillation procedure that enables PAC-guarantees on the global robustness of the teacher network.