Skip to content

ws-jiang/awesome-sharpeness-aware-minimization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 

Repository files navigation

Paper List of Sharpness-Aware Minimization

  • Sharpness-aware minimization for efficiently improving generalization. ICLR 2021

The papers are categorized into four groups: (1) improving efficiency of SAM (2) improving effectiveness of SAM (3) theoretical analysis of SAM (4) applications of SAM.

Efficiency

  • An adaptive policy to employ sharpness-aware minimization. ICLR 2023
  • Towards efficient and scalable sharpness-aware minimization. CVPR 2022
  • Randomized sharpness-aware training for boosting computational efficiency in deep learning. arXiv 2022
  • Efficient sharpness-aware minimization for improved training of neural networks. ICLR 2022

Effectiveness

  • Make sharpness-aware minimization stronger: A sparsified perturbation approach. NeurIPS 2022
  • Random sharpness-aware minimization. NeurIPS 2022
  • Penalizing gradient norm for efficiently improving generalization in deep learning. ICML 2022
  • Fisher SAM: Information geometry and sharpness aware minimisation. ICML 2022
  • Surrogate gap minimization improves sharpness-aware training. ICLR 2022
  • ASAM: Adaptive sharpness-aware minimization for scale-invariant learning of deep neural networks. ICML 2021

Theoretical Analysis

  • How sharpness-aware minimization minimizes sharpness? ICLR 2023
  • Towards understanding sharpness-aware minimization. ICML 2022

Applications

  • Sharpness-aware gradient matching for domain generalization. CVPR 2023
  • Generalized federated learning via sharpness aware minimization. ICML 2022
  • Sharp-MAML: Sharpness-aware model-agnostic meta learning. ICML 2022
  • Sharpness-aware minimization with dynamic reweighting. EMNLP 2022
  • Sharpness-aware minimization improves language model generalization. ACL 2022

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published