Categories |
OPTIMIZATION
MACHINE LEARNING
DEEP LEARNING
|
About |
This workshop will attempt to shed light on this statement. Topics of interest include, but are not limited to, second-order methods, adaptive gradient descent methods, regularization techniques, as well as techniques based on higher-order derivatives. This workshop will bring machine learning and optimization researchers closer, in order to facilitate a discussion with regards to underlying questions such as the following:
|
Call for Papers |
We welcome submissions to the workshop under the general theme of “Beyond First-Order Methods in ML systems”. Topics of interest include, but are not limited to,
We encourage submissions that are theoretical, empirical or both. Submissions should be up to 4 pages excluding references, acknowledgements, and supplementary material, and should follow ICML format. The CMT-based review process will be double-blind to avoid potential conflicts of interests; submit at (to be annnounced). Accepted submissions will be presented as posters. |
Credits and Sources |
[1] ICML 2020 : Beyond First Order Methods in Machine Learning |