[NeurIPS 2023] Efficient Solver for ERM with PLQ Loss and Linear Constraints
-
Updated
Dec 16, 2025
[NeurIPS 2023] Efficient Solver for ERM with PLQ Loss and Linear Constraints
EnsLoss: Stochastic Calibrated Loss Ensembles for Preventing Overfitting in Classification
A statistical learning toolkit for high-dimensional Hawkes processes in Python
[NeurIPS 2023] Regularized Composite ReLU-ReHU Loss Minimization with Linear Computation and Linear Convergence
[NeurIPS 2023] Regularized Composite ReLU-ReHU Loss Minimization with Linear Computation and Linear Convergence
Lecture notes taken in the Quantitative Foundations of Artificial Intelligence class in Fall 2023, taught by Prof. Dr. Ludger Overbeck at Justus Liebig University Giessen.
Perceptron learning algorithm implemented in Python
Reproduction code for paper "Liu and Tong (2026). Metric entropy-free sample complexity bounds for sample average approximation in convex stochastic programming"
Adaptive Boosting of Weak Learners implemented in Python
Add a description, image, and links to the empirical-risk-minimization topic page so that developers can more easily learn about it.
To associate your repository with the empirical-risk-minimization topic, visit your repo's landing page and select "manage topics."