I am an IFML Postdoctoral Fellow based at UT Austin hosted by Adam Klivans and Raghu Meka. I received my PhD in Computer Science from the University of Wisconsin-Madison, where I was advised by Christos Tzamos. Prior to UW-Madison, I studied Electrical and Computer Engineering at the National Technical University of Athens where I was advised by Dimitris Fotakis.
I work on designing efficient algorithms with provable guarantees for machine learning problems with a focus on dealing with imperfect data (e.g., classification with noisy labels and statistical inference from biased or censored data). I am also interested in analyzing and providing formal guarantees for popular machine learning algorithms (e.g., diffusion models).
I am on the 2024/25 job market. Here is my CV.
Aug, 22, 2024: I am visiting Simons Institute at Berkeley for the generalization and LLM programs.
Jul, 3, 2024: Our work Smoothed Analysis for Learning Concepts of Low Intrinsic Dimension got the best paper award at COLT 2024!!
May, 1, 2024: New paper on learning mixtures of Gaussians using diffusion models!
Smoothed Analysis for Learning Concepts with Low Intrinsic Dimension 
w/ G. Chandrasekaran, A. Klivans, R. Meka, K. Stavropoulos 
 Best Paper Award  COLT 2024 
IPAM 2024 Long Talk Video
Active Learning with Simple Questions 
w/ M. Ma, C. Tzamos 
COLT 2024
Agnostically Learning Multi-index Models with Queries 
 w/ I. Diakonikolas, D. Kane, C. Tzamos, N. Zarifis 
 FOCS 2024
Super Non-singular Decompositions of Polynomials and their 
Application to Robustly Learning Low-degree PTFs 
w/ I. Diakonikolas, D M. Kane, S. Liu, N. Zarifis 
STOC 2024
Efficient Discrepancy Testing for Learning with Distribution Shift 
 w/ G. Chandrasekaran, A. Klivans, K. Stavropoulos, A. Vasilyan 
 
 NeurIPS 2024
Active Classification with Few Queries under Misspecification 
 w/ C. Tzamos, M. Ma 
 
 NeurIPS 2024
Learning Noisy Halfspaces with a Margin: Massart is no Harder than Random 
 w/ G. Chandrasekaran, K. Stavropoulos, K. Tian 
 
 NeurIPS 2024
Optimizing Solution-Samplers for Combinatorial Problems: 
The Landscape of Policy Gradient Methods 
w/ C. Caramanis, D. Fotakis, A. Kalavasis, C. Tzamos 
 Selected for Oral Presentation  
NeurIPS 2023
SLaM: Student-Label Mixing for Distillation with Unlabeled Examples 
 w/ F. Iliopoulos, K. Trinh, C. Baykal, G. Menghani,  V. Erik 
 
 NeurIPS 2023
The Gain from Ordering in Online Learning 
w/ M. Ma, C. Tzamos 
NeurIPS 2023
Efficient Testable Learning of Halfspaces with Adversarial Label Noise 
w/ I. Diakonikolas, D M. Kane, S. Liu, N. Zarifis 
NeurIPS 2023
Self Directed Linear Classification 
 
w/ I. Diakonikolas, C. Tzamos, N. Zarifis 
COLT 2023
Weighted Distillation with Unlabeled Examples 
 w/ F. Iliopoulos, C. Baykal, G. Menghani, K. Trinh, V. Erik 
 
 NeurIPS 2022
Linear Label Ranking with Bounded Noise 
w/ D. Fotakis, A. Kalavasis, C. Tzamos 
 Selected for Oral Presentation  
NeurIPS 2022
Learning General Halfspaces with Adversarial Label Noise via Online Gradient Descent 
 w/ I. Diakonikolas, C. Tzamos, N. Zarifis 
 
 ICML 2022
Learning a Single Neuron with Adversarial Label Noise via Gradient Descent 
 w/ I. Diakonikolas, C. Tzamos, N. Zarifis 
 
 COLT 2022
Learning General Halfspaces with General Massart Noise under the Gaussian Distribution 
 w/ I. Diakonikolas, D. Kane, C. Tzamos, N. Zarifis 
 
 STOC 2022
A Statistical Taylor Theorem and Extrapolation of Truncated Densities 
 w/ C. Daskalakis, C. Tzamos, M. Zampetakis 
 COLT 2021
Agnostic Proper Learning of Halfspaces under Gaussian Marginals 
 w/ I. Diakonikolas, D. Kane, C. Tzamos, N. Zarifis 
 COLT 2021
Efficient Algorithms for Learning from Coarse Labels 
 w/ D. Fotakis, A. Kalavasis, C. Tzamos 
 COLT 2021
Learning Online Algorithms with Distributional Advice 
 w/ I. Diakonikolas, C. Tzamos, A. Vakilian, N. Zarifis 
 ICML 2021
A Polynomial Time Algorithm For Learning Halfspaces with Tsybakov Noise 
 w/ I. Diakonikolas, D. Kane, C. Tzamos, N. Zarifis 
 STOC 2021
Learning Halfspaces with Tsybakov Noise 
 w/ I. Diakonikolas, C. Tzamos, N. Zarifis 
 STOC 2021 
 Conference version merged with the above paper
Non-Convex SGD Learns Halfspaces with Adversarial Label Noise 
 w/ I. Diakonikolas, C. Tzamos, N. Zarifis 
 NeurIPS 2020
Learning Halfspaces with Massart Noise Under Structured Distributions  
w/ I. Diakonikolas, C. Tzamos, N. Zarifis 
COLT 2020
Algorithms and SQ Lower Bounds for PAC Learning One-Hidden-Layer ReLU Networks 
w/ I. Diakonikolas, D. Kane, N. Zarifis 
COLT 2020
Efficient Truncated Statistics with Unknown Truncation 
w/ C. Tzamos, M. Zampetakis 
FOCS 2019
Removing Bias in Maching Learning via Truncated Statistics 
w/ C. Daskalakis,  C. Tzamos, M. Zampetakis 
Manuscript
Opinion Dynamics with Limited Information 
w/ D. Fotakis, V. Kandiros,  S. Skoulakis 
WINE 2018
Learning Powers of Poisson Binomial Distributions 
w/ D. Fotakis, P. Krysta, P. Spirakis 
Manuscript
Program Committees: ITCS-2025
Reviewer: FOCS (2020, 2021, 2023) , STOC (2020, 2024), COLT (2023), SODA (2019), NeurIPS (2021, 2023), WINE(2018), ICML(2023, 2021, 2020), EC (2022, 2020), MFCS (2018), TCS (2018, 2021), ALT (2021)
Optimizing Solution-Samplers for Combinatorial Problems, NeurIPS 2023 Oral
SLaM: Student-Label Mixing for Distillation with Unlabeled Examples, NeurIPS 2023
Learning General Halfspaces with General Massart Noise, STOC 2022
A Statistical Taylor’s Theorem and Extrapolation of Truncated Densities, COLT 2021
Agnostic Proper Learning of Halfspaces under Gaussian Marginals COLT 2021
Efficient Algorithms for Learning Halfspaces with Tsybakov Noise, STOC 2021
Non-Convex SGD Learns Halfspaces with Adversarial Label Noise, NeurIPS 2020
Learning Halfspaces with Massart Noise Under Structured Distributions, COLT 2020
Efficient Truncated Statistics with Unknown Truncation, FOCS 2019, Video
Learning PBD Powers, ECCO Research Seminar 2017, University of Liverpool