Hi, I am Ezgi. I am a PhD candidate in Electrical and Computer Engineering at New York University where I'm advised by Elza Erkip. I hold an (integrated) MEng degree in Electrical Electronics Engineering from Imperial College London. Recent collaborators include Jona Ballé and Aaron B. Wagner and Deniz Gündüz.
I am a collaborative researcher and enjoy working with people from diverse backgrounds. My current research is driven by a passion for connecting theory and practice in data compression and communication problems, particularly in distributed scenarios. I leverage tools from learning, signal processing, compression and information theory, yielding interpretable results. When I am not busy with research, I enjoy hiking and expanding my food and coffee ☕ palate.
I am always happy to chat and am open to give invited talks on topics at the intersection of information theory, "deep"/machine learning and data compression – feel free to drop me an email at ezgi(dot)ozyilkan(at)nyu(dot)edu!
Useful links: Google Scholar | LinkedIn | arXiv | GitHub
March 2025: Honored to be selected as an iREDEFINE 2025 Fellow by the Electrical and Computer Engineering Department Heads Association (ECEDHA)! Here is the poster I presented.
February 2025: Gave an invited talk at the High-Beams Seminar, where I discussed my recent work on learning-based distributed (data) compression. Many thanks to Kaan Aksit for the kind invitation!
February 2025: Excited to be joining Apple as an ML/CV Research Intern this summer in Cupertino!
January 2025: Our workshop proposal titled "Learn to Compress and Compress to Learn" for 2025 IEEE International Symposium on Information Theory has been accepted! Check out our workshop website here.
December 2024: Honored to be selected as a recipient of the 2024 IEEE Signal Processing Society (SPS) Scholarship!
November 2024: In our recent NeurIPS'24 workshop paper, we discuss a few overarching failure modes of some popular class of neural compressors – that is, their difficulty in learning discontinuous functions.
October 2024: Our recent preprint titled Learning-Based Compress-and-Forward Schemes for the Relay Channel got accepted to IEEE Journal on Communications (JSAC) and will appear, as part of this special issue, in 2025!
September 2024: I presented our work titled Neural Compress-and-Forward for the Relay Channel at SPAWC 2024, in the beautiful Italian city of Lucca! Here is the poster.
July 2024: Our workshop proposal (compression + machine learning) for NeurIPS 2024 has been accepted!
July 2024: Our recent work titled Neural Compress-and-Forward for the Relay Channel got accepted to IEEE International Workshop on Signal Processing Advances in Wireless Communications (SPAWC) 2024!
April 2024: Our recent work titled "Neural Distributed Compressor Discovers Binning" got accepted to IEEE Journal on Selected Areas in Information Theory (JSAIT), part of the special issue on Toby Berger.
January 2024: The full program for our 'Learn to Compress' workshop @ ISIT'24 (including keynote speakers and call for papers) is out.
July 2023: I presented our work titled "Neural Distributed Compressor Does Binning" at Neural Compression Workshop @ ICML'23. Here are the slides.
July 2023: I was selected as the best reviewer for the Neural Compression Workshop @ ICML'23.
July 2023: Our recent ISIT'23 work was accepted as an oral presentation to Neural Compression Workshop @ ICML'23.