Hi, I am Ezgi (she/they). I am a third year PhD candidate in Electrical Engineering at New York University under the advisory of Professor Elza Erkip, CommIT Group at NYU. I received my (integrated) MEng degree in Electrical Engineering from Imperial College London in 2021.
My current interests are in (unconventional) learned compression models, quantization and representation learning. My background is in (information-theoretic) deep learning and statistical signal processing. Outside of academia, I hike as well as try to expand my food and coffee palate.
I am always happy to chat about topics at the intersection of information theory and deep learning – feel free to drop me an email at ezgi(dot)ozyilkan(at)nyu(dot)edu!
Useful links: Google Scholar | LinkedIn | arXiv | GitHub
May 2024: In our recent preprints ([1] and [2]), we propose neural "compress-and-forward" (CF) schemes for the relay channel, that leverage my previous neural distributed compression work. Our proposed neural CF operates closely to the maximum achievable rate in a "primitive relay channel" and also yields interpretable results :)
April 2024: Our recent work titled "Neural Distributed Compressor Discovers Binning" got accepted to IEEE Journal on Selected Areas in Information Theory (JSAIT), part of the special issue on Toby Berger.
April 2024: Our recent preprint on robust distributed lossy compression was accepted to the 2024 IEEE International Symposium on Information Theory Workshops (ISIT'24 Wkshps)!
March 2024: In our recent preprint, we extend our neural distributed lossy compression framework to more robust/general compression settings – for example, where side information may be absent. We demonstrate that our learned compressors mimic the theoretical optimum and yield interpretable results :)
February 2024: Our recent survey titled "Distributed Compression in the Era of Machine Learning: A Review of Recent Advances" will appear at the Conference on Information Sciences and Systems (CISS'24) as an invited paper! Preprint is available here.
January 2024: The full program for our 'Learn to Compress' workshop @ ISIT'24 (including keynote speakers and call for papers) is out.
December 2023: "Distributed Deep Joint Source-Channel Coding with Decoder-Only Side Information" was accepted to the inaugural 2024 IEEE International Conference on Machine Learning for Communication and Networking (ICMLCN)! Preprint is available here.
November 2023: Our proposal "Learn to Compress" has been accepted as a workshop at ISIT 2024. The proposal was put forward by Aaron Wagner (Cornell University), Elza Erkip (NYU) and myself. We will release more details about this workshop in December – but meanwhile, feel free to check out our workshop website!
October 2023: The draft version of the journal version of our previous ISIT 2023 paper is available in arXiv! We demonstrate that the neural distributed compressor mimics the theoretical optimum for more exemplary sources :)
July 2023: I presented our work titled "Neural Distributed Compressor Does Binning" at Neural Compression Workshop @ ICML'23. Here are the slides.
July 2023: I was selected as the best reviewer for the Neural Compression Workshop @ ICML'23.
July 2023: Our recent ISIT'23 work was accepted as an oral presentation to Neural Compression Workshop @ ICML'23.
June 2023: I presented our work titled "Learned Wyner–Ziv Compressors Recover Binning" at International Symposium on Information Theory (ISIT) 2023. Here are the slides!
June 2023: I presented a poster about our upcoming ISIT'23 paper at North American School of Information Theory (NASIT) 2023.
May 2023: I presented a poster titled Neural Distributed Compressor Does Binning at UC Berkeley Simons Institute's workshop on Information-Theoretic Methods for Trustworthy Machine Learning.
April 2023: "Learned Wyner–Ziv Compressors Recover Binning" was accepted to International Symposium on Information Theory (ISIT) 2023. Preprint is available here!
December 2022: "Learned Disentangled Latent Representations for Scalable Image Coding for Humans and Machines" was accepted to Data Compression Conference (DCC) 2023.
August 2022: I presented a poster titled Neural Distributed Source Coding at North American School of Information Theory (NASIT) 2022.
June 2022: Interning at InterDigital – Emerging Technologies Lab in Los Altos, CA.