Ying Wang

I am a first-year PhD student at NYU Center for Data Science, advised by Prof. Andrew Wilson and Prof. Mengye Ren in the CILVR Lab. My research focuses on multimodal learning, aiming to build adaptive and trustworthy machine learning systems that integrate diverse modalities such as text, images, and videos.

Prior to starting my PhD, I earned an MS in Data Science at NYU, where I was honored to be featured in the alumni spotlight. I completed my undergraduate studies in Computer Science, Statistics, and Finance at McGill University in Montreal, Canada.

I’ve had the opportunity to contribute to research and engineering teams at Meta, Amazon, and Morgan Stanley. I also served as a reviewer for CVPR(2024, 2025), AAAI(2025), WACV(2025), ECCV(2024).

News

Sep 1, 2024 I started my PhD at NYU 💜💜💜
Jun 13, 2024 Our team has won the third place in the Ego4D EgoSchema challenge.
Jun 3, 2024 Started my summer internship as a research scientist at Meta (Video recommendation), Bellevue.
Oct 20, 2023 Received Travel Award for NeurIPS 2023! See you in New Orleans ⚜️
Sep 21, 2023 Our work, Visual Explanations of Image-Text Representations via Multi-Modal Information Bottleneck Attribution, is accepted by NeurIPS 2023.

Selected Publications

2023

  1. lifelongmemory_thumbnail.png
    LifelongMemory: Leveraging LLMs for Answering Queries in Long-form Egocentric Videos
    Ying WangYanlai Yang, and Mengye Ren
    Preprint, 2023
  2. m2ib_thumbnail.png
    Visual Explanations of Image-Text Representations via Multi-Modal Information Bottleneck Attribution
    In Advances in Neural Information Processing Systems (NeurIPS), 2023
  3. xmdetr_thumbnail.png
    Adapting Grounded Visual Question Answering Models to Low Resource Languages
    In CVPR Multimodal Learning and Applications Workshop [Oral], Jun 2023