Ying Wang

headshot.jpg

I am a first-year PhD student at NYU Center for Data Science, advised by Prof. Andrew Wilson and Prof. Mengye Ren in the CILVR Lab. My primary research interest is multimodal learning, aiming to build systems that can emulate human-like perception by seamlessly integrating various modalities such as text, images, videos, and audio.

Prior to moving to New York, I studied Computer Science, Statistics, and Finance at McGill University in Montreal, Canada.

Academic Service: Reviewer for CVPR2024, ECCV2024, WACV2025, AAAI2025.

News

Sep 1, 2024 I started my PhD at NYU 💜💜💜
Jun 13, 2024 Our team has won the third place in the Ego4D EgoSchema challenge.
Jun 3, 2024 Started my summer internship as a research scientist at Meta (Video recommendation), Bellevue.
Oct 20, 2023 Received Travel Award for NeurIPS 2023! See you in New Orleans ⚜️
Sep 21, 2023 Our work, Visual Explanations of Image-Text Representations via Multi-Modal Information Bottleneck Attribution, is accepted by NeurIPS 2023.

Selected Publications

2023

  1. lifelongmemory_thumbnail.png
    LifelongMemory: Leveraging LLMs for Answering Queries in Long-form Egocentric Videos
    Ying WangYanlai Yang, and Mengye Ren
    Under Review, 2023
  2. m2ib_thumbnail.png
    Visual Explanations of Image-Text Representations via Multi-Modal Information Bottleneck Attribution
    Advances in Neural Information Processing Systems (NeurIPS), 2023
  3. xmdetr_thumbnail.png
    Adapting Grounded Visual Question Answering Models to Low Resource Languages
    In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, Jun 2023