Award Abstract # 2023166
TRIPODS: Institute for Foundations of Data Science

NSF Org: DMS
Division Of Mathematical Sciences
Recipient: UNIVERSITY OF WASHINGTON
Initial Amendment Date: August 31, 2020
Latest Amendment Date: August 28, 2023
Award Number: 2023166
Award Instrument: Continuing Grant
Program Manager: Christopher Stark
cstark@nsf.gov
 (703)292-4869
DMS
 Division Of Mathematical Sciences
MPS
 Direct For Mathematical & Physical Scien
Start Date: September 1, 2020
End Date: August 31, 2025 (Estimated)
Total Intended Award Amount: $4,852,999.00
Total Awarded Amount to Date: $3,920,286.00
Funds Obligated to Date: FY 2020 = $1,125,915.00
FY 2021 = $1,058,612.00

FY 2022 = $825,788.00

FY 2023 = $909,971.00
History of Investigator:
  • Maryam Fazel (Principal Investigator)
    mfazel@uw.edu
  • Zaid Harchaoui (Co-Principal Investigator)
  • Dmitriy Drusvyatskiy (Co-Principal Investigator)
  • Yin Tat Lee (Co-Principal Investigator)
  • Kevin Jamieson (Co-Principal Investigator)
Recipient Sponsored Research Office: University of Washington
4333 BROOKLYN AVE NE
SEATTLE
WA  US  98195-1016
(206)543-4043
Sponsor Congressional District: 07
Primary Place of Performance: University of Washington
4333 Brooklyn Ave. NE
Seattle
WA  US  98195-2500
Primary Place of Performance
Congressional District:
07
Unique Entity Identifier (UEI): HD1WMN6945W6
Parent UEI:
NSF Program(s): TRIPODS Transdisciplinary Rese,
Algorithmic Foundations
Primary Program Source: 01002021DB NSF RESEARCH & RELATED ACTIVIT
01002324DB NSF RESEARCH & RELATED ACTIVIT

01002223DB NSF RESEARCH & RELATED ACTIVIT

01002122DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 048Z, 075Z, 079Z
Program Element Code(s): 041Y00, 779600
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.049, 47.070

ABSTRACT

Data science is making an enormous impact on science and society, but its success is uncovering pressing new challenges that stand in the way of further progress. Outcomes and decisions arising from many machine learning processes are not robust to errors and corruption in the data; data science algorithms are yielding biased and unfair outcomes, as concerns about data privacy continue to mount; and machine learning systems suited to dynamic, interactive environments are less well developed than corresponding tools for static problems. Only by an appeal to the foundations of data science can we understand and address challenges such as these. Building on the work of three TRIPODS Phase I institutes, the new Institute for Foundations of Data Science (IFDS) brings together researchers from the Universities of Washington, Wisconsin-Madison, California-Santa Cruz, and Chicago, organized around the goal of tackling these critical issues. Members of IFDS have complementary strengths in the TRIPODS disciplines of mathematics, statistics, and theoretical computer science, and a proven record of collaborating to push theoretical boundaries by synthesizing knowledge and experience from diverse areas. Students and postdoctoral members of IFDS will be trained to be fluent in the languages of several disciplines, and able to bridge these communities and perform transdisciplinary research in the foundations of data science. In concert with its research agenda, IFDS will engage the data science community through workshops, summer schools, and hackathons. Its diverse leadership, committed to equity and inclusion, proposes extensive plans for outreach to traditionally underrepresented groups. Governance, management, and evaluation of the institute will build on the successful and efficient models developed during Phase I.

To address critical issues at the cutting edge of data science research, IFDS will organize its research around four core themes. The complexity theme will synthesize various notions of complexity from multiple disciplines to make breakthroughs in the analysis of optimization and sampling methods, develop tools for assessing the complexity of data models, and seek new methods with better complexity properties, to make complexity a more powerful tool for understanding and inventing algorithms in data science. The robustness theme considers data that contains errors or outliers, possibly due to an adversary, and will design methods for data analysis and prediction that are robust in the face of these errors. The theme on closed-loop data science tackles the issues of acquiring data in ways that reveal the information content of the data efficiently, using strategic and sequential policies that leverage information gathered already from past data. The theme on ethics and algorithms addresses issues of fairness and bias in machine learning, data privacy, and causality and interpretability. The four themes intersect in many ways, and most IFDS researchers will work in two or more of them. By making concerted progress on these fundamental fronts, IFDS will lower several of the barriers to better understanding of data science methodology and to its improved effectiveness and wider relevance to application areas. Additionally, IFDS will organize and host activities that engage the data science community at all levels of seniority. Annual workshops will focus on the critical issues identified above and others that are sure to arise over the next five years. Comprehensive plans for outreach and education will draw on previous experience of the Phase I institutes and leverage institutional resources at the four sites. Collaborations with domain science researchers in academia, national laboratories, and industry, so important in illuminating issues in the fundamentals of data science, will continue through the many channels available to IFDS members, including those established in the TRIPODS+X program. Relationships with other institutes at each IFDS site will further extend the impact of IFDS on domain sciences and applications.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 41)
Sadeghi, Omid and Fazel, Maryam "Fast First-Order Methods for Monotone Strongly DR-Submodular Maximization" Proceedings of SIAM Conference on Applied and Computational Discrete Algorithms , 2023 Citation Details
Jia, He and Laddha, Aditi and Lee, Yin Tat and Vempala, Santosh "Reducing isotropy and volume to KLS: an O(n^3 ?^2) volume algorithm" 53rd Annual ACM SIGACT Symposium on Theory of Computing , 2021 https://doi.org/10.1145/3406325.3451018 Citation Details
Dong, Sally and Lee, Yin Tat and Ye, Guanghao "A nearly-linear time algorithm for linear programs with small treewidth: a multiscale representation of robust central path" 53rd Annual ACM SIGACT Symposium on Theory of Computing , 2021 https://doi.org/10.1145/3406325.3451056 Citation Details
Katz-Samuels, Julian and Zhang, Jifan and Jain, Lalit and Jamieson, Kevin "Improved Algorithms for Agnostic Pool-based Active Classification" Proceedings of Machine Learning Research , v.139 , 2021 Citation Details
Joshua Cutler and Dmitriy Drusvyatskiy and Zaid Harchaoui "Stochastic Optimization under Distributional Drift" Journal of Machine Learning Research , v.24 , 2023 Citation Details
Lee, Yin Tat and Shen, Ruoqi and Tian, Kevin "Structured Logconcave Sampling with a Restricted Gaussian Oracle" Proceedings of Machine Learning Research , 2021 Citation Details
Maiti, Arnab and Jamieson, Kevin and Ratliff, Lillian J. "Instance-dependent Sample Complexity Bounds for Zero-sum Matrix Games" Proceedings of Machine Learning Research , v.206 , 2023 Citation Details
Sally Dong and Yu Gao and Gramoz Goranci and Yin Tat Lee and Richard Peng and Sushant Sachdeva and Guanghao Ye "Nested Dissection Meets IPMs: Planar Min-Cost Flow in Nearly-Linear Time" Proceedings of the 2022 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA) , 2022 Citation Details
Yin Tat Lee and Ruoqi Shen and Kevin Tian "Lower Bounds on Metropolized Sampling Methods for Well-Conditioned Distributions" Conference on Neural Information Processing Systems , 2021 Citation Details
Sivakanth Gopi and Yin Tat Lee and Lukas Wutschitz "Numerical Composition of Differential Privacy" Conference on Neural Information Processing Systems , 2021 Citation Details
Sadeghi, Omid and Raut, Prasanna and Fazel, Maryam "A Single Recipe for Online Submodular Maximization with Adversarial or Stochastic Constraints" Advances in neural information processing systems , v.33 , 2020 Citation Details
(Showing: 1 - 10 of 41)

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page