From lfthomps at uw.edu Wed Oct 4 10:39:22 2023 From: lfthomps at uw.edu (Lowell F. Thompson) Date: Thu Mar 7 20:17:34 2024 Subject: [Amath-seminars] Reminder: Boeing Colloquium with Professor Michael Jordan from UC Berkeley Message-ID: Hi all, This is a reminder that our next Boeing Colloquium, presented by Professor Michael Jordan of the University of California, Berkeley ( https://www2.eecs.berkeley.edu/Faculty/Homepages/jordan.html) will be held *tomorrow Thursday, October 5 at 4:00pm in SMI 205*. Title: An Alternative View on AI: Collaborative Learning, Incentives, and Social Welfare Abstract: Artificial intelligence (AI) has focused on a paradigm in which intelligence inheres in a single, autonomous agent. Social issues are entirely secondary in this paradigm. When AI systems are deployed in social contexts, however, the overall design of such systems is often naive---a centralized entity provides services to passive agents and reaps the rewards. Such a paradigm need not be the dominant paradigm for information technology. In a broader framing, agents are active, they are cooperative, and they wish to obtain value from their participation in learning-based systems. Agents may supply data and other resources to the system, only if it is in their interest to do so. Critically, intelligence inheres as much in the overall system as it does in individual agents, be they humans or computers. This is a perspective familiar in the social sciences, and a key theme in my work is that of bringing economics into contact with foundational issues in computing and data sciences. I'll emphasize some of the mathematical challenges that arise at this tripartite interface. Please don't hesitate to contact me if you have any questions. Thanks, Lowell -------------- next part -------------- An HTML attachment was scrubbed... URL: From arahman2 at uw.edu Mon Oct 30 21:53:04 2023 From: arahman2 at uw.edu (Amin Rahman) Date: Thu Mar 7 20:17:34 2024 Subject: [Amath-seminars] Boeing Colloquium Thursday, Nov. 2 Message-ID: Dear Amath and Friends, Our next Boeing Colloquium, presented by Professor Houman Owhadi of Caltech University (https://www.cms.caltech.edu/people/owhadi), will be held next Thursday, Nov. 2, 2023, 4 ? 5 p.m. in Smith Hall 205. Title: Computational Hypergraph Discovery, a framework for connecting the dots Abstract: Function approximation can be categorized into three levels of complexity. Type 1: Approximate an unknown function given (possibly noisy) input/output data. Type 2: Consider a collection of variables and functions indexed by the nodes and hyperedges of a hypergraph (a generalization of a graph in which edges can connect more than two vertices). Assume some of the functions to be unknown. Given multiple samples from subsets of the variables of the hypergraph (satisfying the functional dependencies imposed by its structure), approximate all the unobserved variables and unknown functions of the hypergraph. Type 3: Expanding on Type 2, if the hypergraph structure itself is unknown, use partial observations (of subsets of the variables of the hypergraph) to uncover its structure (the hyperedges and potentially missing vertices) and then approximate its unknown functions and unobserved variables. Numerical approximation, Supervised Learning, and Operator Learning can all be formulated as type 1 problems (with functional inputs/outputs spaces). Type 2 problems include solving and learning (possibly stochastic) ordinary or partial differential equations, Deep Learning, dimension reduction, reduced-ordered modeling, system identification, closure modeling, etc. The scope of Type 3 problems extends well beyond Type 2 problems and includes applications involving model/equation/network discovery and reasoning with raw data. While most problems in Computational Sciences and Engineering (CSE) and Scientific Machine Learning (SciML) can be framed as Type 1 and Type 2 challenges, many problems in science can only be categorized as Type 3 problems. Despite their prevalence, these Type 3 challenges have been largely overlooked due to their inherent complexity. Although Gaussian Process (GP) methods are sometimes perceived as a well-founded but old technology limited to curve fitting (Type 1 problems), a generalization of these methods holds the key to developing an interpretable framework for solving Type 2 and Type 3 problems inheriting the simple and transparent theoretical and computational guarantees of kernel/optimal recovery methods. This will be the topic of our presentation. Best regards, Amin -- Aminur (Amin) Rahman (He/him/his) Acting Instructor (postdoc) Department of Applied Mathematics http://faculty.washington.edu/arahman2 -------------- next part -------------- An HTML attachment was scrubbed... URL: From arahman2 at uw.edu Wed Nov 1 15:27:11 2023 From: arahman2 at uw.edu (Amin Rahman) Date: Thu Mar 7 20:17:34 2024 Subject: [Amath-seminars] Reminder: Boeing Distinguished Colloquium Thursday, Nov. 2 Message-ID: Dear Amath and Friends, Our Boeing Distinguished Colloquium, presented by Professor Houman Owhadi of Caltech University (https://www.cms.caltech.edu/people/owhadi), will be tomorrow, Thursday, Nov. 2, 2023, 4 ? 5 p.m. in Smith Hall 205. Title: Computational Hypergraph Discovery, a framework for connecting the dots Abstract: Function approximation can be categorized into three levels of complexity. Type 1: Approximate an unknown function given (possibly noisy) input/output data. Type 2: Consider a collection of variables and functions indexed by the nodes and hyperedges of a hypergraph (a generalization of a graph in which edges can connect more than two vertices). Assume some of the functions to be unknown. Given multiple samples from subsets of the variables of the hypergraph (satisfying the functional dependencies imposed by its structure), approximate all the unobserved variables and unknown functions of the hypergraph. Type 3: Expanding on Type 2, if the hypergraph structure itself is unknown, use partial observations (of subsets of the variables of the hypergraph) to uncover its structure (the hyperedges and potentially missing vertices) and then approximate its unknown functions and unobserved variables. Numerical approximation, Supervised Learning, and Operator Learning can all be formulated as type 1 problems (with functional inputs/outputs spaces). Type 2 problems include solving and learning (possibly stochastic) ordinary or partial differential equations, Deep Learning, dimension reduction, reduced-ordered modeling, system identification, closure modeling, etc. The scope of Type 3 problems extends well beyond Type 2 problems and includes applications involving model/equation/network discovery and reasoning with raw data. While most problems in Computational Sciences and Engineering (CSE) and Scientific Machine Learning (SciML) can be framed as Type 1 and Type 2 challenges, many problems in science can only be categorized as Type 3 problems. Despite their prevalence, these Type 3 challenges have been largely overlooked due to their inherent complexity. Although Gaussian Process (GP) methods are sometimes perceived as a well-founded but old technology limited to curve fitting (Type 1 problems), a generalization of these methods holds the key to developing an interpretable framework for solving Type 2 and Type 3 problems inheriting the simple and transparent theoretical and computational guarantees of kernel/optimal recovery methods. This will be the topic of our presentation. Best regards, Amin -- Aminur (Amin) Rahman (He/him/his) Acting Instructor (postdoc) Department of Applied Mathematics http://faculty.washington.edu/arahman2 -------------- next part -------------- An HTML attachment was scrubbed... URL: