PhD students’ seminar Spring 2020

Organizers: Farid Benmouffok and Steven Gilmore

Wednesday, April 15, SAS 3282, 12:30 – 13:30
Speaker: Steven Gilmore
Title: TBA

Wednesday, April 8, SAS 3282, 12:30 – 13:30
Speaker: Minh Bùi
Title: TBA

Wednesday, April 1, SAS 3282, 12:30 – 13:30
Speaker: Farid Benmouffok
Title: TBA

Wednesday, March 25, SAS 3282, 12:30 – 13:30
Speaker: Sarah Strikwerda
Title: TBA

Wednesday, March 18, SAS 3282, 12:30 – 13:30
Speaker: Minh Bùi
Title: TBA

Wednesday, March 4, SAS 3282, 12:30 – 13:30
Title: Problem Session

Wednesday, February 26, SAS 3282, 12:30 – 13:30
Speaker: Prerona Dutta
Title: Semiconcave functions and Hamilton-Jacobi equations
Abstract: We discuss about Hamilton-Jacobi equations and the notion of viscosity solutions, recalling relevant results on semiconcave functions.

Wednesday, February 19, SAS 3282, 12:30 – 13:30
Speaker: Farid Benmouffok
Title: Problem Session

Wednesday, February 12, SAS 3282, 12:30 – 13:30
Speaker: Stephen Gilmore
Title: Admissible Solutions of 1D Scalar Conservation Laws
Abstract: We present a note by Ambrosio and De Lellis on conservation laws.

Wednesday, February 5, SAS 3282, 12:30 – 13:30
Speaker: Zev Woodstock
Title: Rephrasing set constraints with inequalities
Abstract: We analyze instances of when geometric constraints in set optimization problems can be rephrased using inequalities, thereby making these problems approachable with standard optimization techniques. We apply this method to a problem in convex neural codes.

Wednesday, January 29, SAS 3282, 12:30 – 13:30
Title: Problem Session

Wednesday, January 22, SAS 3282, 12:30 – 13:30
Speaker: Minh Bùi
Title: Solving Multivariate Structured Convex Minimization Problems in Hilbert Spaces
Abstract: We propose new flexible weakly/strongly convergent primal-dual parallel splitting methods for solving multivariate structured convex minimization problems in Hilbert spaces. Our methods fully exploit the structure of the problem to the extent that:

1/ The proximity operators are used individually

2/ The algorithms assign to each proximity operator its own scaling parameter that is free to vary over the iterations

3/ Lipschitzian gradients are used explicitly

4/ We do not impose additional assumptions on the functions present in the framework or on the number of variables. Furthermore, neither prior knowledge of the norms of the linear operators involved nor inversion of linear operators are required in the algorithms.

This is a joint work with my advisor Dr. Combettes.