On the 3rd of April of 2019, we will hold a mini-workshop from 14:00 to 17:40 at room 534, Engineering building 14th.
The schedule will be as follows:
14:00-14:40 Ting Kei Pong (The Hong Kong Polytechnic University):
Gauge optimization: Duality and polar envelope
14:40-15:20 Bruno F. Lourenço (University of Tokyo):
Generalized subdifferentials of spectral functions
15:20-16:00 Michael Metel (RIKEN-AIP):
Stochastic gradient methods for non-smooth non-convex optimization
16:20-17:00 Masaru Ito (Nihon University):
Adaptive and nearly optimal first-order method under H lderian error bound condition
17:00-17:40 Takashi Tsuchiya (National Graduate School for Policy Studies):
Duality theory of SDP revisited: Another Analysis on Why Positive Duality Gaps Arise in SDP
Abstracts of the talks:
Title: Gauge optimization: Duality and polar envelope
Speaker: Ting Kei Pong
Gauge optimization seeks the element of a convex set that is minimal with respect to a gauge function, and arises naturally in various contemporary applications such as machine learning and signal processing. In this talk, we
explore the gauge duality framework proposed by Freud. We then define the polar envelope and discuss some of its important properties. The polar envelope is a convolution operation specialized to gauges, and is analogous to Moreau envelope. We will highlight the important roles the polar envelope plays in gauge duality and in the construction of algorithms for gauge optimization. This is joint work with Michael Friedlander and Ives Macêdo.
Title: Generalized subdifferentials of spectral functions
Speaker: Bruno F. Lourenço
In this talk, we explain how to compute the regular, approximate and horizon subdifferentials of spectral functions over Jordan algebras.We will also show how the obtained formulae can be used to
compute the exponent of the Kurdyka–Łojasiewicz inequality for spectral functions, which is useful for the analysis of gradient methods in nonsmooth optimization. As an application, we compute the generalized subdifferentials of the k-th largest eigenvalue function. This is a joint work with Akiko Takeda.
Title: Stochastic gradient methods for non-smooth non-convex optimization
Speaker: Michael Metel
Title: Adaptive and nearly optimal first-order method under H lderian error bound condition
Speaker: Masaru Ito
Knowledge of strong convexity or its relaxed error bound conditions of the objective function is important to accelerate the convergence of first-order methods. In this work, we focus on the H lderian Error Bound (HEB) condition, which is a generalization of the strong convexity relative to the solution set, parameterizing the exponent. An adaptive and nearly optimal first-order method is presented, which reduces the requirement of prior knowledge of the HEB exponent.
Title: Duality theory of SDP revisited: Another Analysis on Why Positive Duality Gaps Arise in SDP
Speaker: Takashi Tsuchiya
In this talk, we study the duality structure of semidefinite programs to understand positive duality gap in SDP. It is shown that there exists a relaxation of the dual problem whose optimal value coincides with the primal optimal value, whenever the primal is weakly feasible. The relaxed dual SDP is constructed by removing several linear constraints from the dual, and therefore is likely to have larger optimal value generically, leading to a positive duality gap. It can also happen that the relaxed dual problem coincides the dual itself, i.e., no linear constraint is removed in the construction process of relaxation, in which case strong duality holds. Strong duality under the Slater condition or strong duality in LP can be understood in this context. The analysis suggests that the existence of positive duality gap is a rather generic phenomenon when both the primal and dual problems are weakly feasible.
This is a joint work with Bruno F. Lourenco and Masakazu Muramatsu.