Item request has been placed!
×
Item request cannot be made.
×
Processing Request
Regularized online exponentially concave optimization.
Item request has been placed!
×
Item request cannot be made.
×
Processing Request
- Additional Information
- Abstract:
In this paper, we investigate regularized online exponentially concave (abbr. exp-concave) optimization, in which each loss function consists of a time-varying exp-concave function and a fixed convex regularization. If the whole loss function is exp-concave, a classical method called online Newton step (ONS) enjoys an O (d log T) regret bound, where d is the dimensionality and T is the time horizon. However, in the regularized setting, the sum of an exp-concave function and a convex regularization is not necessarily an exp-concave function, which implies that ONS is not applicable. To address this problem, we propose the proximal online Newton step (ProxONS), and show that it can attain the same O (d log T) regret bound for any convex regularization. The main idea is to first perform an iteration of ONS with the exp-concave part in each loss function and then perform a proximal mapping with the regularization part. Furthermore, we demonstrate that by utilizing the standard online-to-batch conversion, our ProxONS can be extended to solve stochastic optimization with a regularized exp-concave objective, and enjoy an O (d log T / T) convergence rate with high probability. Experimental results on two real datasets verify the effectiveness of our ProxONS. [ABSTRACT FROM AUTHOR]
- Abstract:
Copyright of Neurocomputing is the property of Elsevier B.V. and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
No Comments.