Home ML/Data science blogs [2208.08567] Excessive Likelihood Bounds for Stochastic Subgradient Schemes with Heavy Tailed Noise

[2208.08567] Excessive Likelihood Bounds for Stochastic Subgradient Schemes with Heavy Tailed Noise

0
[2208.08567] Excessive Likelihood Bounds for Stochastic Subgradient Schemes with Heavy Tailed Noise

[ad_1]

View a PDF of the paper titled Excessive Likelihood Bounds for Stochastic Subgradient Schemes with Heavy Tailed Noise, by Daniela A. Parletta and three different authors

View PDF
HTML (experimental)

Summary:On this work we examine excessive likelihood bounds for stochastic subgradient strategies below heavy tailed noise. On this setting the noise is simply assumed to have finite variance versus a sub-Gaussian distribution for which it’s recognized that commonplace subgradient strategies enjoys excessive likelihood bounds. We analyzed a clipped model of the projected stochastic subgradient methodology, the place subgradient estimates are truncated each time they’ve massive norms. We present that this clipping technique leads each to close optimum any-time and finite horizon bounds for a lot of classical averaging schemes. Preliminary experiments are proven to assist the validity of the tactic.

Submission historical past

From: Saverio Salzo [view email]
[v1]
Wed, 17 Aug 2022 23:05:05 UTC (135 KB)
[v2]
Solar, 14 Apr 2024 21:29:23 UTC (135 KB)

[ad_2]

Supply hyperlink

LEAVE A REPLY

Please enter your comment!
Please enter your name here