Dear all,

We are pleased to announce that the next speaker of the *“I Can’t Believe
It’s Not Better!” (**ICBINB)* virtual seminar series will be *Mariia
Vladimirova** (**Inria Grenoble Rhone-Alpes**)*. More details about this
series and the talk are below.

The *"I Can't Believe It's Not Better!" (ICBINB) monthly online seminar
series* seeks to shine a light on the "stuck" phase of research. Speakers
will tell us about their most beautiful ideas that didn't "work", about
when theory didn't match practice, or perhaps just when the going got
tough. These talks will let us peek inside the file drawer of unexpected
results and peer behind the curtain to see the real story of *how real
researchers did real research*.

*When: *October 20th, 2022 at 10am EDT / 4pm CEST / 7am PDT

*Where: *RSVP for the Zoom link here:
https://us02web.zoom.us/meeting/register/tZwtc--vqTsjE9MY5uo86UxMOsK6TLwSdHoU

*Title:* *Heavy tails in Bayesian neural networks: expectation and reality*

*Abstract:** The finding of the connection between the Gaussian process and
deep Bayesian neural networks in the wide limit increased the interest in
research on Bayesian neural networks. On one side, it helped to reason
about existing works and their assumptions, such as Gaussian activations
assumption in the Edge of Chaos effect, or tuning priors over functions to
get closer to some GP. On another side, it gave a new perspective to
Bayesian neural networks that lead to the study of the training dynamics
through the neural tangent kernel, improvements in variational inference,
uncertainty quantification, and others.*





*However, empirically, the distance between a hidden unit distribution and
a Gaussian process increased with depth for the same number of hidden units
per layer. So one of the main directions became the study of the difference
between finite and infinite width neural networks.We showed the sub-Weibull
and Weibull-tail properties of hidden units conjecturing that hidden units
are heavier-tailed with going deeper in the network. This tail description
reveals the difference between hidden units’ in finite- and
infinite-widths. There are also parallel works that show the full
description of hidden units’ distributions through Meijer G-functions that
are consistent with our heavy-tailed result.We found theoretically that the
tail parameter increases linearly with depth. However, we could not observe
the theoretical tail parameter empirically. At least, not that precise. In
this talk, I give a retrospective on this line of work about the Bayesian
neural networks. Further, I give details and possible explanations of our
empirical results.*

*Bio:* *Mariia Vladimirova is a PostDoc researcher at Inria Grenoble
Rhone-Alpes
<https://www.inria.fr/fr/centre-inria-universite-grenoble-alpes> in the
Statify <https://team.inria.fr/statify/> team. Her research mostly focuses
on exploring distributional properties of Bayesian neural networks. More
specifically, she is interested in explaining the difference between deep
learning models of wide and shallow regimes in order to improve the
interpretability and efficiency of the models.*

*Mariia Vladimirova did her graduate studies in Statify
<https://team.inria.fr/statify/> and Thoth <http://thoth.inrialpes.fr/>
teams under supervision of Julyan Arbel <https://www.julyanarbel.com/> and
Jakob Verbeek <http://lear.inrialpes.fr/people/verbeek/>. During November
2019-January 2020, she was visiting Duke University
<https://trinity.duke.edu/> and working on prior predictive distributions
in BNNs under supervision of David Dunson
<https://scholars.duke.edu/person/dunson>. Prior to that, she obtained my
Bachelor degree at Moscow Institute of Physics and Technology (MIPT) and
did the second year of Master program at Grenoble Institute of Technology
(Grenoble – INP, Ensimag <https://ensimag.grenoble-inp.fr/>).*

For more information and for ways to get involved, please visit us at
http://icbinb.cc/, Tweet to us @ICBINBWorkhop
<https://twitter.com/ICBINBWorkshop>, or email us at
cant.believe.it.is.not.bet...@gmail.com.


--
Best wishes,
The ICBINB Organizers
_______________________________________________
uai mailing list
uai@engr.orst.edu
https://it.engineering.oregonstate.edu/mailman/listinfo/uai

Reply via email to