To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The connection between Residual Neural Networks (ResNets) and continuous-time control systems (known as NeurODEs) has led to a mathematical analysis of neural networks, which has provided interesting results of both theoretical and practical significance. However, by construction, NeurODEs have been limited to describing constant-width layers, making them unsuitable for modelling deep learning architectures with layers of variable width. In this paper, we propose a continuous-time Autoencoder, which we call AutoencODE, based on a modification of the controlled field that drives the dynamics. This adaptation enables the extension of the mean-field control framework originally devised for conventional NeurODEs. In this setting, we tackle the case of low Tikhonov regularisation, resulting in potentially non-convex cost landscapes. While the global results obtained for high Tikhonov regularisation may not hold globally, we show that many of them can be recovered in regions where the loss function is locally convex. Inspired by our theoretical findings, we develop a training method tailored to this specific type of Autoencoders with residual connections, and we validate our approach through numerical experiments conducted on various examples.
We show that a class of higher-dimensional hyperbolic endomorphisms admit absolutely continuous invariant probabilities whose densities are regular and vary differentiably with respect to the dynamical system. The maps we consider are skew-products given by $T(x,y) = (E (x), C(x,y))$, where E is an expanding map of $\mathbb {T}^u$ and C is a contracting map on each fiber. If $\inf |\!\det DT| \inf \| (D_yC)^{-1}\| ^{-2s}>1$ for some ${s<r-(({u+d})/{2}+1)}$, $r \geq 2$, and T satisfies a transversality condition between overlaps of iterates of T (a condition which we prove to be $C^r$-generic under mild assumptions), then the SRB measure $\mu _T$ of T is absolutely continuous and its density $h_T$ belongs to the Sobolev space $H^s({\mathbb {T}}^u\times {\mathbb {R}}^d)$. When $s> {u}/{2}$, it is also valid that the density $h_T$ is differentiable with respect to T. Similar results are proved for thermodynamical quantities for potentials close to the geometric potential.