Search

Results: 131
Bregman Itoh–Abe Methods for Sparse Optimisation
Abstract: In this paper we propose optimisation methods for variational regularisation problems based on discretising the inverse scale space flow with discrete gradient methods. Inverse scale space flow generalises gradient...
Published by:
Focus U-Net
BACKGROUND: Colonoscopy remains the gold-standard screening for colorectal cancer. However, significant miss rates for polyps have been reported, particularly when there are multiple small adenomas. This presents an opportunity...
Published by:
Bregman Itoh–Abe Methods for Sparse Optimisation
Abstract: In this paper we propose optimisation methods for variational regularisation problems based on discretising the inverse scale space flow with discrete gradient methods. Inverse scale space flow generalises gradient...
Published by:
An anisotropic interaction model for simulating fingerprints.
Evidence suggests that both the interaction of so-called Merkel cells and the epidermal stress distribution play an important role in the formation of fingerprint patterns during pregnancy. To model the formation of fingerprint...
Published by:
Variational regularisation for inverse problems with imperfect forward operators and general noise models.
We study variational regularisation methods for inverse problems with imperfect forward operators whose errors can be modelled by order intervals in a partial order of a Banach lattice. We carry out analysis with respect to...
Published by: Inverse problems
Deep learning as optimal control problems
We consider recent work of [18] and [9], where deep learning neural networks have been interpreted as discretisations of an optimal control problem subject to an ordinary differential equation constraint. We review the first...
Published by:
Γ-Convergence of an Ambrosio-Tortorelli approximation scheme for image segmentation
Given an image u0, the aim of minimising the Mumford-Shah functional is to find a decomposition of the image domain into sub-domains and a piecewise smooth approximation u of u0 such that u varies smoothly within each...
Published by:
Accelerating Variance-Reduced Stochastic Gradient Methods

Variance reduction is a crucial tool for improving the slow convergence of stochastic gradient descent. Only a few variance-reduced methods, however, have yet been shown to directly benefit from Nesterov’s acceleration...

Published by:
Pattern formation of a nonlocal, anisotropic interaction model

We consider a class of interacting particle models with anisotropic, repulsive-attractive interaction forces whose orientations depend on an underlying tensor field. An example of this class of models is the so-called...

Published by:
Accelerating Variance-Reduced Stochastic Gradient Methods

Variance reduction is a crucial tool for improving the slow convergence of stochastic gradient descent. Only a few variance-reduced methods, however, have yet been shown to directly benefit from Nesterov’s acceleration...

Published by:
Choose your Path Wisely
We propose an extension of a special form of gradient descent --- in the literature known as linearised Bregman iteration --- to a larger class of non-convex functions. We replace the classical (squared) two norm metric in the...
Published by:

|<

<

1

2

3

4

5

>

>|