Page 1

Displaying 1 – 6 of 6

Showing per page

Decomposition and Moser's lemma.

David E. Edmunds, Miroslav Krbec (2002)

Revista Matemática Complutense

Using the idea of the optimal decomposition developed in recent papers (Edmunds-Krbec, 2000) and in Cruz-Uribe-Krbec we study the boundedness of the operator Tg(x) = ∫x1 g(u)du / u, x ∈ (0,1), and its logarithmic variant between Lorentz spaces and exponential Orlicz and Lorentz-Orlicz spaces. These operators are naturally linked with Moser's lemma, O'Neil's convolution inequality, and estimates for functions with prescribed rearrangement. We give sufficient conditions for and very simple proofs...

Deep learning for gradient flows using the Brezis–Ekeland principle

Laura Carini, Max Jensen, Robert Nürnberg (2023)

Archivum Mathematicum

We propose a deep learning method for the numerical solution of partial differential equations that arise as gradient flows. The method relies on the Brezis–Ekeland principle, which naturally defines an objective function to be minimized, and so is ideally suited for a machine learning approach using deep neural networks. We describe our approach in a general framework and illustrate the method with the help of an example implementation for the heat equation in space dimensions two to seven.

Dynamical instability of symmetric vortices.

Luis Almeida, Yan Guo (2001)

Revista Matemática Iberoamericana

Using the Maxwell-Higgs model, we prove that linearly unstable symmetric vortices in the Ginzburg-Landau theory are dynamically unstable in the H1 norm (which is the natural norm for the problem).In this work we study the dynamic instability of the radial solutions of the Ginzburg-Landau equations in R2 (...)

Currently displaying 1 – 6 of 6

Page 1