# Fourier Neural Operator (FNO)

Traditional PDE solver such as finite element methods (FEM) and finite difference methods (FDM) rely on discretizing the space into a mesh. Therefore, they impose a trade-off on the resolution: coarse grids are fast but less accurate; fine grids are accurate but slow. Solving complex systems of Partial Differential Equations (PDEs), that model real-life physical phenomena, usually require a very fine discretization, and therefore it is very challenging and time-consuming for traditional solvers. On the other hand, data-driven methods can directly learn the trajectory of the family of equations from the data. As a result, the learning-based method can be orders of magnitude faster than the conventional solvers.

Recently, a new line of work proposed learning mesh-free, infinite-dimensional operators with neural networks. ^{1}
The neural operator remedies the mesh-dependent nature of the finite-dimensional operator methods by producing
a single set of network parameters that may be used with different discretizations. The neural operator
requires no knowledge of the underlying PDE, only data.

Fourier Neural Operator (FNO) is a novel, data-driven deep learning architecture able to learn mappings between infinite-dimensional
spaces of functions. The key feature of FNO is called **spectral convolutions**.
These are special operations that work in the Fourier space and involve an integral kernel.

Operator learning can be taken as an image-to-image problem. The Fourier layer can be viewed as a substitute for the convolution layer.

^{2}

The inputs and outputs of PDEs are continuous functions. It is more efficient to represent them
in Fourier space and do global convolution. Then, for the Fourier layer, we can use the fast Fourier transform (FFT).
Activation functions will then be applied on the spatial domain. They help to recover the Higher frequency modes and
non-periodic boundary which are left out in the Fourier layers. Therefore it's necessary to the Fourier transform
and its inverse at each layer. ^{2}