TorchKbNufft Documentation¶
Documentation | GitHub | Notebook Examples
About¶
torchkbnufft
implements a non-uniform Fast Fourier Transform
[1,
2] with
Kaiser-Bessel gridding in PyTorch. The implementation is completely in Python,
facilitating flexible deployment in readable code with no compilation. NUFFT
functions are each wrapped as a torch.autograd.Function
, allowing
backpropagation through NUFFT operators for training neural networks.
This package was inspired in large part by the NUFFT implementation in the Michigan Image Reconstruction Toolbox (Matlab).
Installation¶
Simple installation can be done via PyPI:
pip install torchkbnufft
torchkbnufft
only requires numpy
, scipy
, and
torch
as dependencies.
Operation Modes and Stages¶
The package has three major classes of NUFFT operation mode: table-based NUFFT interpolation, sparse matrix-based NUFFT interpolation, and forward/backward operators with Toeplitz-embedded FFTs [3]. Table interpolation is the standard operation mode, whereas the Toeplitz method is always the fastest for forward/backward NUFFTs. For some problems, sparse matrices may be fast. It is generally best to start with Table interpolation and then experiment with the other modes for your problem.
Sensitivity maps can be incorporated by passing them into a
KbNufft
or KbNufftAdjoint
object. Auxiliary functions for calculating sparse interpolation matrices,
density compensation functions, and Toeplitz filter kernels are also included.
For examples, see Basic Usage.
References¶
Fessler, J. A., & Sutton, B. P. (2003). Nonuniform fast Fourier transforms using min-max interpolation. IEEE Transactions on Signal Processing, 51(2), 560-574.
Beatty, P. J., Nishimura, D. G., & Pauly, J. M. (2005). Rapid gridding reconstruction with a minimal oversampling ratio. IEEE Transactions on Medical Imaging, 24(6), 799-808.
Feichtinger, H. G., Gr, K., & Strohmer, T. (1995). Efficient numerical methods in non-uniform sampling theory. Numerische Mathematik, 69(4), 423-440.