Introduction

This library aims to provide XLA/JAX based Python implementations for various algorithms related to:

Bulk of this library is built using functional programming techniques which is critical for the generation of efficient numerical codes for CPU and GPU architectures.

Functional Programming

Functional Programming is a programming paradigm where computer programs are constructed by applying and composing functions. Functions define a tree of expressions which map values to other values (akin to mathematical functions) rather than a sequence of iterative statements. Some famous languages based on functional programming are Haskell and Common Lisp. A key idea in functional programming is a pure function. A pure function has following properties:

  • The return values are identical for identical arguments.

  • The function has no side-effects (no mutation of local static variables, non-local variables, etc.).

XLA is a domain-specific compiler for linear algebra. XLA uses JIT (just-in-time) compilation techniques to analyze the structure of a numerical algorithm written using it. It then specializes the algorithm for actual runtime dimensions and types of parameters involved, fuses multiple operations together and emits efficient native machine code for devices like CPUs, GPUs and custom accelerators (like Google TPUs).

JAX is a front-end for XLA and Autograd with a NumPy inspired API. Unlike NumPy, JAX arrays are always immutable. While x[0] = 10 is perfectly fine in NumPy as arrays are mutable, the equivalent functional code in JAX is x = x.at[0].set(10).

Linear Operators

Efficient linear operator implementations provide much faster computations compared to direct matrix vector multiplication. PyLops [RV19] is a popular collection of linear operators implemented in Python.

A framework for building and composing linear operators has been provided in cr.sparse.lop. Functionality includes:

  • Basic operators: identity, matrix, diagonal, zero, flipud, sum, pad_zeros, symmetrize, restriction, etc.

  • Signal processing: fourier_basis_1d, dirac_fourier_basis_1d, etc.

  • Random dictionaries: gaussian_dict, rademacher_dict, random_onb_dict, random_orthonormal_rows_dict, etc.

  • Operator algebra: neg, scale, add, subtract, compose, transpose, hermitian, hcat, etc.

  • Additional utilities

Greedy Sparse Recovery/Approximation Algorithms

JAX based implementations for the following algorithms are included.

  • Orthogonal Matching Pursuit [PRK93, Tro04]

  • Compressive Sampling Matching Pursuit [NT09]

  • Subspace Pursuit [DM09]

  • Iterative Hard Thresholding [BD09]

  • Hard Thresholding Pursuit [Fou11]

Convex Optimization based Recovery Algorithms

Convex optimization [BV04] based methods provide more reliable solutions to sparse recovery problems although they tend to be computationally more complex. The first method appeared around 1998 as basis pursuit [CDS98].

Alternating directions [BPC+11] based methods provide simple yet efficient iterative solutions for sparse recovery.

[YZ11] describes inexact ADMM based solutions for a variety of \(\ell_1\) minimization problems. The authors provide a MATLAB package yall1 [ZYY10]. A port of yall1 (Your algorithms for \(\ell_1\)) has been provided. It provides alternating directions method of multipliers based solutions for basis pursuit, basis pursuit denoising, basis pursuit with inequality constraints, their non-negative counterparts and other variants.

Evaluation Framework

The library also provides

  • Various simple dictionaries and sensing matrices

  • Sample data generation utilities

  • Framework for evaluation of sparse recovery algorithms

Open Source Credits

Major parts of this library are directly influenced by existing projects. While the implementation in CR-Sparse is fresh (based on JAX), it has been possible thanks to the extensive study of existing implementations. We list here some of the major existing projects which have influenced the implementation in CR-Sparse. Let us know if we missed anything.

  • JAX The overall project structure is heavily influenced by the conventions followed in JAX. We learned the functional programming techniques as applicable for linear algebra work by reading the source code of JAX.

  • SciPy JAX doesn’t have all parts of SciPy ported yet. Some parts of SciPy have been adapted and re-written (in functional manner) as per the needs of CR-Sparse. E.g. cr.sparse.dsp.signals. The [TC98] version of CWT in cr.sparse.wt.

  • OpTax This helped in understanding how to use Named Tuples as states for iterative algorithms. This was also useful in conceptualizing the structure for cr.sparse.lop.

  • PyLops: The cr.sparse.lop library is heavily influenced by it.

  • PyWavelets: The DWT and CWT implementations in cr.sparse.wt are largely derived from it. The filter coefficients for discrete wavelets have been ported from C to Python from here.

  • HTP Original implementation of Hard Thresholding Pursuit in MATLAB.

  • WaveLab This MATLAB package helped a lot in initial understanding of DWT implementation.

  • YALL1: This is the original MATLAB implementation of the ADMM based sparse recovery algorithm.

  • L1-LS is the original MATLAB implementation of the Truncated Newton Interior Points Method for solving the l1-minimization problem.

  • Sparsify provides the MATLAB implementations of IHT, NIHT, AIHT algorithms.

  • Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing book code helped a lot in basic understanding of sparse representations.

  • aaren/wavelets is a decent CWT implementation following [TC98]. Influenced: cr.sparse.wt.

Further Reading

BDDH11

Richard Baraniuk, M Davenport, M Duarte, and Chinmay Hegde. An introduction to compressive sensing. Connexions e-textbook, 2011.

BD09

Thomas Blumensath and Mike E Davies. Iterative hard thresholding for compressed sensing. Applied and Computational Harmonic Analysis, 27(3):265–274, 2009.

BPC+11

Stephen Boyd, Neal Parikh, Eric Chu, Borja Peleato, and Jonathan Eckstein. Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends in Machine Learning, 3(1):1–122, 2011.

BV04

Stephen Boyd and Lieven Vandenberghe. Convex optimization. Cambridge university press, 2004.

Candes06

Emmanuel J Candès. Compressive sampling. In Proceedings of the International Congress of Mathematicians: Madrid, August 22-30, 2006: invited lectures, 1433–1452. 2006.

CandesW08

Emmanuel J Candès and Michael B Wakin. An introduction to compressive sampling. Signal Processing Magazine, IEEE, 25(2):21–30, 2008.

CDS98

Scott Shaobing Chen, David L Donoho, and Michael A Saunders. Atomic decomposition by basis pursuit. SIAM journal on scientific computing, 20(1):33–61, 1998.

DM09

Wei Dai and Olgica Milenkovic. Subspace pursuit for compressive sensing signal reconstruction. Information Theory, IEEE Transactions on, 55(5):2230–2249, 2009.

Don06

David L Donoho. Compressed sensing. Information Theory, IEEE Transactions on, 52(4):1289–1306, 2006.

Ela10

Michael Elad. Sparse and redundant representations. Springer, 2010.

Fou11

Simon Foucart. Recovering jointly sparse vectors via hard thresholding pursuit. Proc. Sampling Theory and Applications (SampTA)],(May 2-6 2011), 2011.

Mal08

Stephane Mallat. A wavelet tour of signal processing: the sparse way. Access Online via Elsevier, 2008.

NT09

Deanna Needell and Joel A Tropp. Cosamp: iterative signal recovery from incomplete and inaccurate samples. Applied and Computational Harmonic Analysis, 26(3):301–321, 2009.

PRK93

Yagyensh Chandra Pati, Ramin Rezaiifar, and PS Krishnaprasad. Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition. In Signals, Systems and Computers, 1993. 1993 Conference Record of The Twenty-Seventh Asilomar Conference on, 40–44. IEEE, 1993.

RV19

Matteo Ravasi and Ivan Vasconcelos. Pylops–a linear-operator python library for large scale optimization. arXiv preprint arXiv:1907.12349, 2019.

TC98(1,2)

Christopher Torrence and Gilbert P Compo. A practical guide to wavelet analysis. Bulletin of the American Meteorological society, 79(1):61–78, 1998.

Tro04

Joel A Tropp. Greed is good: algorithmic results for sparse approximation. Information Theory, IEEE Transactions on, 50(10):2231–2242, 2004.

YZ11

Junfeng Yang and Yin Zhang. Alternating direction algorithms for l_1-problems in compressive sensing. SIAM journal on scientific computing, 33(1):250–278, 2011.

ZYY10

Yin Zhang, Junfeng Yang, and Wotao Yin. User's guide for yall1: your algorithms for l1 optimization: version 1.0. Technical Report, CAAM Department, Rice University, 2010.

Documentation | Code | Issues | Discussions | Sparse-Plex