Introduction¶
This library aims to provide XLA/JAX based Python implementations for various algorithms related to:
Linear operators
Bulk of this library is built using functional programming techniques which is critical for the generation of efficient numerical codes for CPU and GPU architectures.
Functional Programming¶
Functional Programming is a programming paradigm where computer programs are constructed by applying and composing functions. Functions define a tree of expressions which map values to other values (akin to mathematical functions) rather than a sequence of iterative statements. Some famous languages based on functional programming are Haskell and Common Lisp. A key idea in functional programming is a pure function. A pure function has following properties:
The return values are identical for identical arguments.
The function has no side-effects (no mutation of local static variables, non-local variables, etc.).
XLA is a domain-specific compiler for linear algebra. XLA uses JIT (just-in-time) compilation techniques to analyze the structure of a numerical algorithm written using it. It then specializes the algorithm for actual runtime dimensions and types of parameters involved, fuses multiple operations together and emits efficient native machine code for devices like CPUs, GPUs and custom accelerators (like Google TPUs).
JAX is a front-end for XLA and Autograd
with a NumPy inspired API.
Unlike NumPy, JAX arrays are always immutable. While x[0] = 10
is perfectly fine
in NumPy as arrays are mutable, the equivalent functional code in JAX is
x = x.at[0].set(10)
.
Linear Operators¶
Efficient linear operator implementations provide much faster computations compared to direct matrix vector multiplication. PyLops [RV19] is a popular collection of linear operators implemented in Python.
A framework for building and composing linear operators has been
provided in cr.sparse.lop
. Functionality includes:
Basic operators: identity, matrix, diagonal, zero, flipud, sum, pad_zeros, symmetrize, restriction, etc.
Signal processing: fourier_basis_1d, dirac_fourier_basis_1d, etc.
Random dictionaries: gaussian_dict, rademacher_dict, random_onb_dict, random_orthonormal_rows_dict, etc.
Operator algebra: neg, scale, add, subtract, compose, transpose, hermitian, hcat, etc.
Additional utilites
Greedy Sparse Recovery/Approximation Algorithms¶
JAX based implementations for the following algorithms are included.
Convex Optimization based Recovery Algorithms¶
Convex optimization [BV04] based methods provide more reliable solutions to sparse recovery problems although they tend to be computationally more complex. The first method appeared around 1998 as basis pursuit [CDS98].
Alternating directions [BPC+11] based methods provide simple yet efficient iterative solutions for sparse recovery.
[YZ11] describes inexact ADMM based solutions
for a variety of \(\ell_1\) minimization problems. The authors
provide a MATLAB package yall1
[ZYY10].
A port of yall1
(Your algorithms for \(\ell_1\)) has been provided.
It provides alternating directions method of multipliers based solutions for
basis pursuit, basis pursuit denoising, basis pursuit with inequality constraints,
their non-negative counterparts and other variants.
Evaluation Framework¶
The library also provides
Various simple dictionaries and sensing matrices
Sample data generation utilities
Framework for evaluation of sparse recovery algorithms
Platform Suppport¶
CR-Sparse
can run on any platform supported by JAX
. JAX
doesn’t run natively on Windows platforms at the moment.
The authors have tested CR-Sparse
on Mac and Linux platforms.
Installation¶
Basic installation from PYPI:
python -m pip install cr-sparse
Installation from GitHub:
python -m pip install git+https://github.com/carnotresearch/cr-sparse.git
Further Reading¶
- BDDH11
Richard Baraniuk, M Davenport, M Duarte, and Chinmay Hegde. An introduction to compressive sensing. Connexions e-textbook, 2011.
- BD09
Thomas Blumensath and Mike E Davies. Iterative hard thresholding for compressed sensing. Applied and Computational Harmonic Analysis, 27(3):265–274, 2009.
- BPC+11
Stephen Boyd, Neal Parikh, Eric Chu, Borja Peleato, and Jonathan Eckstein. Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends in Machine Learning, 3(1):1–122, 2011.
- BV04
Stephen Boyd and Lieven Vandenberghe. Convex optimization. Cambridge university press, 2004.
- Candes06
Emmanuel J Candès. Compressive sampling. In Proceedings of the International Congress of Mathematicians: Madrid, August 22-30, 2006: invited lectures, 1433–1452. 2006.
- CandesW08
Emmanuel J Candès and Michael B Wakin. An introduction to compressive sampling. Signal Processing Magazine, IEEE, 25(2):21–30, 2008.
- CDS98
Scott Shaobing Chen, David L Donoho, and Michael A Saunders. Atomic decomposition by basis pursuit. SIAM journal on scientific computing, 20(1):33–61, 1998.
- DM09
Wei Dai and Olgica Milenkovic. Subspace pursuit for compressive sensing signal reconstruction. Information Theory, IEEE Transactions on, 55(5):2230–2249, 2009.
- Don06
David L Donoho. Compressed sensing. Information Theory, IEEE Transactions on, 52(4):1289–1306, 2006.
- Ela10
Michael Elad. Sparse and redundant representations. Springer, 2010.
- Fou11
Simon Foucart. Recovering jointly sparse vectors via hard thresholding pursuit. Proc. Sampling Theory and Applications (SampTA)],(May 2-6 2011), 2011.
- Mal08
Stephane Mallat. A wavelet tour of signal processing: the sparse way. Access Online via Elsevier, 2008.
- NT09
Deanna Needell and Joel A Tropp. Cosamp: iterative signal recovery from incomplete and inaccurate samples. Applied and Computational Harmonic Analysis, 26(3):301–321, 2009.
- PRK93
Yagyensh Chandra Pati, Ramin Rezaiifar, and PS Krishnaprasad. Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition. In Signals, Systems and Computers, 1993. 1993 Conference Record of The Twenty-Seventh Asilomar Conference on, 40–44. IEEE, 1993.
- RV19
Matteo Ravasi and Ivan Vasconcelos. Pylops–a linear-operator python library for large scale optimization. arXiv preprint arXiv:1907.12349, 2019.
- Tro04
Joel A Tropp. Greed is good: algorithmic results for sparse approximation. Information Theory, IEEE Transactions on, 50(10):2231–2242, 2004.
- YZ11
Junfeng Yang and Yin Zhang. Alternating direction algorithms for l_1-problems in compressive sensing. SIAM journal on scientific computing, 33(1):250–278, 2011.
- ZYY10
Yin Zhang, Junfeng Yang, and Wotao Yin. User's guide for yall1: your algorithms for l1 optimization: version 1.0. Technical Report, CAAM Department, Rice University, 2010.
Documentation | Code | Issues | Discussions | Sparse-Plex