[talks] S Jafarpour general exam

Melissa Lawson mml at CS.Princeton.EDU
Fri Jan 9 14:56:05 EST 2009


Sina Jafarpour will present his research seminar/general exam on Friday January 16
at 2:30 PM in Room 402.  The members of his committee are:  Rob Calderbank (advisor), 
Rob Schapire, and Andrea LaPaugh.  Everyone is invited to attend his talk, and those 
faculty wishing to remain for the oral exam following are welcome to do so.  His 
abstract and reading list follow below.
------------------------------------------- 

The magic of sparsity.

Joint work with Robert Calderbank, Robert Schapire, and Ingrid Daubechies.

Compressed Sensing is a novel data acquisition method suitable when either the measurement
or the compression is costly. The traditional approach to compressed sensing uses convex
optimization methods, which are based on some "near isometry" property of the sensing
matrix. This approach is costly in terms of the recovery time as well as the required
storage.

In this talk, we present two new efficient sensing methods. We first show how expander
graphs are appropriate for compressed sensing in terms of providing explicit and efficient
sensing matrices as well as a simple and efficient recovery algorithm. Our second result
provides efficient deterministic sensing. We show very simple conditions a matrix should
satisfy in order to satisfy statistical restricted isometry property. This leads to
deterministic sensing matrices in which the recovery time only depends on the number of
measurements and might even bypass the random sensing lower bounds..

Finally, we show how compressed sensing can be regarded as a universal sparse
dimensionality reduction technique. Especially, we provide tight bounds, demonstrating
that the soft-margin SVM classifier in compressed domain is almost as accurate as the best
classifier in high dimension. This is beneficial in the compressed sensing point of view
by reducing the recovery time, and also in the machine learning point of view by bypassing
the curse of dimensionality.



Books:
======
 
1. S. Russell and P. Norvig. Artificial Intelligence: A modern approach. 
   Chapters: 
     3.1-3.5
     4.1-4.3
     6
     7.1-7.6
     11.1, 11.5
     8.1-8.3
     13.1-13.6
     14.1-14.5
     15.1-15.6
     16.1-16.3
     17.1-17.4
     18.1-18.5
     20.1-20.3,20.5-20.6
     21.1-21.4

2. Lecture notes of Computer Science 511: Theoretical Machine learning, by Robert
Schapire, spring 2008.

3. S. Boucheron1, O. Bousquet and G. Lugosi, Theory of classification, a survey of some
recent advances, ESAIM: Probability and Statistics 9, 323 - 375, 2005.
http://www.econ.upf.edu/~lugosi/esaimsurvey.pdf


4. S. Boyd and L. Vandenberghe, Convex Optimization.
   Chapters:
     2.1-2.3
     3.1.1-3.2
     4.1-4.4
     5.1-5.2
     5.4-5.5


Papers:
=======
 
1. C. Burges. A tutorial on Support Vector Machines for pattern recognition. Data Mining
and Knowledge Discovery, 2(2):955-974, 1998.
http://www.umiacs.umd.edu/~joseph/support-vector-machines4.pdf

2. S. Shalev-Shwartz and N. Srebro. SVM Optimization: Inverse Dependence on Training set
size. ICML, 2008. http://icml2008.cs.helsinki.fi/papers/266.pdf

3. A. Blum. Random Projection, Margins, Kernels, and Feature-Selection. LNCS vol 3940, pp.
52--68, 2006. www.cs.cmu.edu/~avrim/Papers/randomproj.pdf 

4. R. Schapire and Y. Singer. Improved boosting algorithms using confidence-rated
predictions. Machine Learning, 37(3):297-336, 1999.
http://www.cs.princeton.edu/~schapire/uncompress-papers.cgi/SchapireSi98.ps

5. R. Baraniuk, Compressive sensing. IEEE Signal Processing Magazine, 24(4), pp. 118-121,
July 2007 http://www.dsp.ece.rice.edu/cs/baraniukCSlecture07.pdf

6. E. Candès and M. Wakin, An introduction to compressive sampling. IEEE Signal Processing
Magazine, 25(2), pp. 21 - 30, March 2008. http://www.dsp.ece.rice.edu/cs/CSintro.pdf


7. R. Berinde and P. Indyk, Sparse recovery using sparse random matrices. Preprint, 2008.
http://people.csail.mit.edu/indyk/report.pdf

8. R. Baraniuk, M. Davenport, R. DeVore, and M. Wakin, A simple proof of the restricted
isometry property for random matrices using Johnson-Lindenstrauss Lemma. To appear in
Constructive Approximation. http://www.dsp.ece.rice.edu/cs/JL_RIP.pdf


9. V. Chandar, A negative result concerning explicit matrices with the restricted isometry
property. Preprint, 2008 http://www.dsp.ece.rice.edu/cs/Venkat_CS.pdf

10. L. Applebaum, S. Howard, S. Searle, and R. Calderbank, Chirp sensing codes:
Deterministic compressed sensing measurements for fast recovery. Applied and Computational
Harmonic Analysis, September 2008. http://www.dsp.ece.rice.edu/cs/chirpCode.pdf

11.S. D. Howard, A. R. Calderbank, and S. J. Searle, A fast reconstruction algorithm for
deterministic compressive sensing using second order Reed-Muller codes. (Conf. on Info.
Sciences and Systems (CISS), Princeton, New Jersey, March 2008)
http://www.dsp.ece.rice.edu/cs/HCS_CISS_2008.pdf

12.W. Xu and B. Hassibi, Efficient compressive sensing with deterministic guarantees using
expander graphs. http://www.dsp.ece.rice.edu/cs/cs-expander2.pdf
      




More information about the talks mailing list