Rashish Tandon (राशीश टंडन)

me 

I am a ML Engineer at Apple, based in Seattle.

Prior to joining Apple, I was a graduate student (PhD) in the Department of Computer Science at UT Austin, advised by Alex Dimakis and Pradeep Ravikumar(now at CMU). My PhD thesis dealt with machine learning in high-dimensional and distributed settings. Prior to joining UT Austin, I obtained a B.Tech/M.Tech(Dual Degree) in CS from IIT Kanpur in 2011.

Research

  • PhD Thesis : On Structured and Distributed Learning [pdf]
       R.Tandon

  • Gradient Coding from Cyclic MDS codes and Expander Graphs [pdf]
       N. Raviv, I. Tamo, R. Tandon, A. Dimakis
       To appear in Transactions on Information Theory, 2020
       - Also appeared in International Conference on Machine Learning (ICML) , 2018

  • Gradient Coding : Avoiding stragglers in distributed Synchronous Gradient Descent [pdf] [code]
       R. Tandon, Q. Lei, A. Dimakis, N. Karampatziakis
       In International Conference on Machine Learning (ICML) , 2017
       - A shorter version appeared in the ML Systems Workshop (MLSyS), NIPS 2016

  • Kernel Ridge Regression via Partitioning [pdf] [code]
       R. Tandon, S. Si, P. Ravikumar, I. Dhillon
       Preprint

  • On the Information Theoretic Limits of Learning Ising Models [pdf]
       R. Tandon, K. Shanmugam, P. Ravikumar, A. Dimakis
       In the Advances in Neural Information Processing Systems (NIPS), 2014

  • Learning Graphs with a Few Hubs [pdf] [appendix]
       R.Tandon, P.Ravikumar
       In the International Conference on Machine Learning (ICML), 2014

  • Learning Sparsely Used Overcomplete Dictionaries via Alternating Minimization [pdf]
       A. Agarwal, A. Anandkumar, P. Jain, P. Netrapalli and R. Tandon
       In the Conference on Learning Theory (COLT), 2014

  • On the Difficulty of Learning Power-Law Graphical Models [pdf]
       R. Tandon, P. Ravikumar
       In the IEEE International Symposium on Information Theory (ISIT), 2013