I present some of our group’s recent work on Gaussian processes (GP) both from a theoretical and applied perspective. I first introduce the Renye GP which is an alternative objective for GPs capable of improving generalization through tuning the induced regularization. I then highlight the ability of mini-batch stochastic gradient descent to perform inference in GPs (a correlated setting) and hence scaling them far beyond what has been thought possible. From an applied perspective, I introduce predictive GP models that can be used for joint event data, weakly supervised settings and state of the art multi-output regression.