Yesterday, I gave talk at the Deep Learning London Meetup about my PhD research and my approach to winning the Galaxy Zoo challenge on Kaggle. The slides for my talk are available for download:

The three papers I discussed in the first part of the talk are described here, download links to the PDFs are included. A detailed description of my solution for the Galaxy Challenge is available in an earlier post on this blog. The code for all 17 models included in the winning ensemble is available on GitHub.

New Lasagne feature: arbitrary expressions as layer parameters

Lasagne now supports arbitrary Theano expressions as layer parameters, creating more flexibility and allowing easier code reuse. Continue reading

Paper about my Galaxy Challenge solution

Published on March 25, 2015