Quick Overview of Theano: A Popular New Python Library for Deep Learning

Sam Hillis Data Scientist
Read Time: 3 minutes apprx.
big data classification data science deep learning machine learning python time series

Birch Trees small

In a recent engagement, we were tasked with evaluating sets of time series data at many intervals to determine the health of the system (essentially, looking to classify each increment of the time series). Being biased towards Python, we built datasets and began testing with most of the standard machine learning classification algorithms in sci-kit learn. It quickly became evident that we were not going to be able to achieve the desired level of predictive accuracy; it seemed that using lagged variables and other standard tricks did not truly capture the essence of the correlation between time periods.

After some discussion, we determined that the data could be modeled well using recurrent neural networks. Our first stop was at pybrain: one of the standard modules for building neural networks in Python. The library offered all of the flexibility required to build our custom network, but training on such a large network (containing over 100 hidden layers) took far too long.

Ultimately, we turned to a library that is making a lot of noise in the deep learning community: Theano. Theano has a steep learning curve for most Python users as the framework for declaring variables and building functions differ greatly from base Python. No machine learning algorithms exists “out-of-the-box” in Theano, however tutorials and examples exist to get new users started in the right direction. If you can get past these hurdles, the performance benefits are incredible. Many great Python libraries (numpy, pandas, etc.) perform operations more quickly than analogous operations in base Python by taking advantage of optimized methods ran in C or FORTRAN. Theano can dynamically generate C code for all functions generated within its framework allowing it “to attain speeds rivaling hand-crafted C implementations for problems involving large amounts of data”. In addition, Theano code can be executed on GPUs (which a typical PC may have hundreds of versus a small handful of CPUs), increasing speed by an order of magnitude by taking advantage of additional parallel processing.  As a disclaimer, this was not a trivial task*.

It took us about a week to get our custom neural networks up and running on GPUs in Theano. In the end, the extra time invested proved worthwhile, as the several hundred models that we needed to build for cross-validation and parameter tuning over many time horizons took a little over two days to train; by our estimation, the same models built and executed in pybrain would have taken over a month. If you are considering building a complex neural network, or just training a standard net on a large set of data, take a look at the Theano tutorial; there is a good chance you will find that the initial time investment will pay off.

*Note: if you do not have access to a machine with a GPU that is compatible with Theano, AWS instances can be used (you can even find some AMI’s with the necessary software for Theano already installed).