image shutterstock

For many years now I have been interested in applying deep learning to tabular data. In fact, I have been so interested in this topic that I wrote a book about it. In this article, I’ll review why I think this topic is important, why I have new hope that deep learning with tabular data could go mainstream, and why the counter argument, that tabular data is fundamentally unsuitable for deep learning, is still kicking.

Why deep learning with tabular data?

Image: author

Doing a basic web deployment of a deep learning model is good way to prototype how your model will be used and to validate assumptions that you made during the training process. While working with Keras, I did simple deployments of several models and published articles describing my experience deploying Keras models with Facebook Messenger and Flask.

Since I have been spending a lot of time in the last year using the other major high-level deep learning framework, fastai, I wanted to get practical experience with simple deployments of fastai models. I decided to adapt the the approach I used…

New York’s subway (image shutterstock)

In late 2020 I published an article on using GPT-3 to navigate the London Underground. This article described how I created a simple harness in Python that:

  • Defines a GPT-3 object, reads a CSV file containing prompts and calls the GPT-3 add_example API to make the prompts available to the GPT-3 object

Getting full details about fastai curated datasets

If you want to learn about deep learning you really can’t go wrong with the fastai framework. This framework is at the heart of a set of courses and is now the topic of a book written by the fastai leader Jeremy Howard.

Here are some of the benefits of fastai:

  • Ease of entry — even beginners can create high-performing deep learning models in a few lines of code with fastai.
  • Intelligent defaults throughout — if you want to take the happy path, fastai makes good default choices for many settings, allowing you to get a working model with minimal…

(image shutterstock)

Using GPT-3 to navigate the London Underground

I have done a few simple experiments with GPT-3, including comparing its performance to a Rasa chatbot and using it to generate git commands. From this experience, and the dozens of applications of GPT-3 that others have published, I have come to appreciate GPT-3’s remarkable ability to solve a wide range of problems. I had not, however, seen GPT-3 tackle any spatial problems, so I asked myself, how well would GPT-3 be able to tackle a spatial navigation problem?

I thought that navigating a subway system would be a good test case for GPT-3’s spatial abilities because trips on a…

Like git, learning to driver with a manual transmission can be hard (image

Using GPT-3 to generate git commands from English descriptions of what I want to do

There are some basic skills that are easy if you learn when you are young but can be baffling if you have to master them later in life. For example, driving a car with a manual transmission can seem to be perversely difficult to somebody who learned to drive on an automatic. However, the challenge of “driving stick” pales in comparison to mastering the git command line interface if you didn’t grow up with git. In this article I’ll describe how I harnessed OpenAI’s GPT-3 to generate git commands from English language descriptions.

What’s great (and not so great) about git

Richard Trevithick’s locomotive — the GPT-3 of the early 19th century (

Comparing the performance of GPT-3 and a custom-trained Rasa chatbot

In 1829, an event took place that unleashed a technological revolution. At the Rainhill Trials a group of steam locomotives squared off to determine which one could win a series of tests of speed, strength and reliability. The winning machine, Rocket, not only blew away its competition at the trials, it also set the direction for steam locomotive development for the following century.

A wealth of information in the Postgres catalog

Getting the most out of database metadata

Relational databases like Postgres include a set of tables that describe the tables in the database. This set of metadata tables is called the catalog and it contains a treasure trove of details about the database. I recently needed to write a program to automatically extract insights from database catalogs and this led me to write a simple Python module to connect to a Postgres database, get information from the catalog, and load that information into a Pandas dataframe for further processing in Python. This article describes the process I followed.


photo by author

I recently had the opportunity to deliver a hands-on workshop on training a Keras deep learning model. This workshop was a follow-on for a session I had done for a local meetup that reviewed the content in my upcoming book for Manning Publications, Deep Learning with Structured Data. After the introductory session there was appetite for a hands-on session where participants would be able to work through the process of training training one of the deep learning models featured in the book.

Options for the deep learning model training workshop

Cage Match (illustration by author)

Ever since I had my first taste of deep learning I have been interested in applying it to structured, tabular data. I have written several articles on the subject and I am writing a book on Deep Learning with Structured Data for Manning Publications. It would be great to tackle problems with structured tabular data by harnessing deep learning’s flexibility and potential for reduced feature engineering.

The idea of using deep learning on tabular data is not without its critics. A consistent objection I have heard is that non-deep learning approaches, XGBoost in particular, are simpler to code, easier to…

Mark Ryan

Data Science manager at Intact Insurance. Opinions expressed are my own.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store