Azure Machine Learning — First Impressions

Mark Ryan
4 min readAug 28, 2018

--

For the last year and a half I have been using Watson Studio and DSX Local as my development environments for exploring machine learning and implementing models. Inspired by Siraj Raval’s video on Azure Machine Learning I decided to take the plunge and check out Microsoft’s ML environment. This posting covers my first impressions, the good and the bad, and contrasts Azure ML with Watson Studio / DSX.

Getting a first taste of Azure ML from a standing start is relatively easy — here’s a quick overview of the preparatory steps:

  1. Review this introduction to get a high-level view of the components of machine learning using Azure.
  2. If you don’t already have access to Azure, you can easily set up a free account. Unlike IBM, Microsoft asks for a credit card to get useful access set up.
  3. Install the Azure Machine Learning Workbench
Azure ML Workbench

As Siraj notes in his video, Microsoft touts a hybrid (on prem + cloud) approach. Before getting into the experience of using Azure ML, I’d like to contrast my experience of “hybrid” Microsoft vs. IBM:

  • Microsoft has a locally installed development environment (Azure Machine Learning Workbench) that you use in conjunction with Azure’s cloud services.
  • IBM packages common ML technology separately as Watson Studio (for public cloud) and DSX Local (for behind the firewall / private cloud and as part of the IIAS appliance).

Full disclosure: I am an IBM employee, but I can see pros and cons to both approaches. One advantage to IBM’s pure public cloud offering (Watson Studio) is that you can evaluate it without having to worry about any local system prerequisites, and particularly without the drama of conflicting requirements — more about that below.

The design of the Azure ML Workbench is intuitive, and having direct access to the local file system is handy. The Workbench is a bit of a diva in terms of its system requirements — my primary Windows machine didn’t make the cut. Not the end of the world, I thought, I’ll use my second Windows system. The Workbench install was painless on the machine #2. However, Docker, the other diva in the stack, was happy to install on machine #1 but not on machine #2. I eventually got both pieces together on the same system after I found an old level of Docker that would deign to be installed on machine #2. This experience of mutually exclusive pre-reqs made me nostalgic for the simplicity of Watson Studio’s all-cloud approach.

Easy to illustrate run history

So, with Azure ML Workbench and Docker finally both on the same system, I started through the Workbench introductory tutorial that uses the Iris data set. The tutorial is divided into three parts:

  1. preparing data — introduces the concept of data preparation packages, a slick and intuitive way to prepare data for the model.
  2. build model — the notebook environment is pretty standard. I liked the run history features and I could see using them extensively in experiments. It is really easy to define new visualizations of results — nicely done!
  3. deploy model — and here’s where things went from smooth to rough. As the Ramones logo stated, look out below.

While parts 1 and 2 of the tutorial are really slick and show off Azure ML to advantage, part 3 shows that the end-to-end story isn’t quite there yet. Some of the steps in Part 3 are one-time command line gobbledeegook, so that’s OK, but I defy anybody who doesn’t literally sit beside a Microsoft engineer working on Azure ML to get the deployment to work “out of the box”.

Deployment not quite working

At the time of writing, despite working through a bunch of intermediate problems, I still haven’t been able to get the real-time web service to score data. Judging by the results of searching on the error above, I am not alone. I know that deployment can be challenging because of the number of layers involved, but my impression so far is that Azure ML does a worse job of hiding this complexity than, say, DSX Local.

I have not given up on the deployment part of the Azure ML tutorial. Some folks report success deploying with Azure Container Services (ACS) rather than a local Docker setup as specified in the tutorial. Assuming I can get deployment with ACS to work I will post on that experience, along with further adventures with Azure ML.

--

--

Mark Ryan
Mark Ryan

Written by Mark Ryan

Technical writing manager at Google. Opinions expressed are my own.

No responses yet