2 November 2015
I still remember that summer in Switzerland. Tucked away in a
remote castle in the Swiss Alps, we had been handed a data
warehouse containing 22 million sensitive documents.
The documents included emails, contracts, and spreadsheets;
there were invoices and pdfs of all sorts. We had to submit our
findings in one week, as the government investigation had already
started. At the time there were only three of us. "Seven million
documents each", we thought aloud. "Just a million per day".
My colleagues and I were forensic accountants. We had plied our
trade around the world from Kazakhstan to Venezuela after the
financial crisis. A typical engagement would have us reviewing
financial statements and advising clients about corruption risk.
But now we were feeling the urgency; after a late afternoon phone
call the previous day, we had each jumped on the first flight from
London to Geneva that morning. We were soon to become an integral
part of a quickly escalating investigation.
Our task in that week was not a standard review of internal
controls. This time we were looking for bribes in one of the
largest companies in Europe. This was a sophisticated operation,
where employees spoke four languages and transacted business in
three currencies. They were experts at getting around systems, and
they had evaded detection for years. To catch them, we had to think
like they did. We were looking for a needle in a haystack: a single
transaction in US dollars, approved directly by a senior
The Thinking Man's Algorithm
Dancing between computers and humans.
So, we began at the beginning, looking through documents. As a
normal accountant would, flipping from scanned document to scanned
document, skimming the content. At the end of the first day, we had
each reviewed ten thousand documents and had mapped out a few
possibilities. But there weren't any warm leads, and we had
millions of documents left to review. Short of expanding our team
100 fold, our approach wasn't feasible. We needed to change our
strategy. I spent the next morning attempting to automate our
searching; I knew a bit about software coding from my engineering
days. After some trial and error, I finally landed a method that
could search through folders, metatags and text simultaneously. The
first test run scanned ten thousand documents in 25 minutes.
Soon, I was running four and five algorithms at a time-testing
out different hypotheses for more leads. Sometimes I would set off
too many tasks in parallel; my computer would get hot, and I knew I
was stressing the hardware. Other times the software would freeze;
that meant I was doing too much in one sequence. And so I sat for
hour upon hour, manually pulling the strings to control each
automated task. Balancing on that optimized point between software
and hardware was one of those exciting experiences that later led
me to dive further into analytics.
It required me to think like a computer, to function like an
algorithm. It forced me to simulate a stochastic human process with
mini equations, and then stitch them together into a flexible
model. This same area, somewhere between raw calculations and
thoughtful strategy, is where-just five months earlier-the term
Prescriptive Analytics was first being used in an IBM lab six
thousand miles away; but more on that later.
Ultimately, we found our single transaction that week. With the
aid of some smart algorithms we were able to model uncertain human
behaviour to get a specific outcome. The investigation quickly
scaled up, and soon one transaction turned into a half dozen. Our
team expanded to ten people, and the investigation extended to
several months. By the end, our now-proven ability to model
behavior got us access to Swiss banking records - there we found
even more transactions; I remember it as the summer that Analytics
got its capital A.
Listen before you create.
These days I work in healthcare, and it's very much the same
investigative mindset. Only today we use Analytics to uncover
complications from treatment or monitor complex patient pathways.
We've built systems that help clinicians audit the quality of care
across thousands of providers, and we've created networks that
share information between doctors-saving the patient significant
headaches from re-entering routine data. We query years of medical
records with Descriptive Analytics, and we look for root cause
correlation with Predictive Analytics. But when we need to solve
the most uncertain challenges, it's back to that area between
computers and humans: Prescriptive Analytics.
To master the prescriptive arts, a new skill is required:
listening. We recently built a system that reminds doctors of the
most likely side effects for each treatment option while they are
deciding which course of treatment to prescribe. In this case we
needed to model clinicians' behavior, so we sat with them and
listened closely to understand how they thought through tough
decisions. We observed their process and noted each minor step. We
searched for the potential bottlenecks and manual interventions. In
the end, it's about listening for randomness, and it's about
capturing it in the mini calculations of your model. Only then can
you turning stochastic, uncertain real world interactions into
optimized multi-tiered algorithms.
The iterative process of really listening is at the core of
these intelligent analytics models. Active listening ensures that
we can build technology solutions that directly address the
underlying problem. The more granular our "building blocks" are,
the more options we have as the process develops. This modular
functionality is a theme throughout Prescriptive Analytics. Just
like the system we developed in Switzerland, it's automating the
uncertainty that adds the most value.
Old School. Only Faster.
Don't abandon the old methods; just get really, really good
All of this talk about tiered algorithms and intelligent
analytics doesn't mean we should leave the old methods behind. On
the contrary, progress is only possible if we build off the old
methods. For the BI function in your business, this means driving
to automate simple tasks, and it means investing in your old
Descriptive Analytics capabilities to build a strong arsenal of
dashboards and drill-down menus. Projects like moving data from
system to system should be an opportunity to gain new insights.
It means supplying your Predictive Analytics team with the tools
to data mine efficiently and forecast with confidence. Planning
should be painless, and business cases should be available with one
click. Once your resources are put to efficient use, you'll have
time to examine the really challenging issues. What choices do we
make to maximize value, to minimize errors? How do we keep our
colleagues and stakeholders happy and healthy, productive and
enabled? These are the challenges of Prescriptive Analytics. And
when you integrate the entire analytics spectrum within your BI
function, not only does quality improve at each step, but your BI
team delivers what it was always meant to: business
That summer in Switzerland, we didn't notice the powerful shift
in technology coming our way. We couldn't imagine that by 2015
companies would be drafting a dedicated Analytics strategy, never
mind three of them. We were just happy to have stumbled on a few
slick algorithms that helped us track down that needle in our
By James Mac, Director - Analytics and
Finance, Capita Health Partners
Find out more about how we can help with
these challenges by email to firstname.lastname@example.org.
About the author
James Mac builds software and database solutions specialising in
the UK healthcare market. James is the Director of Analytics &
Finance for Capita Health Advisory, and he holds an MBA in Finance
from University of Oxford.
Error loading MacroEngine script (file: BlogMetaData.cshtml)
Error loading MacroEngine script (file: AddThis.cshtml)