Moore’s Law and the rise of cloud computing models that enable collaboration in networks of up to hundreds of thousands of individual CPUs on a single algorithm mean that old algorithms can find new applications and address much larger problems. I believe that this will have as profound an impact on knowledge work over the next two decades as the personal computer did transforming clerical jobs in the 80’s and 90’s.
Algorithms in the Attic
Michael Schrage identified this trend in a February 2007 piece “Algorithms In The Attic.”
“As computing gets ever faster and cheaper, yesterday’s abstruse equations are becoming platforms for tomorrow’s breakthroughs. Companies in several industries are now dusting off these formulas and putting them in the service of new products and processes.”
Michael Schrage in “Algorithms In The Attic.”
Things we know but have not applied constitute a backlog of potential insights to be harvested. If you can leverage an existing mathematical model or algorithm it’s like gaining access to low cost capital: you don’t have to invest time discovering and refining your infrastructure you can focus on refining and improving existing “open source” know-how.
Why Now? Moore’s Law and High Bandwidth Networks
“Why should past work, often quite theoretical, be so useful now? Done in the absence of high-speed, low-cost computational capacity, that work put a premium on imaginative quantitative thinking. With today’s high-powered processors and broadband networks, those abstractions can point the way to practical software that leaps over current operational constraints. Disruptive opportunities abound.”
“There are huge hidden assets in the operations-research community,” says the MIT professor Richard Larson, a pioneer in probabilistic modeling techniques. “If you gave an army of 20 grad students the mission to rake through the published literature of the past 30 years, they would find stuff that has untapped business potential worth billions of dollars. There are many clever ideas my students worked on decades ago that in today’s networked environment would not be an academic exercise but a real business opportunity.”
Michael Schrage in “Algorithms In The Attic.”
Operations Research is now often referred to as Management Science, but it is more than 100 years old. The primary applications for it’s first three decades were operational planning in World War 1 and World War 2. As computers started to gain speed and support complex languages the discipline–you can also call it applied mathematics or applied statistics or machine learning–has picked up broader and deeper applicability.
Adoption of Algorithms Is A Differentiator in the Market
“A large opportunity may lie in simply getting more managers to use existing quantitative tools in their decision making. There’s no doubt, says the Stanford University professor Sam Savage, that literally millions of business spreadsheets would benefit from the stress-testing of key assumptions with “Monte Carlo” random number generators. To improve the reliability of their individual business plans, Savage observes, managers could even plug into enterprise-wide probability estimates.
Only recently have these academic research tools become part of everyday business practice in fields such as engineering and financial services. The rate at which they are intelligently adopted could be a differentiator in the wider marketplace. ”
Michael Schrage in “Algorithms In The Attic.”
This is the transition that knowledge workers face over the next two decades. In the same way that widespread deployment of personal computers in business transformed clerical labor starting in the mid-80’s; cloud-enable algorithms are going to have a significant impact on knowledge work. It’s already well underway in engineering, law, and accounting and is spreading to medicine, education, scientific research. The decision on how large a datacenter to hire in the cloud for an hour or a few days, or which projects to allocate what mix of people and processors to will determine many firms’ competitiveness in winning work and then delivering on the promises made to win it.
It’s Not Just Green ASCII on a Black Screen
“The big-box retailers Wal-Mart and Best Buy, for example, are widely regarded as having superior analytic infrastructures. But they don’t just hire the smartest “quants”; they push to make their mathematical tools accessible to others. They substitute on-screen representations and visualizations of data for complex numerical equations. They’re constantly rethinking when mathematics should automate a decision and when it should simply assist the decision maker.”
Michael Schrage in “Algorithms In The Attic.” [bold added]
Knowledge work will be transformed but not obsoleted. While Stephen King and I can both use Microsoft Word to write our novels, his seem to find a much wider audience. The removal of error is not the same as adding insight: while removing errors may bring a novice up to a journeyman level of performance, journeyman will need more than that to overtake a master. But in many fields a master who does make use of the most powerful algorithms on an appropriate computing infrastructure will be outclassed by one who does. The challenge for startups will not be to just implement the algorithms but to do so in a way that intermediate solutions are easily visualized and provide the basis for suggestions and guidance by the human expert.
It also means that human experts will need to learn how to teach and delegate not only to journeyman or apprentice level human employees, but how to delegate the appropriate tasks to algorithms assigned the right amount of computing capacity.
Related Blog Posts
- A Picture is Worth a Thousand CPU Hours which ended with three questions for the reader:
What’s the strangest useful visualization you have seen in the last year?
What picture is worth a 1,000 CPU hours–or about $100 at a dime an hour on Amazon–to you?
What models for “Insight as a Service” do you see emerging? - Hadoop Summit 2009 Quick Impressions
The show reminded me a lot of INTEROP 88, the year that Interop transitioned from workshop to trade show with a few dozen vendors at the Santa Clara Convention Center. The vendor ecosystem for Hadoop is not yet as diverse, but the focus was clearly on system administration and technology, with the applications discussed in highly technical language. The crowd seemed to be researchers and system programmers for the most part, but the potential business impacts are starting to become a lot clearer. - Address a Problem an Industry has Promoted by Satisfying a Basic Need
- IEEE/NATEA Event on Cloud Computing July 19 2008 at Stanford
I have become convinced that “software above the level of the device” whether it’s called Grid, Farm, Cluster, Multi-Core, ManyCore, or Cloud Computing represents a significant opportunity for application development by start-ups. - David Stutz: “Advice to Microsoft regarding commodity software (2003)”
Useful software written above the level of the single device will command high margins for a long time to come.
Pingback: SKMurphy, Inc. A Picture Is Worth a Thousand CPU Hours - SKMurphy, Inc.
Pingback: SKMurphy, Inc. Michael Schrage on Innovation, Collaboration, Tools, and Incentives - SKMurphy, Inc.