For all the hype around recent advancements in artificial intelligence and machine learning, it’s important to remember that, at their core, a majority of these are little more than clever algorithms with a strong feedback loop.
Clever, of course, is an understatement. Convolutional neural networks (aka “Deep Learning”) have allowed for some impressive advancements in predictive behavior that look and feel an awful lot like intelligence. It’s gotten to the point where I spend inordinate amounts of time honestly questioning whether my brain is merely a biological version of these same mathematical tricks.
The combination of these techniques has resulted in what is sometimes called “emergent intelligence”. The term “emergence” in this context refers to complex behaviors arising from a set of simple rules or functions. The classic example in computer science is Conway’s Game of Life. Using the ancient game of “Go” as a starting point, this game uses a set of four simple rules to determine what happens to a set of “cells” on the game board. As more pieces are added to the board following these same rules, strange phenomena start appearing. Groups of cells appear to walk across the board, or fire shots into the distance.
Distributed computing leads to a new emergence
In a previous post, I talked about how we’re finally reaching the long-promised dream of totally decentralized applications, with small snippets of code running across any number of different hosts connected by APIs. While the challenge will be in securing, controlling, and maintaining these hosts, it’s clear that adoption of these techniques will result in an emergence of their own—something I think of as “emergent architecture”.
In traditional web application architecture, you generally started with an N-tier system. At its most basic, this typically consisted of a data layer, an application layer, and a presentation layer. In a decentralized environment, such as with microservices architecture, these represent not as layers, but autonomous micro-applications that can be composited to create a complete application served in any number of ways on any number of devices through the use of web service APIs.
This has already led to the beginnings of emergent architecture as those micro-applications can be hosted, maintained, and served by the application designers themselves, their partners, or any other subject matter expert systems that make their data accessible. Serverless technologies like Amazon Lambda expand on this by breaking these micro-applications and services down even further into single-shot functions that only exist when called upon with little regard to where they actually reside.
Will your ops team be replaced by an algorithm?
Lambda-style architectures can rely on algorithms to determine where to run the code to optimize performance and reduce latency. In time, these algorithms will do much of the heavy lifting in routing requests, provisioning computational resources, and securing the data as it’s transported across different systems.
One potential benefit of this style of architecture is massive failover protection. If these small snippets of code are distributed across many systems, it’s relatively easy to re-route a request when one endpoint fails to respond. But, even in this environment, monitoring services and ensuring optimal performance will remain crucial. Visual analytics tools like TIBCO Spotfire and TIBCO Data Science can make it easier for humans to see the big picture of how the data flows to these applications, allowing developers to create better algorithms and, potentially, feeding back into a machine learning loop to even take this responsibility from us.
While I don’t necessarily see hordes of artificially intelligent agents replacing programmers any time soon, I do see the potential for developers to adopt these tools and architectural styles and rely on these agents as they do their human colleagues to improve their coding efficiency and deliver complex, battle-hardened applications to the public in record amounts of time.