We need a new approach to AI. I'm not talking about new data sets, machine learning models, or chip designs. It has been a long and strange road from pioneers like Ada Lovelace and Alan Turing to cheap smart speakers that can order you a pizza, tell you the weather, and read you the news. But now - with encroaching surveillance, pervasive adtech, and cybersecurity threats - is it time to consider an alternate path? How do we want to use AI; how should our data be collected and used, and ultimately who stands to benefit in an algorithmic society run by a handful of platforms?
In my view, there is a significant and relatively unexplored difference between public AI and private AI. Much of AI today is public. Services like Alexa, Cortana, Google and Siri record everything they hear, process it in the Cloud, and then provide a relevant response or trigger an action. Unfortunately, the price of that magical voice capability is exposing your life to not only algorithmic analysis, but a vast organization of human employees and contractors tasked with improving the efficiency and accuracy of the system.
Certainly, in late 2014, when Alexa was first introduced - it was inconceivable that a voice-controlled AI could work in any other way. But this is no longer 2014. For many basic applications, whether it be turning on the lights in your house or controlling your TV, edge AI for on-device processing means that uploading your voice to the Cloud is neither necessary nor desirable. But what happens when you have a complex request and still want privacy?
In the next few years, I believe we will see the rise of private AI. Private in the sense that rather than using a publicly available voice assistant, you will train an entirely personal one with your own data. The data will be yours and yours alone, as will the AI model itself. Your AI may still live in the Cloud, but your data will exist in a walled-off domain - free from advertisers and third parties that might want to trawl it to build a statistical model of your purchase intent.
How will your AI get smarter without a massive dataset of other users? New machine learning techniques like federated learning and differential privacy will potentially allow your AI to share insights and patterns with other AIs, without compromising you or your information.
Like the animalistic daemons in Philip Pullman's novels, we will grow up with our private AIs as constant companions. Rather than distinct personalities like the AI in Her or Bladerunner 2049, they will be more like digital doppelgängers, and a source of personal advantage. We will bring them to school and university, training them as we are ourselves educated. They will be extensions of ourselves; empowered to act on our behalf and speak for us after we are gone.
Don't get me wrong. This is not the Singularity: they won't actually be us - unless you are Deepak Chopra, perhaps. But they will be a fragment of us or, at least, an embodiment of our personal or collective intentions. Wealthy families will have financial AIs that rival today's quant funds, while estate lawyers will have a field day trying to sort through the validity of smart contracts and algorithmic wills with AI executors.
The rise of private AI is by no means assured. In the early days of the Web, some idealistically imagined that we would have infomediaries as custodians and brokers of our data. But rather than protecting us, these data infomediaries just ended up becoming Amazon, Google and Facebook.
This time our fate will once again depend on the collision of technology and business models. The same forces of cheap computation, massive datasets, and better algorithms that fueled the efficiency of Amazon's retail recommendations and Spotify's playlists - also have the potential to enable a world of private AI. Whether that happens or not, hangs on just how much convenience-driven surveillance we are prepared to take before we start pushing back.
As our world grows in complexity and AI-powered diversity, we will increasingly need personalized algorithms and automated systems that can act solely in our interest. If you don't know what an AI is being optimized for, you could well be the target. Or, as the saying goes, if you are not paying for the product - you are the product.
- - -