We don’t deserve AI

9

October

2018

5/5 (1)

“If we’re lucky, we’ll be in management before the problems catastrophically catch up to us. By then we’ll be the ones denying the next generation the resources to pay back our technical debt, because it always worked just fine before they touched it.”

 

Software Maximalism

Growing up, my brother was a hardware and software maximalist while I became a minimalist. He’d overclock his gaming rig to see how much it could bench, while I’d underclock my ragtag rigs and strip my OSs to see how little I could use to get the functionality I sought.

This piece was originally written on the 27th birthday of Linux. Reading Software disenchantment by Nikita Prokopov reminded me of how far down the road of maximalism we’ve come. Ten years ago, there still was some realistic hope that efficiency, not resource gluttony, would inherit the land. In the last decade, I’ve witnessed the movement of software minimalism all but die.

Faint memories of memory

We’ve already bent Moore’s law; if it weren’t for our continuously improving hardware performance, we would have hung ourselves years ago with the mess our current software infrastructure is. It’s only due to the price-fixing and underproduction of RAM (which has driven up memory prices by hundreds of percent) that we even realise what memory hogs our applications have become. Ten years ago mini-laptops’ OSs and applications were competing over the smallest RAM footprint. Now, nobody cares – and in a year or two once prices start to normalise, people care even less.

 

Toxic, Steaming Heaps of Big Data

In most cases, we don’t need big data or big data tools. We need to realise that data is a toxic asset and the mantra of “let’s gather everything because it might be useful somewhere down the line” is not a data management strategy, it’s hoarding. It doesn’t enable us to magically discover hidden insights in our users’ secret desires left and right. It just makes the most important – and mundane – business data analysis a PITA that requires a whole data engineering team instead of one or two BI grunts. We don’t need more data, we need less data. We need smarter data. We need data and tools that even the average mouthbreather of a business analyst can work with. And if we’re not delivering that, we’re just another part of the problem.

 

The Next Generation (of the Problem)

“Well let’s use AI to solve this!” is not the answer. We’ve entered the future running and screaming from the steaming heap of technical debt our predecessors left us, just hoping that our racks and stacks develop fast enough for us to stay immune to care about the incredibly wasteful use of computing resources we deploy. Most of the time, we don’t need complicated models or superdeep neural networks. 9 out of 10 times great training data beats great models. However, performing janitorial duties on the data is less glamorous than indulging in months of science to come up with an over-complicated solution to what initially wasn’t even a problem. 

 

If we’re lucky, we’ll be in management before the problems catastrophically catch up to us. By then we’ll be the ones denying the next generation of developers the resources to pay back that debt, because it always worked just fine before they touched it. Besides, refactoring has 0 business value. We’ll just add another server to the cluster, here, have more RAM. Maybe the robots will take care of it.

 

 

Image credit
‘Neurons in the brain’ by Dr Jonathan Clarke. Credit: Dr Jonathan Clarke. CC BY

Please rate this

Leave a Reply

Your email address will not be published. Required fields are marked *