Managing Oneself in the 21st Century

22

October

2018

No ratings yet.

TL;DR! It’s profoundly ironic that our undoing isn’t the complexity of our time, but our inability to master even one moment of it.

I kept a distraction ticker while writing this post, and the outcome quite frankly frightened me. Writing these 460 words involved over 130 distractions and interruptions, spreading it over 5 sessions. Having rehabilitated from ADD already once in my life, this is not good. However, this is largely symptomatic of our times and habits. It’s a miracle anyone gets anything done anymore. The jury’s out on whether the next generations will be better at focusing, but from the echoes that we hear from teachers, a childhood filled with an endless, addictive stream of super-sensory stimuli doesn’t seem to do any favours either.

Prof. Cal Newport has made numerous painfully sharp realisations of our age of digital distractions. One of his main points is the need for what he calls deep work. This is work that happens in a prolonged state of concentration; it takes a long while to get started and can only happen uninterrupted. Newport describes this as the prerequisite of high-quality information work, and the ability to engage in it as one of the key criteria for a successful information worker – and organisation. Newport posits that the value created in a modern organisation is basically the integral of sustained attention over time (which he refers to as the Attention Capital Theory).

I didn’t really grasp how important – and refreshing deep work can be before I started working on projects with a level of complexity. I worked in an environment that was rich with geeky introverts where most tasks could be carried out with very little external input – and even most of that was handled with Slack messages to people within 10 meters.

In this kind of environment, work easily becomes sequential instead of semi-random multitasking. Things happen on schedule. Even interruptions happen mostly on schedule. No one else has the permission to make reservations in your calendar. Eventually you only stress constructively about solving the challenges at hand and wrangling the complexity on schedule. You cease stressing about having a dozens irrelevant but important things to do at once. Your productivity skyrockets, your blood pressure drops, and your ability to stretch your capabilities grows immensely.  

Instead of TL;DRs, executive summaries, gamification and super-sensory media, I’ve come to believe Newport’s digital minimalism is the antidote to the mental poison of our time. We need to realise that in a world of excess, we’re never going to get anything done if we don’t focus on one thing intensely at a time. And usually, those few things are what ca lead us to greatness

 

If you want to read more, Prof. Cal Newport also has a quirky blog with some pretty cool ideas.

http://calnewport.com/about/#ideas

 

Please rate this

We don’t deserve AI

9

October

2018

5/5 (1)

“If we’re lucky, we’ll be in management before the problems catastrophically catch up to us. By then we’ll be the ones denying the next generation the resources to pay back our technical debt, because it always worked just fine before they touched it.”

 

Software Maximalism

Growing up, my brother was a hardware and software maximalist while I became a minimalist. He’d overclock his gaming rig to see how much it could bench, while I’d underclock my ragtag rigs and strip my OSs to see how little I could use to get the functionality I sought.

This piece was originally written on the 27th birthday of Linux. Reading Software disenchantment by Nikita Prokopov reminded me of how far down the road of maximalism we’ve come. Ten years ago, there still was some realistic hope that efficiency, not resource gluttony, would inherit the land. In the last decade, I’ve witnessed the movement of software minimalism all but die.

Faint memories of memory

We’ve already bent Moore’s law; if it weren’t for our continuously improving hardware performance, we would have hung ourselves years ago with the mess our current software infrastructure is. It’s only due to the price-fixing and underproduction of RAM (which has driven up memory prices by hundreds of percent) that we even realise what memory hogs our applications have become. Ten years ago mini-laptops’ OSs and applications were competing over the smallest RAM footprint. Now, nobody cares – and in a year or two once prices start to normalise, people care even less.

 

Toxic, Steaming Heaps of Big Data

In most cases, we don’t need big data or big data tools. We need to realise that data is a toxic asset and the mantra of “let’s gather everything because it might be useful somewhere down the line” is not a data management strategy, it’s hoarding. It doesn’t enable us to magically discover hidden insights in our users’ secret desires left and right. It just makes the most important – and mundane – business data analysis a PITA that requires a whole data engineering team instead of one or two BI grunts. We don’t need more data, we need less data. We need smarter data. We need data and tools that even the average mouthbreather of a business analyst can work with. And if we’re not delivering that, we’re just another part of the problem.

 

The Next Generation (of the Problem)

“Well let’s use AI to solve this!” is not the answer. We’ve entered the future running and screaming from the steaming heap of technical debt our predecessors left us, just hoping that our racks and stacks develop fast enough for us to stay immune to care about the incredibly wasteful use of computing resources we deploy. Most of the time, we don’t need complicated models or superdeep neural networks. 9 out of 10 times great training data beats great models. However, performing janitorial duties on the data is less glamorous than indulging in months of science to come up with an over-complicated solution to what initially wasn’t even a problem. 

 

If we’re lucky, we’ll be in management before the problems catastrophically catch up to us. By then we’ll be the ones denying the next generation of developers the resources to pay back that debt, because it always worked just fine before they touched it. Besides, refactoring has 0 business value. We’ll just add another server to the cluster, here, have more RAM. Maybe the robots will take care of it.

 

 

Image credit
‘Neurons in the brain’ by Dr Jonathan Clarke. Credit: Dr Jonathan Clarke. CC BY

Please rate this