Endless study

They say in technology that if you’re not studying something new then you’re already behind. I’ll admit there were times in my career where I did feel like I was behind the technological curve. But never enough that I was willing to do anything about it. Now things are different, I worked while going to college and it feels like that once again.

This isn’t a nauseating “my employer is so amazing” brag but I say one thing about Amazon, you are always studying for something. It’s not like you have a choice, it’s built into the job. If you stand still, you are going to get run over. Not only by your peers, most of whom would dismantle the television set in front of you and rebuild it with lasers if there was nothing else to do, but also by your customers.

Getting close to two years with the company, a week hasn’t gone by where I haven’t had some reference material open studying for this or for that. It’s great for expanding your knowledge of technology but I’ll admit there are some study nights that take more willpower than others.

Martin-adams-_OZCl4XcpRw-unsplash-1Photo by Martin Adams on Unsplash


MapR's failure represents nothing.

The technology business has much in common with the fashion industry. Both follow fads and have a gaggle of posers who flitter from seasonal event to seasonal event. They're hype driven businesses and just like failed attempts at creating fashion trends the idea that the failure of a batch of companies in any segment of tech is a warning for all of tech is incorrect. Far from being an industry wide cautionary tale the write off of the $280M invested into the now defunct startup, MapR, represents nothing.

In 2016 MapR was valued at $1B. Today it is worth nothing or close to nothing. The public cloud providers forced MapR's largest rivals to consolidate in order to cut costs and gain scale. It is incredibly difficult to compete against hyper-scale companies with near infinite resources so Hortonworks and Cloudera decided it was better to travel together than to travel alone.

MapR, with no allies and no path forward, succumbed to the inevitable when its funding was cut. I've seen arguments made online that the Big Data segment was nothing but hype and this is a warning about hype powered investments into other tech segments. Yes, there was a lot of hype but there is always a lot of hype.

Big data, analytics and AI workload deployments are increasing but they are doing so while not being enabled by MapR. This was MapR's problem. Technology is like the fashion industry in the sense it has seasons, fads and assorted posers who jump from one fad to another each season. Following the hype, finding the fads, ignoring the posers and trying to pick a big winner from amongst many contenders has been the tech investment strategy for decades.

Y Combinator, the famed startup accelerator, has a vigorous vetting process and has invested in more than 700 startups. If you want to see the future you start with Y Combinator. Out of those 700+ startup investments the number of breakthrough companies Y Combinator has invested in that have hit it big? Three.

The majority of investments failed, some are profitable and making money, but the ludicrously successful investments are in the low single digits. It's true for Y Combinator and it's true for the big VC firms.

Go from Y Combinator to the heavy hitters of VC finance and you find VC firms doing their best to identify as many unique companies as they can in a fashionable segment of tech. The hope is that one of them will be the massive success, the next market dominating force in its category. Everything else they invest in will be seen as garbage even if it's a nice little business. No VC wants a piece of a little business. If you can get to profitability without VC money you can be as little a business as you want.

VCs invest in companies they think have a shot at massive returns. When it becomes clear there is no shot they stop investing and move on. MapR's failure is the realisation that against the public cloud providers, and the scaled up Cloudera, MapR had no shot.

There is always a new season and new fad to invest in. With diligence and a bit of luck one of those investments will hit it big and we can then start complaining about how overvalued it is or how high its prices are.


The revenge of the original RISC processor.

Two categories of microprocessor have dominated computing over the past 18 years, performance processors and low power processors. These are now being joined by a third category of microprocessors, minimal cost processors. This emergence of minimal cost processors, led by RISC-V and MIPS Open, will replenish the dwindling supply of system design engineers.

The dominance of the Intel architecture eliminated a generation or two of system design engineers. Years ago there was a cornucopia of processor architectures used in different computing segments, then Intel’s critical mass made standardisation viable across a number of different markets.This move to x86 everywhere made economic sense for many companies but the result was a homogenisation of design.

Where designs once were expensive and bespoke processors and boards became a commodity. Intel used economies of scale to extract significant gross profit from those commodities while their fabrication prowess ensured competitors could never execute fast enough to challenge them. When Intel stumbled a competitor rose, when Intel returned to superior execution the competitor fell.

As a host of processor competitors withered and died there was neither space nor need for people working on things that were substantially different to x86. Very quietly the PC and Server wars ended. Across desktops, in appliances and in data centres "Pax Intel" reigned.

For a long time AMD and IBM POWER were the only other two providers left standing among a pile of defunct designs and their market share was minimal. It took a new class of device, the smartphone, for ARM to propagate across the world. If you are looking for performance Intel, AMD (And POWER) occupy that category, but a low power category was an opportunity for someone else.

ARM, today's low power champion, is now under threat from a new category that of the "minimal cost" category. Two designs from the distant past have returned and unhappy ARM licensees are interested in what they have to offer.

The first minimal cost design is RISC-V, the fifth version of the original Reduced Instruction Set Computer processor designed at UC Berkeley. The second is MIPS Open, the successor of the Stanford University spin-out processor that powered Silicon Graphics systems and the Nintendo 64.

These two minimal cost processors offer current ARM licensees the choice between hiring their own engineers to create new designs using an open license or not doing any of that and taking a license for immutable core designs from ARM. Increasingly firms are now looking at the cost of licensing from ARM and are instead putting their own design teams together. System design jobs have entered a new era of expansion with companies looking at doing their own bespoke implementations again.

RISC-V has a simple base specification of about 50 or so instructions, in contrast the POWER Instruction Set Architecture I used to keep in a desk drawer runs to three bound books and over 1200 pages. RISC-V’s academic roots are plain to see as there is just enough to get going.

If you want to own a trinket you could drop $60 on a HiFive1 board with a genuine RISC-V processor but you can simulate RISC-V in QEMU today and boot Linux on it. CPU simulation performance is acceptable enough to get work done and QEMU also supports MIPS CPU simulation.

While RISC-V may run Linux it does not have the capability to move into the performance category. David Patterson, one of the original co-designers of RISC (And RAID) co-authored an interesting paper spelling out an ambition to become the ubiquitous processing core anywhere computing is done. A lofty goal but like Linux it will take billions of dollars in investment from a broad partner ecosystem to move towards performance and take on the established providers there.

The ARM server CPU companies have faded because moving from low power to high performance is a new set of brutal challenges. One of those challenges would be in meeting Intel and AMD on ground they know well and outspending them when it was required. When it mattered Linux had friends with deep pockets, ARM server processor providers do not.

Unlike RISC-V MIPS descended from high performance into low power and minimal cost through a number of marketplace defeats at the hands of others. MIPS was considered to be so strategic in the 90s that it was the second target platform for Windows NT 1.0, but it is now more commonly found in embedded controllers.

With poor leadership and a hostile market MIPS has had a rough two decades but continues to be used for embedded applications and appliances. Facing a surge in RISC-V interest team MIPS had no choice but to open up their intellectual property for fear they would be the first casualty of RISC-V. This was an move showing intelligence that has been lacking from previous MIPS leadership for years so there might be room for MIPS Open in the minimal cost processor segment yet.

The question that matters is if it's possible for minimal cost processors to jump categories and take on the performance CPU providers? Yes it is possible. But they're going to need a lot of friends who have a lot of money and those friends will really have to want them to succeed.

Same as it was back in the 80s when the original RISC processor designs came out of the Berkeley CS labs and dominated the UNIX business for years after.


Quantum uncertainty

The promise of quantum computing is its ability to derive answers for problems which are currently computationally prohibitive. Of the many open issues surrounding the successful delivery of mass-market quantum computers two of note are the vagueness of what they will be useful for and if quantum computers can be scaled in a manner to solve hard problems? It is uncertain as to if there are answers to these questions.

By modelling the natural world in a processor, as opposed to digital operations in current processors, the assumption is we will have the ability to quickly simulate the interaction of particles in the real world. Our current computing model built as it is on zero and one’s is inadequate for these use cases as in the natural world things are not binary. Traditional computers are built on the bit, it has a state of 0 or 1 and never anything else. The qubit, the building block of quantum computing, is capable of operating in intermediate states. Imagine it as always operating from 0 to 1, and qubits only deliver an output of 0 or 1 when you take a measurement of their current state.

To give an inaccurate accounting of quantum computing that I'll tell you right now is incorrect but does not require an understanding of quantum mechanics or linear algebra, imagine that when simulating something using bits that you sequentially cycle through every outcome until you deliver an answer. For some workloads this can be done quickly, for others, usually involving an examination of the building blocks of reality, the computational time required can be measured in thousands of years. With qubits you can check multiple outcomes in parallel because what you’re using for your computation more accurately reflects the phenomena you are looking to examine. When you take a measurement, you have an answer and the answer is not derived from a binary simulation of reality painfully stepped through one option at a time but from the effect of computation on reality as it exists.

This isn’t to say that quantum computing will solve all problems, it is not administration rights to the knowledge of the universe, nor might it solve problems faster than traditional computing. It is expected quantum computing will have applications in physics and mathematical factorisation (for example cryptography and breaking cryptography), but there is still a realm of hard problems expected to be well beyond the capability of quantum computing.

To date we are unsure as to what quantum computers will be useful for as the hardware is experimental, small scale and provides results of questionable accuracy. If chip designers can crack the tough problems around the development of quantum processors and qubits the end goal will be discreet quantum processor unit cards (QPUs) available as accelerator cards the way graphics processors are delivered today. Today however quantum computers are big, qubits requiring isolation as to not negatively interact with one another, and require cryogenic cooling to ensure stability.

Right now Intel, IBM and Google have fabricated double-digit qubit chips but Intel admit these are probably not good enough for operation at scale as these qubits have a high error rate. The fact the hardware returns an unacceptable number of incorrect answers to be useful for computation has not slowed down the search for new quantum accelerated algorithms. With a lack of production grade hardware, software developers have turned to simulating qubits on traditional computers. Microsoft has released a preview of their Q# programming language which comes packaged with their quantum processor simulator and there are extensions for Java and Python which do the same thing.

As qubits in the real world may not be performing as expected how accurate software simulations running on traditional computers will turn out to be is also a question yet to be answered. There may be a discovery or two yet to be found not reflected in the software and when the hardware and software are delivered the systems may just fail to live up to their promises.

The quantum computing breakthrough has been five years away since the first time you heard the phrase “quantum computing” and its success is still not inevitable. While the technology industry is like the fashion industry in the sense there is hype, trends and seasons when it comes to new offerings it would be unwise to be cynical about a new technology in its formative state. That said, controlling expectations would be prudent until you can rent millions of qubits from your favourite cloud computing provider or add a QPU to your desktop.

Just be sure to keep a can of liquid nitrogen close to hand if you buy your own QPU.


Fracturing the Internet

The battle between the United States government and Huawei is a battle between the United States and China for control over the development and direction of wireless technologies. The Internet may not be forever, political moves in the game between the United States and China guarantee that the Internet will mutate in coming years as a result of this infrastructure conflict and future conflicts at every control point.
 
In telecommunications technology there are five major players of note, Ericsson, Nokia Networks, Huawei, ZTE and Qualcomm. Respectively, these companies are proxies for the economic ambitions of Europe, China and the United States. Holding a key position above the others the United States, represented by Qualcomm, provides intellectual property which underpins all the other offerings. This has worked out well for Europe which though jealous of Silicon Valley's success has always embraced its innovations, but China chafes under the influence Qualcomm's intellectual property provides the United States as it allows the US to dictate terms. Making Qualcomm irrelevant is a Chinese strategic objective.
 
In 2018 the United States, having discovered that China's ZTE had shipped products to Iran and North Korea containing Qualcomm technology, banned all US technology exports to ZTE with the result that ZTE faced ruin. At the time China was refusing to sign off on Qualcomm's $39 billion acquisition of European semiconductor provider NXP, the ZTE ban had the upside of being a potential lever to get the deal moving again. When the US lifted the ban on exports to ZTE it was expected that China would reciprocate by allowing the NXP acquisition to take place. China did not reciprocate forcing Qualcomm to scrap its acquisition plans, much to the consternation of the United States government.
 
With the extradition of Huawei's CFO from Canada to the United States in process, again for shipping products to Iran and North Korea containing Qualcomm technology but also for hiding the money trail, we see the political game escalate but might ask the question should the United States be allowed to decide who gets 5G wireless? China appears to be asking that question, a lot, and if it develops its own answer to Qualcomm what might happen to standards?
 
The Internet was a US phenomenon that spanned the world, everyone got in line behind technology decisions made in the United States but would the United States and its Western allies get behind technology decisions made in China? If they would not could we see the beginning of a fracture in infrastructure which will lead to a split in the Internet? A United States led alliance facing off against a Chinese led alliance, the primaries of both engaged in battles for control over all layers of infrastructure and at every software control point.
 
Qualcomm is now considered to be such a strategic part of long-term United States objectives that the Department of Defence has begun intervening in domestic investigations of Qualcomm's business practices. Likewise China's commitment to Huawei is clear. Two sides have chosen their champions, as per usual Europe has no plan to put the wood behind one arrow but soon enough it would not be surprising to see an Ericsson – Nokia Networks merger slide on through the European Commission without an eyebrow raised by the Commissioner for competition. Stability is something the EU always prefers.
 
After decades of use it would be a mistake to assume the Internet is stable because it is successful. The fact it is so successful makes it of interest to those looking to further their own political, economic and social goals. If those goals require that the Internet be split into incompatible pieces you should assume governments are working towards that relentlessly.

Two faces of Artificial Intelligence

At its core Artificial Intelligence is about teaching computers to do what humans can do with the expectation that computers will do those things better. Examining the history of technological progress it is possible you will not live long enough to see Artificial Intelligence change the world. Assuming AI is something that will change the world and that is in no way assured.

In a best case scenario Artificial General Intelligence (AGI) will be system capable of tapping the sum of human knowledge to answer questions which are currently beyond us and generate ideas which we are incapable of. AGI relies on a breakthrough yet to be made so in the near term we can expect that slivers of task specific Artificial Intelligence will be embedded into software, services and products in the same fashion that databases are now embedded in the such things. There was a time when the very idea of a database in your home or in your hand was ridiculous but now task specific databases are embedded throughout a multitude of devices people own or interact with.

In 2014 one of the more popular books bought by technology industry executives was The Second Machine Age. In The Second Machine Age the authors, Brynjolfsson & McAfee, propose that advances in software design and computational power are doing for thinking what the steam engine did for manual labour. To the authors true innovation is in combining things that already exist in different ways to create new outcomes. Add AI to a vehicle and gain the benefits of self driving vehicles.

This is an optimistic book which takes great pains to point out that quality-of-life improvements will not be distributed evenly and there will be losers from this ongoing cognitive revolution. In the opinion of the authors the greatest gains in productivity, wealth creation, and social good lie ahead but we must remember to bring everyone along. This is a book where technology not only saves the world but makes a better for everyone and regardless of the strength of the ideas proposed between its covers that is a very appealing vision for people working in the IT industry.

The antithesis of the second machine age would be The Rise and Fall of American Growth. Written by Robert J. Gordon this proposes that life began improving dramatically for people through a series of great inventions, such as electricity and the networked home. It is an example of our focus on Information and Communication Technology that the idea of a networked anything would involve Ethernet but in this case the networks are those of electricity and indoor plumbing. Electrification brought light and the mechanical automation of repetitive chores into the home, while indoor plumbing provided freshwater for consumption and as importantly increased public health through better sanitation.

In Gordon's view the century of unprecedented growth between 1870 and 1970 was an outlier and not something that will be easily repeated. Using the example of the internal combustion engine Gordon proposes that important inventions do not have an immediate impact and must be adapted and disseminated. In the case of the internal combustion engine it took nearly 50 years before tractors replaced horses on farms. The greatest inventions have shown that the process of dissemination is slow but provides steady increases in living standards over a long time.

In Gordon's research the development of the Internet and other related Information and Communication Technologies created a surge in productivity between 1994 and 2004 which then tailed off dramatically. Unlike the inventions in the century of unprecedented growth the dissemination of the Internet did not create a significant increase in living standards. In Gordon's view the algorithm is no match for the assembly line when it comes to making people's lives better. Artificial intelligence may be able to quickly identify what is a cancerous growth in a patient and what is not, but delivering untainted water to where billions of people live and taking away their waste has saved and will continue to save orders of magnitude more people.

This is not to say there is no value in Artificial Intelligence but Gordon's view is that we have already exited an unprecedented cycle of intellectual achievement and quality of living increases throughout the 20th century, and we are now returning to incremental increases in living standards we have known throughout human history. Artificial General Intelligence, where computers can truly do things humans can do and do them better, still eludes us. Accepting that any breakthrough in this field would take time to be adapted and disseminated we see there are decades between the creation of the first AGI system and the adoption of AGI systems throughout society.

Artificial Intelligence may change everything and introduce the long boom of the second machine age, or with a lot of the hard work to increase living standards already done in the 20th century it may just provide incremental improvements to our lives by being task specific. But the clock does not start ticking on the societal impact of artificial intelligence until we have a major breakthrough.

Today AI can beat the best DOTA2 players, OpenAI Five playing 180 years worth of DOTA2 games every day and using what it has learned to demolish human players in the arena, but if you change the game those simulated decades of experience become worthless. The breakthrough we are looking for may come from gameplaying artificial intelligence but that breakthrough is not artificial intelligence which can only play games. The clock hasn't started yet and the decades required for adaptation and dissemination will not begin until it does.

How will we know when artificial intelligence has made a true impact on society? When it starts telling us things we do not like to hear.


Tech waits for its next recession

Time_recessionUntil the beginning of December the common wisdom has been that the equity markets have another 12 months of growth before the current multi-year bull run draws to a close. That has gone from being informed opinion to a desperate hope in just a few weeks. Could we be facing into a recession starting in 2019? It is probable that is the case.

As defined by the National Bureau of Economic Research a period of recession is a significant decline in economic activity spread across the economy, lasting more than a few months, normally visible in real GDP, real income, employment, industrial production, and wholesale retail sales.

Functioning economies expand and grow with more people (Consumers) born in developing countries while the people in developed countries become more productive. A recession is where this tide goes out, people consume less, businesses produce less, and productivity falls as people lose their jobs. Recessions are typically short lived, the average being a year in duration, but they are such a hard reset that it takes many years to return to the point where the recession began.

McKinsey conduct a regular global survey of Chief Financial Officers and Chief Executive Officers around the world. Their current report shows that the majority of survey respondents currently hold very unfavourable views of the economies their companies operate in. Over the next six months these executives predict that conditions will deteriorate even further. Trade wars and political instability cited as the two main areas of worry. The news plays up riots in Paris and the US verses China but these ideas have taken deep root in the heads of people who juggle tens or hundreds of billions is assets.

There are many potential measurements that can be used to forecast a recession but since economics is what happens inside people's heads it can be a good idea to start there first. In the heads of the Chief Financial Officers of some of the most prestigious companies in the world the outlook is grim. These men and women sign the cheques for technology spending, when they snap their cheque books shut that sound is heard everywhere in the IT industry.

It's not awful for technology vendors, technology winners tend to use recessionary periods to tighten up processes that have gotten loose and invest for the future. That strategy of spending on development when others are timid can position you well when budgets grow and CFOs start spending again. Much as they might try no company can cost cut their way to lasting success. Intelligence, execution and timing are still the major factors and the first two tend to need all the help they can get from IT vendors.

While executives have a pessimistic view of 2019, consumer spending is holding at the moment. House purchases, which are the largest purchases the general public tends to make, proceed apace and have not shown any signs of slowing yet. That said, between the volatility in the global markets and increasing pessimism of top executives we have two visible warning lights on the dashboard. It will not take much to light up some more of them in 2019.

Merry Christmas!


The AWS economy

For the good of the information technology industry, and those employed in it, AWS must continue to offer an increasingly complex portfolio of services. The more effort it takes an organisation to use AWS effectively, the more jobs it creates for other people. This undue burden on AWS customers is a job creation program for everyone else. AWS is a giant star orbited by partner companies of various sizes and countless individuals, this is the AWS economy.

According to Gartner worldwide IT spending is $3.7 trillion, this is more than twice the size of the global oil industry at $1.7 trillion, and five times the size of the international metals and minerals market at $660 billion. IT spending is not just people buying things, it includes people doing things. The smiles of server and storage vendors have become a rictus of panic in trying to convince everyone their businesses are doing fine when they are not, the real spending is happening elsewhere.

While vendors capture most of the economic value from their intellectual property there is an expansive ecosystem of higher revenue but lower margin services provided around that intellectual property. This is a technological economy and every new service AWS offers increases the overall size of the AWS economy.

Independent software vendors selling products that use or run on AWS, consultants and system integrators who wrangle AWS for organisations and those developers who deploy code on AWS are all beneficiaries of the AWS economy. They are employed to do things AWS customers cannot or do not wish to do themselves. If AWS was easy and something organisations did not have to think about these other members of the AWS economy would not exist.

How large is the AWS economy? That is unknown but we can get a sense of how large it might be by looking at a peer. In 2017 Salesforce.com estimated that for every one dollar Salesforce earned the economy that operates around Salesforce made $3.67. For every one turn of the Salesforce crank the connected flywheel spun nearly four times.

By Salesforce’s estimates, between 2016 and 2022 Salesforce will facilitate the creation of 3.3 million jobs and generate $859 billion in new business revenue. Salesforce pitch this as an example of how Salesforce helps companies perform better. But it’s also a lot of consultants and sales people buying plane tickets, booking hotel rooms and going to see Salesforce customers to sell them products which integrate with Salesforce.

Is the AWS economy now measured in the billions? Yes. Hundreds of billions? Well, if it is not there yet it will be soon. Every new service for customers to make sense of and integrate with adds hundreds of millions of dollars to the AWS economy.

Andy Jassy will take the stage this week and will fire off a volley of new features. He may throw another service or five on the pile of ~90. He’ll probably mention something about databases, because Larry Ellison has been living rent free in Andy’s head for a while now.

Focusing on databases is good because if Oracle’s Autonomous Database strategy pays off, where automation does things junior DBAs used to, it will probably cost some junior Oracle DBAs their jobs. Those junior DBAs will look for other places to sell their time and a lot of them will land on whatever AWS is offering because that is where the future growth is. These will be new members of the AWS economy.

Amongst the noise and endless queues in Las Vegas this week do not forget that the more difficult it becomes to read the eye chart that is AWS’s service offerings, the better it is for members of the AWS economy.

So sit back, relax and listen to what the sages at Amazon have to say. Then know that many people elsewhere will be hired to make sense of it all..


AI adoption cost Diane Greene her job.

New products are adopted slowly if they are adopted at all. The expectation that Google Cloud's AI offerings would generate a monsoon of new revenue, within the five year horizon Diane Greene referenced frequently, has proven to be incorrect. 

This cost Diane her job.

Google Cloud Platform (GCP), under Diane's leadership, has attempted to leapfrog its cloud competition by selling the superiority of GCP's AI offerings. But Google are early and have dramatically overestimated the speed of AI adoption. 

Product adoption can be measured. 3M corporation is one of the most innovate companies in the world and how they measure the success of that innovation is in how much revenue it returns to their business. This is a good measure of success for any technology company. In Transforming a Legacy Culture at 3M: Teaching an Elephant How to Dance, the New Product Vitality Index (NPVI) is shown as a 3M measure of sales generated from products introduced during the past five years.

At 3M's highest performing point, its NPVI has not exceeded 35%. Out of more than 50,000 products touching different parts of our lives, two thirds of their revenue comes from products that are more than five years old. For companies not as successfully innovative as 3M, an NPVI of 3%-5% of revenue is common.

New products are adopted slowly if they are adopted at all, and AI is being adopted slower than Google Cloud needs it to be.

While GCP's financials are opaque in Alphabet's earning reports there is no visibly increasing GCP/AI bounce in Google's revenue. Not in the way AWS and Azure have clearly contributed to their parent operations.

AWS and Azure built their leads selling infrastructure and platform as a service offerings making it understandable that Google would look for a point of differentiation. AI has been the wrong point of differentiation. While AI will diffuse throughout new products over time, being built in the way embedded databases were, this adoption time will be long.

Changing business leaders will not alter Google Cloud's market position because it does not change its point of differentiation. Finding a differentiator with an adoption timeline that works for Google, and works against its competitors, will be what will earn Thomas Kurian his compensation.

Or if he too gets it wrong, it'll earn Thomas a severance package..


IBM's last roll of the dice.

IBM is buying RedHat because IBM has fewer potential customers today than it had in the early 1970s. You can buy companies, but you can’t buy time and time is IBM’s enemy because every passing second acts against IBM’s business model by killing off IBM's best customers.

According to the US National Bureau of Economic Research, in 1976 there were nearly 5000 publicly listed companies, approximately 23 public firms per million US inhabitants. These companies have long been the bedrock on which IBM’s technology and professional services businesses were built.

When these companies first needed a computer, they bought a mainframe. When the PC and client server computing waves swept through firms IBM was there with the hardware, software and services required to put a PC on every desk and a server in every small data centre. When the internet hit, IBM had an e-business strategy on paper for CIOs to read and an army of IBM Global Services consultants to implement that strategy if the CIO cut them a purchase order.

IBM helped publicly traded firms make sense of technological change. Then, slowly at first, the worst thing happened. Globalisation and the Internet began murdering IBM’s customers.

That company making plumbing fixtures in their own factories in the US went out of business because its customers could order the same fixtures slightly cheaper and in higher volume from Asia with just a few clicks. As time ticked on hundreds of that company's publicly traded peers joined them on the funeral pyre of companies that are delisted and go out of business.

In 2016 the number of publicly listed companies had declined from 5000 in 1976 to approximately 3,600. As the US population continued to grow and consumer demand surged, a vast swath of existing and potential IBM customers withered and died. Where there once were 23 companies per million US inhabitants now there are 11 and falling.

In 1975, 109 firms accounted for half of all the profits booked by publicly traded firms, this year just 30 firms booked half of the profits. More customers giving more money to a smaller and smaller number of companies. Every one of the massive tech platform companies is in the top ten. Those platform companies not looking to buy anything from IBM today or tomorrow.

Near term, in the Enterprise IT market you can either become one of the trillion Dollar platform holders or be a company that uses those platforms to provide something of value to customers. RedHat has been moving towards trying to provide something of value. IBM has realised they will never be a platform holder of the size IBM requires to sustain itself, so the RedHat deal is their attempt to get onto those hyper-scale platforms and provide something of value to customers of any size.

The coming threat, for many IT providers not just IBM, is the next death wave to rip through the existing publicly traded companies. Amongst the financially living there are shambling zombie firms which have been lurching from one cash flow problem to the next under the darkness of creative accounting.

Mortally wounded due to the financial crisis and global competition, they shuffle onwards because of the cheap debt sloshing around the global financial system. Now that quantitative easing has tapered off across the globe, the low interest money drying up, rising debt interest rates will send these companies to their final death. Bankruptcy follows unsustainable debt payments and IBM’s potential customer pool will shrink even further as these zombie firms are shown to be flat broke.

RedHat is not a perfect deal, it has its own challenges, but it’s a deal that can be done with the financial resources IBM has today. There is no time for IBM to attempt another organic growth spurt, cognitive computing (Watson) was IBM’s in-house attempt to build a bridge to a prosperous future, IBM has failed in this effort. The idea was good but in business perfect execution of a mediocre idea results in profits. IBM had poor execution and that’s why cognitive computing has been a bust for them.

You can buy companies, but you can’t buy time. Buying RedHat is an attempt to buy the results of the time RedHat has spent on solutions to get customers up and running on the hyper-scale platforms. IBM’s customers are the names rolling across the stock market tickers, but that list shortens every year. With the RedHat deal it is IBM’s hope that a potential IBM customer is anyone of any size trying to get work done on the hyper-scale platforms. 

It could work.

It could also be the death rattle of an industry pioneer.

But what a way to go out...