Samsung Rising

Another staycation day, another book. This book opens with exploding Galaxy 7s and Samsung’s paralysis, evasion, deception, and eventual recognition of a dangerous flaw in their flagship handset. Imperial rulers prefer to tell you what the truth is and Samsung’s opinion at the time of the burning handsets was that there was no issue. It was just the pro-Apple press stirring up trouble for them.

Imperial is accurate when describing the management of Samsung. In Samsung Rising the author does a good job of examining the political and dynastic intrigue of an organisation known locally as the Republic of Samsung. It's a Republic in name but a monarchy in operation. Corporatism is a cornerstone of South Korea, members of the leading chaebol (large family owned business conglomerates that dominate Korean industry) routinely finding themselves dragged through the courts only to have sentences commuted and pardons issued. The brand name corporations that they control being symbols of national pride to regular people.

South Korea’s economic success is a product of the chaebol working hand in glove with the government of the day. In Samsung’s case its start goes back to the company’s granite faced founder, Lee Byung-chul. Having lived through Japanese colonialism in Korea, B.C. took inspiration from the Japanese zaibatsu companies. The post-war economic miracle economies of Japan and Germany were South Korea’s template for industrial development. While everyone else was trying to copy the United States, B.C. ensured that Samsung would be a family controlled vertically integrated monopoly pervasive throughout South Korean life. He would import the best ideas from Japan and Germany, then make them Korean.

Betting big on semiconductors B.C. looked to rapidly scale Samsung Electronics. During the PC revolution of the 1980s Samsung Electronics looked like an also-ran but Steve Jobs showed up on its doorstep looking for memory chips for the Dynabook concept (A tablet computer). This was the beginning of the long and tortured relationship between Samsung and Apple that continues to this day. Samsung being both a key supplier for the iPhone and one of its most dangerous rivals with their Galaxy line of handsets.

The development of Samsung’s Galaxy forms the backbone of the book. Samsung's ongoing failure to grasp software development and the need of "The Tower", the brain of the Samsung octopus, to control every aspect from the top down being covered in detail. It is a compelling read. There is a horrific sequence halfway through Samsung Rising where the company’s US marketing team are summoned to South Korea for a Samsung global marketing meeting. The small, by Samsung standards, and unruly marketing team from Texas had been sticking it to Apple and carving out the Galaxy brand in the US as the defining line of Android handsets. They expected to be recognised for their work.

Recognised they were. The US team was asked to stand, and their Korean leadership told everyone in the room to clap for the US team as a sign of encouragement because they were the only group present that was failing the company. Every metric said otherwise with customer sentiment and sales of Galaxy handsets soaring in the US, but the message was clear. You do not shine brighter than the imperial court back at the South Korean HQ.

This was a good read about a company whose internal operation I knew nothing about. Samsung has stumbled badly during their last dynastic transition of power from father to son but it would be a mistake to underestimate them.


I’m Thinking of Ending Things

The weather today has been abominable so I finished the latest staycation book much sooner than I had expected. The problem I am now faced with is that "I’m Thinking of Ending Things" is a psychological thriller/horror novel that is built on one twist. If I discuss the twist the book becomes pointless. Even an attempt to discuss it in a tangential way would cause the novel to diffuse into the air. So I will avoid it entirely. 

Like all good psychological thrillers the book begins by making you uncomfortable from the start. There is nothing gruesome here, things just feel distorted and that distortion is unsettling. You keep waiting for the floor to drop out from under the female protagonist and for things to start going wrong. You know it is coming and in a way she does too. It’s just a matter of when and how.

The majority of the book is focused on a girlfriend and boyfriend having conversations during a road trip to the farm of the boyfriend’s parents. It is dark and snowing, the cold outside the car is oppressive and the landscape at night time narrows to a point surrounding their vehicle. The conversations they have are intellectual, but he is a know-it-all and more than once I recoiled the way she does when their conversations take a negative turn.

As a reader we get extra contextual information, we are privy to the female protagonist’s internal monologue. She has been in this relationship for a number of weeks and she is not convinced it is going to work out. She is thinking of ending things. Were I in her position I would have already.

Looking at him as he is driving she considers his flaws. Flaws anyone would discover were they to make a life with another person. She wonders what it would be like to listen to the same person digest food for years to come. After sex she observes his body with detachment, noticing details that make him less attractive to her. The odds are good that this relationship is going nowhere.

Right now the realtionship is taking her to his parents farm and on the way we learn she has been keeping a secret from him. She's getting phone calls late at night, the caller only leaves messages. Not abusive messages but strange ones. The caller starts leaving her messages on the commute and the story most forward from there.

With any fiction book the bottom line of a review is if the juice was worth the squeeze? Was it worth the reading time spent? In this case I would say yes. I did not find the denouement to be revelatory but it was an uncomfortable story that was written in an engaging way. 

Would I sit down and read it again?

Probably not.



Drew Barrymore is Harley Quinn if Harley Quinn decided she wanted two children and a house filled with pets. After the slog of yesterday's staycation book it was time to breeze through something and Barrymore’s memoir “Wildflower” proved to be a breezy and sometimes funny read.

Her high energy humor aside, there are sharp edges here. She recounts several asshole stunts that will make you cringe, they now make her cringe, but scattered in between these are examples of how different her situation was. Hollywood having thrown her away, there is a memory of her spying on people at the laundromat so she could learn how to wash and dry her clothes. Ruining them she drags the sodden bleach stained mess back to her dump of an apartment, where she was living alone on a diet of take-out meals and cigarettes. Despondent, she recognises that she is a school dropout who does not know how to do anything and for all intents and purposes her career as an actress is over. She was fourteen.

There is a sadness in that chapter that drives her forward in many of the others. She slowly and carefully rebuilds her shattered career, assembles a family of friends, starts reading voraciously and over achievement in cooking and domesticity becomes a driving ambition. I would not be surprised if bedsheets in the Barrymore house are changed three times a week and any meal that comes out of the microwave for dinner is seen as a personal affront.

There is an insight close to the halfway point that to me explains both why she wrote the book and why it is written in the anecdotal non-linear fashion that it is. Barrymore decided that more than anything she wanted to be appropriate. Having been a washed-up child actress; a teenager who ended up in rehab; and a tabloid fodder exhibitionist with a string of male and female lovers, all of that had to go if she wanted to take her life to the next point. Her vagabond father and incapable mother were not appropriate as people or parents, but she makes the choice that she will be.

The clothes stay on, the film roles become more wholesome and the behavior in public becomes more controlled. Her private life becomes private. This book and its presentation are an exercise in not shaming herself to her daughters while explaining the past to them. When they come asking questions they will get a detailed description of the lessons she has learned. With surface level descriptions of the situations she learned those lessons from.

The strategy here is to not let her children get away with any of the things she got away with. Believing she should never have been in those situations in the first place.


Talking to Terrorists: How to End Armed Conflicts

Another staycation day, another book from the unread pile bites the dust. This time it was a deep dive into armed conflict resolution in Talking to Terrorists: How to End Armed Conflicts.

Jonathan Powell, chief of staff to former British Prime Minister Tony Blair, is one of the clearest thinkers on negotiation with armed groups that I’ve read. That said, his writing here is a slog to get through. The book is anecdote rich and upfront that negotiation is an art rather than a science but this could have been leaner and more readable if his insights were not buried deep inside the examples.

To Powell there is no conflict so insoluble that it cannot be unlocked through talking. He points out that Governments always talk to terrorists, even when they say they do not or will not. Negotiations resulting from these talks might fail, but any progress can be built on incrementally. A failure today, yesterday, and last year does not preclude success tomorrow or in five years.

It was negotiation that brought a peaceful end to the apartheid state of South Africa. Negotiation created a peaceful settlement in Northern Ireland, it brought about the disbanding of ETA in Spain and FARC in Columbia. The road to these solutions only revealed itself when both sides, no matter how distasteful they found one another, talked in private.

Opening with a discussion of what we understand to be modern terrorism, with no pride do I mention he states it was developed by the Irish, it is Powell’s belief that the path to peaceful resolution always starts with private talks.

Over the course of the book he builds a convincing argument that this is the case, engaging with the counter arguments around never negotiating until terrorists are about to be annihilated. It is Powell’s assertion that time again a new terrorist threat emerges and it never reaches the point of collapse governments like to think it has. Security, military and technological solutions are deployed to combat the terrorists and while a lot of people die none of those solutions solve the problem. The terrorism mutates and carries on as the terrorist acts are the most visible symptom of another underlying set of problems.  

This book was published seven years ago but it is the author’s conclusion that to make progress on the dissolution any armed group, be their doctrine what the West would call rational or apocalyptic, you have to sit down with them. All terrorist groups begin with unreasonable demands, even the ones that state their ambition is to wipe out a competing ideology entirely. What matters in the end is the terrorists ability to park the extreme position and talk about on everything else around it. It is there the path forward to peace begins.

Overall, an interesting read but hard work to dig through.


Sumner Redstone: The King of Content

My plans for the current staycation are to put a dent in the unread book pile. The first book out was timely since the subject died this week. 

Recently departed media mogul, Sumner Redstone, was an uncompromising man. The King of Content by Keach Hagey covers Sumner’s early life and business moves but has enough about his tumultuous personal life to be gossipy.

The advantage of the gossip is that this book reads better than a set of business cases, which it could have been as there are remarkable successes and crushing failures in the deals he made throughout his life. Covering Redstone’s early days, we find a mob connected father and a neurotic mother nurturing Sumner’s formidable intellect. This intellect was welded to a ruthless competitive streak and the combination of both proved to be explosive.

Powering him through the most prestigious school in Boston and into Harvard in short order Redstone breaks Japanese codes during World War II; becomes disillusioned with practising law; side-lines his brother to take over the family drive-in movie business; and then wages all-out war against other media companies.

Scooping up Viacom, MTV Networks and CBS Redstone consolidated several prestigious media assets under his control. Control being Redstone’s internal drive. Business associates and family members whom he cannot control are discarded. Sometimes with regret, but not too much regret. Redstone cries a lot as he’s sticking the knife into a family member or a long-time business associate, but he never cries for too long.

The arc of Redstone’s media empire follows the arc of his life. As he began to physically decline so did his investments. Mistakes were made. There was an obsession with videogame studio Midway and after investing $800 million in the beleaguered company it was sold for $100,000 and the investment written off when it was clear Midway could not be turned around by his team.

Viacom passed on acquiring Marvel, with whom their studio Paramount had the original distribution deal for the Marvel cinematic universe. That was a costly mistake when in subsequent years Paramount released box office bombs while Disney made billions from the Avengers.

The decline was not just financial. Redstone was a man of voracious sexual appetite and it is a matter of record that he left sexually explicit voicemails recounting a foursome he had the night before with the legendary Hollywood producer Robert Evans and two women. He would have been 90 at the time and a daily user of Viagra.

Throw in numerous inter-family power struggles for control of the fortune, an old man with a string of young gold-digging girlfriends, everyone getting slapped with lawsuits and you have an accurate synopsis of his final years.

Every good story needs a hero and in this story his daughter, Shari, has been cast in that role. She battles not only her father, his string of girlfriends and his sycophants but also the Board of Directors at Viacom and CBS. After all the bloodletting she stands victorious a top a pile of corpses and has as much control over the now merged ViacomCBS as her father ever had.

Sumner frequently stated that his ambition was to live forever and never carve up the Empire he had built. It strikes me that his daughter is much more pragmatic and for her corporation, ViacomCBS, to survive it will have to buy other companies or itself be sold to someone larger.

 Either way as in business and in life, control over others or yourself doesn’t last forever.


The Age of ARM

Doing your own software and your own silicon is back in computing style again. Apple have been moving towards releasing ARM powered Macs running on their in house designed processors for a while now. Off your desktop and into the cloud, AWS is on its second iteration of its ARM based Graviton server processor.

The original Graviton processor was a 16nm part consisting of 5 billion transistors. The Graviton 2 is a 7nm part, with 30 billion transistors delivering a performance increase of 7x over the first-generation part. There are 4000 pins on the back of the processor package so you will not be slapping one of these into a motherboard you own. Though that is more down the fact that current consumer motherboards use contact patches and not pins, but still, there is a lot on that Graviton 2 processor die.

The economic drivers of doing the whole thing in house might be the same for both companies. The price/performance ratio for AWS Graviton 2 instances running Linux is notable, but in Apple’s case this represents Apple walking away from Intel. 14 years after they walked towards them.

Architecture Plan B’s are not new. Back in 1991 Windows NT was being developed for Intel x86 and MIPS. Dave Cutler, the mercurial leader of the Windows NT project, was obsessed with portability. To Cutler, Windows NT was to survive even if the hardware architecture it was built on withered. Microsoft was all in on x86 and the pull was strong to move forward just with that. Cutler, paranoid that Intel might drive itself into a ditch, ensured everyone on the NT team knew that his dev machine used a MIPS CPU.

Not to support the MIPS version was to stand in opposition to Cutler. Woe to the developer who tried to check in some x86 specific customisation into the NT builds. Cutler would come storming out of his office and make a bee line for the developer involved. At the time, the fastest way to get the head of the largest commercial software development effort in the world, Cutler, to your desk was to flag a problem getting something written for the MIPS version.

If you had a design problem, he would try to solve it. If you had a bug, he would test it for you. In the end MIPS was the company that drove itself into a ditch, but the portability lesson was taken to heart by other companies. Companies like NeXT, which started life on the Motorola 68000 in 1989 but in 1993 released an x86 version of NEXTSTEP and a version for SPARC and PA-RISC in 1995.  

Apple are experts at pulling off evacuations from what they feel to be are losing prospects. They escaped the moribund PowerPC architecture with a quick transition that they started in 2006. Expect that the same bag of compiler tricks, performance profilers and verification checks as last time will be used to move from x86 to ARM.

Before the smartphone, the future of computing was the x86 architecture. With movement towards ARM in Apple notebooks and desktops as well as the proliferation of cloud based Graviton instances we are now in the Age of ARM.



Some people are just hitmakers. When it comes to microprocessors, Jim Keller, is a hitmaker. Having worked on the AMD Athlon he drove a sword into Itanium with the K8 x86-64 bit architecture before leaving AMD for new opportunities.

There was a route through PA-Semi (Apple’s processor acquisition) before a boomerang back to AMD to work on Zen. Zen and its successors are also hits, so much so that AMD is on track to reach the heady height of 20% desktop/notebook CPU marketshare. Something they last did two decades ago back in the Athlon days.

Unlike the Athlon, where PC enthusiasts and budget conscious consumers sought out those processors, the new new thing is the growing AMD server CPU business. Years ago AMD failed to crack the fortress built of market development funds that Intel placed around the tin benders of the server business. That crimped AMD's server growth even when they had a winning line of chips.

In the age of hyperscale providers, where offering multiple CPU types is seen as advantageous to consumers, this type of fortress is a remnant of a bygone age.

Intel’s server CPU business has long been a jealously guarded high margin treasure but now AMD have shown up to make off with some of it. AMD started at 1% server CPU market share in 2017 and have grown to 8% in two years. Could they slice off 10 or 20% of the server market? It’s possible.

One minor wrinkle in all this is after bouncing around Tesla for a split second Jim Keller took a new job.

At Intel.

One of the people who dynamited Intel’s most high profile failure is now working to restart their invention engine.

Lets see if a tired old chip maker has one more hit left in them.


MapR's failure represents nothing.

The technology business has much in common with the fashion industry. Both follow fads and have a gaggle of posers who flitter from seasonal event to seasonal event. They're hype driven businesses and just like failed attempts at creating fashion trends the idea that the failure of a batch of companies in any segment of tech is a warning for all of tech is incorrect. Far from being an industry wide cautionary tale the write off of the $280M invested into the now defunct startup, MapR, represents nothing.

In 2016 MapR was valued at $1B. Today it is worth nothing or close to nothing. The public cloud providers forced MapR's largest rivals to consolidate in order to cut costs and gain scale. It is incredibly difficult to compete against hyper-scale companies with near infinite resources so Hortonworks and Cloudera decided it was better to travel together than to travel alone.

MapR, with no allies and no path forward, succumbed to the inevitable when its funding was cut. I've seen arguments made online that the Big Data segment was nothing but hype and this is a warning about hype powered investments into other tech segments. Yes, there was a lot of hype but there is always a lot of hype.

Big data, analytics and AI workload deployments are increasing but they are doing so while not being enabled by MapR. This was MapR's problem. Technology is like the fashion industry in the sense it has seasons, fads and assorted posers who jump from one fad to another each season. Following the hype, finding the fads, ignoring the posers and trying to pick a big winner from amongst many contenders has been the tech investment strategy for decades.

Y Combinator, the famed startup accelerator, has a vigorous vetting process and has invested in more than 700 startups. If you want to see the future you start with Y Combinator. Out of those 700+ startup investments the number of breakthrough companies Y Combinator has invested in that have hit it big? Three.

The majority of investments failed, some are profitable and making money, but the ludicrously successful investments are in the low single digits. It's true for Y Combinator and it's true for the big VC firms.

Go from Y Combinator to the heavy hitters of VC finance and you find VC firms doing their best to identify as many unique companies as they can in a fashionable segment of tech. The hope is that one of them will be the massive success, the next market dominating force in its category. Everything else they invest in will be seen as garbage even if it's a nice little business. No VC wants a piece of a little business. If you can get to profitability without VC money you can be as little a business as you want.

VCs invest in companies they think have a shot at massive returns. When it becomes clear there is no shot they stop investing and move on. MapR's failure is the realisation that against the public cloud providers, and the scaled up Cloudera, MapR had no shot.

There is always a new season and new fad to invest in. With diligence and a bit of luck one of those investments will hit it big and we can then start complaining about how overvalued it is or how high its prices are.

The revenge of the original RISC processor.

Two categories of microprocessor have dominated computing over the past 18 years, performance processors and low power processors. These are now being joined by a third category of microprocessors, minimal cost processors. This emergence of minimal cost processors, led by RISC-V and MIPS Open, will replenish the dwindling supply of system design engineers.

The dominance of the Intel architecture eliminated a generation or two of system design engineers. Years ago there was a cornucopia of processor architectures used in different computing segments, then Intel’s critical mass made standardisation viable across a number of different markets.This move to x86 everywhere made economic sense for many companies but the result was a homogenisation of design.

Where designs once were expensive and bespoke processors and boards became a commodity. Intel used economies of scale to extract significant gross profit from those commodities while their fabrication prowess ensured competitors could never execute fast enough to challenge them. When Intel stumbled a competitor rose, when Intel returned to superior execution the competitor fell.

As a host of processor competitors withered and died there was neither space nor need for people working on things that were substantially different to x86. Very quietly the PC and Server wars ended. Across desktops, in appliances and in data centres "Pax Intel" reigned.

For a long time AMD and IBM POWER were the only other two providers left standing among a pile of defunct designs and their market share was minimal. It took a new class of device, the smartphone, for ARM to propagate across the world. If you are looking for performance Intel, AMD (And POWER) occupy that category, but a low power category was an opportunity for someone else.

ARM, today's low power champion, is now under threat from a new category that of the "minimal cost" category. Two designs from the distant past have returned and unhappy ARM licensees are interested in what they have to offer.

The first minimal cost design is RISC-V, the fifth version of the original Reduced Instruction Set Computer processor designed at UC Berkeley. The second is MIPS Open, the successor of the Stanford University spin-out processor that powered Silicon Graphics systems and the Nintendo 64.

These two minimal cost processors offer current ARM licensees the choice between hiring their own engineers to create new designs using an open license or not doing any of that and taking a license for immutable core designs from ARM. Increasingly firms are now looking at the cost of licensing from ARM and are instead putting their own design teams together. System design jobs have entered a new era of expansion with companies looking at doing their own bespoke implementations again.

RISC-V has a simple base specification of about 50 or so instructions, in contrast the POWER Instruction Set Architecture I used to keep in a desk drawer runs to three bound books and over 1200 pages. RISC-V’s academic roots are plain to see as there is just enough to get going.

If you want to own a trinket you could drop $60 on a HiFive1 board with a genuine RISC-V processor but you can simulate RISC-V in QEMU today and boot Linux on it. CPU simulation performance is acceptable enough to get work done and QEMU also supports MIPS CPU simulation.

While RISC-V may run Linux it does not have the capability to move into the performance category. David Patterson, one of the original co-designers of RISC (And RAID) co-authored an interesting paper spelling out an ambition to become the ubiquitous processing core anywhere computing is done. A lofty goal but like Linux it will take billions of dollars in investment from a broad partner ecosystem to move towards performance and take on the established providers there.

The ARM server CPU companies have faded because moving from low power to high performance is a new set of brutal challenges. One of those challenges would be in meeting Intel and AMD on ground they know well and outspending them when it was required. When it mattered Linux had friends with deep pockets, ARM server processor providers do not.

Unlike RISC-V MIPS descended from high performance into low power and minimal cost through a number of marketplace defeats at the hands of others. MIPS was considered to be so strategic in the 90s that it was the second target platform for Windows NT 1.0, but it is now more commonly found in embedded controllers.

With poor leadership and a hostile market MIPS has had a rough two decades but continues to be used for embedded applications and appliances. Facing a surge in RISC-V interest team MIPS had no choice but to open up their intellectual property for fear they would be the first casualty of RISC-V. This was an move showing intelligence that has been lacking from previous MIPS leadership for years so there might be room for MIPS Open in the minimal cost processor segment yet.

The question that matters is if it's possible for minimal cost processors to jump categories and take on the performance CPU providers? Yes it is possible. But they're going to need a lot of friends who have a lot of money and those friends will really have to want them to succeed.

Same as it was back in the 80s when the original RISC processor designs came out of the Berkeley CS labs and dominated the UNIX business for years after.

Quantum uncertainty

The promise of quantum computing is its ability to derive answers for problems which are currently computationally prohibitive. Of the many open issues surrounding the successful delivery of mass-market quantum computers two of note are the vagueness of what they will be useful for and if quantum computers can be scaled in a manner to solve hard problems? It is uncertain as to if there are answers to these questions.

By modelling the natural world in a processor, as opposed to digital operations in current processors, the assumption is we will have the ability to quickly simulate the interaction of particles in the real world. Our current computing model built as it is on zero and one’s is inadequate for these use cases as in the natural world things are not binary. Traditional computers are built on the bit, it has a state of 0 or 1 and never anything else. The qubit, the building block of quantum computing, is capable of operating in intermediate states. Imagine it as always operating from 0 to 1, and qubits only deliver an output of 0 or 1 when you take a measurement of their current state.

To give an inaccurate accounting of quantum computing that I'll tell you right now is incorrect but does not require an understanding of quantum mechanics or linear algebra, imagine that when simulating something using bits that you sequentially cycle through every outcome until you deliver an answer. For some workloads this can be done quickly, for others, usually involving an examination of the building blocks of reality, the computational time required can be measured in thousands of years. With qubits you can check multiple outcomes in parallel because what you’re using for your computation more accurately reflects the phenomena you are looking to examine. When you take a measurement, you have an answer and the answer is not derived from a binary simulation of reality painfully stepped through one option at a time but from the effect of computation on reality as it exists.

This isn’t to say that quantum computing will solve all problems, it is not administration rights to the knowledge of the universe, nor might it solve problems faster than traditional computing. It is expected quantum computing will have applications in physics and mathematical factorisation (for example cryptography and breaking cryptography), but there is still a realm of hard problems expected to be well beyond the capability of quantum computing.

To date we are unsure as to what quantum computers will be useful for as the hardware is experimental, small scale and provides results of questionable accuracy. If chip designers can crack the tough problems around the development of quantum processors and qubits the end goal will be discreet quantum processor unit cards (QPUs) available as accelerator cards the way graphics processors are delivered today. Today however quantum computers are big, qubits requiring isolation as to not negatively interact with one another, and require cryogenic cooling to ensure stability.

Right now Intel, IBM and Google have fabricated double-digit qubit chips but Intel admit these are probably not good enough for operation at scale as these qubits have a high error rate. The fact the hardware returns an unacceptable number of incorrect answers to be useful for computation has not slowed down the search for new quantum accelerated algorithms. With a lack of production grade hardware, software developers have turned to simulating qubits on traditional computers. Microsoft has released a preview of their Q# programming language which comes packaged with their quantum processor simulator and there are extensions for Java and Python which do the same thing.

As qubits in the real world may not be performing as expected how accurate software simulations running on traditional computers will turn out to be is also a question yet to be answered. There may be a discovery or two yet to be found not reflected in the software and when the hardware and software are delivered the systems may just fail to live up to their promises.

The quantum computing breakthrough has been five years away since the first time you heard the phrase “quantum computing” and its success is still not inevitable. While the technology industry is like the fashion industry in the sense there is hype, trends and seasons when it comes to new offerings it would be unwise to be cynical about a new technology in its formative state. That said, controlling expectations would be prudent until you can rent millions of qubits from your favourite cloud computing provider or add a QPU to your desktop.

Just be sure to keep a can of liquid nitrogen close to hand if you buy your own QPU.