Soup Tasting

Ray Dalio’s Principles is required reading at Bridgewater, and contains plenty of wisdom that resonates well beyond its original context. Far down on the list, at #139, we find: 

... 139) “Taste the soup.” A good restaurateur constantly tastes the food that is coming out of his kitchen and judges it against his vision of what is excellent. A good manager needs to do the same.

Soup tasting is hard and requires you to pierce comfortable levels of abstraction.  Often where there are bad outcomes, there is a gross lack of soup tasting, both because of inertial unwillingness to take a bite and because of ineffective gustation.

Amazon’s Jeff Bezos is the archetypal soup taster (among many other outstanding talents).  Bezos is renowned for the depth of his knowledge and the clarity of his insights (especially when making snap judgments), but equally important is his ability to get to the crux of seemingly complex matters in five questions or less.  It’s easy to forget how many complex decisions Amazons has faced over the years, and the fact that their success is often taken for granted is largely a tribute to Bezos’ ability to ask the right questions so incisively and consistently.

The importance of soup tasting seems intuitive enough, but how you develop the ability to taste soup well is one of the more underrated challenges of leadership for a number of reasons.  To begin with, there is never just one kind of soup.  The metaphor applies equally well to the commercial success of your business and the view from inside.  At the same time, not all soup is equally important, and even the most astute taster’s capacity is limited, so you need a focal point. As Bezos has often described, “We start with the customer and we work backwards.”

More fundamentally, soup tasting is largely about overcoming bias, which is generally a very difficult process.  It needs to be about fearless inquiry, not seeking reassurances. Anyone who has done any actual cooking has probably had the experience of asking someone else if a dish tastes funny, while silently convincing himself that the answer is no. Of course, if it does taste funny, being polite does the aspiring chef no favors.  For soup tasting to have any value as an exercise, you can’t be afraid of what you might discover. 

Soup tasting is as much art as science, and as such it is hard to turn it into a predictable framework.  Still, some basic principles apply:

  • It all starts with probing.  Any time you are presented with an assertion, whether it’s a project plan, forecast, or report, review it tenaciously.  If something isn't clear to you, probe down. If something strikes you as particularly important, probe down deeper. If there are implicit assumptions, challenge them.  Think of the annoying little kid who responds to everything by simply asking “why?” It seems repetitive, but if you proceed from the right starting questions you will quickly get to the heart of the matter.
  • Get closer to the problem.  Something about the soup seems off.  Now you need to taste it some more.  The first step in getting close to the problem is simply a more thorough probing.  If that doesn’t do the trick, you need to go down two or three levels, either by honing in on the most important things in your area of credibility, or by asking someone who is credible.  By the way, assessing who has credibility in what areas, beyond just being aware of their reputations, is its own important form of soup tasting.
  • Measure.  Soup-making, both literal and figurative, requires experimentation, and it’s one of the hallmarks of the Amazon approach.  Bezos places a premium on experiments that are measurable and produce hard data.  As he explained in Fast Company, “The great thing about fact-based decisions is that they overrule the hierarchy. The most junior person in the company can win an argument with the most senior person with a fact-based decision.”  At the same time, as Bezos will quickly tell you, “there’s this whole other set of decisions that you can’t ultimately boil down to a math problem” – hence you need to master the art as well as the science.

It’s also well worth considering what soup tasting is not:

  • It’s not micromanagement.  This means telling people how to do something without tasting the soup for yourself, or telling them how to do something in an area where you lack credibility.  
  • It’s not distrust.  Distrust is not a productive default position, but neither is blind trust. Real trust is developed by consistent soup tasting – as the old saying goes, “trust, but verify”.  Knowing which issues to escalate as priorities, and how to escalate them as a team, is also an art form, honed through soup tasting interactions.
  • It’s not indefinite, nor is it an end in itself.  You need to find the middle ground between an excessively laissez-faire approach and never-ending inspection.

The more soup you start to taste, the more you'll want to taste, but as with anything, you can overdo it – just as you can proofread too long, and you’re bound to miss something obvious. It is critical to cultivate credible soup tasters throughout the organization, but the transition from soup taster to meta-soup taster is a tough one. It only works if your trust has been validated, and requires a great deal of intellectual honesty to avoid indulging in wishful thinking, feel-good exercises, or just shedding responsibility. 

In the end, soup tasting is how you know what is true – “overcoming bias” and “intellectual honesty” are really just fancier ways of expressing this.  And the truth matters more than anything else.  In his introduction to Principles, Dalio states,

“I also believe that those principles that are most valuable to each of us come from our own encounters with reality and our reflections on these encounters – not from being taught and simply accepting someone else’s principles…. So, when digesting each principle, please…

…ask yourself: “Is it true?”

 

All soup tasting, ultimately, is a variation on this one simple yet profound question.

The Nature of Goal Orientation

I think a lot about what specific competencies are needed when starting something, but even more fundamentally, how does someone approach work (and life)? My experience is that there are goal-oriented people and there are process-oriented people.  Finding goal-oriented people is one of the most crucial determinants of startup success - no amount of expertise can substitute for goal orientation.

There is implicit bias in both orientations, but not all biases are created equal.  Goal orientation subordinates process to outcomes.  As a result, there is sometimes a tendency to ignore or undervalue the importance of frameworks, checklists, and details, though in my experience truly goal-oriented people are quite intuitive at abstracting useful and repeatable approaches from their experiences. Planning and process are also not the same thing – done right, planning is simply the division of larger goals into smaller ones.  Even so, goal orientation is a vastly preferable bias.  You can learn organization (and the most effective people are constantly re-learning it), but motivation is much harder.  By the same token, consultants can help to improve your processes, but they can’t define your goals for you.

Process orientation, on the other hand, actually subverts your goals, under the subtle guise of helping you achieve them.  Uncritical acceptance of process creates an alibi for failure.  When things go wrong, a process-oriented person thinks “I did all I could while following the process to the letter, so maybe it just wasn’t meant to be.” Without a healthy institutional skepticism, process easily becomes a goal in itself. To be fair, processes and goals can both be destructive if they are not subject to revision, but process is fundamentally tied to predictability and routine, whereas goals require constant thought and re-examination to remain effective.

The most inventive organizations are more concerned with limiting process than perfecting it. Apple’s revitalization began when they started to re-imagine a hardware problem (personal devices) as a software problem. If process had been the dominant consideration, Apple would have kept refining their old product lines until they faded into irrelevance.  By the same token, many enormous failures affecting society writ large can be attributed in part to relying on process while ignoring the substance (Enron, the subprime collapse, countless failed technology acquisitions).

Everyone claims to be goal-oriented (it’s probably one of the top resume clichés), but the norm is that people want to be told what to do.  Freedom is scary, partly because it is new and unfamiliar, but mostly because the onus will be on you to succeed once the security blanket of process is taken away.  Truly meritocratic and goal-oriented organizations are also quite rare, so it’s easy to mistake boredom and frustration with bureaucracy for real self-determination.  During both Internet bubbles, countless career big-company employees decided they wanted to “join a startup”, without really asking why or realizing that they were trying to be different in the exact same way as everyone else (the word “join” isn’t an accident either).   Ironically, when asked by hiring managers what they would bring to the table, these people would typically deliver lengthy homages to their current company’s processes. 

One of the most interesting things about goal and process orientation is what part is constitutional and cultural.  Some people are natural insurgents, who will orient and achieve the goal so intuitively that they may not even appear disruptive to the casual observer.  Others have been raised in cultures that value conformity and process.  Just as many genes are only expressed when the right stressors are present, a naturally goal-oriented person may not emerge until landing in the right environment. The converse is much less common, however – process-oriented people tend to be exposed fairly quickly in truly goal-oriented environments where there is little concept of playing along.

The conflict between goal and process orientation is exceptionally relevant to planning one’s career.  We’ve all seen picture-perfect, cookie-cutter resumes that are obviously a result of process orientation,.  What’s more interesting is when people try to design rules and processes to reverse-engineer a major career shift.  There are plenty of “experts” who will tell you to get experience in the private sector before doing a stint in government (or vice versa), or that you should learn “fundamentals” at a  Fortune 500 company before joining an early-stage startup.  With all due respect, these people completely miss the point of having goals.  It should be more obvious with really unorthodox career arcs, but even so, many people are apt to read about Steve Jobs and think “Ok, so I should drop out of college, but take a calligraphy class, and get fired from my own company before making a triumphant comeback.”

Of course, there are plenty of perfectly good environments for process-oriented people.  The problem is when they land in the wrong place and both the person and team suffer.  It really comes down to honestly understanding your strengths and weaknesses, as an individual and as an organization. 

On the Joy of Renting

Ownership is the essence of the American Dream – or is it? The mortgage crisis certainly led many people to rethink the virtues of owning a home, but even in less dramatic markets, it’s a fair question.  There are many assumptions to be challenged and hidden costs to be considered.  Warren Buffett continues to bet heavily on housing, while Yale economist Robert Shiller contends that housing is an investment fad, with no net appreciation in the US market over 100 years.  Of course, as author of the Shiller post points out, most of us are living in our homes, and the benefit is partly intangible.  But how much does the intangible actually depend on ownership as opposed to just being there?

Rental has always been a popular alternative for major, long-term-use assets with high maintenance costs.   Traditionally this has meant homes and cars, but they are just the beginning.  The convergence of low-friction technology, on-demand efficiencies, expanding tastes, and shrinking wallets has led to the explosion of the sharing economy, as reported by The Economist.  There are countless examples, each with its own intricacies: Rent The Runway, Amazon Web Services/EC2, ZipCar, Uber, even BlackJet. It’s about deciding not to own something you really don’t need to own yourself (and achieving better financial health as a result).).  Increasingly, we have the option to spread out the long-term maintenance cost, which actually exceeds the acquisition cost for more assets than people tend to realize, while maintaining high availability. 

The sharing economy ranges from necessities such as housing and transportation to luxuries such as designer dresses and private jets but necessities quickly become luxuries when acquired carelessly.  This is especially pertinent for government, but it’s not always obvious which costs justify themselves.  Traditionally, the Forest Service, Coast Guard, police, et cetera all maintained their own helicopters, for example.  Even if they were grounded 90% of the time, no one wanted to give up ownership if they had a choice.  Now that states are going broke, sharing is a much more palatable option, but it’s not just about cutting costs – you have to re-examine the incentives.  In government, one of the major drivers of ownership is funding.  It’s easier to get large capital funds for new assets because they are assumed to be investments— and investment has a return.  It’s much harder to get operational funding because that is a cost - and costs are bad, right? (how many times have you heard the renting is throwing money away?)  But what if that helicopter fleet is just a really bad investment? It becomes a lot easier to make that case if you can get a helicopter on short notice, probably based on a retainer and/or hourly use fee (similar to ZipCar).   

 

Separating the emotional appeal of ownership (as difficult as that may be), my thesis is that it is generally a bad idea to own an asset unless you have a specific and special competency to own it.  This is the same for everything: housing, cars, servers - and especially software.

Cars are a tricky case, famously depreciating (up to 10%) the minute you drive them off the lot (a phrase so commonplace you probably finished it in your head). Many of us don’t know how to truly maintain our cars beyond the basics.  For occasional drivers, there is the lesser option, such as ZipCar, but US infrastructure is still designed around individual drivers, and giving up your car can be very difficult if you don’t live in a city.  However, something like Sebastian Thrun's self-driving car work could someday open up a whole new world of on-demand transportation that is more efficient and safer than anything we have now.  Think about it: 97% of the time, your car is sitting around, taking up space, idle.

Servers, beyond the fixed costs, require hardware maintenance, networking, power and cooling.  Many servers require replacement after just a few years.  It’s much easier and lower overhead to simply rent the capacity you need - unless you are Google, Amazon, or the like, and have a special competency that requires you to maintain your own servers.

Software is often perfectly suited to on-demand delivery for predictable use cases, and software-as-a-service (SaaS) certainly qualifies as one of the major technology waves of recent years.  More and more, the prevailing sentiment is “why buy software when you can rent it?”, as reflected in Salesforce’s now-iconic logo. 

Of course, not all software needs can be satisfied by SaaS.  Then the relevant question is whether to build or buy, as opposed to rent or own, but the underlying considerations are similar (if quite a bit more complex).  My guiding principle is that you shouldn’t be building your own software unless you have a particular competency that requires it, or need to develop such a competency.

In keeping with the theme of recognizing our own biases, it’s important to separate the emotional resonance of ownership from the practical reality.  With software, the reality is that code depreciates incredibly fast, not to mention the continuous iteration and improvement required for software to stay relevant. Ownership bias is perhaps most frequent (and outsized) in government, where the idea of “owning” the code base has become hugely and irrationally popular.  In the vast majority of cases, “building” and subsequently owning your own software actually means contracting with private vendors to develop complex, bespoke systems that cost 10, even 100 times as much as an off-the-shelf product. 

There is an attractive yet perniciously false idea that once you build the software, it’s yours, free and clear.  The appeal is simple - people enjoy the feeling of ownership, and are naturally wary of being beholden to outside vendors.  But the reality is that you are paying down accrued technical debt all the time – just as you would maintain a house or car, except that a house or car isn’t expected to fundamentally change in a matter of months.  Furthermore, a bespoke project concentrates that debt with one client instead of amortizing it across all customers the way a productized solution does.  In a very cynical way, bespoke developers are smart to let the government own the source code. Not only does this prevent other customers from re-using the IP (and saving money on development), but it also makes the ongoing maintenance costs easier to justify because now, it’s their baby.

The final point is that if you are going to buy, you need to make sure that the seller has a specific competency in software.  It might seem obvious, but more than any other product, you want to buy software from a software company. Rolls-Royce can build world-class cars and jet engines alike, but there isn’t really an analog in the world of aerospace companies and systems integrators that also attempt to build software.  The product lifecycle, pace of innovation, maintenance considerations, and above all the deltas between good and great all make software unique among industries.

Mo Data, Mo Problems

If you’ve spent any time in high tech the last few years, you’ve probably heard the term “big data” more than you care to recall.  It’s become a constant refrain, and the subject of plenty of breathless cheerleading, much like “the cloud”, “social media”, and countless other trends that preceded it.  This is not to say that big data is not important, but context and meaning are essential.  Big data has many roles to play, but it’s not an end in itself, as Shira Ovide explains so concisely in her recent Wall Street Journal piece

“Data for data’s sake” is the first major weakness of the big data obsession cited by Ovide, and it’s probably the most salient.  This a classic case of valuing inputs over outputs – the idea that if we only collect enough data, good things will happen.  This sort of magical thinking is somewhat reminiscent of past crazes for purely A.I./algorithmic approaches to data science, but at least in those cases there was some concept of outputs and programmatic attempts at sense-making. 

Of course, big data also isn’t going anywhere, and many worthy analytical endeavors demand that we address it.  However, it is essential to distinguish between warehousing, searching and indexing, and actual analysis.  Focusing solely on storage and performance creates a sort of computational uncertainty principle, where the more we know, the less we understand.

As Ovide also notes, there is also a critical gap in analytical talent, which big data has done more to expose than mitigate.   Computing power can go a long way towards making big data manageable and facilitating insight – if paired with a sufficient dose of human ingenuity.  Simply put, humans and computers need each other.  "Pattern recognition” is frequently cited as a benefit of a big data approach, but computers can't learn to spot patterns they've never seen.  As a result, the value of the analyst in defining the correct patterns and heuristics becomes all the more important. 

Appropriately enough, the most valuable and elusive elements lurking within big datasets are often human: fast-moving targets such as terrorists, cyber criminals, rogue traders, and disease carriers who tend to slip through the cracks when algorithms are deployed as-is and left unattended.  The old playground retort that it “takes one to know one” actually applies quite well to these types of situations.

Human capital is a key part of the equation, but it’s not enough to acquire the right talent – you need to address the inevitable organizational challenges that come with retooling for a big data future.  Ovide notes that many companies are installing “Chief Analytics Officers”, and while I want to reserve judgment, the cynic in me suspects this reflects the bias of large organizations to centralize power and create new titles as a first line of defense against unfamiliar problems.  A chief analytics officer could be the catalyst to instill readiness and analytical rigor throughout the organization, but whether this reinforces or dilutes the perception that big data is everyone’s concern is a fair question.

More than anything else, I would analogize the challenges of big data to the differences between conventional warfare and counter-insurgency.  In conventional warfare, the targets are distinct and obvious.  In counter-insurgency, the enemy is hiding among the population.  Much as you can occupy an entire country without knowing what’s really going on outside the wire, you can warehouse and perhaps even index massive data stores without producing actionable insights.  Effective big data approaches, like effective counterinsurgency, require the right balance of resources, sheer power, ingenuity, and strong and constant focus on outcomes.  In the long run, the willingness to pursue a population-centric strategy may well prove to be the difference.

1776 The Ultimate Story of Entrepreneurship

David McCullough’s 1776 is, to my mind, the ultimate story of entrepreneurship.  Starting a company is challenging enough - now imagine starting a country!  Although many orders more complex, America’s founding has much to teach entrepreneurs of all varieties.  And given this heritage, it should also come as no surprise that the United States remains the best place in the world to start something new. 

One of the most valuable things 1776 imparts is an appreciation for the incredibly hard fight endured by the Continental army.  If your most recent lesson on the American Revolution came from a high school textbook, you might dimly recall a few triumphant battles and Valley Forge.  1776 paints a vivid picture of the sheer misery and constant trials of the war – trials few could have anticipated.  The Continental Army’s perseverance is even more impressive when you realize that the Treaty of Paris wasn’t signed until 1783.  For the modern reader, it’s a nuanced lesson: on one hand, you need to be realistic about the challenge ahead, but at the same time, you have no way of really knowing.

The parallels between startups and the Continental army are fascinating.  Some quick observations:

  • Chaos: Compared to the British army, the Continental army seemed completely chaotic. There were no well-defined roles and no visible hierarchy among these ragtag, shoeless countrymen who had taken up arms.  Of course, some of this chaos was real and some was perceived.  The relevant point when starting anything is not how to eliminate chaos, but rather which elements of chaos should be tackled in what order.  Do you address real organizational challenges, or just shuffle everyone’s title? This distinction escaped the British, who underestimated the strength and ability of the “rebels” simply because they looked like a mess. 
  • Meritocracy.  Nathaniel Greene and Henry Knox are two of the better examples.  Greene, a Rhode Island Quaker who had never been in battle before, became Washington's most trusted general due to his exceptional competence and dedication.  Knox was an obese 25-year-old who rose to the rank of Colonel.  He thought up the mission to secure artillery from Ticonderoga, without which the Continental army would have had no such capability. 
  • Talent: Despite Washington’s minor experience in the French and Indian Wars, his principal strength was not military strategy (in fact, his advisors staved off disaster more than once by convincing him not to do something).  His real superpower was his ability to quickly determine who was talented at what. 
  • Food: Food was critical to the Continental army.  Certainly there were times where they were on the move and hardly ate for days on end.  While food was always scarce, the fact that the Army was actually able to feed people with some consistency was critical. The modern startup is obviously not directly comparable, but we’ve seen time and again how providing food pays for itself many times over in terms of focus, productivity and commitment.

But more than simple observations and parallels, there are some real takeaways and strategies for anyone who aspires to start something extraordinary:   

Be Ruthless.

I was shocked by how many times during the course of battle the British would halt their movement to rest or make porridge or something completely non-essential.  There were countless occasions where the side with the advantage could have ended the war, had they only pressed on.  Their reasons should sound a cautionary note even now - stop because it is getting dark?  Stop because that was the plan (despite the ground truth)?  Worst of all: stop because we can finish the job more comfortably tomorrow. 

After routing the Americans and forcing them across a bridge, British General Cornwallis decided to rest.  The Americans retreated brilliantly and swiftly into the night. This was not the Continental Army's first such retreat, so it’s hard to imagine how Cornwallis did not realize the significant risk they posed. Why didn't he send out patrols? Most likely, he thought he would win tomorrow regardless, and preferred not to win under uncomfortable circumstances.  After the fact, he said that he would have kept going, whatever the risks, no matter the orders, if he had only known he would have caught Washington.  The lesson:  Be ruthless as a default setting, not just because victory is seemingly at hand.

Don't Get Overconfident.

Nearly every major mistake by either side in the 1776 campaign was a result of overconfidence.  Minor victories would lead commanders to discard their hard-won knowledge, resulting in terrible decisions.  The tendency to let encouraging signs override our better judgment is actually a fundamental human cognitive bias.  If you’re interested in learning how to recognize and defeat all manner of non-rational thinking, make it a point to read Overcoming Bias

Don't Waste Time Politicking.

General Charles Lee felt slighted that the less experienced George Washington was given command of the Continental army, and constantly sought to undermine him.  When Washington ordered Lee to bring his forces to New Jersey, Lee dawdled, and was captured by the British while seeking some female companionship in a tavern.  Lee was marched to New York in his nightgown, and soon defected.  Much more devastating, however, was a series of letters to Lee from Washington's close advisor and friend Joseph Reed, detailing Reed’s disappointment with Washington.  Why couldn’t Reed have an honest, face to face conversation with his brother in arms to sort through the issues?  In any vital endeavor, there is too much at stake to have closed communications or privately nurse resentments.

It ain't over 'til it's over.

Time after time, each side thought a specific battle was going to be decisive.  In retrospect, it is amazing how incredibly wrong they were, and how often.  So how do you respond? There is a fine line between being jaded and being realistic. Starting something invariably requires commitment in the face of uncertainty.  For this reason, I’d argue that it’s better to be optimistic (even if slightly naïve) than completely cynical, but again, the key is to be aware of our biases.

 

Business Schools and Employability

According to a recent Wall Street Journal article, business schools are placing increased emphasis on the employability of their students prior to admission.  I won’t speculate to what extent this is motivated by the need to protect their job placement statistics in a grim economy, but it’s worth considering the true consequences of this trend.  As the article notes, business schools have always considered the goals of the applicant – but to what extent are they curating these goals on the front end?  Even if we assume good intentions, the effect is to reinforce the status quo, making business school populations even more risk-averse and less entrepreneurial.

Ironically, this seems to be at least partly motivated by the banking collapse: “when the financial crisis upended the banking sector and sure-thing jobs on Wall Street disappeared, schools began formally tying input (applicants) to output (graduates).”  Why “ironically”?  Regardless of how much blame you want to assign to federal housing and lending policy as opposed to private sector recklessness, the financial crisis wasn’t brought on by entrepreneurial, non-linear thinking. Legions of conventionally smart people who had done everything right, rigorously following twenty year plans including name-brand firms and business schools, managed to get the biggest bets horribly wrong.  This is not meant to be flippant – current market conditions and job statistics are stubborn things that must be acknowledged.  However, if the lesson of the financial crisis is that we should double down on conventional wisdom, regardless of whether anything of value is created, then we’ve indeed learned nothing from the past five years.

As someone who frequently uses the frame of inputs vs. outputs, I took immediate notice of the wording above.  It would be encouraging to see an extremely input-focused sector more concerned with outputs, but I suspect they have confused the two in this case, merely trading one set of inputs for another (the addition of an MBA).  You can also think of this as commoditizing human capital, and this calls the entire purpose of an MBA into question.  Is business school meant to, help develop leaders, or serve as a finishing process on a prestigious kind of assembly line? 

The article goes on to state that “making employability too weighty a factor in admissions can backfire. “ According to Graham Richmond, a former admissions officer at University of Pennsylvania's Wharton School, “Looking at applicants through a narrow vocational lens may deter schools from accepting riskier candidates, such as entrepreneurs or career-switchers, in favor of more sure things, such as aspiring management consultants.”  The fact that aspiring management consultants are considered “sure things” is evidence of how much MBA culture values process over invention.  Candidates and schools understandably want assurances, especially in the wake of 2008.  The world is a chaotic place, even more so since the financial crisis (though I contend that it has always been so, and that the banking industry simply managed to insulate itself unusually well for as long as it could).  Obviously, you have to adapt to the current reality.  Yet I can’t help but wonder if by focusing on doing obvious, “safe” things, to the exclusion of risk-taking and creativity, the MBA community isn’t just constructing an elaborate playpen in which nothing new ever happens.

Calling All Computer Scientists in Southern Europe

One of the most startling yet largely under-reported facets of the European financial crisis is the rate of youth unemployment, especially in Southern Europe.  If you are a young person in Greece (58%), Spain (56%), Portugal (39%), Italy (37%), or France (27%) you are likely looking elsewhere already. There are certainly nearby places with a shortage of qualified workers (such as Germany), and when any job is scarce, it may seem a strange time to be seeking your ideal job.  

Yet, for those of you who studied engineering (especially computer science) that is exactly what I am suggesting.  Palantir is hiring aggressively in Palo Alto, New York, Washington, Los Angeles, London, Australia, New Zealand, Singapore, and beyond.  If you are not only technical, but also passionate about using technology to address problems that matter most in the word, Palantir (and I personally) would love to hear from you. Why Palantir?

Meritocracy: Silicon Valley has the highest concentration of great computer scientists of anywhere in the world.  If you are a gifted young computer scientist, you belong with a Silicon Valley company if not in the Valley itself.  Of all the great things about Silicon Valley, meritocracy may be the greatest differentiator.  There are no long apprenticeship or trainee programs at Palantir (though we are always learning).  Everyone is equipped to begin working on real problems within weeks. Good ideas don’t have to pass through a massive hierarchy - the best idea wins, regardless of whose idea it is.  

Save The World: Palantir is focused on solving the most important problems for the world’s most important institutions, and we are always exploring new uses for our platforms. Some of our major areas of application financial oversight, disease control, narco-trafficking, protection of children, cyber security, protection of civil liberties, and most recently,  disaster response. In the face of global austerity, we are helping governments to get the most out of limited resources, and working with financial regulators to prevent the next financial crisis before it happens.

These are uncertain and volatile times, especially for Europe, yet there has also never been a better time to be part of something extraordinary. 

Apply Here (the path to a better tomorrow): https://www.palantir.com/challenge/ 

 

The Soft Underbelly of Technology Services

I spend a lot of time thinking about delivery models for technology, especially in an age of shrinking budgets and growing complexity.  So I was struck to read that Avanade, a joint custom software venture between Accenture and Microsoft, had been sued by a customer for major cost overruns.  The key part:

The lawsuit said a software project estimated to cost $17 million and take 11 months instead mushroomed to $37 million over three years, and ScanSource said it still doesn’t have a Dynamics software up and running. Accenture has estimated it will cost $29 million more to complete the ERP project, according to ScanSource’s lawsuit.

What can be learned from this? There are quite a few things.  The cynics among us might point out that an overrun of $20 million and 2+ years is considered a bargain in some areas of government.  That is of course an outrage, but the important takeaway goes beyond the numbers, to the fundamental nature of the delivery model.  Let’s assume for this conversation that actors all good faith and very competent here.  I think that despite that, the model leads to these sorts of outcomes.

Not surprisingly, Avanade turns out to be in the business of renting labor.  Services is the exact wrong model – a catastrophically incorrect model, the more you think about it. These sorts of incidents are really a lagging indicator of the weakness in the model, but it’s taking a whole lot of innocent (and some not-so-innocent) bystanders with it.  More on them in a few.

There are many shortcomings to services model, but most fundamentally it’s the wrong incentive structure.  When you’re renting labor and other nebulous inputs, it’s almost a truism to point out that the longer it takes, the more the company prospers, and the bigger the project, the more room for abuse.  A contractor doing a bathroom remodel might employ a similar cost structure, but could never get away with overruns on a tenth the scale of those alleged in the Avanade lawsuit.  Of course, even if you have reliable cost safeguards in place, custom software development is inefficient, as I’ve often railed about in these pages.  It takes an army of consultants to deliver, and another army of consultants to maintain.  

It’s not all the services company’s fault, though – not even primarily.  In a sense everyone is complicit, from the services company, to the customer who doesn’t demand something better or structure payment to be a premium but based on success, to the tech giants who aren’t working to productize services.  Of course, if product companies dared to do so, the services companies of the world would throw a fit, and professional courtesy runs deeper than you might think in a theoretically competitive marketplace.  

When the world changes, you don’t always get a say in the matter, and evolution has a funny way of sneaking up on those who get too comfortable.  The first indications may just be bubbling to the surface, but two things are clear: services companies are under tremendous pressure, and product companies need to productize services. 

The first point makes sense from a valuation standpoint.  Mature tech companies such as Oracle and Microsoft have market caps of ~5-6x annual revenue, while the multiple is often less than 2x for services firms, even the upper tier.  Yet it’s still not obvious to all that services companies are living in the past (partly because many services companies are so good at convincing people they’re really technology companies).  Mostly, though, it’s because services companies still generate a lot of money.  It’s a dying model that’s still making people rich, so it’s easy to ignore the warning signs even if you see them.  And for an exponential trend, by the time you are 1% there, it is almost done.  You could almost analogize it to the SUV craze: consumers couldn’t get enough gas-guzzling SUVs, and American auto makers happily served them up for several years.  Suddenly (but not all that surprisingly), $3-4/gallon gasoline was a fact of life and those same automakers were all teetering on bankruptcy for giving the customers exactly what they wanted.

In terms of multiplying complexity and data problems, we’re entering an era of $10/gallon gas.  Even if you’re in the product business, if you’re not increasing your productivity per person, you are dying – in some cases more quickly and dramatically than the services dinosaurs.  And for this reason, product companies can’t just deliver products any more – they need to productize services on a continuous basis.  In short, they need to deliver outcomes.  Mere capabilities only work against well understood problems.  They won’t be sufficient for the types of challenges that grow appreciably bigger in the time it takes to read this blog post.  

If that sounds smug, it needs to be acknowledged that building a business based on outcome delivery, as opposed to a static product, is still extraordinarily hard.  Not only are the prevailing incentive and cost structures far behind, but technically speaking it’s a very rugged frontier.  This is perhaps best illustrated by software, where performance at scale, processing, security, stability, and interoperability are often much bigger challenges than achieving the desired functionality.  On the other hand, though, successful technology has always productized services of some kind, dating back as far as the cotton gin or even the wheel.  The entropy of the present and future data landscape adds an enormous degree of difficulty, but along with Moore’s Law, the single biggest lever of the knowledge economy is the ability to repackage experience and lessons learned into a better, more responsive product.  It may take years or even decades, and it’s entirely possible that the first mover will end up being a sacrificial lamb.  Sooner or later, though, the company that gets productization right will eat the legacy companies’ lunch.

Inputs vs Outputs

A defining difference between Silicon Valley and the Old World is that Silicon Valley is intensely focused on outputs as opposed to inputs.  While the shift to an outcome-based  economy remains a work in progress, the high-tech world tends to focus on tangible results, not ingredients.  It’s not just about a different way of thinking about business – it’s a matter of different societies and what they value.

One of the original inputs is ancestry.  No one in Silicon Valley will ask you who your parents were or what they did, whereas people absolutely will in the Old World and East Coast.  At some point in American history, having ancestors who came over on the Mayflower became an indicator of New England aristocracy – funny when you consider that the Pilgrims themselves were people of no social standing, building something from scratch.

Input bias is easy to observe in classically process-oriented companies (and societies).  Fixation on research and development is a prime example: the value of the final product is judged by the input (“it cost us $500million to develop this”) more so than the results. Spending in general is frequently touted as an absolute good or evil ipso facto, but it’s actually one of the least relevant data points on its own.  When we talk about confusing cost with value, we’re really talking about confusing inputs with outputs.

Wall Street is extremely focused on inputs, even though their efforts are ostensibly measured by outputs, and fairly straightforward ones at that.  On Wall Street, input doesn’t just refer to assets under management – it’s about name-brand firm experience, having an MBA from the right school, who designed your suit, even your business cards.  Ironically, Goldman Sachs, the biggest name on Wall Street, transformed itself from a struggling, undistinguished firm to the world’s top investment bank under the leadership of Sidney Weinberg, a junior high school dropout. Weinberg was originally hired at Goldman as a janitor’s assistant making three dollars a week – an anonymous and menial job, certainly, but a job at the firm judged solely on output.  

Where you went to school is an obvious input, but outputs matter for the endurance and success of the school itself, especially young schools.  How did Stanford, founded in 1891, achieve equal footing with the Ivies?  Money certainly helped, but intermingling with Silicon Valley and entrepreneurial culture played a much greater role than simply having wealthy donors.  From legendary engineering dean Frederick Terman, who mentored (and invested in) Hewlett and Packard, to the founding of Yahoo! and Google by Stanford grad students, to Peter Thiel’s recent course on entrepreneurship, Stanford and Silicon Valley have enjoyed a unique symbiosis.  In terms of clear outputs, a recent study found that companies founded by Stanford alumni create $2.7 trillion in annual revenue.   Beyond pure productivity, Stanford arguably introduced the concept of great entrepreneurs as a tangible output of a university, mentioned in the same sentence as Nobel laureates and world leaders.  The willingness of many of these great entrepreneurs to reinvest not only their money but also their wisdom and mentorship into the university is one of the great virtuous cycles in education.

Perhaps the ultimate input is age, and when a society values something simply for being old, it speaks volumes – especially when that something is itself.  The output that matters is enduring impact and relevance.  For the Old World, the danger is that reverence for the merely old is so deeply ingrained that by the time a society realizes it’s stagnating, it is exponentially harder to reverse the tide – witness the number of once-great empires of Europe struggling to stay afloat.  The United States is an obvious counterpoint (not that we can take that for granted), and I’ve often reflected that Silicon Valley values are really American values writ large, but there are new revolutions happening all the time, even in very old societies.  China and India were home to ancient and storied cultures, though neither was a world power as recently as the mid-20th century.  Today, in a post-imperial, post-Soviet world, they are major players, buoyed by high-tech explosions that would have been unimaginable fifty years ago.  Yet I would argue that such transformation only became possible when China and India collectively decided that only outputs, not the systems that produce them, are truly sacred.  

 

 

Focus on the First Derivative

In a fast growing company, everyone has less experience than they need for their roles, by definition.  This will continue to be true as the company scales, one's role changing in a fundamental way every 3-6 months, especially when it continues to defy expectations for months and years.  Ultimately, that’s all irrelevant.  In Silicon Valley, we like to talk about visionary leaders making momentous decisions amid great uncertainty, but what really matters is the first derivative of understanding: how are you and your team learning from the experience as it unfolds?  There are many considerations nested in this question – here are some of the most important:

How quickly are you learning? When you are operating within a tornado, speed counts for a great deal.  It’s often been said that even the right decision is wrong when taken too late, and this begins with learning. If the second and third order effects of your original challenge are already on an irreversible course by the time you’ve grasped the nature of the challenge, it’s no longer the same challenge.

Are people taking the same things away from failures? In an ideal world, everyone would not only draw the same conclusions from the experience, but they would also be the correct ones.  More often, the process is a lot messier, but that’s just reality – you learn together through give and take, not some mystical collective unconscious.  The key is that you are unified about your next move.  

Are you making meaningful abstractions, or just reacting to your immediate circumstances? Even when execution is everything, there is such a thing as being too tactical, and morale plummets when people can’t make abstractions (or they aren’t taken seriously).  It’s a delicate line, because your abstractions have to be actionable and part of a continuous cycle of learning and responding.

When dissent occurs, is it productive? Just because you eventually arrive at the same takeaways doesn’t mean there is no room for disagreement.  The question is whether it’s healthy and constructive, or pointed and personal.  The “team of rivals” concept has gained many adherents in recent years, but it’s important to remember that it’s above all a team.  Ideally, iron sharpens iron.

Three Two strikes and you’re out.  In certain areas, such as distribution, you don’t get many chances to course-correct when one approach fails, so extracting the right lessons from the first failure is paramount.  This is not to say that you should impose needless anxiety on these kinds of decisions, but be aware of what the stakes are.

Can you reform your model? Models can be extremely useful and necessary to consolidate your understanding of a complex world and plan accordingly.  However, they can also be an especially insidious kind of blindfold.  Adjusting your model, or abandoning it when necessary, can be incredibly difficult, because it requires you to first recognize and confront your inherent biases, and resist the tendency to rationalize away the model’s shortcomings.

In a hyper-growth environment, you will never have enough information, experience, or foresight.  The first derivative will be the only thing that matters.  We became the ultimate learning animals through many unforgiving eons of natural selection.  This new evolutionary challenge of warp-speed learning and adaptation may feel significantly more abstract, but once again, it all comes down to survival.