“Europe today is a powder keg and the leaders are like men smoking in an arsenal. A single spark will set off an explosion that will consume us all. I cannot tell you when that explosion will occur, but I can tell you where. Some damned foolish thing in the Balkans will set it off.” – Otto von Bismarck, 1878
One hundred years ago this past November, the Armistice of 11 November 1918 was signed ending the first World War and its associated hostilities. The cost of this war in lives was beyond anything that can be conceived of today. In the United States, for example, the Vietnam War is correctly regarded as a military disaster. In just under twenty years of combat operations, the United States alone suffered an appalling 58,318 Americans killed. At the First Battle of the Marne in 1914, by comparison, the Allies suffered almost four times that amount in five days.
It is tempting to explain the war by saying that there was no way that its horrors could have been predicted. The difficulty with this explanation is that it is clear that at least some of those involved in its authoring did in fact understand the consequences. The same day that he argued for war in Parliament, the British Foreign Secretary Edward Grey reportedly said “The lamps are going out all over Europe. We shall not see them lit again in our lifetime.”
One of the questions, therefore, that historians have struggled to answer in the years since is how – if the stakes were understood – the series of linked events referred to as the July Crisis that led to the war was allowed to escalate out of control, to the point that war was the only possible outcome.
The answer, while enormously complicated in the details, is simple on the surface. Given the landscape and power structures at the time, none of the various combatants felt that they had an alternative. One of the most frightening things about studying the causes of the war, in fact, is that if you consider the political realities of the time, it’s actually easy to understand the respective nation-state justifications.
It’s frightening how easy it is, in other words, to accept that war was inevitable.
Three years ago in June, a venture firm gathered together a small group of media, vendor representatives and analysts, including RedMonk, to discuss its thesis on the importance of open source as a mechanism for adoption in commercial settings. After a presentation of its own model, a partner from the venture firm moderated a panel of senior executives from its commercial open source portfolio companies. Each described in detail how open source had enabled and transformed its adoption cycle within the enterprise at the expense of proprietary alternatives.
The idea that the adoption of open source by developers within enterprises at scale had transformed the nature of procurement was consistent with RedMonk’s own views, of course. To some degree, it has been a core belief all along, and has been surfaced explicitly over the years with pieces such as this one from 2011 entitled “Bottom Up Adoption: The End of Procurement as We’ve Known It.” What was interesting about the proposed model wasn’t what it told us about the present, however, but rather what it failed to tell us about the future.
Conspicuously unmentioned at this event was the cloud. The cited competition for both investor and commercial OSS supplier was proprietary software; no special attention or even explicit mention was made of Amazon or other hyperscale cloud providers. A question on the subject was brushed off, politely.
Which was interesting, because RedMonk had by that point been judging commercial open source leadership teams based on their answer to the simple question of “who is your competition?” If the answer was a proprietary incumbent, this suggested that the company was looking backwards at the market. If the answer was instead the cloud, it was safe to assume they were more forward-looking.
Between then and now, as is evident, the market has caught up to this thinking. In the last twelve to eighteen months, in fact, a switch has been flipped. Companies have gone from regarding cloud providers like Amazon, Google or Microsoft as not even worth mentioning as competition to dreadful, existential threat. The fear of these cloud providers has become so overpowering, in fact, that commercial open source vendors have chosen – against counsel, in many cases – to walk down strategic paths that violate open source cultural norms, trigger massive and sustained negative PR and jeopardize relationships with developers, partners and customers. Specifically, commercial open source providers have increasingly turned to models that blur the lines between open source and proprietary software in an attempt to access the strengths of both, with the higher probability outcome of ending up with their weaknesses instead.
That commercial open source providers took these actions having been advised of these and other risks in advance says everything about how these businesses view their prospects in a world increasingly dominated by massive providers of cloud infrastructure and an expanding array of services that sit on top of that. The strategic decisions inarguably have major, unavoidable negative consequences, but commercial open source providers – or their investors, at least – believe that a lack of action would be even more damaging.
It will be interesting to see whether that is still the belief after Amazon Web Services’ announcement this week. A rough summary of events that led up to this week include the following:
- 2010: Originally written by Shay Banon just shy of a decade ago, Elasticsearch is a permissively-licensed open source search engine. It proved popular enough that a commercial entity was eventually formed around it. Elastic NV – neé Elasticsearch BV – took rounds totaling better than a hundred million dollars, went public in October of last year and is currently valued at just under $6B.
2015: Five years after the project was founded – presumably in response to customer demand, Amazon announced a cloud-based implementation called the Amazon Elasticsearch Service based on that permissively licensed open source code. It competed directly with the commercial offerings, both on-premise and off, offered by Elastic NV.
2018: In part due to this and other cloud-based competition, Elastic began to blur the lines between its open source offerings and proprietary licensed complements to same – its x-pack offerings, specifically. Notably, Elastic did not follow in the footsteps of some of their commercial open source counterparts and attempt to solve this with hybrid licenses, but the company began to intermingle open source and source available but proprietary code in the same repository, and its default builds included this non-open source software.
2019: This week, in response to that response, Amazon did several things. Most obviously, with the support of Expedia and Netflix, it introduced what it regards as a distribution of Elasticsearch. The assumption here, however, is that based on the relationship between the two projects it will for all intents and purposes function as a fork. Second, the project includes new open source contributions, contributions which roughly approximate the features that Elastic NV had not made available as open source and charged for. Third, as it had with the original Elasticsearch-based AWS service, it leveraged the Elasticsearch name for this project.
Given that previously latent friction has now escalated into open conflict, there are many questions being asked: how did it come to this? Was this inevitable? And, most obviously, who is to blame?
One of those questions, at least, is easy to answer. This move has been expected for some time. This is from September, for example, when looking at at the Commons Clause:
It seems improbable, certainly, that if suddenly confronted with Commons Clause licensed software, that cloud providers would universally pivot and license the software from the commercial open source providers that wrote it. It’s possible, in fact, that the application of the Commons Clause could backfire, making it more likely for cloud providers to try and poach key contributors or committers from companies that employ the license and forking projects either publicly or privately as a lower cost means of providing the necessary control over the software assets.
The Amazon and Elastic controversy is the product of a collision of models. To Banon and Elastic’s credit, Elasticsearch proved to be an enormously popular piece of software. The permissive license the software was released under enabled that popularity.
Permissive licenses also enable, however, usage such as Amazon’s. For Elasticsearch and similar projects that are popular and have high visibility, cloud providers – in an effort to be responsive (and to not leave revenue on the table) – will try and find a way to try and meet the demand for these projects with native services.
- Licensing is not realistic. In spite of what some investors at least apparently believe, the addition of commercial terms to previously open source assets was never going to get hyperscale cloud providers to engage on a product basis. No company operating at that scale wants a major service to be beholden – whether in product development or pricing – to a third party they don’t control.
- Acquisition is another means of meeting the demand, but it scales poorly. Even well capitalized cloud suppliers don’t want to pay the acquisition premium for each new service entry, particularly when there’s a cheaper and lower friction alternative – which there is.
- Forking historically been regarded as something of a nuclear option within open source communities, but from a public relations perspective it becomes more palatable when the commercial open source supplier has jeopardized its own standing by embracing tactics and methods that run counter to open source community norms. In such a scenario, even large third parties can attempt to occupy the moral higher ground while serving their own interests at the same time.
Faced with those options, the logical response for a cloud supplier to the addition of adverse licensing terms will be, in at least some cases, a fork. Which is why a move like Amazon’s this week was expected and inevitable. It is also why the question of blame is difficult for non-partisans to assign. Both parties are essentially acting as might be expected given their respective outlooks, capabilities and legal rights.
It is probable, in fact, that Amazon’s move here will be the first but not the last. As it and other cloud providers attempt to reconcile customer demand with the lack of legal restrictions against creating projects like the “Open Distro for Elasticsearch” they are likely to come to the inescapable conclusion that their business interests lie in doing so. Just as inescapable, apparently, is the conclusion on the part of commercial open source providers that the cloud represents such a threat that the boundaries of open source must be pushed.
The only real question, in fact, is whether this experience will have a deterrent effect, whether other commercial open source vendors will look at Elastic’s situation – which finds the company now competing with Amazon not just in product but in open source – and determine that the returns for more some of the more controversial licensing approaches simply don’t justify the costs.
More likely, however, is that the status quo persists. The incentives and motivations of both parties are clear, and understandable and logical within the context of their respective models. Models which are and will continue to be, intrinsically at odds even as they’re inextricably linked.
One hundred years ago, leadership of dozens of countries decided to march into a conflict that it knew would be costly, horrifically damaging and from which there was unlikely to be a real winner. They did so because they didn’t see any other choice.
The technology industry doesn’t seem to either.
Disclosure: Amazon and Elastic are RedMonk customers, as are Google and Microsoft. Expedia and Netflix are not RedMonk customers.