Is There Life After Blockchain? (Part 2)

This is the second part of a two-part article. The first part is available here.

Dilbert on workflow

Inequality

UN’s organizations have often been criticized for misusing funds, specifically for allocating the larger share to their maintenance rather than institutional projects.

Building Blocks is a sharp turn in maximizing effectiveness of allocation and expenditure of WFP funding, primarily for a major reduction of costs and overhead. Also, WFP’s Building Blocks shows that blockchain is not just “essentially a safe way to exchange pretty much anything on the Internet with someone you don’t know” nor much less a tool to “empower end users to go P2P with linguists”. Finally, WFP’s Building Blocks shows that trustworthiness is a fundamental and indispensable pre-requisite in every transaction. At least in the real world of everyday life. In other words, refugees can buy goods on credit because WFP guarantees payment. And the inequality of privileging the larger store over the smaller retailers was a major issue even before the introduction of blockchain, indeed possibly bigger.

Out of curiosity, no WFP exists in the translation industry, is it? So, who would be just as reliable as WFP to guarantee all transactions? Certainly not an ordinary LSP, especially with virtually no capital on its own to need an ICO.

Also, although the reports of TM’s death are clearly exaggerated, they have long been occurring nearly daily. Curiously, many visionaries share the convinced support for the TM marketplaces. And blockchain, of course.

Tertium non datur: Either you keep convincingly collecting language data—most of it, of course, in some TM format of sorts—or you as well stop voicing around the centrality of data. And if they just have not been buried yet, TMs are going to stink, very soon. Actually, given the conditions in which most of them most possibly are, they already do, although perfectly alive. So, what is supposedly going to be the stuff exchanged in a TM marketplace, hopefully using blockchain?

Apparently, a translation marketplace must be kind of an El Dorado, at least for the multiple and repeated attempts to establish a working one, even though practically no one has so far succeeded. So, it is very hard to see how just another marketplace to trade some dead objects using an obscure mechanism based on an even more obscure technology and some highly volatile cryptocurrency could do. And it is even harder to see how the necessary flocks of translators and LSPs would run to crowd it, not to mention customers. And with no WFP-like organization backing it.

Among the reasons why marketplaces have not worked is that, like directories, they entice the same kind of people, and that LSPs mostly share and employ the same people. Also, all LSPs focus almost exclusively on sales, hence on BDMs, neglecting their vendor base: Seldom they do develop and pursue any plan to expand it, improve it and refresh it. When they do, it is only to find someone cheaper. This, in turn, leads to the Gresham’s Law, thus inexorably to a shortfall of qualified language specialists, and finally to the so-called “Bodo dilemma”.

Incidentally, this is very reminiscent of the case of agricultural produce and large-scale retail with huge amounts required in very short times at a ludicrous price leaving very little margin to farmers. As an example of price pressure, in the 1980s, a kilo of oranges was worth 5 times as much as today.

Large-scale retail is just a side effect of consolidation, which, in the view of laissez-faire, neoliberalism, and liberism, is a pillar to growth, together with the invention of inequality helping growth. The final outcome is a rampant recklessness and a great and growing inequality witnessed by the many cases of insolvency and bankruptcy of LSPs as a result of a challenging financial situation, the other side of the consolidation coin.

The effects apply to the ‘minor’ industry of translation and localization with those at the bottom having less chance than ever to climb the ladder.

Stuck in the Middle

On the other hand, in today’s economy, economies of scale are imperative, even where the output is believed to be a service rather than a product. Economies of scale require a ‘critical mass’, though, hence the quest for consolidation.

Anyway, without advanced and flexible production processes and support systems, instead of allowing for economies of scale, size and territorial distribution only generate overhead and low productivity.

If one really thinks of pulverization as the main problem of the industry, then the question should be, where does it come from and why. Pulverization is the ultimate effect of the typical attitude of bigger customers with a long-time practice in procurement not to rely only on one or two suppliers. This practice has led a multitude of dogs to crowd around one bone. The prevailing of the wilder dogs have led a dozen companies to divide up a quarter of the market leaving the remains to a myriad of small and very small players, often much more than happy with the little role they can play, perhaps on the request of some big shot.

The recent acquisitions of the world’s largest LSPs show that there is no room for organic growth for the big ones, not any longer, despite any statements to the contrary. Not surprisingly, they keep boasting a up roaring growth, but without providing any figures on profitability. Demand for services and technology may even grow, but costs grow too, while prices will continue to stagnate if not plummet, hence compensations too, thus pushing the best players out of the playing field for good.

The Data Deception

The pulverization of the industry has given rise to more issues. First of all, no one player is really that big and vertical to be able to deliver a really large, current and fully functional amount of data—especially language data—to use in any effective manner. And when and where this data exists it is hardly put to use; possibly because no one knows how.

It comes as no surprise then that, with more than 99,99 percent of translation coming from tech giants’ machine translation systems, the data feeding them comes from sources other than the usual traditional industry players.

This should offer the chance for many an industry player to focus on developing and curating business-purpose data, which would be, however, highly volatile as well.

Finally, another major issue is that of information asymmetry. To overcome it, beyond a fair proficiency in data management, a deep linguistic competence and subject-field expertise are needed, on both sides, to be sure that the dataset is relevant, reliable and up to date. And even when data comes from web crawling, the challenge of selecting and using the right data sources persists.

All this brings us back to the inequality question: These tasks require significant resources that are not usually available to SMEs.

Therefore, if someone positively thinks that a marketplace is absolutely necessary and not to be postponed further, especially one specifically geared to data exchange, good luck, because the money to invest in this endeavor must necessarily be his/hers, not coming from anyone else, especially if collected without offering any guarantee.

Looking Forward

Disintermediation requires marketplaces where customers can find their ideal vendor by querying a database of profiles and offerings and dispatching jobs while the platforms provide the infrastructure for managing projects, billing, invoicing, and reporting. The same infrastructure might allow users on both sides of the fence to pick and use several MT engines simultaneously and linguists to exploit advanced features in translation tools like auto-complete, automatic correction, confidence scores, etc. Yes, all this is yet to come together.

Blockchain is still immature, complex, and expensive and hardly brings any enhancement to human experience, but it might help secure agreements and their execution through smart contracts. In the translation business this would not require any underlying cryptocurrency and could be of help especially in contexts involving microtransactions, possibly in conjunction with comprehensive SLAs.

It is quite weird that, with less clunky alternate technologies emerging, this option has not been explored in favor of a belated and misfocused project with very uncertain perspectives.

Supporting Agile

The average size of localization and translation projects has been getting smaller and smaller for at least a decade. This has shifted costs from the actual linguist component to the project management side, sharply reducing profitability margins. This trend is not going to stop; indeed it is most likely to increase further due to the wider and wider application of the agile methodology to software development, despite the growing criticism.

Agile’s major outcome is the quickening, squeeze, and iteration of the software release life cycle. The intensity of iterations leaves wide room for errors that are to be fixed in later iterations.

While the software development community developed its own tools, like RADs, and design frameworks, like UML, to deal with the agile methodology, the localization and translation community just went for some small talk at industry events to the tow of major IT customers. Agile has thus become yet another fad and nothing more.

Only recently something started to move, almost thirty years after the introduction of the methodology, long after continuous delivery and continuous deployment have become standard practice. One or two web localization platforms have some mechanism in place to automatically collect strings and delivers them to the localization project managers and localizers, but there is still a long, long way to Tipperary.