Everyone in business supposedly knows about big data now, and its impact on the world we live in.
Big data is such a big thing that translation technology and service vendors are trying hard to adapt their selling proposition to it.
The problem with translation data more than with language data is in the amount and depth of it.
The US retailing giant Wal-Mart handles over 1 million customer transactions an hour and imports them into databases estimated to contain more than 2.5 petabytes of data.
At the other end of the trade, Rolls Royce uses thousands of sensors to identify issues before they arose, charge customers based on engine usage time, repairing them and replacing parts when a problem occurs. This dynamic servicing option now accounts for 70% of the civil-aircraft engine division’s annual revenue.
The first ten LSPs in Common Sense Advisory’s Top 100 list barely represent a 20% share of the whole translation service market. On the base of the estimated valued of the language service industry at approximately USD 40B, and the recently assessed value of the market average rate of USD 0.21 per word, the Top 100 LSPs process approximately 3.8 billion words per year.
This number roughly corresponds to 1.9% of the overall amount of words that Google Translate processes every day, 0.005% on a yearly base or 19% of the monthly amount of words SDL claims its machine translation engines now translate.
This means that, when used in conjunction with translation, big data is only meant to support some marketing fanfaronade. Possibly to satisfy the ambition to leave a mark.
Moreover, translation data — i.e. project data — has a limited lifespan and, at some point in time, it becomes outdated, possibly inaccurate, and definitely irrelevant.
Most importantly, translation companies traditionally show little understanding of business data, and research outcomes showing that businesses use, on average, less than 5% of available information to make informed decisions can be even more unfavorable.