Why Did Alexa Lose The AI Race? Here's What Former Amazon Machine Learning Scientist Has To Say
According to former Amazon Alexa Machine Learning scientist Mihail Eric, Alexa AI was riddled with technical and bureaucratic problems. Read the piece to understand the mystery.
Amazon's AI assistant Alexa jumped into the AI market probably way before than any other product could make a name for itself. But, despite having such a huge advantage, it failed to carry on the momentum. Somewhere Amazon lost track of the game and other companies such as OpenAI, Microsoft and Google came along to conquer the landscape. There are many stories going around in the market about why Amazon lost the winning battle but, there is one that a former senior machine learning scientist at Alexa AI.
Mihail Eric, the scientist at Alexa AI, posted the story of why Alexa lost on X.
How Alexa dropped the ball on being the top conversational system on the planet
— Mihail Eric (@mihail_eric) June 11, 2024
—
A few weeks ago OpenAI released GPT-4o ushering in a new standard for multimodal, conversational experiences with sophisticated reasoning capabilities.
Several days later, my good friends at PolyAI…
Why Alexa Lost: Mislabelling Of Data
According to Eric, Alexa AI was riddled with technical and bureaucratic problems. He said that Alexa put a huge emphasis on protecting customer data with guardrails in place to prevent the leakage and access of it. He added that it was a crucial practice, but it left them with one consequence which was that the internal infrastructure for developers was agonisingly painful to work with.
As per him, if any need would arise then it would take weeks to get access to any internal data for analysis or experiments. Not only was the data was poorly annotated but the documentation was also not well-maintained, in fact, it was either nonexistent or stale. He further said that the experiments had to be run in resource-limited computing environments. He gave the example of training a transformer model when all you can get a hold of is CPUs.
He went on to share the story of when his team did an analysis demonstrating that the annotation scheme for some subset of utterance data was completely wrong. The point his team was trying to prove was that it was resulting in incorrect data labels which means that for months, the internal annotation team had been mislabeling thousands of data points every single day. When he attempted to get the annotation taxonomy changed then he discovered that it would require much more than thought to modify even the tiniest bit.
He had to get the Product Manager onboard then their manager’s buy-in, then submit a preliminary change request, then get that approved (a multi-month-long process end-to-end).
The reason it got stuck as per Eric, is there was no incentive for the Product Manager to fix it as there was no story for a promotion here. The only reason for the Product Manager to fix this issue was that “it’s scientifically the right thing to do and could lead to better models for some other team.” Since there was no incentive, hence there was no action taken.
Why Alexa Lost: Fragmented Organisational Structure
He then talked about how Alexa’s organisational structure was decentralised by design meaning there were multiple small teams working on sometimes identical problems across geographic locales. He said that teams scrambled to get their work done to avoid getting reorganised and subsumed into a competing team.
The consequence as per the scientist was that it became an organisation plagued by antagonistic mid-managers that had little interest in collaborating for the greater good of Alexa and only wanted to preserve their own area of operations.
He narrated the story of a time when he along with other teams was coordinating a project to scale out the large transformers model training. If it was done correctly, then it could have been the genesis of an Amazon ChatGPT (well before ChatGPT was released).
He said, "Our Alexa team met with an internal cloud team which independently was initiating similar undertakings. While the goal was to find a way to collaborate on this training infrastructure, over the course of several weeks there were many half-baked promises made which never came to fruition. At the end of it, our team did our own thing and the sister team did their own thing. Duplicated efforts due to no shared common ground. With no data, infrastructure, or lesson sharing, this inevitably hurt the quality of produced models."
Why Alexa Lost: Product-Science Misalignment
In his tweet, he wrote, "Alexa was viciously customer-focused which I believe is admirable and a principle every company should practice. Within Alexa, this meant that every engineering and science effort had to be aligned to some downstream product. That did introduce tension for our team because we were supposed to be taking experimental bets for the platform’s future. These bets couldn’t be baked into product without hacks or shortcuts in the typical quarter as was the expectation. So we had to constantly justify our existence to senior leadership and massage our projects with metrics that could be seen as more customer-facing."
He then gave an example and said, "For example, in one of our projects to build an open-domain chat system, the success metric (i.e. a single integer value representing overall conversational quality) imposed by senior leadership had no scientific grounding and was borderline impossible to achieve. This introduced product/science conflict in every weekly meeting to track the project’s progress leading to manager churn every few months and an eventual sunsetting of the effort."