Spain 60 Belo employees were stopped from providing services, causing the entire work process that relied on Claude assistants to be paralyzed.
“Never put all your eggs in one basket,” Patricio Molina, CEO of financial consulting company Belo based in Spain, wrote on X last weekend, after his company was stopped providing services by Anthropic without providing any explanation.
“Claude AI wiped out our entire organization with more than 60 accounts belonging to a legitimate company, but did not give a reason, without any explanation. The only way to appeal the decision is to fill out a Google Form? Very bad user experience and customer service,” Molina continued to post on X later, along with the letter automatically sent by Anthropic. In this letter, Anthropic did not mention details about which rules were violated and how.
Logo of AI Anthropic Claude application on the phone. Image: Luu Quy
On social networks, a series of criticisms were directed at Anthropic, saying that this company disregarded customers. Some said they were experiencing the same situation and had even filed appeals via Google Form for months but nothing was resolved.
Comments on social networks also said that the incident with Belo is also a lesson for any company that depends too deeply on platforms such as Anthropic’s Claude or OpenAI’s GPT. Some users say Molina should run its own models on local computers, or use multiple AI models simultaneously instead of relying on just one platform, even if that leads to duplicating some or all of the work.
To date, Anthropic has not commented. Belo’s system restored access after 15 hours of problems, but it is unclear whether the company intervened or because of negative reactions from public opinion.
Anthropic is headquartered in the US, founded in 2021 by a group of former OpenAI employees, including brothers Dario and Daniela Amodei. This startup focuses on developing safe, reliable and controllable AI models, a direct competitor to OpenAI itself. The prominent product line is Claude launched in 2023, designed to process natural language, support content writing, data analysis and question answering, serving both military and civilian uses. However, at the end of February, the Pentagon also classified Anthropic on the list of “threats to the supply chain”. The US government has asked federal agencies to stop using this AI.
https://gettr.com/post/p3yjbcbf81e
https://form.jotform.com/260794934957072
https://robertsspaceindustries.com/en/citizens/aviatrixbeet
https://onlinesequencer.net/members/256662
https://www.pearltrees.com/leonskot33/item784910072
https://recordsetter.com/submit/review?pend=d154af03-2734-4ad7-bab9-cacb8d138228
https://gist.github.com/AnnaStyvenson/9be897b5b5d3adecd91b8be578400566
https://www.minds.com/newsfeed/1885123239827476480?referrer=annastyvenson
https://odysee.com/Aviatrix-une-exp%C3%A9rience-moderne-qui-change-la-vision-du-jeu-en-ligne:e
https://scrapbox.io/taftie/%E0%A6%97%E0%A7%87%E0%A6%AE%E0%A7%87%E0%A6%B0_%E0%A6%AC%E0%A7%88%E0%A6%9A%E0%A6%BF%E0%A6%A4%E0%A7%8D%E0%A6%B0%E0%A7%8D%E0%A6%AF_%E0%A6%AC%E0%A7%81%E0%A6%9D%E0%A7%87_%E0%A6%B8%E0%A6%BF%E0%A6%A6%E0%A7%8D%E0%A6%A7%E0%A6%BE%E0%A6%A8%E0%A7%8D%E0%A6%A4_%E0%A6%A8%E0%A7%87%E0%A6%93%E0%A6%AF%E0%A6%BC%E0%A6%BE
https://www.checkli.com/forestarroww
https://vimeo.com/user256039690
https://zumvu.com/forestfortune/
https://www.facer.io/u/AviatrixBet
https://it-rating.com/profile/012811
https://tekkenmods.com/user/137115/forestfortunegame
https://www.investagrams.com/Profile/forestarrow
https://gravatar.com/maximumenemy55f81a094c
https://megagrass.com/community/question-and-answer/forums/4133-general-questions/topics/3253546-forest-fortune
https://write.as/mt1duny3sp8cw
https://penzu.com/public/b46e2784f6987063
https://civitai.com/user/ForestArrow
https://community.m5stack.com/user/orestfortune-game-com
https://www.are.na/nico-ray/forest-fortune
https://disqus.com/by/disqus_k80HzgO0KL/about/