How OpenAI’s bot crushed this seven-person company’s website ‘like a DDoS attack’ | TechCrunch

OpenAI Bot Disrupts Small Business Website Like a DDoS Attack

A small e-commerce site faced major downtime due to OpenAI’s bot overwhelming its servers with requests

Technology

Triplegangers, OpenAI, Ukraine, DDoS Attack, E-commerce, AI Bots

Kyiv: So, there’s this small company called Triplegangers that got hit hard by an OpenAI bot. The CEO, Oleksandr Tomchuk, found out his e-commerce site was down on a Saturday. It looked like a DDoS attack at first.

But nope, it was just this bot trying to scrape their entire site. Imagine having over 65,000 products, each with its own page and tons of photos. That’s what Triplegangers has, and the bot was sending tens of thousands of requests to download everything.

Tomchuk mentioned that the bot used around 600 different IPs to do its thing. They’re still digging through the logs to see just how much damage was done. It felt like their site was being crushed under the weight of all those requests.

For Triplegangers, their website is everything. They’ve spent years building what they call the largest database of “human digital doubles” online. They sell 3D files and photos to artists and game developers who need realistic human features.

Even though they have a terms of service that says bots can’t take their images, it didn’t help much. They needed a properly set up robot.txt file to tell OpenAI’s bot to back off. If that file isn’t configured right, it’s like giving the green light for bots to scrape away.

And here’s the kicker: OpenAI’s bots can take up to 24 hours to notice any changes to that file. So, if you’re not on top of it, they’ll just keep going. Tomchuk had to scramble to get everything set up correctly after the bot had already caused chaos.

To make matters worse, this all happened during U.S. business hours, and now Tomchuk is bracing for a hefty AWS bill because of all the extra server activity. It’s a real headache.

Even after getting a proper robot.txt file and setting up Cloudflare to block the bot, Tomchuk still has no way to know what data was taken. He can’t even reach out to OpenAI for help. It’s frustrating, especially since they deal with sensitive rights issues due to the nature of their business.

Triplegangers’ site is a goldmine for AI crawlers because it’s packed with detailed, tagged images. The irony? If the OpenAI bot hadn’t been so aggressive, they might not have even realized how vulnerable they were.

Tomchuk warns other small businesses to keep an eye out for these bots. Many don’t even know they’ve been scraped until it’s too late. The problem is getting worse, with a huge spike in invalid traffic caused by AI crawlers this year.

It’s like a mafia shakedown, really. These bots will take what they want unless you have some kind of protection in place. Tomchuk believes they should be asking for permission instead of just taking data.

[rule_2]