sublinear 9 hours ago

> "It's actually quite similar to some of the supply chain attacks we've seen before [...] you're trying to trick somebody who's doing some vibe coding into using the wrong API."

I have renewed faith in the universe. It totally makes sense that vibe coding would be poisoned into useless oblivion so early in this game.

  • cryptoegorophy 3 hours ago

    I confess to vibe coding. Specially with api work. And it has gotten so badly that I have to actually either send api pdfs and links to api documentation.

  • TZubiri 4 hours ago

    I feel a bit icky, but whenever I see people work with such a disregard for quality, I'm actually rooting for their products to break and get hacked.

    In 2015 it was copy and pasting code from stackoverflow, in 2020 it was npm install left-pad, in 2025 it's vibecoding.

    I refuse to join them, and I patiently await the day of rapture.

    • andrei_says_ 3 hours ago

      It may already be here but in large moneyed organizations no one wants to take responsibility and speaking against management’s orders to lean in on AI may be a political suicide.

      So a lot of people are quietly watching lots of ships slowly taking water.

  • labrador 5 hours ago

    A cynic. I like it.

Kesseki 9 hours ago

This is, in turn, making the world of comment and forum spam much worse. Site operators could tag all user-submitted links as "nofollow," making their sites useless for SEO spammers. But spammers have learned that most LLM content scraper bots don't care about "nofollow," so they're back to spamming everywhere.

  • labrador 5 hours ago

    It reminds me of non-radioactive steel, the kind you can only get from ships sunk before the atomic bomb. Someday, we’ll be scavenging for clean data the same way: pre-AI, uncontaminated by the AI explosion of junk.

  • mananaysiempre 8 hours ago

    I’m not sure if even for traditional search engines “nofollow” means that the scraper doesn’t follow the link, or that it just does not include it in the PageRank or whatever graph but still uses it for to discover new pages. (Of course, LLMs are far too impenetrable for such a middle ground to exist.)

ttoinou 6 hours ago

LLMs dont seem to hallucinate my niche products and company (sometimes the name of the company doing the product yes, but not the product name, not the url of the company), and according to CloudFlare Radar I'm only between 200000 and 500000 top domains https://radar.cloudflare.com/scan

flufluflufluffy 3 hours ago

> “Crims have cottoned on to a new way to lead you astray”

Was - was the article written by AI?

boleary-gl 9 hours ago

I wonder if Cloudflare's new plan for blocking AI from scraping the "real" sites...

  • zahlman 8 hours ago

    You wonder if the plan is (or does) what?

9283409232 9 hours ago

I've been using Phind lately and I think they do a really good job avoiding this problem. I don't think I've ever run into a fake URL using it. As a search engine, I think I still prefer Kagi but Phind is a great if you want a free option.

kristopolous 9 hours ago

It's wild how many of the links are hallucinations.

Maybe the error rate is consistent with everything else, we just eisegesis our way into thinking it's not

  • whatsgonewrongg 3 hours ago

    I’ve had Claude hallucinate options for a lot of things, most recently DOM method arguments, and Wrangler configuration. It’s like, very reasonable things you might expect to exist. But they don’t.

    I must be holding it wrong. How are people working with these tools to produce quality software?

    • DanAtC 2 hours ago

      In case you're not being sarcastic: they're not.