• 0 Posts
  • 11 Comments
Joined 2 years ago
cake
Cake day: December 21st, 2023

help-circle
  • Yeah but when it’s a total crapshoot as to whether or not its summary is accurate, you can’t trust it. I adblocked those summaries cause they’re useless.

    At least some of the competing AIs show their work. Perplexity cites its sources, and even ChatGPT recently added that ability as well. I won’t use an LLM unless it does, cause you can easily check the sources it used and see if the slop it spit out has even a grain of truth to it. With Gemini, there’s no easy way to verify anything it said beyond just doing the googling yourself, and that defeats the point.





  • Gemini once told me to “please wait” while it did “further research”. I responded with, “that’s not how this works; you don’t follow up like that unless I give you another prompt first”, and it was basically like, “you’re right but just give me a minute bro”. 🤦

    Out of all the LLMs I’ve tried, Gemini has got to be the most broken. And sadly that’s the one LLM that your average person is exposed the most to, because it’s in nearly every Google search.







  • This wouldn’t have happened if the dumbasses at the hospital in Bangkok would have actually done their jobs, and had a doctor check for vital signs before merely taking the guy for his word, and just going with the assumption that she was dead before rejecting her.

    Secondly, nobody at the hospital found it even slightly odd that a dude just rolls in with a body and says, “she’s dead. Cremation, plz”? Instead they respond with, “produce a death certificate or fuck off”. What the actual fuck is going on over there in Thailand, that basic due diligence doesn’t even cross their minds? I would have been asking, “isn’t that your job, assholes?”