Cry about it. Crypto bros make the same excuses to this day prove your bullshit works before you start shoving it in my face. And yes, LLMs are really unhelpful. There’s extremely little value you can get out of them (outside of generating text that looks like a human wrote it which is what they are designed to do) unless you are a proper moron.
I’ve used them and have yet to get a fully correct result on anything I’ve asked beyond the absolute basics. I always have to go in and correct some aspect of whatever it shits out. Scraping every bit of data they can get their hands on is only making the problem worse.
To say you’ve never gotten a fully correct result on anything has to be hyperbole. These things are tested. We know their hallucination rate, and it’s not 100%.
I have used them in a large variety of ways, from general knowledge seeking to specific knowledge seeking, writing code, generating audio, images, and video. I use it most days, if not essentially every day. What examples would you like me to provide? Tell me and I will provide them.
I agree with the other user that it sounds like user error. Or perhaps you’ve not really used them at all, and just have joined the AI hate bandwagon.
Cry about it. Crypto bros make the same excuses to this day prove your bullshit works before you start shoving it in my face. And yes, LLMs are really unhelpful. There’s extremely little value you can get out of them (outside of generating text that looks like a human wrote it which is what they are designed to do) unless you are a proper moron.
You sound like an old man yelling about the TV. LLMs are NOT unhelpful. You’d know this if you actually used them.
I’ve used them and have yet to get a fully correct result on anything I’ve asked beyond the absolute basics. I always have to go in and correct some aspect of whatever it shits out. Scraping every bit of data they can get their hands on is only making the problem worse.
To say you’ve never gotten a fully correct result on anything has to be hyperbole. These things are tested. We know their hallucination rate, and it’s not 100%.
Please read the entire comment. Of course it can answer simple stuff. So can a google search. It’s overkill for simple shit.
In all of your replies, however, you fail to provide a single example. Are they writing code for you, or creating shitty art for you?
I have used them in a large variety of ways, from general knowledge seeking to specific knowledge seeking, writing code, generating audio, images, and video. I use it most days, if not essentially every day. What examples would you like me to provide? Tell me and I will provide them.