Holy shit, you’re too far gone. Yes, I’ve operated my own local LLM front end for personal use and have run them frequently in the past at a low level with no front end and messed with parameters directly. I’ve modified models as well, and I have a server designed to run them. It’s insane that you think those apply to his topic (billionaires lying to you). You’re just throwing random terminology out to make yourself seem smart and to reinforce your stance, but all that does is make you seem insecure about your knowledge. It does quite the opposite of what you intend.
All LLMs do is hallucinate, and sometimes, by pure coincidence, get things correct. This is why it’s impossible to get rid of hallucinations. They do not think, they do not have their own goals (refusing a request can be baked in, but that’s no different than programming something with guardrails), and they certainly will not suddenly become sentient. LLMs cannot do that, by design. Perhaps something else could, but LLMs are not that.
You really had to go to insults for this? I think you need to touch grass and stop believing billionaire marketing. LLMs are not technologically capable of doing what you have been brainwashed to believe. They will crash the economy when the bubble bursts, because MBAs and billionaires have convinced you and rich VCs that they can do more than they actually can.
Holy shit, you’re too far gone. Yes, I’ve operated my own local LLM front end for personal use and have run them frequently in the past at a low level with no front end and messed with parameters directly. I’ve modified models as well, and I have a server designed to run them. It’s insane that you think those apply to his topic (billionaires lying to you). You’re just throwing random terminology out to make yourself seem smart and to reinforce your stance, but all that does is make you seem insecure about your knowledge. It does quite the opposite of what you intend.
All LLMs do is hallucinate, and sometimes, by pure coincidence, get things correct. This is why it’s impossible to get rid of hallucinations. They do not think, they do not have their own goals (refusing a request can be baked in, but that’s no different than programming something with guardrails), and they certainly will not suddenly become sentient. LLMs cannot do that, by design. Perhaps something else could, but LLMs are not that.
You really had to go to insults for this? I think you need to touch grass and stop believing billionaire marketing. LLMs are not technologically capable of doing what you have been brainwashed to believe. They will crash the economy when the bubble bursts, because MBAs and billionaires have convinced you and rich VCs that they can do more than they actually can.