

9·
8 months agoThe problem is not everyone who wants to can move. Picking up everything and moving to New England or the West Coast is not viable for people making minimum wage.
The problem is not everyone who wants to can move. Picking up everything and moving to New England or the West Coast is not viable for people making minimum wage.
And sharks are older then the rings of Saturn.
Do you need the space? If not who cares.
Personally I run a media service for friends and family. I’m about to bring another 100tb online because we are running low on storage. Am I holding or just running a rack of servers in my basement?
Hell. Five nines is doable with eks, a single engineer and thinking through your changes before pushing them to prod. Ask me how I know…
Full disclosure - my background is in operations (think IT) not AI research. So some of this might be wrong.
What’s marketed as AI is something called a large language model. This distinction is important because AI implies intelligence - where as a LLM is something else. At a high level LLMs are using something called “tokens” to break apart natural language into elements that a machine can understand, and then recombining those tokens to “create” something new. When a LLM is creating output it does not know what it is saying - it knows what token statistically comes after the token(s) it has generated already.
So to answer your question. An AI can hallucinate because it does not know the answer - its using advanced math to know that the period goes at the end of the sentence. and not in the middle.