Which is the entire ideology of the cult around AI.
Y’all want a world that gives you nothing you ask for while we are powerless to do anything about it.
You are wrong, or at least you are wrong to identify your beliefs as in the realm of rationality or science, what you are espousing are a set of religious beliefs in the power of AI that there is zero evidence AI will ever fulfill and damn if it isn’t a lame and depressingly cynical religion.
It is clear when dealing with folks from your cult that proving AI is shit and that it has fundamental problems and limitations is irrelevant to the world view of the cult as it just proves to whatever particular cultist you are talking to that whatever particular subbranch of the cult they are on is the True Sect unlike the other subbranches stuck on the old teachings that aren’t really magical… and that the REAL AGI is just around the corner and this is a distraction.
It is the same nonsense problem you get when you start proclaiming dates for the end of the world in your religion and they keep inconveniently passing by without the world ending. To maintain your delusion you must divide up the religion and say “oh it was that Sect over there that was wrong, we have the true knowledge!”. Rinse Repeat.
I am fine with you having different spiritual beliefs then me just don’t waste everyones’ time by trying to force people into thinking your religion is reality.
No one cares who isn’t already part of your cult.
Sora shut down because this tech has moved on to local models running near real-time.
No, Sora shut down because AI is a bullshit business model that doesn’t produce anything consistently of useful value other than the obsfucation of theft or responsibility while it consumes a vast amount of resources to accomplish what a moderate amount of humans with food, water, shelter and love could make far more efficiently and with far more soul.
I regularly hear normal people describe shitty, fake things that come off hollow as “like AI”, you can see how much of a false future AI is in how little people like what it makes almost as a rule. The reason normal people hate AI is because it is so suffocatingly often a boring black box that spits out sloppy unoriginal crap chopped up from stolen human labor, something most normal people are used to identifying in the hollow structures of society around them.
I am describing how video-to-video models are better at ‘change this one thing’ than ‘make up a whole scene.’ It’s not metaphysics. It’s CGI for dummies.
Local models run on the same power draw as a video game, and some can process ten seconds of footage every five seconds. The best use - because ‘change this one thing’ works better - is processing things humans made the usual way. E.g., real actors on cardboard sets, and other things I actually said.
You’re having a much more contentious conversation inside your head. Please stop projecting traits you’d rather be arguing against, when someone points out, it does the thing it’s for. It demonstrably functions. You could make a coherent moral argument about how it was made - but you haven’t. You’ve railed against an imaginary frothing psychopath, because someone politely described utility.
How it was made is addressable, by the way. It’s fixable. There will be vegan models made from bespoke, licensed, and public-domain data. Will that change your opinion in any way? If not, that complaint is decorative.
Okay, here’s reality from the recent past: some guy recreated GPT-2 for $20. Same size, similar training data, equal performance. The original required VC funding. This guy spent pocket change. That was a year ago. That’s how much efficiency has already improved, for training these models. These assholes only spend billions because it’s exclusionary, and they’re all caught in a dollar auction to see who can lose the gentlest. I’m sorry any hypotheticals about that are incompatible with your moral crusade.
Meanwhile, it does the thing.
That’s not going to change and you kind of have to deal with it. We now have programs that just do what you ask, for any output that’s text, images, audio, or video. They often fuck up in horrifying ways. But they’re usually about what you asked for. Especially if you asked for very little. That’s quite useful where small changes are wildly complex, like ‘make this guy look like another guy.’ The robot won’t do it as good as a team of human professionals, but I don’t have a million dollars to hire a team of human professionals, and I’m betting you don’t either. You can still consider projects that involve making one guy look like another.
They often fuck up in horrifying ways. But they’re usually about what you asked for. Especially if you asked for very little. That’s quite useful where small changes are wildly complex, like ‘make this guy look like another guy.’ The robot won’t do it as good as a team of human professionals, but I don’t have a million dollars to hire a team of human professionals, and I’m betting you don’t either. You can still consider projects that involve making one guy look like another.
I am an artist so I understand when I have the shallow desire to make something into a copy of another thing and my artistic capability fails me, or my lack of resources confines me from reaching my initial vision, that this is the true beginning of my artistic journey and all of that stuff before was just a way of backing myself into wanting something new or changed when I couldn’t get the perfect thing I wanted that was in my head. AI is more than anything else an attempt to seduce human beings into pushing this artistic genesis point of humility, listening and growth further and further away, which is in a way another way of explaining why AI so often leads people into Psychosis.
I have also done lots of community theater so I understand the foolishness of thinking that the important part of making one thing look like another is aesthetic mimcry and not capturing the minimal potent essence of something so it can be received as far more intense of an experience for the audience than a perfect copy could. Theater is a memory not a photograph and you are pointing to how amazingly AI can fabricate high quality fascimiles of photographs as if that doesn’t insult the complexity of how the human brain approaches something it is invited to interpret.
The human brain was designed to see a vivid memory of a hunt through a couple of paint marks on a cave wall, the whole approach of AI and AI cultists deeply insults that magical relationship the human brain has with the most mundane, minimal arrangements of sensation.
Do you think for all these years everybody watching Shakespeare plays where two actors played characters supposed to be easily mistaken for one other as a key part of the plot, that audiences of these shows were getting a suboptimal experience because the two actors didn’t look perfectly alike?
Do you care that in Hamnet that two siblings that are supposed to look so alike that they are frequently mistook for one another, even by death itself, don’t actually look that similar? No, they are child actors who did an amazing job, to care about that in the context of the achievement of Hamnet is shallow and misses the point. You could presumably use AI to “fix” this part about Hamnet (and see Hamnet as Death experienced it and how AI would undoubtedly portray it) and everybody would hate you for it if you did…
My point is that even when AI is good at particular things, often the whole approach is hollow to the Why? with AI. This is something artists could have explained easily to techbros if they ever listened, because the Why? is the whole point.
Denouncing the pursuit of verisimilitude is a novel response to hand-wave CGI. Are you this philosophical when a movie does spend a million dollars, to make two unrelated actors look exactly the same? Should audiences be happier if a no-budget sci-fi film has cardboard displays? It’s cute, certainly. But when a central complaint is that people will notice generated elements and object to low quality, I think they’re gonna notice literal cardboard.
Films are photographs. That’s why The Social Network didn’t just say the Winklevii were twins and expect people to pretend. Movies are a visual medium, whereas theater is mostly heard. Like how television has viewers but theater has an audience. You can Dogville it, and people will roll with that, but anything that looks fake is more commonly a technical failure than a stylistic choice.
So yes, you can tell people the tin can is a spaceship… but they’d rather be shown. The preference for showing over telling is so ingrained that it’s cliche. Nobody needs to announce ‘we lay our scene in fair Verona’ when you can put the mediterranean coastline onscreen, and then cut to a cobblestone village where people have pointy shoes. Folks will get it. They’ll get it on a level deeper than narration, or an overlay reading “Verona, Italy, 15° E, 40° N, June 17th 1435, 0700 hours.” They’ll get it even if the aerial shot of the coastline was bought as stock footage. Or rendered, in one way or another.
but anything that looks fake is more commonly a technical failure than a stylistic choice.
Your lack of media literacy is wild, film is entirely a honest fabrication of obvious fakes, that is the basis of cinema, the fundamental concept of the movie screen being itself simply a fake window that is honest to you about the speculative nature of the world revealed beyond.
Movies don’t convey impossible things by actually creating them, they present destabilized artifice from perspectives that invite us to see the mundane everywhere as a facade disguising something quivering underneath.
So yes, you can tell people the tin can is a spaceship… but they’d rather be shown. The preference for showing over telling is so ingrained that it’s cliche. Nobody needs to announce ‘we lay our scene in fair Verona’ when you can put the mediterranean coastline onscreen, and then cut to a cobblestone village where people have pointy shoes. Folks will get it. They’ll get it on a level deeper than narration, or an overlay reading “Verona, Italy, 15° E, 40° N, June 17th 1435, 0700 hours.” They’ll get it even if the aerial shot of the coastline was bought as stock footage. Or rendered, in one way or another.
You almost make a coherent point here but then you topple your entire logic.
The first lesson you learn as a writer is to show not tell and the first lesson you learn as an artist working with video is that to tell is actually something that is desperately hard to avoid doing with a video camera because at the heart of it that is all moving images can do moment to moment, unlike words untethered from direct sensation.
Thus the true skill of an artist working with photographs or video is how they continously subvert the tendency of images to exhaustingly tell instead of show.
This is kind of a basic aspect to an exploration of movies as art…?
Whether it be documentaries having to grapple with the inherent paradox of the production of the documentary affecting and telling upon what it is attempting only to honestly show a picture of, or movies about fictional things having to constantly avoid the catastrophe of the audience only attending to the literal quality of the thing presented to them scene to scene, it is all the same existential question.
The only “cult” around AI is the anti-AI cult. The rest of us just acknowledge the reality that AI is now an everyday tool to use, and it’s revolutionising the world at a break-neck pace. It’s turning entire industries on their heads, doing in minutes for a handful of dollars what a year ago took 300 people millions of dollars to do.
The genie is out of the bottle, and it’s not going back in ever again. The genie is only going to get more powerful by leaps and bounds. The anti-AI crowd are going to be left behind, unemployed, and even worse for them - unemployable.
The rest of us just acknowledge the reality that AI is now an everyday tool to use, and it’s revolutionising the world at a break-neck pace.
Except it isn’t? Most of us who don’t worship techbros like you don’t think highly of the quality of output of AI, it has become common parlance for people to describe fake and hollow feeling things as “like AI” and I agree with the aesthetic label, y’all are just too blind to see it while you try to force it down our throats. We are in amid a massive economic bubble with AI that is about to burst given that almost no AI companies are profitable and they consume an incredible amount of energy.
You are fantasizing about a religion, great, you can believe in whatever you want but stop making a clown out of yourself by pretending what you are espousing isn’t a set of religious beliefs with no hard evidence to support the magical thinking they demand.
No one is “worshipping tech bros” 🤣. Most people like what AI is bringing to the world. It makes lots of jobs infinitely easier. It opens new doors for people, doors that previously were locked shut with a million padlocks and booby traps.
Like I said, the only cult like behaviour is from people like you.
These things aren’t getting built, or if they’re getting built, it’s taking way, way longer than expected, which means that interest on that debt is piling up. The longer it takes, the less rational it becomes to buy further NVIDIA GPUs — after all, if data centers are taking anywhere from 18 months to three years to build, why would you be buying more of them? Where are you going to put them, Jensen?
This also seriously brings into question the appetite that private credit and other financiers have for funding these projects, because much of the economic potential comes from the idea that these projects get built and have stable tenants. Furthermore, if the supply of AI compute is a bottleneck, this suggests that when (or if) that bottleneck is ever cleared, there will suddenly be a massive supply glut, lowering the overall value of the data centers in progress…which are, by the way, all filled with Blackwell GPUs, which will be two or three-years-old by the time the data centers are finally turned on.
I also wonder whether the demand actually exists to make any of this worthwhile, or what people are actually paying for this compute.
If we assume 3GW of IT load capacity was brought online in America, that should (theoretically) mean tens of billions of dollars of revenue thanks to the “insatiable demand for AI” — except nobody appears to be showing massive amounts of revenue from these data centers.
Although there has been between $30 and $40 billion in enterprise investment into generative AI, a recent MIT report shows that 95 percent of organizations are seeing zero return.
Just 5 percent of integrated artificial intelligence pilots “are extracting millions in value,” while the majority contribute no measurable impact to profits, the report found.
They found that people perceived AI scientists more negatively than climate scientists or scientists in general, and that this negativity is driven by concern about AI scientists’ prudence – specifically, the perception that AI science is causing unintended consequences. The researchers also examined whether these negative perceptions might be a result of AI being so new and unknown, but found that public perceptions of AI science and scientists did not significantly improve from 2024 to 2025, even as AI became a more common presence in everyday life.
AI is also a threat towards luring people into psychosis because it pathologically confirms every impulse you have, so trying to argue everyone loves AI is going to backfire on you. Everyone loved cigarettes too when they were a new thing. People still love cigarettes, that only proves they are addictive.
We find that sycophancy is both prevalent and harmful. Across 11 AI models, AI affirmed users’ actions 49% more often than humans on average, including in cases involving deception, illegality, or other harms. On posts from r/AmITheAsshole, AI systems affirm users in 51% of cases where human consensus does not (0%). In our human experiments, even a single interaction with sycophantic AI reduced participants’ willingness to take responsibility and repair interpersonal conflicts, while increasing their own conviction that they were right. Yet despite distorting judgment, sycophantic models were trusted and preferred. All of these effects persisted when controlling for individual traits such as demographics and prior familiarity with AI; perceived response source; and response style. This creates perverse incentives for sycophancy to persist: The very feature that causes harm also drives engagement.
The study, published Thursday in the journal Science, tested 11 leading AI systems and found they all showed varying degrees of sycophancy — behavior that was overly agreeable and affirming. The problem is not just that they dispense inappropriate advice but that people trust and prefer AI more when the chatbots are justifying their convictions.
The productivity gains aren’t there for AI, the business use cases aren’t actually there for AI, people are beginning to associate AI with “Slop” more and more as they realize how boring and poor quality content AI makes… and even in Google’s own search engine rankings AI written content barely makes it anywhere near the top because it scores so low for relevance and engagement to people.
Oh yeah and again AI sends people into psychosis by putting people into echo chambers, so defending AI as likable isn’t even a rational defense for it in the same way arguing a Venus Fly Trap tastes good to a Fly to encourage the Fly to step on in is a poor argument.
Which is the entire ideology of the cult around AI.
Y’all want a world that gives you nothing you ask for while we are powerless to do anything about it.
You are wrong, or at least you are wrong to identify your beliefs as in the realm of rationality or science, what you are espousing are a set of religious beliefs in the power of AI that there is zero evidence AI will ever fulfill and damn if it isn’t a lame and depressingly cynical religion.
It is clear when dealing with folks from your cult that proving AI is shit and that it has fundamental problems and limitations is irrelevant to the world view of the cult as it just proves to whatever particular cultist you are talking to that whatever particular subbranch of the cult they are on is the True Sect unlike the other subbranches stuck on the old teachings that aren’t really magical… and that the REAL AGI is just around the corner and this is a distraction.
It is the same nonsense problem you get when you start proclaiming dates for the end of the world in your religion and they keep inconveniently passing by without the world ending. To maintain your delusion you must divide up the religion and say “oh it was that Sect over there that was wrong, we have the true knowledge!”. Rinse Repeat.
I am fine with you having different spiritual beliefs then me just don’t waste everyones’ time by trying to force people into thinking your religion is reality.
No one cares who isn’t already part of your cult.
No, Sora shut down because AI is a bullshit business model that doesn’t produce anything consistently of useful value other than the obsfucation of theft or responsibility while it consumes a vast amount of resources to accomplish what a moderate amount of humans with food, water, shelter and love could make far more efficiently and with far more soul.
I regularly hear normal people describe shitty, fake things that come off hollow as “like AI”, you can see how much of a false future AI is in how little people like what it makes almost as a rule. The reason normal people hate AI is because it is so suffocatingly often a boring black box that spits out sloppy unoriginal crap chopped up from stolen human labor, something most normal people are used to identifying in the hollow structures of society around them.
Literally what are you talking about.
I am describing how video-to-video models are better at ‘change this one thing’ than ‘make up a whole scene.’ It’s not metaphysics. It’s CGI for dummies.
Local models run on the same power draw as a video game, and some can process ten seconds of footage every five seconds. The best use - because ‘change this one thing’ works better - is processing things humans made the usual way. E.g., real actors on cardboard sets, and other things I actually said.
Nope, this is a bullshitters tool for people with no talent who want to pretend there is a shortcut to making good art.
The tool you are obsessed with is just a way of convincing yourself you made something when it was stolen from other human artists.
Everybody else can see that but people who have drunk the Kool Aid of AI too hard to admit it to themselves.
You’re having a much more contentious conversation inside your head. Please stop projecting traits you’d rather be arguing against, when someone points out, it does the thing it’s for. It demonstrably functions. You could make a coherent moral argument about how it was made - but you haven’t. You’ve railed against an imaginary frothing psychopath, because someone politely described utility.
How it was made is addressable, by the way. It’s fixable. There will be vegan models made from bespoke, licensed, and public-domain data. Will that change your opinion in any way? If not, that complaint is decorative.
Stop referencing promises about the future to prove your point, you sound like a door-to-door salesperson.
Okay, here’s reality from the recent past: some guy recreated GPT-2 for $20. Same size, similar training data, equal performance. The original required VC funding. This guy spent pocket change. That was a year ago. That’s how much efficiency has already improved, for training these models. These assholes only spend billions because it’s exclusionary, and they’re all caught in a dollar auction to see who can lose the gentlest. I’m sorry any hypotheticals about that are incompatible with your moral crusade.
Meanwhile, it does the thing.
That’s not going to change and you kind of have to deal with it. We now have programs that just do what you ask, for any output that’s text, images, audio, or video. They often fuck up in horrifying ways. But they’re usually about what you asked for. Especially if you asked for very little. That’s quite useful where small changes are wildly complex, like ‘make this guy look like another guy.’ The robot won’t do it as good as a team of human professionals, but I don’t have a million dollars to hire a team of human professionals, and I’m betting you don’t either. You can still consider projects that involve making one guy look like another.
That utility is new and it’s not going anywhere.
I am an artist so I understand when I have the shallow desire to make something into a copy of another thing and my artistic capability fails me, or my lack of resources confines me from reaching my initial vision, that this is the true beginning of my artistic journey and all of that stuff before was just a way of backing myself into wanting something new or changed when I couldn’t get the perfect thing I wanted that was in my head. AI is more than anything else an attempt to seduce human beings into pushing this artistic genesis point of humility, listening and growth further and further away, which is in a way another way of explaining why AI so often leads people into Psychosis.
I have also done lots of community theater so I understand the foolishness of thinking that the important part of making one thing look like another is aesthetic mimcry and not capturing the minimal potent essence of something so it can be received as far more intense of an experience for the audience than a perfect copy could. Theater is a memory not a photograph and you are pointing to how amazingly AI can fabricate high quality fascimiles of photographs as if that doesn’t insult the complexity of how the human brain approaches something it is invited to interpret.
The human brain was designed to see a vivid memory of a hunt through a couple of paint marks on a cave wall, the whole approach of AI and AI cultists deeply insults that magical relationship the human brain has with the most mundane, minimal arrangements of sensation.
Do you think for all these years everybody watching Shakespeare plays where two actors played characters supposed to be easily mistaken for one other as a key part of the plot, that audiences of these shows were getting a suboptimal experience because the two actors didn’t look perfectly alike?
Do you care that in Hamnet that two siblings that are supposed to look so alike that they are frequently mistook for one another, even by death itself, don’t actually look that similar? No, they are child actors who did an amazing job, to care about that in the context of the achievement of Hamnet is shallow and misses the point. You could presumably use AI to “fix” this part about Hamnet (and see Hamnet as Death experienced it and how AI would undoubtedly portray it) and everybody would hate you for it if you did…
My point is that even when AI is good at particular things, often the whole approach is hollow to the Why? with AI. This is something artists could have explained easily to techbros if they ever listened, because the Why? is the whole point.
Denouncing the pursuit of verisimilitude is a novel response to hand-wave CGI. Are you this philosophical when a movie does spend a million dollars, to make two unrelated actors look exactly the same? Should audiences be happier if a no-budget sci-fi film has cardboard displays? It’s cute, certainly. But when a central complaint is that people will notice generated elements and object to low quality, I think they’re gonna notice literal cardboard.
Films are photographs. That’s why The Social Network didn’t just say the Winklevii were twins and expect people to pretend. Movies are a visual medium, whereas theater is mostly heard. Like how television has viewers but theater has an audience. You can Dogville it, and people will roll with that, but anything that looks fake is more commonly a technical failure than a stylistic choice.
So yes, you can tell people the tin can is a spaceship… but they’d rather be shown. The preference for showing over telling is so ingrained that it’s cliche. Nobody needs to announce ‘we lay our scene in fair Verona’ when you can put the mediterranean coastline onscreen, and then cut to a cobblestone village where people have pointy shoes. Folks will get it. They’ll get it on a level deeper than narration, or an overlay reading “Verona, Italy, 15° E, 40° N, June 17th 1435, 0700 hours.” They’ll get it even if the aerial shot of the coastline was bought as stock footage. Or rendered, in one way or another.
Your lack of media literacy is wild, film is entirely a honest fabrication of obvious fakes, that is the basis of cinema, the fundamental concept of the movie screen being itself simply a fake window that is honest to you about the speculative nature of the world revealed beyond.
Movies don’t convey impossible things by actually creating them, they present destabilized artifice from perspectives that invite us to see the mundane everywhere as a facade disguising something quivering underneath.
You almost make a coherent point here but then you topple your entire logic.
The first lesson you learn as a writer is to show not tell and the first lesson you learn as an artist working with video is that to tell is actually something that is desperately hard to avoid doing with a video camera because at the heart of it that is all moving images can do moment to moment, unlike words untethered from direct sensation.
Thus the true skill of an artist working with photographs or video is how they continously subvert the tendency of images to exhaustingly tell instead of show.
This is kind of a basic aspect to an exploration of movies as art…?
Whether it be documentaries having to grapple with the inherent paradox of the production of the documentary affecting and telling upon what it is attempting only to honestly show a picture of, or movies about fictional things having to constantly avoid the catastrophe of the audience only attending to the literal quality of the thing presented to them scene to scene, it is all the same existential question.
The only “cult” around AI is the anti-AI cult. The rest of us just acknowledge the reality that AI is now an everyday tool to use, and it’s revolutionising the world at a break-neck pace. It’s turning entire industries on their heads, doing in minutes for a handful of dollars what a year ago took 300 people millions of dollars to do.
The genie is out of the bottle, and it’s not going back in ever again. The genie is only going to get more powerful by leaps and bounds. The anti-AI crowd are going to be left behind, unemployed, and even worse for them - unemployable.
Except it isn’t? Most of us who don’t worship techbros like you don’t think highly of the quality of output of AI, it has become common parlance for people to describe fake and hollow feeling things as “like AI” and I agree with the aesthetic label, y’all are just too blind to see it while you try to force it down our throats. We are in amid a massive economic bubble with AI that is about to burst given that almost no AI companies are profitable and they consume an incredible amount of energy.
You are fantasizing about a religion, great, you can believe in whatever you want but stop making a clown out of yourself by pretending what you are espousing isn’t a set of religious beliefs with no hard evidence to support the magical thinking they demand.
No one is “worshipping tech bros” 🤣. Most people like what AI is bringing to the world. It makes lots of jobs infinitely easier. It opens new doors for people, doors that previously were locked shut with a million padlocks and booby traps.
Like I said, the only cult like behaviour is from people like you.
Where is your proof of this?
Where’s your proof?
My evidence would be the absolutely massive and widespread adoption of all things AI. Yours would be……?
https://www.wheresyoured.at/the-ai-industry-is-lying-to-you/
https://thehill.com/policy/technology/5460663-generative-ai-zero-returns-businesses-mit-report/
https://graphite.io/five-percent/ai-content-in-search-and-llms
https://www.asc.upenn.edu/news-events/news/ai-perceived-more-negatively-climate-science-or-science-general
AI is also a threat towards luring people into psychosis because it pathologically confirms every impulse you have, so trying to argue everyone loves AI is going to backfire on you. Everyone loved cigarettes too when they were a new thing. People still love cigarettes, that only proves they are addictive.
https://www.science.org/doi/10.1126/science.aec8352
https://www.rochesterfirst.com/science/ap-ai-is-giving-bad-advice-to-flatter-its-users-says-new-study-on-dangers-of-overly-agreeable-chatbots/
The productivity gains aren’t there for AI, the business use cases aren’t actually there for AI, people are beginning to associate AI with “Slop” more and more as they realize how boring and poor quality content AI makes… and even in Google’s own search engine rankings AI written content barely makes it anywhere near the top because it scores so low for relevance and engagement to people.
Oh yeah and again AI sends people into psychosis by putting people into echo chambers, so defending AI as likable isn’t even a rational defense for it in the same way arguing a Venus Fly Trap tastes good to a Fly to encourage the Fly to step on in is a poor argument.