• peanuts4life@beehaw.org
    link
    fedilink
    English
    arrow-up
    18
    ·
    11 months ago

    Imo, the true fallacy of using AI for journalism or general text, lies not so much in generative AI’s fundamental unreliability, but rather it’s existence as an affordable service.

    Why would I want to parse through AI generated text on times.com, when for free, I could speak to some of the most advanced AI on bing.com or openai’s chat GPT or Google bard or a meta product. These, after all, are the back ends that most journalistic or general written content websites are using to generate text.

    To be clear, I ask why not cut out the middleman if they’re just serving me AI content.

    I use AI products frequently, and I think they have quite a bit of value. However, when I want new accurate information on current developments, or really anything more reliable or deeper than a Wikipedia article, I turn exclusively to human sources.

    The only justification a service has for serving me generated AI text, is perhaps the promise that they have a custom trained model with highly specific training data. I can imagine, for example, weather.com developing highly specific specialized AI models which tie into an in-house llm and provide me with up-to-date and accurate weather information. The question I would have in that case would be why am I reading an article rather than just being given access to the llm for a nominal fee? At some point, they are not no longer a regular website, they are a vendor for a in-house AI.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      6
      ·
      11 months ago

      This was already true years ago after search engines became a thing. The main answers that come to mind for your question are:

      • providing novel information that wasn’t online before.
      • providing information to you that you wouldn’t have thought to ask for on your own.

      Both of these remain valid and useful reasons for going to a web site even if that web site’s content is AI generated.

      There’s also the matter that “AI generated” is a very broad term. Did someone merely turn an AI loose with a vague instruction to generate some pap to fill a page out with? Or did someone actually provide it with a subject and some information to write about and give the resulting article a read-through to ensure it was good? Did they write a rough draft and just have the AI do the polishing? There’s lots of approaches here, some of them much better than others.

    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      11 months ago

      why not cut out the middleman if they’re just serving me AI content.

      When you have a workflow like:

      1. human
      2. AI extend
      3. AI summarize
      4. you

      …the reason is that AI middlemen would rather rake in the benefits from providing both AI services, instead of getting cut out.

      There is a secondary benefit in that an “AI extended” human input, is more suitable for third party AI readers… so arguably the web is becoming more AI friendly (you can thank us later, future AI overlords).

      PS: GPT-4 compatible version: “y n0t 🗑️👥 if AI📺? wf: 1.👤 2.AI+ 3.AI- 4.👁️ cuz AI👥💰4AI+&AI-. AI+👤👍4AI👁️… web👉AI👌 (🙏🏻AI👑)”