• FauxPseudo @lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    2 days ago

    Seems like there should be a third exception. For those occasions where the article is about LLM generated text. They should be able to quote it when it’s appropriate for an article.

    • Zagorath@quokk.au
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      That is a reasonable exception to no-AI policies in research papers and newspaper articles, but not for Wikipedia. As a tertiary source, Wikipedia has a strict “no original research” policy. Using AI to provide examples of AI output would be original research, and should not be done.

      Quoting AI output shared in primary and secondary sources should be allowed for that reason, though.

      • ricecake@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        21 hours ago

        Eh, that’s not quite original research. There are plenty of other examples of images and sound files created for Wikipedia. A representative example isn’t research, it’s just indicating what something is.

        The Wikipedia article on AI slop and generative AI has a few instances of content that’s representative to illustrate a sourced statement, as opposed to being evidence or something.

        It’s similar to the various charts and animations.