Regarding AI-generated content (part 1 of 2)

As a writer, translator, reader and reviewer, I am against the use of AI-generated content in any literary work.

But this is a complex issue. Therefore, in this article, I am outlining some of the recent trends regarding the use of AI-generated content. In the next article, I will discuss how to detect AI-generated content.

Firstly, as the Australian Society of Authors points out:

“New technologies out to serve our community and unlock new opportunities for our creative industries. If regulated appropriately, AI represents a chance to support Australian authors, artists and publishers rather than displace our creators to the detriment of our nation’s unique cultural landscape.”

Meanwhile, the Authors Guild, as the oldest and largest professional organisation for writers in the United States, recently introduced this clause:

“Author shall not be required to use generative AI or to work from AI-generated text. Authors shall disclose to Publisher if any AI-generated text is included in the submitted manuscript, and may not include more than 5% AI-generated text.”

Indeed, AI is supposed to be a tool that supports creators. As is the case with all tools ever invented throughout humanity, responsible use is the key.

The Authors Guild further advised that AI technologies can be used to assist in the creation of a literary work, as long as (1) the work substantially comprises human creation and (2) a human artist has control over, and reviews and approves, each word in the work.

So it comes down to a writer’s choice. On those occasions where generative AI is used, would that writer be responsible and transparent about it?

Dave Malone, a poet and screenwriter based in Missouri, recently published his AACC framework – AI Attribution and Creative Content – as a “transparency framework for creators”. The framework’s focus is to distinguish between “AI-assisted” and “AI-generated”.

In Malone’s view, “AI-assisted” means a work of art should be originated by a human artist, who will make “all major creative decisions and is responsible for the final work”. In this process, AI can contribute by helping to generate, modify or enhance the work’s content.

In this case, the human artist should provide a “transparency statement” that the work is created by [Creator’s Name] and AI-assisted by [AI Name].

On the other hand, “AI-generated” means the role of a human artist remains “conceptual, curatorial, and editorial”, while AI “creates the primary content from the creator’s prompts and direction”.

In this case, the human artist should provide a “transparency statement” that the work is AI-generated by [AI Name] with concept by [Creator’s Name].

Would every human writer out there be responsible and transparent about their use of AI technologies in creating their literary works, as suggested by the framework cited here? I doubt it. But I think Malone has made an excellent point:

“AI is a tool, and, like any tool, it reveals the skill of the person using it. Label your work. Help audiences understand your process. Be honest about how you’re working… Be transparent. It’s that simple.”

Note: This article was originally titled “Regarding AI-generated content – Part 1 of 2” and was published under the title “Regarding AI – Part 1” by Ranges Trader Star Mail, January 27, 2026, P.26.

Leave a Reply, Please