Australiaās main research funder has barred peer reviewers from using artificial intelligence chatbots toĀ produce feedback, following allegations that this had been happening.
The Australian Research Council published onĀ the issue after applicants for grants ofĀ up toĀ A$500,000 (Ā£262,000) awarded under the Discovery Projects scheme reported spotting the ātell-taleā signs ofĀ ChatGPT inĀ assessorsā comments.
The said the reports were a āgeneric regurgitationā of their applications with little evidence of critique, insight or assessment, and that one reviewer had even forgotten to remove the āregenerate responseā prompt that appears at the bottom of all ChatGPT-created text.
The new guidance, published on 7Ā July, says peer reviewers āare required to preserve the principles of confidentialityā.
Āé¶¹
Campus collection: AI transformers like ChatGPT are here, so what next?
āRelease of material into generative AI tools constitutes a breach of confidentiality and peer reviewersā¦must not use generative AI as part of their assessment activities,ā the policy says.
It adds that reviewers āare asked to provide detailed high quality, constructive assessments that assist the selection advisory committees to assess the merits of an application. The use of generative AI may compromise the integrity of the ARCās peer review process by, for example, producing text that contains inappropriate content, such as generic comments and restatements of the application.ā
Āé¶¹
The policy says that, where the ARC suspects that reports are AI-generated, they will be removed from the review process, and that the ARC āmay impose consequential actions in addition to any imposed by the employing institutionā.
Australian researchers had suggested that the use of ChatGPT to write feedback was a symptom of the time pressure that academics in the country are under.
In terms of grant applicants, the ARC guidance says that, while AI āpresents an opportunity to assist researchers in the crafting of grant proposalsā, this āmay raise issues around authorship and intellectual property including copyright. Content produced by generative AI may be based on the intellectual property of others or may also be factually incorrect.ā
As such, the ARC āadvises applicants to use caution in relation to the use of generative AI tools in developing their grant applicationsā and notes that universities are required to certify that all applicants āare responsible for the authorship and intellectual content of the applicationā.
Āé¶¹
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to °Õ±į·”ās university and college rankings analysis
Already registered or a current subscriber?








