The University of California (UC) Presidential Working Group on Artificial Intelligence (AI) recently released the Responsible Artificial Intelligence report, which sheds light on how the institution currently uses AI and includes recommendations on incorporating eight guiding ethical principles for UC’s use of AI in its services.

The working group gathered information to understand the AI landscape at UC better. They focused on where AI is “most likely to affect individual rights in university settings,” it noted.

Based on the group’s gathered information, the report provided recommendations for AI usage at UC; how AI could affect admissions and financial aid, student success, mental health and grading, and remote proctoring. It also recommended that the institution use AI-powered software to inform human decision-making or in areas where the technology could improve student life on campus.

Additionally, in response to growing concerns over the use and consequences of AI, various sets of AI principles and guidelines have been developed to guide the private and public sectors’ responsible development and use of AI. The working group drew upon these principles and procedures to formulate their own. It urged UC to adopt the following responsible AI principles to guide its procurement, development, implementation, and monitoring of AI within its provision of services:

  • Appropriateness
  • Transparency
  • Accuracy, Reliability, and Safety
  • Fairness and Non-Discrimination
  • Privacy and Security
  • Human Values
  • Shared Benefit and Prosperity
  • Accountability

Following the report’s release, UC administrators said they’ll follow recommendations on risks and opportunities in academics, health, human resources, and policing laid out in the report. UC also announced its plans to launch a public database and assess how the institution uses AI technologies.

Read More About
Lisbeth Perez
Lisbeth Perez
Lisbeth Perez is a MeriTalk State and Local Staff Reporter covering the intersection of government and technology.