Frequently Asked Questions

High-Risk Automated Decision Systems Data Collection

Background

Q: Why is the California Department of Technology (CDT) requiring departments to submit a list of their High-Risk Automated Decision Systems?

A: GC § 11546.45.5 requires the California Department of Technology (CDT) on or before January 1, 2025, and annually thereafter, to submit a report of the comprehensive inventory of high-risk automated decision systems as described in subdivision GC § 11546.45.5 (b) to the Assembly Committee on Privacy and Consumer Protection and the Senate Committee on Governmental Organization.

This comprehensive inventory includes all high-risk automated decision systems that have been proposed for use, development, or procurement by, or are being used, developed, or procured by any “State agency”.

 

Q: What is a “State agency”?

A: For the purposes of the GC § 11546.45.5 ADS reporting, a “State agency” means any of the following:

  1. Any state office, department, division, or bureau.
  2. The California State University.
  3. The Board of Parole Hearings.
  4. Any board or other professional licensing and regulatory body under the administration or oversight of the Department of Consumer Affairs.

“State agency” does not include the University of California, the Legislature, the judicial branch, or any board, except as provided above.

“Board” means any administrative or regulatory board, commission, committee, council, association, or authority consisting of more than one person whose members are appointed by the Governor, the Legislature, or both.

 

Q: What is an Automated Decision System?

A: “Automated decision system” means a computational process derived from machine learning, statistical modeling, data analytics, or artificial intelligence that issues simplified output, including a score, classification, or recommendation, that is used to assist or replace human discretionary decisionmaking and materially impacts natural persons. “Automated decision system” does not include a spam email filter, firewall, antivirus software, identity and access management tools, calculator, database, dataset, or other compilation of data.

 

Q: What is a High-Risk Automated Decision System (ADS)?

A: A high-risk automated decision system (ADS) means an automated decision system that is used to assist or replace human discretionary decisions that have a legal or similarly significant effect, including decisions that materially impact access to, or approval for, housing or accommodations, education, employment, credit, health care, and criminal justice.

 

Q: Who should I contact if I need assistance in interpreting GC § 11546.45.5?

A: Please contact your legal counsel.

Data Collection

Q: What if my “State agency” does not use a High-Risk ADS, do I still need to submit a data collection?

A: Yes. This will ensure CDT complies with the comprehensive inventory requirement of GC § 11546.45.5

 

Q: What if my “State agency” has more than one High-Risk ADS, in use or planned to be in use?

A: CDT requires a separate data collection submission for each High-Risk ADS that has been proposed for use, development, or procurement, or is being used, developed, or procured. 

 

Q: Why is it essential for me to supply my work email?

A: A copy of the submission of your High-Risk ADS will be emailed back to the email you supply. If it is not correct or valid, you will not receive the confirmation message along with your submitted response.

 

Q: How do I know my High-Risk ADS Data collection survey was received by CDT?

A: Upon submitting the data collection, you will receive an automated email response entitled, “Legislatively Mandated High-Risk ADS Data Collection GC § 11546.45.5” from noreply+automations@airtableemail.com acknowledging your response was submitted successfully.

If you are unable to locate your automated email response, please check your Microsoft Outlook Junk Email folder.

 

Q: Will my data be shared outside of CDT?

A: While data collection submissions of High-Risk ADS are generally considered as confidential, certain information is required to be submitted to the Legislature in the annual report. Additionally, to the extent that information in data collection submissions is not considered confidential or otherwise prohibited from disclosure under applicable law, such information must be shared in response to a lawful request for information (Public Records Request, Subpoena, Court Order, etc.).

 

Q: What if my High-Risk ADS uses Generative AI, do I need to submit a response? 

A: Yes. Generative AI is a type of Artificial Intelligence.

 

Q: Since I already provided CDT with a SIMM 5305-F for my High-Risk Generative AI, do I need to submit a response?

A: Yes. The definition of a High-Risk ADS is different than a High-Risk GenAI.

 

Q: What if I submitted a form in error?

A: Please send an email to highriskadsreporting@state.ca.gov to correct your submission.

 

Q: What should I do if I have a question regarding the data collection submission process for my High-Risk ADS ?

A: Please contact CDT at highriskadsreporting@state.ca.gov for further clarification on the reporting requirements of GC § 11546.45.5.

 

Q: How can I contact CDT if I have further questions?

A: Please send your question to highriskadsreporting@state.ca.gov.

High-Risk ADS Examples

Below are some possible examples of High-Risk ADS. The examples below do not imply that these High-Risk ADS are being used by State agencies subject to GC § 11546.45.5

Example 1: A system uses artificial intelligence to assist state employees in approving or denying benefits. The system provides a simplified output in the form of an approval/denial recommendation which the state employees rely upon to make benefits decisions.

Because the system uses AI as a computational process to issue a simplified output in the form of recommendation, and materially impacts a natural person, and is used to assist human discretionary decision making, and has a legal or similarly significant effect for benefits, it is a High-Risk Automated Decision System. This should be reported as a High-Risk ADS.

Example 2: A system uses Generative Artificial Intelligence to evaluate and rank resumés and job applications for alignment to job requirements and issues a ranking score.  A state employee relies upon the rank score to either move applicants forward to the next phase of recruitment or reject applicants.

Because the system uses Generative Artificial Intelligence, a type of AI, as a computational process to issue a simplified output in the form of a rank score, and materially impacts a natural person, and is used to assist human discretionary decision making, and has a legal or similarly significant effect on a natural person’s employment eligibility, it is a High-Risk Automated Decision System. This should be reported as a High-Risk ADS.

Example 3: A law enforcement system uses a statistical model to assist law enforcement in identifying persons who have committed a crime. Law enforcement relies upon these positive identifications to obtain arrest warrants.

Because the system uses a statistical model as a computational process to issue a simplified output in the form of a natural person’s identity match, and materially impacts a natural person, and is used to assist human discretionary decision making, and has a legal or similarly significant effect identifying a natural person as a suspect, it is a High-Risk Automated Decision System. This should be reported as a High-Risk ADS.