MANAGEMENT UPDATE.
INCREASING SAFETY IN AI PROCUREMENT
Multiple Governor executive orders of the past three years have established new policies to ensure responsible, safe, transparent and bias-free use of AI within state governments. On March 30, 2026, a new executive order from California Gov. Gavin Newsom will begin to push the state’s contractors to clearly communicate their internal AI policies and practices, as well.
As part of this newly announced initiative, the California Department of General Services (DGS) and the California Department of Technology (CDT) have been ordered to submit recommendations within 120 days “for new certifications that may be incorporated into state contracting processes.”
This means that companies that seek to provide AI services will see increased attention given to “their policies and safeguards to protect public safety while preventing the misuse of their technologies.”

Addressing the new executive order, Gov. Newsom said “We’re going to use every tool we have to ensure companies protect people’s rights not exploit them or put them in harm’s way.”
In addition to a new approach to certifications, both DGS and CDT will also be submitting recommendations by the end of July for “reforms to contractor responsibility provisions.” These departments, along with Government Operations, the Office of Data and Innovation and the California Department of Human Resources also have been ordered to:
Facilitate employee access to vetted GenAI tools “for general use cases with appropriate privacy and cybersecurity safeguards.”
“Leverage the State Technology Council and the AI Community of Practice to share best practices on responsible AI procurement and adoption while protecting public safety, civil liberties and privacy.”
The executive order also provides other AI safeguards, including efforts to strengthen AI transparency and accountability, expand training and promote AI best practices. In addition, CDT and Government Operations will also be issuing industry best practice guidance for watermarking “AI generated or significantly manipulated images or video” within 120 days.
While this executive order stands out as directly addressing increased safety needs in procurement, the topic of “vendor risk” has been a topic of growing concern for cities, counties and states, which may not fully understand the AI models that vendors are using, the exact nature of the AI that’s being sold, or the internal protections that a company has to ensure safety and privacy of government data.
#StateLocalGovernmentGenerativeAIPolicyandManagement #StateandLocalAI #StatenadLocalArtificialIntelligence #StateandLocalProcurement #StateandLocalProcurementManagement #StateandLocalProcurementPerformance #StateTechnologyManagement #GovernorExecutiveOrders #CaliforniaExecutiveOrder #CaliforniaArtificialIntelligenceExecutiveOrder #CaliforniaArtificialIntelligenceContracting #IncreasingSafetyForAIProcurement #CaliforniaArtificialIntelligenceProcurement #StateContractorSafeguardsForAI #StateandLocalGovernmentManagement #StateandLocalManagement #StateandLocalTechnologyManagement #ArtificialIntelligenceVendorRisk #CaliforniaDepartmentOfTechnology #SafeguardsForStateAI #ArtificialIntelligenceHazardsAndGuardrails #StateandLocalArtificialIntelligenceHazardsAndGuardrails #StateandLocalAIGovernance #ContractorArtificialIntelligencePolicyAndPractices #StateandLocalArtificialIntelligenceNews #StateandLocalManagementNews #BarrettandGreeneInc






