The recent “OCEG Use of AI for GRC '' research report found that 82 percent of respondents agree or strongly agree they must adopt generative AI, yet only 12 percent said their organization has a documented AI Governance plan. Generative AI represents the next generational wave of technology; however, for many businesses, adopting and managing usage is uncharted waters. Uncertainty is causing companies to delay implementing AI tools which hinders efficiency and innovation. To help brands better understand the opportunities and potential consequences of adopting AI, LogicGate, the holistic GRC experts delivering leading GRC solutions, is proudly producing a continuous stream of educational resources that will help customers leverage and govern the technology to enhance business performance safely and effectively.
AI Governance creates an advantageous opportunity for GRC leaders to reimagine the traditional ways of managing risks, and gain a more strategic seat at the table. During the recent LogicGate webinar: Navigating AI Governance: Opportunities, Outcomes, and Consequences guest speaker, Forrester Senior Analyst Cody Scott and LogicGate President of Technology Jay Jamison emphasized the importance of holistic AI governance and how to safely leverage the technology’s power for operational efficiencies and growth. According to Cody,
“The important thing here is that speed of innovation is a systemic risk. That means that companies are struggling to gain the competitive advantage that comes with something like the value proposition of AI. And because the aperture for the value of AI is so wide, it requires us to have a much more holistic approach when it comes to just managing it and really being able to understand what the value is and the opportunity that's associated with it.”
Throughout the webinar Cody references industry research and he also elaborates on the four pillars of AI Governance success:
- Purpose - Specifying the business outcomes that the AI system should achieve for that AI model. What is the “why” - why are you taking on this initiative or why are you trying to mitigate a particular risk? What is the value?
- Culture - Embedding AI governance into your organizations’ practices and employee roles. This is really important because people are responsible and accountable for how AI is used; it can't just be one person responsible for everything. This is really important because people, at the end of the day, are responsible and accountable for how AI is used.
- Action - Enabling and automating AI stewardship tasks and processes. First you have to test the model by gathering data and evidence about how it's used and track performance metrics to confirm if the AI model is operating the way it's expected to and delivering the desired value.
- Assessment - Having the ability to observe, measure, and communicate the impact of AI. Leverage risk assessments, residual assessments, control assessments, and metrics for monitoring and making sure the AI model is not creating a business or ethical dilemma.
Cody and Jay also discussed stakeholder engagement and how to identify which teams are accountable for the opportunities and potential consequences of AI usage.
“This has to be a c-suite level discussion because so many people are involved. So many parts of the business are using AI for so many things, whether that's employee productivity, employee experience, operational efficiencies versus customer experience, or even revenue-generating opportunities,” said Cody Scott.
While AI needs to be championed by the CEO to help drive a culture of safe and effective AI usage, balancing technical expertise with a strategic view of the business is a broader, more comprehensive responsibility of senior GRC leaders. According to Cody:
“The question here is not - is there any single stakeholder who needs responsibility? It's really about getting a team together, and you can make that as formal or informal as need be, but the important thing is that you've got that senior level and tactical makeup of stakeholders who are involved in the discussion on a regular basis.”
The AI Governance webinar underscores the need for ongoing guidance and education to help companies build and maintain effective AI Governance programs. AI is rapidly evolving which requires governing tools and solutions that can evolve with it.
“Companies need to prioritize responsible use of AI and we provide a partnership approach so they can not only build an effective AI governance program but also cultivate a culture of risk to drive awareness and accountability,” said Jay Jamison. “AI will continuously evolve and as GRC leaders, we have a responsibility to provide ongoing support and guidance to our customers. We take that seriously at LogicGate.”
About LogicGate
LogicGate® is a global, market-leading SaaS company empowering customers to effectively manage and scale their cyber risk and control, third-party risk management, compliance controls, enterprise risk, and operational resilience programs. Recognized as an industry-leading platform, Risk Cloud® is built with usability in mind, including a no-code interface and graph-database management making the technology flexible, agile and scalable to support various levels of GRC maturity and bolster business outcomes. With an unwavering commitment to fostering business resilience in dynamic landscapes, LogicGate empowers customers to quantify risk, strengthen their security posture, and have visibility into information to create strategic advantages and support business objectives. Learn more about our solutions by visiting www.logicgate.com and/or join us on LinkedIn.
Contact
Jade Trombetta
Senior Director, Communications for LogicGate