ICAEW-1

Assurance of Emerging Technology

Over the past year, ICAEW has undertaken a project to review the business risks associated with using developing and emerging technologies – such as blockchain, VR, or the main case study of our research, AI and cognitive automation.  We have been seeking to understand better the key risks that these powerful and valuable technologies carry, and how organisations seeking to benefit from their strengths can work to reduce and contain those risks.  This has led us to describe a combined response to the risks of emerging technologies, based on design principles followed at the point of creation, controls put in place during operation, and a role for internal or external assurance providers to help check that the risks are properly controlled and the technology is working as intended. 

For AI, which is usually developed from data using machine learning or similar approaches, a key risk is that the system can become incomprehensible to a human reviewer.  This not only makes spot-checking the system’s decisions – or explaining them to the subject of an automated decision – harder, it also means that any errors or biases in the system may be difficult or impossible to fix.  Therefore, organisations should consider choosing learning models that produce more interrogable and explainable decisions where possible. 

A related risk is that of bias.  Datasets are inevitably shaped by the world they measure and the people and systems measuring them, and so systems trained on data – any data – will naturally acquire biases and errors that can reinforce and amplify existing issues.  This is most notable with racial, gender, and other socially important biases, but can happen for more subtle qualities too.  More than just making poor decisions, organisations have to worry about reputational and litigation risks if this happens to them.  There’s no simple recipe for reducing bias in machine learning-based models.  But it can be mitigated by, for example, considering the omissions and biases inherent in the training data, removing sensitive fields and proxies for those fields from the data, and testing the model’s outputs for signs of unequal treatment. 

Chartered accountants can help their own organisations to think through these issues and more, advising on best practices and controls design to make sure that any AI, data analytics, or robotic process automation projects reach their full potential without stirring in additional risks.  Those in practice might already be considering how to design an assurance engagement that looks at an algorithm or machine learning project.  In any case, look for the full report on risk and assurance for emerging technologies from ICAEW in the coming weeks. 

Join the ICAEW at The Alternative Accountancy Strategic IT Conference.

www.alternativeaccountancyit.com

The Institute of Chartered Accountants in England and Wales

The Institute of Chartered Accountants in England and Wales

ICAEW
ICAEW is a world leading professional membership organisation that promotes, develops and supports over 181,500 chartered accountants and students worldwide. ICAEW provide qualifications and professional development, shares knowledge, insight and technical expertise, and protect the quality and integrity of the accountancy and finance profession. As leaders in accountancy, finance and business ICAEW members have the knowledge, skills and commitment to maintain the highest professional standards and integrity. Together contributing to the success of individuals, organisations, communities and economies around the world.