BACKGROUND:
Women and people who are BAME (black and minority ethnic) are disproportionately likely to be the victim of cybercrime, and are more likely to financially suffer as a result, finds new research from Malwarebyes.
Why? One reason is that the modern technologies designed to identify, verify and therefore protect us all are naturally biased.
<p>Human computer interfaces are almost never built with minority communities in mind. Bias and inequality in biometric technologies are caused by a lack of diverse demographic data, bugs, and inconsistencies in the algorithms. If we continue at the same course and speed, software companies will, unintentionally, fail to sufficiently protect minority groups. </p>
<p>We must be ready to do whatever it takes to allow equal access to everyone. Organisations have a responsibility to recognise bias, and work to adapt models to acknowledge the differences that make us who we are. This could involve diversifying the types of biometric technologies that are used to identify users, retraining systems which are misgendering people, changing the way systems classify by gender and, most importantly, listening to customers.</p>