Thousands of UK consumers were charged twice for debit card payments as a glitch occured in the card terminal run by Cardnet, a joint venture between Lloyds Bank and First Data.
This is only latest IT glitch in a very long list: TSB, M&S, Gatwick, recurring NHS glitches to name but a few of the Software glitches affecting customers, travellers and patients in the past three months.
CAST, the software intelligence company, is helping financial services organisations such as Fannie Mae, Telefonica, Credit Suisse and ING have reliable and resilient software. Experts at CAST are dedicated to improving Software quality, resilience and security.
Lev Lesokhin, EVP of Strategy and Analytics at CAST:
“Glitches such as the one that left consumers scrambling for reimbursements on double-charged purchases are commonly caused by poor interfaces or APIs between layers of complex IT systems. In this case the systems designed to process debit card transactions. Designing software to run these functions within a single organization is complicated enough, but building trustworthy systems is an even higher bar.
The financial services industry is saddled with old, complex applications that are difficult to modernize without risk of service disruption. Considering the glitch occurred in a system touched by multiple organizations, from Cardnet to Lloyds Bank to First Data, it’s likely that a change was made in the system without considering the architectural ramifications of that change. For example, Lloyds might have issued a new software release to improve the interface for online banking services without realizing how it would impact the interface to connected Cardnet terminal, therefore leading to a non-intentional double-charge of debit transactions.
The emerging practice of Software Intelligence, insight around software trustworthiness that points towards a solid understanding of software structure, was designed to help organizations to deal with such complexity. Especially in API-driven development, preventing cross-layer glitches from occurring, teams across organizations must collaborate based on a centralized understanding of software architecture and how enhancements will impact application performance in the field.”
<p>Cybercriminals will continue to try and take advantage of the isolated remote worker, as the world gets used to ‘not another day at the office’. The attack techniques we have seen an increase over the past year – phishing, email scams, social engineering – will persevere while regular communication channels remain disrupted. Without the ability for an employee to easily double-check that an email is actually from the finance department or their boss, there is a risk they will just click the link or enter their details because it is the path of least resistance.</p> <p> </p> <p>However, organizations will respond by strengthening their defenses. Remote Access solutions adopted in haste at the start of the pandemic will be risk assessed and improved to become Secure Remote Access solutions. Zero Trust – the idea that you should assume by default that those accessing your network cannot be trusted – has been long discussed in the security community but will now become the norm. The traditional model of ‘connect then authenticate’ will shift to ‘authenticate then connect’.</p> <p> </p> <p>Context – where an employee is, what device they are using, on what day and at what time – will also play an increasingly important role in authentication alongside traditional identity checks. In fact, with the move to the cloud, a combination of identity and context will effectively become the new perimeter, as the traditional enterprise firewall becomes less and less relevant. Because of the more fluid nature of the perimeter, user and entity behaviour analytics will also increase in importance as identifying patterns outside of normal will be vital for enterprises trying to spot potentially harmful activity.</p>