What’s the big deal: Big data in the financial services sector

Disruptive digital technologies have changed the way customers want to interact with financial services sector businesses.

big-dataCustomers, myself included, are used to the customer-centric technologies experienced in other ‘digitized’ sectors like publishing or music – these have had to adapt to the digital revolution early.  Customers now want that experience replicated in the FS sector.  Banks, for example, are beginning to take notice, investing significant amounts of money in IT upgrades: Australia’s Commonwealth Bank  invested over AUD $1.1 billion in an end-to-end IT transformation project to replace its aging core banking system and Barclays has been promoting customer-centric technologies like the mobile payments app PingIt for many years.

Clearly, to meet the challenge of such disruptive technologies and maintain (and improve) the share of the customer’s digital wallet, the established FS businesses must continue to improve their customer engagement.  This is particularly important given the rise of challenger banks, like Atom and Metro Bank, and digital start-ups, like Zopa in the peer-to-peer lending space or comparethemarket.com in the insurance space. These newcomers are vying for other parts of the financial services value chain and they are leveraging technology as a key differentiator to cosy up to the customer.

Personalization of data

One way to improve customer loyalty and rise to the challenge of the digital revolution in the FS sector is through the development of personalized products for customers based on analysis of their previous behaviors (see ‘Big Data in the Financial Services Sector’ section below).  This requires the collection and analysis of large volumes of existing or historic customer data held by FS companies to predict future customer likes and dislikes.

The shorthand for this data collection and analysis is often referred to as ‘Big Data’.  Along with mobile, ‘the Internet of Things’ and cloud computing, Big Data is the fourth pillar driving change in customer engagement.  In many ways it is the product of the other pillars: the growth of mobile apps and connected devices has increased the amount of information available to companies on consumer behavior whilst the use of cloud computing technologies has made it cheaper to use powerful Big Data analytics tools to collect and analyze customer data to predict trends. According to the Centre for Economics and Business Research for SAS ‘Data Equity‘,  Big data has big potential: estimates suggest that, via efficiencies, innovation and business creation, Big Data was worth £25 billion to UK businesses in 2011 and may reach an annual value of £41 billion by 2017!

What is Big Data?

Part of the problem with understanding what Big Data is lies in the fact that there is no single accepted definition for it.  However, it is often referred to as the collection and analysis of large volumes of structured and unstructured data, often of unknown reliability, (potentially) in real-time to create value for companies.

Let’s break that statement down in to what has been coined the ‘five Vs’ associated with Big Data:

  1. Volume: in today’s connected world, huge amounts of data is being created every second, from tweets to video clips and photos to emails.  This data comprises large data sets that are not capable of being reviewed by conventional software tools within acceptable time-frames.
  2. Variety: the data can range from structured (eg numeric data in fixed fields such as spreadsheets) to unstructured (eg rich-media information in videos or images that cannot be neatly placed into spreadsheets).  Big data technology allows users to analyze not only the structured data we find in the FS sector (eg financial data), but also the more complex unstructured data that is becoming more prevalent in order to come to new conclusions and findings.
  3. Veracity: the reliability of the data may not always be known, especially if it is obtained from third-party, publicly available sources (eg ‘open source data’).
  4. Velocity: the data is frequently updated and can be quickly analyzed as it is collected/generated without the need for it to be placed into databases (ie analyzed in real time).
  5. Value: by predicting new trends based on the analysis of data, banks and insurers can create value for customers by offering them new services (see ‘Big data in the financial services sector’ below).

[Internet-Things]

Legal and commercial issues

All this fast-paced development does bring with it certain legal and commercial challenges.

To make the most of the Big Data revolution, FS companies will need to partner with Big Data software specialists who have the BIA software to allow them to undertake the level of analysis required to maintain competitive advantage. They will also, in the long term, need to look at upgrading their legacy IT infrastructure to handle the new high-volume, high-velocity, high-variety data that is becoming available and to be able to integrate it with pre-existing company and customer data to create value.

Procuring new BIA software tools and/or undertaking large-scale technology refresh projects comes with a host of issues to consider that I will consider below.

Software procurement

Many of the issues to consider are similar to the contract issues associated with licensing software in other sectors.

  • Scope of license: be clear on who will be granted the license to use the software to collect and analyze the data.  For example, will it just be the customer contracting entity or will any group company want to take the benefit of the services?  If it is the latter, the license terms will need to be drafted accordingly.  In addition, the licensee may wish outsourced service providers or contractors to be able to use the software to deliver services to it, in which case the scope of the license will need to cover these third parties.
  • Purpose of license: make sure the license aligns with the business purpose! It is a common oversight by software licensees to be unclear on what they intend to use the software for.  If the software will be used for internal analysis then a limited license may be sufficient.  However, if the licensee intends to use the software to analyze data to then use as the basis to sell personalized products to its customers, then this should be clearly articulated in the license terms.

Software implementation timetable: software implementations are prone to delays.  It is important that the contract sets out the timetable for implementation of the software, including precise milestones with clear acceptance criteria so the parties can identify whether each milestone has been passed or not.  Also, payment terms should be linked to the sign-off of milestones and liquidated damages should be considered in the event of delays to the timetable, save where they were directly caused by the customer’s acts or omissions.

  • Acceptance testing: the parties should be clear on what the process is for acceptance testing of the software to make sure both parties can clearly identify whether the software performs in accordance with the agreed specification and is free from defects.

Data licensing

Given the sheer volume and variety of data being created, it is not unusual for FS companies to want to combine data owned by third parties with their own data.  Again, the issues to consider should not be unfamiliar to many FS companies as they come across similar concerns in the context of licensing market data.

  • Purpose of license: as with software procurement, the scope of the data license needs to be aligned to the purposes for which the business wants to use the data.  For example, if the data licensee wants to combine the licensed data with in-house data to create new data sets it needs to be granted the relevant rights.  In addition, if it wishes to use the data not just for internal purposes, but also for use by a third party software licensor (eg the BIA software licensor) then the data license must grant the necessary rights to such third parties.
  • Derived data: the contract should also be very clear on who owns the IPR in any new data created from the data analysis (for example, when data owned by third parties is combined with the customer owned data to create something new and original which is not a copy or extract or modified version of the data sets but a new work based on analysis of the data).  If the licensee does not own the IPR in the derived data then careful attention should be paid to the scope of the license granted:  for example, does the contract permit the data licensee to use the derived data contained in reports post-termination?
  • Warranties: the data licensee should be clear on the warranties in the contract around accuracy and reliability of any data owned by third parties it relies on to craft personalized customer products and services.  Often, publicly available data (‘open data’) comes with limited warranties as to accuracy.  In such circumstances, it will be up to the data licensee to undertake its own due diligence to be sure the data is error-free before it uses it.

Sector specific issues

Upgrading legacy IT infrastructure to meet the Big Data challenge is not a simple task and may often involve the outsourcing of activities to third-party specialists.  But, this is an issue that the industry appears to be aware of.  In a self-assessment survey conducted by Opinium, involving the participation of 300 senior executives in the financial services sector, almost half revealed that their IT infrastructure was not able to move fast enough to enable the business to make better use of data.

When undertaking IT outsourcing projects, FS companies will need to be aware of the regulatory issues.  The rules relating to outsourcing are dotted throughout the FCA and PRA Handbooks.  For example, regulated firms must notify their regulators prior to outsourcing any regulated activity and ensure the relevant service provider has obtained Part IV permission.

Regulated firms looking to undertake critical outsourcing projects need to take particular note of the SYSC rules in the FCA handbook.  A critical outsource is the outsourcing of an activity that is so important to the regulated firm that the outsourced service provider’s failure to perform such services would materially impact the regulated firm’s ability to comply with its requirements under the FCA Handbook.  The rules apply as guidelines to regulated firms but as mandatory requirements to ‘common platform firms’ like banks, building societies and investment firms.  SYSC 8.1.8 is a useful starting point in terms of analyzing the types of requirements that all outsourcing contracts should include. (I recommend it is used as a checklist for terms to include in contracts when considering an IT outsource, whether or not you are a common platform firm.)  It covers principles such as audit, service levels and termination rights and is anchored around the principle that a regulated firm must have in place the necessary mechanisms to ensure continuity of service and no loss of operational control when outsourcing its activities.

Conclusion

Big data will become more and more prevalent as the amount of data generated in the FS sector increases.   It is said that every day we create 2.5 quintillion bytes of data, 90% of which was created in the last two years alone (IBM 2015).  Social media applications alone are said to account for approximately 27% of Big Data used by the banking and financial markets.  In addition, following the financial crisis of 2007, there has been an exponential increase in the amount and granularity of data that banks are being required to report on and disclose to central banks and regulators.  This requirement of greater record keeping is part of a global move by regulators to increase transparency of financial markets and prevent a recurrence of 2008 when complex over-the-counter derivatives were poorly understood and tracked.

FS companies need to leverage the potential benefits of the vast volume of data they collect to provide better services to their customers in an increasingly competitive digital world.  Failure to do so will lead to loss of market share as customers move to more digitally accessible newcomers.  Just watch out for the legal and commercial potholes along the Big Data highway!

Share

Warning & Disclaimer: The pages, articles and comments on IPWatchdog.com do not constitute legal advice, nor do they create any attorney-client relationship. The articles published express the personal opinion and views of the author as of the time of publication and should not be attributed to the author’s employer, clients or the sponsors of IPWatchdog.com.

Join the Discussion

No comments yet.