I’ve covered two of the five “V’s” of big data in previous posts — volume and variety. Today, I’m looking at velocity, in terms of both how fast data comes in and how fast it’s now expected to come out in usable forms of information (i.e., in real-time).
Did you know that the New York Stock Exchange receives 1 terabyte of data each day? Or that by the end of 2016, there will be an estimated 18.9 billion network connections — nearly 2.5 for every person on the planet?
With so much volume — and so many internet connections to capture it — the transmission of data is faster than ever. Data flows between machines, networks and humans, through mobile devices, social media channels and business processes, faster than the speed of light, literally.
Think of how quickly we’re able to stream videos and music, or complete credit card transactions at retailers, or examine analytics in real-time for corporate websites. This flow of data is ongoing and massive, and (combined with the other four “V’s”) can be overwhelming if the necessary infrastructure to manage it isn’t in place.
This is especially important for financial institutions. Consider the common uses of big data for banks and credit unions:
- Better manage risk and compliance
- Forecast sales
- Increase account holder engagement and retention
- Improve the customer experience
- Create more strategic marketing campaigns
The above uses are all time sensitive and need data in real-time to deliver optimal results.
Factor in increased pressure from Fintechs, P2Ps and other tech-driven competitors and the stakes are even higher to get it right.
The good news is that today’s technologies allow for data to be analyzed in real-time, as it’s being generated, without having to be put into databases first. These technologies create a “feedback loop” that allows businesses to deliver data quickly where it’s needed while also generating usage insight from customers and other end-users.
Many financial institutions are looking to the cloud in the form of software-as-a-service (SaaS) applications to help them overcome velocity (as well as the other four “V” challenges). SaaS allows financial institutions to collect data through a remote service and avoid overloading their existing infrastructure.
Open source software is another way that banks and credit unions can address the velocity challenge of big data. All they have to do is plug their algorithms and policies into the system and let it handle the increasingly demanding tasks of processing and data analysis.
By now you may have noticed a pattern in my blog posts about the five “V’s” of big data. The answer to the challenges can be found in overlapping solutions: cloud computing and open source software.
Check back in the coming weeks as I explore whether these technologies can also help with the remaining two “V’s”—veracity (validity) and value.
Until then, learn more about big data and its challenges and benefits for financial institutions in the Analytics section of our website.