Sometimes, one good thing leads to another. This is what we experienced as we began developing applications for the cloud. To better take advantage of what this new platform had to offer, we also started moving towards a microservice-based architecture. Right away we noticed how well the microservices pattern fit our agile project approach.
What's your data worth? A few years ago I would hear this question come up fairly regularly in the financial services industry. Over the last number of years though, the topic of "valuing data" has been pretty quiet as many executives have been more focused on the pressures that new regulations and evolving market risks have placed on their data programs.
Earlier this year the Harvard Business Review published an article titled Do Know You Know What Your Company's Data is Worth? The authors introduce the concept enterprise value of data (EvD) and raise a variety of points, predominantly about how you can value data as an intangible asset or think about the impact data has on the company. They even take it to an extreme of assessing or valuing the impact on the business if the business didn't have access to its data.
Topics: entity data
A couple of weeks ago, I spent a few days with 500 of my fellow data professionals at the FIMA Europe conference in London. I’ve been going to FIMA events for more than 10 years and it’s been exciting to see this industry evolve. As Peter Serenita, Group CDO of HSBC put it, we can honestly say we’re a profession now.
So what did this gathering of data professional have to say? For those that missed it, here are three of my top observations, coming from my data and technology executive point of view.
The EU Global Data Protection Regulation (GDPR) was one of the most discussed topics at a recent financial services data conference (FIMA) in London. The discussions mirrored a recent poll that cited more than 60 percent of respondents were either in the planning stage or hadn't started preparing for GDPR as of October 2016. The reason the GDPR is such a high-profile conversation and concern is because of its impact. Like some other regulations before it, such as BCBS 239, or Solvency II, GDPR requires a significant change in how we think about and execute management of data. Not just management of data in the EU, but globally.
It is a simple truth: there is more information vying for our time today than ever before, and it will continue to increase every year. In fact, some estimates put the amount of media consumed in the US last year at around 15.5 hours per person per day, or about 96% of our waking hours. Of course, information overload is nothing new to any of us. We are all familiar with the ever-growing torrent of emails, videos, social media, and yes even blogs constantly warring for our attention. So how does this concept of information overload relate to the bleeding edge of web development? Let's consider how the web has changed in recent years.
In our time spent with executives throughout the financial services industry, we've uncovered four categories of activity that consume leaders' time and budget for data quality. In particular, with such a focus on legal entity data, we are seeing a significant focus on improving the spend and data governance efforts around these categories of data quality management. Firms that identify data quality issues early and move quickly to fix problems typically achieve higher success with more advanced data management requests, and fewer questions from regulators. Take a look at the four categories and download the whitepaper for more information on entity data quality.
In a prior post I defined microservices and the advantages they provide over more traditional monolithic application architectures.
At Kingland we see the cloud as a key enabler of the microservice architecture. Many microservice features enable organizations to benefit from an environment that automatically scales, communicates with other services, and replace a faulty service without impacting conjoined services. Three of the most important features of the cloud, as it relates to running microservices, are rapid provisioning, service discovery, and detailed monitoring.
We've seen continued focus on cognitive, or artificial intelligence (AI) technologies from some of the larger tech companies in the world. This wave of technology is truly revolutionary, but it can be a bit confusing. With new definitions popping up like a frenetic game of buzzword bingo, I prefer this simple explanation used by New York Times reporter Quentin Hardy:
"Cloaked inside terms like deep learning and machine intelligence, AI is essentially a series of advanced statistics-based exercises that review the past to indicate the likely future."
As I've said before, I think everyone should be taking on a cognitive project. As you're working on your plans, here are six tips for understanding this space and narrowing your priorities.
Topics: Cognitive Computing
You're at a party, striking up a conversation with your friends and colleagues, and what do you talk about? Sports. Politics. Business. Hierarchy data? While hierarchy data may not always be the first topic discussed, I've been to a few events with chief data officers where it does come up. If it comes up at your next cocktail party, I want you to be ready to contribute to the conversation. And if I’m in attendance, I’ll join you in the conversation.
Joking aside, for data professionals, hierarchy data is growing in importance. Sometimes referred to as relationship data, family tree data, legal or corporate hierarchy, this data topics is about the relationships between legal entities that indicate ownership, control, or influence of one entity over another.
My passion for hierarchy data started in the 2003 time-frame solving global hierarchy data problems related to issuers of securities across 140 countries for public accounting firms. As 2008 rolled around and issues in the financial markets hit, many banking and capital markets institutions and insurance companies started to realize the importance of hierarchy data for risk purposes. Then, as regulations emerged, relationship data became a must have for regulatory reporting, risk aggregation, capital adequacy, and many other use cases. Now, we're seeing many global companies look at the importance of hierarchies for understanding supplier business relationships, analyzing revenue and pricing strategies, and assessing cross-border client relationships.
Use these five steps and reduce your costs associated with data quality.
Take Advantage of New Technology
By using new technology, firms can scan their data and identify problem areas to gain a quick overview of the state of their entity data. New technology can upload data records and perform hundreds of quality scans, covering data completeness, consistency, duplication and more. You can even uncover data by attribute, and assess dozens of aspects that define quality.
Today, technology can read information from hundreds of sources just like a human and identify names, addresses, relationships, and other information…just like a human. Think about what could happen if your technology can’t readily match the right name with the right address.