Company Blog

blog.jpg

Why Assets Require Maintenance: A Case for Proactive Data Monitoring

Posted by Alex Olson on 6/29/17 10:50 AM

Many of us are homeowners.  We can easily remember the first house we purchased and the day we "closed" on it - all the paperwork that we signed, the new debt that we now have, etc. Most of us treated that house as one of the most important physical assets that we have because we spent more money on it than any prior purchase. We painted rooms, cleaned the carpets, and ensured that the appliances were in working order.  For a moment, think if you would have taken a different approach - done nothing.  At first you wouldn't be able to tell the difference, but over time the walls would reveal where the paint chipped away, the carpet would show dark spots in certain areas, the doorbell would stop working, and the garage would smell like old garbage - you get the picture. This same decay can happen with your data, if not properly maintained. 

Read More

Topics: Data Quality, Text Analytics

A Simple Explanation of CAT Complexity

Posted by Tony Brownlee on 5/16/17 6:30 AM

The Consolidated Audit Trail (CAT) is coming. With specifications coming out this summer and compliance dates approaching quickly in 2018, broker dealers are now starting to put plans in place to report and comply with CAT. Some in the industry are calling the CAT “revolutionary," particularly due to the requirements for customer and account information reporting. Put simply, if there is a trade occurring in the US equity or options markets, the customer and account information related to those trades must be reported. This is the first time that this breadth of customer and account data will be reported throughout the industry – nearly 1,800 broker dealers, more than 100 million accounts, and an estimated 50 million individual and institutional customers. Why? To help the regulators monitor, understand, and mitigate market-manipulating events across customers, accounts, instruments, and venues. 

Read More

Topics: Data Quality, Enterprise Data

Customer and Account Challenges Ahead After CAT Announcement

Posted by Tony Brownlee on 3/7/17 6:30 AM

Like many regulations, the Consolidated Audit Trail (CAT) requirements have been “baking” since first published by the SEC nearly six years ago. As others such as WatersTechnology journalist Dan DeFrancesco are covering, there is a lot of work ahead for the industry as 2017 marks the year that CAT becomes real. I’ve personally been working with regulators and industry members on CAT since 2013 and I think we can help firms understand some of the basics about CAT as well as some challenges that are likely to emerge. 

Read More

Topics: Data Quality, Enterprise Data

Four Categories of Data Quality Management

Posted by Tony Brownlee on 10/11/16 6:30 AM

In our time spent with executives throughout the financial services industry, we've uncovered four categories of activity that consume leaders' time and budget for data quality.  In particular, with such a focus on legal entity data, we are seeing a significant focus on improving the spend and data governance efforts around these categories of data quality management. Firms that identify data quality issues early and move quickly to fix problems typically achieve higher success with more advanced data management requests, and fewer questions from regulators. Take a look at the four categories and download the whitepaper for more information on entity data quality.

Read More

Topics: Data Quality, Enterprise Data

Over 90% of Hierarchy Data Problems Fall Into These Categories

Posted by Tony Brownlee on 9/8/16 11:30 AM

You're at a party, striking up a conversation with your friends and colleagues, and what do you talk about?  Sports. Politics. Business. Hierarchy data?  While hierarchy data may not always be the first topic discussed, I've been to a few events with chief data officers where it does come up.  If it comes up at your next cocktail party, I want you to be ready to contribute to the conversation. And if I’m in attendance, I’ll join you in the conversation.

Joking aside, for data professionals, hierarchy data is growing in importance.  Sometimes referred to as relationship data, family tree data, legal or corporate hierarchy, this data topics is about the relationships between legal entities that indicate ownership, control, or influence of one entity over another.  

My passion for hierarchy data started in the 2003 time-frame solving global hierarchy data problems related to issuers of securities across 140 countries for public accounting firms.  As 2008 rolled around and issues in the financial markets hit, many banking and capital markets institutions and insurance companies started to realize the importance of hierarchy data for risk purposes.  Then, as regulations emerged, relationship data became a must have for regulatory reporting, risk aggregation, capital adequacy, and many other use cases.  Now, we're seeing many global companies look at the importance of hierarchies for understanding supplier business relationships, analyzing revenue and pricing strategies, and assessing cross-border client relationships.  

Read More

Topics: Data Quality, Enterprise Data

5 Essential Steps to Measuring Entity Data Quality

Posted by Tony Brownlee on 8/15/16 6:30 AM

Use these five steps and reduce your costs associated with data quality.

Take Advantage of New Technology

By using new technology, firms can scan their data and identify problem areas to gain a quick overview of the state of their entity data. New technology can upload data records and perform hundreds of quality scans, covering data completeness, consistency, duplication and more. You can even uncover data by attribute, and assess dozens of aspects that define quality.

Today, technology can read information from hundreds of sources just like a human and identify names, addresses, relationships, and other information…just like a human. Think about what could happen if your technology can’t readily match the right name with the right address.

Read More

Topics: Data Quality, Enterprise Data

Data Quality: Are you a brick inspector?

Posted by Alex Olson on 5/31/16 7:30 AM

In May 1980, a Barnard College student died in New York City due to a piece of terra cotta falling from a building. This terrible situation caused Local Law 10 to be passed in New York City, which was further strengthened by Local Law 11 in 1998 after a similar tragedy. These laws require the façade of each building to be inspected to ensure that the bricks are firmly connected to the structure.

Why are we talking about bricks? As I was walking to one of my client meetings in New York, I observed this process occurring. Scaffolding was erected, individuals were looking at individual bricks, and small fixes were being made. The people on the sidewalks were affected due to the scaffolding in place. I am not a building engineer, but I asked myself, "Is there a better way?" My mind then proceeded to my client meeting and data quality. Were the clients like the brick inspector?  Were they rebuilding the scaffolding, looking at the façade instead of the root causes? Plodding through records manually instead of using tools to find the problem?

Read More

Topics: Data Quality

Data Quality Requires its Own Strategy

Posted by Jeff Gorball on 5/20/16 8:55 AM

What's your data management strategy (DMS)?  Most organizations that have a data management or data governance program recognize that they need one. How about your data quality strategy (DQS) Surprisingly, a lot of organizations miss this need, or fail to define a DQS, and as a consequence fail to meet data quality needs.

A DMS defines your data management program; its vision, scope, objectives, how it will be structured and resourced (among other things), but it misses expectations and guidance related to data quality. That’s where the DQS comes in. It should define your strategy to achieve and maintain the quality levels necessary to ensure that the data under management will in fact meet the business needs.

Read More

Topics: Data Quality

Broken Windows and Data Quality

Posted by Tony Brownlee on 4/13/16 5:13 AM

In this age of increased regulatory reporting, many institutions have taken the clear stance that data quality initiatives need to be prioritized based upon data required by the regulatory reports and models.  While this has helped prioritize data quality initiatives, it does mean that many data quality problems get left behind and are simply deprioritized. 

All of this makes me think of the Broken Windows theory often used in sociology to take the use of informal actions to create change in behavior.  As it’s described in the 1982 article by James Q. Wilson and George L. Kelling… 

Read More

Topics: Data Quality, Enterprise Data

Subscribe to Email Updates

Recent Posts