Today’s A:360 discusses a few suggested ways to measure the return on investment (ROI) for your analytics initiatives. A common question I receive is “how do we determine the effectiveness of our analytics efforts?”. This podcast’s intent is to present a few possible ways to answer that very question about measuring analytics ROI.

Watch and Listen

Click to Watch on YouTube.

Listen to the Podcast

Click to Listen on SoundCloud
Click to Listen on iTunes

Read the Transcribed Audio

Welcome to today’s A:360. My name is Brewster Knowlton, and today we’re going to be talking about some ways that you can measure return on investment for your data and analytics initiatives.

As analytics initiatives become much more commonplace in even the smallest organizations, there will always be the question: “How are we evaluating whether or not we are getting a strong analytics ROI – a strong return on our investment, for both time and resources, in our analytics initiatives?”

We’re going to cover a few different ways to measure this. These are good starting points, by no means is this going to be a compressive or exhaustive conversation about analytics ROI. However, it’s going to be something that gets the conversation started, while giving you a few ideas to get going with your own analytics initiatives and ROI calculations.

My starting point for anything related to analytics ROI is going to be, “How can I reduce the time it takes my staff to be able to produce reports or some data-driven analysis?”

Reduce Manual Reporting Efforts

I find that (we will look at this in the context of a credit union or a bank) for a financial institution with a billion dollars in assets, there are between 4,000-6,000 hours (minimum) that can be automated through the use of improved reporting and analytics. This would most likely be through the integration of data and/or the automated extraction of data from applications that don’t allow for easy data extraction.

This is what our previous podcasts and articles talking about data inventories or report inventories are getting at. That is, how much time is it taking on a monthly or weekly basis to produce those reports?

Once you have that time calculation, you can back into the opportunity cost of your staff manually producing these reports. You can do this by taking the percentage of their time spent (in a month, let’s say) on producing what could be automated reports, multiplied by their compensation and benefits expense. Then, you are able to get a number that’s telling you how much it costs your staff to do manual continue to gather data and produce reports. For organizations in the first 18-24 months of their analytics initiatives, this is where you’re probably going to get the biggest bang for your buck and the biggest return on investment. Frankly, you’ll also make a lot of friends in the process as you’re going to give people a lot of time back per week or per month.

Measuring Access to Data and Analytics

The next way that I would recommend you start looking at analytics ROI is by measuring the overall access to data and analytics throughout the whole organization. What we tend to see is that there are pockets of information – silos – throughout an organization where data may or may not be shared and spread throughout the rest of the organization. What this does is creates very limited insights into the organization as a whole as it relates to anything operationally or in the context of analytics.

A way to quickly identify the spread of analytics usage is to measure how many individuals not only can access essential BI portals (usually the front-end to your analytics platform) but also how many people are accessing it on a regular basis. Especially those in roles where having access to and consistently using data is going to be a critical component of success in their jobs in a more data-driven organization.

This is a point that isn’t necessarily financially-driven. (How many CFOs listening are saying to themselves that you only like financially-based ROI calculations?!) But, as we start to talk about overall utilization of any product that we acquire or implement, we need to consider how well it’s actually being used throughout the organization. If two people are using it out of an organization of five hundred, our analytics product penetration is very low. While not necessarily financially-driven, it is a way to measure the overall impact of your analytics platform and initiatives.

Measures of Self-Service

You’ve heard me say this before if you’ve listened to previous podcasts or read our articles: your analytics program should be centrally-driven and broadly distributed. What that does NOT mean is that your analytics team becomes a series of report writers where they’re, essentially, order takers from business users that need data.

So, another way to measure analytics ROI is to analyze how many reports or dashboards visualizations are in your BI portal that have been created by the analytics team and how many of those reports have been created by the business users themselves. This becomes a measurement of the self-service capability of your analytics platform. Again, this is not necessarily financially-based from an ROI perspective, but as we look at overall utilization, you really want to have a platform that enables the business users to get data on their own. If your analytics team is required to constantly create all the new reports and analysis, you’re not going to be able to scale as the data needs and analytics requests rise. Therefore, this self-service piece is a very important and integral component of the success of analytics program and initiative. This metric directly measures the success or failure towards that objective.

Benchmarking Analytics Lift

The last point that I’ll make about measuring analytics ROI – again, this is by no means and exhaustive list but just the starting point for the conversation – is that you can measure analytics ROI and determine the impact of analytics through the benchmarking process.

For example, let’s look at a lending example. Suppose, before you were able to dive into your underwriting and origination data, you had a 5.5% average yield for your consumer loan portfolio. Now, after investing in analytics and you’ve gathered and integrated your loan origination data and loan servicing data, you found there were more opportunities to underwrite loans with lower credit quality borrowers but maintain your same delinquency and chargeoff ratios. In this example, you are able to make more money with a reduced risk or equivalent risk portfolio. After all this, instead of having a 5% yield, you might be at a 6% yield. All of a sudden, you’re making an extra 50 basis-points on loans – directly contributing to the bottom line.

That benchmarking comparison of, “What did we do before analytics?” versus “What did we do after analytics?” is just one way that you can start to show the value. There are going to be a million scenarios where benchmarking applies. Look at your credit card portfolio and look at the number of transactions. Perhaps a dive into the data helps you develop a gamification-based marketing campaign that has an emphasis on signature-based debit card transactions as opposed to PIN-based transactions. This would lead to more interchange income on each swipe. Benchmark the before (i.e your control group) to the “after”. Try to explore the different ways that you can use benchmarking as a way to determine the return on investment or the impact of analytics on your operations.

As I’ve said, this is by no means an exhaustive list for measuring analytics ROI, rather it is just a way to get some ideas flowing about how you can measure the impact and return on investment for analytics.

We talked about measuring analytics ROI through:

  • Reducing employee time to create reports, dashboards, and other data-related tasks
  • Measuring access and utilization of analytics and the BI portal
  • Measuring the percentage of reports and analytical efforts created by the business users vs. the analytics team directly
  • Benchmarking your time in the “before analytics” period versus the “after analytics” time.

That’s it for today. Thanks again for listening to today’s A:360.


Subscribe to have new content sent directly to your email!

[mc4wp_form]

Photo Credit

Today’s A:360 discusses why it is highly recommended that most organizations take an iterative, phased approach to developing their analytics solution.

Watch and Listen

Click to Watch on YouTube.

Listen to the Podcast

Click to Listen on SoundCloud
Click to Listen on iTunes

Read the Transcribed Audio

Hey everyone. Welcome to today’s A:360. My name is Brewster Knowlton, and today we’re going to be talking about the importance of taking an iterative approach to building out your data and analytics program.

By an iterative approach to data and analytics, I really mean your organization should make a focused effort not to try to tackle and bite off everything all at once. Essentially, approach business intelligence or analytics in phases.

I hear a lot of talk about big data and getting to this point of unstructured data where technologies like Hadoop, Hive, and MongoDB are thrown around. Before worrying about those more advanced technologies, take it one step at a time. Crawl, walk, run.

Focus first on building out a data warehouse and identifying those high priority data sources (like a core or other important third-party data sources) that you need to integrate first. In this podcast we’re going to talk about the importance of taking that iterative approach and some of the dangers of going all in and trying to do too much at once as well.

Identifying ROI is something that a lot of executives are rightfully concerned about when it comes to analytics. Let’s assume you start by building out a data warehouse with a core and a few major third-party applications (these could be your loan origination system or a CRM system, for example).

Without investing all of the resources building out your analytics program up front, you can show incremental value to the more skeptical individuals within your organization with a phased approach. That’s not to say that those organizations that are going all in are taking a bad approach necessarily, but there are some organizations that have to take a step back and prove that ROI at each step of the development process. By taking an iterative, step by step approach to this, you can actually start to build up that scale, build up the ROI and start to incrementally show value without having to rely on one big initial burst after months of development and a larger up-front cost.

Taking a phased approach allows you to build out the skills of not only your BI team but the rest of the organization as well.

If you are going for a spoked-wheel model [of analytics] – where you have subject matter experts or power users within each department that are going to be responsible for some of the analytics in that area – you’re going to need time to build up their skills and train them on various things such as:

Analytics is certainly a learning process. By taking a phased approach to analytics, we can learn from our mistakes during each phase.

If we try to boil everything down to a single phase – especially for those organizations who are trying to build their data warehouse/analytics platform in-house (I would probably urge you to take a step back and reconsider if that’s the best approach) – you’ll likely wish you adopted an iterative development approach. If you make a critical mistake in the first phase and continue to make that same mistake (because you haven’t broken the project down into phases where you’ve learned from your mistake), you’re going to create a very, very difficult rats nest to unravel when you realize the mistake later on. As a result, you’ll have to go back through potentially every single phase and make changes.

By taking an iterative approach, you might make a mistake in phase one, but you can correct that mistake for only that phase’s work. Then, when you go onto the next phase, you’ll have learned from the prior phase and will be able to avoid making the same mistake(s).

Analytics is best handled with an iterative, phased approach. Break it down into phases and don’t try to bite off too much at once. This approach allows you to show incremental value, allows you to properly develop and cultivate the necessary skills, and it allows you to correct mistakes that may arise with only minimal issues or rework.

That’s it for today. Thanks again for listening to today’s A:360.


Subscribe to have new content sent directly to your email!

[mc4wp_form]

Photo Credit

2016 was a great year for analytics throughout many industries. The financial industry seemed to especially embrace analytics and take the necessary first steps towards becoming a more data-driven industry.

Back in the beginning of 2016, I published an article titled “5 Reasons to Invest in Data and Analytics in 2016”. Most (if not all) of the points mentioned are just as true today as they were then. However, in the spirit of continuous improvement, I’ve updated the list for 2017 to reflect the changing industry and analytics landscape.

Without further ado, here are your top five reasons to invest in data analytics in 2017…

1. And you thought you had a lot of data sources last year…!

The volume of data is always worth considering. But what many fail to consider – until it’s too late! – is how disparate your data is becoming. In 2016, you likely added several new data sources to your organization’s data inventory.

How many of these data sources are siloed from the rest of the organization’s data?

As you bring on a new CRM system or LOS or digital banking platform or whatever new software your organization acquires, consider how that data will become integrated with the rest of business’ data. Make that a priority during the vendor/software evaluation process instead of an afterthought post-implementation.

2. Your data strategy affects WAY more than just your data.

An organization’s data strategy is:

A comprehensive and actionable foundation for an organization’s ability to harness and leverage data.

Your data strategy, at a bare minimum, must include:

  • A strategy defining your data analytics goals
  • A tactical roadmap describing how you will accomplish the analytics goals outlined above
  • Plans, tactics, and processes to develops analytics skills and create a data-driven culture

But your data strategy doesn’t exist on an island of its own. In fact, your success (or failure) with data strategy will impact so much more than just data.

Is improving your efficiency ratio on your strategic plan for 2017? Better data (read information) can help with that.

Is developing a digital strategy on your strategic plan for 2017? Don’t you think a strong analytics platform to understand your members and their banking habits would support a digital marketing initiative?

If you take away one thing and one thing only from this article, make it this:

Analytics and a strong data strategy will enable greater success in every single one of your other strategic initiatives.

3. The cost of inaction is (truly) greater than the cost of action.

I wrote a post last year talking about how analytics would soon become a competitive necessity for the financial industry. This is as true today as it was then.

There are still many competitive advantages that can be gained in most markets/regions through a greater maturity with analytics. However, as the analytics maturity of the industry rises, the value of the competitive advantage shrinks.

We are nearing a state where the cost of inaction with analytics is greater than the cost of action.

This is a good news – bad news situations. The bad news? Failure to act quickly will limit the potential competitive advantage spread you could achieve within your market.

The good news? Vendor competition and growing expertise in the industry have made high-quality analytics platforms drastically more affordable.

Thought you needed to spend millions to get a strong analytics platform? Think again.

4. Efficiency, efficiency, efficiency

I’m still surprised at how this is the best-kept secret about analytics:

A strong data strategy and analytics platform can drastically improve your organization’s efficiency.

What do I mean? Think of how many hundreds (more than likely, thousands) of hours your staff spends manually merging data from various sources, importing into a spreadsheet, producing charts and graphs, etc.

The average financial institution we worked with last year (for scale comparison: approx. 200-400 employees and 10-20 branches) had well over 3 FTEs of work that could be automated through the proposed analytics solution. At the $500M+ peer group average of roughly $77,000 compensation and benefits expense per FTE, this automation amounts to, at a minimum, of a $231,000 per year opportunity cost savings.

And that’s just scratching the surface on the easy stuff that we could identify during a few days onsite visiting with a client.

Want to improve your organization’s efficiency? Ask how data analytics can help.

5. The industry has some experience with analytics now

Let’s be honest, financial institutions tend to be pretty risk averse. Especially when it comes to new technologies or ideas, the “wait and see” approach is more often than not the method taken. By now, however, there have been enough organizations that have taken the leap into the world of analytics that this isn’t a “new” idea. There are some great use cases from financial institutions across all asset classes on how they’ve had success.

There are also some stories of how analytics hasn’t worked.

This isn’t all bad.

Treat those stories as a blueprint for what to avoid when rolling out your own analytics programs and data strategies. Every experience – whether it ended in success or failure – is an opportunity for the industry to continue to grow its analytics competency and maturity.

If you belong to a more risk-averse organization, I’d suggest reaching out to colleagues at other organizations that have started to leverage analytics. You might be surprised to find out that it isn’t quite as scary as some think.

Still uncertain?

Take a look at an article we wrote last September titled “What’s Holding You Back from Being Data-Driven?”. In that post, I explored some of the common misconceptions around data and analytics that tend to restrain organizations. It’s worth a look to see if some simple clarification could resolve lingering concerns.

What’s in store for 2017?

2017 is going to be a big year for the financial industry. From digital marketing innovations to improved data analytics maturity, this should be a fun year.

Want to stay updated on any articles and new content we publish? (Don’t worry, you’ll only get 1 or 2 emails with new content per week at most!)

Fill out the form below to subscribe!

[mc4wp_form]

Photo Credit

In my last post, I talked about some of the defining traits of an organization at the lowest, first, stage of the “Analytics Maturity Curve”. The second stage of the “Analytics Maturity Curve” is what Tom Davenport refers to as the “Localized Analytics” stage.

Common Traits of This Stage

The “Localized Analytics” stage is a stage that, based on my estimates, roughly 1 in 4 financial institutions have reached. In this stage, any data analytics efforts are siloed to a small number of departments or subject areas. For banks and credit unions, your marketing department – especially a digital marketing area – or lending might be driving your organization to this second stage of “Analytics Maturity Curve”.

Questions that are asked by these pockets of analytics include:

  • Why was last month’s results the way they were?
  • How can we use data analytics to improve [EFFORT/INITIATIVE/PROCESS]
  • What data can we use to understand this [EFFORT/INITIATIVE/PROCESS] better?

Notice that each question is not phrased as a simple backwards-looking assessment. Instead, the questions ask how data can improve or better understand some process or effort. The questions naturally become multi-dimensional this way and generally start to grow the data-driven mindset.

Unfortunately, organizations at this stage do not have complete buy-in from management and the enterprise with data analytics. These pockets of analytics show great interest and open up exciting new innovative opportunities for the organization. Stop and ask yourself this:

Is your department a pocket of analytics? Or is it a detractor from the organization’s analytics maturity? If so, why?

Technology Used at This Stage

In the second stage, the “Localized Analytics” stage of the “Analytics Maturity Curve”, a data warehouse or any enterprise analytics platform is non-existent. The most advanced of the stage two organizations may have an independent data mart in place to support local analytics. However, all technology is siloed and not shared across department lines.

How many reporting tools or applications does your organization have that only have users from a single department?

If you rely on reporting solutions for your lending area, your core, your marketing department, or any other area, you are certainly no further than stage two of the “Analytics Maturity Curve”.

Where’s the Good News?

There is plenty of good news for organizations at this stage of the “Analytics Maturity Curve”. First, you are clearly progressing up from the “Analytics Impaired” stage. Progress is being made and the right steps are being taken. While this isn’t a “full steam ahead” approach to analytics, organizations at stage two of the curve are slowly building up experience with and pockets of analytics.

The lessons each pocket of localized analytics learns can be applied to growing your organization’s analytics maturity as your progress up the curve.

Is your organization at stage two? Here are some parting comments worth remembering:

Prove the value of analytics – even if it is limited to a handful of areas.

Educate your senior management on the successes and/or failures your area has had with localized analytics. Lessons learned – from both the good and the bad – will prove beneficial.

Plant seeds by asking the right question. If someone talks about last month’s numbers, ask them WHY the numbers are what they are. Challenge your colleagues to think deeper about data.

Stay tuned next week for a discussion organizations that are at the third stage of the “Analytics Maturity Curve”!


Subscribe to have new content sent directly to your email!

[mc4wp_form]

Photo Credit

Where does your organization lie on the analytics maturity curve?

This question is functionally equivalent to:

  • What is the state of data analytics at your organization?
  • How do we compare to others in the areas of business intelligence, data, and analytics?
  • With regards to analytics, what the heck are we doing right and wrong?

If Google, Facebook, Uber, Amazon, and Netflix are at the top of the “analytics maturity curve”, where does your organization lie?

Over the next several posts, I will be describing the various stages of the analytics maturity curve. I’ll discuss the distinctive features of the stage, the technology employed at the stage, the types of questions asked, and the general skills required at each stage of analytics maturity.

Arguably the seminal work when it comes to discussing an organization’s analytics maturity, Tom Davenport’s Competing on Analytics: The New Science of Winning provides the foundation for much of the content in the coming posts. I highly recommend Davenport’s book as a fantastic resource for all things analytics.

Stage 1: Analytically Impaired

Stage one. The floor of the analytics maturity curve. You’d better act quick if you expect your organization to survive long at this level of analytics maturity.

Why? Because you are driving with a blindfold on at this stage.

Enterprise analytics – even quality operational reporting – is borderline non-existent. Excel is your greatest analytics tool and you struggle with gaining anything beyond limited insight into your operations.

At best, you might be asking vague questions like “what happened last month?” None of your questions align with the organization’s KPIs – in fact, you might not even have any defined KPIs! Answering even simple questions about the state of your business is a challenge littered with inconsistency and uncertainty. If I asked “how many customers/members do you have?” to several different areas of the organization, I would, undoubtedly, get multiple answers back.

Your primary source of analytics is derived from pre-built, standard reports from application-specific reporting solutions. Data integration is likely only a figment of your imagination thus making it nearly impossible to gain a 360-degree of your business, its customers/members, and their interactions.

At this stage, your organization “analytics” hinges on how many times your VLOOKUPs break in your Excel files. Skills in things like SQL, ETL development, data architecting or data visualization are non-existent or simply un-utilized.

The good news? There is nowhere to go but up. And, to all you CFOs reading this, an organization at this stage has one of the strongest opportunities to yield a near-immediate ROI. Why? Analytics, especially for Stage 1 organizations, represents an untapped oil field of new opportunities and efficiencies (just read a previous post “5 Reasons to Invest in Data and Analytics in 2016” to learn more about these opportunities).

How can you start to move out of the “Analytically Impaired” stage and into the second stage of the analytics maturity curve? I’d strongly urge you to take a step back and honestly assess the current state of analytics at your organization. Create a data strategy and develop a plan for how you can navigate your way up the analytics maturity curve.

Stay tuned for our next post to learn about what analytics looks like in an organization at the second stage of the analytics maturity curve.


Subscribe to have new content sent directly to your email!

[mc4wp_form]

Photo Credit

Today’s A:360 discusses some of the common pushbacks that I often hear surrounding data, analytics, and becoming a data-driven organization. In this podcast, I’ll dispell some of these common pushbacks and explain how you can overcome these misconceptions and challenges.

Watch and Listen

Click to Watch on YouTube.

Listen to the Podcast

Click to Listen on SoundCloud
Click to Listen on iTunes

Read the Transcribed Audio

Hey everyone. Welcome to today’s A:360. My name is Brewster Knowlton, and today we’re going to be talking about the common reasons and common misconceptions that are holding you and your organization back from becoming data-driven.

One of the most common misconceptions I hear about becoming data-driven is that it costs too much money. I hear all the time that building a data warehouse costs millions and millions of dollars. But that’s simply not true for the community banks and credit unions that are in that mid-market in terms of business size.

If you’re a large company (Fortune 500, Fortune 1000 or somewhere in that scope or scale), then absolutely it’ll probably cost you that much. However, for the mid-market organizations (most credit unions and community banks) you’re not even coming anywhere near that number in terms of the initial buildout. It’s simply not true that becoming data-driven costs too much money. Especially for a data warehouse, there are companies out there (one that we partner with for credit unions is called OnApproach that have pre-built data warehouse solutions. With them, you’re not building from scratch, and it can save you a tremendous amount of money in the long run.

The next common misconception that I hear is that data warehouses and analytics projects are way too big of a project. And that’s true – if you try to do everything at once.

Would you implement a new loan origination system, a new core, a new online banking platform and merge with another organization all at the same time? Probably not. And, if you did, those would be extremely large projects to tackle at once and would probably be unbearable. Data warehouses are unbearable if you do too much at once. However, if you break it down into phases and break it down to an iterative process, building a data warehouse or building an analytics platform is not too big of a project. It simply has to be approached the right way.

Another common misconception is when people tell me there isn’t enough time to focus on analytics. They’re too busy. They can’t do it.

The average billion-dollar credit union that we work with has between 45 and 65 third-party applications. And, because of the disparate nature of this data, what that means is that there are thousands and thousands of man-hours that could be automated with a better data and analytics platform.

So, if we could get you back 5,000 hours (which has a significant time-cost associated with it) would there be enough time then? Or, think about what the value of having those people get that time back would be. Whenever I hear that there isn’t enough time to focus on analytics, that is just either a misconception or [making analytics] a lower priority.

There is enough time. In fact, you will get more time if you invest in these projects.

This next one is probably my least favorite excuse. It’s an issue that I get when discussing data and analytics with people they often say, “The way that we’ve always done it has worked, so why would we need to change? Why bother? Why should I invest in this data and analytics stuff? We’ve been doing just fine.”

In the past, that logic might have prevailed. But look how much the financial industry’s landscape has changed in just the past five years. You have peer to peer lenders. You have peer to peer payment mobile apps like Venmo. You have 100% digital banks. You have a very different landscape, and the way that those vendors – those non-traditional competitors – are successful is through an investment in analytics. And they have a lot less data about your customers than you do!

Saying “The way we have always done it works” is not going to be true in the constantly changing future environment. It requires data and analytics to be innovative and adaptive.

The last common misconception or excuse that ends up holding organizations back from becoming data driven is when they say that, “We don’t have the right culture. We don’t have a culture for using data, so why would we invest?”

That’s sort of a tautology, isn’t it? If you don’t have a data-driven culture, then you don’t have a data-driven culture. Right? It’s obvious. But like anything else, you have to develop that competency. You don’t just step into a car and automatically know how to drive. You have to learn. Therefore, as part of your analytics project, you have to take the right time to promote and grow a culture that supports, trusts, believes in and uses data and analytics. (We have a couple posts and podcasts that talk about that.)

Regardless, that cultural shift is key. Using the statement that “We don’t have the right culture for data and analytics” as an excuse to not invest in analytics presents a natural paradox. You have to grow that culture in order to make use of analytics, and every organization goes through this challenge. Nobody naturally has, unless they built it from the ground up when starting the business (see Uber, Netflix, etc.), an intrinsically data-driven organization. That is part of your analytics project, and you too can grow the proper culture to successfully deploy and leverage analytics.

Again, these are the common misconceptions or excuses that I hear for why organizations are not data-driven and what’s holding them back:

  • They say that becoming data-driven costs too much money. False.
  • They say that data warehouses are way too big of a project. False.
  • I hear that there isn’t enough time to spend on analytics. Surprise…also false!
  • My least favorite: “The way we’ve always done it works.” Well, in the past, yes. In the future, probably not.
  • They will say, “We don’t have the right culture”. You can have the right culture. You just have to build it. It is not going to inherently exist.

That’s it for today. Thanks again for tuning in to today’s A:360.

Subscribe to have new content sent directly to your email!

[mc4wp_form]

Photo Credit

For starters, yes the title is a terrible play on “when life gives you lemons, make lemonade”.

Bad jokes aside, I hear too frequently how organizations need more and more data. I’m a data guy – I’m all about data. But there is a subtle difference between having more data and more information.

Below is one of my favorite quotes about data:

Without data, you’re just another person with an opinion. – W. Edwards Deming

Opinions are the foundation of subjectivity, and subjectivity fundamentally is devoid of data as support. Decisions driven by opinions without data is counter to everything The Knowlton Group (and I personally) stand for. Our primary mission is to enable every organization to become data-driven.

But as much as I like Deming’s quote, I also love this quote from a 2015 Forbes article:

“Without an opinion, you’re just another person with data” – Milo Jones and Philippe Silberzahn

The converse of Deming’s quote is equally accurate. Having data without an opinion or interpretation of that data is as bad as forming opinions without any data to back you up. With all of this data out there, it is foolish to believe that data will tell us what we need to do. All data requires interpretation and opinions to be formed before it can be practically applied.

The Proper Way to Use Data

Though this section is a gross oversimplification, it boils down the proper way to use data into a few simple steps. Following this process will enable you to turn data into information.

1. Ask a Question

Every actionable use of data must start with a question. These questions can be relatively simple like “how many members do we have” – unless there is no consistent definition for a member! .

The questions you ask can also be more complicated like “do we need a new branch?”. Regardless of what the question is, properly leveraging data and analytics requires asking a question to which you hope to discover an answer.

2. Form a Hypothesis

Like the null hypothesis in statistics, I strongly believe that you must start with a hypothesis. This is where your opinion and subjectivity can come into play. Use this hypothesis as a method to which you test your analysis against. Be sure that confirmation bias doesn’t play into your analysis. If your hypothesis turns out to be incorrect, who cares!?.

“I have not failed. I’ve just found 10,000 ways that won’t work” – Thomas Edison

3. Test your Hypothesis

With a question asked and a hypothesis formed, now you can begin to discover an answer to your question and test your hypothesis. This is where we can start to dig into the data (and the fun really begins). Simpler questions may require you to gather data from a single source. More complicated questions can require analyzing data from multiple data sources. For the more complicated questions, a data warehouse really starts to prove its worth!

Avoid gathering more data than you need and overcomplicating the process. Paralysis by analysis is a real thing. Gather and analyze only the data you need to test your hypothesis. Here are some of my favorite quotes/sayings on avoiding unnecessary complexity to motivate you:

“KISS – Keep it Simple Stupid” – Kelly Johnson, Late Lead Engineer for Lockheed Skunk Works

“Simplicity is the ultimate sophistication” – Leonardo Da Vinci

“Make simple tasks simple!” – Bjarne Stroustrup

4. Analyze Findings and Provide an Explanation

This is one of the most overlooked steps of the process. Asking the right question of your data is crucial, but you must provide an explanation or recommendation based on your analysis. Too often I see fancy documents put together with many pages and loads of charts and spreadsheets. In these instances, you’ve written a lot but said little.

Be concise, be clear, and provide an answer (for simple business questions) or a recommendation (for more complicated business questions).

Wrapping Up

Properly using data and analytics is becoming invaluable and, ultimately, necessary for most organizations. To begin developing a data-driven culture, have your staff read this article to learn how to properly turn data into information.

There is A LOT of data out there. The most successful companies (and people) know how to turn that data into information.


Need help turning data into information? Contact us by filling out the form below or sending us an email!

[contact-form-7 id=”3″ title=”Contact”]

Photo Credit

I’ve thought more and more about this question in the past few months: can traditional financial institutions (retail banks and credit unions, specifically) become (or, at minimum, act like) technology companies?

Critique to this article might very well be, “who cares whether traditional FIs can be classified as technology companies.” Admittedly, the debate is predicated on an entirely semantic argument, but the question goes much deeper than a superficial definition. Regardless, I’m going to give it a shot.

We first have to come up with a definition of a technology company – no small feat – and determine whether or not banks and credit unions could ever fulfill that definition. Once you’ve finished reading my take on this question, comment and let us know where you stand.

The Definition of a Technology Company

Unsurprisingly, the definition of a technology company is a bit of an enigma. In a recent Inc.com article, four respected technology executives, analysts, and venture capitalists gave their takes:

“You are a technology company if you are in the business of selling technology–if you make money by selling applied scientific knowledge that solves a concrete problem.”

Alex Payne, Co-Founder, Simple

“It’s generally a company whose primary business is selling tech or tech services. A more nuanced definition is a company with tech or tech services as a key part of its business. It’s a hard question.”

Todd Berkowitz, VP of Research, Gartner

“A tech company uses technology to create an unfair advantage in terms of product uniqueness or scale or improved margins. Ask the question: Could this company exist without technology? If the answer is no, it has to be a tech company.”

Greg Bettinelli, Partner, Upfront Ventures

“I think there’s a false dichotomy in the idea that a company either is or is not a tech company. I think it’s possible for a company to be a hybrid if tech is giving it an edge over incumbents.”

Hayley Barna, Venture Partner, First Round Capital

Each response to the question of what defines a technology company never fully answers the question. Payne’s response aligns more with a way of approaching problems using a scientific approach – a common theme amongst companies who proclaim they are technology companies. Berkowitz’ more nuanced definition is an interesting thought but is difficult to rationalize if “technology” is not equated to “software” in the context of his definition.

The point is that nearly every definition of a “technology company” is deficient in some way.

Let’s try a different approach where we dig deeper into leading technology companies instead of superficially trying to define them. I’d equate this approach to answering the question of “Who is John Smith” not by superficially replying “he is a 5’11” male who works at ABC corp…” but rather by saying “John Smith is a father of two who enjoys spending time with his family while balancing work and life without sacrificing what is most important to him”. It’s a very different way of approaching the same problem, whereby the essence of John Smith is more descriptive than how he appears.

Instead of trying to apply a simple definition to a clearly more complicated classification, let’s look at the common characteristics of companies we tend to mostly agree are “technology companies”. Perhaps we can create a definition driven around the essence of what makes a technology company successful.

Apple

Apple is one of the most likely candidates for the company that is front-of-mind when you hear “technology company”. What defines Apple is their ability to innovate and challenge the status quo. This innovation is their competitive advantage. Whether they are designing the iPhone or challenging the music industry with iTunes, Apple has shown time and time again that they can effectively change the course of industries.

Creating a bigger iPhone is not innovation – innovation is taking phone, email, messaging, calculators, alarm clocks and dozens of other applications and technologies and putting it in the palm of your hand with a “smart” phone. Whether it be the personal computer, smart phone, wearables or a variety of other ventures Apple invests in, their innovation – not their products – is what makes them so successful.

Alphabet (Google)

Google has become a generic trademark for search engines. How often do you say “I’m going to search for something online using a search engine?” Never. You say “I’ll Google it”.

It’s no secret that Google is much more than just a search engine. Back in October of 2015, Larry Page and Sergey Brin reorganized Google into Alphabet. This holding company structure better enables each business line to grow independently. From the “traditional” Google products and services to YouTube to Google X, Nest, and the other subsidiaries of Alphabet, the freedom to grow and innovate each business line independently is critical to Alphabet’s master plan.

At the heart of all each product and service is innovation. Susan Wjocicki, current CEO of YouTube and 16th Google employee, penned a great article several years back (pre-Alphabet) titled ”The Eight Pillars of Innovation”. She discusses the various ways Google (now Alphabet) continues to be an innovative company despite its sprawling operations that span tens of thousands of employees, multiple continents, and an ever increasing number of products and services. In hindsight, Susan’s article could certainly have foreshadowed the impending reorganization.

Microsoft

Bill Gates is arguably one of the top business people of the 20th century. It takes quite a leader to put “a computer on every desk and in every home. He has certainly contributed to making that goal a reality when he first started Microsoft back in April of 1975.

Considering that the large “computers” of 1975 were primarily used by larger companies, the thought that he could put one on a desk in every home is arguably crazy. It takes an innovative genius to be able to drive a vision that could have been considered science fiction at the time. While the mid-2000s might have taken away some of Microsoft’s “innovation points”, the rise of Satya Nadella to the CEO position after Balmer’s departure is bringing innovation back to the forefront of Microsoft’s business model. Azure, HoloLens, opening up Microsoft’s development tools, and a variety of other new products, services, and ventures has placed Microsoft back into the conversation of leading innovative companies.

Innovation is what got them to where they were; Nadella is betting on it to bring them back to the top of the technology world.

What Does This Have to Do with Financial Institutions?

Are you noticing the not-so-subtle theme? Every leading technology company is defined not by their software but by their innovation. We can’t deny that these original technology companies didn’t develop great software and technology, but the innovation that drove the technology is what defined them. It’s no surprise that Walter Isaacson fantastic book, “The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution” emphasizes the roles of Steve Jobs, Larry Page, and Bill Gates (Apple, Google (Alphabet), and Microsoft) in making the modern, digital world.

Let’s establish the unstated though, hopefully, obvious point I am trying to make: a “technology company” is defined through its innovation and not by its products and services. The products and services (read technology) are there to simply support their innovative endeavors.

So how does this apply to banks and credit unions?

The most innovative advances in banking and finance are being developed by companies that are not traditional financial institutions. Companies like Square, Sofi, LendingClub, and Stripe are challenging services that have traditionally been provided by banks and credit unions.

Why are they successful? Because they are challenging the way that things have always been done. In my work helping financial institutions design and implement data strategies, I get to interview dozens of staff from the CEO to the customer and member-facing staff. I cringe when I ask why a current tool is used or process still exists and the response I get is “because we’ve always done it this way”.

The intersection of business and technology has blurred the line of the archaic definition of a technology company. To answer the question asked in the first paragraph, “can banks and credit unions become technology companies?” My answer is a resounding yes.

The answer is yes but not by the definition of what investors will use to classify tech companies on the stock market. I answer yes based on the definition that leading technology companies are as successful as they are because of their innovation and not their technology. By my definition, companies like Tesla, Bayer, and Marriott are also technology companies (see this list here for others that would certainly satisfy this definition).

Be Steve Jobs and challenge the status quo. Be Bill Gates and realize a vision so grand that people think you are crazy. Be Sergey Brin and Larry Page and revolutionize the world.

Traditional financial institutions can compete with financial technology startups by being as innovative as they are. The software and technology is but a byproduct of their innovation; embrace an innovative culture within your bank or credit union and you too can change the industry.


Image Credit

In our previous post, we introduced you to the data and analytics management methodology, “centrally driven, broadly distributed”. Having discussed the reasons why a “centrally driven” approach to data and analytics is optimal, this post will dive into the second half of the quote, “broadly distributed”.

“Broadly Distributed” Analytics Takes Down Silos

In many organizations without a robust data and analytics program, analytics is typically managed within silos as discussed in part 1. The reporting and analysis is then rarely distributed to the rest of the organization. Rather, it is kept “close to the chest” of the department that originated the analysis. This creates a “broadly driven, centrally distributed” approach whereby analytics originates in departmental silos and never leaves those silos.

Organizational transparency is critical to a strong data and analytics program. Marketing can no longer operate in a vacuum without support and communication with IT, Lending or Operations (to name a few in a banking context). These interdepartmental efforts create the need to view reporting and analytics across department reporting lines as well.

A “broadly distributed” approach to data and analytics ensures that data can be consumed by all departments in the organization. Of course, security measures must be ensured for particularly sensitive data, but operationally data transparency is of critical importance. If a new lending promotion is being planned, Lending shouldn’t have to jump through hoops to get Marketing data. Similarly, Marketing shouldn’t have to jump through hoops to get data from the Lending team.

Silos are bad.

From “I” to “We”

In organizations where analytics is not front-of-mind, there lacks an enterprise-wide view of processes, goals and operations. When data and analytics is “broadly distributed”, staff start to see how their department’s efforts contribute to the organization’s strategic goals.

Enterprise wide distribution of reporting and analytics enables inefficient processes to be identified early and reduce their negative impacts. For example, if a new lending product is not interfacing with the core properly, Lending, IT, Marketing, Operations or a staff member of any department could assist in identifying this process flaw.

More often than not, we tend to find that staff are willing and able to adopt an enterprise level view of their efforts. The challenge often lies in giving them the tools and access to fully embrace a view outside of their own department’s efforts.

A “broadly distributed” data and analytics program enables your employees to no longer think only about their department’s operations but about the enterprise’s operations with the big picture in mind.

Accurate Across Reporting Lines

In part 1 of this post, we described a situation where various members of different departments are asked the same business question, “how many members do we have?”. In a “broadly driven” (instead of “centrally driven”) data and analytics model, each staff member would go to their own data sources to retrieve the “correct” answer. Rarely does each staff member produce the same answer.

A “broadly distributed” approach in conjunction with a “centrally driven” model ensures that each employee retrieves data and analytics from the same, single source of truth (add link to data warehouse article). By distributing the “centrally driven” repository of data and analytics throughout the organization, accurate and consistent reporting proliferates throughout the company.

Wrapping Up…

Combined with a “centrally driven” solution, a “broadly distributed” model is unquestionably the optimal approach to data and analytics. Eliminating silos, reducing inconsistencies and inaccuracies and ensuring an organizational mindset are but a few of the positive outcomes of this design. Coupled with the benefits described in our post on a “centrally driven” solution, our hope is that you and your internal team’s see the value in approaching enterprise analytics the proper way.


If your team needs any form of assistance in formulating your data and analytics strategy, give us a call at 860-593-7842 or send an email to brewster@knowlton-group.com!

Subscribe below to have our new content delivered directly to your inbox!




In Thomas Davenport’s “Competing on Analytics: The New Science of Winning”, one of the first chapters of the book defines the common attributes of analytically-driven organizations. Davenport discusses that one of the critical aspects to success in analytics revolves around taking an enterprise-level approach to managing data and analytics. He then quotes Harrah’s Entertainment’s management approach to enterprise data and analytics, calling the approach “centrally driven, broadly distributed”.

Focusing on “Centrally Driven”

In this article – part 1 of a two part post – we will discuss the first half of that quote. Having data and analytics be “centrally driven” is of the utmost importance to success and sustainability of your analytics programs.

Inconsistencies Abound

If you are an executive of any industry, I know for certain you have experienced the following scenario: you ask the same data and analytics question to different departments and you get several different responses.

If you are in the banking industry, ask Marketing, Lending, IT, Finance, Commercial, and Retail Operations how many customers/members you have.

How many different answers do you think you will get?

In most organizations, each department would independently retrieve the information that you asked of them. Marketing might go to their MCIF, IT might go to the core, Finance might look at the GL application – rarely will each department go to a centralized, single source of truth to retrieve the “correct” answer. Since each application is managed separately with different data contained within, there is no way to ensure consistency and accuracy without a central data warehouse or repository.

Without a centralized, enterprise-level platform of data and analytics, it is nearly impossible to ensure consistent, accurate reporting.

Spreadsheet Errors

I love Excel as much as the next guy. Quite a few organizations across nearly every industry live and die by their Excel spreadsheets (investment banking first year associates know this all too well). But how many Excel spreadsheets are 100% correct and error-free? Some studies estimate that nearly 9 in 10 Excel files have errors!

Especially in the banking industry, I see organizations littered with complicated spreadsheets. Analysts will extract Excel files from different data sources and then hope they’ve defined their VLOOKUPs accurately to be able to consolidate the data sets. I’ve personally witnessed high-level, executive reports with wildly inaccurate formulas that effectively rendered the spreadsheets useless. This is, sadly, not an anomaly.

How do you resolve this issue? By taking a centrally driven approach to managing data, users will not be required to consolidate data from various sources or manually define complicated, error-prone formulas. Data will be consolidated and validated reducing or eliminating the vast majority of these overly-complex Excel spreadsheets. Data warehouses combined with BI and data visualization tools (like Tableau, InformationBuilders, Qlik, and many others) provide an enterprise-level, industry-leading platform for data and analytics.

Definitions and Data Governance

Whenever I get the opportunity to speak with a credit union, I eventually always ask the same question, “what is your definition of a member? Does everyone have the same definition as you?” This seems similar to the inconsistencies idea brought up at the beginning of the article, but even with the same data source, there is no assurances that everyone will define a key business term the same.

Marketing might only focus on members that have not been placed on a “do not contact” list. What about the same individual (i.e. unique SSN) that has multiple accounts and member/customer numbers? How are they counted for aggregation purposes? These subtleties are critical to a strong data and analytics program. A cross-departmental team must agree on key definitions that are used throughout the organization. This centrally driven approach to defining key business terms helps ensure accountability and consistency.

Wrapping up

A centrally driven approach to data and analytics ensures consistently accurate reporting with key business definitions universally understood by everyone in the organization.

In our second part of this post, we talk about why having a “broadly distributed” data and analytics solution is vital.

Check back soon or subscribe to our blog for updates!

[mc4wp_form]