A few backs at the 2017 Credit Union Analytics Summit hosted by First Tech Federal Credit Union in Redmond, Washington, Brewster Knowlton moderated a panel with five fantastic industry leaders in the analytics space. The panelists included:

  • Naveen Jain, VP Digital Analytics at First Tech Federal Credit Union
  • Clay Yearsley, SVP Data Analytics at Texas Trust Credit Union
  • Harsh Tiwari, CDO at CUNA Mutual
  • Ben Morales, CEO at Q-Cash Financial and CTO at Washington State Employees’ Credit Union
  • Matt Duke, Director of Data Analytics and Product Management at Cisco

If you are an organization (in any industry!) wondering how to get started with analytics, this panel discussion is worth watching. We discuss several topics including:

  • How do you get started with data?
  • How can you identify quick wins and gain momentum for analytics?
  • What are the most crucial points to consider when getting started with data?
  • How do you address and manage the cultural transformation component of analytics initiatives?

Either watch on YouTube or press play below!

Subscribe to have new content sent directly to your email!


Today’s A:360 discusses why it is critical to boil analytics down to well-defined questions. A question is the fundamental building block of analytics. Well-defined questions can shape and simplify the delivery of analytics to an organization. For those business users who aren’t quite sure what data they are looking for, helping them shape a question can be an excellent starting point.

Watch and Listen

Click to Watch on YouTube.

Listen to the Podcast

Click to Listen on SoundCloud
Click to Listen on iTunes

Read the Transcribed Audio

Hey everyone. Welcome to today’s A:360. My name is Brewster Knowlton, and today we’re going to be talking about why success with analytics needs to start with a well-defined question.

I’ve seen a lot of instances where individuals go up to their analytics staff and they’ll ask some very general questions like, “I want to see more data” or “show me some analytics”. That’s like going up to someone and saying, “I want dinner”. Well, what do you want? There’s a lot of choices. The same thing goes for analytics. There has to be that specificity.

The best way to define that specificity for analytics and define what you’re really going for is to frame every analytics idea or every analytics objective in the form of a question.

Asking, what are you trying to accomplish?

If you can get the business users who are requesting information and analytics from you to ask a question, that helps change the context of conversation that you have with them. It’s not about simply producing a report, it becomes about helping them answer a question. This changes the parameters and context in which you gather information and present it. But, in order to do that, you must start with a question.

Often times, people will ask for data and/or reports just so they can try to figure out what they’re looking for. They have an idea, but they don’t quite know how to articulate what they are looking for in the form of a question. This is where a strong analytics team can really show its strength. It’s not so much in the technology, it’s in helping the business translate what they’re trying to figure out into a well-defined question, and then figuring out can we go about answering that question.

This may sound overly simplified, but this really is the fundamental starting point for analytics. There’s an article that I wrote that’s called, “When Life Gives You Data, Make Information”. It talks about the distinct difference between data and information. At its core, it really comes down to asking a question.

The difference between data and information is that data is just raw numbers. Information is the actionable intelligence built off of that underlying data. Let’s look at an example from business users in lending. suppose one of your business users comes to you and says that they need a report of all loan applications in a pending status. They’re really trying to ask a couple of things. For instance, they may be trying to figure out how to improve their close rate. Or, answer the question “why are so many loan applications falling off before being approved?”. They may be trying to figure out how to increase throughput or productivity. My point is, they haven’t really defined a question, and as a result, they’re grasping at straws. They’re looking at all of this data and trying to make sense of it. Helping these users to frame a question at the very beginning, not at the middle or the end of the analytics gathering process, can help them target exactly what they’re looking for and may allow you as the analytics individual within your organization to be able to better provide what they really need or what they’re really looking for.

Again, this seems oversimplified. It’s funny because I’m sitting here doing over a four-minute podcast about why it’s important to ask a question with analytics. But it really is something that falls by the wayside, especially as we get inundated with requests. People will just say, “I want data. I want data. I want data”. My suggestion is to take a momentary step back, and. as simple, insignificant and superficial as it may seem, just ask: “What question are you trying to answer?” That question alone will help spark a conversation that I can assure you will improve the process by which you can deliver analytics to your organization.

That’s it for today. Thanks again for listening to today’s A:360.

Subscribe to have new content sent directly to your email!


Photo Credit

Have you ever wondered what it takes to enable your organization to become data-driven? For a limited time, I am giving away my “Credit Union Executive’s Guide to Data and Analytics” eBook FOR FREE! To get the eBook, complete the form below and press “Sign Up”:

[mc4wp_form id=”1960″]

eBook Overview

It’s no secret that credit unions are becoming more data-driven organizations. As the amount of digital channels increases and the usage and utilization of those digital channels increases, so too does the amount of data available and accessible to those organizations. Throughout this eBook, I will discuss how credit unions can become data-driven organizations, what that really means, and some simple ways to get started on analytics journey.

What Will I Read About?

First, I’ll help you understand what it means to be a data-driven credit union. I don’t want the term “data-driven” to become just another buzzword. I’ll address what it really means for an organization to use data properly and how organizations can take advantage of data and analytics.

Second, I will address some key definitions and common misconceptions surrounding business intelligence, data, and analytics. For example big data, my least favorite buzzword, is drastically misused on a regular basis. I will discuss what big data really means, and what people tend to mean when they say the term “big data”.

Next, I’ll discuss the importance of a data strategy. I’m a firm believer that strategy without execution will fail and that execution without strategy will also fail. Business intelligence and analytics requires a top-down approach. This means that an organization should start with a strategy and then figure out how to execute that strategy. I will address this concept further and the details associated with a data strategy when we get to that section.

Next, I will teach you how start building your credit union’s data strategy. Answering the question, what can you start doing today to begin the discovery process and gather all the relevant information necessary to begin formulating your credit union’s data strategy?

I will then go over how to gain greater clarity around the technical aspects of building a data and analytics platform. By downloading this eBook, I do not assume you have a technical background. In fact, when I discuss the technology that’s going to be used, I do so from the perspective of someone who does not have much, if any, technical expertise. At the same time, we do need to have a discussion about the technology that your credit union could or should be using in order to maximize your data and analytics knowledge. This will help you become aware and understand what is involved for analytics. It will also better prepare you for conversations with both technical and business people.

Next, I will go over a common question that I often get about analytics: “How do you address data quality?” Data quality issues are something you can never get rid of. There will always be bad data somewhere. The important thing is not to try and have perfect data, but that you have a plan to address data quality issues when they inevitably arise. Data quality issues will always present themselves as long as there is the possibility of human error (which is always)! Being able to address those data quality issues as they arise is essential, so they do not snowball into bigger problems and compound going forward.

Next, I will address how you can build and evolve a data-driven culture. Not properly developing a data-driven culture is one of the most common reasons that data and analytics initiatives fail within credit unions. As a result, I will address what constitutes a data-driven culture, how you can create a data-driven culture, and some techniques that can help you out along the way.

Lastly, I’ll go over the importance of “quick wins” along with something known as the “Flywheel Effect”. For those of you who have read Jim Collins’ book, Good to Great, this “Flywheel Effect” concept should be familiar to you. It has an interesting application when it comes to analytics. Learning how to gain early momentum is critical in implementing a successful and sustainable data strategy and a data and analytics platform.

Complete the form below to have a copy of “The Credit Union Executive’s Guide to Data and Analytics” sent to your inbox!

[mc4wp_form id=”1960″]

More and more financial institutions are investing in developing their own analytics teams. Data warehousing and other modern analytics platforms are becoming the norm and not the exception. As these organizations start to develop their data strategy and implementation roadmap, some of them find that their data is being held hostage.

What do I mean by that?

Let’s assume that you are running a CRM system, for example, that is on-premise. More likely than not, the data for that CRM system is being held in a SQL database. Getting data out of a SQL database is easy in the world of data warehousing and analytics.

Now, let’s assume you are running a loan origination system (LOS) that is a hosted, third-party application. Except for a handful of exceptions, you will not be able to access a SQL database housing this data directly. However, your analytics team needs to get this data out of the hosted environment. You likely will call up the vendor, and they will give you a quote for how much they will charge you to provide you with your data.

Let me repeat that. They will give you a quote for how much they will charge you for YOUR data.

Avoiding Data Hostage Situations

Access to data is typically an afterthought in the product evaluation process for new software acquisition. As more organizations take steps towards becoming data-driven, the need to have easy access to their data will become even more critical than it already is.

Data access, then, should become part of the software evaluation process – a forethought instead of an afterthought.

Most vendors have the means to provide the data to you a number of different ways. By waiting until after implementation, however, data access becomes a product and/or service increase as opposed to an existing feature of the software acquisition. This is typically where you receive a quote for how much it will cost to have data delivered to your organization.

Some of you may be reading this saying “but I can access all the data I need from a web portal they’ve provided to me.” In that situation, reports must be manually opened and downloaded if you want to do anything with that data. Your analytics team will need data automatically downloaded or transferred to a specific location on a regular (usually, nightly) basis. Access to a reporting portal that requires manually downloading of reports and data is insufficient for a data-driven organization.

Key Point: Negotiate access to raw data at the beginning of the software acquisition process not after it has been implemented.

Ways Data Can Be Delivered

Most vendors have several ways of automatically delivering data to your organization:

  1. SFTP (Secure File Transfer Protocol) – a secure way to send files to and from vendors. For those vendors that cannot allow direct access to a SQL database, this is usually the most common delivery method.
  2. SQL Replication – some applications (shoutout to MortgageBot) will set up a replicated SQL instance on your network. Put simply, they are putting a copy of the production database on your system for reporting purposes. This a dream come true for analytics teams that need access to raw data.
  3. Physical Copies of Database Backups – some vendors are able to send you a physical copy (i.e. encrypted external hard drive) that contains a copy of a SQL database up to a certain point in time. Then, they can SFTP over backups and/or log files that update the database. A hybrid of the first two options, there is a bit more work in this solution, but it is still a viable option
  4. API – as credit unions and community banks start to build their own development teams, APIs are becoming more commonplace. Think of this as the language through which a development team and an application could communicate. Depending on how open the API is, this may be a sufficient option to gather the raw data required by your analytics team.

There are few other delivery methods, but, for the most part, they are derivations of the methods already mentioned.

As you are negotiating or re-negotiating with your vendors, make the conversation about data access and delivery a priority. Some of the most successful financial institutions are achieving their success through their increased use of data analytics.

Avoid having your data held hostage and make data access a priority in all software evaluation processes.

Subscribe to have new content sent directly to your email!


Photo Credit

Today’s A:360 discusses a few suggested ways to measure the return on investment (ROI) for your analytics initiatives. A common question I receive is “how do we determine the effectiveness of our analytics efforts?”. This podcast’s intent is to present a few possible ways to answer that very question about measuring analytics ROI.

Watch and Listen

Click to Watch on YouTube.

Listen to the Podcast

Click to Listen on SoundCloud
Click to Listen on iTunes

Read the Transcribed Audio

Welcome to today’s A:360. My name is Brewster Knowlton, and today we’re going to be talking about some ways that you can measure return on investment for your data and analytics initiatives.

As analytics initiatives become much more commonplace in even the smallest organizations, there will always be the question: “How are we evaluating whether or not we are getting a strong analytics ROI – a strong return on our investment, for both time and resources, in our analytics initiatives?”

We’re going to cover a few different ways to measure this. These are good starting points, by no means is this going to be a compressive or exhaustive conversation about analytics ROI. However, it’s going to be something that gets the conversation started, while giving you a few ideas to get going with your own analytics initiatives and ROI calculations.

My starting point for anything related to analytics ROI is going to be, “How can I reduce the time it takes my staff to be able to produce reports or some data-driven analysis?”

Reduce Manual Reporting Efforts

I find that (we will look at this in the context of a credit union or a bank) for a financial institution with a billion dollars in assets, there are between 4,000-6,000 hours (minimum) that can be automated through the use of improved reporting and analytics. This would most likely be through the integration of data and/or the automated extraction of data from applications that don’t allow for easy data extraction.

This is what our previous podcasts and articles talking about data inventories or report inventories are getting at. That is, how much time is it taking on a monthly or weekly basis to produce those reports?

Once you have that time calculation, you can back into the opportunity cost of your staff manually producing these reports. You can do this by taking the percentage of their time spent (in a month, let’s say) on producing what could be automated reports, multiplied by their compensation and benefits expense. Then, you are able to get a number that’s telling you how much it costs your staff to do manual continue to gather data and produce reports. For organizations in the first 18-24 months of their analytics initiatives, this is where you’re probably going to get the biggest bang for your buck and the biggest return on investment. Frankly, you’ll also make a lot of friends in the process as you’re going to give people a lot of time back per week or per month.

Measuring Access to Data and Analytics

The next way that I would recommend you start looking at analytics ROI is by measuring the overall access to data and analytics throughout the whole organization. What we tend to see is that there are pockets of information – silos – throughout an organization where data may or may not be shared and spread throughout the rest of the organization. What this does is creates very limited insights into the organization as a whole as it relates to anything operationally or in the context of analytics.

A way to quickly identify the spread of analytics usage is to measure how many individuals not only can access essential BI portals (usually the front-end to your analytics platform) but also how many people are accessing it on a regular basis. Especially those in roles where having access to and consistently using data is going to be a critical component of success in their jobs in a more data-driven organization.

This is a point that isn’t necessarily financially-driven. (How many CFOs listening are saying to themselves that you only like financially-based ROI calculations?!) But, as we start to talk about overall utilization of any product that we acquire or implement, we need to consider how well it’s actually being used throughout the organization. If two people are using it out of an organization of five hundred, our analytics product penetration is very low. While not necessarily financially-driven, it is a way to measure the overall impact of your analytics platform and initiatives.

Measures of Self-Service

You’ve heard me say this before if you’ve listened to previous podcasts or read our articles: your analytics program should be centrally-driven and broadly distributed. What that does NOT mean is that your analytics team becomes a series of report writers where they’re, essentially, order takers from business users that need data.

So, another way to measure analytics ROI is to analyze how many reports or dashboards visualizations are in your BI portal that have been created by the analytics team and how many of those reports have been created by the business users themselves. This becomes a measurement of the self-service capability of your analytics platform. Again, this is not necessarily financially-based from an ROI perspective, but as we look at overall utilization, you really want to have a platform that enables the business users to get data on their own. If your analytics team is required to constantly create all the new reports and analysis, you’re not going to be able to scale as the data needs and analytics requests rise. Therefore, this self-service piece is a very important and integral component of the success of analytics program and initiative. This metric directly measures the success or failure towards that objective.

Benchmarking Analytics Lift

The last point that I’ll make about measuring analytics ROI – again, this is by no means and exhaustive list but just the starting point for the conversation – is that you can measure analytics ROI and determine the impact of analytics through the benchmarking process.

For example, let’s look at a lending example. Suppose, before you were able to dive into your underwriting and origination data, you had a 5.5% average yield for your consumer loan portfolio. Now, after investing in analytics and you’ve gathered and integrated your loan origination data and loan servicing data, you found there were more opportunities to underwrite loans with lower credit quality borrowers but maintain your same delinquency and chargeoff ratios. In this example, you are able to make more money with a reduced risk or equivalent risk portfolio. After all this, instead of having a 5% yield, you might be at a 6% yield. All of a sudden, you’re making an extra 50 basis-points on loans – directly contributing to the bottom line.

That benchmarking comparison of, “What did we do before analytics?” versus “What did we do after analytics?” is just one way that you can start to show the value. There are going to be a million scenarios where benchmarking applies. Look at your credit card portfolio and look at the number of transactions. Perhaps a dive into the data helps you develop a gamification-based marketing campaign that has an emphasis on signature-based debit card transactions as opposed to PIN-based transactions. This would lead to more interchange income on each swipe. Benchmark the before (i.e your control group) to the “after”. Try to explore the different ways that you can use benchmarking as a way to determine the return on investment or the impact of analytics on your operations.

As I’ve said, this is by no means an exhaustive list for measuring analytics ROI, rather it is just a way to get some ideas flowing about how you can measure the impact and return on investment for analytics.

We talked about measuring analytics ROI through:

  • Reducing employee time to create reports, dashboards, and other data-related tasks
  • Measuring access and utilization of analytics and the BI portal
  • Measuring the percentage of reports and analytical efforts created by the business users vs. the analytics team directly
  • Benchmarking your time in the “before analytics” period versus the “after analytics” time.

That’s it for today. Thanks again for listening to today’s A:360.

Subscribe to have new content sent directly to your email!


Photo Credit

Today’s A:360 discusses why it is highly recommended that most organizations take an iterative, phased approach to developing their analytics solution.

Watch and Listen

Click to Watch on YouTube.

Listen to the Podcast

Click to Listen on SoundCloud
Click to Listen on iTunes

Read the Transcribed Audio

Hey everyone. Welcome to today’s A:360. My name is Brewster Knowlton, and today we’re going to be talking about the importance of taking an iterative approach to building out your data and analytics program.

By an iterative approach to data and analytics, I really mean your organization should make a focused effort not to try to tackle and bite off everything all at once. Essentially, approach business intelligence or analytics in phases.

I hear a lot of talk about big data and getting to this point of unstructured data where technologies like Hadoop, Hive, and MongoDB are thrown around. Before worrying about those more advanced technologies, take it one step at a time. Crawl, walk, run.

Focus first on building out a data warehouse and identifying those high priority data sources (like a core or other important third-party data sources) that you need to integrate first. In this podcast we’re going to talk about the importance of taking that iterative approach and some of the dangers of going all in and trying to do too much at once as well.

Identifying ROI is something that a lot of executives are rightfully concerned about when it comes to analytics. Let’s assume you start by building out a data warehouse with a core and a few major third-party applications (these could be your loan origination system or a CRM system, for example).

Without investing all of the resources building out your analytics program up front, you can show incremental value to the more skeptical individuals within your organization with a phased approach. That’s not to say that those organizations that are going all in are taking a bad approach necessarily, but there are some organizations that have to take a step back and prove that ROI at each step of the development process. By taking an iterative, step by step approach to this, you can actually start to build up that scale, build up the ROI and start to incrementally show value without having to rely on one big initial burst after months of development and a larger up-front cost.

Taking a phased approach allows you to build out the skills of not only your BI team but the rest of the organization as well.

If you are going for a spoked-wheel model [of analytics] – where you have subject matter experts or power users within each department that are going to be responsible for some of the analytics in that area – you’re going to need time to build up their skills and train them on various things such as:

Analytics is certainly a learning process. By taking a phased approach to analytics, we can learn from our mistakes during each phase.

If we try to boil everything down to a single phase – especially for those organizations who are trying to build their data warehouse/analytics platform in-house (I would probably urge you to take a step back and reconsider if that’s the best approach) – you’ll likely wish you adopted an iterative development approach. If you make a critical mistake in the first phase and continue to make that same mistake (because you haven’t broken the project down into phases where you’ve learned from your mistake), you’re going to create a very, very difficult rats nest to unravel when you realize the mistake later on. As a result, you’ll have to go back through potentially every single phase and make changes.

By taking an iterative approach, you might make a mistake in phase one, but you can correct that mistake for only that phase’s work. Then, when you go onto the next phase, you’ll have learned from the prior phase and will be able to avoid making the same mistake(s).

Analytics is best handled with an iterative, phased approach. Break it down into phases and don’t try to bite off too much at once. This approach allows you to show incremental value, allows you to properly develop and cultivate the necessary skills, and it allows you to correct mistakes that may arise with only minimal issues or rework.

That’s it for today. Thanks again for listening to today’s A:360.

Subscribe to have new content sent directly to your email!


Photo Credit

I had the great pleasure of having a podcast discussion with John Best of the Best Innovation Group through the BigCast podcast.

During this conversation we cover a few major topics:

  • The six characteristics of a data-driven organization.
  • Using data warehouses to improve the question-and-answer process while analyzing data.
  • Why data warehouses can sometimes be difficult
  • The future of data warehousing
  • Prescriptive data warehouses.
  • Why “purpose” is the key to successful data integrations.

Click here to listen to the BigCast episode and enjoy! Like what you heard? Subscribe to BigCast on iTunes (and don’t forget to subscribe to The Knowlton Group’s A:360 podcast while you’re there!)

Want to get our new podcasts, articles, and content sent directly to you? Fill out the form below to subscribe!

Fill out my online form.
Use Wufoo integrations and get your data to your favorite apps.

Photo Credit

2016 was a great year for analytics throughout many industries. The financial industry seemed to especially embrace analytics and take the necessary first steps towards becoming a more data-driven industry.

Back in the beginning of 2016, I published an article titled “5 Reasons to Invest in Data and Analytics in 2016”. Most (if not all) of the points mentioned are just as true today as they were then. However, in the spirit of continuous improvement, I’ve updated the list for 2017 to reflect the changing industry and analytics landscape.

Without further ado, here are your top five reasons to invest in data analytics in 2017…

1. And you thought you had a lot of data sources last year…!

The volume of data is always worth considering. But what many fail to consider – until it’s too late! – is how disparate your data is becoming. In 2016, you likely added several new data sources to your organization’s data inventory.

How many of these data sources are siloed from the rest of the organization’s data?

As you bring on a new CRM system or LOS or digital banking platform or whatever new software your organization acquires, consider how that data will become integrated with the rest of business’ data. Make that a priority during the vendor/software evaluation process instead of an afterthought post-implementation.

2. Your data strategy affects WAY more than just your data.

An organization’s data strategy is:

A comprehensive and actionable foundation for an organization’s ability to harness and leverage data.

Your data strategy, at a bare minimum, must include:

  • A strategy defining your data analytics goals
  • A tactical roadmap describing how you will accomplish the analytics goals outlined above
  • Plans, tactics, and processes to develops analytics skills and create a data-driven culture

But your data strategy doesn’t exist on an island of its own. In fact, your success (or failure) with data strategy will impact so much more than just data.

Is improving your efficiency ratio on your strategic plan for 2017? Better data (read information) can help with that.

Is developing a digital strategy on your strategic plan for 2017? Don’t you think a strong analytics platform to understand your members and their banking habits would support a digital marketing initiative?

If you take away one thing and one thing only from this article, make it this:

Analytics and a strong data strategy will enable greater success in every single one of your other strategic initiatives.

3. The cost of inaction is (truly) greater than the cost of action.

I wrote a post last year talking about how analytics would soon become a competitive necessity for the financial industry. This is as true today as it was then.

There are still many competitive advantages that can be gained in most markets/regions through a greater maturity with analytics. However, as the analytics maturity of the industry rises, the value of the competitive advantage shrinks.

We are nearing a state where the cost of inaction with analytics is greater than the cost of action.

This is a good news – bad news situations. The bad news? Failure to act quickly will limit the potential competitive advantage spread you could achieve within your market.

The good news? Vendor competition and growing expertise in the industry have made high-quality analytics platforms drastically more affordable.

Thought you needed to spend millions to get a strong analytics platform? Think again.

4. Efficiency, efficiency, efficiency

I’m still surprised at how this is the best-kept secret about analytics:

A strong data strategy and analytics platform can drastically improve your organization’s efficiency.

What do I mean? Think of how many hundreds (more than likely, thousands) of hours your staff spends manually merging data from various sources, importing into a spreadsheet, producing charts and graphs, etc.

The average financial institution we worked with last year (for scale comparison: approx. 200-400 employees and 10-20 branches) had well over 3 FTEs of work that could be automated through the proposed analytics solution. At the $500M+ peer group average of roughly $77,000 compensation and benefits expense per FTE, this automation amounts to, at a minimum, of a $231,000 per year opportunity cost savings.

And that’s just scratching the surface on the easy stuff that we could identify during a few days onsite visiting with a client.

Want to improve your organization’s efficiency? Ask how data analytics can help.

5. The industry has some experience with analytics now

Let’s be honest, financial institutions tend to be pretty risk averse. Especially when it comes to new technologies or ideas, the “wait and see” approach is more often than not the method taken. By now, however, there have been enough organizations that have taken the leap into the world of analytics that this isn’t a “new” idea. There are some great use cases from financial institutions across all asset classes on how they’ve had success.

There are also some stories of how analytics hasn’t worked.

This isn’t all bad.

Treat those stories as a blueprint for what to avoid when rolling out your own analytics programs and data strategies. Every experience – whether it ended in success or failure – is an opportunity for the industry to continue to grow its analytics competency and maturity.

If you belong to a more risk-averse organization, I’d suggest reaching out to colleagues at other organizations that have started to leverage analytics. You might be surprised to find out that it isn’t quite as scary as some think.

Still uncertain?

Take a look at an article we wrote last September titled “What’s Holding You Back from Being Data-Driven?”. In that post, I explored some of the common misconceptions around data and analytics that tend to restrain organizations. It’s worth a look to see if some simple clarification could resolve lingering concerns.

What’s in store for 2017?

2017 is going to be a big year for the financial industry. From digital marketing innovations to improved data analytics maturity, this should be a fun year.

Want to stay updated on any articles and new content we publish? (Don’t worry, you’ll only get 1 or 2 emails with new content per week at most!)

Fill out the form below to subscribe!


Photo Credit

In my last post, I talked about some of the defining traits of an organization at the lowest, first, stage of the “Analytics Maturity Curve”. The second stage of the “Analytics Maturity Curve” is what Tom Davenport refers to as the “Localized Analytics” stage.

Common Traits of This Stage

The “Localized Analytics” stage is a stage that, based on my estimates, roughly 1 in 4 financial institutions have reached. In this stage, any data analytics efforts are siloed to a small number of departments or subject areas. For banks and credit unions, your marketing department – especially a digital marketing area – or lending might be driving your organization to this second stage of “Analytics Maturity Curve”.

Questions that are asked by these pockets of analytics include:

  • Why was last month’s results the way they were?
  • How can we use data analytics to improve [EFFORT/INITIATIVE/PROCESS]
  • What data can we use to understand this [EFFORT/INITIATIVE/PROCESS] better?

Notice that each question is not phrased as a simple backwards-looking assessment. Instead, the questions ask how data can improve or better understand some process or effort. The questions naturally become multi-dimensional this way and generally start to grow the data-driven mindset.

Unfortunately, organizations at this stage do not have complete buy-in from management and the enterprise with data analytics. These pockets of analytics show great interest and open up exciting new innovative opportunities for the organization. Stop and ask yourself this:

Is your department a pocket of analytics? Or is it a detractor from the organization’s analytics maturity? If so, why?

Technology Used at This Stage

In the second stage, the “Localized Analytics” stage of the “Analytics Maturity Curve”, a data warehouse or any enterprise analytics platform is non-existent. The most advanced of the stage two organizations may have an independent data mart in place to support local analytics. However, all technology is siloed and not shared across department lines.

How many reporting tools or applications does your organization have that only have users from a single department?

If you rely on reporting solutions for your lending area, your core, your marketing department, or any other area, you are certainly no further than stage two of the “Analytics Maturity Curve”.

Where’s the Good News?

There is plenty of good news for organizations at this stage of the “Analytics Maturity Curve”. First, you are clearly progressing up from the “Analytics Impaired” stage. Progress is being made and the right steps are being taken. While this isn’t a “full steam ahead” approach to analytics, organizations at stage two of the curve are slowly building up experience with and pockets of analytics.

The lessons each pocket of localized analytics learns can be applied to growing your organization’s analytics maturity as your progress up the curve.

Is your organization at stage two? Here are some parting comments worth remembering:

Prove the value of analytics – even if it is limited to a handful of areas.

Educate your senior management on the successes and/or failures your area has had with localized analytics. Lessons learned – from both the good and the bad – will prove beneficial.

Plant seeds by asking the right question. If someone talks about last month’s numbers, ask them WHY the numbers are what they are. Challenge your colleagues to think deeper about data.

Stay tuned next week for a discussion organizations that are at the third stage of the “Analytics Maturity Curve”!

Subscribe to have new content sent directly to your email!


Photo Credit

Where does your organization lie on the analytics maturity curve?

This question is functionally equivalent to:

  • What is the state of data analytics at your organization?
  • How do we compare to others in the areas of business intelligence, data, and analytics?
  • With regards to analytics, what the heck are we doing right and wrong?

If Google, Facebook, Uber, Amazon, and Netflix are at the top of the “analytics maturity curve”, where does your organization lie?

Over the next several posts, I will be describing the various stages of the analytics maturity curve. I’ll discuss the distinctive features of the stage, the technology employed at the stage, the types of questions asked, and the general skills required at each stage of analytics maturity.

Arguably the seminal work when it comes to discussing an organization’s analytics maturity, Tom Davenport’s Competing on Analytics: The New Science of Winning provides the foundation for much of the content in the coming posts. I highly recommend Davenport’s book as a fantastic resource for all things analytics.

Stage 1: Analytically Impaired

Stage one. The floor of the analytics maturity curve. You’d better act quick if you expect your organization to survive long at this level of analytics maturity.

Why? Because you are driving with a blindfold on at this stage.

Enterprise analytics – even quality operational reporting – is borderline non-existent. Excel is your greatest analytics tool and you struggle with gaining anything beyond limited insight into your operations.

At best, you might be asking vague questions like “what happened last month?” None of your questions align with the organization’s KPIs – in fact, you might not even have any defined KPIs! Answering even simple questions about the state of your business is a challenge littered with inconsistency and uncertainty. If I asked “how many customers/members do you have?” to several different areas of the organization, I would, undoubtedly, get multiple answers back.

Your primary source of analytics is derived from pre-built, standard reports from application-specific reporting solutions. Data integration is likely only a figment of your imagination thus making it nearly impossible to gain a 360-degree of your business, its customers/members, and their interactions.

At this stage, your organization “analytics” hinges on how many times your VLOOKUPs break in your Excel files. Skills in things like SQL, ETL development, data architecting or data visualization are non-existent or simply un-utilized.

The good news? There is nowhere to go but up. And, to all you CFOs reading this, an organization at this stage has one of the strongest opportunities to yield a near-immediate ROI. Why? Analytics, especially for Stage 1 organizations, represents an untapped oil field of new opportunities and efficiencies (just read a previous post “5 Reasons to Invest in Data and Analytics in 2016” to learn more about these opportunities).

How can you start to move out of the “Analytically Impaired” stage and into the second stage of the analytics maturity curve? I’d strongly urge you to take a step back and honestly assess the current state of analytics at your organization. Create a data strategy and develop a plan for how you can navigate your way up the analytics maturity curve.

Stay tuned for our next post to learn about what analytics looks like in an organization at the second stage of the analytics maturity curve.

Subscribe to have new content sent directly to your email!


Photo Credit