On February 23, 2021, The Knowlton Group was fortunate enough to present with our great client partner, Logix Federal Credit Union, at the Callahan & Associates hosted Credit Union Tableau User Group.

Nicole Lopez, the Manager of Business Intelligence at Logix, highlighted some of the fantastic ways that her team has leveraged the VeriCU Data Platform to deploy actionable analytics-driven output to critical lines of business and executives.  The video above is the replay of the webinar if you weren’t able to join us live.  I’d highly recommend that any credit union (those with and without existing analytics teams) watch this video to learn about some great practical analytics use cases.

Today’s A:360 discusses the importance of the “crawl, walk, run” progression when getting started with analytics. Feel free to read the summary of the podcast below or scroll towards the bottom of the page to watch or listen!

Like most major projects and strategies, realistic goals and timelines need to be adhered to for the greatest chance of success. Many organizations may want to immediately start using “big data” and “data science” yet they haven’t even tackled the basics of traditional business intelligence. This is where the crawl, walk, run mentality comes into the picture.

The “crawl” phase is what I would consider traditional business intelligence. This is the phase where the organization stops living and dying by their Excel VLOOKUPs and starts to use relational databases and common reporting tools. Visualization tools (like WebFOCUS, PowerBI or Tableau) are implemented allowing the organization to consume information in a fashion other than a spreadsheet.

The “walk” phase of analytics is where “real” analytics can begin. Data governance has been set in place so key definitions and terms are understood by all within the organization. Data is no longer stored and reported on in silos. Data transparency and data integration allow the organization to see a 360-degree view of the member. At this phase, your staff does not need to go to ten different places to get data for a report. Near real-time or real-time analytics can be developed. Questions that are asked are not “what did we do last month?” but “what will we do next month?”.

The “run” phase of analytics is where data science and statistical models become fully realized and leveraged. This is the point where your organization may employ or work with outsourced data scientists. Statistical models are developed based on your underlying data structures to do any number of things including:

  • Advanced Member Attrition Analysis – who is likely to leave the credit union (or no longer use us as their PFI) and when?
  • Refined Risk Modeling – is using credit score really the best way to manage risk and maximize NIM? Can we layer in other attributes in the origination process to underwrite traditionally “riskier” loans without impacting our risk profile?
  • Advanced analysis of card transaction data to identify opportunities for improved interchange income and greater utilization (i.e. top of wallet) by the member.

There are about a hundred other examples of “run” phase analytics that could be leveraged. The idea, however, is that in the “run” phase the data is working for us in every way imaginable. At this phase, the organization has the structure, the culture, and the skills to make full use of the data’s potential.

Watch and Listen

Click to Watch on YouTube.

Listen to the Podcast

Click to Listen on SoundCloud
Click to Listen on iTunes


Subscribe to have new content sent directly to your email!

[mc4wp_form]

Photo Credit

A few backs at the 2017 Credit Union Analytics Summit hosted by First Tech Federal Credit Union in Redmond, Washington, Brewster Knowlton moderated a panel with five fantastic industry leaders in the analytics space. The panelists included:

  • Naveen Jain, VP Digital Analytics at First Tech Federal Credit Union
  • Clay Yearsley, SVP Data Analytics at Texas Trust Credit Union
  • Harsh Tiwari, CDO at CUNA Mutual
  • Ben Morales, CEO at Q-Cash Financial and CTO at Washington State Employees’ Credit Union
  • Matt Duke, Director of Data Analytics and Product Management at Cisco

If you are an organization (in any industry!) wondering how to get started with analytics, this panel discussion is worth watching. We discuss several topics including:

  • How do you get started with data?
  • How can you identify quick wins and gain momentum for analytics?
  • What are the most crucial points to consider when getting started with data?
  • How do you address and manage the cultural transformation component of analytics initiatives?

Either watch on YouTube or press play below!


Subscribe to have new content sent directly to your email!

[mc4wp_form]

In my last post, I described the characteristics of organizations at the third, “Analytical Aspirations” stage of the analytics maturity model. In this post, we will dive into what an organization at the fourth stage, the “Analytical Enterprise” stage, of the analytics maturity model looks like.

These posts on analytics maturity stages are heavily influenced by the great work Competing on Analytics by Thomas Davenport. I’d highly recommend picking up a copy for you or your BI department.

Common Traits at this Stage

At this stage of analytics maturity, organizations have a well-developed analytics platform. Data centralization, integration, cleanliness, and governance all are mature in their manifestation. The challenges faced in the previous three stages have all been resolved. Questions asked shift from being reactive to proactive in nature – from “what happened last month” to “what will happen next month”.

The question that most frequently enters the analytics lexicon at this stage is “why”. The question of “what” (i.e. descriptive) shifts to a more prescriptive question. It is the medical equivalent of going from gathering a list of symptoms to attempting to understand why those symptoms are present.

How does this apply to analytics? Organizations at the “Analytical Enterprise” stage of analytics maturity start to ask why things are happening. The questions force a deeper dive into process analysis and how analytics can be embedded within the processes themselves. From “what is our membership doing” to “why is our membership doing what they are doing”.

Internally, analytics begins to drive performance. Clearly defined KPIs are measurable and transparent throughout the organization. Benchmarking analyses have been put in place by organizations at this stage to highlight the value yielded by analytics efforts. “Guesstimates” are nowhere in sight.

Cultural and Change Management Required at this Stage

There are only a few technological differences between this stage and the third stage of analytics. The analytics platform is a bit more developed and skills in the areas of statistical modeling and data science have been acquired.

The single greatest shift from stage three to stage four of analytics maturity is the organization’s cultural adaptation driven by data analytics. Analytics is no longer a “want to have” but a “need to have”. It is embedded in every discussion at nearly every level of the organization. Conversations occur about analytics can drive innovation thereby making analytics success the platform for innovative success. It should come as no surprise to readers of this post that most innovate and digitally transformative organizations are also highly successful with analytics. Analytics is, therefore, not an effect but the cause. Analytics propels innovative strategies and digital transformation efforts. It provides the foundation for what the vast majority of organizations seek to achieve over the next 3-7 years.

The next and final stage of the analytics maturity curve will describe those organizations whose business models are entirely driven by success with analytics. We’ll describe that in more detail in the next post!

Photo Credit

Today’s A:360 discusses why developing and nurturing a data-driven culture is as crucial to your analytics success as your technology implementation. Many fail to realize that analytics is as much an exercise in change management as it is in development and programming. In this podcast, I’ll give my thoughts on why creating a strong data culture is so critical and some tips to develop one as you start out on the analytics journey.

Watch and Listen

Click to Watch on YouTube.

Listen to the Podcast

Click to Listen on SoundCloud
Click to Listen on iTunes

Read the Transcribed Audio

Hey everyone, welcome to today’s A:360. My name is Brewster Knowlton, and today we’re going to be talking about why developing a data-driven culture is often more challenging than any of the technical implementation projects that you’ll have to face in your analytics journey.

People are often surprised when I say that what they assume to be predominantly a technology or technical-based implementation is actually harder from a change management or cultural perspective. Analytics, in general, and the deployment of analytics is by no means an “if you build it they will come” solution. Overcoming the cultural and are often more crucial and relevant to the success of your analytics initiative than any technology implementation or any technological aspect of the analytics implementation.

Why is that?

An initial component of deploying an analytics solution and, in fact, one of the best ways to get a return on your investment in the first 12-18 months is by automating redundant, manual reporting tasks. On a recurring basis, I’m sure your staff spend thousands of hours producing the same reports month by month living and dying by their VLOOKUPs, manually pulling, merging and extracting data. It’s inefficient.

But even when the most well-intentioned employee hears that twenty hours of their work week is going to be automated, the initial reaction is to be a little bit fearful and concerned as to whether or not they are going to lose their job. It’s their livelihood. It makes sense. One of the initial components and the initial challenges of developing a data-driven culture is creating a structure such that people feel comfortable having parts of their job automated. They can understand that the automation process is not brought up as, “Hey, we’re going to take away a part of your job”. It’s brought on as a way to say that we are going to enable you to do more of what you were truly hired to do. Analysts should not be spending 90% of their time merging and extracting data. They should be spending 90% of their time analyzing data. In that context, help your analysts understand that, “No, we’re not taking your job away from you, we’re simply enabling you to actually fulfill the job function that you were hired to perform”.

I have a few tips that can help assist in the change management process and in developing a data-driven culture. The first is to provide “the why”. Why are we going through this analytics journey? Why are we developing analytics and making this a core competency of our organization? Be transparent about that process and the initiative. It’s not that you’re trying to take jobs away from people, it’s that you’re looking to enable people to perform their jobs even better, to provide a better member or customer experience, and to deliver on whatever the mission statement or strategic goals of your organization are. People will buy into that if it is properly described and properly communicated. Providing “the why” and the explanation for why the effort is taking place and how it will actually benefit them is crucial. Do not create a sense of opacity where there is some black box of analytics somewhere in a back room. That creates fear and concern, and it will ultimately cause people to be more resistant to change. Again, transparency and providing “the why” is a crucial first step in beginning to create a data-driven culture.

My next point is actually one that I have previously discussed in other podcasts on the Analytics Flywheel Effect. It’s worth reiterating here, in this context, because the value of quick wins cannot be underestimated or understated when it comes to developing a data-driven culture. Everyone loves when things are made easier for them. If you can create enough of those quick wins and make tasks easier for someone, it changes the mentality from “someone is taking my job away”, to “somebody is making my life easier”. That mentality shift is crucial to gather support. It’s a very simple way to get employees aligned with the analytics objectives. I’ve talked about this before in a previous podcast, so I don’t want to waste too much time talking about it. However, developing quick wins not only helps generate ROI and some momentum for your analytics initiative, but it’s also one of the strongest ways to develop a data-driven culture.

The next point is by far one of the most overlooked aspects of any analytics initiative – training. Training the business users – the consumers of your analytics efforts – on how to access and leverage the dashboards, reports, and analytics cannot be understated. If you’re rolling out a new core, you’d never deploy it to frontline staff without properly training them on how to use it. The same goes for a new LOS or CRM system. The same has to be true for your analytics platform.

Ask yourself this: are you properly teaching, training and supporting your customers – the business users – how to consume, leverage and take advantage of the information and the analytics that you provide?

Train, train and train some more. And, when you think you’ve done enough training and led enough focus groups or support and communication with your users, do a little bit more. This not only creates a line-of-sight and transparency between the business users and the analytics or BI team, it also helps the users feel more comfortable with the analytics solution. Eventually, they will create their own reports and their own analytics. This also helps with the scaling and growth of your analytics initiatives. .

Let’s throw a cliché out there.

Change is hard.

We intuitively understand that change is difficult. People say that they want to change (and I do believe that most genuinely want to change), but, when you get into the process of actually changing, some challenges become clear. It’s hard. There is fear. People are used to doing something a certain way for a long time (in some cases, for upwards of twenty years!). When you try to take that away from them or there is the perception that something of theirs is being taken away, these challenges become particularly apparent. It’s imperative to maintain and manage that fear of lack of control that comes with the change management process. The technological challenges and making a proper analytics platform choice is, of course, significant and important. But, give as much emphasis to how you’re going to manage the change process. Leverage the knowledge and skills of your HR team. Perhaps they can assist you with the implementation and deployment plan from an organizational development perspective. Leverage the resources and the assets that you have in your organization, each with their own skills, to be able to best improve your chances of success with analytics at your organization.

That’s it for today. Thanks again for tuning in to today’s A:360.


Subscribe to have new content sent directly to your email!

[mc4wp_form]

Photo Credit

Today’s A:360 discusses why it is critical to boil analytics down to well-defined questions. A question is the fundamental building block of analytics. Well-defined questions can shape and simplify the delivery of analytics to an organization. For those business users who aren’t quite sure what data they are looking for, helping them shape a question can be an excellent starting point.

Watch and Listen

Click to Watch on YouTube.

Listen to the Podcast

Click to Listen on SoundCloud
Click to Listen on iTunes

Read the Transcribed Audio

Hey everyone. Welcome to today’s A:360. My name is Brewster Knowlton, and today we’re going to be talking about why success with analytics needs to start with a well-defined question.

I’ve seen a lot of instances where individuals go up to their analytics staff and they’ll ask some very general questions like, “I want to see more data” or “show me some analytics”. That’s like going up to someone and saying, “I want dinner”. Well, what do you want? There’s a lot of choices. The same thing goes for analytics. There has to be that specificity.

The best way to define that specificity for analytics and define what you’re really going for is to frame every analytics idea or every analytics objective in the form of a question.

Asking, what are you trying to accomplish?

If you can get the business users who are requesting information and analytics from you to ask a question, that helps change the context of conversation that you have with them. It’s not about simply producing a report, it becomes about helping them answer a question. This changes the parameters and context in which you gather information and present it. But, in order to do that, you must start with a question.

Often times, people will ask for data and/or reports just so they can try to figure out what they’re looking for. They have an idea, but they don’t quite know how to articulate what they are looking for in the form of a question. This is where a strong analytics team can really show its strength. It’s not so much in the technology, it’s in helping the business translate what they’re trying to figure out into a well-defined question, and then figuring out can we go about answering that question.

This may sound overly simplified, but this really is the fundamental starting point for analytics. There’s an article that I wrote that’s called, “When Life Gives You Data, Make Information”. It talks about the distinct difference between data and information. At its core, it really comes down to asking a question.

The difference between data and information is that data is just raw numbers. Information is the actionable intelligence built off of that underlying data. Let’s look at an example from business users in lending. suppose one of your business users comes to you and says that they need a report of all loan applications in a pending status. They’re really trying to ask a couple of things. For instance, they may be trying to figure out how to improve their close rate. Or, answer the question “why are so many loan applications falling off before being approved?”. They may be trying to figure out how to increase throughput or productivity. My point is, they haven’t really defined a question, and as a result, they’re grasping at straws. They’re looking at all of this data and trying to make sense of it. Helping these users to frame a question at the very beginning, not at the middle or the end of the analytics gathering process, can help them target exactly what they’re looking for and may allow you as the analytics individual within your organization to be able to better provide what they really need or what they’re really looking for.

Again, this seems oversimplified. It’s funny because I’m sitting here doing over a four-minute podcast about why it’s important to ask a question with analytics. But it really is something that falls by the wayside, especially as we get inundated with requests. People will just say, “I want data. I want data. I want data”. My suggestion is to take a momentary step back, and. as simple, insignificant and superficial as it may seem, just ask: “What question are you trying to answer?” That question alone will help spark a conversation that I can assure you will improve the process by which you can deliver analytics to your organization.

That’s it for today. Thanks again for listening to today’s A:360.


Subscribe to have new content sent directly to your email!

[mc4wp_form]

Photo Credit

Have you ever wondered what it takes to enable your organization to become data-driven? For a limited time, I am giving away my “Credit Union Executive’s Guide to Data and Analytics” eBook FOR FREE! To get the eBook, complete the form below and press “Sign Up”:

[mc4wp_form id=”1960″]

eBook Overview

It’s no secret that credit unions are becoming more data-driven organizations. As the amount of digital channels increases and the usage and utilization of those digital channels increases, so too does the amount of data available and accessible to those organizations. Throughout this eBook, I will discuss how credit unions can become data-driven organizations, what that really means, and some simple ways to get started on analytics journey.

What Will I Read About?

First, I’ll help you understand what it means to be a data-driven credit union. I don’t want the term “data-driven” to become just another buzzword. I’ll address what it really means for an organization to use data properly and how organizations can take advantage of data and analytics.

Second, I will address some key definitions and common misconceptions surrounding business intelligence, data, and analytics. For example big data, my least favorite buzzword, is drastically misused on a regular basis. I will discuss what big data really means, and what people tend to mean when they say the term “big data”.

Next, I’ll discuss the importance of a data strategy. I’m a firm believer that strategy without execution will fail and that execution without strategy will also fail. Business intelligence and analytics requires a top-down approach. This means that an organization should start with a strategy and then figure out how to execute that strategy. I will address this concept further and the details associated with a data strategy when we get to that section.

Next, I will teach you how start building your credit union’s data strategy. Answering the question, what can you start doing today to begin the discovery process and gather all the relevant information necessary to begin formulating your credit union’s data strategy?

I will then go over how to gain greater clarity around the technical aspects of building a data and analytics platform. By downloading this eBook, I do not assume you have a technical background. In fact, when I discuss the technology that’s going to be used, I do so from the perspective of someone who does not have much, if any, technical expertise. At the same time, we do need to have a discussion about the technology that your credit union could or should be using in order to maximize your data and analytics knowledge. This will help you become aware and understand what is involved for analytics. It will also better prepare you for conversations with both technical and business people.

Next, I will go over a common question that I often get about analytics: “How do you address data quality?” Data quality issues are something you can never get rid of. There will always be bad data somewhere. The important thing is not to try and have perfect data, but that you have a plan to address data quality issues when they inevitably arise. Data quality issues will always present themselves as long as there is the possibility of human error (which is always)! Being able to address those data quality issues as they arise is essential, so they do not snowball into bigger problems and compound going forward.

Next, I will address how you can build and evolve a data-driven culture. Not properly developing a data-driven culture is one of the most common reasons that data and analytics initiatives fail within credit unions. As a result, I will address what constitutes a data-driven culture, how you can create a data-driven culture, and some techniques that can help you out along the way.

Lastly, I’ll go over the importance of “quick wins” along with something known as the “Flywheel Effect”. For those of you who have read Jim Collins’ book, Good to Great, this “Flywheel Effect” concept should be familiar to you. It has an interesting application when it comes to analytics. Learning how to gain early momentum is critical in implementing a successful and sustainable data strategy and a data and analytics platform.


Complete the form below to have a copy of “The Credit Union Executive’s Guide to Data and Analytics” sent to your inbox!

[mc4wp_form id=”1960″]

More and more financial institutions are investing in developing their own analytics teams. Data warehousing and other modern analytics platforms are becoming the norm and not the exception. As these organizations start to develop their data strategy and implementation roadmap, some of them find that their data is being held hostage.

What do I mean by that?

Let’s assume that you are running a CRM system, for example, that is on-premise. More likely than not, the data for that CRM system is being held in a SQL database. Getting data out of a SQL database is easy in the world of data warehousing and analytics.

Now, let’s assume you are running a loan origination system (LOS) that is a hosted, third-party application. Except for a handful of exceptions, you will not be able to access a SQL database housing this data directly. However, your analytics team needs to get this data out of the hosted environment. You likely will call up the vendor, and they will give you a quote for how much they will charge you to provide you with your data.

Let me repeat that. They will give you a quote for how much they will charge you for YOUR data.

Avoiding Data Hostage Situations

Access to data is typically an afterthought in the product evaluation process for new software acquisition. As more organizations take steps towards becoming data-driven, the need to have easy access to their data will become even more critical than it already is.

Data access, then, should become part of the software evaluation process – a forethought instead of an afterthought.

Most vendors have the means to provide the data to you a number of different ways. By waiting until after implementation, however, data access becomes a product and/or service increase as opposed to an existing feature of the software acquisition. This is typically where you receive a quote for how much it will cost to have data delivered to your organization.

Some of you may be reading this saying “but I can access all the data I need from a web portal they’ve provided to me.” In that situation, reports must be manually opened and downloaded if you want to do anything with that data. Your analytics team will need data automatically downloaded or transferred to a specific location on a regular (usually, nightly) basis. Access to a reporting portal that requires manually downloading of reports and data is insufficient for a data-driven organization.

Key Point: Negotiate access to raw data at the beginning of the software acquisition process not after it has been implemented.

Ways Data Can Be Delivered

Most vendors have several ways of automatically delivering data to your organization:

  1. SFTP (Secure File Transfer Protocol) – a secure way to send files to and from vendors. For those vendors that cannot allow direct access to a SQL database, this is usually the most common delivery method.
  2. SQL Replication – some applications (shoutout to MortgageBot) will set up a replicated SQL instance on your network. Put simply, they are putting a copy of the production database on your system for reporting purposes. This a dream come true for analytics teams that need access to raw data.
  3. Physical Copies of Database Backups – some vendors are able to send you a physical copy (i.e. encrypted external hard drive) that contains a copy of a SQL database up to a certain point in time. Then, they can SFTP over backups and/or log files that update the database. A hybrid of the first two options, there is a bit more work in this solution, but it is still a viable option
  4. API – as credit unions and community banks start to build their own development teams, APIs are becoming more commonplace. Think of this as the language through which a development team and an application could communicate. Depending on how open the API is, this may be a sufficient option to gather the raw data required by your analytics team.

There are few other delivery methods, but, for the most part, they are derivations of the methods already mentioned.

As you are negotiating or re-negotiating with your vendors, make the conversation about data access and delivery a priority. Some of the most successful financial institutions are achieving their success through their increased use of data analytics.

Avoid having your data held hostage and make data access a priority in all software evaluation processes.


Subscribe to have new content sent directly to your email!

[mc4wp_form]

Photo Credit

Today’s A:360 discusses a few suggested ways to measure the return on investment (ROI) for your analytics initiatives. A common question I receive is “how do we determine the effectiveness of our analytics efforts?”. This podcast’s intent is to present a few possible ways to answer that very question about measuring analytics ROI.

Watch and Listen

Click to Watch on YouTube.

Listen to the Podcast

Click to Listen on SoundCloud
Click to Listen on iTunes

Read the Transcribed Audio

Welcome to today’s A:360. My name is Brewster Knowlton, and today we’re going to be talking about some ways that you can measure return on investment for your data and analytics initiatives.

As analytics initiatives become much more commonplace in even the smallest organizations, there will always be the question: “How are we evaluating whether or not we are getting a strong analytics ROI – a strong return on our investment, for both time and resources, in our analytics initiatives?”

We’re going to cover a few different ways to measure this. These are good starting points, by no means is this going to be a compressive or exhaustive conversation about analytics ROI. However, it’s going to be something that gets the conversation started, while giving you a few ideas to get going with your own analytics initiatives and ROI calculations.

My starting point for anything related to analytics ROI is going to be, “How can I reduce the time it takes my staff to be able to produce reports or some data-driven analysis?”

Reduce Manual Reporting Efforts

I find that (we will look at this in the context of a credit union or a bank) for a financial institution with a billion dollars in assets, there are between 4,000-6,000 hours (minimum) that can be automated through the use of improved reporting and analytics. This would most likely be through the integration of data and/or the automated extraction of data from applications that don’t allow for easy data extraction.

This is what our previous podcasts and articles talking about data inventories or report inventories are getting at. That is, how much time is it taking on a monthly or weekly basis to produce those reports?

Once you have that time calculation, you can back into the opportunity cost of your staff manually producing these reports. You can do this by taking the percentage of their time spent (in a month, let’s say) on producing what could be automated reports, multiplied by their compensation and benefits expense. Then, you are able to get a number that’s telling you how much it costs your staff to do manual continue to gather data and produce reports. For organizations in the first 18-24 months of their analytics initiatives, this is where you’re probably going to get the biggest bang for your buck and the biggest return on investment. Frankly, you’ll also make a lot of friends in the process as you’re going to give people a lot of time back per week or per month.

Measuring Access to Data and Analytics

The next way that I would recommend you start looking at analytics ROI is by measuring the overall access to data and analytics throughout the whole organization. What we tend to see is that there are pockets of information – silos – throughout an organization where data may or may not be shared and spread throughout the rest of the organization. What this does is creates very limited insights into the organization as a whole as it relates to anything operationally or in the context of analytics.

A way to quickly identify the spread of analytics usage is to measure how many individuals not only can access essential BI portals (usually the front-end to your analytics platform) but also how many people are accessing it on a regular basis. Especially those in roles where having access to and consistently using data is going to be a critical component of success in their jobs in a more data-driven organization.

This is a point that isn’t necessarily financially-driven. (How many CFOs listening are saying to themselves that you only like financially-based ROI calculations?!) But, as we start to talk about overall utilization of any product that we acquire or implement, we need to consider how well it’s actually being used throughout the organization. If two people are using it out of an organization of five hundred, our analytics product penetration is very low. While not necessarily financially-driven, it is a way to measure the overall impact of your analytics platform and initiatives.

Measures of Self-Service

You’ve heard me say this before if you’ve listened to previous podcasts or read our articles: your analytics program should be centrally-driven and broadly distributed. What that does NOT mean is that your analytics team becomes a series of report writers where they’re, essentially, order takers from business users that need data.

So, another way to measure analytics ROI is to analyze how many reports or dashboards visualizations are in your BI portal that have been created by the analytics team and how many of those reports have been created by the business users themselves. This becomes a measurement of the self-service capability of your analytics platform. Again, this is not necessarily financially-based from an ROI perspective, but as we look at overall utilization, you really want to have a platform that enables the business users to get data on their own. If your analytics team is required to constantly create all the new reports and analysis, you’re not going to be able to scale as the data needs and analytics requests rise. Therefore, this self-service piece is a very important and integral component of the success of analytics program and initiative. This metric directly measures the success or failure towards that objective.

Benchmarking Analytics Lift

The last point that I’ll make about measuring analytics ROI – again, this is by no means and exhaustive list but just the starting point for the conversation – is that you can measure analytics ROI and determine the impact of analytics through the benchmarking process.

For example, let’s look at a lending example. Suppose, before you were able to dive into your underwriting and origination data, you had a 5.5% average yield for your consumer loan portfolio. Now, after investing in analytics and you’ve gathered and integrated your loan origination data and loan servicing data, you found there were more opportunities to underwrite loans with lower credit quality borrowers but maintain your same delinquency and chargeoff ratios. In this example, you are able to make more money with a reduced risk or equivalent risk portfolio. After all this, instead of having a 5% yield, you might be at a 6% yield. All of a sudden, you’re making an extra 50 basis-points on loans – directly contributing to the bottom line.

That benchmarking comparison of, “What did we do before analytics?” versus “What did we do after analytics?” is just one way that you can start to show the value. There are going to be a million scenarios where benchmarking applies. Look at your credit card portfolio and look at the number of transactions. Perhaps a dive into the data helps you develop a gamification-based marketing campaign that has an emphasis on signature-based debit card transactions as opposed to PIN-based transactions. This would lead to more interchange income on each swipe. Benchmark the before (i.e your control group) to the “after”. Try to explore the different ways that you can use benchmarking as a way to determine the return on investment or the impact of analytics on your operations.

As I’ve said, this is by no means an exhaustive list for measuring analytics ROI, rather it is just a way to get some ideas flowing about how you can measure the impact and return on investment for analytics.

We talked about measuring analytics ROI through:

  • Reducing employee time to create reports, dashboards, and other data-related tasks
  • Measuring access and utilization of analytics and the BI portal
  • Measuring the percentage of reports and analytical efforts created by the business users vs. the analytics team directly
  • Benchmarking your time in the “before analytics” period versus the “after analytics” time.

That’s it for today. Thanks again for listening to today’s A:360.


Subscribe to have new content sent directly to your email!

[mc4wp_form]

Photo Credit

Today’s A:360 discusses why it is highly recommended that most organizations take an iterative, phased approach to developing their analytics solution.

Watch and Listen

Click to Watch on YouTube.

Listen to the Podcast

Click to Listen on SoundCloud
Click to Listen on iTunes

Read the Transcribed Audio

Hey everyone. Welcome to today’s A:360. My name is Brewster Knowlton, and today we’re going to be talking about the importance of taking an iterative approach to building out your data and analytics program.

By an iterative approach to data and analytics, I really mean your organization should make a focused effort not to try to tackle and bite off everything all at once. Essentially, approach business intelligence or analytics in phases.

I hear a lot of talk about big data and getting to this point of unstructured data where technologies like Hadoop, Hive, and MongoDB are thrown around. Before worrying about those more advanced technologies, take it one step at a time. Crawl, walk, run.

Focus first on building out a data warehouse and identifying those high priority data sources (like a core or other important third-party data sources) that you need to integrate first. In this podcast we’re going to talk about the importance of taking that iterative approach and some of the dangers of going all in and trying to do too much at once as well.

Identifying ROI is something that a lot of executives are rightfully concerned about when it comes to analytics. Let’s assume you start by building out a data warehouse with a core and a few major third-party applications (these could be your loan origination system or a CRM system, for example).

Without investing all of the resources building out your analytics program up front, you can show incremental value to the more skeptical individuals within your organization with a phased approach. That’s not to say that those organizations that are going all in are taking a bad approach necessarily, but there are some organizations that have to take a step back and prove that ROI at each step of the development process. By taking an iterative, step by step approach to this, you can actually start to build up that scale, build up the ROI and start to incrementally show value without having to rely on one big initial burst after months of development and a larger up-front cost.

Taking a phased approach allows you to build out the skills of not only your BI team but the rest of the organization as well.

If you are going for a spoked-wheel model [of analytics] – where you have subject matter experts or power users within each department that are going to be responsible for some of the analytics in that area – you’re going to need time to build up their skills and train them on various things such as:

Analytics is certainly a learning process. By taking a phased approach to analytics, we can learn from our mistakes during each phase.

If we try to boil everything down to a single phase – especially for those organizations who are trying to build their data warehouse/analytics platform in-house (I would probably urge you to take a step back and reconsider if that’s the best approach) – you’ll likely wish you adopted an iterative development approach. If you make a critical mistake in the first phase and continue to make that same mistake (because you haven’t broken the project down into phases where you’ve learned from your mistake), you’re going to create a very, very difficult rats nest to unravel when you realize the mistake later on. As a result, you’ll have to go back through potentially every single phase and make changes.

By taking an iterative approach, you might make a mistake in phase one, but you can correct that mistake for only that phase’s work. Then, when you go onto the next phase, you’ll have learned from the prior phase and will be able to avoid making the same mistake(s).

Analytics is best handled with an iterative, phased approach. Break it down into phases and don’t try to bite off too much at once. This approach allows you to show incremental value, allows you to properly develop and cultivate the necessary skills, and it allows you to correct mistakes that may arise with only minimal issues or rework.

That’s it for today. Thanks again for listening to today’s A:360.


Subscribe to have new content sent directly to your email!

[mc4wp_form]

Photo Credit