Opening spend analytics data to decision makers reducing costs? Here’s Why!

Using Analytics for Better Decision-Making

Spend analytics information has traditionally been closely guarded by procurement. But this approach is arguably a lost opportunity for both procurement and the wider business.

Spending decisions are regularly made by other functions, so for a business to derive the best value and outcome from those decisions, spend analytics information must be used — not just viewed — by all stakeholders.

Before that data can be used effectively, there has to be cohesion and agreement about several key issues. The first issue: There must be a single version of the truth, with everyone working from the same data and analytics. Without a unified approach, different functions will make different decisions based on their own information, and the objective of improving value in spend decisions will be lost.

The second key point: There must be agreement on what is “good enough” for the purposes of the organization. That will vary from company to company. While some might want close to 100 percent accuracy of their spend analytics data, it’s important to understand that achieving close to perfect data will be costly, and it’s unlikely the company will get adequate return on investment. A “good enough” estimate of 80 percent data accuracy will give you enough insight to enable your people to make good spending decisions. It should also be noted that to achieve a single version of the truth and the required level of data accuracy, an organization should work with a specialist that has the data skills to create reliable spend analytics information.

Analytics

Simple access

Other key considerations: ensuring that the data is easy for everyone to access and easy to interpret so the right spend decisions can be made. The level of access and the depth of information must also be adjusted for the individual. For example, the chief executive will have a different requirement compared to a category manager or a buyer in a department. Equally, the information must be targeted to the skill level of the individual. Those with high-level analytical skills should have access to more in-depth spend analytics data, while others might simply want an-easy-to-understand dashboard or application.

The tools users can access are also important. For example, a voice recognition tool that allows users to ask the application a simple question such as, “How much do I spend with suppliers in manufacturing?” might be most useful for some individuals. If the tool is designed properly, it can provide the user with a chart or report with the required information.

If everyone involved in spend decisions has the ability to freely and easily access spend analytics data, companies can start to drive cost savings and achieve greater efficiencies.

Lowering costs and increasing value can be crucial for life sciences organizations where indirect spend is widely distributed across different functions. Without access to information that’s accurate and trustworthy, they can’t make the right buying decisions and won’t adhere to corporate procurement standards or processes.

In fact, most procurement managers are all too accustomed to different departments spending with suppliers that are not part of a purchase order agreement. The procurement managers can save themselves — and their organizations — a lot of pain by making information easily accessible, reliable and user friendly.

Beyond owning: From conventional to unconventional analytics data

Shale Oil & Gas Production & Completion Data | unconventional resources production analytics

The scale of big data, data deluge, 4Vs of data, and all that’s in between… We’ve all heard so many words adjectivized to “Data”. And the many reports and literature have taken the vocabulary and interpretation of data to a whole new level. As a result, the marketplace is split into exaggerators, implementers, and disruptors. Which one are you?

Picture this! A telecom giant decides to invest in opening 200 physical stores in 2017. How do they go about solving this problem? How do they decide the most optimal location? Which neighbourhood will garner maximum footfall and conversion?

And then there is a leading CPG player trying to figure out where they should deploy their ice cream trikes. Now mind you, we are talking impulse purchase of perishable goods. How do they decide the number of trikes that must be deployed and where, what are the flavours that will work best in each region?

In the two examples, if the enterprises were to make decisions based on the analytics data available to them (read owned data), they would make the same mistakes day in and day out – of using past analytics data to make present decisions and future investments. The effect of it stares at you in the face; your view of true market potentials remains skewed, your understanding of customer sentiments is obsolete, and your ROI will seldom go beyond your baseline estimates. And then you are vulnerable to competition. Calculated risks become too calculated to game change.

Disruption in current times posits enterprises to undergo a paradigm shift; from owning data to seeking it. This transition requires a conscious set-up:

Power of unconstrained thinking

The Power of the Wandering Mind - WSJ

As adults, we are usually too constrained by what we know. We have our jitters when it comes to stepping out of our comfort zones – preventing us from venturing into the wild. The real learning though – in life, analytics or any other field for that matter – happens in the wild. To capitalize on this avenue, individuals and enterprises need to cultivate an almost child-like, inhibition-free culture of ‘unconstrained thinking’.

Each time we are confronted with unconventional business problems, pause and ask yourself: If I had unconstrained access to all the data in the world, how would my solution design change; What data (imagined or real) would I require to execute the new design?

Power of approximate reality

A Theory of Reality as More Than the Sum of Its Parts | Quanta Magazine

There is a lot we don’t know and will never know with 100% accuracy. However, this has never stopped the doers from disrupting the world. Unconstrained thinking needs to meet approximate reality to bear tangible outcomes.

Question to ask here would be – What are the nearest available approximations of all the data streams I dreamt off in my unconstrained ideation?

You will be amazed at the outcome. For example, the use of Yelp to identify the hyperlocal affluence of catchment population (resident as well as moving population), estimating the footfall in your competitor stores by analysing data captured from several thousand feet in the air.

This is the power of combining unconstrained thinking and approximate reality. The possibilities are limitless.

Filter to differentiate signal from noise – Data Triangulation

Triangulation of Data | Blended & Personalized Learning Practices At Work

Remember, you are no longer as smart as the data you own, rather the data you earn and seek. But at a time when analytics data is in abundance and streaming, the bigger decision to make while seeking data is identifying “data of relevance”. An ability to filter signals from noise will be critical here. In the absence of on-ground validation, Triangulation is the way to go.

The Data ‘purists’ among us would debate this approach of triangulation. But welcome to the world of data you don’t own. Here, some conventions will need to be broken and mindsets need to be shifted. We at Anteelo have found data triangulation to be one of the most reliable ways to validate the veracity of your unfamiliar and un-vouched data sources.

Ability to tame the wild data

Data in the wild | ACM Interactions

Unfortunately, old wine in a new bottle will not taste too good. When you explore data in the wild – beyond the enterprise firewalls – conventional wisdom and experience will not suffice. Your data scientist teams need to be endowed with unique capabilities and technological know-how to harness the power of data from unconventional sources. In the two examples mentioned above – of the telecom giant and CPG player – our data scientist team capitalized on the freely available hyperlocal data to conjure up a great solution for location optimization; from the data residing in Google maps, Yelp, and satellites.

Having worked with multiple clients, across industries, we have come to realize the power of this approach – of owned and seeking data; with no compromise on data integrity, security, and governance. After all, game changer and disruptors are seldom followers; rather they pave their own path and chose to find the needle in the haystack, as well!

error: Content is protected !!