Blog Archive

4/17/15

The Changing Role of the CFO – The More Things Change, The More They Stay the Same

The subject of many research papers and articles recently has been on the “changing role of the CFO.” That is at best a misnomer. At worst, it is a dangerous representation of what is really happening. There can be little dispute that all positions in business are effected by change – including the CFO’s. For decades the role of the CFO has been relatively consistent. During that period of time, there have been a lot of changes in the way the business of finance has been conducted. The role of the CFO has had to respond to these changes over the decades, but the overall role has remained constant.

Every so often, an article or study looks at the nature of the CFO role and why it has changed. These articles and surveys often look at the way the role has changed, and focus on the reasons for that change. The latest look at this topic from the ACCA (the Association of Chartered Certified Accountants) and IMA (the Institute of Management Accountants) does a great job of looking at five influences they note are shaping and will continue to shape the role of the CFO function. Implicit in all these pieces, including this one, is a notion that trends in business inevitably change the role of the CFO.

We would argue that this isn’t the case – that in fact the role of the CFO has remained relatively stable for several decades, perhaps since its inception. As trends have shaped the face of business over the years, the way that the CFO conducts his or her job has changed, but the role itself has remained essentially the same. This may seem like we’re splitting hairs a bit. In reality, this semantic distinction is important. CFO’s are and should be perceived to be the steadiest of all influencers in the executive management team. Both internal and external stakeholders see the CFO as the source of stability in most companies. CEO roles may be more high profile, but CFO statements are often more closely parsed. Ask as an analyst if, all things being equal (most notably longevity), they would rather see a CEO or CFO resign and see what they say. For this reason, we think it is important that CFO’s maintain a consistency in their role even as they may be “pivoting” internally to keep up with business and technology trends.

It usually is the case that the things which are usually noted as “changing the role of the CFO” are the very things that are influencing business. Today factors as diverse as increasing risk, volatility, and brand influence are often cited as elements which may “change the role.” Most concede that these are external factors and trends. The closest thing we’ve seen to an internal factor is the strategic influence of the role increasing so that the CFO role has direct impact with the CEO on corporate decisions.

We think the external factors are merely trends CFO’s have to get on top of and respond to as part of their role. Not unlike governance regulations in the 90’s and 2000’s (thing such as SOX compliance) or the advent of software ERP systems in the 70’s, even major external influences do not change the role of the CFO. CFO’s still manage tax and treasury, still conduct budgeting and planning, still sit on earnings calls, and still influence corporate direction.

Today risk, volatility, brand management, analytics, etc. impact a lot of those functions of the CFO. But the functions themselves still exist as they always have. There can be no question that CFO’s have to be ready to respond to changing conditions, just as they have for decades. To say that the changes mean the role of the CFO has changed is inaccurate and implies that the role of the CFO is a shifting target. That is neither accurate nor desirable. In an environment rife with volatility (as is often cited), it is more important than ever that the CFO role be positioned the same, consistent way. The message to stakeholders both internal and external should be clear: “the more things change, the more they remain the same.”

BACK TO TOP

12/30/14

Ten Signs Your Data Analysis Approach Is Inefficient

If your job includes analyzing data, chances are high that you regularly perform some or all of the following steps. There are many vendors who claim to have different approaches. Tableau preaches Data Visualization as the solution. Domo espouses interactive dashboard. Excel, Spotfire, Lumira, Qlikview, Birst, and many other platforms all claim to have comprehensive approaches to data.

The problem is that all products use the same basic metaphor to perform analysis – there really isn’t that much difference in their approaches. No matter what the product, analysts still perform the same tasks. We put together a list of ten tasks we used to do. If one of more of the following seems familiar, you are wasting time:

1. Manually building charts by selecting data, measure and value combinations, chart types and then fiddling around with various combinations in order to build ONE chart.

2. Creating multiple charts by repeating the process above over and over for each chart.

3. Going through the forensic process of diagnosing an answer to a strategic question by trying to visualize data on a single chart in a trial and error process.

4. Comparing the implications of two different data sets or scenarios by going through the steps above twice and building two different set of charts and then toggling back and forth between files or sheets.

5. Reusing a set of charts which have already been created for another set of data by pasting in the charts, removing the links to the existing data set and finding the correct data to use for each chart.

6. Reusing a set of charts which have already been created for another set of data by copying the sheet or workbook (or using the “save as” function), then carefully pasting in new data over the old data, then correcting any broken links in any of the charts.

7. Creating a presentation deck by copying and pasting analytics from Excel into a Powerpoint file, paying special attention to the graphic format used to create links or avoid large file sizes.

8. Updating graphics in the presentation deck with more recent charts reflecting refreshed data.

9. Going back through old presentations to locate the chart type and composition that best reflected a certain type of data.

10. Building a data set by pulling data from multiple data sources into a single Excel workbook, then performing one or more of the above steps on the consolidated data.

It’s time to look for another metaphor for data analysis – one which does not require any of the ten steps listed above. An entirely new metaphor for data analysis where analysts look at their data the way they want to from whatever perspective they want. We have created this metaphor in our Agylytyx Generator. Send us a quick email for a no-obligation demonstration and see how this different approach will save you a lot of time and help you make better business decisions.

BACK TO TOP

12/3/14

Spreadsheets and Human Error – Solvable or Not

Recently a high profile incident blamed on a “spreadsheet error” was widely reported. Apparently, the valuation model Goldman Sachs built for Tibco contained a double-counting of the restricted stock. The impact was significantly more than a rounding error – 2.33% if our math is correct ($100 million off a $4.3 billion price tag). The acquisition is still going through, but many have already begun to speculate on the potential litigation by TIBCO shareholders.

Advocates of self-proclaimed alternatives to spreadsheets have already been pointing to the entire incident as evidence that those nasty spreadsheets are the real culprit. In an “I told you so” kind of moment, vendors (and no doubt will continue to) seize upon the magnitude of the amount involved and use that amount as the basis for ROI calculations. We envision email blasts warning of greater and greater similar calamities looming if we don’t ditch the spreadsheets completely.

As we have written before, that is not realistic, and it is not likely to happen. Does that mean incidents like this one are inevitable? What about the ones that we don’t catch? Is it possible there are significant errors like this which happen all the time? Even if it is not realistic to think that we should we ditch spreadsheets, should we be looking at alternatives?

Our view on this issue is pretty straightforward: spreadsheets have their place in the organization. There are places better tools exist, places where they do not, and places where users should look for applications which make the most out our use of spreadsheet. In fact, we published an extensive analysis of these situations in a previous blog series.

We published our view on this issue and these questions in a series “Spreadsheets, What Are They Good For” several months ago.

An even simpler and more compelling point is this: no application can stop a person or group from this type of mistake. We don’t see any way that any of the “alternatives” to spreadsheets could ever replace the kind of error reported here. No doubt that many sets of eyes went across this model, but nobody caught the double-counting going on. No “alternatives” no matter how much they enable collaboration, can protect humans from themselves. The onus should be on vendors who want to use this incident to indicate how their solution would have prevented it from occurring.

BACK TO TOP

11/5/14

Thoughts from Day Three at AFP

The AFP (Association for Finance Professionals) 2014 Conference opened Sunday in our nation’s capital. Almost 7,000 finance executives from all over the country, in fact all around the globe, are gathered here. Today was day one of the conference. Throughout the course of the first full day, we continue to have a chance to talk with a wide sampling of folks in the FP&A Community and we hear several presentations. After all we have four people on the ground here. Comparing notes, here are our collective thoughts from the third day.

1. Platforms need to be self service.

The increasing need for responsiveness and adaptation makes working with IT on every revised query undesirable. IT generally doesn’t like it either, and rarely has satisfactory turn times. The solution is a self-service platform that IT can implement once and finance users can pull data from systems of record and easily and quickly make their own analytics.

2. Don’t bet against the US economy, especially over the long term.

Several forces combine to make the US economy a safe bet for global leadership. In addition to being the most adaptive to change, the US is poised to become the largest oil producer, has an efficient economic regulatory system (comparatively), and a strong currency. In fact, as the world economy become more interdependent, it will become increasingly important for the U.S. to maintain a strong position.

3. When it comes to analytics, simple is better.

The analytics we saw in demos and in presentations ran the full gamut between very simple output and very intricate diagrams. Some of the more complex analytics clearly provided the most information possible in the optimal digestible format. A few people like these, and they are most likely to provide only one or two of these analytic types in a single meeting in order to optimize explanation time. However, when given two contrasting choices in a side by side format, we found a consistent preference for the simpler analytic presentations. The appeal of being able to generate more simple graphics which are more easily digestible tends to be better overall context for management decision making.

4. The FP&A Community is Still Too Insular at AFP

We wrote yesterday about the maturity of FP&A as a track at AFP. Those close to the community tend to be very understanding about the importance of FP&A, and it is clear that the AFP board is supportive of the certification approach. We explained that the level of context coupled with the certification put FP&A on par with other disciplines at AFP (Tax and Treasure for example). We continue to believe the FP&A community at AFP needs to continue to think and act cohesively, and not talk about ourselves as “new” anymore. Even though the track, its content, and its participants make FP&A look great within our community. Still, externally, we may still have work to do. For example, a treasury professional leaned over to us at one party and asked “what is FP&A?” It is incumbent on us to be able to answer the question with unanimity.

5. Don’t just do your FP&A job.

Thomas Friedman made this point best, but in fairness we heard it a lot. Because of the dynamic environment in which we live, it will be necessary for us to think like ‘artisans” and “entrepreneurs in our job, always looking for ways to improve FP&A at our companies.

6. Bring in data from non-finance sources.

To help make analytics continue to impact company strategy or at least its ability to execute, the data used in analytics needs, by definition to include data from other sources like CRM and Salesforce databases.

BACK TO TOP

11/4/14

Thoughts from Day Two at AFP

The AFP (Association for Finance Professionals) 2014 Conference opened Sunday in our nation’s capital. Almost 7,000 finance executives from all over the country, in fact all around the globe, are gathered here. Today was day one of the conference. Throughout the course of the first full day, we continue to have a chance to talk with a wide sampling of folks in the FP&A Community and we hear several presentations. After all we have four people on the ground here. Comparing notes, here are our collective thoughts from the second day.

1. Analytics is a hot topic.

It seems like there are more presentations that mention the word analytics than ever before. It is on vendor signs and in hallway chats. We see this going one of two ways. The concept may become so prevalent that it comes to have a pretty standard meaning and becomes part of our vernacular like “spreadsheet”. On the other hand, it might be the bright shiny new object that basically exists as a fad that falls into disuse only to resurfaced occasionally like “balanced scorecard.” Until we grapple with this one for a while, it may be too early to predict which course this term will take.

2. Attempts at standardization are already taking place.

Continuing this thought from above, some (to their credit) have already tried to create some standardized meanings. One speaker drew what seemed to be relatively artificial distinctions between business intelligence, business analytics, and analytics for corporate performance management (CPM). By linking the value of analytics to corporate performance management, this speaker may have inadvertently contributed to the “buzzwordization” of our community. As if to further reinforce this idea, this same speaker stated that the term “predictive analytics” would eventually be replaced with “prescriptive analytics.”

3. For any finance systems, look beyond finance for requirements.

We have long known that finance constituencies are a lot broader than just finance. Even in forecasting or planning for example, there are business inputs and executives reviewing the output. Very rarely does a finance system exist in a vacuum (meaning the application and its output will only be used by finance). Even if the application is used exclusively by finance, it’s safe bet that the output will not. The need to understand the user’s perspective is paramount. In order to best ensure the success and buy-in of something (whether it is an application or process, for example), it is important to understand the requirements of the community outside finance. That is why the FP&A exam has a communication component, it is why so many finance teams lead strategy at their respective companies, and it is why several speakers pointed out the need for finance to coordinate their activities with sales and marketing (e.g only do value-creating activities).

4. FP&A at AFP is growing up early.

By this we means that FP&A seems to be maturing before anyone thought it would. The FP&A track at FP&A is already starting to show signs that it belongs. There is an interesting dichotomy at work here. There is a temptation to look at track attendance (especially the late afternoon ones) and conclude that there just isn’t interest in FP&A. We don’t think that is true at all. In fact, we think the fact that FP&A now has as many courses as other tracks at AFP coupled with the fact that many sessions in other tracks are also sparsely attended at the end of the day says quite the opposite. The party events (like tonight’s FP&A reception) were very well attended. In fact, with the rise in vendors, sessions, speakers and a certification, FP&A actually looks a lot more like the other tracks at AFP. If anything, we need to stop talking about FP&A as a fledgling activity at AFP. Maybe it was three years ago, but it is already starting to feel mature.

5. The FP&A Certification Helps.

We noticed quite a few with the red “FP&A” strip on their AFP badge. Like CTP, this designation means that the wearer has taken and passed the FP&A exam and met the experience requirements. We heard from panelists and non-panelists alike that the content on the FP&A exam 1) was common to FP&A and relevant to their jobs 2) significant and 3) difficult. Still the worldwide demand for the certification is high according to AFP, and will help drive the maturity of FP&A as even more people start adding the Red strip to their badge.

BACK TO TOP

11/3/14

Thoughts from the AFP Opening

The AFP (Association for Finance Professionals) 2014 Conference opened yesterday in our nation’s capital. Almost 7,000 finance executives from all over the country, in fact all around the globe, are gathered here. Last night was the kickoff. The leadership of AFP spoke, followed by Pinnacle awards, then a session from Ben Bernanke (preceded by one of the nation’s top Marching Bands). The exhibit hall opened for a bit, and the opening reception took place in an impressive private function at the Museum of American History, complete with live music and (very convincing) impersonators of Abraham Lincoln and Uncle Sam. As usual, it was an auspicious start to what promises to be another substantive conference.

Throughout the course of the opening, we did have a chance to talk with a wide sampling of folks, from Fortune 50 FP&A executives to Cash Management leaders at brand new companies to public sector banking officials. In addition, we overheard a lot of conversations. After all we have four people on the ground here. Comparing notes, here are our collective thoughts after the first day.

1. The TARP program was a success.

Okay, Ben Bernanke made this point in a not-so-subtle way. In fact he called the program the most effective and least popular government program ever. The thing is, people tend to agree with him. The consensus reaction seemed to be “crisis averted, thank you very much.”

2. We are more stable today.

Yes, Ben Bernanke made this point too, and we didn’t get a measured response. But we also didn’t hear any objection, and we tend to agree. He pointed to Basil Three, Dodd-Frank, and Bank stress testing as things we didn’t have in 2006 that make us safer this time around.

3. The U.S. economy is poised for leadership long-term.

This point was a bit more controversial in that not everyone agreed on the reasons, but most people we talked with agreed on the premise, that the U.S. economy is poised to lead again. For the most existential, it was an “if not us than who?” perspective. For most (including Bernanke) it was a fueled by a confidence in the strength of future generations of American economic thought leaders coupled with a resurgence in economic diversification.

4. The diversification of AFP is underway.

The AFP leadership, conference speakers, tracks, and exhibitors all point to one thing: AFP is not just tax and treasure anymore. The tent is widening to include content and attendees from other disciplines within finance, most notably FP&A. A lot has been made over the last two year about the rapid rise of FP&A. The consolidation of certain pre-conference sessions let us know that this trend has slowed a bit from a very strong growth trajectory, but the commitment to content by AFP and the evolution of the membership base leads us to believe that this trend is still very strong and is likely to continue. We estimate that ultimately fully 20% of AFP members will identify themselves with FP&A.

Looking ahead, the agenda promises more content to come. We will keep you updated.

BACK TO TOP

10/7/14

Scenario Planning in the Context of Long Range Planning – Creating Confusion or Clarity? Part 4 – Comments That Show Scenarios Aren't Helping with LRP Allocation Decisions (Decoding the Diplomatic Speech)

Our last post in this series (Top 10 Reasons Stakeholders Give for Not Buying into LRP Scenarios") proved to be quite popular. We promised to focus on decision-makers and the way they make budget allocations during LRP time. We do want to keep to that promise. Our goal in this series is always to help finance departments manage the LRP process. Our final post or posts in this series will focus on how finance departments can head off some of the problems we have identified and to eliminate them as they occur. In keeping with the theme of lists, we have organized our thoughts for this post into some of the things decision makers say when they get output. Because these decision makers are usually diplomatic, the probable interpretation of the statements are included.

The Top Things Decision Makers Say about LRP Scenarios and What They Might Mean.

1. "Where did you get this data?"

The probable meaning: "I have already heard through the grapevine that many people didn't provide complete data for these scenarios, so I don't trust them."

2. "Are these outcomes real?"

The probable meaning: "I don't see how we can predict such different outcomes from slight changes in affordable spend, so I have no confidence in the output."

3. "This analysis is too complicated."

The probable meaning: "I don't see an appreciable difference in the scenario outcomes anyway."

4. "What I really want to see is X"'

The probable meaning: "I am not bought into these scenarios as they are crafted."

5. "What are the assumptions behind the data?"

The probable meaning: "If I have to ask for these details, I am probably too confused to make decisions on them."

6. "We can't afford to X right now."

The probable meaning: we can't ask the organization to embrace this much change."

Of course, there is such a thing as reading too much into some statements. Also, each person and organization are different, so it's best to put these comments or questions in their proper context. One thing is certain however. These are fairly common statements which are indicative of a lack of confidence in the scenario planning approach, and any of them could put the entire LRP process in jeopardy. Ignore them at your own risk!

In the series finale we will talk more about how to head off these types of comments, and what to do if you hear them.

All of these objections can he either headed off, answered or both, but they all represent real threats to the development of a realistic scenario approach in long range planning. In the next post, we will turn our attention to some common problems facing those presenting the strategic options in LRP scenarios, and we will do so through the eyes of a decision-maker. In the final post (or possibly two) we will talk specifically about how to head off and address the objections, as well as ones frequently heard at decision making time.

BACK TO TOP

9/19/14

Scenario Planning in the Context of Long Range Planning – Creating Confusion or Clarity? Part 3 – Top 10 Reasons Stakeholders Give for Not Buying In to LRP Scenarios

In Part I of this Series we looked at some of the reasons that scenario planning fails so often during the annual budgeting process (which for the purpose of this series we called LRP). In Part II we looked at the perspective of the "stakeholder," the executive tasked with providing the input and holding the budget. Next, we planned to look at the perspective of the executive decision maker, usually the CXO who is making the ultimate budget allocation decisions. Then our plan is to turn our attention to the finance organization administrating the LRP process, standing between the stakeholders and executive decision makers, collecting information from one, providing analysis to the other. Finally we plan to wrap the series up by attempting to identify ways to make scenario planning more effective in the LRP process.

Before we look at the executives who are the constituents of LRP output, we think it is worth one more post considering those stakeholders who are tasked with providing input to the LRP process, and their relationship to scenario development. In the spirit of the top 10 lists that have become so popular, and since many companies are right in the midst (or about to kick off) their data gathering activities, we thought it would be a good idea to publish our view of the top 10 reasons stakeholders give for "opting out" of scenario planning inputs during LRP time. In a future post, we'll talk a little about how to head those off or address them directly. Many of them are not particularly good logic, nevertheless, they are frequently used. We have heard them all so it is worth noting that these are in no particular order and often not mutually exclusive.

1. The scenario involves reduced affordability.

The thinking: "why should I provide any information for a scenario which might provide decision makers with a reason to reduce my budget?"

2. The scenario is irrelevant to stakeholders.

The thinking: "why should I provide data for a scenario that (for example) effects markets which we don't even address?"

3. The scenario is needlessly complex.

The thinking: "This scenario is so complicated that no one will ever be able to understand it, much less act on it."

4. Stakeholders don't feel comfortable with the answer.

The thinking: "I am not sure how this scenario will impact our group."

5. Stakeholders think creating the answer isn't worth the effort.

The thinking: "Our people can probably invest the time to figure out the answers, but the impact to the business is too great."

6. Too many data requests already.

The thinking: "They are asking too much of us if they expect us to provide data to support all these different possible scenarios."

7. No confidence/lack of visibility into the way the data will be used.

The thinking: "We are not going to have any input into how the data for this scenario is consolidated or presented."

8. Concern over how others will complete the data for the scenario.

The thinking: "Others may use this scenario to their advantage by providing data for this scenario that paints an overly optimistic picture."

9. Impossible to use existing designed templates.

The thinking: "my data response doesn't fit cleanly into the existing LRP template. I sure wish we could also provide X data for the proper context, but there really isn't a way to do that."

10. Lack of strategic interest.

The thinking: "there is no way to represent the strategic importance of this initiative in quantitative terms."

All of these objections can he either headed off, answered or both, but they all represent real threats to the development of a realistic scenario approach in long range planning. In the next post, we will turn our attention to some common problems facing those presenting the strategic options in LRP scenarios, and we will do so through the eyes of a decision-maker. In the final post (or possibly two) we will talk specifically about how to head off and address the objections, as well as ones frequently heard at decision making time.

BACK TO TOP

8/28/14

Scenario Planning in the Context of Long Range Planning – Creating Confusion or Clarity? Part 2 – Stakeholder Complications for Scenario Planning in LRP

When it comes to resource allocation, stakeholders get sensitive. They get extra-sensitive when it is annual budget time and their teams and projects are on the line and being scrutinized. Different organizations handle their planning processes different ways and call it different things. For the sake of this post, we're going to refer to the annual budgeting process as the long range planning process (LRP).

In LRP some companies attempt to incorporate scenario planning into the data collection process. Most frequently, companies look for an "upside" and "downside" scenario. In many cases, the most common guidance is "what could you accomplish if you had X% more" and "what would your results be if you had X% less." In more cases than we'd probably all like to admit, we've seen institutional "pushback" in the form of business constituents who conspire to create unrealistic cases on either side, or even refuse to submit a particular case.

The other type of scenario planning which frequently occurs in the LRP process is the kind of analysis of the output that leads to the creation of "what-if" scenarios e.g. "what happens to the portfolio if we were to spend our money slightly more geared toward the development of new offerings?"

Each of these types of scenarios requires educated guesswork and assumptions on the part of the business constituent and/or the planning analyst. A recent post by a finance executive on our previous blog about scenario planning indicated that the reason scenario planning usually fails was the lack of consensus building in the process. We would agree, and note that this generally occurs because there is no consensus around the assumptions to begin with, and therefore building confidence around scenarios becomes impossible.

This is why scenario planning with the LRP process is hard. There really isn't a good way to do it. In theory, the idea of understanding the implications of allocating affordability along different vectors is a very desirable thing to do. Understanding the implications of spending in different ways gives executives the visibility and control they desire. Creating these scenarios within the context of an LRP process is difficult.

When so much is on the line, one of the easiest things for a stakeholder to do is to cast doubt on the scenario planning part of the LRP Process or even refuse to participate. Successful scenario planning in the LRP process starts with confidence building and consensus buy in. As finance executive recently noted (prompted by our previous blog post) most scenario planning processes fail because of a lack of consensus building.

In this post, we have focused on the need for consensus building around the assumptions behind scenarios in the LRP process. In the next post, we'll comment on something to which we allude to in this post the lack of consensus that can transpire around the scenarios themselves. Finally, we'll turn our attention to the consensus-building problem around the output of the scenarios.

BACK TO TOP

7/10/14 – GUEST POST

Where We Are Headed – Using Analytics to Manage Risk and Improve Global Performance

Agylytyx (www.agylytyx.com) is on the right track- create a powerful suite of planning and analysis tools providing senior financial management professionals the ability to instantly access global ERP data and quickly "drill down" using analytics which develop actionable management insights and improve business performance. Taking hundreds of thousands of real-time data points and quickly distilling them down to create easy to understand actionable reports and charts is a powerful management capability. I have agreed to serve as a Senior Advisor to the Agylytyx team and I share their vision on the FP offerings and the powerful new benefits offered to users.

What I personally find most compelling are the risk and opportunity analytic benefits to Fortune 500 users. Most analytics today focus on internal management, drawing upon worldwide ERP and other info to gauge and optimize business performance. External market and risk assessment analytics are often less structured, more "one-off," and as mentioned in recent FP blog posts, challenging metrics around which to build management consensus.

Companies typically address external market and risk assessment today using proven, traditional tools and methodologies taught to business school students and used in most Fortune 500 companies. Let's discuss three of these.

First, Porter's Five Forces Model which provides insights on market directions and attractiveness. Simply stated, if a market is highly competitive, it may be impacted by substitute technologies, and is being "attacked" by new entrants seeking to capture market share. One would expect this market to be highly competitive with declining profit margins, and the market is therefore considered "less attractive." By contrast, if the market is more stable, competition is minimal, and the impact of these market forces is less severe, one would consider the market "attractive," primarily because this market is capable of supporting higher profitability as demand develops. While developed more than three decades ago (Michael Porter, 1979), the model has relevance today, and in some ways given external market uncertainties, even more so. It is worth noting that the term "market" can be defined as an overall market (e.g., generic pharmaceuticals) or a targeted product market (e.g., topical antiseptics). This has distinction has relevance given Agylytyx scenario and 'drill down' analytics as we will see shortly.

A second such model often used is STEEP analysis – a traditional, structured strategy management tool addressing the external market environment. STEEP analysis addresses the following representative issues:

Social – demographics, wealth distribution trends, social and cultural trends that may impact the business

Technological – current and emerging technology, product and process innovations, R&D efforts

Economic – GDP, interest rates, fiscal and monetary policy, raw material sourcing, exchange rates

Ecological – 'greening' impact, environmental regulations current and future, renewable energy impact

Political – legal and regulatory issues, stakeholders influencing the business

Yet a third model involves competitive benchmarking – identifying the top five competitors for each 'Portfolio of Interest" or "POI", and defining and tracking the 3 to 5 key benchmark metrics for each (e.g., growth of same store sales, number of users, etc.)

Each of these models (and other similar approaches) are frequently used to assist in long term strategic planning. This is often an ad-hoc, 'give and take' process which may involve many staff, resources and time. While this process works, it is inefficient and cannot possibly address all risks and opportunities. The classic example was Monsanto's inability to foresee all the market and regulatory issues related to offering genetically modified ("GM") products within EU markets – a major setback to Monsanto's global GM business at the time. Securing real time external market analytics, and understanding, with precision, how these insights impact risk and opportunity in global Portfolios of Interest is a powerful strategic and competitive capability.

Coupling Agylytyx real time data analytics and scenario generators with predictive analytics to address external market risks and opportunities provides an exciting portfolio of new analytic capabilities that complement and streamline today's traditional strategy management process. The ability to:

Draw upon hundreds of thousands of value chain data points

Instantly generate alternative "what-if" business scenarios and evaluate their impact

Track and analyze real-time unstructured data sources (e.g., news feeds, periodicals, market reports, etc.) and integrate these inputs to refine and enhance a POI's external market outlook are some of the many capabilities available which provide management with a way to integrate both internal and external data and assessments for all global Portfolios of Interest.

Of course, risk and opportunity are complimentary. If, for example, analysis identifies a risk to an existing business (e.g., a new regulation in Germany imposing new conditions for financial services), that same risk may really be an opportunity to create a new Portfolio of Interest. Risk and opportunity analysis go together and scenario analysis plays a key role here to test "what-ifs" – a task that has traditionally been hard to accomplish in a strategic planning environment.

Proven predictive analytics coupled with the Agylytyx Generator's scenario generator and analytics is now making this all possible. You can expect to hear more about these exciting new capabilities in coming months.

This guest post was provided by Paul B. Silverman. Paul B. Silverman is an Advisor, Executive, Speaker, and Educator, and has more than 35 years senior corporate management, management consulting and entrepreneurial experience. He serves as Managing Partner of the Gemini Business Group, LLC, a new venture development firm, and is former Executive Chairman and CEO of InferX Corporation, a predictive analytics company. He also serves as Adjunct Professor in the R.H. Smith School of Business at the University of Maryland. He can be reached at paul@paulbsilverman.com or on his blog at www.paulbsilverman.com/blog

BACK TO TOP

6/19/14

Scenario Planning in the Context of Long Range Planning – Creating Confusion or Clarity? Part 1

When it comes to resource allocation, stakeholders get sensitive. They get extra-sensitive when it is annual budget time and their teams and projects are on the line and being scrutinized. Different organizations handle their planning processes different ways and call it different things. For the sake of this post, we're going to refer to the annual budgeting process as the long range planning process (LRP).

In LRP some companies attempt to incorporate scenario planning into the data collection process. Most frequently, companies look for an "upside" and "downside" scenario. In many cases, the most common guidance is "what could you accomplish if you had X% more" and "what would your results be if you had X% less." In more cases than we'd probably all like to admit, we've seen institutional "pushback" in the form of business constituents who conspire to create unrealistic cases on either side, or even refuse to submit a particular case.

The other type of scenario planning which frequently occurs in the LRP process is the kind of analysis of the output that leads to the creation of "what-if" scenarios e.g. "what happens to the portfolio if we were to spend our money slightly more geared toward the development of new offerings?"

Each of these types of scenarios requires educated guesswork and assumptions on the part of the business constituent and/or the planning analyst. A recent post by a finance executive on our previous blog about scenario planning indicated that the reason scenario planning usually fails was the lack of consensus building in the process. We would agree, and note that this generally occurs because there is no consensus around the assumptions to begin with, and therefore building confidence around scenarios becomes impossible.

This is why scenario planning with the LRP process is hard. There really isn't a good way to do it. In theory, the idea of understanding the implications of allocating affordability along different vectors is a very desirable thing to do. Understanding the implications of spending in different ways gives executives the visibility and control they desire. Creating these scenarios within the context of an LRP process is difficult.

When so much is on the line, one of the easiest things for a stakeholder to do is to cast doubt on the scenario planning part of the LRP Process or even refuse to participate. Successful scenario planning in the LRP process starts with confidence building and consensus buy in. As finance executive recently noted (prompted by our previous blog post) most scenario planning processes fail because of a lack of consensus building.

In this post, we have focused on the need for consensus building around the assumptions behind scenarios in the LRP process. In the next post, we'll comment on something to which we allude to in this post the lack of consensus that can transpire around the scenarios themselves. Finally, we'll turn our attention to the consensus-building problem around the output of the scenarios.

BACK TO TOP

6/12/14

Scenario Planning – Forward Movement or an Exercise in Futility?

Most of us have been involved in a lot of scenario exercises in our day. Most commonly, scenarios in finance involve potential budget reallocations and are frequently associated with the planning exercise. Sometimes there is a lot of available planning data on which to base scenarios, often the data for the scenarios is based on assumptions, and frequently the scenarios themselves are purely hypothetical.

Then there is always the question of what to use as a baseline for the scenario during the planning process. Usually it is the existing forecast (what is frequently dubbed the "business as usual" scenario). And there are the chosen tools for supporting this exercise. Usually this takes one of two forms: 1) putting a "square peg in a round hole" by using a scenario feature or function that exists within a current planning data source, or 2) pulling data into Microsoft Excel and using features like goal seek.

One way or another, scenarios are often created in the planning process. A common problem is understanding what these scenarios mean in terms of their impact on the budget. Making the scenarios "actionable" by translating their effect into budget terms is often a step that is omitted. In these frequent cases, there may be a moment of rational self-clarity when an executive team realizes that the scenarios they are examining are in fact not an actionable one for the company. More commonly, the lack of confidence associated with a company's inability to execute on a scenario leads an executive team to revert to the "original" scenario, which is in fact one which changes the current course and speed very little.

This is the unfortunate reality of many planning processes. It isn't because the scenarios under consideration aren't good ideas. In most cases, an executive team would not be considering them if they weren't.

Two major problems exist which make scenario planning today such an exercise in futility: 1) scenarios generally aren't actionable (meaning they aren't linked to budgets effectively) and 2) scenarios can't be easily evaluated, meaning the implications of the choices between the options can't be easily compared.

It is a rare case that tools supporting process are the reason the process breaks down, but that may actually be the case here. What is missing in scenario planning isn't the ability to make scenarios, it is the ability to translate their impact to budgets and to compare the implications of the scenarios reliably and quickly. Getting forward movement on scenarios requires a tool that is built specifically for the purpose of translating scenarios into something actionable and allows for quick and easy side by side comparison of the scenarios under consideration.

BACK TO TOP

5/22/14

Strategic Acquisitions – Truth or Oxymoron

Acquisitions are typically viewed as strategic to a company. Often they are done in the name of corporate growth – entering a new market or a new geography for example. Sometimes they are done in order to remove a large competitor. Occasionally they are done in order to supplement a firm's capabilities – a technology acquisition for example. In all these cases, the rationale that is messaged to external investors is a strategic one – management creates a compelling case as to why the company was absorbed.

These explanations sound great and often make sense to everyone who hears them. After all, management teams and the consultants who advise them in such transactions are usually knowledgeable, respected persons.

So why do so many acquisitions fail? The number are pretty bleak – a staggering percent of the acquisitions made not only do not live up to their business cases, they are often not accretive to shareholder value. With some pretty hefty price tags for a lot of these acquisitions, the results to a company can be devastating.

The reasons acquisitions usually fail has nothing to do with the soundness of the strategy – it is the ability to integrate the acquisition into a company's business processes. Desperate to make acquisitions accretive, companies have tried all kinds of tactics – from creating ad hoc teams to help integrate a company, to making integration a centralized service to leaving an acquisition unintegrated. In some cases, all approaches have been tried.

Ultimately, integrating acquisitions usually fails not because the acquired company does not fit in with corporate strategy in an explainable way. Acquisitions do not typically fail to achieve desired results because they cannot find a way to use a company's existing website, ordering systems, financial applications, etc.

Applications most often fail for the same reason large companies have a hard time executing their own strategy – they cannot budget according to their strategy. Thinking that an acquisition can survive and thrive in an environment where budgeting doesn't reflect corporate strategy is a mistake. Trying to integrate a strategic acquisition into an already difficulty planning process is the surest way to make sure acquisitions are not, in fact, strategic. A strategic acquisition may be the intention, but the truth is it is usually tantamount to throwing good money after bad.

BACK TO TOP

5/8/14

Consultants, Systems, Queries, and Reports – The Finance Treadmill

Sometimes it seems like it never ends. The same reports should be easy to produce each quarter. But they aren't. Reorgs occur, skus and pids get rebundled, departments get renamed, partners merge, companies get acquired, and on and on and on. Data Hierarchies struggle to keep up. In the midst of all this change, systems go down for maintenance or upgrades or some such thing (was it planned or not, who can say?). The constant changes have even spawned a cruel yet somehow sensible name when running comparisons in this kind of environment – "as is" and "as was" reporting.

All of these circumstances make the consultant who evolve ERP systems a seemingly permanent fixture in most large corporations. The cloud was supposed to alleviate a lot of the need for long system integrations. But regardless of where a tool is located, the regular changes require people to keep up, and they are usually consultants who have the knowledge of the system(s) in question. Be very wary of any company who says that their reporting engine can be implemented quickly and easily.

In the midst of this regular finance treadmill, there is still the nagging question of support for business strategy. How can an organization possibly hope to answer big, important strategic questions about the business when it can barely manage to keep numerical reporting supported from quarter to quarter?

It's no wonder then, that finance organizations pull data from these multiple systems into a spreadsheet in order to support ad hoc strategic queries. There has to be a better way to perform visual analysis, and there is. We know because we built the product to deal with producing both regular and ad hoc graphical analysis and reporting in such a difficult environment. The life of financial reporting may be a treadmill, but stepping off the treadmill to help support the important decisions is now possible.

BACK TO TOP

4/25/14

The Appropriate Role of Spreadsheets in Finance –
What Are They Good For? Part Five – Series Recap

In this series on Spreadsheets in Finance, we have looked at where spreadsheets cause nothing but trouble, where they need to be augmented, and where they tend to do just fine where they are. If you are keeping score, here is a recap of our recommendations, along with specific vendor links, follows:

spacer spacer spacer spacer spacer
Using Spreadsheets For        
    Consider   You May Want to Look At
Creating Templates   Microsoft Excel as flexible a template design tool as you can get.   Look no further if you are comfortable using Excel.
Consolidating Information   Consolidating information from templates is generally a manual exercise. At best, you are wasting your team's time. At worst, you will end up with errors. Collect information with templates, but look for a solution that gracefully handles the import of information from spreadsheets and automatically consolidates them.   Planview has a great solution. Their application is naturally suited for importing templates which have already been created and issued. Best of all, the infomration is all gathered into a read-to-use rollup mechanism.
Aligning Groups   You are definitely wasting your team's time trying to manage revisions and coordinate requests, especially in a matrixed organization. Look for an application which supports multiple simultaneous users and has self-service access controls built-in.   Again, here we recommend Planview as the best alternative. Because the company comes from the project and portfolio management space, their application is robust enough to handle multiple simulatenous users, and innately understands self-service access controls, hierarchy definitions and changes, etc.
Pulling Data   We generally love to put information from systems of record into spreadsheets. Look for a datasolution that produces data "cubes" that you can query using Excel easily.   Essbase will do the trick here.
Building Models   Excel is generally great for comparing scenarios. If you do want to supplemement, we have some recommendations here:   Anaplan – Great for building flexible and predictive models from existing financial data.
SmartOrg – Great for building custom models from financial data which help evaluate strategic options.
Crystal Ball – Powerful optimization tool with a very technical interface
Comparing Scenarios   Spreasheets are notoriously bad for comparing scenarios they have modelled. In fact, none of the tools mentioned above are particulary effective at scenario comparison. There just aren't many applications out there which enable this comparison.   Agylytyx Generator – The Agylytyx Generator, with its side-by-side comparison function and its ability to analyze scenarios from multiple perspectives, is an ideal tool for this task.
Making Charts   Excel is very flexible at making charts. The downside - the amount of time invested in creating a single chart can be daunting.   Agylytyx Generator – The Agylytyx Generator has all the possible measure and attribute combinations built into the product already, so users never build charts, they make and apply entire analytic templates so multiple charts are created immediately.
Reporting   Excel is good at designing reports, but prone to error. These errors become more likely the more data is involved. And no one wants to be the one preparing these reports each quarter or month.   Unfortunately, as we looked at in this series, there is no good reporting alternative which we are comfortable enough with to make a recommendation, so it looks like we are stuck with Excel for now.

Of course, our vendor recommendations come from experience. We have used a lot of applications in this space. To be fair, we haven't used or seen everything out there, but we do feel comfortable enough in our knowledge to make these recommendations. Every firm's situation is a little different, so we recommend using a third party to help evaluate and select applications suitable for your specific situations.

Above all, do not believe everything you read about spreadsheets in finance. We are especially wary of vendors who bash spreadsheets too much, or who make bold claims about displacing spreadsheets. If nothing else, this series should reveal our belief that spreadsheets are tools of our trade, and we need to look for ways to get the most out of them, save time, and prevent errors, not try to displace them.

BACK TO TOP

4/10/14

The Appropriate Role of Spreadsheets in Finance –
What Are They Good For? Part Four

In part II of this series, we took an extensive look at eight situations where spreadsheets are commonly used in corporate finance today. In part III, we looked at when a company should continue the use of spreadsheets either on a standalone basis (keep) or with a supplemental tool (supplement), and where they should probably look at replacing them altogether (ditch). In part III, we promised a quick peek at some of the other tools worth looking at in order to supplement or completely replace spreadsheets altogether. This part IV isn't a comprehensive guide to vendors. Still, this post should provide a pretty good place to start, and we will be only too glad to supplement our analysis if we are contacted.

One of the places we looked at as a potential supplement to the spreadsheet is in the consolidation of information. We noted that spreadsheets generally worked when consolidating information from various sources. But we also noted that consolidating information from multiple systems should not really be necessary. There are a few vendors out there that claim to be able to handle all the data required and actually perform the consolidation themselves. When these systems are broadly implemented, consolidation and roll up can and should happen within the system, eliminating the potential for error for which spreadsheets have become infamous. The traditional ERP Vendors like SAP and Oracle have acquired and integrated consolidation modules. Recent cloud based vendors like NetSuite/Adaptive Insights and Host Analytics are also advertising single-source-of-truth consolidation points. We rated this "supplement (then possibly ditch)" for a reason – truly implementing one of these systems so that spreadsheet consolidation can be completely avoided is a time consuming exercise, especially in large corporate finance departments.

The notion of using spreadsheets to align groups is an interesting one. In part II we indicted the use of spreadsheets to perform this task. In Part III we provided more specific analysis as to why spreadsheets were inappropriate for aligning budgets from various business units and functions. We mentioned that there is no clear winner when it comes to alternatives. Here we recommend looking outside the finance arena for an application which handles multiple versions, multiple simultaneous users, various access controls, and workflows graciously. Specifically, Planview has an excellent application that, as it turns out, is well-suited to the complex task of aligning budgets in large corporate environments. It turns out to be surprisingly flexible and easy to implement as well.

We did rate the capability to pull data in to a spreadsheet as a reason to "supplement" the spreadsheet. In doing so we recommended that we all recognize the fact the Excel is often used to query databases, and that therefore we should align ourselves with tools that lend themselves well to such queries. Some vendors produce applications which make it quite difficult to use Excel to get at information. Other vendors handle such queries well.

When it comes to modeling scenarios, we looked at length at the power of Excel. We were equally pessimistic about spreadsheets when it comes to comparing scenarios which have been created. Until recently, there simply hasn't been a good way to evaluate the strategic implications of scenarios in a side-by-side way. This is one of the reasons we created the product called the Agylytyx Generator. By enabling quick and easy side-by-side analytic displays of scenarios which have been modeled, the Agylytyx Generator is an application uniquely suited for this purpose.

Excel is pretty good at making charts. The visualization controls in Excel, its flexibility, and its proximity to the data all combine to make it a great choice for making individual charts. There are several product categories which have built tools to emulate chart creation capability. These "classes" of product include:

"Better mousetrap" applications. Either because they are closer to the data or because they enable faster "measure and attribute" use, products like Lumira from SAP have an inherent advantage over Excel.

"Prettier picture" applications. There are some applications like Tableau that present an impressive array of built-in visual charts, and which can be pointed at a particular data set to produce renderings.

"Technical expert" applications. There are some applications like Tibco's Spotfire, which automatically create impressive visuals based on data, but require a deep understanding of the way in which data effects the pictures.

"Report writer/dashboard creator" applications. These are usually part of a larger package like Business Objects within SAP. These applications are ideal for creating canned reports from existing data streams, but are relatively inflexible and harder to edit on the fly.

All of these approaches, including the Excel approach, have two things in common – they all use a "trial and error" approach to uncovering business insights and they all require users to tweak the built-in charts one at a time. We call that a "stumble upon" approach. This was another reason we built the Agylytyx Generator. Agylytyx Generator users never build charts. They create and apply entire templates from the thousands of built-in pre-canned business metrics. This is an entirely different metaphor for chart building, and it makes Excel (and the other products categories mentioned above) obsolete.

Reporting is a discrete task often done in Microsoft Excel, and that should never be case. A corporate financial analyst at a Fortune 10 company told us recently that "reporting tools suck," and that is a painfully true statement. In our opinion there is room in the marketplace for a reporting writer, so folks should have to use spreadsheets to do this job. We are at a loss for a good recommendation here, though.

Those who have read the entire series, or even the last two parts, should be left with the distinct impression that we understand that spreadsheets will not, and should not be, displaced in corporate finance. They should, however, be applied judiciously and to the tasks for which they are best suited in a corporation. In most corporations, the Agylytyx Generator represents a way to make the most of spreadsheets and significantly supplement their analytical capabilities. The goal of the Agylytyx Generator is to significantly reduce the reliance on spreadsheets, freeing up analysts time to do more of the "A" in "FP&A" and spend less time on the spreadsheet (the "F" function).

BACK TO TOP

3/25/14

The Appropriate Role of Spreadsheets in Finance –
What Are They Good For? Part Three

In part II of this series, we took an extensive look at the use of spreadsheets in corporate finance today. We looked at eight situations where they are commonly used today:

Creating Templates

Consolidating Information

Aligning Groups

Pulling Data

Building Models

Comparing Scenarios

Making Charts

Reporting

This is not to say that there are not other uses of spreadsheets. They are frequently used for other purposes, like databases. But these are the most common uses in the finance community of which we are aware.

In this part III, we'll look at where we should continue the use of spreadsheet either on a standalone basis (keep) or with a supplemental tool (supplement), and where we should probably look at replacing them altogether (ditch). Our conclusions might surprise you, and even be somewhat controversial. Nevertheless, our experience tell us that:

Spreadsheets tend to be good places for creating templates. If we just look at template creation and not actually implementing it the template, it is hard to beat the design flexibility of a spreadsheet. True that a more IT-centric type might choose an html tool or a Visual Studio application, but for a finance user, creating a template from a spreadsheet works just fine. We vote "keep" on this one.

Spreadsheets tend to work just fine at consolidating information, whenever that information comes from disparate systems. When information comes from a single system, even if there are different sources for the information, consolidating data in a spreadsheet becomes a duplicative task and subjects the consolidation to unnecessary error. For these reasons we say the best choice is "supplement." Then maybe eventually "ditch."

Spreadsheets can be used for aligning groups in the budgeting process. We have certainly seen it happen, even in very large company environment. It typically requires a lot of manual effort and brute force, and even then the alignment tends to be error prone simply because there is so much data flying everywhere. Using spreadsheets to align groups results in bad business decision at best, and often is thrown out due to a lack of confidence in the results anyway. For these reasons, we vote to "ditch" spreadsheets in this instance.

Pulling Data can be very easy using spreadsheets as long as the data source supports queries from a tool like Microsoft Excel. The advantages are obvious – data pulled directly into a spreadsheet comes from a system so it is verified, but then has the advantage of flexibility. Unfortunately, not all systems are compatible with such SQL pulls, so care is warranted, the "right" tools must be selected for spreadsheet compatibility, or one will end up with a two-step import or even a copy and paste process. This is why we say the best choice is to "supplement" here.

Spreadsheet are generally quite good at Building Models. If one doesn't know how to build models in Excel, the problem isn't the tool, it's the user. It is a very rare that something more powerful than Excel is needed in order to build models. If it is, call a PhD in Statistics, get a mainframe, and use a product from SAS Institute. The only real option for a savvy finance user is "keep" here.

While spreadsheet may be good at building models, they generally are entirely inefficient when it comes to comparing scenarios. Here are we not talking about a changing a variable or using goal seek in order to find sensitivity. We are talking about comparing the strategic implications of two different scenarios. The only way we know to do this is to build charts and toggle back and forth between tabs. Even placing charts side by side in a spreadsheet tab and linking them to the scenarios is cumbersome, difficult, and error prone. Here the spreadsheet has finally outlived its usefulness and we need to vote "ditch" in favor of another tool (even if it is – ugh – PowerPoint).

Spreadsheets are good up to a certain point for making charts. They are very flexible, and can create multiple chart sets from static data. On the other hand, using spreadsheets to analyze information that is in pivot tables and using PowerPivot to create charts is an iterative process – charts become fleeting when they are not saved. When they are saved they often break or become difficult to regenerate. Creating a way to make charts more easily and automatically is needed. That is why we feel like spreadsheets really need to be "supplemented," especially if something more than static charts are required.

Often spreadsheets are used for reporting, simply because (as a Fortune 10 FP&A team member put it to us recently) "reporting tools suck." Specific tools for reporting usually don't exist. Standard reporting formats can be created in just about any tool. The problem is that the variation in reporting changes so frequently in a large corporate finance environment that it almost feels as if it is quarterly. Perhaps the frequent small changes in reporting is why we like the flexibility of spreadsheets so much for our reporting tasks. Of course, reports from other tools can be evolved as well, which is why we reluctantly assess this one a "supplement" as well.

If you are keeping score at home, the checklist below summarizes the places we would keep, supplement, or completely ditch spreadsheets in finance. In part IV of this series, we will look at some of the other tools available for the "supplement" and the "delete" strategy. As a little preview, we will need to look at different tools for different situations.

spacer spacer spacer spacer
Spreadsheet Design Checklist      
  Keep Supplement Ditch
Creating Templates X    
Consolidating Information   X  
Aligning Groups     X
Pulling Data   X  
Building Models X    
Comparing Scenarios     X
Making Charts   X  
Reporting     X

BACK TO TOP

3/13/14

The Appropriate Role of Spreadsheets in Finance –
What Are They Good For? Part Two

Spreadsheets are the "Swiss Army Knife" of the finance world. In part one of this series, we looked at how ingrained spreadsheets have become in our lives. Much like a "Swiss Army Knife" spreadsheets have many uses and we are so used to using them that we even often use them when there are better tools for the job. Usually it is faster and easier and very effective to use the spreadsheet anyway. Let's take a little closer look at some of those situations.

Creating Templates
We often use spreadsheets to create templates to be used in collecting information. Most of us have developed many standard conventions like the use of color coded fonts to designate information to be contributed and locking down cells which we don't want others to overwrite. Using spreadsheets for this purpose usually takes us a few hours at most and we don't have to get IT involved.

Consolidating Information
Many times a template is disseminated to various groups. The various groups complete information and return the completed template. Then finance consolidates the data into a single spreadsheet. In some of the more sophisticated cases, we have created a rollup sheet or workbook with consolidation formulae already in place.

Aligning Groups
Sometimes this is a successor step to Consolidating Information. Sometimes it is a successor step to Pulling Data. Specifically in budgeting or planning cases, these may describe either a "bottoms up" or "top down" process. Because all groups typically cannot see all information from other groups, we usually "parse" the data we have consolidated or pulled into parts which represent the business, then issue that information out to various groups.

Pulling Data
Information we want to get at for analysis purposes often isn't readily available in spreadsheet form. In these cases, spreadsheets are actually used to pull data from various sources. In fact, entire queries are often written in spreadsheets. These queries are often used to pull data from various systems. In the "easiest" of these cases, an icon exists within the spreadsheet toolbar for updating data directions from a "cube." In the more difficult of these cases, SQL queries have to be executed much more manually, requiring a finance person to have IT-level knowledge.

Building Models
Creating analytical models takes on many forms, but at their essence they have some basic similarities – data needs to be analyzed, and the best way to do that is to explore the relationships and sensitivities between the data. Using spreadsheets is a quick, easy, powerful way to do that using functions like goal seeking and scenario manager. Many of us even create assumptions panels, updateable data tables, and report output.

Comparing Scenarios
Often, but not always, this naturally flows from building models. Sometimes scenarios are provided to us. Sometimes we even use other products to develop the models and still use spreadsheet to compare them. There is not always a uniform way to do this in a spreadsheet, so we handle this task in different ways. We might setup parallel tabs in a worksheet with the same reporting output so we can toggle back and forth. We might create side-by-side charts sets one a time and link each chart to the right scenario to develop our comparison. Whatever the method, the use of the spreadsheet to compare scenarios requires a bit more spreadsheet manipulation and a lot of toggling.

Making Charts
Creating a few charts in a worksheet has become second nature to most of us. Many of us have gotten pretty good at it. Even when we are, choosing the right data elements, the best chart type, and the right configuration of data can still be time consuming. Building a set of these charts one at a time is even more time consuming. Still, many of us prefer this method because flexibility which makes the process time consuming also gives us the most power and control. Linking these charts to data becomes especially vexing for us if the data is dynamic. These charts usually still need to be into presentation software, which is a formatting exercise of its own.

Reporting
Many of us use spreadsheets for official reporting purposes. Often our reports need to take on a certain "look and feel" which can best be designed, controlled and updated using spreadsheets. They can also handle "corner cases" more easily. The need for a quick turnaround coupled with the need for flexibility often leads finance persons to design reports using a spreadsheet.

We have look at eight common ways spreadsheets are used in finance. Entire papers could be written about each of these, and there are probably more uses not even covered here. Since spreadsheets are used for all these purposes, it is little wonder that they have become the "Swiss Army Knife" of finance – an indispensable tool woven into our DNA. In part three of this series we will look not at how to "Ditch the Spreadsheet" – that is obviously not a feasible or desirable option. Instead, we will start to look at how to make the most out of the role of spreadsheets in your organization.

BACK TO TOP

3/6/14

The Appropriate Role of Spreadsheets in Finance –
What Are They Good For? Part One

It seems as if bashing spreadsheets is in vogue. In the last two years, articles have appeared in major publications with titles like "88% of Spreadsheets have errors" and "Spreadsheet blunders costing billions." The discourse has not been limited to articles. One of the most popular discussion threads in a recent Linked In discussion in an FP&A groups was entirely dedicated to the use of spreadsheets in planning – with the authors almost uniformly denigrating their use.

We all know that spreadsheets are a common tool of the trade in finance. We all use them. Many of us use them daily. Some of the most common functions used in finance tend to be macros, pivot tables and VLOOKUPS. Many of us use some of the more esoteric features, like goal seek and named ranges. A lot of us pull data from various sources and consolidate that data using spreadsheets to analyze the data and make charts. Still, probably because of the prevalence, many of us have developed a love-hate relationship with spreadsheets – loving them when they work, hating them when they don't.

Maybe that's why emails with subject lines like "Ditch the Spreadsheet" have become so common. A lot of vendors have developed very strong replacements for the use of a spreadsheet in a given situation that they have developed "tunnel vision." These vendors seem to have convinced themselves that, just because they have come up with a better approach than spreadsheets present in a specific situation, they we can all just "throw the baby out with the bathwater" and give up spreadsheets completely.

We all know that this is fiction. Spreadsheets have become second nature to us for a reason – they are infinitely flexible and powerful. There is a reason for that. Ever since Visicalc introduced the first spreadsheet program (which was often cited as the driving force behind early adoption of the PC), these niftly little programs were probably so useful that they were destined to become part of our existence. Then, as a flagship product, Microsoft has spent untold billions over the last twenty-plus years of development of Microsoft Excel until it has become a powerful juggernaut, further entrenching itself into our existence.

Still, there is some truth to the notion that spreadsheet do have their limitations. They have probably been overextended to the point that they have been used in areas for which they were either never intended, or at least for which they are ill-suited and for which there are now better applications. In this series of posts, we will take a look at what spreadsheets are good for, and where there are probably better choices.

Specifically, this series will take an objective look at using spreadsheets for things like: creating templates, collecting information, aligning groups, pulling data, building models, comparing scenarios, making charts, consolidating, reporting, and planning. As we do so, we will point out the usefulness and shortcomings of spreadsheets in each of these situations – we will also take a look at vendors who might have better solutions in the marketplace for certain of these applications.

BACK TO TOP

2/4/14

Cloud Software for Mission Critical Finance Applications?
Don't Believe the Hype

We have noticed a couple of interesting trends in online discussions about cloud-based finance software. Many of us enthusiastically say that finance systems are all moving to the cloud and how on premise folks will (as one of our Advisory Council members put it) be "dinosaurs" who die out in the finance community. We disagree.

One writer in an online thread said he agreed with an earlier post from another author and then posted that stated that "I feel that in a decade or two it will be like asking questions in the 1950s like "refrigerator or ice box?" or "telephone or telegraph?" or "electric light bulb or candle?" Interestingly, that author completely ignored the last sentence from the first writer's post who said, "I think the move to cloud will stop, although there will always be those who prefer on-premise solutions."

While we are generally advocates of the cloud especially for non-mission-critical applications, there are some interesting points to consider in the online discourse:

1. The loudest voices claiming the trend to the cloud usually come from vendors who offer cloud based solutions or members of the finance community at smaller companies. We would be interested to hear from a member of the finance community, especially one at a Fortune 500 company, with their success story about a cloud-based ERP/CPM/Planning solution.

2. According to a benchmarking survey published by Gartner and FERF (the research arm of Financial Executives International), a very important survey which has been running annually for 15 years, in the U.S. last year, for all company sizes, 94% of finance department say they are using a premise solution, while only 6% of them said they used SaaS. That is up from the year before, but the trend, if you can call it that, has a long way to go. Every reputable survey out there has similar numbers. If you trace the origins of the same thread referenced above back to the original article, there is a picture of an ostrich with its head buried in the sand. The implication is that on-premise folks are acting that way. In fact, the opposite is usually the case.

3. Those who argue for a SaaS trend that will one day be ubiquitous ignore the fact that, as one author pointed out, these were offered as time-sharing arrangements back in the 70's and 80's. They were still offered as Application Service Providers (ASPs) in the 90's when Netsuite started (I think with Oracle/Larry Ellison backing). Why didn't the trend catch on already? Why are we only at 6% market penetration in 2014 if this "trend" is inevitable?

We have worked in many Fortune 100 environments for technology companies that would never, ever allow their sensitive data outside the corporate firewall. In our opinion, saying the cloud option will eventually bury the premise option for finance systems is very much like the prediction that the network computer would bury the premise device. Just because something makes better financial sense doesn't mean it will be adopted. There are other considerations too.

BACK TO TOP

1/28/14

The politics of getting the right data to analyze

The scenario: the team has a deadline to provide an answer to a strategic question, one that everyone acknowledges will shape the direction of the company, at least until the next important strategic question comes along. Fortunately, the company is composed of rational actors, so the loudest voice in the meeting doesn't necessarily win, but the one with the best answer usually does. The team is responsible for preparing the best answer by 1) acquiring the right data, 2) analyzing it, and 3) formulating a cogent argument based on carefully documented facts.

So how does the team look like a hero in this case? The data has to determine the data which will e required to inform their process. Then, just getting the right data will be a challenge in itself. Once the team acquires the best available data, there is the challenge of interpreting the strategic picture. Then there is the case to be prepared, and question to anticipate. Adding to the pressure is the timing – the team needs to prepare a comeback within three weeks.

A real world example will help. Assume a company's leadership asks the question "what is likely to happen to the company if we decide to focus a little more of our affordable spend on emerging markets and a little less of it on developed markets." There are no forecast for this kind of scenario laying around, so there is thought involved in proxies that can be used to answer this question, and the approach to create a storyboard, then from that a data request can be formulated.

The team can then go to each part of the company, every business unit, every function, and ask for the required data. Along the way, the team gets schooled in data availability. For example the services function responds "we have what you are asking by services function by product family but not by product line, or you can have it by theater, but you can't have both." Sales finance says "we actually don't have that data, but I think sales ops does." One of the product BU's has data available, but the query has to go to IT and that have a five day turnaround. And so choices are made according to the available data and deals are made to obtain that data from the requisite groups across the company. Worse still, each discussion and data pull leaves less and less time to effectively analyze data and create a strategic response in time.

Obtaining data is of the team's control. Analyzing the data and preparing a response is in their control. Compressing the amount of time spent analyzing data leaves more time to prepare a response. A model which predicts the likely scenario of Emerging Market emphasis using proxy data can be prepared and refined during the data gathering stage, so that the scenario can be completed shortly after obtaining the data. For optimal analysis, the strategic impact of spending more affordability on Emerging Markets and less on Developing Countries needs to be compared to the strategic impact of the "Business as Usual" scenario in order for the cogent argument to be developed.

If a team is using a template creation tool like the Agylytyx Generator, templates full of analytics for Emerging Market theaters and Developed Market theaters would have already been created. Within a few minutes of obtaining the final data, the team has produced, side by side the same group of charts for each scenario. The team can then compare the strategic charts to identify and assess risk, uncertainty, timing, upside, revenue impact, margin impact, marketshare impact, and many more, for the scenarios.

Having all the charts side by side, the team can identify issues and start to prepare a response right away.

This scenario is real. Click here to see for yourself the impact of comparing scenarios side by side.

The power of creating templates without ever having to write a query or build a chart means your team can effectively compress the amount of time it takes to analyze data, so it maximizes the time spent to prepare a response that will win the day and look heroic.

BACK TO TOP

11/5/13

Using the Agylytyx Generator to Answer Strategic Questions

blog20aIn parts I and II of this blog post series, we queued up a very common business problem, then looked at the fact that traditional applications do not address that situation today. To recap, in part I we examined what turns out to be a common problem: many executives asking a team critical strategic questions, to which they expect timely analytical answers. In part II, we looked at the fact that there are not good alternatives to help teams respond in a timely fashion. In part III, we looked at the ways this situation is traditionally treated today. In this part IV, we will talk about the solution.

blog20bThe traditional approach
A quick look back at part III will help contrast the solution approach. The traditional approach of all vendors in the marketplace – be the BI suites, Planning applications, ERP or CPM suites, an Enterprise Data Mart, or even a spreadsheet – they all use the same paradigm depicted on the right. Even when they come prebundled with "templates" or use the latest data visualization techniques, they all follow the same strategy – drill into data until you find and answer, collect all the data on the subject, put your analytics in a presentation, and move on to the next question. We also explored in detail why this approach is wholly insufficient to really address important strategic questions in a timely and robust way.

A new paradigm for business intelligence
Creating a solution to the ability to quickly and easily get to analysis (the "A" in the FP&A), requires a completely different approach. In this approach, users do NOT write queries or make charts. In this approach, users can create their own templates on a self-service basis, templates or "lenses" which represent the strategic "care-about" of each of the executives pictured above. These templates would contain all the analytics necessary to address these leader's perspectives, and since they are built by the finance user, they can also be edited at any time. blog20cThis new solution would allow end users to save these templates for use at any time. This solution would also allow end users to apply any lens to any data set so that different sets of analytics representing those different perspectives on different parts of the company could be built immediately. This solution would mean any plan, forecast, actual, budget, scenario, data slice, etc. could be immediately viewed using any or all of the "lenses."

This solution would effectively create the analytic chart deck for the intersection of the "lens" and the "data set" within a couple of clicks – meaning the chart deck is built FIRST, and all the analyst time can then be spent on the real high value activity – adding analytics to the slides.

This solution is real. Customers of the Agylytyx Generator do not write queries. They don't build charts. They quickly and easily build Frameworks or "Lenses" by choosing the types of things that each user cares about. They use these Frameworks applied to Datasets in order to build chart decks FIRST. That frees up an analyst time to actually do analysis. The result is better, faster business decisions of importance, across the enterprise.

BACK TO TOP

10/29/13

Using Traditional Analytic Approaches to Support Strategic Questions

blog19aIn parts I and II of this blog post series, we queued up a very common business problem, then looked at the fact that traditional applications do not address that situation today. To recap, in part I we examined what turns out to be a common problem: many executives asking a team critical strategic questions, to which they expect timely analytical answers. In part II, we looked at the fact that there are not good alternatives to help teams respond in a timely fashion. In this part III, we will look at the ways this situation is traditionally treated today.

How do teams handle the process of responding to strategic questions today? The available tools differ from company to company, but the process is pretty much the same: 1) drill into data 2) find the issue 3) manually create a chart deck 4) repeat process. Advancements in available toolsets are improving the ability to execute these steps. Still, all these tools fit within traditional approach to answering these questions. In part IV, we will look at a much more appropriate alternative to this traditional approach.

blog19bStep One: Drilling Into Data
All BI Tools, EPM suites, Planning Vendors, CPM suites – all vendors in the market - require end users to start somewhere. "Data Discovery" is the latest trend here - building a chart visually and drilling into that chart. The problem with this approach is that it requires knowing where to look in the first place – knowing where to drill down. Consider, for example, the illustration of the VP of Sales mentioned above who wants to know why sales are soft this quarter. In seeking an answer, users might wonder, is the soft sales trend related to a region? A product line? A business unit? A channel of distribution?

Step Two: Find the Issue Through Trial and Error
Without knowing where to point a tool, users essentially randomly look for the issues until they stumble upon them. Some users may have a gut instinct where to start looking so the investigation may not be completely random. Still, understanding that an answer lies in a general area (for example, "emerging markets are probably responsible" may provide a jumping off point. But the answer is usually much more granular or multivariate (for example, certain product families in certain specific countries through certain channels may be actual answer). The "hunch" or "jumping off" place may also be completely incorrect. "Data Discovery" only helps an end user to flail around faster as they point and visualize until they find the answer area.

Step Three: Manually Create Chart Decks
Once an issue is uncovered, the issue usually needs to be examined from different perspectives to get complete context. If a question can be answered with a single chart, it is most often not significant enough to be strategically important. Strategic questions needs to be answered with a presentation deck. In the sales example, a complete answer would require an analysis of the impact of the soft sales on the product families in question, the countries in question, the channels in question, and the relevant intersections of those. Furthermore, considerations like the impact on business units, on corporate goals, even margin growth, market trends, competitive trends, the sales pipeline etc. might be relevant. In short, a good, comprehensive answer to the seemingly simple question asked by the Sales VP will usually require an extensive analytical response.

Faced with this situation, most end users have to pull data from various sources by writing queries. Even when a single source of truth actually is available, there are unlikely to be templates or built-in charts which can even come close to creating the necessary analytics. That is exactly the reason why so many people use a spreadsheet to consolidate data and build charts, despite having several tools in place from well-established or reputable vendors.

Step Four: Repeat Process
Most often, end users find that they barely have enough time to write queries, pull charts, and manually create a chart deck before they find that they have to go onto the next strategic question. In this case, to be a strong business partner, end users will attempt to find the reasons why sales were soft this quarter, explain the root causes, determine if the issue will be repeated/sustained and for how long, quantify the implications, and help understand possible solutions. After creating this type of deck, end users can move on to the next strategic query.

Unfortunately, given the time pressures that build up from the lack of purpose built applications, end users rarely, if ever, add this kind of value before they have to move on to the next strategic query. To start answering that query, the process starts all over again.

Current tools, no matter how robust, visual, discovery-oriented, template-rich, etc. are NOT designed to support the process of answering strategic questions. In part IV of this blog post, we will examine and contrast what can be accomplished with an application built specifically to support this situation.

BACK TO TOP

10/24/13

Responding to Strategic Questions

In Part One of this Three-Part Series, we looked at how Finance Executives and their teams can easily get overrun by important, high-level, strategic questions. We looked at the importance of their constituents they support, and at the need to turn around quick, insightful answers before moving on to the next important strategic question.

Traditionally, finance teams have tools at their disposal to get at data. Some of the options may include:

Querying essbase cubes

Building dashboard and reports

Pulling data from Oracle, SAP or TM-1 into spreadsheet

Drilling into data using CPM suites or BI tools

... or often some combination of the above.

All of these solutions have been around for years. The major players in the industry employ fancy language to make their improvements seem like a great leap forward. Terms like "data discovery" and "data visualization" are used in a way to suggest a revolution in their approach, but they are really evolutionary changes along the same path. A lot of planning vendors, CPM Suites, BI tools, practically everyone close to the space, uses the language of improving business strategy and decision making.

blog18Still, despite the presence of all these tools making all these claims, CFO's consistently say that the #1 thing in their organization which needs technology attention is "better analysis and decision making." Because this has been the prevailing attitude of CFO's for several years running, the Gartner analyst who runs this survey explained in his presentation this year that it was obvious that this problem is not being addressed by current technology vendors. In essence, all this improved technology has only improved the amount of financial data analysts use to conjure up relevant analytics – they haven't improved the ability of the finance executive to facilitate better analysis and decision making at all.

It is clear that only a contrasting approach can solve this problem. In part III of this series, we will look at the reason traditional approaches to answering key strategic questions are inadequate, and we will look at a completely different approach for a software application that really helps a finance team quickly and meaningfully respond to strategic questions and add value across the enterprise.

BACK TO TOP

10/22/13

Supporting Strategic Questions

In large enterprises Corporate Finance executives frequently support multiple, high-level business constituents. Because finance is the group with "the data," everyone from the CEO, CFO, Senior Vice Presidents, General Managers, Sales Managers, Supply Chain Leaders, Channel Executives, Theater Leads, Corporate Strategy Heads, and more, all seem to want support for key decisions.

Typically, when one of these leaders asks a key strategic question relating to business – that question matters… so they really want the answer quickly. Often, they ask four people at once. If sufficient evidence is not offered to answer a question, a leader often makes a gut call then moves onto another question, and the opportunity will have been missed.

It is little wonder that this happens so frequently in large enterprises. Often, the strategic questions which are asked are not ones which lend themselves to traditional data exploration. For example, for a business leader to ask the question "should we focus more on emerging markets" implies a strategic answer as much as a data-driven response. Even when strategic questions are more basic such as "why are my sales soft this quarter?" – the path of discovery frequently points to a complex set of interdependent variables that help to define the business challenge and point to strategic insights.

strategic questionsUnfortunately, strategic questions from business constituents rarely come one at a time – they usually come in clusters. It is usually at this point that Finance Executives discover that they are religious people – praying that their teams will develop the right analytical conclusions and prepare something more than a set of charts to answer the important questions of strategy.

The fact that so many important strategic questions may be asked from so many different people may be unfortunate. It is also unfortunate that a team will quickly establish a reputation for answering the questions either well or poorly.

However, supporting strategic questions means more than providing a "fact pack" back to the person who asked it. It means being a trusted business partner who responds quickly and with analysis to all the important strategic questions. It means making a difference in better decision-making, across the enterprise.

In the second of this three part blog series, we will explore why traditional approaches fall short at responding to these critical strategic questions.

BACK TO TOP

9/25/13

Sensitive Financial Data In The Cloud

When did "cloud" become a verb? The Wall Street Journal, Wired, even a blogger for SAP have all used the term that way in the last year – in fact all have specifically asked the rhetorical question "to cloud or not to cloud" (basically bastardizing Shakespeare's Hamlet who asked the question of existence "to be or not to be?").

These and many other articles tend to focus around the larger question of whether or not the use of the cloud makes economic sense. For most applications this answer is yes – this is why most companies "outsourced" their payroll to providers like ADP year ago. This is also why most companies have used cloud based voicemail and voice over ip (applications like RingCentral), and a lot of their sales/CRM capabilities (in favor of applications like Salesforce.com).

Sure there are lots of hoops through which many public companies have to jump. All companies have some type of performance requirements they place on cloud based vendors. Many companies require SAS 70 certifications. Some have industry specific requirements, like health care companies and HIPPA.

However when it comes to financial forecast data, enterprise companies have a whole different set of requirements. Payroll data and voice messages may be sensitive, but they are not subject to the same level of SOX, GAAP and FASB regulations.

Our customers and advisors tell us that the world is moving financial data to the cloud. They have directly stated that those executives who resist this trend, choosing instead to keep data inside their firewalls, are a dying breed. But corporate hands may be bound by forces beyond the control of any individual, despite their predilection for taking advantage of the cloud.

When it comes to financial forecast information, it may take significant legal changes to allow large enterprises to use cloud as a verb.

BACK TO TOP

9/19/13

Assessing the ROI of Solutions that Address the Strategy–Execution Gap

Sometimes it can be difficult to figure out the return on investment for an enterprise software application. Often calculating the improvement in efficiency gains is much easier than trying to calculate the potential benefit from improved decision making. For this reason we tend to focus a lot of our efforts and energies into calculating productivity improvements when a much greater value usually results from higher quality of decision making.

blog 15aThe value from better decisions has been quantified by the same McKinsey research study we looked at earlier. This research piece found that those few companies who actually “put their money where there strategy is” become 40% more valuable as measured by return to shareholders, than those who do not reallocate budgets to meet their strategy.

The Agylytyx approach helps link strategy and execution resulting in better decision making. A VP of FP&A at a NASDAQ-listed company said recently that the Agylytyx approach “helped us gain actionable insight into our strategic plan and helped us bridge our long-range plan with our operating budget.”

An interactive tool is available on our website which will help illustrate how a specific ROI can be calculated for the adoption of the Agylytyx methodology in any organization. Let's look briefly at an example ROI that has been built using that tool.

blog 15bThis is a typical Return on Investment case generated by the ROI tool available online. In this case, the user has populated the assumptions panel with their estimated productivity increases, annual operating margin, and estimated solution costs. Based on their responses, this user can conservatively expect an immediate payback from the project, with a mounting investment return of $13-$18 per dollar invested thereafter. That amounts to several million dollars in total return over the course of the investment.

Of course, every situation is different and we understand that. The ROI calculations use the McKinsey findings for some of its assumptions of value, but does not presume that the Agylytyx approach actually closes the strategy-execution gap. It only assumes that a bidirectional translation engine can form a bridge between strategy and execution and allow companies to start addressing the gap.

Our clients are finding that using Agylytyx to form that bridge can make a big difference.

BACK TO TOP

9/10/13

Solving the Strategy–Execution Gap Requires Applications Built for that Purpose

"Putting a square peg in a round hole" is an expression with which most of us are familiar. We all hate spending money on adopting new applications - especially when our existing applications promise us greater insight. This confusion is understandable - lots of vendors are making noise about helping make better strategic decisions. It is a natural human tendency to believe these directions – to rely on our ERP systems like SAP, Oracle Financials, and TM-1. There is a reason these vendors have made strategic investments or acquisitions like Jaspersoft, Hyperion, or Cognos. Even BI focused vendors like Qlik, Tableau, and DOMO claim to assist in strategic decision making at the highest levels of a company.

We have already written extensively about the limitations of the current BI approaches in the marketplace. We have also written about the limitations of the dashboard and scorecards they produce. We haven't talked extensively yet about a better approach, and the benefits to using a solution specifically designed to address the strategy-execution gap. In this post, we will go long and deep on what that solution looks like and why it's different. In next week's post, we will look at how one goes about calculating an ROI on the use of an approach like that.

blog14The Agylytyx Generator has been built entirely to accomplish this bi-directional translation purpose. It is the only product we know of in the marketplace which is specifically designed to address the strategy-execution gap. This unique purpose is manifested in a unique design.

The Agylytyx Generator is not a dash boarding or score carding product. It is not a modeling tool. These are hallmarks of business intelligence products, planning applications, corporate performance management suites, ERP applications, and the like. While these products have their purposes, we have looked in depth at how these products may improve analytics, but do nothing to address the strategy-execution gap.

In formulating a product designed specifically to address this gap, it occurred to us that the product would have to have unique capabilities. In order to our customers to save lots of different chart sets which would represent a "lens" or perspective for each business leader, asking our customers to build all those charts, no matter how simple we made it to build them, was out of the questions. We wanted our customers to be able to choose the chart sets from thousands of pre-constructed charts so that they could apply that set of charts to any data without any additional querying or chart manipulation.

The result is two major differences from any other product or approach out there. One major difference is the content, and the other is the technology.

First let's consider the content that is included. No other product includes a library of charts with such extensive guidance. Our customer don't write queries, they choose charts, and they do so using a rich library of that leverages our expertise and the specific needs of the customer. Each chart represents a financial construct or a portfolio management visual. For each chart we describe it, talk about how it's calculated, and help the user evaluate when to apply or not apply that chart within a "lens."

Second, consider the technology. Because our customers don't write queries, we specialize in knowing what data is needed to populate the chart sets chosen, and draw those visuals. Our formula expression parser knows what data is required, in what order, and in what position, for each chart in our library. This means that, when a chart set or lens is applied to a data set, the formula expression parser knows where the data is located in the data model, and goes to retrieve in and put it in the right place.

If you are familiar with data cubes, an analogy that we often use is a data cube of charts. But imagine that our users can choose and store infinite combinations of sets of charts, and that our users also benefit from our guidance and search filters to create those chart sets.

The result is the only product built from the ground up, to bridge the gap between finance and strategy. The only way to avoid trying to force a square peg in a round hole is to use a round tool. Next week, we will look at the specific ROI that results from using the right application to address the strategy-execution gap.

BACK TO TOP

9/5/13

Why Existing BI Tools and CPM Suites Widen the Strategy–Execution Gap

Have you ever thought you had addressed a problem with a particular approach or solution, only to find out later that you had not really addressed the problem at all? Usually, when the problem does resurface it emerges as a bigger issue because it went unaddressed for so long. Using existing applications like BI Applications and CPM (corporate performance management) applications to try to address the strategy-execution gap is exactly like that. They give the appearance of addressing the problem, they may even say they can help solve the problem. In reality, they don't really address the strategy-execution gap directly, so that problem will continue to fester in a corporate environment.

There are no shortage of BI and CPM tools which talk a lot about strategic analytics. Understanding what these applications really do will also help us understand why they don't address the strategy-execution gap. Referring back to the diagram we have been using to illustrate the strategy-execution gap points this out.

Business Intelligence tools, Corporate Performance Management Suites, and Planning Tools have been doing a better and better job of providing financial context for corporate strategists. Proper use of these analytic tools will help finance build models and build charts. As these application improve they also improve the ability to provide better and better financial context (illustrated by the top arrow in the diagram). These vendors all have another thing in common –they do not help to connect the dots between the corporate long range strategy and the corporate targets and budget, so they leave the strategy-execution gap untouched. Ultimately, a new and improved set of analytics still leaves the corporate FP&A leader wondering how to bridge the budget with the strategy.

blog13aBridging this gap requires more than just better analytics. It requires a systemic approach to bridge the gap between longer term strategic planning and the need to set budgets for the following year. If this type of system were in place, life would be a lot easier, and the strategy-execution gap would reduced.

The only real solution is an organic, systemic approach that is purpose-built to bridge financial targets with long range strategy. What we see here is a physical manifestation of an organic approach to bridging the strategy-execution gap. In this case, a lens (called frameworks in this application) named corporate goal is being applied to two data sets. In this way, it can be determined what the effects on attaining corporate goals will be under two versions of the budget being considered.

blog13bAn example from the Agylytyx Generator is instructive. In this case, the link between corporate goals is being actively evaluated under two different budgets. While the first two charts that are being displayed talk to the OPEX link to corporate goals, the scroll bar on the right accesses additional Constructs, which may include things like a contribution margin comparison by corporate goal or the timing of revenues by corporate goal, for example. Since these "lenses" are all user-defined, the number of Constructs used to compare the two LRP scenarios effect on corporate goals is immense.

Of course, there are a lot of other considerations besides corporate goals which should be considered when comparing the effects of these two scenarios. In that case, the user would simply use another lens, like the CFO lens or the VP of Sales lens. Even these lenses are ways to bridge the strategy-execution gap because they represent visualizations of important strategy considerations like risk and timing using financial data.

Using Business Intelligence (BI) tools, CPM Suites, ERP applications, Planning tools, etc. are all ways to create better financial context in the form of charts to consider, dashboards, reports, KPIs, etc. Reliance on these tools to solve the strategy-execution gap will ultimately result in persistence and growth of the problem. A real solution to the strategy-execution gap is one built specifically for the purpose – an application like the Agylytyx Generator.

BACK TO TOP

8/26/13

How Budget End-Runs Can Widen the Strategy–Execution Gap

We have all seen them. Just when a budget is set, a powerful executive constituent who doesn't get as much as he or she wants goes straight to the CEO and ultimately secures all or part of what they asked for anyway. For a finance executive, this can feel frustrating since it seems to make the entire budget process irrelevant, and risks undermining the credibility of future budgeting exercises anyway.

In and of itself, this kind of end-run may or may not widen the strategy-execution gap. It all depends on what the executive asks for, and how granting this requests impacts the rest of the organization. An outcome that potentially links strategy a little closer to execution is the case where an executive secures additional spend from the CEO's contingency fund in order to invest more fully in strategic alignment. Unfortunately, that is rarely the case. Usually, additional investments in this ad-hoc manner will be to preserve the continuity of the existing resources within that executive's group and will require resource trade-offs from other parts of the organization, effectively widening the strategy-execution gap.

But consider the case when strategies are linked to budgets – something different happens. Corporate leadership already knows the strategic impact of the decisions that are made. These kinds of end runs become pretty meaningless, because it's obvious that the strategy which was already agreed to, is being undercut.

blog 12In the case illustrated here, if a business leader wants to make a case for circumvention, they are in fact asking for a change in strategy. They may have a valid point, but if they do they should be able to render that point visually using the same data.

Here, for example, the VP of Sales is quickly and easily illustrating the impact of their projection on an important initiative within a specific business unit that represents the organization's concern with their margin relief request. In this case, the VP Sales is showing, with visual data, that the forecast will not affect strategic execution of the business unit called Smith's ability to execute the growth initiative.

Unfortunately, there are a lot of tools out there that make it seem like they are addressing problems like this, when in reality, they are only incremental improvements along the same vector.

BACK TO TOP

8/15/13

How Ambiguous Data Can Widen the Strategy–Execution Gap

In many companies, the strategy-execution gap is exacerbated by an extensive reliance on uncertain data. This is a problem that scorecards or dashboards frequently fall into. Trying to put a budget together based on input other than actual data and forecasts often results in misalignment with corporate goals, since execution itself becomes an uncertain moving target.

For example, in some companies a scoring process is being used to assess the budget allocation vis-à-vis corporate goals. In these situations, some attempt is being made to correlate the two. Unfortunately, we have a tendency to confuse quantitative approaches with objectivity.

Finance may ask business leaders to score the link between their initiatives and corporate goals. The various business leaders have very different perceptions of what the scoring actually means. So when a finance leader creates a consolidated picture, the executive leadership team already knows the relative link between various initiates and corporate goals is a dubious one.

Many times we think we can address issues with guidance or ambiguity by layering in another tool or system. However, if the tool is only capturing the quantitative information provided by humans in the scoring process, the tool may be lulling us into a false sense of security. We may think that our score carding is working because the dashboard comes from a tool, but actually the dashboard is masking the uncertainty in the forecast even though the underlying source of the uncertainty still exists.

In this case the uncertainty of information being abstracted upward may result in strategic decisions which have little bearing on reality. Since the strategy is based on uncertain information, the execution of those strategic directions is, by definition, uncertain as well.

In the next blog post on the strategy-execution gap, we will look at the consequences of excessive ambiguity and how to avoid them.

BACK TO TOP

8/13/13

How Guidance Can Widen the Strategy–Execution Gap

Any solution to strategic execution will need to overcome some of the common issues which sometimes help contribute to the strategy-execution gap.

In this next series of posts, we will talk about four of the common issues facing finance which also contribute to the gap between strategy and execution. As we look at each issue, we'll look at approach which might help address those challenges. In the section after this one, we will talk about a systemic solution to the gap itself.

First, let's look at some of the common obstacles facing finance organizations in our efforts to execute strategy.

Unfortunately, narrow guidance is often a problem in large corporations. Here we are not talking about external guidance, but the kind of guidance that companies issue internally when setting budget parameters. In most companies, a built-in momentum is created by existing business units and/or functions. That momentum is dictated by the unit's existing profile – such as its headcount, its program dollars, its revenue, etc.

Most large corporations follow a pretty common planning process. Usually the constituents include a strategy leadership team, a finance team lead by an executive who has analysts or teams who interface with business unit leaders, and business unit leaders who in turn receive input from their teams in the field.

Usually, finance asks all business unit leaders to adhere to the same type of small forecast range. In this case, business teams will often feel that they are essentially engaging in an exercise in futility, since the guidance is so narrow as to be self-predicative. Ultimately business strategy leadership wants to know how a business forecast and budget will support corporate goals. Of course, it by definition cannot, because the guidance was not formulated with the corporate strategy in mind.

The Agylytyx Approach represents a way to easily evaluate budget scenarios, so guidance doesn't have be so limiting. If we want to find out what is strategically possible to achieve, we should encourage our business leaders to envision strategic alternatives in their resource requests. Portrayed here is a comparison function within the Agylytyx Generator application. In this case, the OPEX distributions under two different possible Long Range Planning scenarios are being defined.

guidanceWhen a system is in place which allows extensive resource allocation comparison instantaneously, it is much easier to envision the effects of resource shifts. In this environment, limited guidance can naturally give way to bounded ranges, since there is no need to issue narrow guidance anymore.

What a welcome change for business leaders! Instead of the narrow guidance they are used to receiving, they will actually be encouraged to be more strategic in their resource requests. Instead of wondering what the point of the budget exercise is, business leaders actually feel empowered by finance, and feel that their input on the business is greater.

In the next blog post on the strategy-execution gap, we will look at the consequences of excessive ambiguity and how to avoid them.

BACK TO TOP

8/5/13

How the Strategy–Execution Gap Effects Corporate Finance Departments

Let's start out with a basic diagram that seeks to represent a high level view of how corporate strategy relates to corporate finance in a typical enterprise. Most large companies have a dedicated corporate strategy group and many business leaders such as CEO's, CFO's, business unit leaders, are involved as well. blog9aThe "output" of corporate strategy (represented here by the blue arrow labelled "Strategic Context") may be available in different forms including a presentation deck, but it is typically a 3-5 year strategic plan for the company. While this strategic context is delivered to Finance (in this case the green box labelled "FP&A"), there is also a financial imperative which takes precedence over the long range strategic plan, and those are the financial pressures generated usually by the CFO to meet corporate targets (represented in this case by a blue arrow labelled "Corporate Targets." The Yellow Portion of the diagram represents the "gap" between these two crucial datasets.

Consider a slightly more detailed description of this diagram. This diagram goes into more detail about existing inputs to Corporate Strategy, outputs of FP&A, and the way companies attempt to tie them together today.

Lots of business strategy decisions are usually made with inputs such as the ones represented in the diagram by the blue boxes on the left hand side. These high level strategic inputs usually include factors like overall market demand, competition, and perceived risk factors. Of course one of the key inputs for strategists is financial context derived from actual numbers and forecasts, often provided by finance. This key link is represented by the arrow from finance to the strategic input box (blue box labelled "Financial Context") here. Usually, finance output like tracking, reporting, guidance, variance, scenarios, all serve as important financial context for strategists to consider.blog9b

When a corporate strategy is completed, the output is usually a long range set of objectives to which a company aspires. The strategy is typically manifested in qualitative terms, and may also include some quantitative factors such as a high level business case, total addressable marketing (TAM) information and shared goals. What the strategy almost never includes is an actual budget for the various departments within a business - that a job for finance.

Typically, finance is the recipient of the strategy along with some overall corporate targets for the next year. Usually these two (strategy and corporate targets) are not connected. Finance is then tasked with taking the corporate targets and translating them into a department by department budget using the corporate strategy only as an aspirational context or backdrop for the actual budget formulation. Using these corporate targets, finance will typically create business unit targets and budgets. Sometimes these business unit budgets are broken down further into buckets such as headcount, program dollars, CAPEX, etc.

The problem then arises because corporate targets are typically decided more with continuity and inertia in mind than with a corporations strategic objectives. The targets are derived from a process that rarely changes within large enterprise-the collection of internal forecasts (bubbled up from senior business unit leaders, etc.) which are then used to develop forward guidance. These forecasts, usually produced by finance on behalf of business units, sales, and other functions, reflect that which is happening in the corporate environment. Because finance is typically using corporate targets to formulate budgets and strategy serves only as an aspirational backdrop, there is a major disconnect between what is actually achieved (as defined by corporate budgets) and what a company wishes to achieve (as defined by corporate strategic objectives).

In the next few parts of this series, we will look at a few specific ways finance is often caught in the middle of a planning process that fails to marry corporate aspirations with corporate targets. After that, we will turn our attention to an organic solution to the strategy-execution gap which can reconcile strategic aspirations with corporate targets.

BACK TO TOP

7/30/13

An Introduction to the Strategy–Execution Gap

We decided to write a series of blog posts on what we see as probably the greatest problem facing large enterprises today – it is most commonly referred to as the "strategy-execution" gap. This series will probably span several parts.

In this series we will:

Define what is meant by the strategy execution gap;

Look at how this problem effects finance and strategy departments;

Examine four key contributors to strategy execution gap;

Examine how finance can address these contributing factors by adopting an improved approach;

Discuss the impact on productivity and ROI of adopting this approach;

The existence of this strategy-execution gap has been well documented. For years, the fact that large companies often have a hard time executing their strategies has also been well documented. In June 2008 a Harvard Business Review article titled "The Secrets to Successful Strategy Execution" focused on organizational factors. More current literature has focused on financial factors as the keys to the strategy-execution gap.

It is quite common for large enterprises to spend considerable sums on outside consultants for strategic advice to supplement the significant amount of executive time and energy devoted to formulating strategy. Out of this commitment come some very well thought out and designed corporate strategies. Unfortunately, the capability to actually implement the processes, methodologies, and culture changes required to successfully implement the strategy are often lacking.

McKinsey Research published a study last year called "How to Put Your Money Where Your Strategy Is." In the abstract to the article, the authors noted that most of the companies they looked at allocate their budgets roughly the same way each year. They noted in the study that the longer this goes on, the greater the gap becomes between what companies want to achieve strategically, and what they actually can and do achieve.

strategy executionAs an example, companies may define as a strategic goal a desire to achieve 10% compounded annual growth rate in revenues over the long term. Strategically, on paper, this may be a sound goal given the market evolution and their competitive positioning. However, this goal may be impossible to achieve since taking advantage of the evolving market opportunity will often require a company to re-allocate resources in their budget beyond what they are actually able to do given the built-in momentum and overall inertia of existing budgets.

The same McKinsey Research study alluded to earlier actually quantifies the potential impact to a company's return on shareholders, and details how common the strategy-execution gap really is. In part two of this series we will start to look at the gap in more detail, and understand exactly why and how it occurs.

BACK TO TOP

7/25/13

Focus on Pharmaceuticals: Turning Market Disruptions into Market Leadership

With traditional business models in the Pharmaceutical business undergoing serious disruption from a variety of sources, this industry faces unique pressures. Pharmaceutical CEO's and senior Strategy, Planning, and Finance executives are being forced to confront important big picture decisions about R&D spending, selling models, distribution channels, reimbursement decisions, pricing, advertising, therapeutic specialties, generics, patient models, cost structures, regulatory pressures, market fragmentation, etc.

Complicating these decisions is the fact that the choices have both short and long term implications. In the short term, some decisions are likely to result in budget and organizational turbulence, and may even put short term earnings at risk. In the long term, some of the same decisions have important implications for the company's pipeline and its ability to compete. Faced with these decisions, negotiating a financially and organizationally viable transition is difficult. In a very real sense, making good decisions will have a large impact on the success, profitability, size, and ultimately, even the survival of their businesses.

This current industry environment is unique in the sheer magnitude of the potential disruptive and competitive pressures facing the industry, and represents both an opportunity and a threat. The short and long term perspectives are nothing new. There have always been "thread the needle" timing issues in this industry. What is unique is the convergence of the issues themselves with the need to balance short and long term interests. For decades, large Pharmaceutical companies enjoyed stable protected revenue streams and strong control over pricing power. That has changed, and the considerations outlined above are, quite literally, game changer types of issues. The choices made now will determine the successful participants in the industry in the decades to come.

Faced with this environment, visibility into the effects and implications of decisions will be vital. Effective communication between strategy and finance organizations is the best way to create this visibility. Strategic choices need to be made with an eye toward what is both financially and organizationally viable. Budget and organizational decisions have to be made with an eye toward implementing corporate strategy. Companies that lead the industry in the decades to come will make sound long term strategic decisions and will be able to operationalize those strategies in their budgets and organizations. And over time the marketplace will reward those who are better and successful at making good decisions with higher share multiples. For the Pharmaceutical Industry, risks are rising, and creating a strong link between their corporate strategy decisions and finance teams (through budgeting, forecasting, scenario planning, modeling, targeting, etc.) is more important now than ever before. Firmly establishing this link will help these companies manage and mitigate these risks while obtaining deeper analytical insight into how best to take advantage of market disruptions. Creating a way to map strategic plans to actual and planned scenarios through the visualization of metrics and analysis into revenue, costs, risk assessment, market sectors, distribution channels, therapeutic areas, etc. will be essential for making accurate, well informed, and successful data driven decisions.

BACK TO TOP

6/28/13

The Question of Portfolio Complexity

When it comes to tough strategic decisions, there are probably as many questions out there as there are companies – question facing different sizes of companies, different levels of maturity, different financial trends, different employee sizes, different industries, different corporate cultures, etc.

But when it comes to strategic questions which can be largely addressed by unlocking corporate data, the range of questions starts to narrow a bit. When corporate data can help strategic decision makers understand how a company might perform under various circumstances, unlocking the secrets of that data becomes a very important job. Unfortunately, our experience has been that the more this data matters to strategic direction, the harder it is to decipher.

This is why most companies make their strategies in a relative vacuum, deciding to focus on understandable business issues like total addressable market (TAM), competition and share trends, and some big picture financial trends.

It is little wonder that some companies often run into trouble when they try to execute the strategy that they have built in this vacuum. These companies usually find that, come budget time, they have engineered sufficient complexity in their strategy that they cannot translate their strategy into a corporate budget – at least not an affordable one that meets corporate targets.

The knee-jerk conclusion is this: the portfolio complexity problem, and the tough strategic questions that go with it, apply only to large corporations. While that view is generally true, we have found that it is not necessarily true for the reasons one would expect.

Some smaller companies have a big problem matching their strategy with their execution because their data does not support this translation. A few big companies do not have this problem – their data structures are simple enough to support execution of their business strategy.

So what factors contribute to an environment in which the strategy-execution gap becomes prominent? Among our clients, we have noticed this phenomenon in an environment we have come to call portfolio complexity. Portfolio complexity refers to what we call "multiples" in attribute consideration. "Multiples" refers not to financial multiples, but to the sheer numbers involved in various parts of a company's portfolio – by definition the things it has to consider.

For example, if a company has multiple business units, multiple products, multiple channels of distribution, multiple product offerings, and multiple regions of operation, it has portfolio complexity. As such, there is a good chance that its corporate strategy is being made independent of its data trends and financial considerations. As a consequence, when budgets are created, a company with sufficient portfolio complexity is unlikely to be able to execute according to their strategy.

A company does not have to be big to have portfolio complexity. We have seen companies with $300 million in revenue which battle with complexity. A big company may not have portfolio complexity. We know of companies that have had the same set of product offerings for 100 years, and their decisions are almost all operational.

Of course, generally speaking, the bigger a company is, the more complex its portfolio usually is. But when it comes to thorny business decisions, portfolio complexity increases the urgency to ensure that strategy can actually translate into executable results.

BACK TO TOP

5/30/13

CFO Technology Trends – What Supporting Better Decision Making Means

We recently attended a read out of the Gartner/Financial Executives Research Foundation (FERF) annual presentation on CFO Technology trends. For many years running, the number one issue CFO’s say needs technology improvement (out of 11 choices) was “facilitating analysis and decision making.” John Van Decker, the Gartner analyst noted that since the issue continues to surface, most people really feel like they haven’t solved this one yet. In big letters the footnote on the slide stated “Current needs can mostly be addressed through BI, Analytics.”

Taken together, the comment and the slide bullet raised a critical question about whether current approaches were adequate to solve the problem at all. It stands to reason that, if a satisfactory approach were available, it would have seen widespread adoption – certainly sufficient to bump “facilitating analysis and decision making” out of the number one slot that CFO’s say needs technology improvement.

There is another possibility. It may not be technologically feasible to address this problem. It may that no technology can take the place of human business insight. There is a strong argument to be made that most business decisions come down to gut instinct anyway. Still, CFO’s seem to left with a nagging feeling that they could have lent more objectivity to the process by better informing decision makers so they could make more educated “gut instinct” call.

At the same conference, an analyst from Ventana Research urged that we constantly be thinking about what the next decision-making constructs would be. As we talked about the idea of facilitating analysis and decision making using “strategy visualized,” it became obvious why the business intelligence approach was not getting traction.

For use in a finance environment, there are two basic limiting factors with traditional BI approaches 1) they all assume one knows what one is looking for 2) they don’t support mass creation of charts because they require each chart to be built one at a time.

Supporting better decision-making through better analysis requires analysts to quickly and easily build and assemble multiple charts which represent multiple perspectives on the fly. There is no time to click and drag objects to create one chart at a time.
It is little wonder the issue of facilitating analysis and decision-making keeps topping the charts for CFO technology needs. In order to finally bump this issue off the charts, a new paradigm for visualizing strategy is required.

BACK TO TOP

4/29/13

Welcome to the FP&A World: Writing Queries, Making Charts, and Doing Analysis

The Scenario

Let's say a business partner within the organization asks for FP&A support in understanding their portfolio of products and associated investments better. In order to support that business partner, here are the typical steps FP&A goes through. These steps may actually be oversimplifications, but they roughly describe a key challenge facing typical FP&A managers in an enterprise. To that extent these steps may sound familiar, if not painful.

Step One: Pull Data

All over the world, financial analysts log into systems and execute queries they have written which consolidate data and build reports. Then, almost invariably, the reports need some massaging. Some additional fields need to be added to the query, some fields may need to be deleted. Sometimes the data from one system needs to be manually reformatted or combined with some other data in a spreadsheet. In any case, each time a specific request like the one above is received, it results in a different query being created in order to support the different point of view of that constituent.

Because there are so many manual processes involved, the consolidated data needs to be reconciled against original source totals. If the numbers don't tie, it is incumbent on the financial analyst producing the report to either 1) continue to tweak the reports until the numbers do tie or 2) create a clear, concise explanation for the variance.

Step Two: Make Charts

Accomplished financial analysts know that visual data is easier to interpret than a table full of numbers. Also, business partners across the company usually want something more than tables – they want pictures (charts and graphs) also. So, the business of FP&A is to turn the data pulls - financial results, plans, forecast, budgets, variances, etc. into graphics. This usually involves creating the charts manually from the data which has been prepared in step one above. Whatever tool of choice is used to create charts, the charts are created manually. If a spreadsheet is the tool of choice, charts are created by forming links to the requisite data elements which are manually selected in that spreadsheet.

If a modeling or business intelligence tool is used, it means that the manually formatted data has successfully been imported. When that occurs, charts are still created through individual queries. Sometimes the user interface is a graphical "business intelligence" tool. Sometimes it is even "drag and drop." In these cases, charts are still created through the manual choosing and placement of fields for each individual chart or dashboard. Choosing an incorrect field or overlapping data by putting fields close together, or forgetting to include a field altogether, are common problems. Compounding the problem is the fact that attempts at some automation often fail, and don't generate an adequate audit trail to explain what "broke" the chart. Through a trial and error process, the dashboard or chart can be revised until the desired "look and feel" is achieved. Then, because there are manual steps involved here, the data has to be vetted again.

Complicating this process is the choice of the best way to render data, and even the right charts to use in order to express business issues. Different chart formats considering the same data differently often renders a new business insight or perspective. There is always a nagging feeling that something is missing.

Step Three: Wash, Rinse, Repeat

Rarely is one chart enough to explain a financial picture to a business partner. They want to see different combinations of data portrayed in a way that helps them understand their portfolio. To truly make a business partner happy requires actionable analysis. It usually takes a set of charts with specific and meaningful data analysis, to help them understand what actions to take and why. That means going through the process described above in step 2 several times.

Here is the crux of the problem: Most global FP&A teams don't have one business partner – they have several. Furthermore, the way that each business partner wants to see data, their point of view, changes from time to time. From a business perspective, it should. (If it does not, that is a reason for real concern). That means there are multiple business partners with evolving perspectives who want to view different data slices different ways.

The FP&A manager ends up chasing their tail, constantly writing queries and building charts, and spending precious little time helping business partners look for key drivers and develop critical business insights. In at least one Fortune 100 company, FP&A analysts with Ivy League degrees were wasting their brainpower consolidating data and building charts.

A better way

This is why dashboard and reports are moving targets when it comes to business strategy. It is also why charts frequently break, why errors commonly show up in them, and why business partners get frustrated at the timing and level of analysis provided by finance.

A solution to this problem requires a radical approach. This approach needs to somehow automate Step 1 and Step 2. FP&A analysts are usually in the best position in a company to analyze data, and the solution needs free them up to harness their brainpower. A real solution is one that enables FP&A managers to spend their time focused on analyzing charts rather than making them.

The radical approach involves a library of charts that already have the queries written into them. The radical approach involves assistance to create the sets of charts that represent the different and evolving perspective of end users. Here's the thing: this radical approach already exists. No exaggeration. It's called the Agylytyx Generator, and we developed it because a bunch of us from different companies got tired of dealing with this problem. You can find out more about this solution on this website.

BACK TO TOP

4/8/13

Bounding Strategy to the Attainable

The Strategy Person

A quick search for business strategy yields several similar definitions. One common component in almost all the definitions is the phrase "long-term." It is little wonder, then, that the type of person typically involved in strategy within a corporate environment is most comfortable thinking about things as they should be rather than how they are.

Because the future is always uncertain (arguably now more than ever before), companies need people who worry about strategy. In thinking about the future, strategy persons concern themselves with critical questions such as "what markets do we want to be in?" and "how do we want to position ourselves?"

To help them answer these questions, strategy personnel look for data points like total addressable market (TAM), and often create scenario models. They base the assumptions for their long term projections about business models on things like merger and acquisition activity and market share.

The Finance Person

The finance person, by contrast, usually thinks about the "long term" once a year, in long range planning. To the finance person, the world as it exists today and will exist next quarter is a lot more relevant than the way things should be.
Because getting a detailed picture of a company's performance is so important, companies need finance, planning, and analysis. In consolidating a company's past performance and administrating budget, finance concerns themselves with critical questions like "how did we do?" and "how will we do next quarter?" and "what happens if we tweak the budget this way?"
To help them answer these questions, finance personnel look data on root causes, variance explanations and trends. They base their assumptions on plan data that has been created as well as the actual track record of performance.

A Symbiotic Relationship

As different as these two worlds are, in most companies the strategy and finance teams have developed something that is best described as a symbiotic relationship. Strategy teams impose on finance teams when they need data points that will inform their outlooks. In rare cases, strategy teams may even work with finance resources to create models that perform long range scenario analysis. In cases where modeling skills are resident in finance, this is often the case. In rare cases, finance teams may even consult with strategy teams to develop guidance or variance explanations. In situations where strategy persons are especially informed on the state of the business, this is often the case. Under these circumstances, finance and strategy teams often develop a healthy, if arm's length, relationship.

Embracing the Unattainable

Every strategist seeks to avoid setting out a set of long term objectives which depend on unattainable assumptions. For example, an excellent long-term business outcome may be based on an assumption of a 20% market share position. However a company has never exceeded a 10% market share, this may be an unattainable objectives. In attempt to avoid situations where strategy is considering unrealistic assumptions, more large companies have develop a strategy planning process with runs through the year. This process is most often depicted as a funnel, with the large end of the funnel representing the beginning of the fiscal year where strategy casts a "wide lens" – envisioning many possible outcomes for a company. Through the year this funnel narrows until a strategy is created (at the narrow) end. This narrow end of the funnel ideally corresponds to the annual budget planning process run by finance, so finance's role is to create an annual budget which corresponds to the s Strategic Planning strategy.

"Bounding" Strategic Planning Through Broader Finance Support

Finance has a distinctly helpful role in supporting the Strategic Planning process which is often overlooked. It is well known that Corporate Strategic Planning often develops plans which are not executed. In fact, McKinsey Research documented and quantified the strategy-execution gap. Their findings were that this problem exists in most companies, and it usually has a dramatic impact on shareholder value. It is commonly assumed that the reason for the problem is that budgets are not reallocated in a way that makes the strategy actionable.

Those of us most familiar with this problem tend to put the emphasis on making budgets reflect strategy more. While this is undoubtable true, and probably responsible for most of the problem, there is another explanation which deserves consideration and emphasis. The truth is, reallocating budgets according to strategy is hard for a reason – sometimes strategies are so far afield from what is possible to achieve, that budgets simply cannot be altered to reflect strategy without a complete restructuring of a company.

In order to make actionable budgets, a company must have an actionable strategy. Enter a strategic role for finance. As strategy departments go through their annual exercises of strategic planning, it is incumbent on Finance, Planning and Analysis departments to create actionable scenarios for the next year which create "boundaries" for strategic consideration. Strategy has to be encouraged to think broadly about what is possible for a company to achieve. This type of visionary thinking is what drives successful companies to keep achieving extraordinary results. This type of visionary thinking is most effective when bounded by what is actually possible to achieve. To do otherwise risk broadening the strategy-execution gap, with negative consequences to shareholders.

BACK TO TOP

blog spacer webinar ad
Copyright © 2017 Agylytyx™. All rights reserved.          Privacy Policy          Terms & Conditions          Site Map          Contact          info@agylytyx.com
facebook twitter linkedin youtube