

You are more than a builder of budgets
Step out from behind those spreadsheets and into a more visible role. Earn your Certified Management Accountant certification and you’ll master the 12 most critical practice areas of management accounting. Giving you the skills it takes to be seen. And recognized. Learn more at cmacertification.org.
It makes all the difference.

COVER STORY The increased uncertainty and volatility in business today create greater challenges in forecasting and planning. More effective demand planning can help lead to improved financial performance.

BY BRIGITTE DE GRAAFF, CMA, CSCA, AND PAUL E. JURAS, PH.D., CMA, CSCA, CPA
A DATA ANALYTICS MINDSET WITH CRISP-DM
Applying the six phases of the CRISP-DM methodology can equip accounting and finance professionals to tackle complex data analytics implementations.
BY RICHARD O’HARA, CMA, CFA; LISA S. HAYLON, CPA; AND DOUGLAS M. BOYLE, DBA, CMA, CPA
BY SETH ELLIOTTIMA
IMA CPEdge Express™ qualifies for CPE credit under NASBA QAS certification. For more information, go to bit.ly/3eDEv8G




EDITOR-IN-CHIEF
CHRISTOPHER DOWSETT, CAE cdowsett@imanet.org
SENIOR EDITOR ELIZABETH KENNEDY ekennedy@imanet.org
SENIOR EDITOR NANCY FASS nfass@imanet.org
FINANCE EDITOR DANIEL BUTCHER daniel.butcher@imanet.org
STAFF WRITER/EDITOR LORI PARKS lori.parks@imanet.org
SENIOR DESIGNER JAMIE BARKER jamie.barker@imanet.org
EDITORIAL ADVISORY BOARD
Bruce R. Neumann, Ph.D. Academic Editor
Ann Dzuranin, Ph.D., CPA Associate Academic Editor
William R. Koprowski, Ph.D., CMA, CFM, CFE, CIA Associate Academic Editor
For more information on the role of the Editorial Advisory Board and a complete list of reviewers, visit sfmag.link/reviewers
PUBLISHED SINCE 1919
FOR REPRINT INFORMATION, CONTACT: sfmag@imanet.org
FOR PERMISSION TO MAKE 1- 50 COPIES OF ARTICLES, CONTACT: Copyright Clearance Center, www.copyright.com
Strategic Finance® is indexed in the Accounting and Tax Index by ProQuest at www.il.proquest.com
Except as otherwise noted, the copyright has been transferred to IMA® for all items appearing in this magazine. For those items for which the copyright has not been transferred, permission to reproduce must be obtained directly from the author or from the person or organization given at the end of the article.
Views expressed herein are authors’ and do not represent IMA policy unless so stated. Publication of paid advertising and new product and service information does not constitute an endorsement by IMA of the advertiser or the product or service.
Strategic Finance® (ISSN 1524-833X/USPS 327-160) Vol. 10 4, No. 8, February 202 3. Copyright © 202 3 by IMA. Published monthly by the Institute of Management Ac coun tants, 10 Paragon Drive, Suite 1, Montvale, NJ 07645. Phone: (201) 573-9000. Email: sfmag@imanet.org
MEMBER SUBSCRIPTION PRICE: $48 (included in dues, nondeductible); student members, $25 (included in dues, nondeductible).





A Historic Understanding
BY GWEN VAN BERNE, CMAFUNDAMENTAL TO THE CORE VALUES OF IMA® is helping to create, nurture, and advocate for diversity, equity, and inclusion (DE&I) within our membership community, organizational workplaces, and broader accounting and finance profession. The vestiges of inequality, however, are difficult to dismantle. It takes creative, purposeful, and decisive action to combat biases so that there’s a culture of mutual respect and sense of belonging among all individuals.
The challenges of DE&I have been at the forefront of my term as IMA Chair. It’s a topic I care deeply about. Recently, I had the privilege of meeting with Guylaine Saint Juste, CEO and president, and Herschel Frierson, chairman, of NABA Inc. (formerly National Association of Black Accountants), the world’s largest association representing the interests of Black accounting, finance, and related business professionals.
goals for mutual collaboration between our two organizations. The alliance was forged following a groundbreaking DE&I research series that culminated in a capstone report, Diversifying Global Accounting Talent: Actionable Solutions for Progress (bit.ly/3iaHas4), in which IMA was a sponsor and NABA a research partner.
Under the new alliance, IMA and NABA will work together to support each other’s mission and reduce inequalities for Black people in the accounting pipeline by cross-promoting membership opportunities and the benefits of both organizations to professionals and students. Students will have the ability to join both organizations for free, and professional members of each organization will be able to join the partner organization at a reduced rate. IMA also will sponsor a select number of students and NABA professional members to participate in a complimentary CMA® (Certified Management Accountant) selfstudy cohort and pursue the CMA.
Gwen van Berne, CMA, is director of finance and risk at Oikocredit and Chair of the IMA Global Board of Directors. She’s also a member of IMA’s Amsterdam Chapter. You can reach Gwen at gwen.vanberne @imanet.org or follow her on LinkedIn at bit.ly/3LVeRGM

NABA was started in December 1969, when nine African American financial leaders met in New York City to discuss the unique challenges and limited opportunities they faced in the accounting profession. At that time, there were only 136 African American Certified Public Accountants (CPAs) out of a total of 100,000 CPAs in the United States. This group wanted to establish an organization to address the concerns of Black professionals entering the accounting field. Today, according to the National Society of Black CPAs, fewer than 1% of CPAs in the U.S. are Black, which shows how much of a journey we still have ahead of us to achieve greater equality.
This past fall, IMA signed a memorandum of understanding (MOU) with NABA to outline our
IMA and NABA will aim to enhance opportunities for Black students and professionals through joint webinars, podcasts, speaking engagements, published articles, and cohort meetings. Through this MOU, both IMA and NABA are helping to achieve progress toward three key United Nations Sustainable Development Goals: Quality Education, Decent Work and Economic Growth, and Reduced Inequalities.
For me, this alliance exemplifies what DE&I is all about: understanding needs, removing barriers, and consistently providing opportunities for development and growth.
Are there opportunities in your own life, within your organization, or in your wider sphere of influence where you can make a difference toward DE&I progress? I challenge IMA members to seek out those opportunities and be a positive force for change. SF
«I challenge IMA members to SEEK OUT OPPORTUNITIES to be a POSITIVE FORCE FOR CHANGE.»
IMA/ SHARE YOUR
IMA STORY
IMA® members share a strong spirit of camaraderie. What have your experiences as an IMA member been like? Do you have a story about them you’d like to share? If so, please consider writing an IMA Life column that will be published in Strategic Finance. You can be a student member, a young professional, in the midst of your career, or retired. If you’re interested, please email Lori Parks at lori.parks@imanet.org

THE STATS
67%
Source: Jackie Wiles, “The 5 Pillars of Strategy Execution,” bit.ly/3WIw3Ws
See “How CFOs Can Better Manage Strategy Execution” on p. 46.
NEWS/
WHISTLEBLOWER AWARDED MORE THAN $5 MILLION T

he U.S. Securities & Exchange Commission (SEC) announced in January 2023 an award of more than $5 million to a whistleblower whose information led to a successful SEC enforcement action. According to the press release from the SEC, the whistleblower provided a tip and additional information that helped SEC staff shape their investigative strategy, identify witnesses, and draft document and information requests. The whistleblower also internally reported concerns prior to submitting information to the SEC.
Creola Kelly, chief of the SEC’s Office of the Whistleblower, said, “The whistleblower in this case provided helpful information and substantial ongoing assistance, saving the SEC time and resources during its investigation.”
Whistleblower awards can range from 10% to 30% of the money collected for monetary sanctions that exceed $1 million. As set forth in the Dodd-Frank Wall Street Reform and Consumer Protection Act, the confidentiality of whistleblowers is protected, and the SEC doesn’t disclose any information that could reveal their identity.
Visit www.sec.gov/whistleblower for more information about the whistleblower program and how to report a tip. —Christopher Dowsett, CAE
of integral functions aren’t aligned with corporate strategy.
IMA PARTNERS WITH ACCOUNTING ORGANIZATIONS TO ENCOURAGE DIVERSITY

Anew pilot partnership program between IMA® (Institute of Management Accountants), the American Accounting Association (AAA), NABA Inc. (formerly the National Association of Black Accountants), the California Society of Certified Public Accountants (CalCPA), the Center for Audit Quality (CAQ), and Gleim Exam Prep aims to encourage Black students in California to pursue accounting as a profession and Black professionals to consider sharing their experiences in the accounting classroom.
Under the partnership agreement, the six organizations will combine their resources for outreach to Black students and to professionals for teaching roles in California. Pending the success of this pilot program, it can eventually expand to other demographic groups and geographies.
The program is composed of four exclusive tracks: three student tracks—the CMA® (Certified Management Accountant) track, which will map out a career in management accounting and ready participants to become CMAs; the CPA (Certified Public Accountant) track, which will map out a career in public accounting and ready participants to become CPAs; and the leadership track to help participants become business leaders—and one faculty track, which will include faculty development from AAA and assist in the practitioner-to-faculty transition. Information on applying can be found at bit.ly/3D1gEbT
The program implements actionable solutions from the April 2022 report Diversifying Global Accounting Talent: Actionable Solutions for Progress (bit .ly/3iUhRuD) published by IMA, CalCPA, and the International Federation of Accountants (IFAC), and in which all other organizations in the pilot partnership program were involved. In addition to implementing specific actionable solutions, the program responds to that report’s key call to action of advancing the United Nations’ Sustainable Development Goal 17: Partnerships for the Goals. —Lori Parks
For more information on the CMA, go to www.imanet.org /cma-certification . And visit www.imanet.org/csca -credential to learn about the CSCA.

Bu llet in
LEADERS CAN CARE AND STILL WIN
Far from being disciplinarians who bark orders, the most effective leaders are caring, relatable, empathetic, and humble.
The global pandemic turned out to be a critical leadership stress test, as leaders had to shoulder increased responsibilities while forging the new normal of working from home or a hybrid workplace. Many management accounting and finance professionals are finding that the pandemic has brought new and different demands on them as leaders. In his book Lead. Care. Win.: How to Become a Leader Who Matters, Dan Pontefract shares nine practical leadership lessons that can help finance leaders inspire employees and drive desired action, even during a time of severe disruption.

Pontefract stresses that leaders are in the relationship business— they’re tasked with developing sustainable professional relationships among stakeholders. The focus of the book is making workplace exchanges more meaningful and mutually productive.
The first lesson centers on being empathetic in cultivating relationships. Zoom founder/CEO Eric Yuan is a multibillionaire who built his reputation by demonstrating that strong leaders are relatable, caring, and humble. Zoom came under public scrutiny for several privacy and security issues as organizations switched to online meetings and virtual learning during the pandemic. Yuan quickly admitted the company’s mistakes, apologized, and committed to fixing the issues. In a nutshell, he demonstrated the three steps of the book’s title: lead, care, and then you’ll likely win.
It’s a significant takeaway for me as a university professor. I was dependent on Zoom for my virtual classes early in the pandemic and made the mistake of trying to convey a persona of perfection, with a perfectly placed backdrop and a quiet, orderly room with no interruptions. I didn’t connect with my students because their lives were anything but perfect. I dropped the image, swapping the fake background with an uninteresting view of my closet doors, repeating myself when I forgot to turn on the recording, pausing when my landscapers were mowing right outside my window, and so on. We connected more as a class when I shared my own struggles.
Each of the book’s lessons is a bite-size bit of wisdom illustrated with real-world stories and examples and ends with a summary of the main takeaways. Other lessons include lead with a sense of purpose, stay present and attentive to the needs of others, lead off with learning, share knowledge to build a wise organization, embrace change, think and act with clarity, commit to balance and inclusivity, and champion others. I recommend the book for anyone seeking practical advice on how to lead more effectively so that people will want to follow you. Lisa Book, CMA, CSCA, CFM
SURVEY/
CFO PRIORITIES
Gartner surveyed CFOs in November 2022 to identify their top priorities for 2023. “CFOs will be stretched thinly across many activities in 2023,” said Marko Horvat, vice president, research in the Gartner Finance practice. “The survey revealed a wide range of actions CFOs plan to either lead or be significantly involved with.”

The top priorities include:
Evaluating functional strategy, scope, and design
Planning and sequencing finance transformation activities
Communicating and engaging with the board
Setting finance’s technology strategy and road map
Developing a planning, budgeting, and forecasting strategy
Visit bit.ly/3wfBjoX to learn more about the results.
USING ALGORITHMS TO ROOT OUT FRAUD
Finance professionals can use algorithms to pinpoint and fight fraud.
BY DANIEL BUTCHERMANAGEMENT ACCOUNTANTS CAN HELP THEIR ORGANIZATION TO identify, mitigate, and eradicate fraud by urging leadership to invest in technology powered by AI and machine learning. Such an investment can bolster the organization’s finance, IT, compliance, and risk management functions’ efforts to pinpoint misconduct and bad actors early on. Ultimately, that’s likely to enhance the reputation of the organization’s personnel for conducting themselves ethically and protecting stakeholders from fraud.

Julian Shun, an associate professor of electrical engineering and computer science at the Massachusetts Institute of Technology (MIT) and a lead investigator in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), is the lead instructor of MIT Professional Education’s Graph Algorithms and Machine Learning course (bit.ly/3X7C5j3). His research focuses on the theory and practice of parallel algorithms and programming, with particular emphasis on designing algorithms and frameworks for large-scale graph processing and spatial data analysis.
Graphs are a way to model relationships in data. A graph has vertices and edges, where vertices model objects of interest and edges model relationships between these objects. Shun says that graphs can be used to model social networks, where the vertices are people and edges represent relationships or connections between them, or financial transaction networks, where vertices are buyers and sellers and edges connect individuals who made a transaction with each other.
“Graph algorithms are programs that one runs to find patterns or anomalies in graphs—they can be used to find communities of people with similar interests in social networks or detect anomalous behavior in financial transaction networks,” Shun says. “Data analytics, AI, and machine learning are all related to graph algorithms.”
To use graph algorithms in fraud controls, finance
ETHICS
professionals need to first take the data set and determine whether there’s an appropriate graph representation of the data such as an adjacency list, matrix, or set (bit.ly/3GI9nA1). Then, they would need to convert their data into a graph format.
Graph algorithms are frequently used in machine learning pipelines, Shun says. They can be used to cluster data for classification tasks or to identify anomalous structures in a graph to detect fraud or spam, as well as build and train graph neural networks, which can then be used to predict a graph’s vertex attributes, missing edges, or global properties.
DETECTING FINANCIAL FRAUD
Finance professionals can use a graph to model financial transactions. Edges in this graph are directed—if person A made a payment to person B, then the edge would point from person A to person B.
“Oftentimes, in money laundering, some person X sends money to other people, but the money eventually ends up back at person X,” Shun says. “By using graph algorithms, one can find a sequence of vertices, connected by directed edges, that start at person X and eventually end up at the same person X.”
This is called a directed cycle, which is a red flag. It can be a candidate for further investigation into potential money laundering (although not all instances of this pattern indicate money laundering).
“Graphs are usually large enough that humans can’t manually check for such cycles, so graph algorithms
can significantly narrow down the number of cases that financial analysts need to focus on, thereby increasing productivity,” Shun says.
Graph algorithms can also help detect insurance fraud. One can model the customers in an insurance company using a graph, where edges connect vertices of people involved in the same claim.
“If one finds a group of people that are always involved in the same claims across multiple lines of businesses in the company, then there could potentially be insurance fraud,” Shun says. “Using graph algorithms in this scenario will significantly increase productivity, as these patterns are difficult to detect manually.”
ETHICAL CONSIDERATIONS
Data privacy is an important concern when graphs are used to represent people and their relationships. Using algorithms to find patterns or anomalies in graphs often requires a global view of the data, because many algorithms take into account both local and global features, Shun says. Therefore, analysts studying these graphs will have access to a lot of confidential data, and it’s important that they abide by data privacy rules.
Choosing which data set to analyze must be done by a human, and algorithms could be biased by what data sets the human believes may have relevant information. Choosing which graph algorithms and setting their parameters to use on a data set must again be determined by a human, and, according to Shun, such a decision
IMA ETHICS HELPLINE
For clarification of how the IMA Statement of Ethical Professional Practice applies to your ethical dilemma, contact the IMA Ethics Helpline.
In the U.S. or Canada, dial (800) 245-1383. In other countries, dial the AT&T USA Direct Access Number from www.business.att.com /collateral/access.html, then the above number.
The IMA Helpline is designed to provide clarification of provisions in the IMA Statement of Ethical Professional Practice, which contains suggestions on how to resolve ethical conflicts. The helpline cannot be considered a hotline to report specific suspected ethical violations.
could be biased by what they think they should be finding in the data set.
The output of an algorithm will eventually lead to some decision being taken, and the outcome may not benefit everyone—and may potentially harm some people. Management accountants can help to gauge the potential impact on each group of stakeholders.
“Recently, there has been a whole lot of research on how to design ethical AI and machine learning algorithms and frameworks, and this body of work is important to look at when deploying these solutions in the real world,” Shun says.
Training in the space of graph algorithms would help to improve the productivity of finance professionals who analyze data to find patterns or anomalies, he says. It would also be beneficial to managers of various teams, as they will learn about new ways to analyze data and be able to make better decisions about what kind of data, algorithms, and tools would be the most effective for their team to use.
Financial organizations are now collecting and managing more data than ever before, and that volume is only going to keep increasing each year. Graph algorithms can help management accountants position themselves and their organizations to create value from that data, better serve their customers, and maintain a competitive edge. SF
Daniel Butcher is the finance editor at IMA and staff liaison to IMA’s Committee on Ethics. You can reach him at daniel .butcher@imanet.org
TAXES
CARRIED INTEREST
As impending legislation and episodic deliberations once again raise the possibility of changing the rules for carried interest, it’s a good time to review and understand the concept.
BY KATHRYN SIMMS, PH.D., CPA, CFE, AND VILSON DUSHI, CPATHERE HAVE BEEN PERIODS OF DEBATE and deliberation surrounding carried interest since the concept first emerged in 2007. The Inflation Reduction Act (IRA) of 2022 (Public Law 117-169) marks the most recent chapter of possible revisions, with early versions of the bill including a provision impacting the carried interest rules. Although the IRA ultimately didn’t change the taxation of carried interest, there remains a high level of commitment to increasing taxes on carried interest.
Key senators such as Sen. Kyrsten Sinema (I.-Ariz.) and Mark Warner (D.-Va.) have agreed to work together on future revisions of the current rules (Andrew Duehren, “Sen. Kyrsten Sinema Wins Tax Changes to Democrats’ Climate Bill,” The Wall Street Journal, August 4, 2022, bit .ly/3XFuPLR). With the possibility of change still in the air, it’s a prime time to review what carried interest is, how it’s currently taxed, and the policy viewpoints on the taxation of carried interest.

WHAT IS CARRIED INTEREST?
Private equity, venture capital, and hedge fund investments are commonly organized as limited partnerships. These investments are significant because (1) they have an estimated net asset value of roughly $14 trillion (bit. ly/3XoBJFf), and (2) they foster the development of innovative businesses. Limited partners contribute the bulk of investment capital to the partnership and don’t actively participate in management. Their objective is to profit from appreciation in the partnership’s investments.
General partners actively manage the business interests of the partnership. They’re compensated annually via management fees, usually 2% of assets. They also earn a percentage of the appreciation in the value of the partnership’s investments, which is often 20%, if the appreciation exceeds an agreed-upon rate of return. This payment is called carried interest.
TAXES
Although general partners frequently contribute a portion of the partnership’s investment capital, much of the appreciation in their portion of the investment (i.e., carried interest) is typically derived from “sweat equity,” or management services.
Consider an example. Hedge Fund A sells its investments annually. It’s organized as a limited partnership. The limited partners contribute $100 million to the partnership. The general partners are entitled to 20% of the appreciation in the investment returns over 8% and a 2% annual management fee. At the end of Year 1, the investment has earned a 25% return, or $25 million. The general partners earn a 2% management fee, or $2 million ($100 million ✕ 0.02), and carried interest of $3.4 million ($100 million ✕ (0.25 ‒ 0.08) ✕ 0.20). The limited partners earn $21.6 million ($25 million ‒ $3.4 million).
TAXING CARRIED INTEREST
Carried interest is currently taxed under Internal Revenue Code §1061, established under the Tax Cuts and Jobs Act of 2017. Carried interest held more than three years is taxed as a long-term capital gain, whereas carried interest held for three years or less is taxed as a short-term capital gain. The maximum long-term capital gains tax rate is 20%; short-term capital gains income is taxed as ordinary income at a maximum rate of 37%. A 3.8% Medicare surtax will also likely apply.
For example, Hedge Fund A held its carried interest for one year so that its general
partners will pay taxes of $1,387,200 ($3.4 million ✕ 0.408). Private Equity Fund X earns the same carried interest, but the holding period is three years and a day. The tax due will be $809,200 ($3.4 million ✕ 0.238), a 42% decrease relative to Hedge Fund A. (Any management fees will be taxed as ordinary income in both cases.)
POLICY VIEWPOINTS
Most policy viewpoints on how carried interest should be taxed hinge on the character of carried interest income: Is it truly capital income; is it ordinary income, or is it a combination of the two?
Supporters of the current approach of taxing carried interest often view general partners as having made an investment in the limited partnership and thus being entitled to capital gains treatment on the liquidation of their interests, much like the sale of stock.
Many opponents of this viewpoint suggest that carried interest compensates the general partners for managing the partnership, and, thus, the liquidation of their interests should be taxed at ordinary income tax rates, like wage income.
Others see carried interest as hybrid income. From this perspective, general partners have both a capital interest and a service interest in the investment partnership. The capital interest arises from the general partner’s monetary contributions to the partnership (if any). It should be taxed as capital income. The service interest arises from the services the general partner contributes to the partnership. Under traditional tax
MOST POLICY VIEWPOINTS ON HOW CARRIED INTEREST SHOULD BE TAXED HINGE ON THE CHARACTER OF CARRIED INTEREST INCOME.
theory, the service interest should be taxed as ordinary income.
Other policy positions for modifying the current approach to the taxation of carried interest include:
■ The current provision is rarely applicable to most private equity firms because the holding periods for most private equity investments exceed three years (Joseph Ferrone, “The Taxation of Carried Interest and Its Effects upon Cities,” Fordham Urban Law Journal, April 2020, pp. 717-745), so that the three-year holding period requirement has few meaningful consequences.
■ General partners commonly earn high incomes. Some say it isn’t fair to give these partners a tax break on carried interest when average individual taxpayers can’t earn this tax break on their own compensation.
Common policy positions for continuing the current tax treatment include:
■ Investment partnerships shepherd the foundation of new companies that bring important innovations to the marketplace, greatly benefiting the U.S. government and U.S. consumers. A tax concession
could be more than worth the value of these innovations to the U.S. economy.
■ The Congressional Budget Office has estimated that additional tax revenues generated from taxing carried interest as ordinary income would be $14 billion across 10 years (bit.ly/3iAOayz). Some argue that this amount of savings isn’t significant compared to the U.S. national debt of more than $31 trillion.
Carried interest is likely to continue to be a prominent topic in taxation. Consequently, it’s important for financial professionals to understand the taxation of carried interest and the policy issues surrounding it. Ideally, these professionals will be prepared to react nimbly and counsel others in the advent of new legislation in this area. SF
Kathryn Simms, Ph.D., CPA, CFE, is an assistant professor of accounting at Radford University. She can be reached at ksimms1@radford.edu
Vilson Dushi, CPA, is an instructor of accounting and director of the Governmental and Nonprofit Assistance Center at Radford University. He can be reached at vdushi@radford .edu
© 2023 A.P. Curatola
DO INTERNAL CONTROLS STIFLE CORPORATE INNOVATION?

Internal controls are often thought of as bureaucratic hurdles that stifle creativity, but better control systems just might be a key to greater innovation. BY
BRIAN MILLER, PH.D.; AMY SHENEMAN, PH.D.; AND BRIAN WILLIAMS, PH.D.INTERNAL CONTROL STRUCTURES CAN HELP organizations prevent fraud, monitor risk, and improve the quality of their financial information. But can internal controls also help organizations increase their innovative output? A recent research study we conducted suggests that the improved information environment from higher-quality internal controls can help companies identify and patent their most innovative ideas. In fact, we found that when companies have higher-quality internal controls, they have more patents, and those patents relate to more valuable ideas for companies.
At first glance, it isn’t obvious that internal control quality would influence innovative output. On one hand, internal control systems could create an overly bureaucratic environment that hinders innovation and creativity. On the other hand, internal controls could allow for systematic documentation of projects and better information flows within organizations. Having this increased amount of information could help managers identify the most promising projects to patent.
OUR STUDY
We examined data on the internal control quality of U.S. public companies along with data on patent filings from the U.S. Patent and Trademark Office. We started our analysis by examining the relation between ineffective controls and innovation. (For the complete study, see “The Impact of Control Systems on Corporate Innovation,” Contemporary Accounting Research, Summer 2022.)
Because internal operational controls of public companies are largely unobservable to outsiders, we relied upon data about financial controls, which are publicly disclosed through various financial reporting filings with the U.S. Securities & Exchange Commission. The quality of financial controls can help us better understand operational controls because financial and operational controls are largely interdependent.
To measure the quality of internal controls, we used a three-pronged approach. First, we gath-
ered data on financial control weaknesses, which are publicly disclosed on an annual basis as required by the Sarbanes-Oxley Act of 2002.
Second, we identified instances where public companies had to restate their financial statements. Companies restate financial statements to correct errors in their original filings, which suggests these companies likely had ineffective controls when they filed their original set of financial statements. Third, we inferred ineffective controls using a statistical prediction model from academic research.
We measured innovation as either the stock market’s value of patents or the number of patents. While the stock market’s value of patents provides an indication of patent value based on the market response to the patent, the number of patents provides a measure of the quantity of innovation over a given period. Regardless of which of these measures we examined, we consistently found that companies with worse internal control quality also had lower innovative output.
At the start of our research, we conjectured that higher-quality controls can help managers best identify patents that position the company well in terms of market and technology changes. In other words,
into valuable patents. To explore this possibility, we examined a mechanism for this effect—the ability of companies to convert research and development (R&D) dollars into patents.
POOR INTERNAL CONTROLS HINDER INNOVATION
Ineffective control systems are likely to reduce the quality of data, as well as the quality of project tracking and identification within organizations. If poor internal controls reduce managers’ ability to receive accurate and timely information, managers may be unable to identify the internal projects with the most valuable intellectual property within their organizations. In other words, managers may be unable to allocate appropriate levels of staffing and financial resources to projects that can maximize their company’s R&D investments.
Consistent with our conjecture, we found that poor control quality hinders companies’ ability to convert R&D dollars into patents. We interpret these results as suggesting that when the quality of internal controls is low, managers are less likely to get the most out of their investments in R&D.
The process of converting R&D into projects that
can be patented can be a lengthy one. As such, it’s unclear whether the investment in internal control structures pays off immediately or in future periods. Yet our evidence suggests that internal control quality influences innovative patent output in both the same year of the patent filing as well as the subsequent year after the patent filing. This evidence suggests that investments in internal controls has benefits beyond the current year.
LIMITATIONS AND IMPLICATIONS FOR PRACTICE
Combined, our evidence suggests that managers who care about innovation should consider the benefits of stronger control systems. Our evidence also suggests that control systems can lead to an increase in the quantity and value of patents as measured by the stock market’s reaction to patent filings. On average for the companies in our sample, higher internal control quality is associated with increased innovative output. In particular, these control systems appear to help managers convert R&D into valuable patents.
We found these results when comparing companies with higher-quality controls to those that don’t have higher-quality controls as well as when looking within the same company, examining changes in innovative output as the company’s own internal control quality changes.
Our findings shouldn’t be seen as advice that improving control systems will result in immediate economic payoffs. There
are certainly some companies for which innovation is less important, and these companies are likely to see less benefit on the innovation front from an investment in improved controls. Furthermore, it’s important to remember that while we found results on average, the results may not hold for all companies, even within those for which innovation is important.
Despite these limitations, we believe our study helps us better understand the innovative process within companies. We showed that ineffective control systems are associated with lower levels of innovative output. Overall, our evidence suggests that higher-quality control systems improve the quality of information within companies, enabling management to identify and patent projects with higher innovative potential. Broadly speaking, our study suggests effective control systems can benefit companies when it comes to innovation, which is an important strategic priority for corporate executives. SF
Brian Miller, Ph.D., is the Sam Frumer Professor of Accounting at the Kelley School of Business at Indiana University. Brian can be reached at bpm@indiana .edu
Amy Sheneman, Ph.D., is an assistant professor of accounting at the Fisher College of Business at Ohio State University. Amy can be reached at sheneman.2@osu.edu
Brian Williams, Ph.D., is an associate professor of accounting at the Kelley School of Business at Indiana University. Brian can be reached at bw63 @indiana.edu
managers can better survey their in-process research projects and convert them
COMPANIES WITH WORSE INTERNAL CONTROL QUALITY ALSO HAD LOWER INNOVATIVE OUTPUT.
THE EVOLVING FOCUS ON CRYPTO ASSETS
Recent front-page fraud allegations have brought renewed attention to proposals to modify how entities report digital assets in financial statements.
BY SHARI LITTAN, J.D., CPATHE TERM “DIGITAL ASSETS” REFERS TO THE digital records made by using cryptography for verification and security on a distributed ledger called a blockchain (see Strategic Finance, “From the Mainframe to the Blockchain,” bit.ly/3lRmlh7). This decentralized ledger, the blockchain, securely records information that’s duplicated across user systems. A crypto key allows users to access a platform that connects into the blockchain. A user creates, or mints, a digital asset by adding new information to the blockchain, but existing entries remain unaltered. Through these entries, users can exchange newly created or existing digital assets.

An entity may use digital assets, depending on their underlying arrangement, for a variety of purposes, including as a means of exchange for goods or services or for financing. The variety of arrangements makes it important to consider the specific terms and risks in determining how to account for them.
Fast-paced market trends with respect to digital assets are creating challenges in developing and delivering timely and meaningful accounting guidance. Moreover, the diversity in the types of digital assets makes setting guidance a complex proposition for regulators such as the U.S. Securities & Exchange Commission (SEC) and standard setters the Financial Accounting Standards Board (FASB) and International Accounting Standards Board (IASB) to set accounting rules that are meaningful. A useful criterion, however, is whether the digital assets are exchangeable. While some can be used to conduct transactions, others, such as NFTs (nonfungible tokens), allow the user access to an underlying, unique digital item such as collectible artworks.
ESTABLISHING CONSENSUS
The FASB’s Emerging Issues Task Force (EITF), the American Institute of Certified Public Accountants’ Digital Assets Working Group, and similar professional groups have been examining the various types of digital assets to consider the accounting issues. These assessments have resulted in a consen-
sus that digital assets, falling outside the definition of cash and cash equivalents, financial instruments, and inventory, are indefinitely lived intangible assets.
Following this consensus, digital assets are accounted for under U.S. Generally Accepted Accounting Principles (GAAP) under Accounting Standards Codification (ASC) Topic 350, Goodwill and Other Intangibles But this raises a problem. Although the value fluctuates, as intangible assets, they’re measured at historical cost. The holder doesn’t reflect changing value unless the recorded amount is considered impaired, meaning permanently unrecoverable. After recognizing an impairment, subsequent recoveries in value aren’t reflected in the accounts.
Many consider this accounting inadequate and irreflective of the economic conditions of holding these assets. “It’s been an interesting journey since I first saw these instruments come out,” remarked FASB Chair Rich Jones at the December 2022 FASB board meeting. “Initially I was told that they would be used generally to facilitate exchanges, or possibly hedges against inflation.” But, he noted, at the end of the day, these assets are seemingly a “speculative investment” and that users will benefit from a “true reflection of volatility.”
Stakeholder outreach accelerated the FASB’s attention to the area. The FASB received requests from certain U.S. congressional members and feedback from users in the
financial services sector. Certain FASB stakeholders also raised this as a priority (FASB, 2021 Agenda Consultation, bit.ly/3vF69Xx).
After staff research and recommendations in late 2022, the FASB decided to issue an exposure draft in early 2023 of proposed guidance that covers a narrow group of crypto assets that:
1. Meet the U.S. GAAP definition of an intangible asset;
2. Do not provide the asset holder to enforceable rights to, or claims on, underlying goods, services, or other assets;
3. Reside or are created on a distributed ledger (i.e., blockchain);
4. Are secured through cryptography; and
5. Are fungible (Scott Muir, “FASB addresses crypto asset presentation and disclosure,” KPMG, bit.ly/3X12uj8).
Under the FASB’s pending guidelines, if adopted, holders would measure assets that satisfy these criteria at fair value with unrealized gains and losses in net income. The entity would determine fair value by following ASC Topic 820, Fair Value Measurement, and other related standards.
If adopted, an entity would also present the aggregate amount of its crypto assets separately from other intangibles; it would similarly present gains and losses separately from other income statement items related to intangibles, such as amortization or impairment losses. On the statement of cash flows, an entity effectively would report the receipt of noncash
crypto assets that it converts to cash almost immediately as operating cash flows (foregoing treatment as noncash items). This accounting would apply to all entities; there would be no carveout or a phased-in approach for nonpublic entities.
WAIT AND SEE
The IASB, unlike the FASB, has taken more of a waitand-see approach. For the moment, the International Financial Reporting Standards Interpretations Committee has determined that reporting entities should apply International Accounting Standards (IAS) 38, Intangible Assets , to its accounting for crypto assets, unless they hold these digital assets for sale in the ordinary course of business. In the latter case, the entity reports this under IAS 2, Inventory The IASB, thus far, has decided not to add a project on cryptocurrencies and related transactions. As IASB Chair Andreas Barckow said during a joint session with the FASB in September 2022, “Has the use of digital assets become pervasive on a global scale? Not yet, but we are monitoring.”
The general view by the IASB is that the area is complex and changing, and standard setting may be premature, particularly as policy makers address broader issues than just accounting. The debate remains on whether digital assets, and the narrower category of crypto assets, should be within the scope of a standard that generally addresses intangibles. The IASB has a broad project
regarding intangible assets in its research pipeline, and it may consider the most meaningful way to address cryptocurrencies as it addresses the broader issues.
TRANSPARENCY CONCERNS
In the United States, the SEC is bringing attention and resources to the varied issues around digital assets. For example, operating a crypto asset platform creates verification and security obligations (that is, potential liabilities), and the SEC is concerned about transparency by entities with such safeguard responsibilities (SEC, Staff Accounting Bulletin No. 121, bit.ly/3jUgxs2). Among other issues, the SEC is concerned about the use of crypto assets to raise capital (SEC, “Spotlight on Initial Coin Offerings (ICOs),” bit.ly/3GFLMjs) and lending transactions of crypto assets. Due to the need for specialized attention, the SEC announced in September 2022 the formation of a new Office of Crypto Assets within the Division of Corporation Finance’s Disclosure Review Program.
The acceleration of technology-based transactions highlights the need for regulators, standard setters, corporate accountants, and other stakeholders to respond with agility to respond to risks and provide transparency. SF
Shari Littan, J.D., CPA, is director of corporate reporting research and policy at IMA. She can be reached at shari.littan @imanet.org
SMALL BUSINESSES AND THE RETENTION ADVANTAGE
Small businesses can leverage their agile adaptivity to create retention strategies that minimize recruitment costs.
BY YVONNE BARBERTHE PANDEMIC REDEFINED THE WORKPLACE. Employees dubbed this time the Great Awakening, a chance to reset priorities and to make decisions about their careers. Employers, on the other hand, called it the Great Resignation, an unprecedented attrition phenomenon that caused them to reevaluate their strategies for managing their workforce.
Whatever the angle, an effective strategy for retention requires an understanding of the motivators that fueled the surge of resignations. According to Strengthening Workplace Culture: A Tool for Retaining and Empowering Employees Globally, a 2022 workplace culture report from the Society for Human Resource Management, “Most workers who have thought about leaving their current organization work in organizations with poor cultures. Nine out of 10 workers (90%) who rate their culture as poor have thought about quitting, compared with 72% of workers who rate their organizational culture as average and 32% who rate their culture as good” (bit.ly/3GGTlGp).
For small businesses concerned that employees might be lured away by higher salaries, this suggests that there are other motivators that budget-conscious managers can address without entering a bidding war they can’t win. By examining some of these motivators and building a culture that encourages retention and inclusion, small businesses can remain competitive.

LACK OF TRAINING AND TRANSPARENCY
In small companies, loyal employees who excel are often rewarded with promotions, though they sometimes lack the leadership skills needed to perform in their new role. There are many economical career development training options available through professional organizations
and local universities that can address any deficiencies. When considered as part of a comprehensive employee retention strategy, this can be a relatively inexpensive means to boost retention.
In addition to improving communication skills, regular and intentional communication is also important. Transparent and consistent communication feels inclusive and helps employees collaborate while also ensuring that everyone is strategically aligned and working together toward common goals.
BURNOUT
In small business, employees are often called upon to fill several roles. This can lead to employees feeling as though they’re constantly stretched too thin, without an acknowledgment or reward for this additional effort. When there is frequent turnover or other types of short-staffing issues, it can seem as though there’s never a light at the end of the tunnel and employees begin to feel exhausted and resentful.
The solution within small businesses may look very different than it might in a large company, which may have an entire department dedicated to finding creative ways to promote employee well-being and resources to manage the workload when there’s a vacancy or absence.
Within smaller organizations, cross-training is essential to help balance workloads and create support for coworkers who are away from the workplace. This may require that any segregation of duties that’s
in place be put on hold in the interest of maintaining operations during an absence or vacancy.
This may raise a potential risk temporarily, but small companies are often forced to choose between internal controls and continuing operations. Balancing these risks and providing appropriate cross-training is essential to small business continuity. As a bonus, cross-training presents opportunities for employees to develop new skills and identify undiscovered interests and talent.
LACK OF CAREER GROWTH OPPORTUNITIES
Sometimes it’s difficult for managers to envision career pathways and growth opportunities within a small company because the business is focused on the immediate tasks that need to be completed and managers may have little energy left for anything else.
According to human resources expert Sheree Knowles, “Most companies think about career growth in a traditional sense. They believe that the only way to achieve career growth is through promoting employees to a higher position. In small companies, there aren’t many opportunities for promotions.”
Knowles advises small companies to shift from a traditional “career ladder” point of view to a “career lattice” perspective where employees can grow their careers by taking lateral positions, shadowing others, and receiving stretch assignments.
ESTABLISHING PATHWAYS
Small businesses have a superpower that large companies don’t have: They’re agile and nimble. They can move quickly and adapt to this new competitive landscape. There are several solutions that small companies can implement that don’t require a big budget.
Set clear performance expectations by providing regular and consistent performance reviews that include areas of opportunity for training and development. Schedule at least one separate career pathway conversation with employees to help them consider what may be available within your company. Remember, it doesn’t always have to result in a promotion.
Develop compensation and benefit packages to serve a multicultural and multigenerational workforce. Small companies that can’t compete on salary alone can compete based upon their knowledge of their employees by customizing an overall package that resonates with their employees. For some employees, flexibility to balance personal commitments can be an important benefit that inspires retention. For others, it might mean active mentorship for younger professionals who value recognition and encouragement
The important thing to understand is that there isn’t a one-size-fits-all approach to this, and one of the benefits a small company can bring to its employees is a customized package that considers
their needs rather than offering a plethora of benefits that aren’t needed or valued by its employees. Keep this simple. Ask them what’s important to them and look for ways to offer benefits they’ll value.
RECOGNITION AND INCLUSION
A strong culture of employee recognition can go a long way toward helping companies retain employees, according to a new survey of more than 7,600 U.S. adults who were employed full-time or part-time (Gallup and Workhuman, Unleashing the Human Element at Work: Transforming Workplaces Through Recognition, 2022, bit.ly/3GgdG4a). Leaders who take the time to understand what motivates their employees and how they prefer to receive feedback do well with this. Leadership training can provide tools that will help.
Diversity, equity, belonging, and inclusion are important for companies of all sizes, but the impact within a small company can be even greater. Diversity of thought inspires innovation, which brings a motivating energy to the environment. Small businesses have demonstrated tremendous resiliency in recent years, but the uncertainty we continue to face will require continued innovation and a commitment to a sustainable strategy. SF
Yvonne Barber is a fractional CFO and small business specialist. She is a member of IMA’s Atlanta Chapter and chair of the IMA Small Business Committee. She can be reached at vonnie .barber@gmail.com


THROUGH
BUSINESS RESILIENCE DEMAND PLANNING

Finance professionals can help their organizations weather uncertain market conditions by overcoming demand planning challenges.
BY BILL KOEFOED
The 2023 outlook looks stormy: Economic downturn, continued inflation, volatile foreign exchange rates, and a sustained higher level of uncertainty are all on the horizon. The ongoing pandemic, especially in China, coupled with geopolitical conflicts and trends in deglobalization, have struck global supply chains, bringing instability and higher costs to many economies. Supply chain disruptions are putting long-standing operational models at risk, from offshoring to Just-in-Time, with potentially unforeseen consequences. This scenario hasn’t been seen by business leaders in many years.
Amid this environment, a meaningful way to improve financial performance is through a more effective demand planning process. But accurately planning demand under these volatile market conditions is tough, making it challenging to positively impact financial results. It doesn’t have to be this way. There are several best practices companies can follow to improve demand planning and drive better financial results.
Developing the most accurate forecast for the future demand of a product or service is critical to optimizing performance and business results. Nevertheless, most organizations struggle to accurately forecast and plan for demand under current circumstances. And the consequences of those complexities are forcing organizations to face higher inventory levels and costs, unsatisfied customers, and missed market opportunities. More effective demand planning can help to unlock improved financial performance. But organizations must first understand the inhibitors to accurate demand planning in the current environment. (See Table 1 for more on evaluating your demand planning process.)
Understanding Demand Management
Companies perform demand planning to ensure that they have the right materials to produce the right products, in the right quantities, delivered to the right location, at the right time in order to meet customers’ expected service levels. The duration of this cycle is often referred to as the supply chain lead time (or sourcing lead time, production lead time, or distribution lead time). Companies can’t wait for customer orders before they start planning; they must use demand planning to anticipate future demand and then plan their supply chain activity accordingly.
Demand planning is part of the demand management process that has been in use since the 1950s. In recent years, demand planning has been gaining sophistication as Big Data, analytics, AI, and machine learning technologies enable greater refinement in demand modeling and lower the entry barrier for more organizations. Demand planning has a strong forecasting component but involves much more than just statistics. When done well, demand planning considers the inputs from various sources and experts to build the right assumptions to be used in forecasting models. (See Figure 1 on p. 28 for an overview of the methods involved.)
Planning Demand Today
Even in normal times, gaining an understanding of how the market will react to new products and services, promotions, or price changes is difficult—and it’s much more difficult under current market conditions. For many business leaders, figuring out price elasticity and its impact on demand is becoming an obsessive pursuit for a simple reason: to help with decision making. For example, a premium beverage manufacturer would need to know how sensitive consumers are to a price increase in order to prevent them from shifting to lower-priced drink options. But the manufacturer also needs to protect margins from surges in ingredient and energy prices.
Yet striking a balance between supply and demand, at the right price point, with this level of variability is hard. High forecasting error rates can cause a significant impact to a company’s profitability. When the demand plan is inaccurate, either excess or obsolete inventory is stockpiled, or, on the contrary, having insufficient inventory will cause stock-outs, resulting in negative revenue and reduced customer loyalty.
Given all of that, why is effective demand planning one of the top priorities for business leaders looking to persevere in current market conditions? Because accurate demand planning directly impacts and strongly correlates with financial results. Here are some examples:
■ Profit and loss (P&L): increased revenue and market share, enhanced margins from effective assortment planning and marketing, and reduced reactionary spending on emergency actions (premium expedited freight, labor overtime, etc.)
■ Balance sheet: improved inventory and working capital efficiency; asset and resource optimization by aligning purchasing, production, and distribution plans with demand planning
■ Cash flow: improved capital allocation to high-demand, high-margin items; improved days inventory outstanding (DIO) resulting in lower interest expense
Helping demand planners to tame forecast bias and suppress error levels is critical to optimizing business results. Why, then, is demand planning so hard?
The Difficulty of Demand Planning
Let’s examine three of the top challenges of demand planning. The first challenge is planning demand under high-variability conditions like those that exist today. A simple way to understand this is by looking at the Organisation for Economic Co-operation and Development (OECD) passenger car registration figures for the United States for the last few years (see Figure 2 on p. 29). For 2020—the year the COVID-19 pandemic emerged—the chart shows a drastic decline from February to April 2020 followed by a rapid recovery in the subsequent months. If this decline isn’t considered in the model as an outlier—and thus
TABLE 1: CHECKS TO EVALUATE YOUR DEMAND PLANNING PROCESS
Check Rationale
Demand management is an established ongoing process in which demand planning kick-starts to support long-range and annual plans, and it supports demand generation activities continuously.
Demand management consists of demand planning, demand sensing, demand shaping, and demand prioritization.
Assuming that demand just “happens” spontaneously and can’t be managed is erroneous: An organization must follow up a demand plan by sensing what the actual demand is, trigger actions (marketing campaigns, promotions, new product launches, etc.) to shape the demand, and have a strong governance to prioritize demands when resources are limited.
Demand management is a key process in the organization in that it’s actively supported by top (C-suite) leadership.
Demand planning is communicated and leveraged into wider planning processes such as financial planning and analysis, sales and operations planning, and/or integrated business planning.
Core stakeholder functions of demand planning should include but not be limited to product development, marketing and sales, supply chain, finance planning and control, workforce, and IT staff.
Collaboration among stakeholders is effective and aims at reaching the consensus demand plan.
Determining demand is key for an organization’s business performance; it will determine the revenue potential and can help to minimize cost and protect cash.
Communicating demand is key to the organization’s success. Communication and collaboration among the different demand management stakeholders is essential to ensure the company meets its objectives.
All these functions should have a say in the demand plan and should leverage the plan for their responsibilities. In some industries, customers, suppliers, and partners can also participate in demand planning.
The consensus demand plan is one that has been agreed to by all stakeholders and has been reviewed against drivers and constraints that impact the business.
Demand planners spend most of their time in analysis and value-add activity vs. gathering data.
Demand plan is expressed in ways that are understood by both finance and operational organizations.
Normally, demand planners are wrestling with disparate sources. They want to minimize inaccurate, incomplete, and otherwise low-quality inputs and biased assumptions.
Each stakeholder views the same information through a different lens. At the core volume (unit), measures must be expressed in value (currency) measures to ensure an effective demand planning process.
Demand planning runs at a frequency (e.g., monthly, weekly, or daily) to keep up with changing market conditions and supports the supply chain and delivery process.
excluded—quantitative methods would yield less accurate results in the annual forecasts for future periods.
Therefore, to improve prediction quality, businesses must continually review and adapt their demand planning to constantly changing market conditions. The challenge is that there are many statistical models available and a broad range of quantitative and qualitative methods that need to be com-
Speed matters. The time elapse between a disruptive event and a decision taken and executed is key to capture or keep business value.
bined and used depending on the circumstances and specific characteristics of what’s being analyzed. The most complex statistical method doesn’t always provide superior results. Rather, the quality of the results ultimately depends on the frequency with which data is being ingested, the ability to effectively use various internal and external business drivers to enrich the training of these models, the product character-
FIGURE 1: DEMAND PLANNING BASICS
Demand planning is part of the demand management process and is a collaborative and ongoing effort intended to determine the level of demand that can be satisfied with the available resources in each period. Demand planning starts with the forecast, a statistical analysis enriched with inputs from marketing and sales, product development, and the business strategy.


Demand planning may use both quantitative and qualitative methods.


istics (e.g., an upgrade vs. a totally new product), and more. With higher market volatility, the forecasting and planning error rate grows and trust in the demand planner’s results diminishes.


Second, demand planning is hindered by organizational disarray. Demand planning is a key step in sales and operations planning and integrated business planning. These processes are underpinned by cross-functional collaboration between leadership, finance, sales, marketing, supply chain, operations, and other functions. Yet in most organizations, these business units operate in isolation from one another and occasionally act on contradictory assumptions and conflicting targets. There needs to be further unification of these functions.
Even if demand planners generate solid forecasts, those forecasts are neither followed nor trusted. Why? Because the demand forecasts aren’t supporting the planning assumptions of every business unit. And that results in each function using its own self-generated forecast, creating different views of a shared reality across functions.
Another organizational concern is the placement of demand planning within the overall organization, both functionally and hierarchically. There’s no straightforward answer to the question of where demand planning should be placed, and the options are diverse. Logistics, marketing, sales, finance, strategic planning, supply chain operations, or an independent team—any of these functions can own




FIGURE 2: OECD PASSENGER CAR REGISTRATION IN THE U.S.
(Percentage change 2015 = 100)
Source: “Passenger car registrations,” OECD Data, 2022, bit.ly/3YBPCkz
the demand planning mission. Not to mention, demand planning that’s untrustworthy or excluded from the C-level agenda also tends to get pushed down in the hierarchy.
Finally, many demand planners are working with obsolete technology. Using legacy technology may sound paradoxical in the Digital Age when technology is ubiquitous and data is abundant, but it happens. In today’s world, a wide variety of planning tools are available alongside abundant expertise and insights to input into the demand planning activity. While these factors should raise the accuracy of demand planning, the results are quite the opposite many times. Why is that? Because demand planners often need to handle a mix of different data sources and tools. This mix includes spreadsheets, paper-based reports, and flat files that must be uploaded manually to the planning tools.
Sometimes there’s a direct integration to a source system, but data management and transformation may be required when preparing data to feed into statistical models—and that takes time. If the planning and modeling tools aren’t suited to blend all this data intelligently, more manual manipulation will be required by the demand planner or technical expert. That added manipulation will then create a bottleneck and create more risk for potential errors.
Consolidating all these disparate inputs is what consumes most of the time and effort of demand planners,
leaving little time for modeling, analyzing, and interpreting the results. Therefore, demand planners are forced to make a suboptimal compromise: to discard some input sources and simplify the analysis or miss critical deadlines. If the demand planning process is overly cumbersome or timeconsuming, the probability of that forecast being appropriately refreshed is unlikely.
When businesses don’t refresh their technology to capitalize on the rich and abundant data sources available, demand plans are delayed, erratic, incomplete, or lacking the required level of detail. And whatever the effect, it diminishes the value added to the overall business planning process.
Modern, cloud-based demand planning technologies must therefore be leveraged to handle the growing number of internal and external business drivers, factors, and data volumes. Now is the time for businesses to elevate demand planning as a critical activity to conquer complexity, uncertainty, and risk.
A Better Way
Organizational and technological issues in demand planning aren’t new. The current market conditions, however, highlight the impact these issues are having on business performance. Understanding the issues hindering demand planning and the importance of making the process more
CONSIDERATIONS THAT ENABLE BUSINESS TRANSFORMATION JOURNEYS
Top leadership must be engaged in demand planning and consume its outputs. The key objective for demand planning is to depict what the demand is going to be for the products and services, and this should be a key driver for strategic goal setting. So it’s essential that the C-level is fully committed to the process.
Stakeholders must collaborate, and team collaboration starts with a clearly communicated vision and setting the expectations from the start, defining KPIs that are aligned to strategic objectives and financial attainment.
Finance needs to support demand management with visibility on how a demand plan (or change in the plan) affects profit and loss, the balance sheet, and more. This is the best way the finance organization can provide value to the demand management process and ensure every action taken by marketing, sales, and supply chain operations supports the business goals and financial objectives.
Marketing or an independent function should own demand management. To avoid biased forecasts and plans, demand planning should be done by an independent team or be placed within marketing. If, for example, demand planning is owned by supply chain, it may be biased by the resources and supply constraints and won’t reflect the true demand picture in the market. Notwithstanding, demand management must be closely connected to key commercial and supply chain activities, such as promotions, inventory, purchasing, and more.
Demand management processes are to be challenged by removing deviations, reducing variations, taking away steps and complexity, redefining measurements, and ultimately automating tasks. When a single view of the plan is instilled and teams are collaborating, the need for data checks, file, and data reconciliation is eliminated. When a single solution is implemented, the data verification and certification can be signed off on easily. All this is net contribution to simplification, and it unleashes productivity that can be geared up to higher-value activities for the demand planners.
Invest in technology to drive higher demand planning accuracy and support faster decision making. Each product family reacts differently to demand variability, so equip demand planners with the best tools. Demand planning technology today leverages machine learning and multiple data sources to increase forecasting accuracy, accelerate the run frequency, make simulations, and show the same information in ways that satisfy the needs of all stakeholders.
effective is critical for businesses seeking to optimize financial performance.
Fortunately, as organizations deal with increasing turmoil, technology is making steady progress. Big Data analytics, AI, and machine learning offer good examples, and these technologies are now making an impact in demand planning and forecasting.
AI and machine learning can help drive more effective demand forecasting by ingesting higher volumes of varied data, determining the relationships between drivers (features and events) and their impact on the forecast, and executing thousands of forecasting models at a fraction of the time required with prior methods. The time saved frees up
the demand planner, allowing for more trials with different forecasting methods in a shorter time frame. The benefit? AI solutions can execute far more simulations and recommend to the planner which statistical method drives the most accurate prediction for a given product and business context.
What, then, can organizations do to improve the effectiveness of demand planning? A preliminary step is to identify the root causes of demand planning not yielding accurate results. Organizations can then rank the causes by severity and the effort required to resolve them. Through this process, organizations may gain valuable insights for building the business case for change.
A few important considerations will allow businesses to shape their own transformation journeys and serve as guardrails along the way once an organization is determined to revamp demand planning.
■ Unify all planning activity and identify demand planning as a pivotal piece. Planning doesn’t work well when done in silos. Why? Because doing so creates redundancy, ring-fences the impact of the plans, and blurs the vision. Unifying planning across business lines and functions instead offers a single view of the business that removes organizational bias and provides agility in return. In an ideal world, demand forecasting should be one of the first steps in overall business planning.
■ Seek sponsorship from finance leadership. Under a unified approach to planning, the CFO and the finance organization can better support demand prediction and align it with other planning activities. Financial planning sets the organizational targets, but how realistic is a revenue target when the demand forecast is falling short? Demand planning acts as a check-and-balance step against the financial targets by providing business context to the financial ambitions of an organization. Therefore, the CFO should have a vested interest in supporting demand planning.
■ Align demand planning key performance indicators (KPIs) with financial objectives. Higher demand forecasting accuracy is essential to helping optimize replenishment and safety stocks, but how does that affect financial performance? What if demand scenario modeling could be checked right away against the P&L, balance sheet, income statement, and cash flow statement? Providing that visibility to the business would accelerate the assessment and selection of the operational planning scenario that optimizes both business and financial performance. The demand forecast must be expressed both in value (dollars) and volume (number of units) to serve as a connector between finance and operations.
■ Make use of the best data sources available. Modern demand planning technology allows planners to plug in different data types with high frequency—structured and unstructured; internal and external; financial, operational, and transactional—to inform the forecasts. Adding more sources helps incorporate market trends, seasonality, and cyclicity where applicable. But there’s a caveat: Sometimes access to new data sets is difficult or restricted. For example, having access to point-of-sale data is key for the consumer goods industry. Yet most of that data is proprietary to retailers and wholesalers, who may not be eager to share it. In this case, using weather data to forecast demand of certain consumer products can help improve demand forecast accuracy.
■ Incorporate more demand forecast modeling capabilities. A one-model-fits-all type of system for demand forecasting has never been a good approach—and certainly not when the demand plan must cover a variety of product types and categories, disparate markets, and demand segments. Thus, a good demand planning practice is to constantly look for alternative models to plan for the same product and expand the toolbox with techniques and technologies that broaden the spectrum of modeling methods.
A good example is a supermarket chain changing its mix of dairy products. Rather than relying on historical forecasting models, the chain should test other models, especially when new products are placed on the shelves, new data sets become available, new promotions are run, or factors that influence consumer behaviors change. That doesn’t mean statistical models have an expiration date. Rather, the more models tested on a specific demand set (product, price, data, etc.), the better the chances that errors can be contained or reduced as market conditions change.
■ One version of the numbers isn’t debatable. Having one version of numbers is hard, no doubt, because organizations are broken down into multiple functions and business lines. Accordingly, the systems at hand can only produce limited and incomplete plans. But it shouldn’t be this way. Software solutions exist today that can unify all the planning and forecasting activities within one platform and with one data model. Utilizing a data-first approach to planning should be the first step to consider for an organization looking to reach consensus demand agreements and reduce bias.
Where to Start
Improving demand planning is a tactical starting point for leaders who are interested in unifying financial and operational planning, and a specific way to deliver a positive impact on their business. Instead of marginalizing such planning because current market dynamics may increase errors, organizations should view demand planning as a key mechanism for optimizing business performance. Getting leadership sponsorship is critical. And pursuing a data-driven approach that can produce demand planning models connected to financial objectives can considerably improve results.
A data-first approach provides a unique and unquestionable view of the numbers. In turn, that view will serve as the glue to connect and keep all functions and business lines rowing in the same direction. And the benefit comes in the form of saving time in costly organizational design and culture changes that sometimes don’t last very long.
Business and finance leaders who want to take advantage of new opportunities in uncertain market conditions should consider effective demand planning as a critical capability to improve financial performance. A simple question helps clarify the benefits: If demand planning errors are reduced, how much would that imply in terms of the following?
■ $M savings from lower inventory or obsolete stock write-offs.
■ $M cost avoidance by reducing last-minute decisions in purchasing, freight, and manufacturing.
■ % market share gained due to timely and complete product availability.
The way forward to realizing those benefits starts with CFOs and finance teams partnering with the demand planning organization. SF

INTEGRATED THINKING FOR SUSTAINABLE BUSINESS MANAGEMENT

Providing
BY BRIGITTE DE GRAAFF, CMA, CSCA, AND PAUL E. JURAS, PH.D., CMA, CSCA, CPAThe way in which organizations create value has evolved. Historically, the goal was to create wealth for investors. But that focus is expanding to include value for people, society, and the environment. The increased focus on sustainability and environmental, social, and governance (ESG) issues shows that society is starting to question the basic reason for a business’s existence.
The growing emphasis on and interest in ESG reporting standards by such organizations as the International Sustainability Standards Board and the U.S. Securities & Exchange Commission (SEC) mean that corporate reporting characterized by a focus on financial performance and a lack of information on corporate strategy and nonfinancial performance is becoming less fit for the purpose of adequately informing stakeholders. We’re seeing growing pressure and regulation to require more and different information to improve transparency. With this movement, the holistic perspective of the organization’s relationship to the external environment is becoming increasingly important.
From a governance perspective, a company’s board of directors isn’t representing shareholders if it fails to consider the impact of social and environmental factors as well as the (economic) tangibles and intangibles that contribute to the ability to sustain the enterprise. A board that relies on financial data alone will be missing key elements needed to carry out its responsibility. Likewise, the management accountant who fails to identify the factors contributing to the sustainability of the organization isn’t providing management with a full picture of both the organization’s value and the breadth of risks that need to be addressed in maintaining and enhancing that value. Applying integrated thinking can help in both these instances, helping to move an organization to sustainable business management.
Moving Beyond an External Focus
For many companies, the movement toward greater transparency around ESG issues and sustainable business management begins with integrated reporting. Integrated reporting is the presentation of an organization’s performance that integrates financial and other information related to sustainable value creation. A benefit of integrated reporting is connecting the company’s mission; corporate governance; and its financial, social, and environmental performance to help internal and external users make better decisions and focus on long-term value creation, all while providing greater transparency to external users so they can better evaluate the actual operation and performance of the organization in the short, middle, and long term.
Consider what the company Intel has done (bit .ly/3vPcNe3). Beginning with fiscal 2018, Intel directly incorporated a discussion of the six capitals of the Integrated Reporting <IR> Framework—financial, manufactured, natural, human, intellectual, and social and relationship—into its Form 10-K. The company devotes a page to each capital, discussing its strategic importance and use.
Integrated reporting doesn’t have to follow the <IR> Framework. The lack of standardization can be confusing and challenging. According to The Reporting Exchange, there are more than 2,200 ESG reporting provisions by regulators across more than 70 countries, more than 1,400 key ESG indicators, and more than 1,100 organizations involved in the development of ESG frameworks and initiatives. A company might use multiple frameworks and standards and assemble multiple reports.
While the reporting frameworks and metrics vary, what most have in common is an external focus—both in reporting externally and in addressing external users of information. A common underlying issue among these frameworks is their ignoring of the internal focus.
Yet incorporating an internal focus is important because ESG gets to the heart of why an organization is in business, its impact on the world, how it aligns its business model with the needs of society, what is reported, and how it engages with its people and its stakeholders in general. Simply being reactive to the ESG transformation creates the risk that the organization adapts old value-creation models that can’t meet the concerns of its stakeholders and the organization’s long-term needs. It’s also likely the organization will fail to identify and manage material risks and find itself out of step with its stakeholders.
So, while compliance and controls are important issues, there must be other reasons that integrated reporting and the movement to report ESG factors alongside and connected with traditional financial measures is growing. Taking the discussion out of the external reporting sphere, what’s in it for business? What are the benefits to the company for implementing processes that meet these new reporting expectations? And more importantly, how does it impact organizations?
Ultimately, these questions point to sustainable business management. IMA® (Institute of Management Accountants) describes sustainable business management as “operating in a way that recognizes that resources are limited and valuable.” It requires managing resources “in a way that sustains and builds value for all stakeholders that contribute to an organization” (bit.ly/2ZrlL4n).
A Bridge between Reporting and Management
Sustainable value creation takes more than reimagining the reporting function. It also means potentially reenvisioning strategy and a resultant business transformation. That brings us to the need for integrated thinking. Integrated thinking is about identifying, executing, and monitoring business decisions and strategies for long-term value creation (see “What Is Integrated Thinking?”).
In order to move integrated thinking forward within an organization, it’s important to improve the visibility of the challenges and opportunities by linking different stakeholders across the company into the conversation. One of the most common pitfalls that can trip up even the most sophisticated and well-prepared organizations is when important dialogue occurs in silos. True leaders and strategic management teams understand that involving as many participants from different departments as possible creates a more diverse team, better solutions, and more robust
use cases. By embracing integrated thinking, these siloed pitfalls can be avoided and ultimately bridge integrated reporting to sustainable business management.
This also means that blindly following reporting guidelines isn’t necessarily in line with the principles of ESG, sustainable business management, or integrated reporting. As Gerald Ratigan explained in “The Ethics of ESG” (Strategic Finance, April 2022, bit.ly/3VXlTzS), there’s also a responsibility to create credibility when reporting on ESG issues. This means there would be a need to take a step back and view the organization holistically to see not only how the information should be reported (and have all the elements related to reliability of the information) but also to take an integrated view of the organization to actually help decide what should be reported. And since we’re broadening the scope of stakeholders to include internal consumers (management) of the information, it will be important to be able to connect the various silos that might exist into one integrated view.
A keyword describing the process taking place with integrated thinking is “connectivity,” which was one of the guiding principles of the original <IR> Framework. Connectivity is what allows silos to be broken down. This increased connectivity between the different silos shows that value creation, by definition, is multidimensional and thus requires integrated thinking. This means that it affects decision making by management. Table 1 contains several examples of integrated thinking in action. Although each of these case companies shows a different path or focus to integrated thinking, the core message is the same: One can’t focus on just the financial impact anymore, nor can an organization consider the different capitals separately. The interconnectivity between the capitals, their trade-offs, and the overall value creation of the organization is undeniably linked. As these use cases show, integrated thinking connects integrated reporting, which fulfills the external stakeholders’ needs, and sustainable business management, which has management as one of its most important stakeholders.
Integrated thinking can be accomplished by developing accounting and reporting practices that go deep into the organization’s operations to identify the points of integration that occur as the business model unfolds, but it requires understanding, measuring, and connecting a comprehensive set of relationships and performances to the organization’s purpose or mission. It’s the need to understand the connections across the multiple drivers of the value chain that puts management accounting professionals at the heart of value-creation efforts and affords them an opportunity to exert greater influence.
Help Steer the Ship
Sustainable business management requires relevant and reliable data. Digital transformation involves ever-larger amounts of information and data flowing in and out of organizations. In such an environment, it can be argued that information and the ability to effectively leverage it form the core of new competitive advantages in the global business landscape. The challenge is for management teams and boards to identify issues that impact the sustainable value
WHAT IS INTEGRATED THINKING?
Integrated thinking starts at the core of the organization—its business model—and builds on the need to reconcile competitiveness and sustainable growth within the context of the organization’s business model to take advantage of the opportunities and face the challenges of the market. Integrated thinking means: ■ Identifying opportunities
Identifying and quantifying the resources needed
Evaluating options
Making decisions with an integrated, holistic perspective
Creating a plan once an option is selected
Executing the plan
Measuring the results
Being adaptive
Integrated thinking can be accomplished by developing accounting and reporting practices that go deep into the organization’s operations to identify the points of integration that occur as the business model unfolds. But it also requires understanding, measuring, and connecting a comprehensive set of relationships and performances to the organization’s purpose or mission. This need to understand the connections across the multiple drivers of the value chain puts management accounting professionals at the heart of value-creation efforts.
of the company and then develop metrics that will lead to better risk management and performance.
This creates an opportunity for management accountants. Consider the data used when sailing a ship. Directional readings come from a compass, while latitude or longitude data comes from a sextant. It’s data from tools.
TABLE 1: USE CASES OF INTEGRATED THINKING
Company Integrated Thinking
Solvay
Company leaders of the chemical company used to think value created for customers created value for shareholders through discounted cash flow and return on invested capital. Yet they noticed that the consumers’ value perceived didn’t align with that of the shareholders because of the negative impacts that the industry created elsewhere in the value chain.
Integrated thinking broadened up leaders’ idea of value creation from their own, narrow perspective to a broader, integrated perspective in which impacts throughout the whole value chain are incorporated. Incorporating these broader issues in management practices changed their strategy and key performance indicators, as well as the way relationships between certain financial and nonfinancial factors were measured.
Schiphol
Value for an airport is multidimensional. Schiphol, a Dutch airport, has the strategic goals of delivering quality of life, quality of network destinations, and quality of services. The trade-offs are natural to this business, where, for example, more air-traffic movements and a higher density of the direct destination network may act as a key driver to the Dutch economy through the transport links created but conflict with the environment (e.g., pollution) and immediate community (e.g., noise).
The result of integrated thinking is the explicit incorporation of eight performance indicators into its strategy, decision making, and financial remuneration of staff. Notably, only one metric is focused on shareholders (and financial value).
BASF Management of this chemical company makes long-term investments in production sites, which operate for 30 years. A sustainable business management perspective requires the company to think in the long term, taking into account different factors such as climate risks, sociological changes, and environmental challenges.
Following an integrated thinking approach, investment decisions are based not only on financial business cases but include multicapital aspects such as the increased risk of severe weather events, potential fluctuations in water availability, and changing demographics.
ING
Through integrated thinking, financial services company ING found a way to better deploy its strategy. Directing attention for integrated reporting from solely external stakeholders helped ING embark on an integrated thinking journey to better understand its value-creation process. Through that process, ING uncovered that one of its biggest assets to drive its strategy is its people.
ING has many stakeholders that are equally important, but certain trade-offs in decision making may give the impression that some stakeholders are more important than others. Employees now integrate the corporate purpose into their work and stakeholder engagement, leading to better collaboration between departments that historically didn’t connect. Employees are more able to deal with trade-offs and better understand the impact that these trade-offs and other decisions have on stakeholders. Therefore, the value of each stakeholder has become much more visible now in the total value creation of the organization.
Standard Bank Management of this large South African financial institution expanded performance evaluation beyond financial measures to also consider social, economic, and environmental value drivers. Internal reporting using these value drivers became the standard for remuneration reports as well as performance tracking for all employees across the bank. Only the combination of the value creation of the different capitals makes it possible to see where value can be created and when value is being destroyed.
But that data alone isn’t enough to steer a ship to its intended destination. The data must be interpreted and combined with knowledge about the winds, currents, and predicted weather. As management accountants develop
their skills and build their repertoire of strategic business competencies, they can share their holistic view of business and the interrelation between financial and strategic business decisions to communicate the importance of creating
GETTING STARTED
Management accountants can begin to implement integrated thinking into their daily practices. This includes best practices such as:
■ State what sustainable business management means for your company and how it enhances the potential of value creation.
■ Make sure top management is on board, as key sustainable business management decisions also need to be made by the board of directors.
■ Identify the relevant ESG drivers that can be impacted by the company.
■ Integrate these key ESG drivers into the company’s strategy.
■ Communicate clearly about the value and importance of ESG drivers for the organization.
■ Implement key ESG drivers in objectives, targets, and goals on multiple levels in the organization.
■ Create ownership throughout the organization for the performance of ESG drivers.
■ Incorporate ESG drivers in the standard planning and control system.
■ Merge ESG drivers into decision-making processes.
■ Include ESG performance in regular external and internal reporting.
and establishing a value-creating, data-driven organization. In other words, they can help steer the ship.
In August 2022, the International Financial Reporting Standards Foundation released version 1.0 of its Integrated Thinking Principles (bit.ly/3ilbTmz). Among the items addressed within these principles are the need for risk and opportunity assessment, performance measurement, governance, assurance, and ethical behavior. These concepts also map directly to the IMA Management Accounting Competency Framework, indicating that management accountants can lead the search for comprehensive performance by suggesting pragmatic solutions to monitor, improve, and communicate the ways in which such an inclusive business purpose may be converted into added value for stakeholders (bit.ly/3GT6M6z).
Within the finance organization, management accountants can act as designers and enablers of an integrated process of thinking, measuring, and reporting that facilitates conversations and fosters the development of innovative solutions that are characterized by multiple backgrounds and points of view. See “Getting Started” for some examples of the actions that management accountants can take.
Be an Integrated Thinker
Value-creation reporting is now here. While most corporations aren’t yet required to provide sustainability reporting, the information is still worth providing. It’s critical to help inform decisions for a wide range of stakeholders, ranging from employees to policy makers, and from customers to
investors—and it must be provided in a way that’s useful to consumers of the information. Regular communication with all functional areas of an organization, combined with keen understanding of business processes, information systems, and IT governance, puts accounting and finance professionals in a unique position to advise about these issues.
To be successful, however, you must have more than awareness of the concepts needed to become an integrated thinker. To be a bridge builder to help your organization move from integrated reporting to sustainable business management, you first need to build the critical skills and knowledge to contribute to the business processes, understand their underlying information systems, and be well-versed in identifying and evaluating risks and controls. SF
Brigitte de Graaff, CMA, CSCA, is a Ph.D. candidate and lecturer at the Vrije Universiteit Amsterdam. She’s also chair of the IMA Sustainable Business Management Global Task Force and a member of the IMA Global Board of Directors. She can be reached at b.c.de.graaff @vu.nl.
Paul E. Juras, Ph.D., CMA, CSCA, CPA, is the Jefferson Vander Wolk Chair of Management Accounting and Operational Performance at Babson College. He’s a member of the IMA Sustainable Business Management Global Task Force and a former Chair of the IMA Global Board of Directors. Follow him on LinkedIn, bit.ly/2YHHDo1, or Twitter, @pauljuras

According to a 2020 survey of 600 senior executives conducted by Harvard Business Review, 55% of organizations agreed that data analytics for decision making is extremely important and 92% asserted data analytics for decision making will be even more important in two years (bit.ly/3PBRENs). Organizations that strategically deploy tools across their finance and accounting functions have an opportunity to better structure manual processes into more stable, accurate, repeatable, and readily auditable procedures (see Gregory Kogan, Nathan Myers, Daniel J. Gaydon, and Douglas M. Boyle, “Advancing Digital Transformation,” Strategic Finance, December 2021, bit.ly/3V45v0h). Thus, financial decision makers are increasingly expected to engage data analytics to enhance decision making and create more efficient processes.
An introduction to the basics of data analytics is the first step for those wanting to deploy specific technologies, create efficiencies in business decision making, and stay current with evolving technologies. There are many analytical methods (e.g., clustering, classification, and regression). While each has specific data requirements, they all share some common activities, such as understanding the business problem being analyzed, identifying data sources, and understanding basic features about the data. Learning the fundamental context of data analytics leads to success in the implementation of specific technologies.

On the other hand, a lack of understanding in data analytics and its value can result in organizational resistance to change and cause organizations to fall behind the evolving market. In “Will We Ever Give Up Our Beloved Excel?” (Management Accounting Quarterly, Winter 2020, bit.ly/3tRIAL7), Jennifer Riley, Kimberly Swanson Church, and Pamela J. Schmidt addressed Microsoft Excel’s inability to hold the magnitude of data now accessible through data analytics and examined the resistance to changing technologies. The results of their study confirmed that respondents recognize the value of switching to data analytics tools regardless of the costs. Additionally, they found the costs of switching increase the resistance to change, while perceived value decreases the influence to resistance to such a change. Educating professionals on the basic concepts of data analytics could help them understand the value and decrease the barriers to implementation.
This is where understanding the phases of Cross-Industry Standard Process for Data Mining (CRISP-DM) can assist. Developed by a team of experienced data mining engineers in the late 1990s, CRISP-DM has been the most widely used data analytics method for more than 20 years. Its development was driven by the desire to establish a universally accepted data mining methodology, and it provides a fun-
damental understanding of data analytics. With this foundation, management accounting and finance professionals will be better able to approach more complex data analytics implementations and better understand the latest developments in specific technologies.
Data analytics projects work best when a systematic and repeatable process is followed for transforming raw data into actionable information. Following the CRISP-DM framework (see Figure 1) is like using checklists to ensure all required activities have been performed. CRISP-DM is made up of six iterative phases (see Table 1):
1. Business understanding
2. Data understanding
3. Data preparation






4. Modeling
5. Evaluation
6. Deployment/communication
It’s very important to assign responsibilities to team members (e.g., business analysts establish requirements, IT specialists gather required data, and data scientists develop and test advanced modeling techniques) with the requisite skills that vary among the phases. Let’s take a look at each phase.
FIGURE 1: THE CRISP-DM MODEL
TABLE 1: CRISP-DM PHASES
CRISP-DM Phase Responsibility
Business understanding
Business analysts
Key stakeholders
Data understanding
Business analytics
Data analysts
Data preparation
Data analysts
Data scientists
Activities
Determine business objectives
Assess situation
Determine data analysis goals
Produce project plan
Collect data (internal and external)
Perform exploratory data analysis on the data
Create documentation (e.g., data dictionary)
Verify data quality
Select data fields
Edit data (identify and document errors)
Clean data
Create new data columns if needed (e.g., return on equity from net income and equity balances)
Integrate data from the various data sources
Format data consistently (e.g., all dates in MM/DD/YYYY format)
Modeling
Data scientists
Evaluation
Deployment/ communication
Data scientists
Business analysts
Business analysts
Operations/IT team
Select modeling techniques (e.g., regression and classification)
Create separate data sets for model training, testing, and validation
Build models
Evaluate model
Evaluate all models
Select best model
Plan deployment
Produce final report, dashboard, etc.
Review project and document key findings
Source: Adapted from Rüdiger Wirth and Jochen Hipp, “CRISP-DM: Towards a Standard Process Model for Data Mining,” January 2000.
Phase 1: Business Understanding
The first phase of CRISP-DM is developing a solid understanding of the business problem. During this phase, business analysts meet with the client to outline the project objectives and requirements in terms that a broad audience of business users can understand. Examples of problems that data analytics can solve include determining the adequacy of loan loss reserve or effectively assessing the conformance to internal control policies.
Once the broad outline of objectives and requirements is agreed upon, the analysts translate the objective and requirements into project goals. Examples of goals include “reduce loan losses by 10% in three years by attracting more credit-worthy customers” or “reduce purchasing
fraud by 5% within one year by developing a model that can detect potentially fraudulent transactions in real time.” Goals should be tailored to organizational needs and mirror the organizational objective. After the objectives and goals are determined, the analyst develops an initial plan for achieving the goals.
Phase 2: Data Understanding
During the second phase of CRISP-DM, business and data analysts obtain a detailed understanding of the project’s data needs. This phase begins by reviewing the business questions identified in phase 1 and documenting the data requirements. Data can come from internal sources such as the enterprise resource planning (ERP) system or general ledger or from external sources like industry benchmarking statistics.
Tasks
Collecting the data
Describing the data
Exploring the data
TABLE 2: COMMON DATA PREPARATION TASKS
Typical Questions
■ Where will we get the data? From internal or external sources?
■ Who is responsible for obtaining the data?
■ How large is the data (e.g., how many columns and rows does the file contain)?
■ What is the format of the data (structured, semi-structured, or unstructured)?
■ What types of data are in each column? Are they qualitative (nominal or original) or quantitative (interval or ratio)?
■ What are the “typical” values in each column? For instance, quantitative variables can use average, minimum, or maximum values. Qualitative variables look at the frequency of occurrence of a value.
■ Are there anomalies or outliers?
■ Are there missing data points? If so, is it possible to make adjustments to continue (e.g., exclude the entire record and impute a value based on values in other columns)?
Verifying the data quality
■ Is the data accurate?
■ Is the data valid (i.e., conforms to business rules; e.g., invoice date must be on or after the sales invoice date)?
Data understanding has three main objectives: determining the data format, determining the data type, and profiling the data. Typical tasks in the second phase of CRISP-DM include collecting the data; exploring the data; describing the data; and ensuring the data quality, accuracy, and validity. It’s important to ask many questions about the data during each of the tasks in data understanding in order to obtain a full view of the information and to allow for better interpretation. Some sample questions that can help you better understand the data include:
Collecting the data:
■ Where will we get the data? From internal or external sources?
■ Who is responsible for obtaining the data?
Describing the data:
■ How large is the data (e.g., how many columns and rows does the file contain)?
■ What is the format of the data (structured, semi-structured, or unstructured)?
■ What types of data are in each column? Are they qualitative (nominal or original) or quantitative (interval or ratio)?
Exploring the data:
■ What are the “typical” values in each column? (For instance, quantitative variables can use average, minimum, or maximum values. Qualitative variables look at the frequency of occurrence of a value.)
■ Are there anomalies or outliers?
■ Are there missing data points? If so, is it possible to make adjustments to continue (e.g., exclude the entire record or impute a value based on values in other columns)?
Verifying the data quality:
■ Is the data accurate?
■ Is the data valid (i.e., conforms to business rules; e.g., invoice date must be on or after the sales invoice date)?
One important area to consider is data structure. Data comes in several different formats, including structured, semi-structured, and unstructured data. Structured data is highly organized with well-defined data types, such as a database table. Semi-structured data has some organization but isn’t fully organized and thus isn’t ready to be inserted into a relational database. This data is unformatted or loosely formatted numbers or characters inside a field with little or no structure within the field. An example of this could be a social media post. Finally, unstructured data is data that has no uniform structure and typically isn’t text-based, for example, image or sound files. This data can be difficult to manage because it might be voluminous, difficult to catalog or index, and problematic to store.
Another crucial area of understanding is the data type. There are a number of different types, including nominal, ordinal, interval, and ratio. The type of data collected and used determines both the types of analytics that can be performed (e.g., regression for numeric data or clustering for nominal data) as well as the types of graphs that can be used to communicate results (e.g., line charts for ratio data or bar charts for nominal data).
Nominal data includes names, labels, or categories. This data can’t be ranked or ordered. Ordinal data is categorical data that can be ranked or ordered regarding preference or in relation to another. Examples include hot vs. cold or good vs. bad. These two opposites can have rankings in between the two extremes, but all rankings may not be equal in relative distance between each increment. Interval data can measure distances and spans, but there’s no zero reference.
FIGURE 2: ETL DIAGRAM
Ratio data has an absolute zero in relation to the data type, and the measurement and intervals between numbers are meaningful.
In a data analytics problem, there can be a combination of different types of data (qualitative or quantitative). For instance, a survey can gather categorical information such as gender and quantitative information such as salary. Collecting diversified data allows analysts to solve different questions and provides a broader basis to analyze.
At the completion of the data understanding phase, it’s very helpful to document the data sources and data prescriptions. A data lineage report identifies each data element and the primary source for that data element. A data dictionary lists key items relevant to each data field. Some of the items that can be defined include:
■ Table name: the name of the database table/spreadsheet tab that holds the data
■ Column name: a quick identification or title for the data collected for each variable
■ Description: a short description of the data held in this column
■ Data type: identifies how the data is measured, i.e., whether it involves qualitative (nominal or ordinal) or quantitative (interval or ratio) scales
■ Unit of measurement: for numeric data, identifies the unit of measurement (e.g., U.S. dollars, euros)
■ Allowable range of values: describes the range of values as dictated by business rules (e.g., purchase order date is before date of receiving report)
Phase 3: Data Preparation
Most data sets are imperfect and need to be revised to ensure the data models are fed with consistent, high-quality data. In phase 3 data analysts and data scientists transform raw data from transactional data sources and store the data in an analytics data warehouse that will be the primary data source of the data models. Phase 3 includes selecting, cleaning, constructing, integrating (merging), and formatting the data (see Table 2).
To accomplish each of these objectives, it’s helpful to review the extract, transform, and load (ETL) process, which transforms raw data into a consistent format and loads it into a centralized data repository (e.g., data warehouse). Data scientists use the data from this data repository when they develop, test, and evaluate the various analytic models being considered (e.g., a clustering model that identifies customers who are likely to default on their loans).
Figure 2 illustrates the ETL process. The extract step involves extracting the data from a transactional data source, such as ERP systems or company budgets on spreadsheets, and then storing a copy of the raw data into a data staging area. In the transform step, the data is appropriately formatted in accordance with the analytical data warehouse specifications, such as data cleaning (remove duplicates, eliminate noisy data, etc.), data transformation (normalize data, create discrete categories for numeric data, and create new data columns), and data reduction (sampling, eliminating nonessential columns from a spreadsheet, and so forth). Finally, in the load step, the transformed data is loaded from the staging area into the analytical data warehouse. All analysis
TABLE 3: DATA ANALYSIS ALGORITHMS
Tasks Description
Anomaly detection
Association analysis
Supervised or unsupervised models for detecting outliers
An unsupervised model that looks for relationships between items in a set
Classification A supervised model that estimates a categorical variable
Common Algorithms
■ Distance-based ■ Density-based
■ Apriori ■ FP growth
■ Decision trees ■ K-nearest neighbors ■ Logistic regression ■ Naive Bayes ■ Neural networks ■ Support vector machines
Clustering
An unsupervised model that looks for meaningful groups within a data set
■ K-means ■ DBSCAN ■ Self-organizing maps
Regression
Time series forecasting
A supervised model that predicts a numerical variable ■ Simple linear regression ■ Multiple regression
A supervised model that predicts a future value of a variable ■ ARMA ■ ARIMA ■ Exponential smoothing
performed (e.g., statistical modeling or ad hoc analysis) uses data from the data warehouse as opposed to taking it directly from transaction sources. This reduces the chance that the transaction-source databases become corrupted.
To be sure that the process is complete, it’s important to look at the five dimensions of data quality as outlined by Lorraine Fellows and Mike Fleckenstein in their 2018 book, Modern Data Strategies: accuracy, completeness, consistency, latency, and reasonableness. Accuracy is the correctness of data. Completeness ensures the data set includes all necessary elements. Consistency refers to the representation and interpretation of the data. Latency refers to the timeliness of data availability. Finally, reasonableness is the credibility, quality, and accuracy of the data set.
Phase 4: Modeling
Phase 4 involves experimenting with many analytics models, such as regression or decision trees, to identify interesting patterns in the data. This phase includes selecting and applying appropriate modeling techniques to generate training and test data sets, building various models, and assessing the model. There are four data analysis types:
1. Descriptive analysis provides information on what happened. An example would be “What has been our historical trends in lost customers?”
2. Diagnostic analysis provides information on why something happened, answering questions like “Why have customer losses been increasing?”
3. Predictive analysis provides information on what’s likely to happen, such as “Which customers are likely to leave?”
4. Prescriptive analysis provides information on how processes and systems can be optimized. It answers questions like “Which customer should we contact to increase the chance of retaining them?”
The most appropriate type of data analysis will be determined by the business problem the company is aiming to solve. Table 3 outlines the most common data analysis tasks and algorithms used to accomplish the data analysis task.
Phase 5: Evaluation
During the fifth phase, the data scientist and business analysts compare the results of the different models developed during phase 4. The typical tasks for phase 5 include evaluating model results, reviewing the process, and determining the next steps. When evaluating a model, the following questions need to be considered:
■ How well does the model perform? Are the results better than random guessing?
■ Do the results of the model make sense in the context of the business problem being analyzed?
■ Is this the simplest model?
■ Can the results be easily interpreted and explained?
■ Is the model cost-effective?
■ Do the results of the model make sense in the context of the problem domain?
10 TIPS FOR USING DATA ANALYTICS
1. Establish a tone at the top that data is a strategic asset and that data quality is paramount to decision making.
2. Don’t recreate the wheel—establish a repeatable and systematic process for analyzing data.
3. Create an analytics data warehouse. Don’t perform analytics on source data.
4. Start your new data analytics skills on small projects.
5. Document the process, including business requirements, data dictionary, and data lineage.
6. When learning a new analytical method, focus first on what the method is doing and less on how it works.
7. Ensure the model used is appropriate for your data types, e.g., regression is used to predict numeric values, while classification is used to assign nominal values.
8. Practice telling a story that complements the analytical findings.
9. Get out of your comfort zone to learn new tools (e.g., Alteryx or Tableau).
10. Embrace the wide variety of skill sets required to complete a data analytics project.
standing complex data. According to the IMA® (Institute of Management Accountants) report Data Visualization (bit .ly/3rrrC5s), data visualization can be used to provide insights in a memorable fashion. It is effective when it can convey a story, and memorable stories will make it easier for the audience to connect and remember the information being conveyed. Relatable stories lead to emotional coupling. Both the storyteller and the audience can relate to the same experience. Research shows that storytelling can engage parts of the brain that lead to action. Good storytelling allows individuals to properly review the project and understand how the data analytics process has substantially addressed the business problem.
A Solid Foundation
Using data analytics effectively starts with knowing how to ask good questions and a strong understanding of the fundamentals of the data analytics process. CRISP-DM is the most-used framework and systematically addresses business problems in a step-by-step process that goes from business problem to business solution. This includes the business problem, the data, the preparation of the data, the modeling, the evaluation, and the deployment.
Accounting and finance professionals who are educated in the basics of data analytics are better equipped to approach complex data analytics implementations into their organizations, are better able to understand the latest developments in specific technologies, and can become leaders in the data analytics frontier of their organization. When they’re able to change the way they think about a problem, they can also change the way they approach the solution. Thus, it’s important that those given the responsibilities of these projects have a sound understanding of the phases in the information processing of data.
The more informed individuals are in data analytics, the more likely data analytics will be used effectively and efficiently. Individuals who have a data analytics mindset are informed and can lead their organization with data analytics projects, allowing the organization to implement effective change and foster growth in the organization and in the application of data analytics. SF
Richard O’Hara, CMA, CFA, is a faculty specialist in the Kania School of Management at the University of Scranton. He can be reached at richard.ohara@scranton.edu.
Phase 6: Deployment
Once new information has been discovered, it must be organized and presented in a way that the audience can understand and use. Depending on the requirements, this step can be as simple as generating a report or as complex as deploying a new computer system. The typical tasks for phase 6 include plan deployment, plan monitoring and maintenance, producing a final report, and reviewing the project.
Plan deployment and plan monitoring should be in direct response to insights gained by the outputs of the model. The model should appropriately convey information about the data set, and the plan should be addressing the business problem.
Data visualizations can improve users’ ability to under-
Lisa S. Haylon, CPA, is an assistant professor in the School of Business at Southern Connecticut State University. She can be reached at haylonl1@southernct.edu.
Douglas M. Boyle, DBA, CMA, CPA, is a professor and department chair in accounting in the Kania School of Management at the University of Scranton. Doug also serves as director of the Ph.D. in accounting program. He’s a member of IMA’s Northeast Pennsylvania Chapter. You can contact Doug at (570) 941-5436 or douglas.boyle@scranton .edu
HOW
BY SETH ELLIOTTew would likely disagree that the position most tied to strategy execution within an organization is the CFO. While the CEO, chief strategy officer (CSO), and chief operating officer (COO) are close contenders, the CFO’s role is uniquely linked to execution across all divisions and initiatives. As a result, CFOs are more in tune with business fluctuations and better equipped to find ways to bridge the strategy execution gap.



But with the ever-evolving and critical role of the CFO in strategy execution, are there better ways for CFOs to juggle these responsibilities? How might CFOs optimize their approaches to strategy execution to avoid micromanaging yet ensure sufficient oversight?
It’s now well understood that the role of the CFO has evolved. To better articulate these changes, Deloitte generated a framework (bit.ly/3Q9tJoM) that identifies the four faces of the CFO: catalyst, strategist, steward, and operator (see Figure 1).
The CFO must now effectively fulfill the historic role of controller while simultaneously acting as the catalyst of progress. Strategy execution therefore becomes one of the key considerations of the CFO, with responsibilities including:
■ Greenlighting new initiatives and purchases, including digital transformation;

■ Monitoring progress and returns across initiatives and purchases;
■ Partnering across divisions to bridge finance and operational targets;
■ Budgeting and dynamically allocating capital to the highest value initiatives;





■ Preserving control over execution, solving problems, and protecting stakeholders’ interests;
■ Detecting and resolving threats before they escalate; and




■ Collaborating in the optimization of strategy execution to increase efficiency and reduce costs.
Balancing Control and Progress
With all those responsibilities, CFOs face a key challenge in strategy execution: the balancing act between control and progress. If sufficient consideration hasn’t been placed on how to better balance these two priorities, CFOs may find themselves on either end of the spectrum. On one hand, too much oversight and micromanagement can impede change and business improvement. On the other hand, too little can put the entire business at risk.
Too much oversight. Acting in their historic role as steward and protector of stakeholders’ interests, CFOs naturally bring a more cautionary approach to strategy execution than their executive peers. Internal changemakers and department leaders are often concerned with improving the business through specific and functional objectives. If given the opportunity, this can lead to greater risk appetite and a limitless demand for resources. The CFO in contrast must both manage risk and deploy resources appropriately across all functions and initiatives. This balancing act is one factor that can lead to too much oversight in strategy execution.
FIGURE 2: THE QUARTERLY OKR CYCLE
Before the quar terDuring the quar terBefore the end of the quar ter
Source: Quantive
The unique capabilities of the CFO can also lead to micromanagement. The CFO is the person most grounded in the critical metrics of the organization. As business partner for each function leader, the CFO maintains a company-wide perspective. This means that the CFO has broad and deep insight regarding the inner workings of the organization, in addition to how everything comes together to create the bigger picture. Along with addressing the historic priority as controller, the capability of the CFO is often needed when it comes to aspects of execution such as decision making and problem solving.
Too little oversight. With ever-increasing responsibilities to manage, CFOs can find themselves in a position where they have less oversight than desired. This may occur from a conscious decision to liberalize their control function to enable risk taking, change, speed, and innovation. We saw this most recently as businesses were required to change and adapt quickly to new technologies and ways of work due to the COVID-19 pandemic. Too little oversight can also occur organically. Juggling multiple responsibilities and delivering insights of both breadth and depth across the organization means problems can easily slip through the cracks. Data leaks, financial scandals, and an increased chance of stagnating or failed initiatives can result.
Striking the balance. The balance between too much or too little oversight can be achieved through adopting the best practices of strategy execution. For example, tracking progress across multiple initiatives can be better managed with a connected data stack and the adoption of goal management methods, such as the objectives and key results (OKR) framework. A culture of transparency and trust, as opposed to control and compliance, can empower team members on the ground to solve
problems before they escalate. And better alignment across the organization, both vertically and horizontally, reduces wasted efforts from duplicated or unaligned work.
Greater Alignment through OKR
When thinking about strategy execution, it’s typical to think in terms of specific approaches. It’s clear, however, that there’s a need to rethink the “engine” that powers strategy execution—the operating model. Factors such as the lingering strategy execution gap, increased business velocity and disruption, and the wealth of data available suggest that a “modern operating model” is required to optimize strategy execution—specifically, an operating model that enables achievement of goals and outcomes faster and more effectively, in addition to navigating the threats and opportunities of a rapidly changing world.
OKR is a goal management method adopted by leading companies such as Google and Adobe. The method uses objectives as a better way to define and organize goals. Objectives are qualitative in nature—they must describe what you want to achieve, be inspirational, stretch your team’s capabilities, and have a deadline, among other things. Key results are individual elements that “metrically” quantify whether you have achieved your objectives and serve to track progress along the way. Here’s a quick OKR example: ■ Objective: Build an all-star finance team. ■ Key result 1: Hire five high-performing team members. ■ Key result 2: Every finance team member completes 30 hours of professional development.
FIGURE 3: THE FOUR STEPS OF OBSERVABILITY




OKR typically functions around quarterly cycles, but annual and alternative cycles (such as six-week) may be used depending on the needs of a team or organization. OKR cycles are a core part of the methodology, as emphasis is placed on learning feedback loops and adjusting course based on collected data. Figure 2 provides an overview of what a quarterly OKR cycle looks like.
OKR is meant to be applied across an entire organization, with the overall goal to better connect strategic imperatives to execution realities. There are numerous benefits to OKR, but the most important is greater alignment. Often, the executive level generates excellent strategies that fail due to incorrect, ineffective, or unaligned implementation. In fact, faltering strategy execution is the top risk identified by CFOs, according to Deloitte (bit.ly/3VJyczP).
There are many contributing factors, but one of the most critical is often lack of alignment. Research from Gartner highlights that as much as 67% of integral functions aren’t aligned with corporate strategy (Jackie Wiles, “The 5 Pillars of Strategy Execution,” bit.ly/3WIw3Ws).
So how do OKR and greater alignment benefit the CFO’s priorities?

Vertical alignment. Top-down, or hierarchal, alignment describes how well the strategic thinking of the executive level is communicated and executed throughout the organization. This alignment should ideally extend to the far edges of the organization—right down to the most junior employees.

When applied correctly, OKR enables vertical alignment through greater clarity and transparency of strategic






objectives. This ensures that the entire organization works toward mission-critical objectives, which reduces wasted work and the costs from misguided initiatives. It also increases efficiency gains at the local level by acting as a north star for decision making—for instance, when deciding which technology purchases are necessary.
Horizontal alignment. Alignment across the organization can be described as horizontal—how different functions, departments, networks, and even the broader company ecosystem work together toward company objectives. This allows for the interdependencies, capabilities, and bandwidth of the organization to be optimized in a collective, evolving way. Horizontal alignment leads to less duplication of work, better collaboration toward strategic objectives, and a more efficient organization overall.
Together, better vertical and horizontal alignment means less oversight is needed from the CFO in strategy execution. As all members of the organization can effectively connect their work to each other and the broader objectives, there’s less concern that work, initiatives, and purchases will go off track.
Board alignment. A unique benefit of OKR for the CFO is better board alignment. Short-term thinking is the standard in our modern business climate, which undoubtedly has an impact on strategy execution. Long-term initiatives are often hindered by short-term priorities. To combat this, OKR can be used in addition to other connected data to bolster support for long-term initiatives by telling a more complete story of organizational progress. In turn, there should be reduced pressure to fulfill only short-term prior-
1. Collecting data about the business from an integrated technology stack
Detecting anomalies through the monitoring of the data 3. Inspecting and analyzing the factors that are creating the unexpected change
Responding quickly to resolve a threat
ities as positive business momentum with a holistic view can be presented to the board.
Detecting and Resolving Threats
Detecting and resolving threats before they escalate is a key aspect of protecting the business and stakeholders’ interests. This can be achieved through investing in the practice and systems of business observability. This primarily involves monitoring the key performance indicators (KPIs) of the business to ensure business continuity, but it also aids decision making for process improvement, adaptability, and innovation. Observability can be summarized broadly in four steps:
1. Collecting data about the business from an integrated technology stack;
2. Detecting anomalies through the monitoring of the data;
3. Inspecting and analyzing the factors that are creating the unexpected change; and
4. Responding quickly to resolve a threat. (See Figure 3.)
Obvious examples of threats would be critical IT infrastructure going down or a cybersecurity breach. But threats can extend well beyond business systems, depending on the connectedness of your data stack and observability approach. KPIs from all aspects of the business can be collected and analyzed to detect vulnerabilities across the board. For instance, data can be analyzed regarding employee retention through software that collects engagement metrics.
Although the benefits of resolving threats for the CFO are clear, engaging in a concentrated effort to master the practice and systems of observability can greatly improve the results. For many businesses, so much data is produced each day that it’s impossible to collect and process data manually. This of course suggests that consistently identifying and resolving threats in real time is a practical impossibility. Speed is also a factor—a critical threat may be identified by a team member, but the damage may have already been done.
Modern technologies and approaches to observability solve these problems. AI-driven prescriptive analytics and connecting data from across the organization create a way for anyone in the organization to better respond to threats—ideally well before they escalate. In turn, the goal of protecting the business without too much oversight is achieved.
Combining OKR and KPIs
A modern operating model is data-driven, which makes tracking the progress of OKR and the observability of KPIs critical components. Working together, this enables the creation of a more robust picture of what’s happening in the organization. This unlocks many benefits including:
■ Empowered decision making using the abundance of data available. This leads to better certainty in execution regarding adjustments or opportunities to innovate.
■ Monitoring and dynamically allocating resources to the most promising initiatives (see Ariel Babcock, Sarah
Keohane Williamson, and Tim Koller, “How executives can help sustain value creation for the long term,” McKinsey & Co., July 22, 2021, mck.co/3i7jchm). This, in turn, reduces sunk costs in flailing or failed initiatives.
■ Visibility and management of multiple initiatives. Modern organizations have a lot to contend with simultaneously. Environmental, social, and corporate governance compliance; digital transformation; and becoming more competitive are but a few examples. A data-driven system is required to better manage and optimize alignment and resources across divisions and initiatives.
Creating a Resilient Organization
An ideal end state for the CFO’s priorities in strategy execution would be for the company to evolve into a resilient organization. This refers to the ability of an organization to withstand shocks and crises. Taken one step further, an organization with peak resilience may become anti-fragile, which is a concept developed by Nassim Nicholas Taleb that describes the ability to not only endure shocks and crises but become stronger in response. Crises can be internal,
Threats can extend well beyond business systems, depending on the connectedness of your data stack and observability approach.
such as the loss of critical personnel, or external, such as supply chain disruptions. Crises may also vary in timelines, from abrupt events to lingering challenges like a shrinking pipeline of opportunities.
An organization with a strong market position will naturally have a level of resilience due to its position. Large cash reserves, brand equity, and strong relationships will always play a role in responding to crises. But a strong market position alone isn’t enough to be resilient, as evidenced by major retailers filing for bankruptcy during the recent pandemic (bit.ly/3ZknByv). To truly become resilient requires making fundamental changes to how the organization works. This means looking at components of the operating model such as company structure, culture, and the overall guiding principles of strategy execution.
Embracing constant transformation. As resilience occurs in response to changes, adaptiveness and evolution must be a feature of the operating model. Change should be baked into the DNA of the company, as opposed to reacting solely to specific events or challenges.
One way to enable this is through creating a culture of organizational learning. The creation of knowledge, insights, and wisdom shouldn’t be limited to only academic institutions. Businesses increasingly need to be at the forefront of knowledge, both in their domain and the broader world, in order to adapt and grow.
The starting point to actualize this requires collecting raw data and information. KPIs from a connected data stack and the information provided from OKR tracking and retrospectives are great mechanisms for this purpose. Thereafter a combination of AI and human analysis can produce knowledge, insights, and wisdom to change, improve, and adapt the organization. The embedded system and culture of learning contribute to a constantly evolving organization that can better survive a fast-changing world.
To better enable change, revisiting organizational structure is also key. It’s time to reevaluate traditional approaches to work to foster greater flexibility and responsiveness to change. Teams and individuals shouldn’t be confined to one project, initiative, or even department, but instead be dynamically deployed across the organization as needed. New ways of work, including hybrid work and the flexible workforce (gig workers and freelancers) can also better support constant transformation.
Empower the workforce. Shocks, crises, and a rapidly changing world mean there’s no longer time to go up and down the chain of command for decision making. An organization can never truly be resilient unless the individuals within are empowered to act to resolve situations before they escalate to the point of peril. In addition, they should have the capacity to innovate and incrementally improve the organization, even at the furthest edge of the network. One possible route to enact this is through adopting an operating model that allows a distributed (as opposed to hierarchal) power structure.
This may not have been possible in the past, but the abundance of data now available creates an opportunity to empower the teams and individuals closer to the challenges at hand to make decisions. This enables greater speed since there’s no longer a need to deal with layers of bureaucracy. More accurate and reliable decisions can also be made as people armed with the details of execution are more likely to have the right answer to issues. Combined with greater alignment to strategic objectives, employees will be in a strong position to act autonomously to improve the organization without extra oversight.
Of course, the approach described here relies on a completely new paradigm of leadership—from control and compliance to trust and empowerment. Transitioning to this new way of thinking can be difficult, but the benefits of doing so mean greater resilience, optimization across the organization, and less oversight needed from the CFO.
Overall, the role of the CFO in today’s business world requires adopting best practices for strategy execution. The key challenge of balancing the traditional role of control, with the emerging role of the catalyst of progress, can be best managed through approaches such as building alignment, observability, and creating a resilient organization. Through modern approaches to strategy execution, CFOs can reduce the oversight needed across all parts of the business—all the while ensuring risk is managed, progress toward objectives is made, and the organization is continuously improving. SF
An organization can never truly be resilient unless the individuals within are empowered to act.


MAX MINUS MIN IN A PIVOT TABLE

Using an Excel pivot table, it’s easy to find the average, maximum, or minimum price for each product. But what if you wanted to calculate the maximum minus the minimum for each product? With a typical pivot table, the calculation fails because Excel calculates MAX minus MIN on a row-by-row basis in the original data.
calculate the MAX for all banana rows as $20. After those calculations are complete, DAX will calculate $20 ‒ $8 and come up with the accurate delta of $12. (Currently, the Data Model is only available in Excel for Windows. It isn’t yet working in Excel Online, Excel for Android, Excel for iOS, or Excel for Mac.)
USING THE DATA MODEL
To use the Data Model for your pivot table, select one cell in your detailed data. From the Insert tab, choose Pivot Table, From Table or Range to open the Pivot Table dialog box. At the bottom of that dialog is a checkbox for “Add this data to the Data Model.” Choose this box and then click OK.
BY BILL JELENHere’s a very simple example of why this calculation fails. Let’s say you’re tracking sales of fruit. Your data has four columns showing product, quantity, price for each item, and the total sale. Currently, there are exactly two rows of data. In row 2, you sold 1 case of bananas for $20. In row 3, you sold 100 cases of bananas for $8 each. The MIN price is $8, the MAX price is $20, and the delta between MIN and MAX is $12. But a regular pivot table would calculate the delta for row 2 as $20 ‒ $20, or $0. It would then calculate the delta for row 3 as $8 ‒ $8. The total delta for the data set will be $0.
There’s a powerful but obscure alternate formula language for pivot tables called DAX. To use it, you must choose to base your pivot table on the Data Model while creating the pivot table. The advantage of DAX is that the calculation happens once for each row in the final pivot table. Using DAX, Excel will calculate the MIN for all banana rows as $8, then
Build the pivot table as normal. Drag Product to the Rows area. Drag the Each price to the Values area twice. Double-click the heading cell for the first Sum of Each to open the Value Field Settings dialog. In that dialog, change the “Summarize Value Field By” from Sum to Min. Click OK. Double-click the heading cell for the Sum of Each2 column. Change the calculation to Max. Change the custom name from “Max of Each2” to “Max of Each.”
ADDING A DAX CALCULATION
Figure 1 shows the PivotTable Fields pane when the pivot table is based on the Data Model. The original data set has four fields—Product, Qty, Each, and Total—that appear indented in the Fields pane under the word “Range.” The key to creating a DAX calculated field is to right-click the word Range and then choose Add Measure.
Note that the SQL Server developers who created the DAX language use the term “Measure” instead of “Calculated Field.” For a few early years, the DAX formulas were called “Calculated Field” in Excel, but this was too confusing because legacy pivot tables also have a “Calculated Field” feature. To differentiate the better DAX formulas from the legacy formulas, Microsoft chose to use the term “Measure,” even though it isn’t meaningful for most Excel users.
The Measure dialog box starts with a Table Name of “Range.” You get to fill in the Measure name. In Figure 2, I’ve used “Delta.” The Formula box begins with an equal sign. Click after the equal sign and type a left square bracket ([), and a selection box appears. You’ll see all of the fields in the original data set plus one calculated field for each item in the Values area. Double-click [Max of Each] to insert it into the formula. Click after =[Max of Each]. Type a minus sign and type another left square bracket to open the selection box. This time, choose [Min of Each] and press Tab to insert it into the formula.
You also have the option to specify the number format as Currency with 0 decimal places. The dialog also contains a button labeled Check DAX Formula. You can click that to make sure the formula is valid.
Once you’re ready, click OK to add the new calculation to the PivotTable Fields list. It will appear with a script “fx” logo and the field name of Delta.
You will have to select the checkbox next to fx Delta in order to add the calculation to the pivot table.

For the pivot table in Figure 2, the Delta formula is only used eight times. Being able to perform a calculation on the summary numbers shown in the pivot table adds a lot of functionality that goes beyond regular pivot tables.
There are many other scenarios where you need to wait to perform a calculation on the summary numbers elsewhere in the pivot table. In many of these situations, adding the data to the Data Model and then using a DAX measure will allow you to perform calculations inside of the pivot table instead of trying to perform these calculations outside of the pivot table. SF

Bill Jelen is the host of MrExcel.com and the author of 67 books about Excel. He helped create IMA’s Excel courses on data analytics (bit.ly/2Ru2nvY) and the IMA Excel 365: Tips in Ten series of microlearning courses (bit.ly/2qDKYXV). Send questions for future articles to IMA@MrExcel.com
Figure 1
Figure 2
Microsoft touts the Data Model for merging data from multiple tables. But choosing the Data Model also unlocks many extra features in pivot tables.
TECH PRACTICES
DESIGN THINKING FOR INNOVATION
BY KRISTINE BRANDS, CMAThe speed of technological change is disrupting management accounting. Addressing this challenge calls for innovative and creative solutions to reimagine business systems and processes to create value for a competitive edge. Design thinking is a tool and a mindset that can help achieve these objectives.
The concept of design thinking problem solving has been around for decades. It’s only in the last 15 years that it’s seen widespread adoption. Tim Brown, chair and co-CEO of IDEO, a global design and consulting firm specializing in business transformation, put the methodology on the map in 2008. Brown says, “Design thinking is a human-centered approach to innovation that draws from the designer’s toolkit to integrate the needs of people, the possibilities of technology, and the requirements for business success.”
While traditional applications of design thinking focus on increasing customer value and new product development, Gary C. Biddle, professor of accounting at the University of Melbourne, views design thinking as an effective tool to support the management accountant’s role in creating organizational value. He argues that sometimes value-creation initiatives focus on fine-tuning a business model or process that never should have been done in the first place because management accountants got involved too late. Instead, he believes they should ask questions about opportunities for value creation and choose the best opportunities using design thinking.
DESIGN THINKING STEPS
Let’s apply design thinking to developing a new system that leverages technology using its collaborative tools. Design thinking follows a five-step process: empathize, define, ideate, prototype, and test (see Figure 1). The first step, empathize, focuses on
connecting to the users experiencing the problem. It requires thoroughly understanding a 360° view of the problem, process, or system needing change.
For example, a company’s basic cost accounting system isn’t integrated with its general ledger, requiring the preparation of manual entries to post monthly inventory and cost of sales transactions. At month’s end, monthly account reconciliations are prepared to balance the two systems. The system hasn’t been updated since implementation a decade ago. It isn’t integrated with other departments’ functions such as supply chain management, sales, quality, and logistics. Data and information requests from other departments are processed manually because the reporting module doesn’t interface with the general ledger. The inefficiency of the system causes errors, confusion, and frustration for users because it isn’t robust. Employee turnover is high; morale is low.
The design thinking team begins by interviewing and observing the cost accounting team’s and other departments’ interfaces to understand current processes. The emphasis is placed on the team’s stories to identify important details (i.e., inventory, quality control, and production scheduling), the current system’s issues and inefficiency (such as the lack of supply chain vendor interface), and how the current system affects the team’s workday and morale. The designers must put themselves in the users’ shoes and should never assume they understand the system based on their previous experience. The designers must start with a clean whiteboard approach.
Define is the second step. Once an understanding of the environment and the issues are identified from the first step, attention turns to refining a definition of the problem. Care must be taken to follow a human-centric approach. For example, using the previous example, it’s tempting to jump to the conclusion that new software
Management accountants can apply this tool and mindset to achieve a competitive edge.
Figure 1: Design Thinking Steps
Test Prototype Ideate Define Empathize





is required. That is simply a technical solution to the problem. There may be other processes and features that need to be examined before a decision is made. As noted, other internal users such as the production and sales departments need to be consulted to identify their requirements to ensure a comprehensive system is designed. The company may have budget limits constraining the scope of the project requiring prioritization and project phasing.
Ideate, the third step, focuses on team brainstorming to identify creative solutions. This is the engine of design thinking, to push toward innovation and business transformation. In addition to the accounting department, all stakeholders such as production, quality, sales, marketing, and the design team need to participate. The IT department needs to be at the table to identify tools and processes that leverage the system with technology.
For example, advanced cost accounting techniques such as Just-in-Time, demand planning, and vendor supply chain interfaces are valuecreation opportunities. The design team needs to research those opportunities and be prepared to demonstrate how they will create value. Business transformation must be part of the discussion because it will accelerate value creation. All team members must be open-minded about innovative and outside-the-box solutions. Using an online visualization tool (think virtual sticky notes) is an effective way to organize ideas in real time and allows greater participation for virtual teams.
Stormboard’s design thinking templates (stormboard.com) facilitate structuring and capturing ideas generated during these sessions. Once the ideas are collected, they’re organized thematically (affinity clustering). This step is iterative. Multiple brainstorming rounds are recommended to allow the team to build upon strengths and mitigate weaknesses in proposed solutions.
PROTOTYPE AND TEST
After the ideate step, a testable prototype model of the system or process is developed.
Because the design is still a work in progress, the initial model doesn’t need to be complicated. Simple works. A slide show, storyboard, outline, business process diagram, or presentation describing the system will suffice. The objective is to solicit more feedback and refine the model to ensure it meets organizational and users’ needs. The objective is to identify and address opportunities and concerns. The more effort spent tweaking the prototype, the better the design. Like the ideate step, this is an iterative process that benefits from multiple reviews and refinements.
The final step is testing the system. This can range from a working prototype to a full-blown test system of the new design. The system’s stakeholders should participate to reaffirm that their requirements are met. Their participation simulates the new system, provides a hands-on experience, is a closer representation of the final system, and affords another opportunity to fill the gaps in the system.
Many organizations delegate system design away from users. Management accountants live and breathe their systems and have invaluable insight and experience that should be captured in the process. A design thinking approach to system design empowers their participation and is a major step toward value creation. SF
The views expressed in this article are those of the author and don’t necessarily reflect the official policy or position of the Air Force, the Department of Defense, or the U.S. Government. Distribution A: Approved for Public Release, Distribution Unlimited. USAFA-DF-2022-908.
Kristine Brands, CMA, is an assistant professor of management at the U.S. Air Force Academy in Colorado Springs, Colo. She’s a member of IMA’s Technology Solutions and Practices Committee and IMA’s Denver Centennial Chapter. You can reach her at kmbrands@yahoo.com
The Best Investment
BY EKATERINA EMELIANOVA, CMA, CSCA, CFMIFE HAS A WAY OF TEACHING US LESSONS, and I hope some of the lessons I’ve learned can help others avoid the same pitfalls.
The year was 2002: I was living in Russia, working at the representative office of a large Japanese multinational. I had earned my master’s degree two years before and was considering what to do next. Thinking that more education was a wise choice, I started a Ph.D. program in finance, but I quickly discovered it wasn’t for me. It was too theoretical—I wanted something more practical. What I thought was the answer soon became obvious. Most of my friends were working at Big Four firms, which sponsored Association of Chartered Certified Accountants and CPA (Certified Public Accountant) qualifications for their employees, so I decided I also needed the CPA.
I truly believe things in our lives don’t happen by accident—that certain events that appear to be random are just a part of your journey. So it came to be that two years after becoming a CPA, I was at work discussing how to better measure results and address complaints from some of our business line executives who didn’t think it was fair that they had to pay a larger share of HQ expenses just because of their strong performance. Here was a problem in front of me, and I lacked the knowledge of how to deal with it.
Ekaterina
Emelianova, CMA, CSCA, CFM, is CFO at TradeXBank AG and president of IMA’s Switzerland Chapter. You can reach her at emelianova .eka@gmail.com

Thus began what I now call “the year of CPA.” I invested considerable time and effort into studying and made a huge financial investment as well: I personally covered the costs of travel to the United States and all my preparation materials, which totaled about half the cost of my one-bedroom apartment in Moscow. After I got my CPA exam results (I passed on the first attempt), I found myself both happy and sad. On the one hand, I was gratified that my efforts had paid off, but on the other, I knew that I didn’t want to be an auditor and that controlling was always closer to my heart.
Soon thereafter I was talking to a consulting partner from a Big Four firm in the U.S. and complaining that I had been disappointed by my high hopes for the CPA. He asked me, “Have you ever heard about the CMA? Maybe that’s what you need, as it’s very practical, close to the controlling world, and helps people from finance support business decisions and management of a company.” I had never heard of it, but that conversation would prove to be a game changer for me. The next day, after doing my research on the IMA® website, I bought my online course and books (if you ask me, this was the best value for money investment for my education). I read the CMA® (Certified Management Accountant) review materials like thrillers, as I found them filled with fascinating concepts and ideas that I was keen to start trying in my daily job.
That was 18 years ago, and I can honestly say I’ve used my CMA skills every day since. In 2019, I also earned my CSCA® (Certified in Strategy and Competitive Analysis) to further develop my strategy skills. My journey to the CMA took a while, but I’m glad I learned my lesson and am proud to call myself a CMA. SF
«Certain events that appear to be random ARE JUST A PART OF YOUR JOURNEY.»
Predictive Analytics in Forecasting: The Basics
Predict what’s ahead in uncertain markets. Discover how to make the forecasting process more efficient and reduce uncertainty with predictive analytics. Through interactive video tutorials, you’ll learn techniques to improve forecasting and apply your knowledge with hands-on exercises.
Improve your predictive analytics skills with this e-learning course today.

HR for businesses that want to thrive – not just survive
Struggling to keep your business compliant and competitive in changing times? Navigate business challenges with the assurance that comes from Insperity’s unmatched HR service and support.
Whether you need help keeping up with evolving labor laws, attracting and retaining diverse talent in a tight market, leading remote teams or overcoming some other HR obstacle, you can move forward with Insperity behind you.
Learn more at insperity.com/ima | alliance@insperity.com.
