top of page

Process Tempo Insights

  • striveon

Assembling Critical Information for Decision Making

In last week’s blog, we highlighted the need for organizations to use their data to become more agile and adaptive. We discussed how these characteristics are critical to enabling adaptive recovery, an effective approach to business continuity when dealing with uncertain timelines and conditions. We clarified how the data needs to be contextual and relevant, accessible to many, and enable transparency for the organization.

Now, we move on to the next stage of business continuity planning: assembling critical information for decision-making.


Today, business leaders are facing tough decisions: which areas should be deemed essential or non-essential? Which projects can be cut or boosted? Should attempt to reopen? Should we reconfigure existing operations, or create alternative ones?

Understanding these risks and making decisions in business continuity planning was once a more simple task. Our current climate has transformed this process into a significant undertaking for two reasons:

First, leaders must factor in the interdependent chains of large and complex scenarios extending beyond a single event, supplier, or service. Second, leaders must complete the impossible task of predicting an unknown duration of risk.

In business continuity planning, risks typically assume a known duration. A standard production line, in an ordinary instance, would calculate the cost of lost sales for a downed line for a certain amount of time: two hours, one week, even a month. Our new normal provides no known duration, and the increased interconnectedness of datasets and systems also means that these timelines could shift - in real-time - as we attempt to forecast them.

This makes accurate planning more difficult, and it’s why having data that can react and adapt is so crucial.


After considering these elements and acknowledging their relevancy to operations, an organization can move on to determining what information and data are critical to executing the plan.

Let’s look at a topic that many organizations are currently dealing with as an example: the impact of already escalating insurance costs for employees. Think of the datasets needed to answer the questions related to this: errors and omissions, liabilities, employee benefits, healthcare, unemployment, workman's compensation, to name a few. There are more, and an organization’s data should be able to illustrate these relevant areas.

Here are some other topics worth considering or reexamining:

  • How the business operates in smaller slices, reviewed over longer than anticipated time frames

  • Supplier and full supply chain capabilities, and how assumptions are impacting the ability to recover operations

  • Economic trends and the financial implications of the new economies arising from new norms

  • Retail and Web Presence: the capacity and capability as an outlet to the sales motions, and how to adopt “social distancing”, as well as increased online shopping by consumers

  • Shipping Logistics: considering shipping times, costs, warehousing space, sanitation of the system, and container availability

  • Forecasting and planning for mobility restrictions placed on people, regardless of if it is due to governmental mandates, health quarantines, or fear

  • Forecasting how people in the entire supply and demand chain react (Employees, Customers, Suppliers) to restrictions. Will they seek alternatives for the good they need? Will they start to build their own or source locally?

To add to the complexity, internal datasets have to rub shoulders with external datasets such as CoVID19 cases, reactions by health organizations and governments, and the willingness of people to comply with recommendations and/or mandates. Previously unrelated datasets are now connected.


In all of these scenarios, it’s important to not get bogged down by paralysis by analysis and to narrow down the specific information needed. Considering all of the aforementioned elements, organizations will produce their own understandings of what information is needed to make concrete decisions.

While the information required will differ, there are some standard best practices that all organizations should follow:

  • Guard against bias by involving others in the process

  • Define objectives, make value judgments, and ask the right questions

  • Identify sources containing relevant information and how it can be retrieved

  • Pull data, observe constraints, and implement workarounds

Next, organizations need to manage and use this data effectively. There’s a sincere need here to identify what the data “should” be doing and how it should be interacting with the people and processes of the organization.

The retrieved data needs to be assembled in a way that allows information to be flexible. It should pivot rapidly to produce relevant and sustainable forecasts and calculate potential outcomes. It also needs to be presented in a way that ensures actionable insight, making it easy for business leaders to set measurable goals.

Data analytics tools should empower business owners and analysts to come to good conclusions quickly through the use of a streamlined process.

When tasked with answering questions using data, individuals tend to create their own logic, potentially creating more long-term problems than solving short-term ones. The process should work to cut down on contrasting opinions or outcomes, and avoid the path of least resistance.

Data analysis by teams also requires a valuable set of skills by individual team members. There needs to be a strong capacity to collect, normalize, analyze, and make good recommendations from data.

In a similar fashion, business leaders need to do the same: listen to and understand what insights are saying, and communicate how triaged efforts were focused. Partnerships between the BOD, CXO Suite and the Analytics team depend upon transparency during this critical time.

Ultimately, not using insights steeped in the very nature of the business will make it more difficult to evaluate, democratize, and rally a culture of data-driven decision making. Now more than ever, the ability to identify necessary data and manage it effectively will be significant in determining whether a company survives.


Jim Szczygiel has been working as an Information Technologist since the early nineties. Most recently Jim has held roles as a consultant, product manager, data analyst, sales, and solutions architect along with working with agile and extreme programming teams. Jim has provided services in hundreds of different fortune 500 clients in the sectors of; Chemical and Natural Resources, Finance, Insurance, and Manufacturing. Read More


bottom of page