Currently, there are several machine learning (ML) approaches that use data to make business decisions, including time series forecasting. Time series forecasting is based on time-dependent data but can be impacted by other external factors, including changes in demand, supply chain disruptions, new product releases, political factors, and natural disasters. These external factors are also referred to as simulated scenarios.
The Monte Carlo method suggests that increasing the number of simulated scenarios improves precision when it comes to predicting the outcome. This means that, when applied to time series data, you can obtain more nuanced results if you add more dimensions. But how can you identify different factors that will help you achieve these results? The answer is [scenario analysis](https://www.netsuite.com/portal/resource/articles/financial-management/scenario-analysis.shtml).
Scenario analysis is a method used to evaluate the potential outcomes of a decision or future event by considering various possible scenarios. Together, scenario analysis and time series forecasting can be used to evaluate the potential impact of different scenarios on a given time series, such as sales data or stock prices, and can even help businesses restructure operations to plan for possible revenue loss.
In this article, you'll learn more about scenario analysis with time series forecasting and how you can implement it with Ikigai Labs, an AI apps platform.
As previously stated, combining scenario analysis with time series forecasting is a powerful way for businesses to make more informed decisions. Following are a few ways this combo can positively impact your business:
One way you can use scenario analysis and time series forecasting is in payroll planning. By utilizing historical data and forecasting techniques, companies can make informed decisions about budgeting and identify potential risk factors and opportunities.
This approach enables companies to perform what-if analysis to see how potential changes in head count, wage increases, or benefits may impact payroll expenses in the future, which can help with proactive decision-making. Ultimately, this can lead to more effective payroll planning and better financial outcomes for the company.
Investors may find scenario analysis and time series forecasting useful for evaluating the potential performance of different investments across multiple scenarios. By utilizing historical data and forecasting techniques, investors can create a forecast of a stock's future performance and then use scenario analysis to evaluate how different situations, such as economic downturns or competitor actions, may impact the predicted stock performance. This provides the investor with a view of a range of possible outcomes and the probability of each scenario occurring, which can help investors make informed long-term decisions.
Businesses can utilize scenario analysis and time series forecasting to aid inventory planning by examining the potential effects of different scenarios on inventory levels and costs. Historical data and forecasting techniques can be used to create a prediction of the upcoming quarter's inventory levels, and then scenario analysis can be used to evaluate how changes, such as fluctuations in demand or supply chain disruptions, can impact these inventory predictions. This helps businesses make informed decisions when it comes to inventory and helps meet demand while still minimizing costs.
These are just a few examples of how scenario analysis with time series forecasting can have a huge impact on your business. Now let's take a look at how you can implement it so that these benefits can start helping your business.
In this section, you'll follow a series of steps to implement scenario analysis with Python and Ikigai Labs. Go ahead and download the data set that will be used here, which is the sales data for a superstore.
In addition to downloading the data set, there are a few other prerequisites you need to meet before proceeding. You'll need to have Python (version 3.8 or newer) downloaded to your operating system. You also need to download Jupyter Notebook for better code readability.
In order to use Ikigai Labs, if you don't already have an account, you need to set one up by filling out the required details. Once you're logged in, you'll see something like this:
Click on the **+** button to the right of **Projects** to create a new project. Give the project a name (*ie* "Superstore_Sales_Project") and click **Create Project**, which will open a window that looks like this:
Once you've created a new project, you're ready to implement scenario analysis, so let's get started!
Before you begin any scenario analysis project, you need to understand your time series data and the key factors that can impact it. This can include researching economic indicators, monitoring news and social media for relevant events, and consulting experts in relevant fields.
Once you fully understand the data and any key factors, you can use Python or Ikigai Labs to gather and analyze the relevant data. This can include collecting historical data, downloading economic indicators, and importing relevant data on any external factors that you identified.
If you want to use Python data analysis libraries, such as pandas, NumPy, and Matplotlib, to perform data cleaning, manipulation, and visualization, you need to start with loading the required dependencies and data:
# load dependencies
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
# load data
data = pd.read_csv('superstore_sales.csv')
# show the first few lines of superstore sales data
However, this takes some coding knowledge to gather and manage the data. Although the previous code seems simple, for different use cases, you would need to carefully check the data set file type (*ie* CSV, XLSX, etc.), file size, correctness, and file path, which can be complex with Python.
Instead, you can use a tool like Ikigai Labs to load the data with a single click. You just need to select the **+** button next to **Dataset** and load the required data set:
Ikigai Labs helps ensure that you have the most accurate and up-to-date data by providing access to real-time data sources and automated data collection and analysis capabilities.
Identifying and factoring in external forces are important steps when it comes to scenario analysis and time series forecasting because they help you make more informed decisions. As previously stated, these forces can include economic indicators, political stability, social and demographic trends, technological advancements, environmental changes, and legal and regulatory changes. This information can be used to develop more realistic and accurate scenarios that can aid you in making more informed decisions.
Ikigai's AI apps make it easy to implement all aspects of scenario analysis, including identifying and factoring in external forces. To do so, navigate to the **Marketplace** from the left-side panel and search for **Value-Add Calculation-Scenario Analysis** and **Time Series Forecast**. You'll also find other facets that you can use depending on your specific scenario:
These facets help you complete the necessary analysis of the loaded data. Starting from data cleaning and data processing, to making predictions and doing scenario analysis, everything can be done with just a few clicks. More importantly, you don't have to choose the algorithms and test them as you would in Python; instead, the facets will choose the best algorithms for you to use.
Prioritizing uncertainties in scenario analysis involves identifying and ranking the most significant sources of uncertainty that may impact the outcome of the scenario being analyzed. This process is important because it allows you to focus on the uncertainties that would most likely impact the scenario and to allocate resources and effort accordingly.
For example, in regard to the sales data you've been working with here, you may want to analyze the sales by category and segment under two scenarios: one where you increase the discount on all products and another where you decrease the price on all products (a scenario that was identified via decision tree analysis). To do this, you need to create two new columns in the DataFrame—one for the discounted price and another for the new price:
# Increase the discount by 10%
data['discounted_price'] = data['Sales'] * 0.9
# Decrease the price by 10%
data['new_price'] = data['Sales'] * 0.9
Implementing these scenarios with Python can be complicated because you need to perform different operations and process steps directly on the data set. This requires you to be thoroughly familiar with the Python language and its various libraries and methods.
In contrast, with Ikigai's [no-code AI tool](https://www.ikigailabs.io/time-series-forecasting), you just need to drag your data set to the canvas (where you build your entire pipeline) and then apply the conditions on the data sets by specifying them in the **expression** field. This is far easier than writing a bunch of code in Python.
Once loaded, you need to click on **Transform** and look for the **Formula** facet:
Drag this facet to the canvas and fill out the following details for calculating the `discounted_price` from the sales column. Similarly, you can calculate the `new_price` with the `Sales` column by adding a new facet.
Once you've prioritized uncertainties, you need to process the data to get it into the required shape and size for forecasting. This includes performing different data preparation operations, such as merging, stitching, advanced matching, de-duping, and calculating null values. Usually, these operations can be classified into three different stages:
- Data cleaning: cleans and preprocess the data to remove missing or duplicate values, handle outliers, and ensure consistency in units of measurement.
- Data transformation: transforms the data into a format suitable for time series analysis, such as taking logarithmic transformations or differences of the values to make the data stationary.
- Feature engineering: creates new variables or features that may significantly impact the target variable.
> **Please note:** For this data set, the preprocessing needed is minimal, but it may be valuable to explore the functions provided by Ikigai, such as filling in missing values, text splitting, and adding a primary key.
To combine and transform your data in Python, you would need to manually code and perform all these data preprocessing stages, which can be time-consuming and confusing. For instance, for this data set, you need to group the data by `category` and `segment` and calculate the total sales for each scenario, as seen here:
# Total sales by category and segment with discounted price
discounted_sales = data.groupby(['Category', 'Segment'])['discounted_price'].sum()
# Total sales by category and segment with new price
new_sales = data.groupby(['Category', 'Segment'])['new_price'].sum()
In Python, you would need to maintain this process flow on your own, but with Ikigai, you can do this with a few connections. You just need to drag the Python facet and add the previous conditions to the code, and it will automatically apply those to your data.
> **Note:** It's recommended that you create two different flows for the `discounted_price` and the `new_price` for better analysis. In addition, while a Python facet has been used here, there are other prebuilt functions available (*ie* reshape, split, and merge).
For the discounted price, your component may look like this:
You can also do the same for `new_price` in a new flow. You can find a lot of custom facets, such as DeepMatch, data transformation, DM unsupervised, and column filter, for data cleaning and transformation. All these transformations can be done in a no-code manner where you just need to pass the output of one facet to another to operate.
Developing scenarios in scenario analysis involves creating different possible future scenarios based on the underlying assumptions and drivers of a particular system or market. The objective is to explore the range of possible outcomes and their implications, and to identify key uncertainties and risks.
To develop a scenario, you need to follow these steps:
1. **Identify drivers:** Identify the key drivers or variables that will impact the future the most. For example, a discount or change in supply can affect sales analysis significantly.
2. **Define assumptions:** Establish a set of assumptions about the values of the drivers and how they will change over time. For example, if a company assumes that sales volumes will remain constant over the next year, as the price increases, it's expected to be absorbed by customers without a significant impact on demand.
3. **Develop scenarios:** Use your set of assumptions to create scenarios representing a range of possible futures. The scenarios should be distinct, plausible, and internally consistent. For example, if there is a disruption in the supply chain, would this change the sales?
4. **Evaluate the scenarios:** Analyze the scenarios to assess their potential impact on the target system or market and to identify risks and uncertainties.
5. **Select scenarios:** Choose the most relevant and useful scenarios to focus on for further analysis and decision-making.
To develop scenarios in Python, you would also need to check the impact of the previously mentioned scenarios so you can plot the results using a bar chart for the total sales by category and segment with the increased discount:
# Create a bar chart for the discounted sales
In comparison, in Ikigai Labs, to obtain the same scenario, you need to add a final facet (*ie* **Exported**), which would look like this:
Once done, select **Run** at the top of your screen, and it will run your entire pipeline to produce the results. Once the run is complete, the output data set will be stored in the data set section. To produce a new chart, you just need to select the **+** button next to **Charts**. Fill out the required details and column names that you want to graph and generate the graph:
While the results are both similar, it's easier to develop and generate them with Ikigai.
As a reminder, this is just one example of how to perform scenario analysis with Python and Ikigai using this sales data. You can modify the analysis to suit your specific needs and requirements.
All the stages in this tutorial have been preparing you for one thing: creating a time series forecasting system that can make accurate predictions. And it's time to do just that. Automating and operationalizing time series forecasting with Python involves the following steps:
- **Data collection:** includes automating the collection of time series data from various sources using Python libraries, such as pandas, NumPy, or requests. Then you store the data in a database or a file for easy access.
- **Data preprocessing:** uses Python libraries to clean and preprocess the data and to perform operations such as normalization, aggregation, or imputation.
- **Model selection:** selects a suitable time series forecasting model, such as [autoregressive integrated moving average (ARIMA)](https://www.capitalone.com/tech/machine-learning/understanding-arima-models/), [seasonal autoregressive integrated moving average (SARIMA)](https://machinelearningmastery.com/sarima-for-time-series-forecasting-in-python/), or [Prophet](https://facebook.github.io/prophet/), and uses Python libraries, such as [statsmodels](https://www.statsmodels.org/stable/index.html), [scikit-learn](https://scikit-learn.org/stable/), or Prophet, to implement the model.
- **Model training:** trains the model on the preprocessed data using Python libraries, such as NumPy or [TensorFlow](https://www.tensorflow.org/), and uses metrics, such as [mean absolute error (MAE)](https://en.wikipedia.org/wiki/Mean_absolute_error) or [root-mean-square error (RMSE)](https://c3.ai/glossary/data-science/root-mean-square-error-rmse/), to evaluate the performance of the model.
- **Model deployment:** deploys the trained model as a web service or an API using Python libraries, such as [Flask](https://flask.palletsprojects.com/en/2.2.x/) or [Django](https://www.djangoproject.com/).
- **Model monitoring:** monitors the performance of the deployed model using Python libraries, such as TensorFlow or [Keras](https://keras.io/), and detects any changes in the data that may affect the accuracy of the forecasts.
- **Model retraining:** retrains the model regularly using updated data to ensure that it remains accurate and up-to-date.
- **Forecast generation:** automates the process of generating forecasts using Python libraries, such as NumPy or TensorFlow, and makes the forecasts available to stakeholders through an API, a dashboard, or a report.
By automating and operationalizing time series forecasting with Python, organizations can leverage the power and versatility of the Python ecosystem to implement, deploy, and maintain accurate and reliable time series forecasting models. However, it takes a lot of time and expertise to automate this pipeline. This is where Ikigai Labs comes to the rescue again.
With Ikigai, the entire process is automated, starting with data collection, all the way to running what-if analysis. All you need to do is drag and drop the required facets for the time series forecasting and scenario analysis.
In addition to automating every step, Ikigai also provides different business intelligence solutions, such as supply chain, strategic planning, demand forecasting, accounting and finance, policy optimization, and customer and revenue analytics. For instance, Ikigai can help businesses optimize their supply chain operations by identifying inefficiencies, improving forecasting, optimizing inventory management, enhancing supplier management, and streamlining logistics. This reduces costs, improves efficiency, and increases customer satisfaction. Moreover, Ikigai Labs can trigger automated actions, such as sending an email, generating reports, or collaborating with your team—eliminating manual work.
In this article, you learned that scenario analysis, in combination with time series forecasting, is a powerful tool that organizations can use to make informed decisions with their data. It can help you take into account the potential impact of various factors to better predict future trends.
However, utilizing this data to its full potential takes a lot of manual work. Thankfully, there are tools like [Ikigai Labs](https://www.ikigailabs.io/) that can help.
The Ikigai platform is a powerful no-code AI tool that enables organizations to make data-driven decisions. With its intuitive interface and user-friendly design, the platform provides a comprehensive solution for data analysis, visualization, and reporting. Whether you are a small business owner or a data analyst in a large corporation, Ikigai BI has everything you need to unlock the full potential of your data. With Ikigai AI apps, you can transform your data into actionable insights that drive business growth and success.
Book a demo today with Ikigai's most advanced no-code AI tool to improve your business forecast with just a few clicks.