In: Economics
Listed below are economic forecasting tools. Explain each one in a short paragraph AND give a real-life example of how it could be used
Delphi Forecasting
Panel Consensus
Market Research
Visionary or Scenario Forecasts
Historical Analogy
Simple Trend Analysis
Moving Averages
Exponential Smooth
Simple Regression Model
Input/Output Model
Learning Curves
1. The Delphi method is a forecasting process framework based on the results of multiple rounds of questionnaires sent to a panel of experts. Several rounds of questionnaires are sent out to the group of experts, and the anonymous responses are aggregated and shared with the group after each round. The experts are allowed to adjust their answers in subsequent rounds, based on how they interpret the "group response" that has been provided to them. Since multiple rounds of questions are asked and the panel is told what the group thinks as a whole, the Delphi method seeks to reach the correct response through consensus.
2. A judgmental forecasting technique by which a committee, sales force, or group of experts arrives at a sales estimate. See: Delphi method, management estimation. Panel Consensus is based on the idea that a panel of data from distributed collections can give a better forecast. Panel consensus may include collection of data from all level of employees in an organization. Another example may be collection of data from various states, various cities or various strata of people. The data from various panels is checked for consistency across levels. For example, data from lower level employees may be biased as they are intimidated by top management.
3. Market research is the process of determining the viability of a new service or product through research conducted directly with potential customers. Market research allows a company to discover the target market and get opinions and other feedback from consumers about their interest in the product or service.
This type of research can be conducted in house, by the company itself, or by a third-party company that specializes in market research. It can be done through surveys, product testing, and focus groups. Test subjects are usually compensated with product samples and/or paid a small stipend for their time.
A business must engage in a variety of tasks to complete the market research process. It needs to gather information based on the market sector being examined. The business needs to analyze and interpret the resulting data to determine the presence of any patterns or relevant data points that it can use in the decision-making process.
4. Scenario analysis is a process of analyzing possible future events by considering alternative possible outcomes . Thus, scenario analysis, which is one of the main forms of projection, does not try to show one exact picture of the future. Instead, it presents several alternative future developments. Consequently, a scope of possible future outcomes is observable. Not only are the outcomes observable, also the development paths leading to the outcomes. In contrast to prognoses, the scenario analysis is not based on extrapolation of the past or the extension of past trends. It does not rely on historical data and does not expect past observations to remain valid in the future. Instead, it tries to consider possible developments and turning points, which may only be connected to the past. In short, several scenarios are fleshed out in a scenario analysis to show possible future outcomes. Each scenario normally combines optimistic, pessimistic, and more and less probable developments. However, all aspects of scenarios should be plausible. Although highly discussed, experience has shown that around three scenarios are most appropriate for further discussion and selection. More scenarios risks making the analysis overly complicated. Scenarios are often confused with other tools and approaches to planning. The flowchart to the right provides a process for classifying a phenomenon as a scenario in the intuitive logics tradition.
5. Historical Analogy is the method used for forecasting when no past data is unavailable. This happens frequently in case of new market entry or product launches. In these cases forecast is made on the basis of similar instances in the past. For example forecast for a shampoo sales may be made from soap sales data from the past. The idea is to use the analogy between the two products and then develop a good forecast.
6. Trend analysis quantifies and explains trends and patterns in a “noisy” data over time. A “trend” is an upwards or downwards shift in a data set over time.
In economics, “trend analysis” usually refers to analysis on past trends in market trading; it allows you to predict what might happen to the market in the future. It might, for instance, be used to predict a trend such as a bull market run.
7. A moving average (MA) is a widely used indicator in technical analysis that helps smooth out price action by filtering out the “noise” from random short-term price fluctuations. It is a trend-following, or lagging, indicator because it is based on past prices.
The two basic and commonly used moving averages are the simple moving average (SMA), which is the simple average of a security over a defined number of time periods, and the exponential moving average (EMA), which gives greater weight to more recent prices. The most common applications of moving averages are to identify the trend direction, and to determine support and resistance levels. While moving averages are useful enough on their own, they also form the basis for other technical indicators such as the Moving Average Convergence Divergence (MACD).
8. Exponential smoothing is a rule of thumb technique for smoothing time series data using the exponential window function. Whereas in the simple moving average the past observations are weighted equally, exponential functions are used to assign exponentially decreasing weights over time. It is an easily learned and easily applied procedure for making some determination based on prior assumptions by the user, such as seasonality. Exponential smoothing is often used for analysis of time-series data.
Exponential smoothing is one of many window functions commonly applied to smooth data in signal processing, acting as low-pass filters to remove high frequency noise.
9. Linear regression models are used to show or predict the relationship between two variables or factors. The factor that is being predicted (the factor that the equation solves for) is called the dependent variable. The factors that are used to predict the value of the dependent variable are called the independent variables.
Good data does not always tell the complete story. Regression analysis is commonly used in research as it establishes that a correlation exists between variables. But correlation is not the same as causation. Even a line in a simple linear regression that fits the data points well may not say something definitive about a cause-and-effect relationship.
10. Input-output is a novel technique invented by Professor Wassily W. Leontief in 1951. It is used to analyse inter-industry relationship in order to understand the inter-dependencies and complexities of the economy and thus the conditions for maintaining equilibrium between supply and demand.
An input is obtained but an output is produced. Thus input represents the expenditure of the firm, and output its receipts. The sum of the money values of inputs is the total cost of a firm and the sum of the money values of the output is its total revenue.
The input-output analysis tells us that there are industrial interrelationships and inter-dependencies in the economic system as a whole. The inputs of one industry are the outputs of another industry and vice versa, so that ultimately their mutual relationships lead to equilibrium between supply and demand in the economy as a whole.
Coal is an input for steel industry and steel is an input for coal industry, though both are the outputs of their respective industries. A major part of economic activity consists in producing intermediate goods (inputs) for further use in producing final goods
11. A learning curve is a concept that graphically depicts the relationship between the cost and output over a defined period of time, normally to represent the repetitive task of an employee or worker. The learning curve was first described by psychologist Hermann Ebbinghaus in 1885 and is used as a way to measure production efficiency and to forecast costs.
In the visual representation of a learning curve, a steeper slope indicates initial learning translates into higher cost savings, and subsequent learnings result in increasingly slower, more difficult cost savings.