Question

In: Economics

OPENING VIGNETTE: DECISION MODELING AT HP USING SPREADSHEETS HP is a manufacturer of computers, printers, and...

OPENING VIGNETTE: DECISION MODELING AT HP USING SPREADSHEETS
HP is a manufacturer of computers, printers, and many industrial products. Its vast product line leads to many decision problems. Olavson and Fry have worked on many spreadsheet models for assisting decision makers at HP and have identified several lessons from both their successes and their failures when it comes to constructing and applying spreadsheet-based tools.
They define a tool as "a reusable, analytical solution designed to be handed off to nontechnical end users to assist them in solving a repeated business problem."
When trying to solve a problem, HP developers consider the three phases in developing a model. The first phase is problem framing, where they consider the following questions in order to develop the best solution for the problem:
• Will analytics solve the problem?
• Can an existing solution be leveraged?
• Is a tool needed?
The first question is important because the problem may not be of an analytic nature, and therefore a spreadsheet tool may not be of much help in the long run without fixing the nonanalytical part of the problem first. For example, many inventory-related issues arise because of the inherent differences between the goals of marketing and supply chain groups. Marketing likes to have the maximum variety in the product line, whereas supply chain management focuses on reducing the inventory costs. This difference is partially outside the scope of any model. Coming up with nonmodeling solutions is important as well. If the problem arises due to "misalignment" of incentives or unclear lines of authority or plans, no model can help. Thus, it is important to identify the root issue.
The second question is important because sometimes an existing tool may solve a problem that then saves time and money. Sometimes modifying an existing tool may solve the problem, again saving some time and money, but sometimes a custom tool is necessary to solve the problem. This is clearly worthwhile to explore.
The third question is important because sometimes a new computer-based system is not required to solve the problem. The developers have found that they often use analytically derived decision guidelines instead of a tool. This solution requires less time for development and training, has lower maintenance requirements, and also provides simpler and more intuitive results. That is, after they have explored the problem deeper, the developers may determine that it is better to present decision rules that can be easily implemented as guidelines for decision making rather than asking the managers to run some type of a computer model. This results in easier training, better understanding pf the rules being proposed, and increased acceptance. It also typically leads to lower development costs and reduced time for deployment.
If a model has to be built, the developers move on to the second phase - the actual design and development of the tools. Adhering to five guidelines tends to increase the probability that the new tool will be successful. The first guideline is to develop a prototype as quickly as possible. This allows the developers to test the designs, demonstrate various features and ideas for the new tools, get early feedback from the end users to see what works for them and what needs to be changed, and test adoption. Developing a prototype also prevents the developers from overbuilding the tool and yet allows them to construct more scalable and standardized software applications later. Additionally, by developing a prototype, developers can stop the process once the tool is "good enough," rather than building a standardized solution that would take longer to build and be more expensive.
The second guideline is to "build insight, not black boxes." The HP spreadsheet model developers believe that this is important, because oftentimes just entering some data and receiving a calculated output is not enough. The users need to be able to think of alternative scenarios, and the tool does not support this if it is a "black box" that only provides one recommendation. They argue that a tool is best only if it provides information to help make and support
By: KMK

By: KMK
decisions rather than just give the answers. They also believe that an interactive tool helps the users to understand the problem better, therefore leading to more informed decisions.
The third guideline is to "remove unneeded complexity before handoff." This is important, because as a tool becomes more complex it requires more training and expertise, more data, and more recalibrations. The risk of bugs and misuse also increases. Sometimes it is best to study the problem, begin modeling and analysis, and then start shaping the program into a simple-to-use tool for the end user.
The fourth guideline is to "partner with end users in discovery and design.” By working with the end users the developers get a better feel of the problem and a better idea of what the end users want. It also increases the end users' ability to use analytic tools. The end users also gain a better understanding of the problem and how it is solved using the new tool. Additionally, including the end users in the development process enhances the decision makers' analytical knowledge and capabilities. By working together, their knowledge and skills complement each other in the final solution.
The fifth guideline is to "develop an Operations Research (OR) champion." By involving end users in the development process, the developers create champions for the new tools who then go back to their departments or companies and encourage their coworkers to accept and use them. The champions are then the experts on the tools in their areas and can then help those being introduced to the new tools. Having champions increases the possibility that the tools will be adopted into the businesses successfully.
The final stage is the handoff, when the final tools that provide complete solutions are given to the businesses. When planning the handoff, it is important to answer the following questions:
• Who will use the tool?
• Who owns the decisions that the tool will support?
• Who else must be involved?
• Who is responsible for maintenance and enhancement of the tool?
• When will the tool be used?
• How will the use of the tool fit in with other processes?
• Does it change the processes?
• Does it generate input into those processes?
• How will the tool impact business performance?
• Are the existing metrics sufficient to reward this aspect of performance?
• How should the metrics and incentives be changed to maximize impact to the business from the tool and process?
By keeping these lessons in mind, developers and proponents of computerized decision support in general and spreadsheet-based models in particular are likely to enjoy greater success.
Questions for the Opening Vignette
1. What are some of the key questions to be asked in supporting decision making through DSS?
2. What guidelines can be learned from this vignette about developing DSS?
3. What lessons should he kept in mind for successful model implementation?
What We Can Learn from This Vignette
This vignette relates to providing decision support in a large organization:
• Before building a model, decision makers should develop a good understanding of the problem that needs to be addressed.
• A model may not be necessary to address the problem.
• Before developing a new tool, decision makers should explore reuse of existing tools.
• The goal of model building is to gain better insight into the problem, not just to generate more numbers.
• Implementation plans should be developed along with the model.
Source: Based on T. Olavson and. C. Fry, "Spreadsheet Decision-Support Tools: Lessons Learned at Hewlett-Packard," interfaces, Vol. 38, No. 4, July/August 2008, pp. 300-.310.

Solutions

Expert Solution

1.What are some of the key questions to be asked in supporting decision making through DSS?

Some of the key questions to be asked in supporting decision making through DSS are:

  • What are the root issues underlying the decision situation? Do we understand the problem sufficiently to support it?  
  • How structured is the decision? Is it unstructured, semi-structured, or structured?
  • Does the decision involve judgment? To what extent?
  • What data is needed to solve the problem?
  • Can an existing tool be leveraged or reused?
  • Is a tool needed?
  • What is the implementation plan?

2.What guidelines can be learned from this vignette about developing DSS?

  • Before building a model, decision makers should develop a good understanding of the problem that needs to be addressed.
  • Coming up with nonmodeling solutions is important because if the problem is due to conflicting priorities, or the misalignment of incentives or unclear lines of authority or plans, then no DSS can help support the decision.
  • A model many not be necessary to address the problem.
  • Before developing a new tool, decision makers should explore reuse of existing tools.
  • The goal of model building is to gain better insight into the problem, not just to generate more numbers.

3.What lessons should be kept in mind for successful model implementation?

  • Implementation plans should be developed along with the model. Successful implementation results in solving the real problem.
  • Including the end users in the development process enhances the decision makers analytical knowledge and capabilities. And by working together, their knowledge and skills complement each other in the final solution and the success of the implementation.

Related Solutions

ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT