In: Accounting
TJ’s Cheese Cake Factory, Inc. sells original cheese cake for $16 each. The company provided the following units and total cost data concerning its cake sales for each month during 2011:
Cost | Units | |
January | 55000 | 2500 |
February | 59000 | 2800 |
March | 60000 | 3000 |
April | 64000 | 4200 |
May | 67000 | 4500 |
June | 71000 | 5500 |
July | 74000 | 6500 |
August | 77000 | 7500 |
September | 75000 | 7000 |
October | 68000 | 4500 |
November | 62000 | 3100 |
December | 73000 | 6500 |
a. Use the linear regression method to estimate fixed and variable costs. Excel has a function that you can use (I have posted these data in excel on Blackboard for your convenience). Print out the regression output and attach to this test.
b. Interpret and evaluate your regression model and results. Write out the cost formula.
c. Estimate total costs in a month when 6,000 cakes are produced and sold.
d. Estimate total profit in a month when 6,000 cakes are produced and sold.
e. You are working on the budget for October 2012 and expect 10,000 cakes will be produced and sold. Estimate total costs in a month when 10,000 cakes are produced and sold. Will you use the estimated cost in your budget? Why?
f. How does linear regression differ from the high-low method in estimating fixed and variable costs? Discuss the pros and cons of each.
e) Linear regression is great when the relationship to between covariates and response variable is known to be linear (duh). This is good as it shifts focus from statistical modeling and to data analysis and preprocessing. It is great for learning to play with data without worrying about the intricate details of the model.
A clear disadvantage is that Linear Regression over simplifies many real world problems. More often than not, covariates and response variables don’t exhibit a linear relationship. Hence fitting a regression line using OLS will give us a line with a high train RSS.
In summary, Linear Regression is great for learning about the data analysis process. However, it isn’t recommended for most practical applications because it oversimplifies real world problems.
So, I will not use the estimated cost in my budget.
f) The high low method uses a small amount of data to separate fixed and variable costs. It takes the highest and lowest activity levels and compares their total costs. On the other hand, linear regression shows the relationship between two or more variables. It is used to observe changes in the dependent variable with changes in the independent variable.
Pros of high-low method:
1)Easy to use
The high-low method only requires the cost and unit information at the highest and lowest activity level to get the required information. Managers can implement this technique with ease since it does not require any special tools.
2)High accuracy with stable costs
The high low method can be relatively accurate if the highest and lowest activity levels are representative of the overall cost behavior of the company. However, if the two extreme activity levels are systematically different, then the high-low method will produce inaccurate results.
Cons of high-low method:
1)Unreliable
The method does not represent of all the data provided since it relies on just two extreme activity levels. The activity levels may not be representative of the costs incurred due to outlier costs that are higher or lower than what the organization incurs in the other activity levels.
2)Does not account for inflation
The high-low method excludes the effects of inflation when estimating costs.
Pros of linear regression:
1) Space complexity is very low it just needs to save the weights at the end of training. hence it's a high latency algorithm.
2) Its very simple to understand
3) Good interpretability
4) Feature importance is generated at the time model building. With the help of hyperparameter lamba, you can handle features selection hence we can achieve dimensionality reduction
Cons of linear regression:
1) The algorithm assumes data is normally distributed in real they are not.
2) Before building model multicollinearity should be avoided.
3) prone to outliers