Using OLS estimation methodology, the study of Morelli and Smith (2015) uses a cross sectional data of 2490 cars for the year 2013 to estimate the factors affecting the price of automobiles in the state of California. The estimation results of regressing the price variable on a set of explanatory variables are shown in Model (1), where the numbers in parentheses are the robust standard errors of the coefficients.
????? = 5647.02 + 5.77 ????ℎ? + 23.64 ??? + 3573.09 ??????? (1)
(1042.20) (1.50) (13.74) (1230)
???_?^2= 0.65, ? = 2490
Where price is in U.S. dollars, weight is in pounds, mpg is the number of miles per gallon, and foreign is a dummy variable that takes 1 if the ith car is foreign and 0 if domestic.
????? = 5524.02 + 6.54 ????ℎ? + 22.73 ??? + 3568.11 ??????? − 93.48 ?????ℎ (2)
(1033.10) (4.85) (13.68) (1232) (32.87)
???_?^2 = 0.92, ? = 2490
If the F-statistic of the coefficients of the four included variables in Model (2) is equal to 54.32, does the inclusion of the variable length in Model (2) creates an econometric problem? Explain in details.
????? = 5631.24 + 4.95 ????ℎ? + 25.99 ??? + 3650.22 ??????? + 88.31 ????? (3)
(1144.67) (1.62) (13.54) (1285.29) (44.38)
???_?^2 = 0.75, ? = 2490
Suppose that the correlations between the variable trunk and the variables price, weight, mpg, and foreign are equal to 0.25, 0.49, -0.38, and -0.36, respectively. Based on these correlations, refer to Model (1) and discuss the direction of the bias of each coefficient of the three included variables. What is your opinion about including the variable trunk as an additional regressor in Model (3)? Does the inclusion of the variable trunk violate any of the OLS assumptions? Explain in details.
In: Economics
QUESTION 6
Master Limited has the following items in its statement of profit or loss and other comprehensive income for the year ended 30 June 2017:
| Revenue | FC*130,000 |
| Cost of goods sold | FC45,000 |
| Other expenses | FC14,000 |
| Income tax expense | FC12,000 |
*FC = Foreign Currency.
All items were earned and incurred evenly across the year. The following exchange rates applied:
End of reporting period FC1 = $1.44
Average rate for year FC1 = $1.42
The net profit after tax translated into the presentation currency is:
|
$41,549. |
||
|
$40,972. |
||
|
$83,780. |
||
|
$84,940. |
0.2 points
QUESTION 7
Banjo Ltd acquired 100% of Wellington Ltd on 1 July 2018. The balance sheet of Wellington Ltd on that date was as follows:
|
Balance sheet at 1 July 2018 |
||||||
|
NZ$ |
NZ$ |
|||||
|
Machinery at cost |
560,000 |
Share capital |
400,000 |
|||
|
Investment property |
400,000 |
General reserve |
200,000 |
|||
|
Receivables |
100,000 |
Retained earnings |
600,000 |
|||
|
Cash |
140,000 |
1,200,000 |
||||
|
1,200,000 |
||||||
The balance sheet of Wellington Ltd as at 30 June 2019 is as
follows:
Balance sheet as at 30 June 2019
|
NZ$ |
NZ$ |
|||||
|
Machinery — carrying value |
300,000 |
Share capital |
400,000 |
|||
|
Investment property |
400,000 |
General Reserve |
200,000 |
|||
|
Receivables |
500,000 |
Retained earnings |
1,000,000 |
|||
|
Cash |
600,000 |
Accounts payable |
170,000 |
|||
|
Income tax payable |
30,000 |
|||||
|
1,800,000 |
1,800,000 |
|||||
Relevant exchange rates are as follows:
|
NZ$ |
A$ |
|||
|
1 July 2018 |
1.00 |
= |
0.95 |
|
|
30 June 2019 |
1.00 |
= |
0.85 |
|
|
Average 2018-19 |
1.00 |
= |
0.90 |
|
If the functional currency of Wellington Ltd is New Zealand dollars and the presentation currency is Australian dollars the total assets of NZ$1,800 000 would translate into Australian dollars as:
|
$1,560,000 |
||
|
$1,710,000 |
||
|
$1,620,000 |
||
|
$1,530,000 |
0.2 points
QUESTION 8
Alpine Limited has the following items in its statement of profit or loss and other comprehensive income:
| NZ$ | |
| Revenue | 140,000 |
| Cost of goods sold | 85,000 |
| Interest expense | 14,000 |
| Income tax expense | 12,000 |
All items arose evenly across the year. The following exchange
rates applied:
End of reporting period NZ$1.00 = A$0.90
Average rate for year NZ$1.00 = A$0.85
The net profit after tax translated into the presentation currency
of A$ is:
|
$34,118. |
||
|
$46,750. |
||
|
$24,650. |
||
|
$26,100. |
0.2 points
QUESTION 9
Which of the following statements is incorrect?
|
Movements in the foreign currency translation reserve must be disclosed. |
||
|
Exchanges differences included in profit or loss must be disclosed. |
||
|
There is no need to disclose if the presentation currency is different from the functional currency. |
||
|
AASB121/ IAS 21 The Effects of Changes in Foreign Exchange Rates requires disclosures about the translation of financial statements into other currencies. |
0.2 points
QUESTION 10
Under AASB 121 The Effects of Changes in Foreign Exchange Rates,
an entity must disclose which of the following items in
particular?
I. The amount of exchange differences included in profit or loss of
the period.
II. The amount of the exchange difference included directly in
share capital during the period.
III. Whether a change in the functional currency has
occurred.
IV. The reason for using a presentation currency that is different
from the functional currency.
|
II and III only. |
||
|
I, II, III and IV. |
||
|
I, III and IV only. |
||
|
I and IV only. |
In: Finance
To properly treat patients, drugs prescribed by physicians must have a potency that is accurately defined. Consequently, not only must the distribution of potency values for shipments of a drug have a mean value as specified on the drug's container, but also the variation in potency must be small. Otherwise, pharmacists would be distributing drug prescriptions that could be harmfully potent or have a low potency and be ineffective. A drug manufacturer claims that its drug is marketed with a potency of 5 ± 0.1 milligram per cubic centimetre (mg/cc). A random sample of four containers gave potency readings equal to 4.93, 5.08, 5.03, and 4.89 mg/cc.
(a) Do the data present sufficient evidence to indicate that the
mean potency differs from 5 mg/cc? (Use α = 0.05. Round
your answers to three decimal places.)
1-2. Null and alternative hypotheses:
H0: μ = 5 versus Ha: μ < 5H0: μ ≠ 5 versus Ha: μ = 5 H0: μ = 5 versus Ha: μ > 5H0: μ < 5 versus Ha: μ > 5H0: μ = 5 versus Ha: μ ≠ 5
3. Test statistic: t
=
4. Rejection region: If the test is one-tailed, enter NONE for the
unused region.
| t | > | |
| t | < |
5. Conclusion:
H0 is not rejected. There is insufficient evidence to indicate that the mean potency differs from 5 mg/cc.H0 is not rejected. There is sufficient evidence to indicate that the mean potency differs from 5 mg/cc. H0 is rejected. There is sufficient evidence to indicate that the mean potency differs from 5 mg/cc.H0 is rejected. There is insufficient evidence to indicate that the mean potency differs from 5 mg/cc.
(b) Do the data present sufficient evidence to indicate that the
variation in potency differs from the error limits specified by the
manufacturer? (HINT: It is sometimes difficult to determine exactly
what is meant by limits on potency as specified by a manufacturer.
Since it implies that the potency values will fall into the
interval 5.0 ± 0.1 mg/cc with very high probability—the implication
is always—let us assume that the range 0.2; or (4.9 to
5.1), represents 6σ, as suggested by the Empirical Rule.
Note that letting the range equal 6σ rather than
4σ places a stringent interpretation on the manufacturer's
claim. We want the potency to fall into the interval
5.0 ± 0.1
with very high probability.) (Use α = 0.05. Round your
answers to three decimal places.)
1-2. Null and alternative hypotheses:
H0: σ2 = 0.0011 versus Ha: σ2 < 0.0011H0: σ2 > 0.0011 versus Ha: σ2 < 0.0011 H0: σ2 = 0.2 versus Ha: σ2 ≠ 0.2H0: σ2 = 0.0011 versus Ha: σ2 > 0.0011H0: σ2 = 0.2 versus Ha: σ2 > 0.2
3. Test statistic: χ2
=
4. Rejection region: If the test is one-tailed, enter NONE for the
unused region.
| χ2 > |
| χ2 < |
5. Conclusion:
H0 is rejected. There is insufficient evidence to indicate that the variation in potency differs from the specified error limits.H0 is rejected. There is sufficient evidence to indicate that the variation in potency differs from the specified error limits. H0 is not rejected. There is insufficient evidence to indicate that the variation in potency differs from the specified error limits.H0 is not rejected. There is sufficient evidence to indicate that the variation in potency differs from the specified error limits.
In: Statistics and Probability
Matt’s Landscaping Pty Ltd is registered for GST purposes. It accounts for GST on the accruals basis and submits its Business Activity Statements monthly. Assume that all amounts in the question include GST when applicable.
The business operates from Brisbane’s north-side and its main business is to design and establish new gardens at shopping centres and office parks, and to provide ongoing garden maintenance services to clients.
During October 2018, Matt’s Landscaping Pty Ltd was involved in the following transactions:
* Designing and establishing a new garden at the refurbished Westfield shopping centre at North Lakes. The invoice issued on 25 October 2018 totalled $27,600.
* Ongoing garden maintenance services for clients. Invoices issued in October 2018 totalled $55,200.
* Matt’s Landscaping Pty Ltd prepared plans for the elaborate gardens of a new hotel being built in Dubai in the United Arab Emirates. The plans were sent to Dubai by airfreight on 1 October 2018. Matt’s Landscaping Pty Ltd issued an invoice for $17,500 on 15 October 2018.
* Purchases of plants, fertiliser and decorative stones from Brunnings Warehouse, one of Australia’s largest household hardware chains, totalled $9,900.
* The company purchased 100 square meters of Buffalo lawn from a retired school teacher who decided to dig up the lawn of his acreage in Caboolture to put down asphalt so that he no longer had to mow his lawn. The school teacher charged Matt’s Landscaping Pty Ltd $660 on 19 October 2018.
*The company purchased a second-hand lawnmower from a large supplier in Perth for $990 on 1 October 2018. The lawnmower was delivered a day later by a nationwide courier company that charged $110 for this service.
* Salaries and wages paid to staff in October 2018 totalled $12,000.
* Fuel costs for the company’s vehicles, lawnmowers and hedge trimmers totalled $1,980 during October 2018.
* The business operated from rented offices in North Lakes owned by a listed property group. Rent for October 2018 totalled $1,000.
* The October invoice that Matt’s Landscaping Pty Ltd received from the electricity and gas supplier for the rental property totalled $275.
* Purchases of milk, sugar, tea bags and coffee powder from a grocery store totalled $100 in October 2018. These were placed in a tea room in the rented offices for the exclusive use of staff.
* On 1 October 2018, Matt’s Landscaping Pty Ltd purchased a new one and a half tonne utility vehicle for $82,500.
* The company purchased specialised pruning shears from a company in France during October 2018. The shears cost $1,500 and international freight and insurance cost $100.
* Interest paid to the Commonwealth Bank of Australia in October 2018 on an overdraft facility totalled $660.
You are required to:
Calculate the GST payable or GST refundable for October 2018. Show all your calculations and provide reasons for your answers
In: Accounting
REM (rapid eye movement) sleep is sleep during which most dreams occur. Each night a person has both REM and non-REM sleep. However, it is thought that children have more REM sleep than adults†. Assume that REM sleep time is normally distributed for both children and adults. A random sample of n1 = 10 children (9 years old) showed that they had an average REM sleep time of x1 = 2.6 hours per night. From previous studies, it is known that σ1 = 0.8 hour. Another random sample of n2 = 10 adults showed that they had an average REM sleep time of x2 = 1.80 hours per night. Previous studies show that σ2 = 0.7 hour. Do these data indicate that, on average, children tend to have more REM sleep than adults? Use a 1% level of significance.
(a) What is the level of significance?
State the null and alternate hypotheses.
H0: μ1 = μ2; H1: μ1 > μ2
H0: μ1 = μ2; H1: μ1 < μ2
H0: μ1 < μ2; H1: μ1 = μ2
H0: μ1 = μ2; H1: μ1 ≠ μ2
(b) What sampling distribution will you use? What assumptions are you making?
The Student's t. We assume that both population distributions are approximately normal with known standard deviations.
The standard normal. We assume that both population distributions are approximately normal with unknown standard deviations.
The Student's t. We assume that both population distributions are approximately normal with unknown standard deviations.
The standard normal. We assume that both population distributions are approximately normal with known standard deviations.
What is the value of the sample test statistic? (Test the difference μ1 − μ2. Round your answer to two decimal places.)
(c) Find (or estimate) the P-value. (Round your answer to four decimal places.)
(d) Based on your answers in parts (a) to (c), will you reject or fail to reject the null hypothesis? Are the data statistically significant at level α?
At the α = 0.01 level, we fail to reject the null hypothesis and conclude the data are not statistically significant.
At the α = 0.01 level, we reject the null hypothesis and conclude the data are not statistically significant.
At the α = 0.01 level, we reject the null hypothesis and conclude the data are statistically significant.
At the α = 0.01 level, we fail to reject the null hypothesis and conclude the data are statistically significant.
(e) Interpret your conclusion in the context of the application.
Fail to reject the null hypothesis, there is insufficient evidence that the mean REM sleep time for children is more than for adults.
Reject the null hypothesis, there is sufficient evidence that the mean REM sleep time for children is more than for adults.
Reject the null hypothesis, there is insufficient evidence that the mean REM sleep time for children is more than for adults.
Fail to reject the null hypothesis, there is sufficient evidence that the mean REM sleep time for children is more than for adults.
In: Statistics and Probability
Have you ever wondered what it means to click the “offset carbon emissions” button when you book a flight or train trip? It adds a small cost to your ticket, but how does this reduce emissions? The money is typically used to fund projects that reduce carbon emissions. One such project type is the introduction of more efficient cooking stoves into communities. Much of the world uses inefficient charcoal or wood stoves that result in excessive indoor air pollution, deforestation, and carbon emissions. Switching millions of families to more efficient stoves can result in a significant reduction in carbon emissions. You may read more about such a project here. In order for a project to claim carbon credits, they must provide accurate estimates of how much carbon that project is saving. An important parameter for cook-stove projects is the reduction in fuel that results from switching to the more efficient stove. Statisticians are needed to design the experiments; it is expensive to do these tests, so figuring out how big the sample size should be in order to get sufficiently accurate estimates, or to detect significant differences between the stove types, is important. The EXCEL file, stove.xlsx, for this lab contains data from a pilot study using 19 randomly selected cooks. The numbers refer to the weight of firewood (in kg) to cook a regular meal. Each row in the spreadsheet corresponds to the same cook cooking the same meal. Use this data to answer the following questions. You may assume the conditions to carry out a hypothesis test are satisfied. You can assume (based on many similar studies) that the population standard deviation of reduction of firewood used is 0.7kg. Try to store as many decimal places as possible in intermediate steps. Old Stove Improved Stove 3.9 1.8 3.8 2.65 3.65 1.5 3.2 2.2 2.6 1.25 2.4 1.65 2.3 1.4 2.25 1.7 2.2 2.15 2.1 1.8 2 1.4 2 1.05 1.9 0.8 1.9 1.75 1.8 0.55 1.55 0.9 1.4 1.3 1.4 1.1 1.15 0.75 5. Find the 90% CI for the true mean reduction in firewood used. You may assume the population standard deviation of reduction in firewood used is 0.7. What is the margin of error (round to 4 decimal places)? 6. For a project to qualify for carbon credits, the required precision for estimates of the amount of wood saved per new stove adopted is 90/10, i.e. the 90% confidence interval must have a margin of error no greater than 10% of the value of the estimate. Will the data from the pilot study enable the project to qualify for carbon credits? 7. What is the minimum sample size required to meet the 90/10 precision requirement? 1 8. We want to know if the weight of wood used with the improved stove is significantly less than the weight of wood used with the old stove. State the null and alternative hypotheses for such a test.
In: Statistics and Probability
Tombro Industries is in the process of automating one of its plants and developing a flexible manufacturing system. The company is finding it necessary to make many changes in operating procedures. Progress has been slow, particularly in trying to develop new performance measures for the factory.
In an effort to evaluate performance and determine where improvements can be made, management has gathered the following data relating to activities over the last four months:
| Month | ||||||||
| 1 | 2 | 3 | 4 | |||||
| Quality control measures: | ||||||||
| Number of defects | 188 | 166 | 127 | 90 | ||||
| Number of warranty claims | 49 | 42 | 33 | 30 | ||||
| Number of customer complaints | 105 | 99 | 82 | 61 | ||||
| Material control measures: | ||||||||
| Purchase order lead time | 10 days | 9 days | 7 days | 5 days | ||||
| Scrap as a percent of total cost | 1 | % | 1 | % | 2 | % | 3 | % |
| Machine performance measures: | ||||||||
| Machine downtime as a percentage of availability | 4 | % | 5 | % | 5 | % | 8 | % |
| Use as a percentage of availability | 95 | % | 92 | % | 89 | % | 85 | % |
| Setup time (hours) | 10 | 12 | 13 | 14 | ||||
| Delivery performance measures: | ||||||||
| Throughput time | ? | ? | ? | ? | ||||
| Manufacturing cycle efficiency (MCE) | ? | ? | ? | ? | ||||
| Delivery cycle time | ? | ? | ? | ? | ||||
| Percentage of on-time deliveries | 96 | % | 95 | % | 92 | % | 89 | % |
The president has read in industry journals that throughput time, MCE, and delivery cycle time are important measures of performance, but no one is sure how they are computed. You have been asked to assist the company, and you have gathered the following data relating to these measures:
| Average per Month (in days) |
||||
| 1 | 2 | 3 | 4 | |
| Wait time per order before start of production |
9.0 | 11.2 | 12.0 | 14.0 |
| Inspection time per unit | 0.7 | 0.6 | 0.6 | 0.6 |
| Process time per unit | 2.8 | 2.4 | 1.8 | 1.0 |
| Queue time per unit | 3.1 | 4.2 | 6.0 | 7.6 |
| Move time per unit | 0.4 | 0.6 | 0.6 | 0.8 |
Required:
1-a. Compute the throughput time for each month.
1-b. Compute the manufacturing cycle efficiency (MCE) for each month.
1-c. Compute the delivery cycle time for each month.
3-a. Refer to the inspection time, process time, and so forth, given for month 4. Assume that in month 5 the inspection time, process time, and so forth, are the same as for month 4, except that the company is able to completely eliminate the queue time during production using Lean Production. Compute the new throughput time and MCE.
3-b. Refer to the inspection time, process time, and so forth, given for month 4. Assume that in month 6 the inspection time, process time, and so forth, are the same as in month 4, except that the company is able to eliminate both the queue time during production and the inspection time using Lean Production. Compute the new throughput time and MCE.
In: Accounting
Accurate and Efficient Reporting
Case Study- 1: Delta Lloyd Group Ensures Accuracy and Efficiency in
Financial Reporting
Case Study- 2: Flood of Paper Ends at FEMA
Learning Objectives:
• How to improve accuracy in reporting
• Compliance with industry regulations
• How to improve efficiency in reporting
Case Study- 1: Delta Lloyd Group Ensures Accuracy and Efficiency in
Financial Reporting
Delta Lloyd Group is a financial services provider based in the
Netherlands. It offers insurance, pensions, investing, and banking
se1vices to its private and corporate clients through its three
strong brands: Delta Lloyd, OHRA, and ABN AMRO Insurance. Since its
founding in 1807, the company has grown in the Netherlands,
Germany, and Belgium, and now employs around 5,400 pe1manent staff.
Its 2011 full-year financial reports show €5.5 billion in gross
written premiums, with shareholders' funds amounting to €3 .9
billion and investments under management worth nearly €74
billion.
Challenges:
Since Delta Lloyd Group is publicly listed on the NYSE Euronext
Amsterdam, it is obliged to produce annual and half-year repo1ts.
Various subsidiaries in Delta Lloyd Group must also produce reports
to fulfill local legal requirements: for example, banking and
insurance reports are obligatory in the Netherlands. In addition,
Delta Lloyd Group must provide reports
to meet international requirements, such as the IFRS (International
Financial Reporting Standards) for accounting and the EU Solvency I
Directive for insurance companies. The data for these reports is
gathered by the group's finance department, which is divided into
small teams in several locations, and then converted into XML so
that it can be published on the corporate Web site.
Importance for Accuracy:
The most challenging part of the reporting process is the "last
mile"-the stage, at which the consolidated figures are cited,
formatted, and described to form the final text of the report.
Delta Lloyd Group was using Microsoft Excel for the last-mile stage
of the reporting process. To minimize the risk of errors, the
finance team needed to manually check all the data in its reports
for accuracy. These manual checks were very time-consuming. Arnold
Honig,
team leader for reporting at Delta Lloyd Group, comments: "Accuracy
is essential in financial
reporting, since errors could lead to penalties, reputational
damage, and even a negative impact on the company's stock price. We
needed a new solution that would automate some of the last mile
processes and reduce the risk of manual error. "
Solution
The group decided to implement IBM Cognos Financial Statement
Reporting (FSR). The implementation of the software was completed
in just 6 weeks during the late summer. This rapid implementation
gave the finance department enough time to prepare a trial draft of
the annual report in FSR, based on figures from the third financial
quarter. The successful creation of this draft gave Delta Lloyd
Group enough confidence to use Cognos FSR for the final version of
the annual report, which was published shortly after the end of the
year.
Results
Employees are delighted with the IBM Cognos FSR solution. Delta
Lloyd Group has divided the annual report into chapters, and each
member of the reporting team is responsible for one chapter. Arnold
Honig says, "Since employees can work on documents simultaneously,
they can share the huge workload involved in report generation.
Before, the reporting process was inefficient, because only one
person could work on the report at a time."
Since the workload can be divided up, staff can complete the report
with less overtime. Arnold Honig comments, "Previously, employees
were putting in 2 weeks of overtime during the 8 weeks required to
generate a report. This year, the 10 members of staff involved in
the report generation process worked 25 percent less overtime, even
though they were still getting used to the new software. This is a
big win for Delta Lloyd Group and its staff. " The group is
expecting further reductions in employee overtime in the future as
staff becomes more familiar with the software.
Accurate Reports
The IBM Cognos FSR solution automates key stages in the
report-writing process by populating the final report with
accurate, up-to-date financial data. Wherever the text of the
report needs to mention a specific financial figure, the finance
team simply inserts a "variable "- a tag that is linked to an
underlying data source. Wherever the variable appears in the
document, FSR will pull the figure through from the source into the
report. If the value of the figure needs to be changed, the team
can simply update it in the source, and the new value will
automatically flow through into the text, maintaining accuracy and
consistency of data throughout the report.
Arnold Honig comments, "The ability to update figures automatically
across the whole report
reduces the scope for manual error inherent in spreadsheet-based
processes and activities. Since we have full control of our
reporting processes, we can produce better quality reports more
efficiently and reduce our business risk. “IBM Cognos FSR also
provides a comparison feature, which highlights any changes made to
reports. This feature makes it quicker and easier for users to
review new versions of documents and ensure the accuracy of their
reports.
Adhering to Industry Regulations
In the future, Delta Lloyd Group is planning to extend its use of
IBM Cognos FSR to generate internal management reports. It will
also help Delta Lloyd Group to meet industry regulatory standards,
which are becoming stricter. Arnold Honig comments, "The EU
Solvency II Directive will come into effect soon, and our Solvency
II reports will need to be tagged with extensible Business
Reporting Language [XBRL]. By implementing IBM Cognos FSR, which
fully suppo1ts
XBRL tagging, we have equipped ourselves to meet both current and
future regulatory requirements. "
Case Study- 2: Flood of Paper Ends at FEMA
Staff at the Federal Emergency Management Agency (FEMA), a U.S.
federal agency that coordinates disaster response when the
President declares a national disaster, always got two floods at
once. First, water covered the land. Next, a flood of paper,
required to administer the National Flood Insurance Program (NFIP),
covered their desks-pallets and pallets of green-striped reports
poured off a mainframe printer and into their offices. Individual
reports were sometimes 18 inches thick, with a nugget of
information about insurance claims, premiums, or payments buried in
them somewhere. Bill Barton and Mike Miles don't claim to be able
to do anything about the weather, but the project manager and
computer scientist, respectively,
from Computer Sciences Corporation (CSC) have used WebFOCUS
software from Information
Builders to turn back the flood of paper generated by the NFIP. The
program allows the government to work together with national
insurance companies to collect flood insurance premiums and pay
claims for flooding in communities that adopt flood control
measures. As a result of CSC's work, FEMA staff no longer leaf
through paper reports to find the data they need. Instead, they
browse insurance data posted on NFIP's BureauNet intranet site,
select just
the information n they want to see, and get an onscreen report or
download the data as a spreadsheet.
And that is only the start of the savings that WebFOCUS has
provided. The number of times that
NFIP staff asks CSC for special reports has dropped in half,
because NFIP staff can generate many of the special reports they
need without calling on a programmer to develop them. Then there is
the cost of creating BureauNet in the first place. Barton estimates
that using conventional Web and database software to export data
from FEMA's mainframe,
store it in a new database, and link that to a Web server would
have cost about 100 times as much more than $500,000- and taken
about two years to complete, compared with the few months Miles
spent on the WebFOCUS solution .
When Tropical Storm Allison, a huge slug of sodden, swirling
clouds, moved out of the Gulf of
Mexico onto the Texas and Louisiana coastline in June 2001, it
killed 34 people, most from drowning; damaged or destroyed 16,000
homes and businesses; and displaced more than 10,000 families.
President George W. Bush declared 28 Texas counties disaster areas,
and FEMA moved in to help. This was the first serious test for
BureauNet, and it delivered. This first comprehensive use of
BureauNet resulted in FEMA field staff readily accessing what they
needed and when they needed it, and asking for many new types of
reports. Fortunately, Miles and WebFOCUS were up to the task. In
some cases, Barton says, "FEMA would ask for
a new type of report one day, and Miles would have it on BureauNet
the next day, thanks to the speed with which he could create new
reports in WebFOCUS.”
The sudden demand on the system had little impact on its
performance, notes Barton. "It handled the demand just fine,” he
says. "We had no problems with it at all." "And it made a huge
difference to FEMA and the job they had to do. They had never had
that level of access before, never had been able to just click on
their desktop and generate such detailed and specific reports.
"
Simulation
Case Study- 1: Agent-based simulation helps Analyze Spread of a
Pandemic Outbreak
Learning Objectives:
• Know the concepts behind and application of genetic
algorithm
Agent-based simulation helps Analyze Spread of a Pandemic
Outbreak
Knowledge about the spread of a disease plays an important role in
both preparing for and responding to a pandemic outbreak. Previous
models for such analyses are mostly homogenous and make use of
simplistic assumptions about transmission and the infection rates.
These models assume that each individual in the population is
identical and typically has the same number of potential contacts
with an infected individual in the same time period.
Also each infected individual is assumed to have the same
probability to transmit the disease. Using these models,
implementing any mitigation strategies to vaccinate the susceptible
individuals and treating the infected individuals become extremely
difficult under limited resources.
In order to effectively choose and implement a mitigation strategy,
modeling of the disease spread has to be done across the specific
set of individuals, which enables researchers to prioritize the
selection of individuals to be treated first and also gauge the
effectiveness of mitigation strategy.
Although nonhomogenous models for spread of a disease can be built
based on individual characteristics using the interactions in a
contact network, such individual levels of infectivity and
vulnerability require complex mathematics to obtain the information
needed for such models.
Simulation techniques can be used to generate hypothetical outcomes
of disease spread by simulating events on the basis of hourly,
daily, or other periods and tallying the outcomes throughout the
simulation. A nonhomogenous agent-based simulation approach allows
each member of the population to be simulated individually,
considering the unique individual characteristics that affect the
transmission and infection probabilities. Furthermore, individual
behaviors that affect the type and length of contact between
individuals, and the possibility of infected individuals recovering
and becoming immune, can also be simulated via agent-based
models.
One such simulation model, built for the Ontario Agency for Health
Protection and Promotion
(OAHPP) following the global outbreak of severe acute respiratory
syndrome (SARS) in 2002-2003, simulated the spread of disease by
applying various mitigation strategies. The simulation models each
state of an individual in each time unit, based on the individual
probabilities to transition from susceptible state to infected
stage and then to recovered state and back to susceptible state.
The simulation model also uses an individual's duration of contact
with infected individuals. The model also accounts for the rate of
disease transmission per time unit based on the type of contact
between individuals and for behavioral changes of individuals in a
disease progression (being quarantined or treated or recovered). It
is flexible enough to consider several factors affecting the
mitigation strategy, such as an individual's age, residence, level
of general interaction with other members of population, number of
individuals in each household, distribution of households, and
behavioral aspects involving daily commutes, attendance at schools,
and asymptotic time period of disease.
The simulation model was tested to measure the effectiveness of a
mitigation strategy involving an advertising campaign that urged
individuals who have symptoms of disease to stay at home rather
than commute to work or school. The model was based on a pandemic
influenza outbreak in the greater Toronto area. Each individual
agent, generated from the population, was sequentially assigned to
households. Individuals were also assigned to different ages based
on census age distribution; all other pertinent demographic and
behavioral attributes were assigned to the individuals.
The model considered two types of contact: close contact, which
involved members of the same household or commuters on the public
transport; and causal contact, which involved random individuals
among the same census tract. Influenza pandemic records provided
past disease transmission data, including transmission rates and
contact time for both close and causal contacts. The effect of
public transportation was simplified with an assumption that every
individual of working age used the nearest subway line to travel.
An initial outbreak of infection was fed into the model. A total of
1,000 such simulations were conducted.
The results from the simulation indicated that there was a
significant decrease in the levels of infected and deceased persons
as an increasing number of infected individuals followed the
mitigation strategy of staying at home. The results were also
analyzed by answering questions that sought to verify issues such
as the impact of 20 percent of infected individuals staying at home
versus 10 percent staying at home. The results from each of the
simulation outputs were fed into geographic information system
software, ESRI ArcGIS, and detailed shaded maps of the greater
Toronto area, showing the spread of disease based on the average
number of cumulative infected individuals. This helped to determine
the effectiveness of a particular mitigation strategy. This
agent-based simulation model provides a what-if analysis too l that
can be used to compare relative outcomes of different disease
scenarios and mitigation strategies and help in choosing the
effective mitigation strategy.
Case Study- 1: Delta Lloyd Group Ensures Accuracy and
Efficiency in Financial Reporting
Case Study- 2: Flood of Paper Ends at FEMA
Activity
Based on the above case studies discussion answer the following
questions:
1. How did Delta Lloyd Group improve accuracy and efficiency in
financial reporting?
2. What were the challenges, the proposed solution, and the
obtained results?
3. Why is it important for any organization to comply with industry
regulations?
4. What are the main challenges that FEMA faces?
5. How did FEMA improve its inefficient reporting practices?
Simulation
Case Study- 1: Agent-based simulation helps Analyze Spread of a
Pandemic Outbreak
Learning Objectives:
• Know the concepts behind and application of genetic
algorithm
___________________________________________________
Activity
Based on the above case studies discussion answer the following
questions:
1. What are the characteristics of an agent-based simulation
model?
2. List the various factors that were fed into the agent based
simulation model described in the case.
3. Elaborate on the benefits of using agent-based simulation
models.
4. Besides disease prevention, in which other situations could
agent-based simulation be employed?
Please, short answers, she does not want to copy paste, she wants
in my style
Case study and the questions underneath that you want an answer, Please, the duty closes ten
In: Computer Science
Write Matlab programs implementing the algorithms based on bisection,Newton, and secant method for numerical solution of scalar nonlinear equa-tions. Use these programs to compute approximations to real roots of the
following equations:
exp(x)−3x^2=0, (1)
x^3=x^2+x+1, (2)
exp(x) =1/(0.1 +x^2), (3)
and
x= 1 + 0.3 cos(x). (4)
Use an error tolerance tol=10^(−12). Display the obtained approximations to the roots of these equations, and compare the number of iterations, starting with the same initial values x0 for bisection and Newton methods, and with x0 and x1, where x1 was computed by bisection starting with x0, for secant method.
In: Advanced Math
In: Chemistry