In: Economics
Read the essays on Google and summarize in no more than 300 words
Financial econometrics: Past developments and future challenges
Tim Bollerslev??*
?Department of Economics, Duke University, Durham, NC 27708,
USA
?NBER, Cambridge, MA 02138, USA
Abstract
The "eld of "nancial econometrics has had a glamorous run during the life span of the Journal of Econometrics. This note provides a selective summary of the most important developments in the "eld over the past two decades, notably ARCH and GMM, along with a discussion of promising avenues for future research. ? 2001 Elsevier Science S.A. All rights reserved.
JEL classixcation: C1; G1
Keywords: ARCH; GMM; High-frequency data; Long memory; Continuous
time
modeling; Risk-neutral distributions
summary:
The field of financial econometrics constitutes one of the most active areas of reaseach in econometeics today.it is rare that an issue of the journal of econometrics does not contain even a couple of articles within the area.until 20 years ago, most empirical finance papers relied on- by statistical and econometric standards-failrly simplistics data analysis tools. rapid acceleration in computer power, the increased avvailability of high quality data for a range of financial instruments , along with the developemnets and adoption of more sophisticated econometric techniques have dramatically chaged the field.these advances have not only been restricted to academia but also brrn profoundly influenced mdern day practise of finance and investment management. let us now summarise the past developments:
the development in financial econometrics over the past two decades time varying volatility models, in the form of ARCH and stochastic volatility formulations and robust methods of moments based estimation procedures such as GMM, stand out as milestones.lets now discuss the past and future research around these two originally independent , conceptions.
two varying volatility: uncertainity plays a central role in financial economics. although volatility clustering and advent of ARCH nad GARCH models that financial aconometrivians statred to seriously model this phenomenon. when a burgeoning empirical literature has developled , we hahve a better understanding of salien distributional features of daily and lower feequency speculative returns. empirical findings include strongly persistent volatility dependencies ,spillpvers and linkages across different assets and markets. the past decade produced important empirical thoeritical results regarding statistical properties of most univariate ARCH and stochastic volatility models. not withstanding these developemnts, several challneges quseyions related to proper modeling of ultra high- frequency data, longer run dependencies and large dimentional system remain.
flexible estimation procedures: the notion of instrumental variables estimation datesback at least to sargan . the GMM procedures in hansen offers the first distributuin free estimation framework for the type of multi period non linear moment conditions that often arise from partial equilibrium restriction and moreover redily accomodates the pronounced volaitility clustreing documant in extant ARCH lietrature. along with the stochastic discount factor approach GMM has served as a cornerstone in the meperical asset pricing lieteature over the past two decades. from the econometric point subsequent algorithms include for practical estimation of weighting matrix, guidance for optimal choice of instruments along with the extentions of the theory to situations in which the moments may have to be computed by numerical simulation techniques.such estimation proceduees are likely to play a pivotol role in some of the future research directions.
DIRECTIONS AND CHALLENGES FOR FUTURE RESEARCH: 1.time varying volatility- high frequency have recently bacome available for a range of different instruemnts and markets. whne such data contain very useful information about a host of market microstructure issues , it ecomes clear this data may also hold important information about longer run interdaily phenomena. the mere size of data bases often involving millions of observations, presnts a numver of practical problems related to data verification, storage and numerical manipilation. with high frequency financial time series the distance between observations vary importantly through time -sometimes the market is very active and prices change very rapidly while at other times there are large gaps between successive observations. this sugests the use if marked pont processes or continuous time methods in which the sampling theory is determined by some notion of time deformation as in the mixture of distribution hypotheses. again financial markets exhibit strong periodic dependencies across trading day. even the actual prices and the bid ask spreads tend to cluster at discrete support points.
2.continuus time models and risk neutral pricing levels: continuous time methods and no arbitrage figurere prominently in the theoritical asset pricing literature . some of the influentitial contributions to date have been derived under very restrictive, and arguably unrealitic, assumptions about the process fo rthe underlying state variables. a no of recent studies proposed more realistic continuous time processes, allowing for time varying volatulity in the state variables. recemt research on the link between the probability distribituons of actual asset prices and coresponding risk neutral probability implied by derivatie prces have just started to deliver important new insights on the way in which market prices risk. a lot remains to be done. for instance existence of multiple simultaneous pricing errors implied volatility smiles and smirks is difficult to reconcile within the complete market framework. another is probelm in the emperical analysis of no arbitrage based models for the term structure of interest rates. this raises important econometric, as well as theoritical questions concerning the proper treatment of panel data structure in options and interest rate data.
conclusion: the field of finanial econometrics has a glamorous run during the life span of journal of econometrics. out of sample forecasting is always marred with difficulties and simply extrapolating the future vitality of the field based on past observations does not necessarily result in optimal predictions. the multitude of interesting and challenging research questions set thestage foe equally exciting future.