The economics of data: Using simple model-free volatility in a high-frequency world

John Garvey, Liam A. Gallagher

Research output: Contribution to journalArticlepeer-review

Abstract

This paper examines the practical implications of using high-frequency data in a fast and frugal manner. It recognises the continued widespread application of model free approaches within many trading and risk management functions. Our analysis of the relative characteristics of four model-free volatility estimates is framed around their relative long memory effects as measured by the feasible exact local Whittle estimator. For a cross-section of sixteen FTSE-100 stocks, for the period 1997-2007, we show that 5-min realized volatility exhibits a higher level of volatility persistence than approaches that use data in a sparse way (close-to-close volatility, high-low volatility and Yang & Zhang volatility). This observation is a useful decision-tool for a trading and risk management decisions that are undertaken in a time-constrained task environment. It recommends that the use of sparse data (open, high, low and closing price observations) requires trader intuition and judgement to build long-memory effects into their pricing.

Original languageEnglish
Pages (from-to)370-379
Number of pages10
JournalNorth American Journal of Economics and Finance
Volume26
DOIs
Publication statusPublished - Dec 2013

Keywords

  • Economics of information
  • High frequency data
  • Long memory effects
  • Model free volatility

Fingerprint

Dive into the research topics of 'The economics of data: Using simple model-free volatility in a high-frequency world'. Together they form a unique fingerprint.

Cite this