database compiled by the Bureau of Economic Analysis and describing interactions between sectors of the US economy.
We show that longmemory can naturally arise when a large number of simple linear homogenous economic subsystems with a short memory are interconnected to form a network. The longmemory behavior then is largely determined by the geometry of the network while being relatively insensitive to the specific behavior of individual subsys- tems. Under weak regularity conditions, the power spectrum of the network’s response to exogenous short-memory noise exhibits the same power spectrum signature as a fractionally integrated processes (), with related to the scaling properties of the network (its spectral dimension). This work not only provides a plausible structural model for the generation of fractionally integrated longmemory processes, but also demonstrates that longmemory is possible without nonlinearity, heterogeneity, unit roots or near unit roots, learning or structural breaks (although these mechanisms can obviously play a role as well). The proposed approach also makes a direct connection between the literatures focusing on longmemory processes, economic networks and diﬀusion on fractals. It also suggests that the spectral dimension would be a very use- ful descriptor to add to the list of commonly used summary statistics (e.g., de Paula (2016)) to characterize networks (degree distribution, centrality, betweenness, etc).
Models including realized measures in the GARCH equation (i.e. GARCH-X) were introduced by Engle (2002) and further studied by Visser (2010). Hansen et al. (2012) completed GARCH-X models with a measurement equation for the realized measure leading to the class of Realized GARCH models. Later, Hansen and Huang (2012) introduced the Realized EGARCH to account for leverage effects and Hansen et al. (2014b) the multivariate Realized Beta GARCH. Competing models include the multiplicative error model (MEM) of Engle and Gallo (2006) and the HEAVY model of Shephard and Sheppard (2010). Further models were constructed to directly forecast the realized measures instead of the conditional variance of returns and include ARFIMA models (Andersen et al. (2003)), long-memory factor models (Luciani and Veredas (2015)) and the well-known HAR-RV models (Corsi (2009)). They are of particular interest in this paper as they all accommodate long-range dependencies in realized measures (see Andersen et al. (2003)) and will be part of the set of competing models in the section devoted to forecasting. Further high-dimensional semi-parametric approaches include Barigozzi et al. (2014).
The organization of the paper is as follows. Section 2 develops the RSV model with general Gegenbauer longmemory, and discusses the diﬀerences from the model with seasonal longmemory. Section 3 explains the estimation method based on the Whittle likelihood under predetermined Gegenbauer frequencies, and shows the approach of Hidalgo and Soulier (2004) for estimating and selecting the Gegenbauer frequencies. Section 3 provides the finite sample properties of these estimators, and the likelihood ratio statistic for testing the seasonal longmemory against the general longmemory. Section 4 presents empirical results using the daily returns and realized volatility measures of three stock indices, namely Standard & Poors 500, FTSE 100, and Nikkei 225. Section 5 provides some concluding remarks.
Caporale, GM. and Gil-Alana, LA., (2008), Modelling the US, UK and Japanese unemployment rates: fractional integration and structural breaks, Computational Statistics and Data Analysis 52 (11) : 4998- 5013.
Crato, N. and P. Rothman, 1996, Measuring hysteresis in unemployment rates with longmemory models, Working Paper, EastCarolinaUniversity, Department of Economics. Dahlhaus, R. (1989) Efficient parameter estimation for self-similar process. Annals of Statistics 17, 1749-1766.
Souza (2006) and Diongue et al. (2009) stress the longmemory properties of electricity
time series and suggest that Gegenbauer models are useful to analyze these datasets, because they allow for different degrees of longmemory at arbitrary periodic frequen- cies. An unresolved issue however, is how to select the number of cyclical components that have to be modeled. This is why we propose a model selection procedure that consistently estimates the required model order and demonstrate how it can be applied to the analysis of electricity load data.
The paper develops a novel realized stochastic volatility model of asset returns and realized volatil- ity that incorporates general asymmetry and longmemory (hereafter the RSV-GALM model). The contribution of the paper ties in with Robert Basmann’s seminal work in terms of the estimation of highly non-linear model speciﬁcations (“Causality tests and observationally equivalent represen- tations of econometric models”, Journal of Econometrics, 1988), especially for specifying causal eﬀects from returns to future volatility. This paper discusses asymptotic results of a Whittle like- lihood estimator for the RSV-GALM model and a test for general asymmetry, and analyses the ﬁnite sample properties. The paper also develops an approach to obtain volatility estimates and out-of-sample forecasts. Using high frequency data for three US ﬁnancial assets, the new model is estimated and evaluated. The paper compares the forecasting performance of the new model with a realized conditional volatility model.
Post-crisis (2010-2016) 0.443 0.387 0.490
*: non-rejection cases of the I(1) hypothesis at the 5% level.
Fear is one of the strongest emotions in financial markets; it can cause market crashes as well as price bubbles. However, its features are still relatively unexplored. We investigate its persistence properties by applying two different long-memory approaches (R/S analysis and fractional integration) to daily and monthly series for the VIX Index (the most commonly used quantitative measure of market fear) for the period from 2004
fear gauge’ that reaches higher levels during periods of market turmoil. We use two different long-memory approaches (R/S analysis with the Hurst exponent method and fractional integration) to analyse persistence of the VIX over the sample period 2004- 2016, as well as some sub-periods (pre-crisis, crisis and post-crisis) to see whether it varies over time depending on market conditions.
hemodynamic activities. The proposed model also implies that the FI process is more appropriate model for resting state BOLD signals with longmemory than the FGN process. This linear longmemory model of hemodynamic response can be extended to the nonlinear case based on the Volterra series expansion. Theories suggest that the longmemory phenomenon is not almost influenced by nonlinearity of the hemodynamic system. Note that the nonlinearity of hemody- namics may have an effect on functional connectivity, and thus would increase the difference of statistical properties between neuronal activities and BOLD signals. This hemodynamic response model has an important implication such that the HRF is not static over time and may be subject to the current and past states of neuronal activity. The dependence of hemodynamic response on the history of neuronal activities was named the history dependent excitability (HDE). The physical and biological mechanism that links the dynamic change of HRF with the history of neuronal activities has been unrevealed. Nevertheless, it can be concluded that the hemodynamic activity is in the critical state generating its fractal behavior when input neuronal activity satisfies the short memory condition and the corresponding BOLD signal has longmemory.
5.4 Results III: FIEGARCH-Jump
The main results of the Monte Carlo experiment are summarized in tables in Appendix G. In the rst two tables MLE, FWE and MODWT-based WWE are compared in terms of individual parameters estimation performance, results for DWT-based WWE are not included due to the limited space. Concerning the comparison of these two estimators, the overall performance of the MODWT-WWE is better than that of the DWT-WWE both in terms of bias and RMSE and considering also the loss of sample size limitation, the MODWT-WWE is strictly preferred. This is only supported by the forecasting results presented in the next tables. Next, focusing on the MLE, FWE and MODWT-WWE relative performance in terms of RMSE for jumps and d = 0:25, the MLE, despite being aected by the residual jump eects, it remains the best followed by the two Whittles, which perform comparably, FWE in most cases works slightly better. Yet, the bias of the MLE is signicant and we would prefer the use of FWE considering both the bias and the RMSE and in case of longer time series, WWE seems to be the best option due to the faster bias decay. Next, for d = 0:45, the MLE performance is very poor and the use of FWE is preferable. As expected, the bias and RMSE in case of individual parameters estimates as well as the mean absolute deviation and RMSE of the out-of-sample forecasts decline and the overall in-sample t improves with sample size increase and longmemory weakening. Next, the constant term estimation performance is worth mentioning, since it is very poor in the case of MLE and strong longmemory, and therefore an ex ante estimation as in the case of FWE and WWE is appropriate.
The standard CUSUM test has the shortcoming that it is developed for independent or serially correlated series, but not for long-range dependent data. Long-memory time series and series with shifts in mean are easily confused since they can show similar characteristics such as hy- perbolically decaying autocorrelations or a pole in the periodogram at Fourier frequencies close to zero (see for example Diebold and Inoue (2001) or Granger and Hyung (2004) ). Therefore,
Source: Authors’ calculations
Notes: denotes significance at ** 5% level and rejection of the null of short memory For real GDP the null of I(0) stationarity is rejected on the basis of Lo’s (1991) modified R/S and Giraitis et al. (2003) V/S test. The Robinson and Lobato (1998) test does not lead to a rejection of the null of short memory. No evidence of longmemory is found for real GDP growth. Overall, the non-parametric test results provide evidence against the unit root hypothesis and in favour of longmemory (fractional integration) in UK real GDP, with the estimated order of integration (d) ranging from 0.068 to 0.484. For the unemployment rate series, there is evidence of short-memory behaviour, in line with Gil-Alana (2001), Gil-Alana et al. (2003), Caporale and Gil-Alana (2008b). The same holds for inflation.
FIGARCH equation was introduced by Baillie et al. (1996) to capture the longmemory eect in volatility. They argued that a stationary FIGARCH process, if exist, has longmemory. However, the existence of a non-trivial FIGARCH process with nite mean was never shown. See Giraitis et al. (2000a), Kazakevicius and Leipus (2003), Mikosch and Starica (2000, 2003), Davidson (2004) for a discussion of controversies surrounding the FIGARCH. Several papers (Giraitis et al. (2000a, 2002), Kazakevicius and Leipus (2003)) claim that the FIGARCH equation has no stationary solution with nite mean E k < 1 besides the trivial
Clark, P.K. (1973). A subordinated stochastic process model with ﬁxed variance for speculative prices. Econometrica, 41, 135–156.
Deo, R.S., Hurvich, C.M. (2001). On the log periodogram regression estimator of the memory parameter in longmemory stochastic volatility models, Econometric Theory, 17, 686–710. Dissanayake, G., Peiris, S., Proietti, T. (2016). State space modeling of Gegenbauer processes
In this paper we report an analysis of several data series generated by a particular set of choices which are often believed to be strongly affected by habit - the viewing of television programmes. We present evidence that - consistent with a cross-sectional aggregation result which we discuss below - the temporal aggregate of individual choices exhibit longmemory as represented by a fractional differencing parameter, d, greater than zero. We also investigate the proposition that temporal aggregation of a fractionally-integrated series leaves the value of d unchanged - an implication of the self-similar behaviour of long-memory processes.
The phenomenon of longmemory has been known for years in fields like hydrology and physics. The hydrologist Hurst (1951) was the first to formally study that long periods of dryness of the Nile river were followed by long periods of floods. A formal theory on longmemory processes was subsequently formulated by Mandelbrot (1975), who introduced the fractional Brownian motion and studied the so called self-similarity property. The introduc- tion of fractional integration in economics and econometrics dates back to Granger (1980) and Granger and Joyeux (1980) who defined the autoregressive fractionally integrated mov- ing average (ARFIMA henceforth) model. Similarly to hydrological and climatological time series, many economic and financial time series show evidence of being neither integrated of order zero (I(0) henceforth) nor integrated of order one (I(1) henceforth). In these cir- cumstances the use of ARFIMA models becomes necessary. Nowadays, a broad range of applications in finance and macroeconomics shows that long-memory models are relevant - see among others Diebold et al. (1991) for exchange rate data, Andersen et al. (2001a) and Andersen et al. (2001b) for financial volatility series, and Baillie et al. (1996) for inflation data. Early papers on the estimation of long-memory models are due to Fox and Taqqu (1986), Dahlhaus (1989), Sowell (1992) and Robinson (1995).
We estimate the degree of fractional longmemory for ination and industrial pro- duction in 11 EMU countries for the period from January, 1999, to June, 2019, where we distinguish between core and periphery countries. Our results suggest breaks in the persistence of both ination and industrial production around the beginning of the nancial crisis in 2007-08 for a majority of countries in our sample. Ination is generally found to be more persistent in the crisis period, with higher persistence estimated for periphery countries compared to core countries. This implies that ination rates where more integrated in the EMU during the pre-crisis period, while the higher persistence during the crisis period carries the danger of diverging processes in case of asymmetric ination shocks. By contrast, industrial production is found to be more persistent in the core countries of the sample. Again, most countries experienced higher persistence also in industrial production during the crisis, but Belgium, Finland and Greece show the opposite pattern with lower persistence after 2008.
September 28, 2020
This paper considers estimation and testing of multiple breaks that occur at unknown dates in multivariate long-memory time series. We propose a likelihood ratio based approach for estimating breaks in the mean and the covariance of a system of long-memory time series. The limiting distribution of these estimates as well as consistency of the estimators is derived. A testing procedure to determine the unknown number of break points is given based on iterative testing on the regression residuals. A Monte Carlo exercise shows the finite sample performance of our method. An empirical application to inflation series illustrates the usefulness of our procedures.
parameters. This was found to cause both unstable parameter estimates and poor quality stan- dard errors of the parameter estimates. However, robust Wald test statistics of the 19 parameter restrictions which reduce the unrestricted ARF IM A(22; d; 0) model to the RARF IM A(22; d; 0) are presented in Table 6. The Wald tests generally reject the restrictions that are consistent with a HAR model. From Table 6 it can be seen that the HAR model restrictions cannot be rejected for Canada, Japan or S&P 500; although there is considerable variation among the short mem- ory HAR parameters with several not being significant. While the M LE of the longmemory parameter d is around 0.30 for four of the RV series and is not significantly different from zero for the Euro or the S&P 500 RV series. In general these results suggest that the HAR model pro- vides a useful representation of some of the low order dynamics of RV , but that longmemory also plays an important role to describe higher order dynamics.
Gourieroux and Jasiak ( 2001 ) show that breaks in the mean of a time series can also lead to biased estimates of the covariance structure of the process again falsely indicating longmemory.
Other than the before mentioned papers K¨ unsch ( 1986 ), Parke ( 1999 ) and Diebold and Inoue ( 2001 ) construct diverse processes which do share the autocorrelation structure with a long-memory time series at least in finite samples. K¨ unsch ( 1986 ) considers monotonic deterministic trends. Parke ( 1999 ) constructs a random level shift process where the shift probability is derived from a heavy tailed distribution. A similar process is considered in Mikosch et al. ( 2002 ). This process has the same autocorrelation function as a longmemory process. However, Davidson and Sibbertsen ( 2005 ) show that its partial sums converge to a Levy motion with independent increments not exhibiting longmemory. Only cross-sectional aggregation of these processes generates longmemory, asymptotically.