Multiple-period market risk prediction under long memory: when VaR is higher than expected

2014 ◽  
Vol 15 (1) ◽  
pp. 4-32 ◽  
Author(s):  
Harald Kinateder ◽  
Niklas Wagner

Purpose – The paper aims to model multiple-period market risk forecasts under long memory persistence in market volatility. Design/methodology/approach – The paper proposes volatility forecasts based on a combination of the GARCH(1,1)-model with potentially fat-tailed and skewed innovations and a long memory specification of the slowly declining influence of past volatility shocks. As the square-root-of-time rule is known to be mis-specified, the GARCH setting of Drost and Nijman is used as benchmark model. The empirical study of equity market risk is based on daily returns during the period January 1975 to December 2010. The out-of-sample accuracy of VaR predictions is studied for 5, 10, 20 and 60 trading days. Findings – The long memory scaling approach remarkably improves VaR forecasts for the longer horizons. This result is only in part due to higher predicted risk levels. Ex post calibration to equal unconditional VaR levels illustrates that the approach also enhances efficiency in allocating VaR capital through time. Practical implications – The improved VaR forecasts show that one should account for long memory when calibrating risk models. Originality/value – The paper models single-period returns rather than choosing the simpler approach of modeling lower-frequency multiple-period returns for long-run volatility forecasting. The approach considers long memory in volatility and has two main advantages: it yields a consistent set of volatility predictions for various horizons and VaR forecasting accuracy is improved.

2013 ◽  
Vol 107 (3) ◽  
pp. 537-556 ◽  
Author(s):  
Wayne Ferson ◽  
Suresh Nallareddy ◽  
Biqin Xie

Author(s):  
Wayne E. Ferson ◽  
Suresh Nallareddy ◽  
Biqin Xie

2012 ◽  
Author(s):  
Wayne Ferson ◽  
Suresh Nallareddy ◽  
Biqin Xie

2014 ◽  
Vol 6 (2) ◽  
pp. 152-178 ◽  
Author(s):  
Paul Simshauser ◽  
Jude Ariyaratnam

Purpose – This paper aims to present a multi-period dynamic power project financing model to produce pragmatic estimates of benchmark wholesale power prices based on the principles of normal profit. This, in turn, can guide policymakers as to whether price spikes or bidding above marginal cost in wholesale electricity markets warrants any investigation at all. One of the seemingly complex areas associated with energy-only wholesale electricity pools is at what point market power abuse is present on the supply side. It should not be this way. If a theoretically robust measure of normal profit exists, identification of potential market power abuse is straightforward. Such a definition readily exists and can be traced back to the ground-breaking work of financial economists in the 1960s. Design/methodology/approach – Using a multi-period dynamic power project model, the authors produce pragmatic and theoretically robust measures of normal profit for project financed plant and plant financed on balance sheet. These model results are then integrated into a static partial equilibrium model of a power system. The model results are in turn used to guide policymaking on generator bidding in energy-only power markets. Findings – Under conditions of perfect plant availability and divisibility with no transmission constraints, energy-only markets result in clearing prices which are not economically viable in the long run. Bidding must, therefore, deviate from strict short-run marginal cost at some stage. To distinguish between quasi-contributions to substantial sunk costs and market power abuse, a pragmatic and robust measure of normal profit is required. Originality/value – This article finds policymakers can be guided by an ex-post analysis of base energy prices against pragmatic estimates for the long-run marginal cost of the base plant, and an ex-ante analysis of call option prices along the forward curve against pragmatic estimates of the carrying cost of the peaking plant.


2018 ◽  
Vol 44 (2) ◽  
pp. 127-141 ◽  
Author(s):  
Vaibhav Lalwani ◽  
Madhumita Chakraborty

Purpose The purpose of this paper is to explore whether stock selection strategies based on four fundamental quality indicators can generate superior returns compared to overall market. Design/methodology/approach The sample of stocks comprises the constituents of BSE-500 index, which is a broad based index consisting of highly liquid stocks from all 20 major industries of the Indian economy. Portfolios are constructed on the basis of quality indicator rankings of companies and the returns of these portfolios are compared with the overall market. Excess returns on quality based portfolios are also determined using OLS regressions of quality portfolio returns on market, size, value and momentum factor returns. Findings The results suggest that two of the four quality strategies, namely Grantham Quality indicator and Gross Profitability have generated superior returns after controlling for market returns as well as common anomalies such as size, value and momentum. Combining value strategies with quality strategies do not yield any significant gains relative to quality only strategies. Practical implications For investors looking to invest in the Indian stock market for a long term, this study provides evidence on the performance of some fundamental indicators that can help predict long run stock performance. The findings suggest that investors can distinguish between high-performing and low-performing stocks based on stock quality indicators. Originality/value This is the first such study to look into the performance of quality investing in the Indian stock market. As most quality investing studies have been focussed on developed economies, this paper provides out-of-sample evidence for quality investing in the context of an emerging market.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Arnab Kundu ◽  
Tripti Bej

Purpose The coronavirus (COVID-19) pandemic led education institutions to move all face-to-face (F2F) courses online. The situation is unique in that teachers and students can make a direct comparison of their courses before (F2F) and after COVID-19 (online). This study aims to analyze teachers’ viewpoints for this unprecedented change. Design/methodology/approach The study followed a mixed-method approach within an ex post facto survey research design. Research tools were distributed among 200 Indian secondary school teachers following a heterogeneous purposive sampling technique. As the study was conducted during the pandemic backdrop researchers used Google forms and telephonic interviews to collect data. Findings Teachers viewed positively to this shift from F2F to online teaching-learning (OTL). They were found to have an overall moderate level of online teaching efficacy and where good efficacy prevails there found minimal concern for infrastructure, an attitude showing least concern for “what is not” and more concerned with “what they can do with what is having.” A statistically significant effect of teacher efficacy was found on their perception of OTL infrastructure that supports this strong conviction among few teachers. Statistical analysis revealed for every 1 standard unit increase in self-efficacy, the perceived OTL infrastructure was to be increased by 0.997 standard units which support the strong correlation between the two chosen cognitive variables (r = 0.8). Besides, teachers were not found as a homogeneous group concerning their reported readiness for online teaching yet, different subgroups of teachers exist which may require different approaches for support and counseling. Originality/value The paper reports an original empirical survey conducted in India and the write-up is based strictly on the survey findings only. An exclusive analysis of teachers’ views of their efficacy and perceived OTL infrastructure. At the same time, path-breaking in analyzing the chemistry between the two variables which will help improving apposite culture, practice and understanding of the digital pedagogy securing quality OTL in the long run.


2014 ◽  
Vol 22 (3) ◽  
pp. 271-284 ◽  
Author(s):  
Lukasz Prorokowski ◽  
Hubert Prorokowski

Purpose – This paper, based on case-studies with five universal banks from Europe and North America, aims to investigate which types of comprehensive risk measure (CRM) models are being used in the industry, the challenges being faced in implementation and how they are being currently rectified. Undoubtedly, CRM remains the most challenging and ambiguous measure applied to the correlation trading book. The turmoil surrounding the new regulatory framework boils down to the Basel Committee implementing a range of capital charges for market risk to promote “safer” banking in times of financial crisis. This report discusses current issues faced by global banks when complying with the complex set of financial rules imposed by Basel 2.5. Design/methodology/approach – The current research project is based on in-depth, semi-structured interviews with five universal banks to explore the strides major banks are taking to introduce CRM modelling while complying with the new regulatory requirements. Findings – There are three measures introduced by the Basel Committee to serve as capital charges for market risk: incremental risk charge; stressed value at risk and CRM. All of these regulatory-driven measures have met with strong criticism for their cumbersome nature and extremely high capital charges. Furthermore, with banks facing imminent implementation deadlines, all challenges surrounding CRM must be rectified. This paper provides some practical insights into how banks are finalising the new methodologies to comply with Basel 2.5. Originality/value – The introduction of CRM and regulatory approval of new internal market risk models under Basel 2.5 has exerted strong pressure on global banks. The issues and computational challenges surrounding the implementation of CRM methodologies are currently fiercely debated among the affected banks. With little guidance from regulators, it remains very unclear how to implement, calculate and validate CRM in practice. To this end, a need for a study that sheds some light on practices with developing and computing CRM emerged. On submitting this paper to the journal, we have received news that JP Morgan is to pay four regulators $920 million as a result of a CRM-related scandal.


2016 ◽  
Vol 33 (4) ◽  
pp. 466-487 ◽  
Author(s):  
Rainer Masera ◽  
Giancarlo Mazzoni

Purpose The paper aims to investigate whether the value of banks is affected by their financing policies. Higher capital requirements have been invoked by exploiting a renewed edition of the Modigliani–Miller (M&M) theorem. This paper shows the limits of this claim by highlighting that the general statement that “bank equity is not expensive” can be misleading. The authors argue that market prices should play an important role in bank supervision. Expectations of future profits in prices supply timely information on the viability of a bank. Design/methodology/approach The authors use the Merton model to show the inapplicability of M&M theorem to banks. The long-run viability of a bank is analyzed with a dividend discount model which allows to compare a bank’s long-term profitability with its overall cost of capital implicit in market prices. Findings The authors show that the M&M framework cannot be applied to banks neither ex-ante nor ex-post. Ex-ante the authors focus on government guarantees, ex-post they emphasize the risk-shifting phenomena that may increase the overall risk of the bank. The authors show that a bank’s stability cannot be achieved if the market expectations of its future profits stay below the cost of funding. Research limitations/implications The authors use simple analytical models. In a future study, some key peculiarities of banks, such as the monetary nature of deposits, should be analytically modelled. Practical implications The paper contributes to the debate on capital regulation on the level of capital requirements and the instruments to assess the viability/stability of banks. Originality/value This paper uses simple models to assess analytically the key issues in the debate on banks’ capital regulation.


2019 ◽  
Vol 12 (1) ◽  
pp. 97-120
Author(s):  
Augustine Chuck Arize ◽  
Ebere Ume Kalu ◽  
Chinwe Okoyeuzu ◽  
John Malindretos

Purpose This study aims to make a comparative study of the applicability of the purchasing power parity (PPP) in selected less developing countries (LDCs) on one hand and European countries on the other hand. Design/methodology/approach The research design is empirical and ex post facto. This study uses an assortment of co-integration tests and error correction representation. The chosen approach allows for the consideration of long-run elasticities and the dynamics of the short-run adjustment of exchange rates to changes in domestic and foreign prices. Monthly data are used for the period 1980:1 through 2015:12 (i.e. 432 observations). Findings Results from long-run co-integration analysis, short-run error correction models and persistence profile analysis overwhelmingly confirm the validity of PPP in these two sets of countries regardless the disparity in their relative exchange rate and price characteristics. Research limitations/implications Curiously, several of these empirical studies and still many more, have focused their attention on the experiences of industrialized countries, with a few investigations devoted to LDCs. The evidence is even scarcer in Africa. Clearly, the acceptance of any hypothesis as a credible explanation of economic reality hinges on the robustness of the hypothesis across countries with different economic and institutional frameworks. Practical implications Knowledge of the extent to which exchange rate and relative prices can be linked in the long run is important for the design and management of inflation and the implementation of monetary policy. For instance, policy actions aimed at stabilizing the domestic economy can obtain results that are, at best, uncertain in the absence of correct characterization of the PPP dynamics. Moreover, structural and macroeconomic adjustment programs implemented in these countries to achieve economic growth and external competitiveness could be unsuccessful if flawed estimates of PPP exchange rates are retained. Originality/value Several empirical studies have been done to prove the validity or otherwise of the PPP. Unlike prior authors, this study makes a comparative study of the applicability of the PPP in selected LDC on one hand and European countries.


Author(s):  
Wayne E. Ferson ◽  
Suresh Nallareddy ◽  
Biqin Xie

Sign in / Sign up

Export Citation Format

Share Document