scholarly journals Portfolio optimization by using MeanSharp-βVaR and Multi Objective MeanSharp-βVaR models

Filomat ◽  
2018 ◽  
Vol 32 (3) ◽  
pp. 815-823
Author(s):  
Shokoofeh Banihashemi ◽  
Sarah Navidi

The purpose of this study is to develop portfolio optimization and assets allocation using our proposed models. For this, three steps are considered. In the first step, the stock companies screen by their financial data. For second step, we need some inputs and outputs for solving Data Envelopment Analysis (DEA) models. Conventional DEA models assume non-negative data for inputs and outputs. However, many of these data take the negative value, therefore we propose the MeanSharp-?VaR (MSh?V) model and the Multi Objective MeanSharp-?VaR (MOMSh?V) model based on Range Directional Measure (RDM) that can take positive and negative values. Also, we consider one of downside risk measures named Value at Risk (VaR) and try to decrease it. After using our proposed models, the efficient stock companies will be selected for making the portfolio. In the third step, Multi Objective Decision Making (MODM) model was used to specify the capital allocation to the stock companies that was selected for the portfolio. Finally, a numerical example of the purposed method in Iranian stock companies is presented.

Author(s):  
Jhuma Ray ◽  
Siddhartha Bhattacharyya ◽  
N. Bhupendro Singh

Over the past few decades, an extensive research on the multi-objective decision making and combinatorial optimization of real world's financial transactions has taken place. The modern capital market theory problem of portfolio optimization stands to be a multi-objective problem aiming at the maximization of the expected return of the portfolio in turn minimizing portfolio risk. The conditional value-at-risk (CVaR) is a widely used measure for determining the risk measures of a portfolio in volatile market conditions. A heuristic approach to portfolio optimization problem using ant colony optimization (ACO) technique centering on optimizing the conditional value-at-risk (CVaR) measure in different market conditions based on several objectives and constraints has been reported in this paper. The proposed ACO approach is proved to be reliable on a collection of several real-life financial instruments as compared to its value-at-risk (VaR) counterpart. The results obtained show encouraging avenues in determining optimal portfolio returns.


2021 ◽  
Vol 14 (5) ◽  
pp. 201
Author(s):  
Yuan Hu ◽  
W. Brent Lindquist ◽  
Svetlozar T. Rachev

This paper investigates performance attribution measures as a basis for constraining portfolio optimization. We employ optimizations that minimize conditional value-at-risk and investigate two performance attributes, asset allocation (AA) and the selection effect (SE), as constraints on asset weights. The test portfolio consists of stocks from the Dow Jones Industrial Average index. Values for the performance attributes are established relative to two benchmarks, equi-weighted and price-weighted portfolios of the same stocks. Performance of the optimized portfolios is judged using comparisons of cumulative price and the risk-measures: maximum drawdown, Sharpe ratio, Sortino–Satchell ratio and Rachev ratio. The results suggest that achieving SE performance thresholds requires larger turnover values than that required for achieving comparable AA thresholds. The results also suggest a positive role in price and risk-measure performance for the imposition of constraints on AA and SE.


2009 ◽  
Vol 39 (2) ◽  
pp. 591-613 ◽  
Author(s):  
Andreas Kull

AbstractWe revisit the relative retention problem originally introduced by de Finetti using concepts recently developed in risk theory and quantitative risk management. Instead of using the Variance as a risk measure we consider the Expected Shortfall (Tail-Value-at-Risk) and include capital costs and take constraints on risk capital into account. Starting from a risk-based capital allocation, the paper presents an optimization scheme for sharing risk in a multi-risk class environment. Risk sharing takes place between two portfolios and the pricing of risktransfer reflects both portfolio structures. This allows us to shed more light on the question of how optimal risk sharing is characterized in a situation where risk transfer takes place between parties employing similar risk and performance measures. Recent developments in the regulatory domain (‘risk-based supervision’) pushing for common, insurance industry-wide risk measures underline the importance of this question. The paper includes a simple non-life insurance example illustrating optimal risk transfer in terms of retentions of common reinsurance structures.


2017 ◽  
Vol 24 (4) ◽  
pp. 1052-1064 ◽  
Author(s):  
Yong Joo Lee ◽  
Seong-Jong Joo ◽  
Hong Gyun Park

Purpose The purpose of this paper is to measure the comparative efficiency of 18 Korean commercial banks under the presence of negative observations and examine performance differences among them by grouping them according to their market conditions. Design/methodology/approach The authors employ two data envelopment analysis (DEA) models such as a Banker, Charnes, and Cooper (BCC) model and a modified slacks-based measure of efficiency (MSBM) model, which can handle negative data. The BCC model is proven to be translation invariant for inputs or outputs depending on output or input orientation. Meanwhile, the MSBM model is unit invariant in addition to translation invariant. The authors compare results from both models and choose one for interpreting results. Findings Most Korean banks recovered from the worst performance in 2011 and showed similar performance in recent years. Among three groups such as national banks, regional banks, and special banks, the most special banks demonstrated superb performance across models and years. Especially, the performance difference between the special banks and the regional banks was statistically significant. The authors concluded that the high performance of the special banks was due to their nationwide market access and ownership type. Practical implications This study demonstrates how to analyze and measure the efficiency of entities when variables contain negative observations using a data set for Korean banks. The authors have tried two major DEA models that are able to handle negative data and proposed a practical direction for future studies. Originality/value Although there are research papers for measuring the performance of banks in Korea, all of the papers in the topic have studied efficiency or productivity using positive data sets. However, variables such as net incomes and growth rates frequently include negative observations in bank data sets. This is the first paper to investigate the efficiency of bank operations in the presence of negative data in Korea.


2006 ◽  
Vol 36 (2) ◽  
pp. 375-413
Author(s):  
Gary G. Venter ◽  
John A. Major ◽  
Rodney E. Kreps

The marginal approach to risk and return analysis compares the marginal return from a business decision to the marginal risk imposed. Allocation distributes the total company risk to business units and compares the profit/risk ratio of the units. These approaches coincide when the allocation actually assigns the marginal risk to each business unit, i.e., when the marginal impacts add up to the total risk measure. This is possible for one class of risk measures (scalable measures) under the assumption of homogeneous growth and by a subclass (transformed probability measures) otherwise. For homogeneous growth, the allocation of scalable measures can be accomplished by the directional derivative. The first well known additive marginal allocations were the Myers-Read method from Myers and Read (2001) and co-Tail Value at Risk, discussed in Tasche (2000). Now we see that there are many others, which allows the choice of risk measure to be based on economic meaning rather than the availability of an allocation method. We prefer the term “decomposition” to “allocation” here because of the use of the method of co-measures, which quantifies the component composition of a risk measure rather than allocating it proportionally to something.Risk adjusted profitability calculations that do not rely on capital allocation still may involve decomposition of risk measures. Such a case is discussed. Calculation issues for directional derivatives are also explored.


Author(s):  
somayeh khezri ◽  
Akram Dehnokhalaji ◽  
Farhad Hosseinzadeh Lotfi

One of interesting subjects in Data Envelopment Analysis (DEA) is estimation of congestion of Decision Making Units (DMUs). Congestion is evidenced when decreases (increases) in some inputs re- sult in increases (decreases) in some outputs without worsening (im- proving) any other input/output. Most of the existing methods for measuring the congestion of DMUs utilize the traditional de nition of congestion and assume that inputs and outputs change with the same proportion. Therefore, the important question that arises is whether congestion will occur or not if the decision maker (DM) increases or de- creases the inputs dis-proportionally. This means that, the traditional de nition of congestion in DEA may be unable to measure the con- gestion of units with multiple inputs and outputs. This paper focuses on the directional congestion and proposes methods for recognizing the directional congestion using DEA models. To do this, we consider two di erent scenarios: (i) just the input direction is available. (ii) none of the input and output directions are available. For each scenario, we propose a method consists in systems of inequalities or linear pro- gramming problems for estimation of the directional congestion. The validity of the proposed methods are demonstrated utilizing two nu- merical examples.


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Meilin Wen ◽  
Linhan Guo ◽  
Rui Kang ◽  
Yi Yang

Data envelopment analysis (DEA), as a useful management and decision tool, has been widely used since it was first invented by Charnes et al. in 1978. On the one hand, the DEA models need accurate inputs and outputs data. On the other hand, in many situations, inputs and outputs are volatile and complex so that they are difficult to measure in an accurate way. The conflict leads to the researches of uncertain DEA models. This paper will consider DEA in uncertain environment, thus producing a new model based on uncertain measure. Due to the complexity of the new uncertain DEA model, an equivalent deterministic model is presented. Finally, a numerical example is presented to illustrate the effectiveness of the uncertain DEA model.


2018 ◽  
Vol 19 (2) ◽  
pp. 127-136 ◽  
Author(s):  
Stavros Stavroyiannis

Purpose The purpose of this paper is to examine the value-at-risk and related measures for the Bitcoin and to compare the findings with Standard and Poor’s SP500 Index, and the gold spot price time series. Design/methodology/approach A GJR-GARCH model has been implemented, in which the residuals follow the standardized Pearson type-IV distribution. A large variety of value-at-risk measures and backtesting criteria are implemented. Findings Bitcoin is a highly volatile currency violating the value-at-risk measures more than the other assets. With respect to the Basel Committee on Banking Supervision Accords, a Bitcoin investor is subjected to higher capital requirements and capital allocation ratio. Practical implications The risk of an investor holding Bitcoins is measured and quantified via the regulatory framework practices. Originality/value This paper is the first comprehensive approach to the risk properties of Bitcoin.


2015 ◽  
Vol 08 (03) ◽  
pp. 1550034 ◽  
Author(s):  
Sohrab Kordrostami ◽  
Alireza Amirteimoori ◽  
Monireh Jahani Sayyad Noveiri

In standard data envelopment analysis (DEA) models, inefficient decision-making units (DMUs) should change their inputs and outputs arbitrarily to meet the efficient frontier. However, in many real applications of DEA, because of some limitations in resources and DMU's ability, these variations cannot be made arbitrarily. Moreover, in some situations, undesirable factors with different disposability, strong or weak disposability, are found. In this paper, a DEA-based model is proposed to determine the relative efficiency of DMUs in such a restricted environment and in presence of undesirable factors. Indeed, variation levels of inputs and outputs are pre-defined and are considered to evaluate the performance of DMUs. Numerical examples are utilized to demonstrate the approach.


2021 ◽  
pp. 1-19
Author(s):  
Zinoviy Landsman ◽  
Tomer Shushi

Abstract In Finance and Actuarial Science, the multivariate elliptical family of distributions is a famous and well-used model for continuous risks. However, it has an essential shortcoming: all its univariate marginal distributions are the same, up to location and scale transformations. For example, all marginals of the multivariate Student’s t-distribution, an important member of the elliptical family, have the same number of degrees of freedom. We introduce a new approach to generate a multivariate distribution whose marginals are elliptical random variables, while in general, each of the risks has different elliptical distribution, which is important when dealing with insurance and financial data. The proposal is an alternative to the elliptical copula distribution where, in many cases, it is very difficult to calculate its risk measures and risk capital allocation. We study the main characteristics of the proposed model: characteristic and density functions, expectations, covariance matrices and expectation of the linear regression vector. We calculate important risk measures for the introduced distributions, such as the value at risk and tail value at risk, and the risk capital allocation of the aggregated risks.


Sign in / Sign up

Export Citation Format

Share Document