Research Article
Asma Hamzeh; Faezeh Banimostafaarab; Fatemeh Atatalab
Abstract
The rating of insurance companies is one of the necessary and operational policies to regulate and evaluate the performance of the insurance industry. It informs shareholders, customers, insurers, and even regulatory authorities, as well as formal and informal support bodies, about the current performance ...
Read More
The rating of insurance companies is one of the necessary and operational policies to regulate and evaluate the performance of the insurance industry. It informs shareholders, customers, insurers, and even regulatory authorities, as well as formal and informal support bodies, about the current performance of insurance companies and their capabilities and prospects for the future. The rating of insurance companies in terms of the regulatory indicators and decision-making and implementation of the administrative measures for the companies based on the regulatory rating of each company is one of the needs of the regulatory body. Therefore, doing this properly requires using the indicators in principal areas, weighting them according to their importance, and implementing the model, finally. For this reason, in this study, first, the effective indicators for the regulatory rating of insurance companies were identified using documentary studies and relevant writings, and the initial indicators were scrutinized and completed using the results of a questionnaire. Then, the indicators prioritization and weighting and implementation of the model for regulatory rating of insurance companies are performed for 2019. Weighting the indicators is done by the Shannon entropy method, and the rating of insurance companies is implemented under three different scenarios with the TOPSIS model and the weighted average method.
Research Article
Hadi Bagherzadeh Valami; Zeinab Sinaei nasab
Abstract
In the process of evaluating the Decision Making Units, two factors of efficiency and production size can be used. When the production size of a unit is not optimal, its Returns To Scale (RTS) determines that changing the resources in anotherdirection would enhance its productivity. In most previous ...
Read More
In the process of evaluating the Decision Making Units, two factors of efficiency and production size can be used. When the production size of a unit is not optimal, its Returns To Scale (RTS) determines that changing the resources in anotherdirection would enhance its productivity. In most previous research, RTS is considered to be increasing or decreasing, and frontier analysis is used to determine it. The concept of RTS in Network Data Envelopment Analysis (DEA) is so interesting. In this paper a method based on Most Productive Scale Size (MPSS) in several steps is developed, in addition to determining that RTS of units for each unit in directional manner, the shortest changes in resources for achieving the right size for network production is also obtained. In this approach, the computational complexity, and the ambiguity in units RTS is not present.
Research Article
Saman Vahabi; Amir Teimour Payandeh Najafabadi
Abstract
In this paper, we design a pure-endowment insurance contract and obtain the optimal strategy and consumption for a policyholder with CRRA utility function. In this contract, premiums are received from the policyholder at certain times. Theinsurer undertakes to pay the premiums by a certain guarantee ...
Read More
In this paper, we design a pure-endowment insurance contract and obtain the optimal strategy and consumption for a policyholder with CRRA utility function. In this contract, premiums are received from the policyholder at certain times. Theinsurer undertakes to pay the premiums by a certain guarantee rate, in addition, by investing in a portfolio of risky and risk free assets share invest pro ts. We used Variance Gamma process as a representative of in nite activity jump modelsand sensitivity of jump parameters in an uncertainty nancial market has been studied. Also we compared results using by two forces of mortality.
Research Article
Mahboubeh Aalaei
Abstract
In this paper, fuzzy set theory is implemented to model internal rate of return for calculating the price of life settlements. Deterministic, probabilistic and stochastic approaches is used to price life settlements in the ...
Read More
In this paper, fuzzy set theory is implemented to model internal rate of return for calculating the price of life settlements. Deterministic, probabilistic and stochastic approaches is used to price life settlements in the secondary market for the Iranian insurance industry. Research findings were presented and analyzed for whole life insurance policies using the interest rates announced in the supplement of Regulation No. 68 and Iranian life table, which recently has been issued to be used by insurance companies. Also, the results of three approaches were compared with surrender value, which indicates the surrender value is lower than the fuzzy price calculated based on the probabilistic and stochastic approaches and it is higher than the price calculated based on the deterministic approach. Therefore, selling life settlements in the secondary market in Iran based on calculated fuzzy price using probabilistic and stochastic approaches will benefit the policyholder. Also, the price is obtained in the form of an interval using the fuzzy sets theory and the investor can decide which price is suitable for this policy based on financial knowledge. Furthermore, in order to show validity of the proposed fuzzy method, the findings are compared to the results of using the random internal rate of return.
Research Article
Robabeh Hosseinpour Samim Mamaghani; Farzad Eskandari
Abstract
In this paper, we considered a Bayesian hierarchical method using the hyper product inverse moment prior in the ultrahigh-dimensional generalized linear model (UDGLM), that was useful in the Bayesian variable selection. We showed the posterior probabilities of the true model converge to 1 as the sample ...
Read More
In this paper, we considered a Bayesian hierarchical method using the hyper product inverse moment prior in the ultrahigh-dimensional generalized linear model (UDGLM), that was useful in the Bayesian variable selection. We showed the posterior probabilities of the true model converge to 1 as the sample size increases. For computing the posterior probabilities, we implemented the Laplace approximation. The Simpli ed Shotgun Stochastic Search with Screening (S5) procedure for generalized linear model was suggested for exploring the posterior space. Simulation studies and real data analysis using the Bayesian ultrahigh-dimensional generalized linear model indicate that the proposed method had better performance than the previous models. Keywords: Ultrahigh dimensional; Nonlocal prior; Optimal
Research Article
Sajad Nezamdoust; Farzad Eskandari
Abstract
The paper considers the problem of estimation of the parameters in nite mixture models.In this article, a new method is proposed for of estimation of the parameters in nite mixture models. Traditionally, the parameter estimation in nite mixture models is performed from a likelihood ...
Read More
The paper considers the problem of estimation of the parameters in nite mixture models.In this article, a new method is proposed for of estimation of the parameters in nite mixture models. Traditionally, the parameter estimation in nite mixture models is performed from a likelihood point of view by exploiting the expectation maximization (EM) method and the Least Square Principle. Ridge regression is an alternative to the ordinary least squares method when multicollinearity presents among the regressor variables in multiple linear regression analysis. Accordingly, we propose a new shrinkage ridge estimation approach. Based on this principle, we propose an iterative algorithm called RidgeIterative Weighted least Square (RIWLS) to estimate the parameters. Monte-Carlo simulation studies are conducted to appraise the performance of our method. The results show that the Proposed estimator perform better than the IWLS method.
Research Article
emad koosha; Mohsen Seighaly; Ebrahim Abbasi
Abstract
The purpose of the present research is to use machine learning models to predict the price of Bitcoin, representing the cryptocurrency market. The price prediction model can be considered as the most important component in algorithmic trading. The performance of machine learning and its models, due to ...
Read More
The purpose of the present research is to use machine learning models to predict the price of Bitcoin, representing the cryptocurrency market. The price prediction model can be considered as the most important component in algorithmic trading. The performance of machine learning and its models, due to the nature of price behavior in financial markets, have been reported to be well in studies. In this respect, measuring and comparing the accuracy and precision of random forest (RF), long-short-term memory (LSTM), and recurrent neural network (RNN) models in predicting the top and bottom of Bitcoin prices are the main objectives of the present study. The approach to predicting top and bottom prices using machine learning models can be considered as the innovative aspect of this research, while many studies seek to predict prices as time series, simple, or logarithmic price returns. Pricing top and bottom data as target variables and technical analysis indicators as feature variables in the 1-hour time frame from 1/1/2018 to 6/31/2022 served as input to the mentioned models for learning. Validation and testing are presented and used. 70% of the data are considered learning data, 20% as validation data, and the remaining 10% as test data. The result of this research shows over 80% accuracy in predicting the top and bottom Bitcoin price, and the random forest model’s prediction is more accurate than the LSTM and RNN models.
Research Article
Samaneh Mohammadi Jarchelou; Kianoush Fathi Vajargah; Parvin Azhdari
Abstract
Investment is the selection of assets to hold and earn more pro t for greater prosperity in the future. The selection of a portfolio based on the theory of constraint is classical data covering analysis evaluation and ranking Sample function. The in vestment process is related to how investors act in ...
Read More
Investment is the selection of assets to hold and earn more pro t for greater prosperity in the future. The selection of a portfolio based on the theory of constraint is classical data covering analysis evaluation and ranking Sample function. The in vestment process is related to how investors act in deciding on the types of tradable securities to invest in and the amount and timing. Various methods have been proposed for the investment process, but the lack of rapid computational methods for determining investment policies in securities analysis makes performance appraisal a long term challenge. An approach to the investment process consists of two parts. Major is securities analysis and portfolio management. Securities analysis involvesestimating the bene ts of each investment, while portfolio management involves analyzing the composition of investments and managing and maintaining a set of investments. Classical data envelopment analysis (DEA) models are recognized as accurate for rating and measuring efficient sample performance. Unluckily, this perspective often brings us to get overwhelmed when it's time to start a project. When it comes to limiting theory, the problem of efficient sample selection using a DEA models to test the performance of the PE portfolio is a real discontinuous boundary and concave has not been successful since 2011. In order to solve this problem, we recommend a DEA method divided into business units based on the Markowitz model. A search algorithm is used to introduce to business units and prove their validity. In any business unit, the boundary is continuous and concave. Therefore, DEA models could be applied as PE evaluation. To this end, 25 companies from the companies listed on the Tehran Stock Exchange for the period 1394 to 1399 were selected as the sample size of statistics in data analysis. To analyzethe data, after classi cation and calculations were analysed by MATLAB software, the simulation results show that performance evaluation based on constraint theory based on DEA approach and the Markowitz model presented in this paper is efficient and feasible in evaluating the portfolio of constraint theory.
Research Article
Asma Khadimallah; Fathi Abid
Abstract
This paper has potential implications for the management of the bank. We examine a bank capital structure with contingent convertible debt to improve financial stability. This type of debt converts to equity when the bank is facing financial difficulties and a conversion trigger occurs. We use a leverage ...
Read More
This paper has potential implications for the management of the bank. We examine a bank capital structure with contingent convertible debt to improve financial stability. This type of debt converts to equity when the bank is facing financial difficulties and a conversion trigger occurs. We use a leverage ratio, which is introduced in Basel III to trigger conversion instead of traditional capital ratios. We formulate an optimization problem for a bank to choose an asset allocation strategy to maximize the expected utility of the bank's asset value. Our study presents an application of stochastic optimal control theory to a banking portfolio choice problem. By applying a dynamic programming principle to derive the HJB equation, we define and solve the optimization problem in the power utility case.The numerical results show that the evolution of the optimal asset allocation strategy is really affected by the realization of the stochastic variables characterizing the economy. We carried out a sensitivity analysis of risk aversion, time and volatility. We also reveal that the optimal asset allocation strategy is relatively sensitive to risk aversion as well as that the allocation in CoCo and equity decreases as the investment horizon increases. Finally, sensitivity analysis highlights the importance of dynamic considerations in optimal asset allocation based on the stochastic characteristics of investment opportunities.
Research Article
Azadeh Ghasemifard; Seddigheh Banihashemi; Afshin Babaei
Abstract
The aim of this paper is to numerically price the European double barrier option by calculating the governing fractional Black-Scholes equation in illiquid markets. Incorporating the price impact into the underlying asset dynamic, which means that trading strategies affect the ...
Read More
The aim of this paper is to numerically price the European double barrier option by calculating the governing fractional Black-Scholes equation in illiquid markets. Incorporating the price impact into the underlying asset dynamic, which means that trading strategies affect the underlying price, we consider markets with finite liquidity. We survey both cases of first-order feedback and full feedback. Asset evolution satisfies a stochastic differential equation with fractional noise, which is more realistic in markets with statistical dependence. Moreover, the Sinc-collocation method is used to price the option. Numerical experiments show that the results highly correspond to our expectation of illiquid markets.
Research Article
Hamid Abbaskhani; Asgar Pakmaram; Nader Rezaei; Jamal Bahri Sales
Abstract
Despite the growing need for research on the going concern and bankruptcy of companies, most of the conducted studies have used the approach of quantitative data for predicting the going concern and bankruptcy of companies; on the other hand, it is possible to manage these quantitative data by company ...
Read More
Despite the growing need for research on the going concern and bankruptcy of companies, most of the conducted studies have used the approach of quantitative data for predicting the going concern and bankruptcy of companies; on the other hand, it is possible to manage these quantitative data by company managers. As a result, there appears to be a need to examine alternative methods for predicting going concern and bankruptcy based on qualitative data from the auditor's report. The purpose of this research is to determine the ability to predict the going concern of the companies using quantitative and qualitative data. The study period was from 2011 to 2021, with a sample of 54 companies admitted to the Tehran Stock Exchange. The results of the first hypothesis test show that the coefficient of determination of text-mining approach model prediction with the presence of a life cycle variable is greater than the determination coefficient of text-mining approach model prediction with the presence of a company size variable. The test of the second hypothesis shows that the difference in the increasing explanatory power of the first model compared to the second model in the companies accepted in the stock exchange is significant.
Research Article
Shokouh Shahbeyk
Abstract
In this paper, we discuss some of the concepts of robustness for uncertain multi-objective optimization problems. An important factor involved with multi objective optimization problems is uncertainty. The uncertainty may arise fromthe estimation of parameters in the model, error of computation, the ...
Read More
In this paper, we discuss some of the concepts of robustness for uncertain multi-objective optimization problems. An important factor involved with multi objective optimization problems is uncertainty. The uncertainty may arise fromthe estimation of parameters in the model, error of computation, the structure of a problem, and so on. Indeed, some parameters are often unknown at the beginning of solving a multi-objective optimization problem. One of the mostimportant and popular approaches for dealing with uncertainty is robust optimization. Markowitz's portfolio optimization problem is strongly sensitive to the perturbations of input parameters. We consider Markowitz's portfolio optimization problem with ellipsoid uncertainty set and apply set-based minmax and lower robust efficiency to this problem. The concepts of robust efficiency are used in the real stock market and compared to each other. Finally, the increase and decrease effects of uncertainty set parameters on these robust efficient solutions are verified.