zakariya yahya algamal
-
International Journal Of Nonlinear Analysis And Applications, Volume:13 Issue: 1, Winter-Spring 2022, PP 2127 -2135
In reducing the effects of collinearity, the ridge estimator (RE) has been consistently demonstrated to be an attractive shrinkage method. In application, when the response variable is binary data, the logistic regression model (LRM) is a well-known model. However, it is known that collinearity negatively affects the variance of maximum likelihood estimator of the LRM. To address this problem, a logistic ridge estimator was proposed by several authors. In this work, a Jackknifing logistic ridge estimator (NJLRE) is proposed and derived. The Monte Carlo simulation results recommend that the NJLRE estimator can bring significant improvement relative to other existing estimators. Furthermore, the real application results demonstrate that the NJLRE estimator outperforms both LRE and MLE in terms of predictive performance.
Keywords: Collinearity, Jackknife estimator, ridge estimator, logistic regression model, Monte Carlo simulation -
International Journal Of Nonlinear Analysis And Applications, Volume:13 Issue: 1, Winter-Spring 2022, PP 2675 -2684
The Liu estimator has been consistently demonstrated to be an attractive shrinkage method to reduce the effects of Inter-correlated (multicollinearity). The negative binomial regression model is a well-known model in the application when the response variable is non-negative integers or counts. However, it is known that multicollinearity negatively affects the variance of the maximum likelihood estimator of the negative binomial coefficients. To overcome this problem, a negative binomial Liu estimator has been proposed by numerous researchers. In this paper, a Jackknifed Liu-type negative binomial estimator (JNBLTE) is proposed and derived. The idea behind the JNBLTE is to decrease the shrinkage parameter and, therefore, the resultant estimator can be better with a small amount of bias. Our Monte Carlo simulation results suggest that the JNBLTE estimator can bring significant improvement relative to other existing estimators. In addition, the real application results demonstrate that the JNBLTE estimator outperforms both the negative binomial Liu estimator and maximum likelihood estimators in terms of predictive performance.
Keywords: Multicollinearity, Liu estimator, negative binomial regression model, shrinkage, Monte Carlo simulation -
International Journal Of Nonlinear Analysis And Applications, Volume:13 Issue: 1, Winter-Spring 2022, PP 2455 -2465
It is known that when the multicollinearity exists in the gamma regression model, the variance of maximum likelihood estimator is unstable and high. In this article, a new Liu-type estimator based on (r-(k-d)) class estimator in gamma regression model is proposed. The performance of the proposed estimator is studied and comparisons are done with others. Depending on the simulation and real data results in the sense of mean squared error, the proposed estimator is superior to the other estimators.
Keywords: Liu-type estimator, gamma regression model, (r-(k-d)) class estimator, (r -- d) class estimator, (r -- k) class estimator, (k -- d) class estimator -
International Journal Of Nonlinear Analysis And Applications, Volume:13 Issue: 1, Winter-Spring 2022, PP 3441 -3450
The support vector regression (SVR) technique is considered the most promising and widespread way in the prediction process, and raising the predictive power of this technique and increasing its generalization ability well depends on tunning its hyperparameters. Nature-inspired algorithms are an important and effective tool in optimizing or tuning hyperparameters for SVR models. In this research, one of the algorithms inspired by nature, the black hole algorithm (BHA), by adapting this algorithm to optimize the hyperparameters of SVR, the experimental results, obtained from working on two data sets, showed, the proposed algorithm works better by finding a combination of hyperparameters as compared to the grid search (GS) algorithm, in terms of prediction and running time. In addition, the experimental results show the improvement of the prediction and computational time of the proposed algorithm. This demonstrates BHA's ability to find the best combination of hyperparameters.
Keywords: Support Vector Regression (SVR), Black Hole Algorithm (BHA), Hyperparameters -
International Journal Of Nonlinear Analysis And Applications, Volume:13 Issue: 1, Winter-Spring 2022, PP 3153 -3168
Modelling of count data has been of extreme interest to researchers. However, in practice, count data is often identified with overdispersion or underdispersion. The Conway Maxwell Poisson regression model (CMPRE) has been proven powerful in modelling count data with a wide range of dispersion. In regression modeling, it is known that multicollinearity negatively affects the variance of the maximum likelihood estimator. To address this problem, shrinkage estimators, such as Liu and Liu-type estimators have been consistently verified to be attractive to decrease the effects of multicollinearity. However, these shrinkage estimators are considered biased estimators. In this study, the jackknife approach and its modified version are proposed for modeling count data with CMPRE. These two estimators are proposed to reduce the effects of multicollinearity and the biasedness of using the Liu-type estimator simultaneously. The results of Monte Carlo simulation and real data recommend that the proposed estimators were significant improvement relative to other competitor estimators, in terms of absolute bias and mean squared error with superiority to the modified jackknifed Liu-type estimator.
Keywords: Multicollinearity, Liu-type estimator, Conway-Maxwell-Poisson regression model, Jackknife estimator, Monte Carlo simulation -
International Journal Of Nonlinear Analysis And Applications, Volume:12 Issue: 1, Winter-Spring 2021, PP 2093 -2104
It is a challenge in the real application when modelling the relationship between the response variable and several explanatory variables when the existence of collinearity. Traditionally, in order to avoid this issue, several shrinkage estimators are proposed. Among them is the Kibria and Lukman estimator (K-L). In this study, a jackknifed version of the K-L estimator is proposed in the generalized linear model that combines the Jackknife procedure with the K-L estimator to reduce the biasedness. Our Monte Carlo simulation results and the real data application related to the inverse Gaussian regression model suggest that the proposed estimator can bring significant improvement relative to other competitor estimators, in terms of absolute bias and mean squared error.
Keywords: Collinearity, K-L estimator, Inverse Gaussian regression model, Jackknife estimator, Monte Carlo simulation
- در این صفحه نام مورد نظر در اسامی نویسندگان مقالات جستجو میشود. ممکن است نتایج شامل مطالب نویسندگان هم نام و حتی در رشتههای مختلف باشد.
- همه مقالات ترجمه فارسی یا انگلیسی ندارند پس ممکن است مقالاتی باشند که نام نویسنده مورد نظر شما به صورت معادل فارسی یا انگلیسی آن درج شده باشد. در صفحه جستجوی پیشرفته میتوانید همزمان نام فارسی و انگلیسی نویسنده را درج نمایید.
- در صورتی که میخواهید جستجو را با شرایط متفاوت تکرار کنید به صفحه جستجوی پیشرفته مطالب نشریات مراجعه کنید.