999精品在线视频,手机成人午夜在线视频,久久不卡国产精品无码,中日无码在线观看,成人av手机在线观看,日韩精品亚洲一区中文字幕,亚洲av无码人妻,四虎国产在线观看 ?

Variable Selection and Parameter Estimation with M-Estimation Method for Ultra-high Dimensional Linear Model

2021-10-20 03:26:22ZHUYanling朱艷玲WANGKai汪凱ZHAOMingtao趙明濤
應用數學 2021年4期

ZHU Yanling(朱艷玲),WANG Kai(汪凱),ZHAO Mingtao(趙明濤)

(School of Statistics and Applied Mathematics,Anhui University of Finance and Economics,Bengbu 233000,China)

Abstract:In this paper,the variable selection and parameter estimation of linear regression model in ultra-high dimensional case are considered.The proposed penalty likelihood M-estimator is proved to have good large sample properties by unifying the least squares,least absolute deviation,quantile regression and Huber regression into a general framework.The variable selection and parameter estimation are performed best by combining backward regression with local linear regression in the numerical simulation.In the case of ultra-high dimension,our general method has good robustness and effectiveness in variable selection and parameter estimation.

Key words:Ultra-high dimensionality;M-estimation;Penalized likelihood;Variable selection

1.Introduction

For the classical linear regression modelY=Xβ+ε,whereY=(y1,y2,...,yn)Tis the response vector,X=(X1,X2,...,Xpn)=(x1,x2,...,xn)T=(xij)n×pnis ann×pndesign matrix,andε=(ε1,ε2,...,εn)Tis a random vector.When dimensionpnis high,it is often assumed that only a small number of predictors among all the predictors contribute to the response,which amounts to assuming ideally that the parameter vectorβis sparse.In order to implement sparsity,variable selection can improve the accuracy of estimation by effectively identifying the subset of important predictors,and also enhance model interpretability of the model.

In this paper,we assume that the functionρis convex,hence the objective function is still convex and the obtained local minimizer is global minimizer.In Section 2,we discuss some theoretical properties of LLA estimator.In Section 3,we supply a simple and efficient algorithm to give the numerical simulation results.The proofs are given in Section 4.

2.Main Results

3.Simulation Results

In this section we evaluate the performance of the M-estimator proposed in(1.1)by simulation studies.

For the RSIS+LLA method,it tends to make a more robust selection in the first step,and performs well on the two indicators of estimation error and prediction error,but thus loses part of the ability to misidentify important variables,which makes it easy to omit important variables and cause great errors.In fact,we can guarantee the robustness of the selection and estimation by using the LAD loss function in the second step variable selection.

For the HOLP+LLA method,its performance in the three indicators of prediction error,correct exclusion of unimportant variables,and error identification of important variables is almost equal to that of FR+LLA,but it is slightly worse in the estimation error of the first index.

Tab.1 Numerical Results for ε ~N(0,1)

Example 2 We consider the case of the LAD loss function and the error termε ~t5.The estimated results are shown in Tab.2.

Tab.2 Numerical Results for ε ~t5

When the selection error term follows the heavy-tailed distribution,all six methods in Tab.2 perform better than the error term of the standard normal distribution in the first index estimation error and the third index correctly excluding the unimportant variable.The second indicator is slightly worse than the forecast error,and the fourth indicator is basically flat.Overall conclusion is consistent with Example 1,i.e.the performance of the FR+LLA method is slightly superior.

Example 3We consider the case of the LAD loss function and the error termε ~0.9N(0,1)+0.1N(0,9).The estimated results are shown in Tab.3.

Tab.3 Numerical Results for ε ~0.9N(0,1)+0.1N(0,9)

Synthesizing simulation results from Example 1 to Example 3,it can be seen that in the case that the number of explanatory variables is larger than the sample size,we design the plan that the backward regression FR method is used in the first step of variable screening,and the second step uses the local linear penalty LLA method proposed,and the performance on the four indicators is quite good.It also shows that for ultra-high dimensional data models,using the screening method of FR+LLA we provide is feasible and effective,and can be applied to more extensive data to obtain more satisfactory results.

4.Proofs of Main Results

主站蜘蛛池模板: 国产精品无码AV片在线观看播放| 国产免费久久精品99re不卡 | 亚洲福利网址| 草草影院国产第一页| 欧美97欧美综合色伦图| 秋霞国产在线| 香蕉视频在线观看www| 色九九视频| 久久婷婷色综合老司机| 亚洲国产系列| 精品福利视频导航| 亚洲无码高清一区二区| 深夜福利视频一区二区| 国产欧美精品一区二区| 国产黑丝视频在线观看| 2021国产精品自产拍在线| 成人精品区| 亚洲欧美激情另类| 三级欧美在线| 亚洲综合色婷婷中文字幕| 2021国产精品自产拍在线| 香蕉在线视频网站| 国产国模一区二区三区四区| 久久99国产乱子伦精品免| 亚洲一区国色天香| 国产精品毛片在线直播完整版| 国产色网站| 亚洲天堂网在线播放| 亚洲男女天堂| 成人免费视频一区| 欧美视频免费一区二区三区 | 久久综合九色综合97网| 女人av社区男人的天堂| 精品偷拍一区二区| 日本久久网站| 国外欧美一区另类中文字幕| 久久精品aⅴ无码中文字幕| 91国内在线观看| 久久不卡国产精品无码| 黄色网站不卡无码| 99久久99这里只有免费的精品 | 91精品国产自产91精品资源| 久久综合丝袜日本网| 在线观看国产黄色| 尤物精品视频一区二区三区| 国产福利拍拍拍| 无码精品国产dvd在线观看9久| 亚洲第一国产综合| 伊人色在线视频| 色综合久久久久8天国| 高潮爽到爆的喷水女主播视频| AV在线天堂进入| 久久人体视频| 一级毛片不卡片免费观看| 午夜电影在线观看国产1区| 精品天海翼一区二区| 凹凸国产分类在线观看| 免费国产黄线在线观看| 中文字幕免费在线视频| 国产成人综合日韩精品无码首页| 亚洲天堂免费观看| 久久精品国产免费观看频道| 久久午夜夜伦鲁鲁片无码免费| 欧美精品亚洲精品日韩专区| 日韩精品资源| 国产一级毛片在线| 日韩精品视频久久| 国产成人精品一区二区三区| 好紧太爽了视频免费无码| 久久久91人妻无码精品蜜桃HD| 亚洲中文精品人人永久免费| 日本一本在线视频| 久久香蕉国产线看观| 久久九九热视频| 国产xx在线观看| 无码免费视频| 免费国产高清视频| 欧美a√在线| 精品久久久久久成人AV| 国产成人无码综合亚洲日韩不卡| 欧美亚洲第一页| 91在线播放免费不卡无毒|