999精品在线视频,手机成人午夜在线视频,久久不卡国产精品无码,中日无码在线观看,成人av手机在线观看,日韩精品亚洲一区中文字幕,亚洲av无码人妻,四虎国产在线观看 ?

The General Conjugate Gradient Method for Solving the Linear Equations

2019-07-18 09:08:20FangXimingLingFengFuShouzhong
數學理論與應用 2019年2期

Fang Ximing Ling Feng Fu Shouzhong

(School of Mathematics and Statistics, Zhaoqing University, Zhaoqing 526000, China)

Abstract In this paper, by introducing a nonsingular symmetric parameter matrix, we present the general conjugate gradient(GCG) method based on the conjugate direction method, which extends the conjugate gradient(CG) method and differs from the preconditioned conjugate gradient(PCG) method. The numerical experiments are performed to show the effectiveness of this method.

Key words Symmetric positive definite matrix Conjugate gradient method Iterative method

1 Introduction

Consider a real linear equations

Ax=b,

(1.1)

whereA∈Rn×nis a symmetric positive definite matrix andb∈Rn. WhenAis a large and sparse matrix, we often apply the iterative methods to solve it, see [1-7]. Some popular iterative methods, for instance, the Jacobi method, the Gauss-Seidel(GS) method, the successive overrelaxation(SOR) method, the generalized minimal residual(GMRES) method, the conjugate gradient(CG) method, the preconditioned conjugate gradient(PCG) method, and other iterative methods can refer to [7-12] and the references therein.

The CG method is a very popular iterative method for solving (1.1). In theory, the CG method can find out the true solution innsteps. In practice, we usually carry less thannsteps and obtain an approximate solution especially for the large and ill-posed problems. In the solving process, the conjugate vectorpkis needed in each step. In order to producepk, both the residualrkand the prior conjugate vectorpk-1are indispensable. In this paper, we use the general conjugate vectors ofAto construct the solution of (1.1). By introducing a nonsingular symmetric parameter matrixP, we applyp1,pk-1andPrk-1to generate the next conjugate vectorpkofA. Then the approximate solution can be represented by the former approximate solution and the conjugate vectorspk. In this way, we present the general conjugate gradient(GCG) method, which is different from the CG method and PCG method.

This paper is organized as follows. We introduce some preliminary knowledge about the conjugate direction method for solving the linear equations in Section 2. Then we propose the general conjugate gradient method in Section 3 and the algorithm in Section 4, respectively. Some numerical experiments are given and discussed in Section 5. Finally, we end this paper by some conclusion remarks in Section 6.

2 Preliminaries

In this section, we give some preliminary knowledge about the conjugate direction method for solving (1.1).

Letx*=A-1bbe the true solution of (1.1), which satisfies

(2.1)

(2.2)

According to the uniqueness of the coordinates, combining with (2.1), if we set

x1-x0=α1p1∈span{p1},

(2.3)

that is,xkapproaches to the true solutionx*of (1.1) gradually by the way

x1-x0,x2-x0,…,xk-x0,…,xn-1-x0,xn-x0=x*-x0,

(2.4)

then we have

(2.5)

rk-1=b-Axk-1,

(2.6)

then another calculation formula ofαkcan be obtained as follows:

(2.7)

Thus, from (2.2) and (2.7) we have

(2.8)

So, once we obtain the approximate solutionxk-1and theAconjugate vectorpk, we can calculateαkfrom (2.8) and then construct the next approximate solutionxkof (1.1) by (2.3), fork=2,3,….

3 The general conjugate gradient method

For a positive definite symmetric matrixA, it is difficult to obtain allnconjugate vectors at the same time. We construct theAconjugate vectorpk-1gradually through adding new independent vector one by one. Set an arbitrary vectorp1≠0 to be the firstAconjugate vector, and suppose that we have theAconjugate vectorsp1,p2,…,pk-1, then the follow-upAconjugate vectorpkshould satisfy the following relation expression

(3.1)

Suppose

(3.2)

andp1≠0. HerePis a given nonsingular symmetric matrix andγk-1≠0 is a real undetermined parameter.

(3.3)

So

(3.4)

In addition, from (2.5) we can obtain

(3.5)

(3.6)

From (2.3) and (3.2), applying mathematics induction method, it is easy to prove the following relations.

span{p1,p2,…,pk}=span{p1,Pr1,…,Prk-1},k=2,3,….

(3.7)

So, from (3.6) and (3.7), we have

rk-1⊥span{p1,p2,…,ps}=span{p1,Pr1,…,Prs-1},s≤k-1,k≥2.

(3.8)

Similarly, from (2.3), we have the following three expressions

xs-xs-1=αsps∈span{ps},s=1,2,…,

rs-rs-1=-αsAps,s=1,2,…,

(3.9)

(3.10)

Then, we obtain the very simple calculating formula ofpkin (3.2) whenαs≠0, that is

(3.11)

From (3.10), we can find thatβkandγkhave the multiple relationship. Then we can letγkbe a nonzero constant, such asγk≡1 for convenience.

In a word, if we select a real nonsingular symmetric matrixPand an initial vectorx0with an arbitrary conjugate vectorp1≠0, then from (2.7),αk,βk,pkandxkcan be generated gradually from (3.10), (3.11), (2,3), (2.8) and (3.4).

All above can be summarized as follows

xk-1=xk-2+αk-1pk-1,

rk-1=b-Axk-1,

(3.12)

Remark 3.1The above derivations are based on the situationrk≠0,k= 1,2,…. If some residualrs-1=0, it is apparent that the true solutionx*=xs-1. Then from (3.10) and (3.12), we haveps=0, which is already not theAconjugate vector. Ifps=0,s

Now, we obtain a method for solving (1.1), that is (3.12), which is called the general conjugate gradient (GCG) method and belongs to the conjugate direction method in nature. The main character of this method is that the residual vectors are applied and a real nonsingular symmetric matrix is introduced. The conjugate vectors can be obtained in a simple way. For the GCG method, the following conclusion can be proved easily from the above construction process ofpkand the derivative process ofxk.

Theorem 3.1Suppose for a given initial vectorx0, the general conjugate gradient method (3.12) carriesm(

(3.13)

4) span{p1,p2…,pk}=span{p1,Pr1,…,Prk-1}.

ProofFrom (3.6), the conclusion 1) holds; from (3.8), the conclusion 2) holds; from (3.1), the conclusion 3) holds; and from(3.7), the conclusion 4) holds.

4 Algorithm

In this section, we present the algorithm of the general conjugate gradient (GCG) method (3.12) as follows.

Algorithm 1 The GCG methodInput: Input A,b,P,x0,p1,σ,k=3,n=length(b)>3.Output: xk,δ,k.1: Compute α1,x1,r1,β1,δ=norm(r1) based on (3.12).2: Compute p2.3: while kσ do4: if pk-1≠0 then5: Compute αk-1,xk-1,rk-1,β1,βk-1.6: if αk-1≠0 then7: Compute pk based on the first formula in (3.12).8: else9: Compute pk based on the second formula in (3.12).10: Compute xk,rk,δ=norm(rk).11: k=k+1.12: else13: if k=3 then14: Set x0=new, p1=new and compute α1,x1,r1,β1,p2.15: else16: Set x0=xk-1, p1=new and compute α1,x1,r1,β1,p2.

5 Numerical experiments

In this section, we compare the general conjugate gradient(GCG) method with the conjugate gradient(CG) method by the numerical examples.

Example 1Let the coefficient matrixA∈Rn×nin (1.1) be

with

S=tridiag(-1,4,-1)∈Rm×m

is a tridiagonal matrix.

B=tridiag(1,0,1)∈Rn×n.

Table 1 The comparison between GCG and CG

From Table 1, we can find that the parameter matrixPhas some influence on the GCG method. If we select an appropriatePand combine with the fast algorithm about multiplication of matrix and vector, it is possible to decrease the iteration steps while deceasing the running time. Compared with the CG method, whenn=900, the GCG method with a symmetric positive definite Toeplitz matrixPhas no superiority, though the iteration steps are smaller under the same stop iteration criterion. However, whenn=2500, the result is different, the GCG method is better than the CG method in both the iteration steps and the running time.

Example 2Let the coefficient matrixAin (1.1) beA=inv(P)*inv(P)T, wherePis just the nonsingular symmetric parameter matrix in the GCG method. Set the true solutionx*, the generated way ofb,x0,p1, the value ofγk-1, the stopping iteration criterionσand the calculation way oferrorto be the same as those in Example 1. The numerical results are shown in Table 2 as follows, where the symbol Toeplitz(α) denotes the symmetric Toeplitz matrix withαbeing the first column.

Table 2 The comparison between GCG and CG

From Table 2, we can find that the superiority of the GCG method is more obvious. Both the iteration steps and the running time are decreased greatly compared with the CG method under the same iteration condition.

6 Conclusion remarks

In this paper, we present the general conjugate gradient (GCG) method based on the conjugate direction method by introducing a nonsingular symmetric matrix and applying the residual vector with the former conjugate vector and the first conjugate vector to construct the next conjugate vector. The numerical results show that the GCG method is effective sometimes.

主站蜘蛛池模板: 国产一线在线| 亚洲高清中文字幕在线看不卡| 亚洲国产精品不卡在线| 国产成年无码AⅤ片在线| 91成人在线免费观看| 久久人人97超碰人人澡爱香蕉| 99re免费视频| 91久久国产热精品免费| 色网站在线视频| 国产精品毛片一区| 国产视频一区二区在线观看| www.日韩三级| 91在线无码精品秘九色APP| 亚洲国产成人超福利久久精品| 国产成人啪视频一区二区三区| 丰满人妻被猛烈进入无码| 无码aⅴ精品一区二区三区| 99视频在线免费| 91黄视频在线观看| 久久综合伊人77777| 日本成人在线不卡视频| 91丝袜美腿高跟国产极品老师| 国产高清在线观看91精品| 在线免费无码视频| 久久情精品国产品免费| 国产精品无码在线看| 精品三级网站| 欧美成a人片在线观看| 69视频国产| 国产手机在线小视频免费观看| 呦系列视频一区二区三区| 久久99精品国产麻豆宅宅| av在线5g无码天天| 亚洲国产精品成人久久综合影院| 日韩精品一区二区三区中文无码| 在线观看无码av五月花| 亚洲精品不卡午夜精品| 亚洲黄色成人| 久久黄色一级视频| 国产亚洲精品97AA片在线播放| 午夜限制老子影院888| 日韩中文无码av超清| 97在线国产视频| 一区二区三区在线不卡免费| 亚洲一道AV无码午夜福利| 色AV色 综合网站| 国产9191精品免费观看| 91成人免费观看| 中文精品久久久久国产网址| 欧美精品1区| 亚洲综合18p| 欧美成人怡春院在线激情| 日本AⅤ精品一区二区三区日| 自慰网址在线观看| 国产日产欧美精品| 成年A级毛片| 精品国产成人三级在线观看| 在线网站18禁| 国产真实乱子伦精品视手机观看 | 777午夜精品电影免费看| 国产chinese男男gay视频网| 国产97区一区二区三区无码| 午夜福利无码一区二区| 视频一区视频二区日韩专区 | 亚洲制服丝袜第一页| 国产成人免费| 亚洲人成高清| 欧美精品成人一区二区视频一| 欧美一区国产| 视频在线观看一区二区| 香蕉国产精品视频| 在线观看国产黄色| 国产成人精品视频一区视频二区| 波多野结衣视频一区二区| 国产精品久久久精品三级| 国产精品第| 最新日韩AV网址在线观看| 中文字幕天无码久久精品视频免费| 国产毛片网站| 色综合久久无码网| 国产成人免费高清AⅤ| 亚洲精品在线影院|