# Descomposición lu

Solo disponible en BuenasTareas
• Páginas : 55 (13657 palabras )
• Descarga(s) : 0
• Publicado : 29 de enero de 2012

Vista previa del texto
Chapter 2

Linear Equations

One of the problems encountered most frequently in scientiﬁc computation is the solution of systems of simultaneous linear equations. This chapter covers the solution of linear systems by Gaussian elimination and the sensitivity of the solution to errors in the data and roundoﬀ errors in the computation.

2.1

Solving Linear Systems
Ax = b

With matrixnotation, a system of simultaneous linear equations is written

In the most frequent case when there are as many equations as unknowns, A is a given square matrix of order n, b is a given column vector of n components, and x is an unknown column vector of n components. Students of linear algebra learn that the solution to Ax = b can be written x = A−1 b where A−1 is the inverse of A. However, in thevast majority of practical computational problems, it is unnecessary and inadvisable to actually compute A−1 . As an extreme but illustrative example, consider a system consisting of just one equation, such as 7x = 21 The best way to solve such a system is by division, x= 21 =3 7

Use of the matrix inverse would lead to x = 7−1 × 21 = .142857 × 21 = 2.99997 The inverse requires more arithmetic— a division and a multiplication instead of just a division — and produces a less accurate answer. Similar considerations apply to systems of more than one equation. This is even true in the common situation 1

2

Chapter 2. Linear Equations

where there are several systems of equations with the same matrix A but diﬀerent right-hand sides b. Consequently, we shall concentrate on the directsolution of systems of equations rather than the computation of the inverse.

2.2

The MATLAB Backslash Operator

To emphasize the distinction between solving linear equations and computing inverses, Matlab has introduced nonstandard notation using backward slash and forward slash operators, “\” and “/”. If A is a matrix of any size and shape and B is a matrix with as many rows as A, then thesolution to the system of simultaneous equations AX = B is denoted by X = A\B Think of this as dividing both sides of the equation by the coeﬃcient matrix A. Because matrix multiplication is not commutative and A occurs on the left in the original equation, this is left division. Similarly, the solution to a system with A on the right and B with as many columns as A, XA = B is obtained by rightdivision, X = B/A This notation applies even if A is not square, so that the number of equations is not the same as the number of unknowns. However, in this chapter, we limit ourselves to systems with square coeﬃcient matrices.

2.3

A 3-by-3 Example

To illustrate the general linear equation solution algorithm, consider an example of order three:      10 −7 0 x1 7  −3 2 6   x2  = 4  5 −1 5 x3 6 This of course, represents the three simultaneous equations 10x1 − 7x2 = 7 −3x1 + 2x2 + 6x3 = 4 5x1 − x2 + 5x3 = 6 The ﬁrst step of the solution algorithm uses the ﬁrst equation to eliminate x1 from the other equations. This is accomplished by adding 0.3 times the ﬁrst equation

2.3. A 3-by-3 Example

3

to the second equation and subtracting 0.5 times the ﬁrst equationfrom the third equation. The coeﬃcient 10 of x1 in the ﬁrst equation is called the ﬁrst pivot and the quantities -0.3 and 0.5, obtained by dividing the coeﬃcients of x1 in the other equations by the pivot, are called the multipliers. The ﬁrst step changes the equations to      10 −7 0 x1 7  0 −0.1 6   x2  =  6.1  0 2.5 5 x3 2.5 The second step might use the second equation to eliminate x2from the third equation. However, the second pivot, which is the coeﬃcient of x2 in the second equation, would be -0.1, which is smaller than the other coeﬃcients. Consequently, the last two equations are interchanged. This is called pivoting. It is not actually necessary in this example because there are no roundoﬀ errors, but it is crucial in general.      10 −7 0 x1 7  0 2.5 5  ...