Parallel implementation of Conjugate Gradient Linear System Solver 1.0 ...

Parallel implementation of Conjugate Gradient Linear System Solver 1.0 ...

Hello,

Parallel implementation of Conjugate Gradient Linear System Solver
1.0

Description:

The Parallel implementation of Conjugate
Gradient Linear System Solver that i programmed here is designed to be used to
solve large sparse systems of linear equations where the direct methods can
exceed available machine memory and/or be extremely time-consuming. for example
the direct method of the Gauss algorithm takes O(n^2) in the back substitution
process and is dominated by the O(n^3) forward elimination process, that means,
if for example an operation takes 10^-9 second and we have 1000 equations , the
elimination process in the Gauss algorithm will takes 0.7 second, but if we have
10000 equations in the system , the elimination process in the Gauss algorithm
will take 11 minutes !. This is why i have develloped for you the Parallel
implementation of Conjugate Gradient Linear System Solver in Object Pascal, that
is very fast.

Jacobi serial complexity is O(N^2) and Conjugate gradient serial complexity
= O(N^3/2).

You can download Parallel implementation of Conjugate Gradient Linear
System Solver 1.0 from:

http://pages.videotron.com/aminer/

Please look at the test.pas example inside the zip file, compile and
execute it...

Language: FPC Pascal v2.2.0+ / Delphi 7+: http://www.freepascal.org/

Operating Systems: Win , Linux and Mac (x86).

Note: to be able to port to Linux and Mac OSX you have to compile the
dynamic libraries...

Required FPC switches: -O3 -Sd -dFPC -dWin32 -dFreePascal

-Sd for delphi mode....

-dUnix for Linux,MacOSX etc.

Required Delphi switches: -DMSWINDOWS -$H+ -DDelphi

And inside defines.inc you have two defines:

{$DEFINE CPU32} for 32 bits systems
{$DEFINE CPU64} for 64 bits
systems

Thank you.
Amine Moulay Ramdane.

8 post / 0 nuovi
Ultimo contenuto
Per informazioni complete sulle ottimizzazioni del compilatore, consultare l'Avviso sull'ottimizzazione

Hello,

I have also ported it to 64 bits systems, just open
defines.inc
and change {$DEFINE CPU32} to {$DEFINE CPU64} and
compile
it.

Also i have got over 3X scalability on a quad
core.

Thank you.
Amine Moulay Ramdane.

Hello,

You have only one method to use that is Solve()

function TParallelConjugateGradient.Solve(var A:
arrarrext;var B,X:VECT;var
RSQ:DOUBLE;nbr_iter:integer;show_iter:boolean):boolean;

The system: A*x = b

The important variables in the Solve() method are:

A is the matrix , B isthe b vector, X the initial vector x,

nbr_iter is the number of iterations that you want

and show_iter to show the number of iteration on the screen.

Thank you.

Amine Moulay Ramdane.

Hello,

And RSQ is the sum of the squares of the components of the residual vector
A.x - b.

Thank you.
Amine Moulay Ramdane


"The Conjugate Gradient Method is the most prominent iterative method for solving sparse systems of linear equations. Unfortunately, many textbook treatments of the topic are written with neither illustrations nor intuition, and their victims can be found to this day babbling senselessly in the corners of dusty libraries. For this reason, a deep, geometric understanding of the method has been reserved for the elite brilliant few who have painstakingly decoded the mumblings of their forebears. Conjugate grandient is the most popular iterative method for solving large systems of linear equations. CG is effective for systems of the form A.x = b
where x is an unknown vector, b is a known vector, A and is a known, square, symmetric, positive-definite
(or positive-indefinite) matrix. These systems arise in many important settings, such as finite difference and finite element methods for solving partial differential equations, structural analysis, circuit analysis, and math homework."

Thank you.

Amine Moulay Ramdane.

"The Conjugate gradientmethod can also be applied to non-linear
problems, but with much less success since the non-linear functions
have multiple minimums. The Conjugategradient method will indeed
find a minimum of such a nonlinear function, but it is in no way guaranteed

to be a global minimum, or the minimum that is desired. But the
conjugate gradient method is great iterative method for solving large,

sparse linear systems with a symmetric, positive, definite matrix.."

Thank you,
Amine Moulay Ramdane

Hello,

"In the method of conjugate gradients the residuals are not used
as search directions, as in the steepest decent method, cause searching

can require a large number of iterations as the residuals zig zag towards

the minimum value for ill-conditioned matrices. But instead conjugate

gradient method uses the residuals as a basis to form conjugate search

directions . In this manner, the conjugated gradients (residuals) form a

basis of search directions to minimize the quadratic function f(x)and

to achieveresult of dim(N) convergence."

Thank you.
Amine Moulay Ramdane.

Hello,

Parallel implementation of Jacobi with relaxation Linear System
Solver

and

Parallel implementation of Conjugate Gradient Linear System
Solver

was updated to version 1.01

You can download them
from:

http://pages.videotron.com/aminer/

Thank
you.
Amine Moulay Ramdane.

Accedere per lasciare un commento.