You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Modern C++ Software which uses OpenMP to parallelise the three classic Algebraic Iterative Methods: Jacobi, Gauss-Seidel and Successive Over Relaxation (SOR) for solving Systems of the form Ax = b.
Note: this software looks at measuring convergence of the solvers and not on solving a large algebraic system of equations; another time.
Introduction
There is a problem in trying to parallelise the Gauss-Seidel & SOR methods because they are inherently sequential due to the dependency on the latest updated values within each iteration.
However, we can parallelise it partially using a Red-Black Ordering Scheme.
However, this scheme is more appropriate for systems where the matrix isn't dense.
If the matrix is dense, true parallel forms of the methods is tricky, and so the algorithm logic has to be altered.
If the matrix is dense, we can use Graph Colouring (AKA Multi-Colouring).
Note: range-based for loops are used in places, so a compiler which supports C++20 features is required. However, OpenMP still requires traditional (index-based) for loops.
The Iterative Solvers
Here is a brief overview of the methods.
The Jacobi Method
Idea: Updates each component of the solution vector independently using the values from the previous iteration.
Convergence: Requires that the coefficient matrix A is diagonally dominant or symmetric positive definite.
Pros: Simple and parallelizable.
Cons: Slow convergence compared to the Gauss-Seidel Method.
The Gauss-Seidel Method
Idea: Like the Jacobi Method, but uses newly updated values as soon as they are available.
Convergence: Often faster than the Jacobi Method. Also needs the coefficient matrix A to be diagonally dominant or symmetric positive definite.
Pros: Improved convergence over the Jacobi Method.
Cons: Less parallelizable due to data dependencies. To parallelise, you need to apply, for example, the Red-Black Ordering Scheme.
The Successive Over Relaxation (SOR) Method
Idea: An extension of the Gauss-Seidel Method that uses a Relaxation Factor w to potentially accelerate convergence.
Parameter range: 0 < w < 2.
w = 1: Gauss-Seidel Method.
w > 1: Over-relaxation (usually speeds up convergence).
Pros: Can converge much faster with optimal w.
Cons: Requires tuning of w; not always easy to choose. Values are chosen experimentally. Less parallelizable due to data dependencies. To parallelise, you need to apply, for example, the Red-Black Ordering Scheme.
Red-Black Ordering Scheme: What is it?
This is a technique to organize a structured grid (like a 2D matrix) into two independent groups of points — typically labeled "red" and "black" like a chessboard pattern.
Points of one colour can be updated in parallel because none of them directly depend on each other - only on the other colour's points.
You alternate updating all the red points and then all the black points in each iteration.
Requirements
Compiler:g++ 13.1.0.
OS: Ubuntu 20.04.
OpenMP. For parallelising the iterative methods: Jacobi, Gauss-Seidel and SOR.
CMake. For building the software.
Python for plotting results. You'll need Matplotlib for this.
The .csv files are saved in a folder called Results (within the build directory).
Results
A Python script called plotter.py is provided to plot the results. This can be found in the Results folder.
A sample of the results is provided; go to the Results directory. You'll find a collated data set and a data set for each algorithm.
At present this plots the collated data set.
About
Software which uses OpenMP to parallelise the three classic Algebraic Iterative Methods: Jacobi, Gauss-Seidel & Successive Over Relaxation, for solving Systems of the form Ax = b