Skip to content

mekealbrown/Parallel

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

42 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Parallel Computing (Spring 2025)

Welcome to my project portfolio for Parallel Computing. This repository showcases my exploration of parallel programming techniques, optimizations, and performance analysis, developed as part of my self-study at Lipscomb University.


About the Project

This portfolio is a collection of assignments, experiments, and optimizations crafted to deepen my understanding of parallel computing concepts.

Learning Objectives

  • Learn parallel programming paradigms (e.g., OpenMP, SIMD).
  • Optimize algorithms for speed and resource efficiency.
  • Analyze performance bottlenecks (cache, memory, CPU utilization).
  • Apply vectorization techniques using modern instruction sets (e.g., AVX2).
  • Benchmark and compare sequential vs. parallel implementations.

Featured Project: Gaussian Blur with AVX2

Overview

A high-performance Gaussian blur implementation in C, achieving a 500x speedup over a naive box blur (kernel size 60) using AVX2 SIMD instructions. It leverages a separable two-pass approach with sliding windows, processing 8 pixels per cycle.

Build & Run

mkdir build
cd build
cmake ..
make
./blur_all <input_image> <output_image> <kernel_size>

Example:

./blur_all ../images/input.png ../images/output.png 10

See the Gaussian Blur README for full details.

Other Projects

Matrix Multiplication with OpenMP

Parallelized a matrix multiply, achieving a max 6.11% speedup on 11 threads.

Tools and Technology

  • Languages: C (primary), Python(analysis)
  • Parallel Frameworks: OpenMP for multi-threading, AVX2 for SIMD.
  • Build System: CMake for cross-platform compatibility, Make.
  • External Libraries: STB Image for image I/O.
  • Hardware: 6-Core AMD Ryzen 5 3600 with AVX2 support, 16Gi DDR4 RAM.

How to Explore

Clone the Repo:

git clone <repository-url>
cd Parallel

Acknowledgments

Dr. Dwayne Towell: For inspiring this deep dive into parallelism.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors