Sunday, May 10, 2026

Pacman Net

 

Pacman Net

A distributed Pacman game implementation with a Qt/C++ client, a C++ server,

RabbitMQ message queuing, and Google Protocol Buffers for event serialization.

🎮 Built a distributed Pacman game from scratch — C++ server, Qt OpenGL client, RabbitMQ + Protocol Buffers over the network.

The idea started simple: take a classic OpenGL Pacman game and split it into a headless game server and a remote viewer. The server runs all the ghost AI and physics at 60fps, serializes the full game state using Google Protocol Buffers, and publishes it to a RabbitMQ fanout exchange. Any client that connects gets a live stream of the game and can send keystrokes back.

Stack:

→ C++ server (no Qt, no display)

→ Qt 6.4 + QOpenGLWidget client with legacy fixed-function OpenGL

→ RabbitMQ (Docker) as the message broker

→ Google Protobuf v3.21 for binary serialization

→ Custom blocking TCP handler for AMQP-CPP (no Boost/libuv)

→ CMake + MinGW on Windows


The build journey was its own adventure — MinGW ABI mismatches, protobuf abseil dependency hell, Winsock header ordering, and PATH conflicts between three different gcc installations. Hard lessons learned about keeping compilers consistent across all dependencies.


Code on GitHub(link below)

#CPlusPlus #OpenGL #Qt #RabbitMQ #Protobuf #GameDev #DistributedSystems #ComputerGraphics



- `client/` runs the Pacman game UI, publishes game events to RabbitMQ, and sends
    user input/events to the server.
- `server/` logs events from message queues and can process game-related messages.
- `proto/` contains the Protocol Buffers schema and generated C++ protobuf sources.
- `shared/` contains shared game constants and logic used by both client and server.

This is the game running:


Github link: LINK

 Note: some paths are hardcoded so it won't clone-and-run immediately, but the architecture and all source files are there.


Tuesday, March 31, 2026

Vertex Shader vs Fragment Shader Explained (With Simple Example)

 Introduction

Vertex and fragment shaders are essential, programmable GPU stages in modern 3D graphics pipelines. Vertex shaders manipulate geometry (position, deformation), while fragment shaders determine pixel color, lighting, and textures. They are necessary for creating custom visual effects, dynamic lighting, and efficient, GPU-accelerated rendering.

Vertex Shaders (Geometry Manipulation)
  • Necessity: Transform 3D model vertices from object space to screen space (clip space) for rendering.
  • Applications: Modifying geometry, such as vertex animation (e.g., cloth simulation, waving water), transforming models (rotation, scaling), and passing data to the fragment shader.
  • Example: A shader that makes a mesh wave by altering vertex positions based on a sine function
Fragment Shaders (Per-Pixel Coloring)
  • Necessity: Determine the final color (pixel value) of every rendered object pixel, handling lighting and textures.
  • Applications: Lighting calculations (Phong/Blinn-Phong), applying textures (UV mapping), transparency, and special effects like bloom or sepia tones.
  • Example: Calculating per-pixel lighting to make a surface appear smooth rather than faceted.
Why Both Are Necessary
  • Pipeline Flow: The vertex shader prepares raw shape geometry, while the fragment shader colors those shapes.
  • Efficiency: Vertex shaders run per-vertex (few), while fragment shaders run per-pixel (many), allowing for complex visual effects without stalling the GPU.
  • Data Passing: The vertex shader can calculate data (like vertex colors or normals) and pass them to the fragment shader, which interpolates this data for smooth gradients or precise lighting.

Shaders are where most beginners get stuck in OpenGL.

You see words like:

  • Vertex Shader
  • Fragment Shader

…and it feels confusing.

Let’s break it down in the simplest way possible.

Core Idea

Vertex Shader = decides POSITION
Fragment Shader = decides COLOR

That’s it.

Visual Explanation

Think of a triangle:

  • Vertex Shader → places the 3 corners
  • Fragment Shader → fills the color inside

Vertex Shader Example

attribute vec4 vPosition;

void main() {
gl_Position = vPosition;
}

👉 It controls where things appear on screen


Fragment Shader Example

precision mediump float;

void main() {
gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0);
}

👉 It controls what color they are


Real Analogy

Think of painting:

  • Vertex Shader = drawing outline
  • Fragment Shader = filling color

Common Mistakes

1. Mixing Responsibilities

Trying to set color in vertex shader → confusion

2. Precision Missing

precision mediump float;

👉 Required in fragment shader


Next Step

Try:

  • Changing color dynamically
  • Passing color from vertex → fragment
  • Creating gradients

Final Insight

Shaders are not complex—you just need to see them as:
👉 Position + Color pipeline

Once that clicks, graphics becomes much easier.

If you want to explore further I have another post where I draw a triangle.

https://cglabprojects.blogspot.com/2018/07/drawing-triangle-in-android-app-using.html