This project is an iOS app intended to help people with visual deficiencies be able to see more of the world around them. It is a ground-up rewrite of the GLVideoFilter project using the Swift programming language and Metal graphics API. These technologies enabled a significantly more flexible processing pipeline that is capable of applying multiple sets of filters (e.g., color vision deficiency simulation and then edge detection) to a real time video feed.
It implements multiple edge detectors (Sobel, double-threshold Canny), color vision deficiency simulation (Protanopia, Deuteranopia, and Tritanopia) and correction (Daltonization), as well as a linear-sampled Gaussian blur for noise reduction.
Source code is available on GitHub.
HopalongVR is a port of Iacopo Sassarini's iconic Barry Martin's Hopalong Attractor Visualizer for WebVR.
A live demo can be found at hopalongvr.dghost.net.
More information on WebVR, including supported browsers, can be found at WebVR.info.
While working for the Laboratory for Atmospheric and Space Physics I took over as the primary developer for a satellite visualization tool called LASPview. Intended to function as an always-on kiosk for visitors, LASPview aggregates data from a multiple sources and presents a near real-time representation of the solar system and multiple NASA satellite missions.
Despite being a fairly modest tool, as a result of my efforts LASPview leverages OpenGL Core Profile to achieve true solar-system scale rendering using a logarithmic depth buffer, astronomically correct positioning and illumination of planetary bodies, and physically-based lighting using the Cook-Torrence, Oren-Nayar, and Lommel lighting models.
An application intended to demonstrate and help quantify visual aberrations introduced by pre-distorting images for the Oculus Rift.
Source code is available on GitHub.
After a bout of nostalgia involving the Quake II railgun, I decided to implement support for the Oculus Rift in Quake II for fun. Originally forked from KMQuake II, Quake II VR has developed into a highly modernized implementation of Quake II that features low-latency input and rendering, OpenAL audio support complete with HRTFs, sRGB color profile output, post-processing effects, and game pad support. It has served as a testbed for research into simulation sickness and incorporates features such as HUD counter-rotation to minimize it's impact on users.
Quake II VR natively supports the Oculus Rift, any SteamVR compatible HMD, and will be updated to support future HMDs as they become available. Source code and download links are available on GitHub.
See also Polygon's writeup: Free mod turns Quake 2 into a VR masterpiece
A tech demo featuring deferred lighting, shadow maps, bump mapping, stereoscopic rendering, per-eye post-processing effects, and Oculus Rift support.
Real-time image processing of a live video stream using OpenGL ES on iOS devices.
The purpose of this app was to evaluate the feasibility of edge detection techniques as an accessibility tool. While the screen size and form-factor of iOS devices is not conducive to real-world use, this was intended as a early prototype to test generating an accessible video stream from a cell phone camera. As a result, the selection of filters is somewhat narrow in scope and is centered around providing meaningful ways of either enhancing a live video stream (by highlighting object edges) or transforming it entirely into a format that is easier for persons with reduced vision to see.
The source code is available on GitHub.
A update to the earlier iPad based OpenGL ES 2.0 rendering engine that focused on improving the lighting techniques used. It features transparent objects, shadow mapping for multiple dynamic lights, dynamic light colors, bump mapping, and mesh deformation using procedurally generated perlin noise.
Working for the Correll Lab at the University of Colorado, my senior project team was responsible for developing an interactive front-end for their swarm robotics simulation library. My contribution to the project was development of the core architecture and renderer for the front-end using Qt and OpenGL Core Profile. The resulting client interacts with the simulation library asynchronously and is capable of visualizing thousands of objects while remaining interactive.
More information on the project can be found at the project homepage.
Qt/OpenGL based cross-platform procedural terrain generator. Features per-pixel lighting, mesh deformation and tessellation, dynamic lighting effects such as Fresnel reflections effects, caustics, and dynamically generated assets.
The source code is available on GitHub.
WebGL demonstration showing per-pixel lighting, bump mapping, and hardware vertex deformation. A live demonstration can be seen at ocean.dghost.net.
The source code is available on GitHub.
Mistakes. Everyone makes them. One of the things that makes graphics programming fun is that these mistakes often wind up being quite spectacular. I like to keep screenshots of some of those "that ain't right" moments as a reminder of not only how much I've learned as a programmer, but also of how much I still have to learn.