About the Author

Michael is a contract developer at Secret Atomics with nine years of employment developing iOS applications, data explorers, interactive installations, and games, and he has over twenty years of experience helping launch new products with venture-backed startups and companies like Google, Samsung, and Apple.

Comparing VR Rendering Models

For the past several months I’ve been working on building environments for the Oculus Quest. While deploying my environments to device, I learned a few things about what makes lighting in VR look good on device, such as the fact that SM5 shows (much) better results than Vulkan, which means tethering via USB is sometimes advised. Running on PC is always going to be better than mobile, but how much better SM5 on PC is than Vulkan in device is pretty significant.

Rendering from Unreal Engine 5 to LED Panels over a wireless connection

I connected Unreal Engine to a Raspberry Pi via some C++ support classes that implemented a custom buffer carrying data from a scene capture component’s texture memory across an MQTT broker to a wireless subscriber driving the LED Panels with a custom C++ driver. At 128 x 64 pixels with 8 bit color depth, the connection across a local router exceeds 90 Hz.

Creating a Youtube VR video in Unreal

I have been setting up an environment that I can render into Youtube VR from Unreal. In order to get something that is representative of a high fidelity environment, I first took time to create a landscape, paint it with blended layers, and add assets and textures downloaded from Quixel.

Tldr:

I made this landscape for the Oculus and Youtube in 4k and VR:

Rebuilding a VR Landscape

I took the steps below to set up a landscape environment for VR:

  • Importing the heightmap for the landscape
  • Adding a VR Hand Tracking pawn
  • Compiling for the Oculus
  • Add hand tracking to the pawn
  • Add teleportation to the controllers
  • Add a VR movement sequence
  • Set up the landscape with Quixel materials
  • Adding blueprint brushes to the landscape
  • Painting the landscape
  • Adding an ocean component
  • Adding a sky sphere blueprint
  • Adding volumetric clounds (and then removing them because of Oculus rendering issues)
  • Adding post processing for auto exposure
  • Adding exponential height fog
  • Adding areas for foliage
  • Creating procedural foliage from Quixel models
  • Painting Foliage

Toy Photo Gallery Walkthrough

The ToyPhotoGallery code base is an example of native iOS development techniques that satisfy common needs among cloud-backed and resource-heavy user experiences. The application fetches a manifest of resource locations from Parse; retrieves thumbnail and optimized preview image assets from an S3 bucket; presents the thumbnails in a fluidly scrolling, auto-refreshing collection view; and animates into a child view controller designed to scroll across gesture-backed transformations of high-fidelity preview images.

Eliminating Collection View Tearing with Xcode's Time Profiler Instrument

TL;DR: Using the Time Profiler to refactor collection view cell model image fetching and pull Parse’s return call off the main thread enables smooth scrolling.

Delivering Interactive Graphics Wirelessly to an LED Matrix

Driving interactive graphics from the framebuffer of a mobile device to a matrix of LEDs has involved optimizing steps in the pipeline on a case-by-case basis rather than trying to get it all to work at the same time.

©2018 - 2022, Michael Edgcumbe