cross-posted from: https://programming.dev/post/15448730
In order to learn programming holograms I’d like to gather some sources in this post.
The linked paper describes already optimised way of rendering holograms. I’d like to find a naive implementation of a hologram i.e. in ShaderToy using interferometric processing of stored inteference patterns like it works in a physical hologram(I guess). I also want this to be a resource to learn how laser holograms work in real life.
To create an introduction project to holographic rendering these steps will be required.
- Store a sphere or a cube interference patterns in a texture. This should be a model of our physically correct hologram. Note: If this step requires saving thousands of textures we should limit the available viewing angles(if that’s what helps)
- Load the rendered patterns as a texture or an array of textures into a WebGL program
- Create a shader that will do the interferometric magic to render the sphere/cube from the hologram model
The performance of the solution is irrelevant. Even if takes and hour to generate the data and a minute to render one frame in low resolution that’s fine.
Note: The goal is not about creating anything that visually looks like a cool hologram or rendering 3D objects with a volume like with SDFs or volume rendering. It’s all about creating a basic physical simulation of viewing a real hologram.
I just learned about “gaussian splats” and it was the first thing I thought of that might be useful for generating a 3D model that could be converted to a hologram because it specifically uses distance data of reflections to generate the model and how that model fills a volume.
I’m wondering if that type of data is abstract enough, follows similar principles and contains enough information to be used as a source to generate interference patterns.