I have been wanting to do this for a loooong time. I originally wanted to try it in Maxwell Render many years ago, but decided against it, since the first proton in the Universe would have decayed before anything useful came out of Maxwell.
(Sorry Next Limit...Great program, but CPUs bad...GPUs good!)
Anyway, I wanted to do lens flare effects inside an unbiased-renderer. The only way to do that without code such as that designed by Hullin, Eisemann, Seidel and Lee ("Physically-Based Real-Time Lens Flare Rendering"), is to build an actual compound lens in 3D and place the Octane camera inside it, while it looks backward at the focal plane. The code mentioned above has been around for more than a year, I think, but it does not appear that anyone is bothering to incorporate it into their programs yet. Possibly the creators of it are asking too much money, maybe they're sitting on it for now...Who knows.
Modeling the virtual camera for this experiment was very easy and straight forward...The hard part was the time in doing an Internet research to locate any accurate engineering drawings or patents of various lenses. I eventually found a bunch of lens patents via http://www.patentlens.net/. Of key importance is that all lens element radii of curvature and distances between the elements are noted as well as the IOR of each element (Compound lenses do NOT use the same IOR for all elements).
So here are the results. The image on the left is completely done within Octane (no post in Photoshop). The image on the right is with post work since Octane does not yet handle the Fraunhofer equations for starburst patterns that are found around bright light sources. These star patterns are the result of the diaphragm within the lens system and change according to the size of the aperture, and the amount of blades that form its sides. Hopefully, Octane will handle apertures in the future!
Curiously, I found an odd issue in Octane when setting up the light sources. There are two sources...The HDR background, and an emitter ball just above the fire extinguisher bottle that is casting the flare effects. Whenever I would change the intensity of the emitter ball, the HDR would change in intensity, too, as if it were just another surface lit by the ball! Clearly, if the ball is reduced to zero intensity, the HDR should remain unaffectedly bright. This was not the case. I thought it might be that the ball (being as bright as a little Sun) might be filling the lens with light fill, but if this were true, then removing the ball entirely should still not affect the HDR in any way. I found that I had to orchestrate the intensities of the HDR and the ball to get things to work. Now, I may be doing something stupid in that respect so that it SEEMS that the ball is coupled to the HDR, but don't really know for sure. Below are two examples with ball at 1000W and one at 10W.
One other note...I found that unless I increased PMC Max Depth and Exploration Strength, and lowered Direct Light Importance, all of the lens reflections would not render. If I understand it correctly, Octane simply moved on to other rays at some cutoff point. If this is true, then the drawback of doing all of this is that a good portion of the render is spent calculating reflections and light effects inside the virtual camera, let alone having to render the scene itself! So I have no idea what would happen if, for instance, you tried to render a scene whose main subject was another lens seen head on with all of the reflections you would see in it.
Overall, I did not get the impression that the virtual camera overhead was all that bad. I will no more when I try a complex scene.