This was originally posted in the Unity Forums. Please respond to the thread here with any feedback, thank you 🙂
I’m in the (increasingly drawn out) process of adding VR support to my asset Fantastic Glass.
I have the grab pass working, (most) inconsistencies caused by Single-pass dealt with, and everything is being synchronised to avoid any ghosting.
The only issue I seem to be left with is getting depth values in my shader from depth textures rendered in other cameras.
The depth textures when received by the material appear to be per-eye and include part of the dividing black border – they aren’t requested in any special way; this is just how Unity renders / passes them along:
Here is an example in VR of the depth issue causing an offset edge in the distortion:
Here’s an example with VR support disabled:
Here’s another screen showing that each eye receives a different offset in VR and even shows the border in Single-Pass:
Here’s an example in VR with the distortion effects disabled (red albedo for visibility as the sphere is unlit to make the edges clearer):
Grab pass and normal (e.g. _MainTex) UVs are working fine. However, if I try to use a similar UV / Unity’s various new stereo functions with the depth textures, the results are even worse than in these examples.
Here’s an example in VR of fogging and extinction showing the same edge – their intensity is derived only from the front, back, and other (non-glass) depth values:
Does anyone have any experience in getting something like this to work or have an idea of what I should be doing to get the right values?
I can’t find an explanation of how Unity recommends integrating their VR support and would massively appreciate any help 🙂
This was originally posted in the Unity Forums. Please respond here with any feedback.