Jump to content

Ambient occlusion: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Most if not all of these links fail WP:EL. If anyone can go through them and check if one or two of them should be kept, that will be fine.
→‎Implementation: HBAO is a type of SSAO
 
(46 intermediate revisions by 34 users not shown)
Line 1: Line 1:
{{short description|Computer graphics shading and rendering technique}}
[[File:Show how 3D real time ambient occlusion works 2013-11-23 10-45.jpeg|thumbnail|The ambient occlusion map (middle image) for this scene darkens only the innermost angles of corners.]]
[[File:AmbientOcclusion German.jpg|thumb|The ambient occlusion map (middle image) for this scene darkens only the innermost angles of corners.]]


In computer graphics, '''ambient occlusion''' is a shading and [[Rendering (computer graphics)|rendering technique]] used to calculate how exposed each point in a scene is to [[Shading#Ambient lighting|ambient lighting]]. The interior of a tube is typically more occluded (and hence darker) than the exposed outer surfaces, and the deeper you go inside the tube, the more occluded (and darker) the lighting becomes. Ambient occlusion can be seen as an accessibility value that is calculated for each surface point.<ref>{{cite book| author=Miller, Gavin| chapter=Efficient algorithms for local and global accessibility shading| title=Proceedings of the 21st annual conference on Computer graphics and interactive techniques| year=1994| pages=319–326}}</ref> In scenes with open sky this is done by estimating the amount of visible sky for each point, while in indoor environments only objects within a certain radius are taken into account and the walls are assumed to be the origin of the ambient light. The result is a diffuse, non-directional shading effect that casts no clear shadows but that darkens enclosed and sheltered areas and can affect the rendered image's overall tone. It is often used as a post-processing effect.
In [[3D computer graphics]], [[3D modeling|modeling]], and [[Computer animation|animation]], '''ambient occlusion''' is a [[shading]] and [[Rendering (computer graphics)|rendering]] technique used to calculate how exposed each point in a scene is to [[Shading#Ambient lighting|ambient lighting]]. For example, the interior of a tube is typically more occluded (and hence darker) than the exposed outer surfaces, and becomes darker the deeper inside the tube one goes.


Ambient occlusion can be seen as an accessibility value that is calculated for each surface point.<ref>{{cite book| author=Miller, Gavin| chapter=Efficient algorithms for local and global accessibility shading| title=Proceedings of the 21st annual conference on Computer graphics and interactive techniques| year=1994| pages=319–326}}</ref> In scenes with open sky, this is done by estimating the amount of visible sky for each point, while in indoor environments, only objects within a certain radius are taken into account and the walls are assumed to be the origin of the ambient light. The result is a [[Diffuse reflection|diffuse]], non-directional shading effect that casts no clear shadows, but that darkens enclosed and sheltered areas and can affect the rendered image's overall tone. It is often used as a [[Image editing|post-processing]] effect.
Unlike local methods such as [[Phong shading]], ambient occlusion is a global method, meaning that the illumination at each point is a function of other geometry in the scene. However, it is a very crude approximation to full [[global illumination]]. The appearance achieved by ambient occlusion alone is similar to the way an object might appear on an overcast day.

Unlike local methods such as [[Phong shading]], ambient occlusion is a global method, meaning that the illumination at each point is a function of other geometry in the scene. However, it is a very crude approximation to full [[global illumination]]. The appearance achieved by ambient occlusion alone is similar to the way an object might appear on an [[overcast]] day.

The first method that allowed simulating ambient occlusion in real time was developed by the research and development department of [[Crytek]] ([[CryEngine |CryEngine 2]]).<ref>{{cite web |url=https://vr.arvilab.com/blog/ambient-occlusion |title=AMBIENT OCCLUSION: AN EXTENSIVE GUIDE ON ITS ALGORITHMS AND USE IN VR|publisher=ARVIlab |access-date=2018-11-26}}</ref> With the release of hardware capable of real time ray tracing ([[GeForce 20 series]]) by [[Nvidia]] in 2018, [[Ray tracing (graphics)|ray traced]] ambient occlusion (RTAO) became possible in games and other real time applications.<ref>{{cite AV media|url=https://www.youtube.com/watch?v=yag6e2Npw4M |archive-url=https://ghostarchive.org/varchive/youtube/20211212/yag6e2Npw4M| archive-date=2021-12-12 |url-status=live|publisher=Nvidia|title=Ray Traced Ambient Occlusion}}{{cbignore}}</ref> This feature was added to the [[Unreal Engine]] with version 4.22.<ref>{{cite news|url=https://www.extremetech.com/computing/285701-unreal-engine-adds-support-for-dx12-raytracing|title=Unreal Engine Adds Support for DX12 Raytracing|work=ExtremeTech}}</ref>


==Implementation==
==Implementation==
[[File:Efecto de la oclusión ambiental.ogg|thumbnail|3D animation of ambient occlusion]]
[[File:Efecto de la oclusión ambiental.ogg|thumbnail|3D animation of ambient occlusion enabled on the animation to the right]]
In real-time applications, such as computer games, [[Screen space ambient occlusion]] can be used as a faster approximation of true ambient occlusion, using pixel depth rather than scene geometry to form an ambient occlusion map. However, newer technologies are making true ambient occlusion feasible even in real-time.{{Citation needed|date=April 2014}}
In the absence of hardware-assisted [[Ray tracing (graphics)|ray traced]] ambient occlusion, [[real-time computer graphics|real-time]] applications such as computer games can use [[screen space ambient occlusion]] (SSAO) techniques such as [[horizon-based ambient occlusion]] including HBAO and [[ground-truth ambient occlusion]] (GTAO) as a faster approximation of true ambient occlusion, using [[Z-buffering|per-pixel depth]], rather than scene geometry, to form an ambient occlusion [[Associative array|map]].


Ambient occlusion is related to accessibility shading, which determines appearance based on how easy it is for a surface to be touched by various elements (e.g., dirt, light, etc.). It has been popularized in production animation due to its relative simplicity and efficiency. In the industry, ambient occlusion is often referred to as "sky light".{{Citation needed|date=June 2008}}
Ambient occlusion is related to accessibility shading, which determines appearance based on how easy it is for a surface to be touched by various elements (e.g., dirt, light, etc.). It has been popularized in production animation due to its relative simplicity and efficiency.


The ambient occlusion shading model has the nice property of offering a better perception of the 3D shape of the displayed objects. This was shown in a paper where the authors report the results of perceptual experiments showing that depth discrimination under diffuse uniform sky lighting is superior to that predicted by a direct lighting model.<ref>{{cite journal|doi=10.1068/p3060|title=Depth discrimination from shading under diffuse lighting|first=M.S.|last=Langer|author2=H. H. Buelthoff|journal=Perception|volume=29|issue=6|pages=649–660|year=2000|pmid=11040949}}</ref>
The ambient occlusion shading model offers a better perception of the 3D shape of the displayed objects. This was shown in a paper where the authors report the results of perceptual experiments showing that depth discrimination under diffuse uniform sky lighting is superior to that predicted by a direct lighting model.<ref>{{cite journal|doi=10.1068/p3060|title=Depth discrimination from shading under diffuse lighting|first=M.S.|last=Langer|author2=H. H. Buelthoff|journal=Perception|volume=29|issue=6|pages=649–660|year=2000|pmid=11040949|citeseerx=10.1.1.69.6103|s2cid=11700764 }}</ref>


The occlusion <math>A_\bar p</math> at a point <math>\bar p</math> on a surface with normal <math>\hat n</math> can be computed by integrating the visibility function over the hemisphere <math>\Omega</math> with respect to projected solid angle:
The occlusion <math>A_\bar p</math> at a point <math>\bar p</math> on a surface with normal <math>\hat n</math> can be computed by integrating the visibility function over the hemisphere <math>\Omega</math> with respect to projected solid angle:


<center>
{{center|
<math>
<math>
A_\bar p = \frac{1}{\pi} \int_{\Omega} V_{\bar p,\hat\omega} (\hat n \cdot \hat\omega ) \, \operatorname{d}\omega
A_\bar p = \frac{1}{\pi} \int_{\Omega} V_{\bar p,\hat\omega} (\hat n \cdot \hat\omega ) \, \operatorname{d}\omega
</math>
</math>
}}
</center>


where <math>V_{\bar p,\hat\omega}</math> is the visibility function at <math>\bar p</math>, defined to be zero if <math>\bar p</math> is occluded in the direction <math>\hat\omega</math> and one otherwise, and <math>\operatorname{d}\omega</math> is the infinitesimal [[solid angle]] step of the integration variable <math>\hat\omega</math>. A variety of techniques are used to approximate this integral in practice: perhaps the most straightforward way is to use the [[Monte Carlo method]] by casting rays from the point <math>\bar p</math> and testing for intersection with other scene geometry (i.e., [[ray casting]]). Another approach (more suited to hardware acceleration) is to render the view from <math>\bar p</math> by rasterizing black geometry against a white background and taking the (cosine-weighted) average of rasterized fragments. This approach is an example of a "gathering" or "inside-out" approach, whereas other algorithms (such as depth-map ambient occlusion) employ "scattering" or "outside-in" techniques.
where <math>V_{\bar p,\hat\omega}</math> is the visibility function at <math>\bar p</math>, defined to be zero if <math>\bar p</math> is occluded in the direction <math>\hat\omega</math> and one otherwise, and <math>\operatorname{d}\omega</math> is the infinitesimal [[solid angle]] step of the integration variable <math>\hat\omega</math>. A variety of techniques are used to approximate this integral in practice: perhaps the most straightforward way is to use the [[Monte Carlo method]] by casting rays from the point <math>\bar p</math> and testing for intersection with other scene geometry (i.e., [[ray casting]]). Another approach (more suited to hardware acceleration) is to render the view from <math>\bar p</math> by [[Rasterisation|rasterizing]] black geometry against a white background and taking the (cosine-weighted) average of rasterized fragments. This approach is an example of a "gathering" or "inside-out" approach, whereas other algorithms (such as depth-map ambient occlusion) employ "scattering" or "outside-in" techniques.


In addition to the ambient occlusion value, a "bent normal" vector <math>\hat{n}_b</math> is often generated, which points in the average direction of unoccluded samples. The bent normal can be used to look up incident [[radiance]] from an [[environment map]] to approximate image-based lighting. However, there are some situations in which the direction of the bent normal is a misrepresentation of the dominant direction of illumination, e.g.,
In addition to the ambient occlusion value, a "bent normal" vector <math>\hat{n}_b</math> is often generated, which points in the average direction of occluded samples. The bent normal can be used to look up incident [[radiance]] from an [[environment map]] to approximate [[image-based lighting]]. However, there are some situations in which the direction of the bent normal is a misrepresentation of the dominant direction of illumination, e.g.,


[[Image:Aocclude bentnormal.png|thumb|center|400px|In this example the bent normal N<sub>b</sub> has an unfortunate direction, since it is pointing at an occluded surface.]]
[[Image:Aocclude bentnormal.png|thumb|center|400px|In this example the bent normal N<sub>b</sub> has a direction that does not allow it to illuminate the scene as it is pointing at an occluded surface.]]


In this example, light may reach the point p only from the left or right sides, but the bent normal points to the average of those two sources, which is, unfortunately, directly toward the obstruction.
In this example, light may reach the point p only from the left or right sides, but the bent normal points to the average of those two sources, which is directly toward the obstruction.


===Variants===
===Variants===
*SSAO-[[Screen space ambient occlusion]]
* [[Screen space ambient occlusion]] (SSAO)
*SSDO-[[Screen space directional occlusion]]
* [[Screen space directional occlusion]] (SSDO)
* [[Ray-traced ambient occlusion]] (RTAO)
*HDAO-High Definition Ambient Occlusion
*HBAO+-Horizon Based Ambient Occlusion+
* High Definition Ambient Occlusion (HDAO)
*AAO-Alchemy Ambient Occlusion
* Horizon Based Ambient Occlusion+ (HBAO)
*ABAO-Angle Based Ambient Occlusion
* Alchemy Ambient Occlusion (AAO)
* Angle Based Ambient Occlusion (ABAO)
*PBAO
*VXAO-Voxel Accelerated Ambient Occlusion
* Pre Baked Ambient Occlusion (PBAO)
* Voxel Accelerated Ambient Occlusion (VXAO)
* Ground Truth based Ambient Occlusion (GTAO)<ref>{{cite web|title=Practical Realtime Strategies for Accurate Indirect Occlusion|url=http://iryoku.com/downloads/Practical-Realtime-Strategies-for-Accurate-Indirect-Occlusion.pdf}}</ref>


==Recognition==
==Recognition==
Line 47: Line 54:
* [[Radiosity (3D computer graphics)|Radiosity]]
* [[Radiosity (3D computer graphics)|Radiosity]]
* [[Ray tracing (graphics)|Ray tracing]]
* [[Ray tracing (graphics)|Ray tracing]]
* [[High-dynamic-range rendering]]


==References==
==References==
<references/>
<references/>

{{Texture mapping techniques}}


[[Category:Shading]]
[[Category:Shading]]

Latest revision as of 00:38, 14 June 2024

The ambient occlusion map (middle image) for this scene darkens only the innermost angles of corners.

In 3D computer graphics, modeling, and animation, ambient occlusion is a shading and rendering technique used to calculate how exposed each point in a scene is to ambient lighting. For example, the interior of a tube is typically more occluded (and hence darker) than the exposed outer surfaces, and becomes darker the deeper inside the tube one goes.

Ambient occlusion can be seen as an accessibility value that is calculated for each surface point.[1] In scenes with open sky, this is done by estimating the amount of visible sky for each point, while in indoor environments, only objects within a certain radius are taken into account and the walls are assumed to be the origin of the ambient light. The result is a diffuse, non-directional shading effect that casts no clear shadows, but that darkens enclosed and sheltered areas and can affect the rendered image's overall tone. It is often used as a post-processing effect.

Unlike local methods such as Phong shading, ambient occlusion is a global method, meaning that the illumination at each point is a function of other geometry in the scene. However, it is a very crude approximation to full global illumination. The appearance achieved by ambient occlusion alone is similar to the way an object might appear on an overcast day.

The first method that allowed simulating ambient occlusion in real time was developed by the research and development department of Crytek (CryEngine 2).[2] With the release of hardware capable of real time ray tracing (GeForce 20 series) by Nvidia in 2018, ray traced ambient occlusion (RTAO) became possible in games and other real time applications.[3] This feature was added to the Unreal Engine with version 4.22.[4]

Implementation

[edit]
3D animation of ambient occlusion enabled on the animation to the right

In the absence of hardware-assisted ray traced ambient occlusion, real-time applications such as computer games can use screen space ambient occlusion (SSAO) techniques such as horizon-based ambient occlusion including HBAO and ground-truth ambient occlusion (GTAO) as a faster approximation of true ambient occlusion, using per-pixel depth, rather than scene geometry, to form an ambient occlusion map.

Ambient occlusion is related to accessibility shading, which determines appearance based on how easy it is for a surface to be touched by various elements (e.g., dirt, light, etc.). It has been popularized in production animation due to its relative simplicity and efficiency.

The ambient occlusion shading model offers a better perception of the 3D shape of the displayed objects. This was shown in a paper where the authors report the results of perceptual experiments showing that depth discrimination under diffuse uniform sky lighting is superior to that predicted by a direct lighting model.[5]

The occlusion at a point on a surface with normal can be computed by integrating the visibility function over the hemisphere with respect to projected solid angle:

where is the visibility function at , defined to be zero if is occluded in the direction and one otherwise, and is the infinitesimal solid angle step of the integration variable . A variety of techniques are used to approximate this integral in practice: perhaps the most straightforward way is to use the Monte Carlo method by casting rays from the point and testing for intersection with other scene geometry (i.e., ray casting). Another approach (more suited to hardware acceleration) is to render the view from by rasterizing black geometry against a white background and taking the (cosine-weighted) average of rasterized fragments. This approach is an example of a "gathering" or "inside-out" approach, whereas other algorithms (such as depth-map ambient occlusion) employ "scattering" or "outside-in" techniques.

In addition to the ambient occlusion value, a "bent normal" vector is often generated, which points in the average direction of occluded samples. The bent normal can be used to look up incident radiance from an environment map to approximate image-based lighting. However, there are some situations in which the direction of the bent normal is a misrepresentation of the dominant direction of illumination, e.g.,

In this example the bent normal Nb has a direction that does not allow it to illuminate the scene as it is pointing at an occluded surface.

In this example, light may reach the point p only from the left or right sides, but the bent normal points to the average of those two sources, which is directly toward the obstruction.

Variants

[edit]

Recognition

[edit]

In 2010, Hayden Landis, Ken McGaugh and Hilmar Koch were awarded a Scientific and Technical Academy Award for their work on ambient occlusion rendering.[7]

See also

[edit]

References

[edit]
  1. ^ Miller, Gavin (1994). "Efficient algorithms for local and global accessibility shading". Proceedings of the 21st annual conference on Computer graphics and interactive techniques. pp. 319–326.
  2. ^ "AMBIENT OCCLUSION: AN EXTENSIVE GUIDE ON ITS ALGORITHMS AND USE IN VR". ARVIlab. Retrieved 2018-11-26.
  3. ^ Ray Traced Ambient Occlusion. Nvidia. Archived from the original on 2021-12-12.
  4. ^ "Unreal Engine Adds Support for DX12 Raytracing". ExtremeTech.
  5. ^ Langer, M.S.; H. H. Buelthoff (2000). "Depth discrimination from shading under diffuse lighting". Perception. 29 (6): 649–660. CiteSeerX 10.1.1.69.6103. doi:10.1068/p3060. PMID 11040949. S2CID 11700764.
  6. ^ "Practical Realtime Strategies for Accurate Indirect Occlusion" (PDF).
  7. ^ Oscar 2010: Scientific and Technical Awards, Alt Film Guide, Jan 7, 2010