<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://vrarwiki.com/index.php?action=history&amp;feed=atom&amp;title=Focal_surface_display</id>
	<title>Focal surface display - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://vrarwiki.com/index.php?action=history&amp;feed=atom&amp;title=Focal_surface_display"/>
	<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Focal_surface_display&amp;action=history"/>
	<updated>2026-04-19T02:52:49Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.43.0</generator>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=36197&amp;oldid=prev</id>
		<title>RealEditor: formatting and link</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=36197&amp;oldid=prev"/>
		<updated>2025-07-02T20:13:41Z</updated>

		<summary type="html">&lt;p&gt;formatting and link&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 20:13, 2 July 2025&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l1&quot;&gt;Line 1:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 1:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;==Introduction==&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-added&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;[[File:Focal surface display prototype.png|thumb|Figure 1. Focus surface display prototype. (Image: Matsuda &amp;#039;&amp;#039;et al&amp;#039;&amp;#039;., 2017)]]&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;[[File:Focal surface display prototype.png|thumb|Figure 1. Focus surface display prototype. (Image: Matsuda &amp;#039;&amp;#039;et al&amp;#039;&amp;#039;., 2017)]]&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;[[File:Focal surface display spatial light modulator.png|thumb|Figure 2. In a focal surface display, a spatial light simulator is placed between the screen and eyepiece of a VR headset. (Image: roadtovr.com)]]&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;[[File:Focal surface display spatial light modulator.png|thumb|Figure 2. In a focal surface display, a spatial light simulator is placed between the screen and eyepiece of a VR headset. (Image: roadtovr.com)]]&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l9&quot;&gt;Line 9:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 8:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;While modern VR experiences are superior to what they were just a few years ago, the Oculus focal surface display addresses a perceptual limitation of current HMDs: not being able to display scene content at correct focal depths. These HMDs have a fixed-focus accommodation determined by the headset’s eyepiece focal length. Although they give the illusion of depth from the stereo images, the images are essentially flat, at a fixed perceived distance from the face and with a focus selected by the software instead of the eyes. Scene content with a virtual distance from the viewer different than the fixed focal distance of the headset’s screen will lead to a [[vergence-accommodation conflict]] - arising from binocular disparity cues (vergence) in conflict with focus cues (accommodation). The vergence-accommodation conflict prevents the VR content scenes from appearing sharply in focus and may contribute to user’s fatigue and discomfort. &amp;lt;ref name=”2”&amp;gt;Comp Photo Lab. Focal surface displays. Retrieved from http://compphotolab.northwestern.edu/project/focal-surface-displays/&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;Miller, P. (2017). Oculus Research&amp;#039;s focal surface display could make VR much more comfortable for our eyeballs. Retrieved from https://www.theverge.com/circuitbreaker/2017/5/19/15667172/oculus-research-focal-surface-display-vr-comfort-eye-tracking&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;Coppock, M. (2017). Oculus developing ‘focal surface display’ for better VR image clarity. Retrieved from https://www.digitaltrends.com/computing/oculus-working-on-focal-surface-display-technology-for-improved-visual-clarity&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;While modern VR experiences are superior to what they were just a few years ago, the Oculus focal surface display addresses a perceptual limitation of current HMDs: not being able to display scene content at correct focal depths. These HMDs have a fixed-focus accommodation determined by the headset’s eyepiece focal length. Although they give the illusion of depth from the stereo images, the images are essentially flat, at a fixed perceived distance from the face and with a focus selected by the software instead of the eyes. Scene content with a virtual distance from the viewer different than the fixed focal distance of the headset’s screen will lead to a [[vergence-accommodation conflict]] - arising from binocular disparity cues (vergence) in conflict with focus cues (accommodation). The vergence-accommodation conflict prevents the VR content scenes from appearing sharply in focus and may contribute to user’s fatigue and discomfort. &amp;lt;ref name=”2”&amp;gt;Comp Photo Lab. Focal surface displays. Retrieved from http://compphotolab.northwestern.edu/project/focal-surface-displays/&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;Miller, P. (2017). Oculus Research&amp;#039;s focal surface display could make VR much more comfortable for our eyeballs. Retrieved from https://www.theverge.com/circuitbreaker/2017/5/19/15667172/oculus-research-focal-surface-display-vr-comfort-eye-tracking&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;Coppock, M. (2017). Oculus developing ‘focal surface display’ for better VR image clarity. Retrieved from https://www.digitaltrends.com/computing/oculus-working-on-focal-surface-display-technology-for-improved-visual-clarity&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;According to Oculus Research, the focal surface display has a new approach to avoid the vergence-accommodation conflict by changing the way light enters the display using spatial light &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;modulators &lt;/del&gt;(Figure 2) to bend the HMD’s focus around 3D objects. This results in an increased depth and maximizes the amount of space represented. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;According to Oculus Research, the focal surface display has a new approach to avoid the vergence-accommodation conflict by changing the way light enters the display using &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;[[&lt;/ins&gt;spatial light &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;modulator]]s &lt;/ins&gt;(Figure 2) to bend the HMD’s focus around 3D objects. This results in an increased depth and maximizes the amount of space represented. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The vergence-accommodation conflict has been a motivation for plentiful of proposals for VR technology that delivers near-correct accommodation cues. The focal surface display technology could help future VR headsets, improving image sharpness and depth of focus, resulting in an experience that approaches how the eyes normally function, thereby reducing discomfort while improving user’s [[immersion]] in the virtual reality. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”5”&amp;gt;Matsuda, N., Fix, A. and Lanman, D. (2017). Focal surface displays.ACM Transactions on Graphics, 36(4)&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt;Halfacree, G. (2017). Oculus VR outs focal surface display technology. Retrieved from https://www.bit-tech.net/news/tech/peripherals/oculus-vr-focal-surface-display/1/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The vergence-accommodation conflict has been a motivation for plentiful of proposals for VR technology that delivers near-correct accommodation cues. The focal surface display technology could help future VR headsets, improving image sharpness and depth of focus, resulting in an experience that approaches how the eyes normally function, thereby reducing discomfort while improving user’s [[immersion]] in the virtual reality. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”5”&amp;gt;Matsuda, N., Fix, A. and Lanman, D. (2017). Focal surface displays.ACM Transactions on Graphics, 36(4)&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt;Halfacree, G. (2017). Oculus VR outs focal surface display technology. Retrieved from https://www.bit-tech.net/news/tech/peripherals/oculus-vr-focal-surface-display/1/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>RealEditor</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=35428&amp;oldid=prev</id>
		<title>Xinreality at 21:22, 7 May 2025</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=35428&amp;oldid=prev"/>
		<updated>2025-05-07T21:22:30Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 21:22, 7 May 2025&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l33&quot;&gt;Line 33:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 33:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Different HMD architectures have been proposed to solve this problem and depict correct or near-correct retinal blur (Figure 3). The focal surface displays augment regular HMDs with a spatial light modulator that “acts as a dynamic freeform lens, shaping synthesized focal surfaces to conform to the virtual scene geometry.” Furthermore, Oculus Research has introduced “a framework to decompose target focal stacks and depth maps into one or more pairs of piecewise smooth focal surfaces and underlying display images,” building on “recent developments in &amp;quot;optimized blending&amp;quot; to implement a multifocal display that allows the accurate depiction of occluding, semi-transparent, and reflective objects.” &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Different HMD architectures have been proposed to solve this problem and depict correct or near-correct retinal blur (Figure 3). The focal surface displays augment regular HMDs with a spatial light modulator that “acts as a dynamic freeform lens, shaping synthesized focal surfaces to conform to the virtual scene geometry.” Furthermore, Oculus Research has introduced “a framework to decompose target focal stacks and depth maps into one or more pairs of piecewise smooth focal surfaces and underlying display images,” building on “recent developments in &amp;quot;optimized blending&amp;quot; to implement a multifocal display that allows the accurate depiction of occluding, semi-transparent, and reflective objects.” &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Contrary to multifocal displays with fixed focal surfaces, the phase modulator shapes focal surfaces to conform to the scene geometry. A set of color images are produced and mapped onto a corresponding focal surface (Figure 4), with visual appearance being rendered by “tracing rays from the eye through the optics, and accumulating the color values for each focal surface.” Furthermore, Matsuda &#039;&#039;et al&#039;&#039;. (2017) explain that their “algorithm sequentially solves for first the focal surfaces, given the target depth map, and then the color &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;images—full &lt;/del&gt;joint optimization is left for future work. Focal surfaces are adapted by nonlinear least squares optimization, minimizing the distance between the nearest depicted surface and the scene geometry. The color images, paired with each surface, are determined by linear least squares methods.” &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Contrary to multifocal displays with fixed focal surfaces, the phase modulator shapes focal surfaces to conform to the scene geometry. A set of color images are produced and mapped onto a corresponding focal surface (Figure 4), with visual appearance being rendered by “tracing rays from the eye through the optics, and accumulating the color values for each focal surface.” Furthermore, Matsuda &#039;&#039;et al&#039;&#039;. (2017) explain that their “algorithm sequentially solves for first the focal surfaces, given the target depth map, and then the color &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;images, full &lt;/ins&gt;joint optimization is left for future work. Focal surfaces are adapted by nonlinear least squares optimization, minimizing the distance between the nearest depicted surface and the scene geometry. The color images, paired with each surface, are determined by linear least squares methods.” &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The focal surface display research team demonstrated that the technology depicts more accurate retinal blur, with lesser multiplexed images, with high resolution being maintained throughout the user’s accommodative range. &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The focal surface display research team demonstrated that the technology depicts more accurate retinal blur, with lesser multiplexed images, with high resolution being maintained throughout the user’s accommodative range. &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=35427&amp;oldid=prev</id>
		<title>Xinreality at 21:22, 7 May 2025</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=35427&amp;oldid=prev"/>
		<updated>2025-05-07T21:22:03Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 21:22, 7 May 2025&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l20&quot;&gt;Line 20:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 20:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Development and announcement of the focal surface display==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Development and announcement of the focal surface display==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The Oculus focal surface display project was a long time in development. According to a research scientist at Oculus Research, “manipulating focus isn’t quite the same as modulating intensity or other more usual tasks in computational displays, and it took us a while to get to the correct mathematical formulation that finally brought everything together. Our overall motivation was to do things the &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;‘right’ way—solid &lt;/del&gt;engineering combined with the math and algorithms to back it up. We weren’t going to be happy with something that only worked on paper or a hacked together prototype that didn’t have any rigorous explanation of why it worked.” &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The Oculus focal surface display project was a long time in development. According to a research scientist at Oculus Research, “manipulating focus isn’t quite the same as modulating intensity or other more usual tasks in computational displays, and it took us a while to get to the correct mathematical formulation that finally brought everything together. Our overall motivation was to do things the &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&quot;right&quot; way: solid &lt;/ins&gt;engineering combined with the math and algorithms to back it up. We weren’t going to be happy with something that only worked on paper or a hacked together prototype that didn’t have any rigorous explanation of why it worked.” &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;On May, 2017, the VR and AR R&amp;amp;D division of Oculus - Oculus Research - announced the new display technology. During the same period, they published a research paper about their focal surface display, authored by Oculus scientists Nathan Matsuda, Alexander Fix, and [[Douglas Lanman]]. The research was also presented at the SIGGRAPH conference in July, 2017. &amp;lt;ref name=”7”&amp;gt;Lang, B. (2017). Oculus Research reveals “groundbreaking” focal surface display. Retrieved from https://www.roadtovr.com/oculus-research-demonstrate-groundbreaking-focal-surface-display/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;On May, 2017, the VR and AR R&amp;amp;D division of Oculus - Oculus Research - announced the new display technology. During the same period, they published a research paper about their focal surface display, authored by Oculus scientists Nathan Matsuda, Alexander Fix, and [[Douglas Lanman]]. The research was also presented at the SIGGRAPH conference in July, 2017. &amp;lt;ref name=”7”&amp;gt;Lang, B. (2017). Oculus Research reveals “groundbreaking” focal surface display. Retrieved from https://www.roadtovr.com/oculus-research-demonstrate-groundbreaking-focal-surface-display/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=30088&amp;oldid=prev</id>
		<title>Acro: /* Introduction */ Change &quot;currently&quot; to 2017 past tense</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=30088&amp;oldid=prev"/>
		<updated>2024-04-17T06:06:18Z</updated>

		<summary type="html">&lt;p&gt;&lt;span class=&quot;autocomment&quot;&gt;Introduction: &lt;/span&gt; Change &amp;quot;currently&amp;quot; to 2017 past tense&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 06:06, 17 April 2024&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l17&quot;&gt;Line 17:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 17:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;There had been previous attempts to solve the vergence-accommodation conflict such as using integral imaging techniques to synthesize [[light field]]s from scene content or displaying multiple focal planes, but these suffered from such problems as low fidelity accommodation cues, low resolution, and low field of view. The focal surface display is expected to generate high fidelity accommodation cues using off-the-shelf optical components. The spatial light modulator - placed between the display screen and eyepiece - produces variable focus along the display field of view. &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;There had been previous attempts to solve the vergence-accommodation conflict such as using integral imaging techniques to synthesize [[light field]]s from scene content or displaying multiple focal planes, but these suffered from such problems as low fidelity accommodation cues, low resolution, and low field of view. The focal surface display is expected to generate high fidelity accommodation cues using off-the-shelf optical components. The spatial light modulator - placed between the display screen and eyepiece - produces variable focus along the display field of view. &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;Currently&lt;/del&gt;, there &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;is &lt;/del&gt;no planned commercial release for &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;the &lt;/del&gt;focal surface display technology. &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;As of 2017&lt;/ins&gt;, there &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;was &lt;/ins&gt;no &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;known &lt;/ins&gt;planned commercial release for focal surface display technology. &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Development and announcement of the focal surface display==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Development and announcement of the focal surface display==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Acro</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=30085&amp;oldid=prev</id>
		<title>Acro: /* Introduction */ Link light field and VAC in separate places</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=30085&amp;oldid=prev"/>
		<updated>2024-04-17T06:04:53Z</updated>

		<summary type="html">&lt;p&gt;&lt;span class=&quot;autocomment&quot;&gt;Introduction: &lt;/span&gt; Link light field and VAC in separate places&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 06:04, 17 April 2024&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l7&quot;&gt;Line 7:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 7:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Focal surface display is a technology developed by [[Oculus]] Research that improves focus on images generated by a [[virtual reality]] (VR) [[head-mounted display]] (HMD) by simulating the way the eyes naturally focus at real object of varying depths (Figure 1). &amp;lt;ref name=”1”&amp;gt;Oculus VR (2017). Oculus Research to present focal surface display discovery at SIGGRAPH. Retrieved from https://www.oculus.com/blog/oculus-research-to-present-focal-surface-display-discovery-at-siggraph/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Focal surface display is a technology developed by [[Oculus]] Research that improves focus on images generated by a [[virtual reality]] (VR) [[head-mounted display]] (HMD) by simulating the way the eyes naturally focus at real object of varying depths (Figure 1). &amp;lt;ref name=”1”&amp;gt;Oculus VR (2017). Oculus Research to present focal surface display discovery at SIGGRAPH. Retrieved from https://www.oculus.com/blog/oculus-research-to-present-focal-surface-display-discovery-at-siggraph/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;While modern VR experiences are superior to what they were just a few years ago, the Oculus focal surface display addresses a perceptual limitation of current HMDs: not being able to display scene content at correct focal depths. These HMDs have a fixed-focus accommodation determined by the headset’s eyepiece focal length. Although they give the illusion of depth from the stereo images, the images are essentially flat, at a fixed perceived distance from the face and with a focus selected by the software instead of the eyes. Scene content with a virtual distance from the viewer different than the fixed focal distance of the headset’s screen will lead to a vergence-accommodation conflict - arising from binocular disparity cues (vergence) in conflict with focus cues (accommodation). The vergence-accommodation conflict prevents the VR content scenes from appearing sharply in focus and may contribute to user’s fatigue and discomfort. &amp;lt;ref name=”2”&amp;gt;Comp Photo Lab. Focal surface displays. Retrieved from http://compphotolab.northwestern.edu/project/focal-surface-displays/&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;Miller, P. (2017). Oculus Research&#039;s focal surface display could make VR much more comfortable for our eyeballs. Retrieved from https://www.theverge.com/circuitbreaker/2017/5/19/15667172/oculus-research-focal-surface-display-vr-comfort-eye-tracking&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;Coppock, M. (2017). Oculus developing ‘focal surface display’ for better VR image clarity. Retrieved from https://www.digitaltrends.com/computing/oculus-working-on-focal-surface-display-technology-for-improved-visual-clarity&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;While modern VR experiences are superior to what they were just a few years ago, the Oculus focal surface display addresses a perceptual limitation of current HMDs: not being able to display scene content at correct focal depths. These HMDs have a fixed-focus accommodation determined by the headset’s eyepiece focal length. Although they give the illusion of depth from the stereo images, the images are essentially flat, at a fixed perceived distance from the face and with a focus selected by the software instead of the eyes. Scene content with a virtual distance from the viewer different than the fixed focal distance of the headset’s screen will lead to a &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;[[&lt;/ins&gt;vergence-accommodation conflict&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;]] &lt;/ins&gt;- arising from binocular disparity cues (vergence) in conflict with focus cues (accommodation). The vergence-accommodation conflict prevents the VR content scenes from appearing sharply in focus and may contribute to user’s fatigue and discomfort. &amp;lt;ref name=”2”&amp;gt;Comp Photo Lab. Focal surface displays. Retrieved from http://compphotolab.northwestern.edu/project/focal-surface-displays/&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;Miller, P. (2017). Oculus Research&#039;s focal surface display could make VR much more comfortable for our eyeballs. Retrieved from https://www.theverge.com/circuitbreaker/2017/5/19/15667172/oculus-research-focal-surface-display-vr-comfort-eye-tracking&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;Coppock, M. (2017). Oculus developing ‘focal surface display’ for better VR image clarity. Retrieved from https://www.digitaltrends.com/computing/oculus-working-on-focal-surface-display-technology-for-improved-visual-clarity&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;According to Oculus Research, the focal surface display has a new approach to avoid the vergence-accommodation conflict by changing the way light enters the display using spatial light modulators (Figure 2) to bend the HMD’s focus around 3D objects. This results in an increased depth and maximizes the amount of space represented. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;According to Oculus Research, the focal surface display has a new approach to avoid the vergence-accommodation conflict by changing the way light enters the display using spatial light modulators (Figure 2) to bend the HMD’s focus around 3D objects. This results in an increased depth and maximizes the amount of space represented. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l15&quot;&gt;Line 15:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 15:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The development of the Oculus focal surface display was an interdisciplinary task, “combining leading hardware engineering, scientific and medical imaging, computer vision research, and state-of-the-art algorithms to focus on next-generation VR.” This technology could even allow people who wear corrective lenses use a VR HMD without glasses. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The development of the Oculus focal surface display was an interdisciplinary task, “combining leading hardware engineering, scientific and medical imaging, computer vision research, and state-of-the-art algorithms to focus on next-generation VR.” This technology could even allow people who wear corrective lenses use a VR HMD without glasses. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;There had been previous attempts to solve the vergence-accommodation conflict such as using integral imaging techniques to synthesize light &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;fields &lt;/del&gt;from scene content or displaying multiple focal planes, but these suffered from such problems as low fidelity accommodation cues, low resolution, and low field of view. The focal surface display is expected to generate high fidelity accommodation cues using off-the-shelf optical components. The spatial light modulator - placed between the display screen and eyepiece - produces variable focus along the display field of view. &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;There had been previous attempts to solve the vergence-accommodation conflict such as using integral imaging techniques to synthesize &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;[[&lt;/ins&gt;light &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;field]]s &lt;/ins&gt;from scene content or displaying multiple focal planes, but these suffered from such problems as low fidelity accommodation cues, low resolution, and low field of view. The focal surface display is expected to generate high fidelity accommodation cues using off-the-shelf optical components. The spatial light modulator - placed between the display screen and eyepiece - produces variable focus along the display field of view. &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Currently, there is no planned commercial release for the focal surface display technology. &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Currently, there is no planned commercial release for the focal surface display technology. &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Acro</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=30083&amp;oldid=prev</id>
		<title>Acro: Acro moved page Focal Surface Display to Focal surface display: Decapitalize non proper noun\</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=30083&amp;oldid=prev"/>
		<updated>2024-04-17T06:03:19Z</updated>

		<summary type="html">&lt;p&gt;Acro moved page &lt;a href=&quot;/wiki/Focal_Surface_Display&quot; class=&quot;mw-redirect&quot; title=&quot;Focal Surface Display&quot;&gt;Focal Surface Display&lt;/a&gt; to &lt;a href=&quot;/wiki/Focal_surface_display&quot; title=&quot;Focal surface display&quot;&gt;Focal surface display&lt;/a&gt;: Decapitalize non proper noun\&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;1&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;1&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 06:03, 17 April 2024&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-notice&quot; lang=&quot;en&quot;&gt;&lt;div class=&quot;mw-diff-empty&quot;&gt;(No difference)&lt;/div&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;</summary>
		<author><name>Acro</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=30081&amp;oldid=prev</id>
		<title>Acro: /* Development and announcement of the focal surface display */</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=30081&amp;oldid=prev"/>
		<updated>2024-04-17T06:00:49Z</updated>

		<summary type="html">&lt;p&gt;&lt;span class=&quot;autocomment&quot;&gt;Development and announcement of the focal surface display&lt;/span&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 06:00, 17 April 2024&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l22&quot;&gt;Line 22:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 22:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The Oculus focal surface display project was a long time in development. According to a research scientist at Oculus Research, “manipulating focus isn’t quite the same as modulating intensity or other more usual tasks in computational displays, and it took us a while to get to the correct mathematical formulation that finally brought everything together. Our overall motivation was to do things the ‘right’ way—solid engineering combined with the math and algorithms to back it up. We weren’t going to be happy with something that only worked on paper or a hacked together prototype that didn’t have any rigorous explanation of why it worked.” &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The Oculus focal surface display project was a long time in development. According to a research scientist at Oculus Research, “manipulating focus isn’t quite the same as modulating intensity or other more usual tasks in computational displays, and it took us a while to get to the correct mathematical formulation that finally brought everything together. Our overall motivation was to do things the ‘right’ way—solid engineering combined with the math and algorithms to back it up. We weren’t going to be happy with something that only worked on paper or a hacked together prototype that didn’t have any rigorous explanation of why it worked.” &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;On May, 2017, the VR and AR R&amp;amp;D division of Oculus - Oculus Research - announced the new display technology. During the same period, they published a research paper about their focal surface display, authored by Oculus scientists Nathan Matsuda, Alexander Fix, and Douglas Lanman. The research was also presented at the SIGGRAPH conference in July, 2017. &amp;lt;ref name=”7”&amp;gt;Lang, B. (2017). Oculus Research reveals “groundbreaking” focal surface display. Retrieved from https://www.roadtovr.com/oculus-research-demonstrate-groundbreaking-focal-surface-display/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;On May, 2017, the VR and AR R&amp;amp;D division of Oculus - Oculus Research - announced the new display technology. During the same period, they published a research paper about their focal surface display, authored by Oculus scientists Nathan Matsuda, Alexander Fix, and &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;[[&lt;/ins&gt;Douglas Lanman&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;]]&lt;/ins&gt;. The research was also presented at the SIGGRAPH conference in July, 2017. &amp;lt;ref name=”7”&amp;gt;Lang, B. (2017). Oculus Research reveals “groundbreaking” focal surface display. Retrieved from https://www.roadtovr.com/oculus-research-demonstrate-groundbreaking-focal-surface-display/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Focal Surface display technology==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==Focal Surface display technology==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Acro</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=24872&amp;oldid=prev</id>
		<title>Xinreality at 08:08, 13 December 2017</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=24872&amp;oldid=prev"/>
		<updated>2017-12-13T08:08:14Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 08:08, 13 December 2017&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l45&quot;&gt;Line 45:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 45:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==References==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==References==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&amp;lt;references /&amp;gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;[[Category:Terms]] [[Category:Technical Terms]]&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Xinreality</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=23698&amp;oldid=prev</id>
		<title>Paulo Pacheco at 17:46, 26 October 2017</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=23698&amp;oldid=prev"/>
		<updated>2017-10-26T17:46:56Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 17:46, 26 October 2017&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l5&quot;&gt;Line 5:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 5:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;[[File:Focal surface display technology.png|thumb|Figure 4. Focal surface display technology. (Image:Matsuda &amp;#039;&amp;#039;et al&amp;#039;&amp;#039;., 2017)]]&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;[[File:Focal surface display technology.png|thumb|Figure 4. Focal surface display technology. (Image:Matsuda &amp;#039;&amp;#039;et al&amp;#039;&amp;#039;., 2017)]]&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Focal surface display is a technology developed by Oculus Research that improves focus on images generated by a virtual reality (VR) head-mounted display (HMD) by simulating the way the eyes naturally focus at real object of varying depths (Figure 1). &amp;lt;ref name=”1”&amp;gt;Oculus VR (2017). Oculus Research to present focal surface display discovery at SIGGRAPH. Retrieved from https://www.oculus.com/blog/oculus-research-to-present-focal-surface-display-discovery-at-siggraph/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Focal surface display is a technology developed by &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;[[&lt;/ins&gt;Oculus&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;]] &lt;/ins&gt;Research that improves focus on images generated by a &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;[[&lt;/ins&gt;virtual reality&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;]] &lt;/ins&gt;(VR) &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;[[&lt;/ins&gt;head-mounted display&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;]] &lt;/ins&gt;(HMD) by simulating the way the eyes naturally focus at real object of varying depths (Figure 1). &amp;lt;ref name=”1”&amp;gt;Oculus VR (2017). Oculus Research to present focal surface display discovery at SIGGRAPH. Retrieved from https://www.oculus.com/blog/oculus-research-to-present-focal-surface-display-discovery-at-siggraph/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;While modern VR experiences are superior to what they were just a few years ago, the Oculus focal surface display addresses a perceptual limitation of current HMDs: not being able to display scene content at correct focal depths. These HMDs have a fixed-focus accommodation determined by the headset’s eyepiece focal length. Although they give the illusion of depth from the stereo images, the images are essentially flat, at a fixed perceived distance from the face and with a focus selected by the software instead of the eyes. Scene content with a virtual distance from the viewer different than the fixed focal distance of the headset’s screen will lead to a vergence-accommodation conflict - arising from binocular disparity cues (vergence) in conflict with focus cues (accommodation). The vergence-accommodation conflict prevents the VR content scenes from appearing sharply in focus and may contribute to user’s fatigue and discomfort. &amp;lt;ref name=”2”&amp;gt;Comp Photo Lab. Focal surface displays. Retrieved from http://compphotolab.northwestern.edu/project/focal-surface-displays/&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;Miller, P. (2017). Oculus Research&amp;#039;s focal surface display could make VR much more comfortable for our eyeballs. Retrieved from https://www.theverge.com/circuitbreaker/2017/5/19/15667172/oculus-research-focal-surface-display-vr-comfort-eye-tracking&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;Coppock, M. (2017). Oculus developing ‘focal surface display’ for better VR image clarity. Retrieved from https://www.digitaltrends.com/computing/oculus-working-on-focal-surface-display-technology-for-improved-visual-clarity&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;While modern VR experiences are superior to what they were just a few years ago, the Oculus focal surface display addresses a perceptual limitation of current HMDs: not being able to display scene content at correct focal depths. These HMDs have a fixed-focus accommodation determined by the headset’s eyepiece focal length. Although they give the illusion of depth from the stereo images, the images are essentially flat, at a fixed perceived distance from the face and with a focus selected by the software instead of the eyes. Scene content with a virtual distance from the viewer different than the fixed focal distance of the headset’s screen will lead to a vergence-accommodation conflict - arising from binocular disparity cues (vergence) in conflict with focus cues (accommodation). The vergence-accommodation conflict prevents the VR content scenes from appearing sharply in focus and may contribute to user’s fatigue and discomfort. &amp;lt;ref name=”2”&amp;gt;Comp Photo Lab. Focal surface displays. Retrieved from http://compphotolab.northwestern.edu/project/focal-surface-displays/&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;Miller, P. (2017). Oculus Research&amp;#039;s focal surface display could make VR much more comfortable for our eyeballs. Retrieved from https://www.theverge.com/circuitbreaker/2017/5/19/15667172/oculus-research-focal-surface-display-vr-comfort-eye-tracking&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;Coppock, M. (2017). Oculus developing ‘focal surface display’ for better VR image clarity. Retrieved from https://www.digitaltrends.com/computing/oculus-working-on-focal-surface-display-technology-for-improved-visual-clarity&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l11&quot;&gt;Line 11:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 11:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;According to Oculus Research, the focal surface display has a new approach to avoid the vergence-accommodation conflict by changing the way light enters the display using spatial light modulators (Figure 2) to bend the HMD’s focus around 3D objects. This results in an increased depth and maximizes the amount of space represented. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;According to Oculus Research, the focal surface display has a new approach to avoid the vergence-accommodation conflict by changing the way light enters the display using spatial light modulators (Figure 2) to bend the HMD’s focus around 3D objects. This results in an increased depth and maximizes the amount of space represented. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The vergence-accommodation conflict has been a motivation for plentiful of proposals for VR technology that delivers near-correct accommodation cues. The focal surface display technology could help future VR headsets, improving image sharpness and depth of focus, resulting in an experience that approaches how the eyes normally function, thereby reducing discomfort while improving user’s immersion in the virtual reality. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”5”&amp;gt;Matsuda, N., Fix, A. and Lanman, D. (2017). Focal surface displays.ACM Transactions on Graphics, 36(4)&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt;Halfacree, G. (2017). Oculus VR outs focal surface display technology. Retrieved from https://www.bit-tech.net/news/tech/peripherals/oculus-vr-focal-surface-display/1/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The vergence-accommodation conflict has been a motivation for plentiful of proposals for VR technology that delivers near-correct accommodation cues. The focal surface display technology could help future VR headsets, improving image sharpness and depth of focus, resulting in an experience that approaches how the eyes normally function, thereby reducing discomfort while improving user’s &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;[[&lt;/ins&gt;immersion&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;]] &lt;/ins&gt;in the virtual reality. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”5”&amp;gt;Matsuda, N., Fix, A. and Lanman, D. (2017). Focal surface displays.ACM Transactions on Graphics, 36(4)&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt;Halfacree, G. (2017). Oculus VR outs focal surface display technology. Retrieved from https://www.bit-tech.net/news/tech/peripherals/oculus-vr-focal-surface-display/1/&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The development of the Oculus focal surface display was an interdisciplinary task, “combining leading hardware engineering, scientific and medical imaging, computer vision research, and state-of-the-art algorithms to focus on next-generation VR.” This technology could even allow people who wear corrective lenses use a VR HMD without glasses. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The development of the Oculus focal surface display was an interdisciplinary task, “combining leading hardware engineering, scientific and medical imaging, computer vision research, and state-of-the-art algorithms to focus on next-generation VR.” This technology could even allow people who wear corrective lenses use a VR HMD without glasses. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l40&quot;&gt;Line 40:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 40:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Oculus focal surface display is still not a perfect solution for the vergence-accommodation conflict. It is instead a middle stage between current VR display technology and a future one with ideal properties to completely solve the problems with vergence and accommodation. &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Oculus focal surface display is still not a perfect solution for the vergence-accommodation conflict. It is instead a middle stage between current VR display technology and a future one with ideal properties to completely solve the problems with vergence and accommodation. &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;A consumer-ready product is still a long way out. Nevertheless, the research into this technology will benefit the VR an AR (augmented reality) industry, providing a new direction for future research to delve into. Indeed, the focal surface display requires eye-tracking - a technique that is also not a completely solved issue - and is difficult to implement with a wide field of view. There is also the possibility of focal surface displays being applied to AR devices. &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;A consumer-ready product is still a long way out. Nevertheless, the research into this technology will benefit the VR an AR (&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;[[&lt;/ins&gt;augmented reality&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;]]&lt;/ins&gt;) industry, providing a new direction for future research to delve into. Indeed, the focal surface display requires eye-tracking - a technique that is also not a completely solved issue - and is difficult to implement with a wide field of view. There is also the possibility of focal surface displays being applied to AR devices. &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;While the technology still needs to mature, there is the desire to move beyond fixed-focus headsets which will, inevitably, produce results that will translate into VR products that can be offered to the consumer, in the future. &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;While the technology still needs to mature, there is the desire to move beyond fixed-focus headsets which will, inevitably, produce results that will translate into VR products that can be offered to the consumer, in the future. &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==References==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==References==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Paulo Pacheco</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=23697&amp;oldid=prev</id>
		<title>Paulo Pacheco: Created page with &quot;==Introduction== Figure 1. Focus surface display prototype. (Image: Matsuda &#039;&#039;et al&#039;&#039;., 2017) File:Focal surface display s...&quot;</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Focal_surface_display&amp;diff=23697&amp;oldid=prev"/>
		<updated>2017-10-26T15:13:14Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;==Introduction== &lt;a href=&quot;/wiki/File:Focal_surface_display_prototype.png&quot; title=&quot;File:Focal surface display prototype.png&quot;&gt;thumb|Figure 1. Focus surface display prototype. (Image: Matsuda &amp;#039;&amp;#039;et al&amp;#039;&amp;#039;., 2017)&lt;/a&gt; File:Focal surface display s...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;==Introduction==&lt;br /&gt;
[[File:Focal surface display prototype.png|thumb|Figure 1. Focus surface display prototype. (Image: Matsuda &amp;#039;&amp;#039;et al&amp;#039;&amp;#039;., 2017)]]&lt;br /&gt;
[[File:Focal surface display spatial light modulator.png|thumb|Figure 2. In a focal surface display, a spatial light simulator is placed between the screen and eyepiece of a VR headset. (Image: roadtovr.com)]]&lt;br /&gt;
[[File:Focal surface display varifocal display comparison.png|thumb|Figure 3. Different solutions for the vergence-accommodation conflict. (Image: roadtovr.com)]]&lt;br /&gt;
[[File:Focal surface display technology.png|thumb|Figure 4. Focal surface display technology. (Image:Matsuda &amp;#039;&amp;#039;et al&amp;#039;&amp;#039;., 2017)]]&lt;br /&gt;
&lt;br /&gt;
Focal surface display is a technology developed by Oculus Research that improves focus on images generated by a virtual reality (VR) head-mounted display (HMD) by simulating the way the eyes naturally focus at real object of varying depths (Figure 1). &amp;lt;ref name=”1”&amp;gt;Oculus VR (2017). Oculus Research to present focal surface display discovery at SIGGRAPH. Retrieved from https://www.oculus.com/blog/oculus-research-to-present-focal-surface-display-discovery-at-siggraph/&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While modern VR experiences are superior to what they were just a few years ago, the Oculus focal surface display addresses a perceptual limitation of current HMDs: not being able to display scene content at correct focal depths. These HMDs have a fixed-focus accommodation determined by the headset’s eyepiece focal length. Although they give the illusion of depth from the stereo images, the images are essentially flat, at a fixed perceived distance from the face and with a focus selected by the software instead of the eyes. Scene content with a virtual distance from the viewer different than the fixed focal distance of the headset’s screen will lead to a vergence-accommodation conflict - arising from binocular disparity cues (vergence) in conflict with focus cues (accommodation). The vergence-accommodation conflict prevents the VR content scenes from appearing sharply in focus and may contribute to user’s fatigue and discomfort. &amp;lt;ref name=”2”&amp;gt;Comp Photo Lab. Focal surface displays. Retrieved from http://compphotolab.northwestern.edu/project/focal-surface-displays/&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;Miller, P. (2017). Oculus Research&amp;#039;s focal surface display could make VR much more comfortable for our eyeballs. Retrieved from https://www.theverge.com/circuitbreaker/2017/5/19/15667172/oculus-research-focal-surface-display-vr-comfort-eye-tracking&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;Coppock, M. (2017). Oculus developing ‘focal surface display’ for better VR image clarity. Retrieved from https://www.digitaltrends.com/computing/oculus-working-on-focal-surface-display-technology-for-improved-visual-clarity&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
According to Oculus Research, the focal surface display has a new approach to avoid the vergence-accommodation conflict by changing the way light enters the display using spatial light modulators (Figure 2) to bend the HMD’s focus around 3D objects. This results in an increased depth and maximizes the amount of space represented. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The vergence-accommodation conflict has been a motivation for plentiful of proposals for VR technology that delivers near-correct accommodation cues. The focal surface display technology could help future VR headsets, improving image sharpness and depth of focus, resulting in an experience that approaches how the eyes normally function, thereby reducing discomfort while improving user’s immersion in the virtual reality. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”5”&amp;gt;Matsuda, N., Fix, A. and Lanman, D. (2017). Focal surface displays.ACM Transactions on Graphics, 36(4)&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt;Halfacree, G. (2017). Oculus VR outs focal surface display technology. Retrieved from https://www.bit-tech.net/news/tech/peripherals/oculus-vr-focal-surface-display/1/&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The development of the Oculus focal surface display was an interdisciplinary task, “combining leading hardware engineering, scientific and medical imaging, computer vision research, and state-of-the-art algorithms to focus on next-generation VR.” This technology could even allow people who wear corrective lenses use a VR HMD without glasses. &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There had been previous attempts to solve the vergence-accommodation conflict such as using integral imaging techniques to synthesize light fields from scene content or displaying multiple focal planes, but these suffered from such problems as low fidelity accommodation cues, low resolution, and low field of view. The focal surface display is expected to generate high fidelity accommodation cues using off-the-shelf optical components. The spatial light modulator - placed between the display screen and eyepiece - produces variable focus along the display field of view. &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Currently, there is no planned commercial release for the focal surface display technology. &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Development and announcement of the focal surface display==&lt;br /&gt;
The Oculus focal surface display project was a long time in development. According to a research scientist at Oculus Research, “manipulating focus isn’t quite the same as modulating intensity or other more usual tasks in computational displays, and it took us a while to get to the correct mathematical formulation that finally brought everything together. Our overall motivation was to do things the ‘right’ way—solid engineering combined with the math and algorithms to back it up. We weren’t going to be happy with something that only worked on paper or a hacked together prototype that didn’t have any rigorous explanation of why it worked.” &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
On May, 2017, the VR and AR R&amp;amp;D division of Oculus - Oculus Research - announced the new display technology. During the same period, they published a research paper about their focal surface display, authored by Oculus scientists Nathan Matsuda, Alexander Fix, and Douglas Lanman. The research was also presented at the SIGGRAPH conference in July, 2017. &amp;lt;ref name=”7”&amp;gt;Lang, B. (2017). Oculus Research reveals “groundbreaking” focal surface display. Retrieved from https://www.roadtovr.com/oculus-research-demonstrate-groundbreaking-focal-surface-display/&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Focal Surface display technology==&lt;br /&gt;
In current VR HMDs, the viewing optics - a magnifying lens - is placed between the user’s eyes and a display screen - a configuration that is mirrored for binocular stereo configurations where one set of optics and one display, or a portion of it, is dedicated to each eye. Matsuda &amp;#039;&amp;#039;et al&amp;#039;&amp;#039;. (2017) write that “a binocular HMD depicts stereoscopic imagery such that the user perceives virtual objects with correct retinal disparity, which is the critical stimulus to vergence (the degree to which the eyes are converged or diverged to fixate a point).” &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Conventional VR HMDs have, therefore, two main optical components: the eyepiece and a display, which deliver a single, fixed focal surface. With the Oculus focal surface technology, a third optical element is introduced: a phase-modifying spatial light modulator that is placed between the other two optical components. The spatial light modulator functions as a programmable lens with differing focal length, allowing the virtual image to be constructed at different depths. This technology is an expansion on the concept of an adaptive multifocal display. &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Current VR HMD technology does not correctly represent retinal blur - a critical stimulus to accommodation. This leads to the vergence-accommodation conflict that is responsible for visual discomforts - such as eye strain and blurred vision - and headaches. Vergence-accommodation as also been associated to perceptual consequences, influencing eye movements and the ability to resolve depth. &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Different HMD architectures have been proposed to solve this problem and depict correct or near-correct retinal blur (Figure 3). The focal surface displays augment regular HMDs with a spatial light modulator that “acts as a dynamic freeform lens, shaping synthesized focal surfaces to conform to the virtual scene geometry.” Furthermore, Oculus Research has introduced “a framework to decompose target focal stacks and depth maps into one or more pairs of piecewise smooth focal surfaces and underlying display images,” building on “recent developments in &amp;quot;optimized blending&amp;quot; to implement a multifocal display that allows the accurate depiction of occluding, semi-transparent, and reflective objects.” &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Contrary to multifocal displays with fixed focal surfaces, the phase modulator shapes focal surfaces to conform to the scene geometry. A set of color images are produced and mapped onto a corresponding focal surface (Figure 4), with visual appearance being rendered by “tracing rays from the eye through the optics, and accumulating the color values for each focal surface.” Furthermore, Matsuda &amp;#039;&amp;#039;et al&amp;#039;&amp;#039;. (2017) explain that their “algorithm sequentially solves for first the focal surfaces, given the target depth map, and then the color images—full joint optimization is left for future work. Focal surfaces are adapted by nonlinear least squares optimization, minimizing the distance between the nearest depicted surface and the scene geometry. The color images, paired with each surface, are determined by linear least squares methods.” &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The focal surface display research team demonstrated that the technology depicts more accurate retinal blur, with lesser multiplexed images, with high resolution being maintained throughout the user’s accommodative range. &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Future work for focal surface displays==&lt;br /&gt;
Oculus focal surface display is still not a perfect solution for the vergence-accommodation conflict. It is instead a middle stage between current VR display technology and a future one with ideal properties to completely solve the problems with vergence and accommodation. &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A consumer-ready product is still a long way out. Nevertheless, the research into this technology will benefit the VR an AR (augmented reality) industry, providing a new direction for future research to delve into. Indeed, the focal surface display requires eye-tracking - a technique that is also not a completely solved issue - and is difficult to implement with a wide field of view. There is also the possibility of focal surface displays being applied to AR devices. &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While the technology still needs to mature, there is the desire to move beyond fixed-focus headsets which will, inevitably, produce results that will translate into VR products that can be offered to the consumer, in the future. &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;/div&gt;</summary>
		<author><name>Paulo Pacheco</name></author>
	</entry>
</feed>