<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://vrarwiki.com/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Shadowdawn</id>
	<title>VR &amp; AR Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://vrarwiki.com/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Shadowdawn"/>
	<link rel="alternate" type="text/html" href="https://vrarwiki.com/wiki/Special:Contributions/Shadowdawn"/>
	<updated>2026-04-14T18:40:01Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.43.0</generator>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Atheer_AiR&amp;diff=19190</id>
		<title>Atheer AiR</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Atheer_AiR&amp;diff=19190"/>
		<updated>2017-01-05T17:07:43Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Device Infobox&lt;br /&gt;
|image=[[File:atheer air1.jpg|400px]]&lt;br /&gt;
|VR/AR=AR&lt;br /&gt;
|Type=[[Smart Glasses]]&lt;br /&gt;
|Subtype=&lt;br /&gt;
|Platform=&lt;br /&gt;
|Creator=&lt;br /&gt;
|Developer=&lt;br /&gt;
|Manufacturer=[[Atheer]]&lt;br /&gt;
|Operating System=[[Android]]&lt;br /&gt;
|Versions=&lt;br /&gt;
|Requires=&lt;br /&gt;
|Predecessor=&lt;br /&gt;
|Successor=&lt;br /&gt;
|CPU=NVIDIA® Tegra® K1 : quad-core ARM-based CPU and Kepler GPU&lt;br /&gt;
|GPU=&lt;br /&gt;
|HPU=&lt;br /&gt;
|Memory=&lt;br /&gt;
|Storage=Up to 128GB flash storage&lt;br /&gt;
|Display=720p (1280 x 720) 60fps, 50 degree field-of-view&lt;br /&gt;
|Resolution=&lt;br /&gt;
|Pixel Density=&lt;br /&gt;
|Refresh Rate=&lt;br /&gt;
|Persistence=&lt;br /&gt;
|Precision=&lt;br /&gt;
|Field of View=&lt;br /&gt;
|Optics=&lt;br /&gt;
|Tracking=&lt;br /&gt;
|Rotational Tracking=&lt;br /&gt;
|Positional Tracking=&lt;br /&gt;
|Update Rate=&lt;br /&gt;
|Tracking Volume=&lt;br /&gt;
|Latency=&lt;br /&gt;
|Audio=&lt;br /&gt;
|Camera=&lt;br /&gt;
|Sensors=&lt;br /&gt;
|Input=&lt;br /&gt;
|Connectivity=&lt;br /&gt;
|Power=Built-in 3100mAH Lithium-Ion battery + extensible battery pack&lt;br /&gt;
|Weight=&lt;br /&gt;
|Size=&lt;br /&gt;
|Cable Length=&lt;br /&gt;
|Release Date=&lt;br /&gt;
|Price=$3,950.00&lt;br /&gt;
|Website=[http://www.atheerair.com/air-glasses/ Atheer AiR]&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Atheer AiR is said to be the world&#039;s most interactive pair of 3D smart glasses for deskless professionals. This innovative product targets people across many different fields and provides them with critical information right in their field-of-view. Such information may include everything from image annotations to workflow charts, video calls, or web data. &amp;lt;ref&amp;gt;https://www.indiegogo.com/projects/atheer-the-world-s-most-interactive-smart-glasses#/&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Features==&lt;br /&gt;
Atheer AiR Glasses leverage the power of existing Android applications by enabling professionals, such as doctors, technicians, and engineers, to enhance their productivity with rich information and distraction free communication. &lt;br /&gt;
&lt;br /&gt;
The glasses itself are powered by a custom built Android operating system called AiR OS. Businesses can take advantage of the collaboration-centric AiR Suite for Enterprise, which allows easy access to critical information. The AiR SDK allows developers to create new applications that utilize many different sensors that come built into the AiR Glasses.&amp;lt;ref&amp;gt;http://www.prnewswire.com/news-releases/augmented-reality-leader-atheer-unveils-air-glasses-and-air-enterprise-suite-to-transform-the-way-deskless-professionals-work-and-collaborate-300182246.html&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
==Hardware==&lt;br /&gt;
The AiR glasses feature two 720p displays capable of displaying 60 frames per second. Each display provides 50-degree field of view. &lt;br /&gt;
&lt;br /&gt;
A 3D depth camera enables gesture interaction and works in conjunction with a 9-axis inertial measurement unit (e.g. accelerometer, gyroscope, and magnetometer), dual RGB cameras, and a directional microphone.&lt;br /&gt;
&lt;br /&gt;
The processor is housed in a separate unit that connects to the glasses with a cable. This unit houses a NVIDIA Tegra K1 chip, 2GB of RAM, up to 128GB flash storage, and a 3100mAH Lithium-ion battery. The battery is hot-swappable for extra long use in the field. Additionally, there&#039;s also Bluetooth 4.1, Wi-Fi 802.11 a/b/g/n/ac, USB-C, and HDMI Out support &amp;lt;ref&amp;gt;http://venturebeat.com/2015/11/19/you-can-now-reserve-atheers-high-end-augmented-reality-glasses-for-the-workplace/&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Technical Specifications==&lt;br /&gt;
* &#039;&#039;&#039;Display&#039;&#039;&#039;: 720p (1280 x 720) 60fps, 50 degree field-of-view&lt;br /&gt;
* &#039;&#039;&#039;CPU&#039;&#039;&#039;: NVIDIA® Tegra® K1 : quad-core ARM-based CPU and Kepler GPU&lt;br /&gt;
* &#039;&#039;&#039;RAM&#039;&#039;&#039;: 2GB RAM&lt;br /&gt;
* &#039;&#039;&#039;Storage&#039;&#039;&#039;: Up to 128GB flash storage&lt;br /&gt;
* &#039;&#039;&#039;Battery&#039;&#039;&#039;: Built-in 3100mAH Lithium-Ion battery + extensible battery pack&lt;br /&gt;
* &#039;&#039;&#039;Connectivity&#039;&#039;&#039;: Bluetooth 4.1, WiFi 802.11 a/b/g/n/ac, USB-C, HDMI Out, optional 4G/LTE &amp;amp; GPS module&lt;br /&gt;
* &#039;&#039;&#039;Sensors&#039;&#039;&#039;: 9-axis IMU, ToF 3D depth sensor for gesture control, 2x 4MP cameras&lt;br /&gt;
==Software==&lt;br /&gt;
The AiR Suite for Enterprise is comprised of three individual applications: AiR Hub, AiR Flow, and AiR Designer. Together, they improve collaboration, management, and rapid task flow deployment. &lt;br /&gt;
&lt;br /&gt;
AiR Hub is a cloud-based collaboration and management console that provides end users with remote expert guidance for AiR Flow users through live video, photo, and text. It supports multi-device &amp;amp; user profile management and integrates with leading enterprise databases &amp;amp; applications.&lt;br /&gt;
&lt;br /&gt;
AiR Designer is a task flow creation tool that lets users author procedures, checklists, and task flows with a simple drag-and-drop interface. &lt;br /&gt;
&lt;br /&gt;
AiR Flow is a customizable task flow and collaboration application that offers remote expert and multimedia collaboration with AiR Hub and easy interaction for task flow compliance.&amp;lt;ref&amp;gt;http://atheerair.com/air-suite/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
==Apps==&lt;br /&gt;
The Atheer AiR is designed to work with all existing Android applications. &lt;br /&gt;
&lt;br /&gt;
==Developer==&lt;br /&gt;
Developers who would like to take full advantage of everything Atheer AiR has to offer can use the AiR SDK. It&#039;s built on the Android API to allow easy design of new solutions and natural integration of rich sensor data provided by the AiR Glasses. The platform also supports common 3rd party toolkits such as Vuforia and the Unity 3D engine.&amp;lt;ref&amp;gt;http://atheerair.com/platform/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
==Accessories==&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
The company behind AiR (Augmented interactive Reality), Atheer, is based in Mountain View, CA and is backed by prominent investors including Bobby Yazdani, Co-Founder of Saba Software, and Farzad Naimi, Co-Founder &amp;amp; Managing Partner, RONA Holdings. Their wearable glasses are designed to enhance the productivity and safety of deskless professionals at Fortune 1000 companies.&amp;lt;ref&amp;gt;http://www.prnewswire.com/news-releases/augmented-reality-leader-atheer-unveils-air-glasses-and-air-enterprise-suite-to-transform-the-way-deskless-professionals-work-and-collaborate-300182246.html&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
February 18, 2014 - End of [https://www.indiegogo.com/projects/atheer-the-world-s-most-interactive-smart-glasses#/updates Indiegogo campaign] &lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Augmented Reality Devices]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Template:App_Infobox&amp;diff=10738</id>
		<title>Template:App Infobox</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Template:App_Infobox&amp;diff=10738"/>
		<updated>2016-09-13T11:28:41Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;infobox&amp;quot; style=&amp;quot;font-size:89%; width:21em; -moz-border-radius: .2em; -webkit-border-radius: .2em;&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! colspan=&amp;quot;2&amp;quot; style=&amp;quot;background-color: #254b72; color: #FFFFFF; font-size:120%; padding:0.5em;&amp;quot; | {{{name|{{PAGENAME}}}}}&lt;br /&gt;
|-&lt;br /&gt;
! colspan=&amp;quot;2&amp;quot; |{{#if:{{{image|}}}|{{{image|}}}}}&lt;br /&gt;
|-&lt;br /&gt;
! colspan=&amp;quot;2&amp;quot; style=&amp;quot;background-color: #254b72; color: #FFFFFF; padding:0em;&amp;quot; | Information&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{VR/AR|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;VR/AR&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{VR/AR|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Developer|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Developer&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Developer|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Publisher|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Publisher&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Publisher|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Director|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Director&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Director|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Producer|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Producer&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Producer|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Platform|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Platform&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Platform|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Device|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Device&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Device|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Operating System|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Operating System&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Operating System|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Type|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Type&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Type|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Genre|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Genre&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Genre|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Input Device|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Input Device&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Input Device|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Play Area|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Play Area&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Play Area|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Game Mode|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Game Mode&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Game Mode|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Comfort Level|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Comfort Level&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Comfort Level|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Language|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Language&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Language|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Version|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Version&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Version|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Rating|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Rating&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Rating|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Review|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Review&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Review|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Downloads|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Downloads&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Downloads|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Release Date|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Release Date&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Release Date|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Price|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Price&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Price|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{App Store|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;App Store&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{App Store|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Website|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Website&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Website|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Infobox Updated|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Infobox Updated&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Infobox Updated|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
|}&amp;lt;includeonly&amp;gt;[[Category:Apps|{{PAGENAME}}]] {{#set:VR/AR={{{VR/AR|}}} |+sep}} {{#set:Developer={{{Developer|}}} |+sep}} {{#set:Publisher={{{Publisher|}}} |+sep}} {{#set:Director={{{Director|}}} |+sep}} {{#set:Producer={{{Producer|}}} |+sep}} {{#set:Platform={{{Platform|}}} |+sep}} {{#set:Device={{{Device|}}} |+sep}} {{#set:Operating System={{{Operating System|}}} |+sep}} {{#set:Type={{{Type|}}} |+sep}} {{#set:Genre={{{Genre|}}} |+sep}} {{#set:Input Device={{{Input Device|}}} |+sep}} {{#set:Play Area={{{Play Area|}}} |+sep}} {{#set:Game Mode={{{Game Mode|}}} |+sep}} {{#set:Comfort Level={{{Comfort Level|}}} }} {{#set:Language={{{Language|}}} }}  {{#set:Version={{{Version|}}} }} {{#set:Rating={{{Rating|}}} }} {{#set:Review={{{Review|}}} }} {{#set:Downloads={{{Downloads|}}} }} {{#set:Release Date={{{Release Date|}}} }} {{#set:Price={{{Price|}}} }} {{#set:App Store={{{App Store|}}} |+sep}} {{#set:Website={{{Website|}}} }} {{#set:Infobox Updated={{{Infobox Updated|}}} }}&amp;lt;/includeonly&amp;gt;&lt;br /&gt;
&amp;lt;noinclude&amp;gt;&lt;br /&gt;
Designed for use on [[App]] pages.&lt;br /&gt;
&lt;br /&gt;
==Usage==&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{{App Infobox&lt;br /&gt;
|image=&lt;br /&gt;
|VR/AR=&lt;br /&gt;
|Developer=&lt;br /&gt;
|Publisher=&lt;br /&gt;
|Director=&lt;br /&gt;
|Producer=&lt;br /&gt;
|Platform=&lt;br /&gt;
|Device=&lt;br /&gt;
|Operating System=&lt;br /&gt;
|Type=&lt;br /&gt;
|Genre=&lt;br /&gt;
|Input Device=&lt;br /&gt;
|Play Area=&lt;br /&gt;
|Game Mode=&lt;br /&gt;
|Comfort Level=&lt;br /&gt;
|Language=&lt;br /&gt;
|Version=&lt;br /&gt;
|Rating=&lt;br /&gt;
|Review=&lt;br /&gt;
|Downloads=&lt;br /&gt;
|Release Date=&lt;br /&gt;
|Price=&lt;br /&gt;
|App Store=&lt;br /&gt;
|Website=&lt;br /&gt;
|Infobox Updated=&lt;br /&gt;
}}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Parameters==&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|image=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Image of instance in normal &amp;lt;nowiki&amp;gt;[[file:name|size]]&amp;lt;/nowiki&amp;gt; format.&lt;br /&gt;
: E.g. [[]]&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|VR/AR=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: [[Virtual Reality]], [[Augmented Reality]] and/or [[Mixed Reality]]&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Developer=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Developer of the app&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Publisher=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Publisher of the app&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Director=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Director of the [[experience]]&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Producer=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Producer of the [[experience]]&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Platform=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: The main Platform(s) for the App. For example: [[Oculus Rift]], [[SteamVR]], [[Project Morpheus]], [[Google Glass]], [[OSVR]] etc.&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Device=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Devices the app runs on. For example: [[DK1]], [[DK2]], [[HTC Vive]] etc.&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Operating System=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Operating System(s) the app runs on. For example: [[Windows]], [[Mac]], [[Linux]], [[iOS]], [[Android]] etc.&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Type=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Type for the app. Types include [[Game]], [[Experience]], [[Demo]], [[Pre-Release]], [[Mod]].&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Genre=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Genre of the app. Genres are include [[360 Video]], [[Action/Adventure]], [[Casual]], [[Educational]], [[Exploration]], [[Fighting]], [[Horror]], [[Puzzle]], [[Racing]], [[RPG]], [[Shooter]], [[Simulation]], [[Sports]], [[Strategy]].&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Input Device=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Input Device(s) or controllers&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Play Area=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Play Area of the App: [[Seated]], [[Standing]], [[Roomscale]]&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Game Mode=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Game Mode of the App&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Comfort Level=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Comfort Level of the App&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Language=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: App Language&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Version=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: App Version&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Rating=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Average Rating of the App&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Review=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: How well the App is received by critics and players&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Downloads=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Number of Downloads&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Release Date=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Release Date of the App&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Price=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Price of the App&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|App Store=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: App Store where the app is sold or distributed. I.E. [[Steam]], [[Oculus Store]] etc.&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Website=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: App Website&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Infobox Updated=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Date the information in the infobox was last updated&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Category:Templates]]&lt;br /&gt;
&amp;lt;/noinclude&amp;gt;&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Oculus_Share&amp;diff=10737</id>
		<title>Oculus Share</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Oculus_Share&amp;diff=10737"/>
		<updated>2016-09-13T11:23:40Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{App Store Infobox&lt;br /&gt;
|image=&lt;br /&gt;
|Platforms=[[Oculus Rift (Platform)]]&lt;br /&gt;
|Devices=[[DK1]], [[DK2]], [[CV1]]&lt;br /&gt;
|Operating Systems=[[Windows]], [[Mac]], [[Linux]]&lt;br /&gt;
|Accessible=[[Desktop]], [[Mobile]]&lt;br /&gt;
|Developer=[[Oculus VR]]&lt;br /&gt;
|Notable Personnel=&lt;br /&gt;
|Apps=&lt;br /&gt;
|Downloads=&lt;br /&gt;
|Website=https://share.oculus.com/&lt;br /&gt;
}}&lt;br /&gt;
{{see also|Oculus Store}}&lt;br /&gt;
&#039;&#039;&#039;Oculus Share was shut down in May 2016.&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Oculus Share is the official [[App Store|app distribution platform]] and database for the [[Oculus Rift (Platform)]]. Created by [[Oculus VR]], Oculus Share allows users to share and discover Rift apps, demos and other experiences. Oculus Share can be accessed on [[desktop]] and [[mobile]] devices. The [[App Store]] version of Oculus Share is called [[Oculus Store]], which is accessible on [[Oculus Rift#Devices|Rift devices]].&lt;br /&gt;
&lt;br /&gt;
==General Information==&lt;br /&gt;
The app files can be hosted by Oculus Share or a 3rd party developer.&lt;br /&gt;
&lt;br /&gt;
Screenshots, icons and [[YouTube]] videos of the app can be uploaded.&lt;br /&gt;
&lt;br /&gt;
Operating Systems include [[Windows]], [[Mac]] and [[Linux]].&lt;br /&gt;
&lt;br /&gt;
[[Input Devices]] include [[Keyboard]], [[Mouse]], [[Gamepad]], [[Hydra]] and [[Other]].&lt;br /&gt;
&lt;br /&gt;
[[Game Modes]] include [[Single Player]], [[Multiplayer]] and [[Co-Op]]&lt;br /&gt;
&lt;br /&gt;
[[Oculus Rift#Devices|Rift Devices]] include [[DK1]], [[DK2]], [[CV1]]&lt;br /&gt;
&lt;br /&gt;
Age Rating: None, 13+, 18+&lt;br /&gt;
==Categories==&lt;br /&gt;
Oculus Share features 5 categories: [[Full Game]], [[Experience]], [[Tech Demo]], [[Pre-Release]], [[Official Mod]].&lt;br /&gt;
&lt;br /&gt;
Developer may choose only 1 category for each app.&lt;br /&gt;
&lt;br /&gt;
==Genres==&lt;br /&gt;
Oculus Share features 9 genres: [[360 Video]], [[Action/Adventure]], [[Casual]], [[Exploration]], [[Puzzle]], [[Racing]], [[Simulation]], [[Sports]], [[Strategy]].&lt;br /&gt;
&lt;br /&gt;
Developer can choose as many genres as needed for his or her app. The list of genres may change over time.&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Template:App_Infobox&amp;diff=10736</id>
		<title>Template:App Infobox</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Template:App_Infobox&amp;diff=10736"/>
		<updated>2016-09-13T10:59:35Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;infobox&amp;quot; style=&amp;quot;font-size:89%; width:21em; -moz-border-radius: .2em; -webkit-border-radius: .2em;&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! colspan=&amp;quot;2&amp;quot; style=&amp;quot;background-color: #254b72; color: #FFFFFF; font-size:120%; padding:0.5em;&amp;quot; | {{{name|{{PAGENAME}}}}}&lt;br /&gt;
|-&lt;br /&gt;
! colspan=&amp;quot;2&amp;quot; |{{#if:{{{image|}}}|{{{image|}}}}}&lt;br /&gt;
|-&lt;br /&gt;
! colspan=&amp;quot;2&amp;quot; style=&amp;quot;background-color: #254b72; color: #FFFFFF; padding:0em;&amp;quot; | Information&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{VR/AR|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;VR/AR&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{VR/AR|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Developer|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Developer&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Developer|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Publisher|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Publisher&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Publisher|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Director|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Director&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Director|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Producer|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Producer&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Producer|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Platform|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Platform&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Platform|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Device|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Device&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Device|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Operating System|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Operating System&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Operating System|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Type|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Type&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Type|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Genre|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Genre&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Genre|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Input Device|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Input Device&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Input Device|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Play Area|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Play Area&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Play Area|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Game Mode|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Game Mode&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Game Mode|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Comfort Level|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Comfort Level&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Comfort Level|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Language|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Language&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Language|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Version|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Version&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Version|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Rating|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Rating&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Rating|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Review|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Review&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Review|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Downloads|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Downloads&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Downloads|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Release Date|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Release Date&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Release Date|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Price|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Price&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Price|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Website|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Website&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Website|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
{{#if: {{{Infobox Updated|}}}|&lt;br /&gt;
{{!}} &#039;&#039;&#039;Infobox Updated&#039;&#039;&#039;&lt;br /&gt;
{{!}} {{{Infobox Updated|}}} }}&lt;br /&gt;
|-&lt;br /&gt;
|}&amp;lt;includeonly&amp;gt;[[Category:Apps|{{PAGENAME}}]] {{#set:VR/AR={{{VR/AR|}}} |+sep}} {{#set:Developer={{{Developer|}}} |+sep}} {{#set:Publisher={{{Publisher|}}} |+sep}} {{#set:Director={{{Director|}}} |+sep}} {{#set:Producer={{{Producer|}}} |+sep}} {{#set:Platform={{{Platform|}}} |+sep}} {{#set:Device={{{Device|}}} |+sep}} {{#set:Operating System={{{Operating System|}}} |+sep}} {{#set:Type={{{Type|}}} |+sep}} {{#set:Genre={{{Genre|}}} |+sep}} {{#set:Input Device={{{Input Device|}}} |+sep}} {{#set:Play Area={{{Play Area|}}} |+sep}} {{#set:Game Mode={{{Game Mode|}}} |+sep}} {{#set:Comfort Level={{{Comfort Level|}}} }} {{#set:Language={{{Language|}}} }}  {{#set:Version={{{Version|}}} }} {{#set:Rating={{{Rating|}}} }} {{#set:Review={{{Review|}}} }} {{#set:Downloads={{{Downloads|}}} }} {{#set:Release Date={{{Release Date|}}} }} {{#set:Price={{{Price|}}} }} {{#set:Website={{{Website|}}} }} {{#set:Infobox Updated={{{Infobox Updated|}}} }}&amp;lt;/includeonly&amp;gt;&lt;br /&gt;
&amp;lt;noinclude&amp;gt;&lt;br /&gt;
Designed for use on [[App]] pages.&lt;br /&gt;
&lt;br /&gt;
==Usage==&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{{App Infobox&lt;br /&gt;
|image=&lt;br /&gt;
|VR/AR=&lt;br /&gt;
|Developer=&lt;br /&gt;
|Publisher=&lt;br /&gt;
|Director=&lt;br /&gt;
|Producer=&lt;br /&gt;
|Platform=&lt;br /&gt;
|Device=&lt;br /&gt;
|Operating System=&lt;br /&gt;
|Type=&lt;br /&gt;
|Genre=&lt;br /&gt;
|Input Device=&lt;br /&gt;
|Play Area=&lt;br /&gt;
|Game Mode=&lt;br /&gt;
|Comfort Level=&lt;br /&gt;
|Language=&lt;br /&gt;
|Version=&lt;br /&gt;
|Rating=&lt;br /&gt;
|Review=&lt;br /&gt;
|Downloads=&lt;br /&gt;
|Release Date=&lt;br /&gt;
|Price=&lt;br /&gt;
|Website=&lt;br /&gt;
|Infobox Updated=&lt;br /&gt;
}}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Parameters==&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|image=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Image of instance in normal &amp;lt;nowiki&amp;gt;[[file:name|size]]&amp;lt;/nowiki&amp;gt; format.&lt;br /&gt;
: E.g. [[]]&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|VR/AR=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: [[Virtual Reality]], [[Augmented Reality]] and/or [[Mixed Reality]]&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Developer=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Developer of the app&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Publisher=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Publisher of the app&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Director=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Director of the [[experience]]&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Producer=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Producer of the [[experience]]&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Platform=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: The main Platform(s) for the App. For example: [[Oculus Rift]], [[SteamVR]], [[Project Morpheus]], [[Google Glass]], [[OSVR]] etc.&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Device=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Devices the app runs on. For example: [[DK1]], [[DK2]], [[HTC Vive]] etc.&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Operating System=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Operating System(s) the app runs on. For example: [[Windows]], [[Mac]], [[Linux]], [[iOS]], [[Android]] etc.&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Type=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Type for the app. Types include [[Game]], [[Experience]], [[Demo]], [[Pre-Release]], [[Mod]].&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Genre=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Genre of the app. Genres are include [[360 Video]], [[Action/Adventure]], [[Casual]], [[Educational]], [[Exploration]], [[Fighting]], [[Horror]], [[Puzzle]], [[Racing]], [[RPG]], [[Shooter]], [[Simulation]], [[Sports]], [[Strategy]].&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Input Device=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Input Device(s) or controllers&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Play Area=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Play Area of the App: [[Seated]], [[Standing]], [[Roomscale]]&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Game Mode=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Game Mode of the App&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Comfort Level=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Comfort Level of the App&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Language=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: App Language&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Version=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: App Version&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Rating=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Average Rating of the App&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Review=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: How well the App is received by critics and players&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Downloads=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Number of Downloads&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Release Date=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Release Date of the App&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Price=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Price of the App&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Website=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: App Website&lt;br /&gt;
&#039;&#039;&#039;&amp;lt;nowiki&amp;gt;|Infobox Updated=&amp;lt;/nowiki&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
: Date the information in the infobox was last updated&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Category:Templates]]&lt;br /&gt;
&amp;lt;/noinclude&amp;gt;&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=HoloLens_Clicker&amp;diff=10735</id>
		<title>HoloLens Clicker</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=HoloLens_Clicker&amp;diff=10735"/>
		<updated>2016-09-13T01:04:25Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Device Infobox&lt;br /&gt;
|image=&lt;br /&gt;
|Type=[[Input Device]], [[Motion Tracker]]&lt;br /&gt;
|Subtype=[[Hands/Fingers Tracking]]&lt;br /&gt;
|Platform=[[Microsoft HoloLens]]&lt;br /&gt;
|Creator=&lt;br /&gt;
|Developer=[[Microsoft]]&lt;br /&gt;
|Manufacturer=&lt;br /&gt;
|Operating System=[[Windows 10]]&lt;br /&gt;
|Versions=&lt;br /&gt;
|Requires=&lt;br /&gt;
|CPU=&lt;br /&gt;
|GPU=&lt;br /&gt;
|HPU=&lt;br /&gt;
|Memory=&lt;br /&gt;
|Storage=&lt;br /&gt;
|Display=&lt;br /&gt;
|Resolution=&lt;br /&gt;
|Refresh Rate=&lt;br /&gt;
|Persistence=&lt;br /&gt;
|Precision=&lt;br /&gt;
|Field of View=&lt;br /&gt;
|Tracking=3DOF&lt;br /&gt;
|Rotational Tracking=IMUs&lt;br /&gt;
|Positional Tracking=&lt;br /&gt;
|Update Rate=&lt;br /&gt;
|Latency=&lt;br /&gt;
|Audio=&lt;br /&gt;
|Camera=&lt;br /&gt;
|Sensors=&lt;br /&gt;
|Input=&lt;br /&gt;
|Power=&lt;br /&gt;
|Weight=&lt;br /&gt;
|Size=&lt;br /&gt;
|Release Date=&lt;br /&gt;
|Price=&lt;br /&gt;
|Website=&lt;br /&gt;
}}&lt;br /&gt;
{{stub}}&lt;br /&gt;
[[HoloLens Clicker]] is the [[Input Device]] for the [[Microsoft HoloLens]]. Developed by [[Microsoft]], the small remote has a single button along with [[rotational tracking]]. It allows a user to click and scroll with minimal hand motion as a replacement for the air-tap gesture.&lt;br /&gt;
==Introduction==&lt;br /&gt;
[[File:hololens clicker concept1.png|350px|thumb|right|Figure 1]]&lt;br /&gt;
The HoloLens Clicker is a peripheral device for Microsoft HoloLens (figure 1) &amp;lt;ref name=”1”&amp;gt; Greenwald, W. (2016). Microsoft HoloLens Development Edition. Retrieved from www.pcmag.com/review/347119/microsoft-hololens-development-edition&amp;lt;/ref&amp;gt;. It was designed exclusively for that augmented reality (AR) device, allowing for another way to control and interact with the holograms displayed by the AR headset &amp;lt;ref name=”2”&amp;gt; Microsoft (2016). Use the HoloLens Clicker. Retrieved support.microsoft.com/pt-pt/help/12646/hololens-use-the-hololens-clicker&amp;lt;/ref&amp;gt;. It is a miniature controller that lets the user click on whatever he or she is looking at, since there is a small dot in the center of the user’s view functioning as a cursor. Hand motion is used for the drag and drop command, since the Clicker has motion sensors on board &amp;lt;ref name=”5”&amp;gt; Rubino, D. (2016). My first 24 hours with Microsoft HoloLens and awesome things I learned. Retrieved from /www.windowscentral.com/my-first-24-hours-microsoft-hololens&amp;lt;/ref&amp;gt;. The user clicks and holds with the Clicker, and then slowly moves the hand to drag objects. This function is mainly used for resizing windows and holograms as well as scrolling up and down documents in Edge &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Wilhelm, P. (2016). Microsoft HoloLens Bluetooth Clicker is our first glimpse at a possible controller. Retrieved from www.techradar.com/news/wearables/microsoft-s-hololens-bluetooth-clicker-is-our-first-glimpse-at-any-controller-1315623&amp;lt;/ref&amp;gt;. It also makes typing an easier endeavor, where the user can look at the letters in the keyboard and click on the device to select them. The HoloLens Clicker can be used in addition or instead of the “air-tap” gesture &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
==Hardware==&lt;br /&gt;
[[File:hololens clicker concept2.png|350px|thumb|right|Figure 1]]&lt;br /&gt;
The Clicker attaches to the user’s finger via an elastic strap (figure 2). It uses Bluetooth and its design aims for simplicity &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt; Petty, J. (2016). I used the Microsoft HoloLens and I can’t stop thinking about it. Retrieved from www.ign.com/articles/2016/09/06/i-used-the-hololens-and-i-canat-stop-thinking-about-it&amp;lt;/ref&amp;gt;. It only has one button, and is charged over Micro USB &amp;lt;ref name=”7”&amp;gt; Tom Warren (2016). Microsoft’s HoloLens Start menu detailed in leaked video. Retrieved from www.theverge.com/2016/2/23/11098332/microsoft-hololens-start-menu-bluetooth-clicker&amp;lt;/ref&amp;gt;. The first details for this device emerged in February of 2016, when a leaked quick-start guide was posted by Twitter user WalkingCat &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The HoloLens interaction model has, in general, three key elements: Gaze, Gesture, and Voice. The Gaze relates to what you are looking at (e.g. headtracking), the Gesture is an “air-tap” movement that the HoloLens will recognize and allow for selection of items, and Voice that allows for voice commands. Even though the gestures work well on HoloLens, and the voice commands are reliable, there have been reports that it may become inconvenient for some users to use them over and over again during a short period of time. Indeed, Jared Petty from IGN reports that even after practicing the “air-tap” gestures, preference went to the Clicker since it provided tactile feedback that made the interactions with the holograms feel more precise &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The HoloLens Clicker is, therefore, a way to solve this and give users another option to control the holograms in the AR device &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; Pradeep (2016). Microsoft planning to release HoloLens Clicker accessory to improve interactions with holograms. Retrieved from mspoweruser.com/microsoft-planning-to-release-hololens-clicker-accessory-to-improve-interactions-with-holograms&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Compared to other headsets, like the HTC Vive, Oculus Rift, and PlayStation VR, the HoloLens Clicker controller is the simplest and smallest of the peripherals. Due to its small design, it is intended primarily for navigation and not for use in precise application such as gaming. This device is the only controller that has been unveiled for the HoloLens, for now. A more complex controller may be in the works &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
A Development Edition of the Microsoft HoloLens has been released, which includes the Clicker peripheral &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Clicker Gestures==&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Select&#039;&#039;&#039; - To select a hologram, button, or other element, gaze at it, then click.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Click and hold&#039;&#039;&#039; - Click and hold your thumb down on the button to do some of the same things you would with tap and hold, like move or resize a hologram.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Scroll&#039;&#039;&#039; - On the app bar, select Scroll Tool . Click and hold, then rotate the clicker up, down, left, or right. To scroll faster, move your hand farther from the center of the scroll tool.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Zoom&#039;&#039;&#039; - On the app bar, select Zoom Tool . Click and hold, then rotate the clicker up to zoom in, or down to zoom out.&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Indicator Lights==&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Blinking white&#039;&#039;&#039; - The clicker is in pairing mode.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Fast-blinking white&#039;&#039;&#039; - Pairing was successful.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Solid white&#039;&#039;&#039; - The clicker is charging.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Blinking amber&#039;&#039;&#039; - The battery is low.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Solid amber&#039;&#039;&#039; - The clicker ran into an error and you&#039;ll need to restart it. While pressing the pairing button, click and hold for 15 seconds&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Apps==&lt;br /&gt;
&lt;br /&gt;
==Developer==&lt;br /&gt;
&lt;br /&gt;
==Images==&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Input Devices]] [[Category:Motion Trackers]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=File:Hololens_clicker_concept1.png&amp;diff=10734</id>
		<title>File:Hololens clicker concept1.png</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=File:Hololens_clicker_concept1.png&amp;diff=10734"/>
		<updated>2016-09-13T01:03:58Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=File:Hololens_clicker_concept2.png&amp;diff=10733</id>
		<title>File:Hololens clicker concept2.png</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=File:Hololens_clicker_concept2.png&amp;diff=10733"/>
		<updated>2016-09-13T01:03:58Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=HoloLens_Clicker&amp;diff=10732</id>
		<title>HoloLens Clicker</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=HoloLens_Clicker&amp;diff=10732"/>
		<updated>2016-09-13T01:03:37Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Device Infobox&lt;br /&gt;
|image=&lt;br /&gt;
|Type=[[Input Device]], [[Motion Tracker]]&lt;br /&gt;
|Subtype=[[Hands/Fingers Tracking]]&lt;br /&gt;
|Platform=[[Microsoft HoloLens]]&lt;br /&gt;
|Creator=&lt;br /&gt;
|Developer=[[Microsoft]]&lt;br /&gt;
|Manufacturer=&lt;br /&gt;
|Operating System=[[Windows 10]]&lt;br /&gt;
|Versions=&lt;br /&gt;
|Requires=&lt;br /&gt;
|CPU=&lt;br /&gt;
|GPU=&lt;br /&gt;
|HPU=&lt;br /&gt;
|Memory=&lt;br /&gt;
|Storage=&lt;br /&gt;
|Display=&lt;br /&gt;
|Resolution=&lt;br /&gt;
|Refresh Rate=&lt;br /&gt;
|Persistence=&lt;br /&gt;
|Precision=&lt;br /&gt;
|Field of View=&lt;br /&gt;
|Tracking=3DOF&lt;br /&gt;
|Rotational Tracking=IMUs&lt;br /&gt;
|Positional Tracking=&lt;br /&gt;
|Update Rate=&lt;br /&gt;
|Latency=&lt;br /&gt;
|Audio=&lt;br /&gt;
|Camera=&lt;br /&gt;
|Sensors=&lt;br /&gt;
|Input=&lt;br /&gt;
|Power=&lt;br /&gt;
|Weight=&lt;br /&gt;
|Size=&lt;br /&gt;
|Release Date=&lt;br /&gt;
|Price=&lt;br /&gt;
|Website=&lt;br /&gt;
}}&lt;br /&gt;
{{stub}}&lt;br /&gt;
[[HoloLens Clicker]] is the [[Input Device]] for the [[Microsoft HoloLens]]. Developed by [[Microsoft]], the small remote has a single button along with [[rotational tracking]]. It allows a user to click and scroll with minimal hand motion as a replacement for the air-tap gesture.&lt;br /&gt;
==Introduction==&lt;br /&gt;
The HoloLens Clicker is a peripheral device for Microsoft HoloLens (figure 1) &amp;lt;ref name=”1”&amp;gt; Greenwald, W. (2016). Microsoft HoloLens Development Edition. Retrieved from www.pcmag.com/review/347119/microsoft-hololens-development-edition&amp;lt;/ref&amp;gt;. It was designed exclusively for that augmented reality (AR) device, allowing for another way to control and interact with the holograms displayed by the AR headset &amp;lt;ref name=”2”&amp;gt; Microsoft (2016). Use the HoloLens Clicker. Retrieved support.microsoft.com/pt-pt/help/12646/hololens-use-the-hololens-clicker&amp;lt;/ref&amp;gt;. It is a miniature controller that lets the user click on whatever he or she is looking at, since there is a small dot in the center of the user’s view functioning as a cursor. Hand motion is used for the drag and drop command, since the Clicker has motion sensors on board &amp;lt;ref name=”5”&amp;gt; Rubino, D. (2016). My first 24 hours with Microsoft HoloLens and awesome things I learned. Retrieved from /www.windowscentral.com/my-first-24-hours-microsoft-hololens&amp;lt;/ref&amp;gt;. The user clicks and holds with the Clicker, and then slowly moves the hand to drag objects. This function is mainly used for resizing windows and holograms as well as scrolling up and down documents in Edge &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Wilhelm, P. (2016). Microsoft HoloLens Bluetooth Clicker is our first glimpse at a possible controller. Retrieved from www.techradar.com/news/wearables/microsoft-s-hololens-bluetooth-clicker-is-our-first-glimpse-at-any-controller-1315623&amp;lt;/ref&amp;gt;. It also makes typing an easier endeavor, where the user can look at the letters in the keyboard and click on the device to select them. The HoloLens Clicker can be used in addition or instead of the “air-tap” gesture &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
==Hardware==&lt;br /&gt;
The Clicker attaches to the user’s finger via an elastic strap (figure 2). It uses Bluetooth and its design aims for simplicity &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt; Petty, J. (2016). I used the Microsoft HoloLens and I can’t stop thinking about it. Retrieved from www.ign.com/articles/2016/09/06/i-used-the-hololens-and-i-canat-stop-thinking-about-it&amp;lt;/ref&amp;gt;. It only has one button, and is charged over Micro USB &amp;lt;ref name=”7”&amp;gt; Tom Warren (2016). Microsoft’s HoloLens Start menu detailed in leaked video. Retrieved from www.theverge.com/2016/2/23/11098332/microsoft-hololens-start-menu-bluetooth-clicker&amp;lt;/ref&amp;gt;. The first details for this device emerged in February of 2016, when a leaked quick-start guide was posted by Twitter user WalkingCat &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The HoloLens interaction model has, in general, three key elements: Gaze, Gesture, and Voice. The Gaze relates to what you are looking at (e.g. headtracking), the Gesture is an “air-tap” movement that the HoloLens will recognize and allow for selection of items, and Voice that allows for voice commands. Even though the gestures work well on HoloLens, and the voice commands are reliable, there have been reports that it may become inconvenient for some users to use them over and over again during a short period of time. Indeed, Jared Petty from IGN reports that even after practicing the “air-tap” gestures, preference went to the Clicker since it provided tactile feedback that made the interactions with the holograms feel more precise &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The HoloLens Clicker is, therefore, a way to solve this and give users another option to control the holograms in the AR device &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; Pradeep (2016). Microsoft planning to release HoloLens Clicker accessory to improve interactions with holograms. Retrieved from mspoweruser.com/microsoft-planning-to-release-hololens-clicker-accessory-to-improve-interactions-with-holograms&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Compared to other headsets, like the HTC Vive, Oculus Rift, and PlayStation VR, the HoloLens Clicker controller is the simplest and smallest of the peripherals. Due to its small design, it is intended primarily for navigation and not for use in precise application such as gaming. This device is the only controller that has been unveiled for the HoloLens, for now. A more complex controller may be in the works &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
A Development Edition of the Microsoft HoloLens has been released, which includes the Clicker peripheral &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Clicker Gestures==&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Select&#039;&#039;&#039; - To select a hologram, button, or other element, gaze at it, then click.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Click and hold&#039;&#039;&#039; - Click and hold your thumb down on the button to do some of the same things you would with tap and hold, like move or resize a hologram.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Scroll&#039;&#039;&#039; - On the app bar, select Scroll Tool . Click and hold, then rotate the clicker up, down, left, or right. To scroll faster, move your hand farther from the center of the scroll tool.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Zoom&#039;&#039;&#039; - On the app bar, select Zoom Tool . Click and hold, then rotate the clicker up to zoom in, or down to zoom out.&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Indicator Lights==&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Blinking white&#039;&#039;&#039; - The clicker is in pairing mode.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Fast-blinking white&#039;&#039;&#039; - Pairing was successful.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Solid white&#039;&#039;&#039; - The clicker is charging.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Blinking amber&#039;&#039;&#039; - The battery is low.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Solid amber&#039;&#039;&#039; - The clicker ran into an error and you&#039;ll need to restart it. While pressing the pairing button, click and hold for 15 seconds&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Apps==&lt;br /&gt;
&lt;br /&gt;
==Developer==&lt;br /&gt;
&lt;br /&gt;
==Images==&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Input Devices]] [[Category:Motion Trackers]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=HoloLens_Clicker&amp;diff=10731</id>
		<title>HoloLens Clicker</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=HoloLens_Clicker&amp;diff=10731"/>
		<updated>2016-09-13T01:00:59Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Device Infobox&lt;br /&gt;
|image=&lt;br /&gt;
|Type=[[Input Device]], [[Motion Tracker]]&lt;br /&gt;
|Subtype=[[Hands/Fingers Tracking]]&lt;br /&gt;
|Platform=[[Microsoft HoloLens]]&lt;br /&gt;
|Creator=&lt;br /&gt;
|Developer=[[Microsoft]]&lt;br /&gt;
|Manufacturer=&lt;br /&gt;
|Operating System=[[Windows 10]]&lt;br /&gt;
|Versions=&lt;br /&gt;
|Requires=&lt;br /&gt;
|CPU=&lt;br /&gt;
|GPU=&lt;br /&gt;
|HPU=&lt;br /&gt;
|Memory=&lt;br /&gt;
|Storage=&lt;br /&gt;
|Display=&lt;br /&gt;
|Resolution=&lt;br /&gt;
|Refresh Rate=&lt;br /&gt;
|Persistence=&lt;br /&gt;
|Precision=&lt;br /&gt;
|Field of View=&lt;br /&gt;
|Tracking=3DOF&lt;br /&gt;
|Rotational Tracking=IMUs&lt;br /&gt;
|Positional Tracking=&lt;br /&gt;
|Update Rate=&lt;br /&gt;
|Latency=&lt;br /&gt;
|Audio=&lt;br /&gt;
|Camera=&lt;br /&gt;
|Sensors=&lt;br /&gt;
|Input=&lt;br /&gt;
|Power=&lt;br /&gt;
|Weight=&lt;br /&gt;
|Size=&lt;br /&gt;
|Release Date=&lt;br /&gt;
|Price=&lt;br /&gt;
|Website=&lt;br /&gt;
}}&lt;br /&gt;
{{stub}}&lt;br /&gt;
[[HoloLens Clicker]] is the [[Input Device]] for the [[Microsoft HoloLens]]. Developed by [[Microsoft]], the small remote has a single button along with [[rotational tracking]]. It allows a user to click and scroll with minimal hand motion as a replacement for the air-tap gesture.&lt;br /&gt;
==Introduction==&lt;br /&gt;
The HoloLens Clicker is a peripheral device for Microsoft HoloLens (figure 1) &amp;lt;ref name=”1”&amp;gt; Greenwald, W. (2016). Microsoft HoloLens Development Edition. Retrieved from www.pcmag.com/review/347119/microsoft-hololens-development-edition&amp;lt;/ref&amp;gt;. It was designed exclusively for that augmented reality (AR) device, allowing for another way to control and interact with the holograms displayed by the AR headset &amp;lt;ref name=”2”&amp;gt; Microsoft (2016). Use the HoloLens Clicker. Retrieved support.microsoft.com/pt-pt/help/12646/hololens-use-the-hololens-clicker&amp;lt;/ref&amp;gt;. It is a miniature controller that lets the user click on whatever he or she is looking at, since there is a small dot in the center of the user’s view functioning as a cursor. Hand motion is used for the drag and drop command, since the Clicker has motion sensors on board &amp;lt;ref name=”5”&amp;gt; Rubino, D. (2016). My first 24 hours with Microsoft HoloLens and awesome things I learned. Retrieved from /www.windowscentral.com/my-first-24-hours-microsoft-hololens&amp;lt;/ref&amp;gt;. The user clicks and holds with the Clicker, and then slowly moves the hand to drag objects. This function is mainly used for resizing windows and holograms as well as scrolling up and down documents in Edge &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Wilhelm, P. (2016). Microsoft HoloLens Bluetooth Clicker is our first glimpse at a possible controller. Retrieved from www.techradar.com/news/wearables/microsoft-s-hololens-bluetooth-clicker-is-our-first-glimpse-at-any-controller-1315623&amp;lt;/ref&amp;gt;. It also makes typing an easier endeavor, where the user can look at the letters in the keyboard and click on the device to select them. The HoloLens Clicker can be used in addition or instead of the “air-tap” gesture &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
==Hardware==&lt;br /&gt;
The Clicker attaches to the user’s finger via an elastic strap (figure 2). It uses Bluetooth and its design aims for simplicity &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt; Petty, J. (2016). I used the Microsoft HoloLens and I can’t stop thinking about it. Retrieved from www.ign.com/articles/2016/09/06/i-used-the-hololens-and-i-canat-stop-thinking-about-it&amp;lt;/ref&amp;gt;. It only has one button, and is charged over Micro USB &amp;lt;ref name=”7”&amp;gt; Tom Warren (2016). Microsoft’s HoloLens Start menu detailed in leaked video. Retrieved from www.theverge.com/2016/2/23/11098332/microsoft-hololens-start-menu-bluetooth-clicker&amp;lt;/ref&amp;gt;. The first details for this device emerged in February of 2016, when a leaked quick-start guide was posted by Twitter user WalkingCat &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The HoloLens interaction model has, in general, three key elements: Gaze, Gesture, and Voice. The Gaze relates to what you are looking at (e.g. headtracking), the Gesture is an “air-tap” movement that the HoloLens will recognize and allow for selection of items, and Voice that allows for voice commands. Even though the gestures work well on HoloLens, and the voice commands are reliable, there have been reports that it may become inconvenient for some users to use them over and over again during a short period of time. Indeed, Jared Petty from IGN reports that even after practicing the “air-tap” gestures, preference went to the Clicker since it provided tactile feedback that made the interactions with the holograms feel more precise &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The HoloLens Clicker is, therefore, a way to solve this and give users another option to control the holograms in the AR device &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; Pradeep (2016). Microsoft planning to release HoloLens Clicker accessory to improve interactions with holograms. Retrieved from mspoweruser.com/microsoft-planning-to-release-hololens-clicker-accessory-to-improve-interactions-with-holograms&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Compared to other headsets, like the HTC Vive, Oculus Rift, and PlayStation VR, the HoloLens Clicker controller is the simplest and smallest of the peripherals. Due to its small design, it is intended primarily for navigation and not for use in precise application such as gaming. This device is the only controller that has been unveiled for the HoloLens, for now. A more complex controller may be in the works &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
A Development Edition of the Microsoft HoloLens has been released, which includes the Clicker peripheral &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Clicker Gestures==&lt;br /&gt;
&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Select&#039;&#039;&#039; - To select a hologram, button, or other element, gaze at it, then click.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Click and hold&#039;&#039;&#039; - Click and hold your thumb down on the button to do some of the same things you would with tap and hold, like move or resize a hologram.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Scroll&#039;&#039;&#039; - On the app bar, select Scroll Tool . Click and hold, then rotate the clicker up, down, left, or right. To scroll faster, move your hand farther from the center of the scroll tool.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Zoom&#039;&#039;&#039; - On the app bar, select Zoom Tool . Click and hold, then rotate the clicker up to zoom in, or down to zoom out.&lt;br /&gt;
&lt;br /&gt;
==Indicator Lights==&lt;br /&gt;
&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Blinking white&#039;&#039;&#039; - The clicker is in pairing mode.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Fast-blinking white&#039;&#039;&#039; - Pairing was successful.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Solid white&#039;&#039;&#039; - The clicker is charging.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Blinking amber&#039;&#039;&#039; - The battery is low.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Solid amber&#039;&#039;&#039; - The clicker ran into an error and you&#039;ll need to restart it. While pressing the pairing button, click and hold for 15 seconds&lt;br /&gt;
&lt;br /&gt;
==Apps==&lt;br /&gt;
&lt;br /&gt;
==Developer==&lt;br /&gt;
&lt;br /&gt;
==Images==&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Input Devices]] [[Category:Motion Trackers]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=HoloLens_Clicker&amp;diff=10730</id>
		<title>HoloLens Clicker</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=HoloLens_Clicker&amp;diff=10730"/>
		<updated>2016-09-13T00:38:57Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Device Infobox&lt;br /&gt;
|image=&lt;br /&gt;
|Type=[[Input Device]], [[Motion Tracker]]&lt;br /&gt;
|Subtype=[[Hands/Fingers Tracking]]&lt;br /&gt;
|Platform=[[Microsoft HoloLens]]&lt;br /&gt;
|Creator=&lt;br /&gt;
|Developer=[[Microsoft]]&lt;br /&gt;
|Manufacturer=&lt;br /&gt;
|Operating System=[[Windows 10]]&lt;br /&gt;
|Versions=&lt;br /&gt;
|Requires=&lt;br /&gt;
|CPU=&lt;br /&gt;
|GPU=&lt;br /&gt;
|HPU=&lt;br /&gt;
|Memory=&lt;br /&gt;
|Storage=&lt;br /&gt;
|Display=&lt;br /&gt;
|Resolution=&lt;br /&gt;
|Refresh Rate=&lt;br /&gt;
|Persistence=&lt;br /&gt;
|Precision=&lt;br /&gt;
|Field of View=&lt;br /&gt;
|Tracking=3DOF&lt;br /&gt;
|Rotational Tracking=IMUs&lt;br /&gt;
|Positional Tracking=&lt;br /&gt;
|Update Rate=&lt;br /&gt;
|Latency=&lt;br /&gt;
|Audio=&lt;br /&gt;
|Camera=&lt;br /&gt;
|Sensors=&lt;br /&gt;
|Input=&lt;br /&gt;
|Power=&lt;br /&gt;
|Weight=&lt;br /&gt;
|Size=&lt;br /&gt;
|Release Date=&lt;br /&gt;
|Price=&lt;br /&gt;
|Website=&lt;br /&gt;
}}&lt;br /&gt;
{{stub}}&lt;br /&gt;
[[HoloLens Clicker]] is the [[Input Device]] for the [[Microsoft HoloLens]]. Developed by [[Microsoft]], the small remote has a single button along with [[rotational tracking]]. It allows a user to click and scroll with minimal hand motion as a replacement for the air-tap gesture.&lt;br /&gt;
==Introduction==&lt;br /&gt;
The HoloLens Clicker is a peripheral device for Microsoft HoloLens (figure 1) &amp;lt;ref name=”1”&amp;gt; Greenwald, W. (2016). Microsoft HoloLens Development Edition. Retrieved from www.pcmag.com/review/347119/microsoft-hololens-development-edition&amp;lt;/ref&amp;gt;. It was designed exclusively for that augmented reality (AR) device, allowing for another way to control and interact with the holograms displayed by the AR headset &amp;lt;ref name=”2”&amp;gt; Microsoft (2016). Use the HoloLens Clicker. Retrieved support.microsoft.com/pt-pt/help/12646/hololens-use-the-hololens-clicker&amp;lt;/ref&amp;gt;. It is a miniature controller that lets the user click on whatever he or she is looking at, since there is a small dot in the center of the user’s view functioning as a cursor. Hand motion is used for the drag and drop command, since the Clicker has motion sensors on board &amp;lt;ref name=”5”&amp;gt; Rubino, D. (2016). My first 24 hours with Microsoft HoloLens and awesome things I learned. Retrieved from /www.windowscentral.com/my-first-24-hours-microsoft-hololens&amp;lt;/ref&amp;gt;. The user clicks and holds with the Clicker, and then slowly moves the hand to drag objects. This function is mainly used for resizing windows and holograms as well as scrolling up and down documents in Edge &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Wilhelm, P. (2016). Microsoft HoloLens Bluetooth Clicker is our first glimpse at a possible controller. Retrieved from www.techradar.com/news/wearables/microsoft-s-hololens-bluetooth-clicker-is-our-first-glimpse-at-any-controller-1315623&amp;lt;/ref&amp;gt;. It also makes typing an easier endeavor, where the user can look at the letters in the keyboard and click on the device to select them. The HoloLens Clicker can be used in addition or instead of the “air-tap” gesture &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
==Hardware==&lt;br /&gt;
The Clicker attaches to the user’s finger via an elastic strap (figure 2). It uses Bluetooth and its design aims for simplicity &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt; Petty, J. (2016). I used the Microsoft HoloLens and I can’t stop thinking about it. Retrieved from www.ign.com/articles/2016/09/06/i-used-the-hololens-and-i-canat-stop-thinking-about-it&amp;lt;/ref&amp;gt;. It only has one button, and is charged over Micro USB &amp;lt;ref name=”7”&amp;gt; Tom Warren (2016). Microsoft’s HoloLens Start menu detailed in leaked video. Retrieved from www.theverge.com/2016/2/23/11098332/microsoft-hololens-start-menu-bluetooth-clicker&amp;lt;/ref&amp;gt;. The first details for this device emerged in February of 2016, when a leaked quick-start guide was posted by Twitter user WalkingCat &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The HoloLens interaction model has, in general, three key elements: Gaze, Gesture, and Voice. The Gaze relates to what you are looking at (e.g. headtracking), the Gesture is an “air-tap” movement that the HoloLens will recognize and allow for selection of items, and Voice that allows for voice commands. Even though the gestures work well on HoloLens, and the voice commands are reliable, there have been reports that it may become inconvenient for some users to use them over and over again during a short period of time. Indeed, Jared Petty from IGN reports that even after practicing the “air-tap” gestures, preference went to the Clicker since it provided tactile feedback that made the interactions with the holograms feel more precise &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The HoloLens Clicker is, therefore, a way to solve this and give users another option to control the holograms in the AR device &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; Pradeep (2016). Microsoft planning to release HoloLens Clicker accessory to improve interactions with holograms. Retrieved from mspoweruser.com/microsoft-planning-to-release-hololens-clicker-accessory-to-improve-interactions-with-holograms&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Compared to other headsets, like the HTC Vive, Oculus Rift, and PlayStation VR, the HoloLens Clicker controller is the simplest and smallest of the peripherals. Due to its small design, it is intended primarily for navigation and not for use in precise application such as gaming. This device is the only controller that has been unveiled for the HoloLens, for now. A more complex controller may be in the works &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
A Development Edition of the Microsoft HoloLens has been released, which includes the Clicker peripheral &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Clicker Gestures==&lt;br /&gt;
&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Select&#039;&#039;&#039; - To select a hologram, button, or other element, gaze at it, then click.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Click and hold&#039;&#039;&#039; - Click and hold your thumb down on the button to do some of the same things you would with tap and hold, like move or resize a hologram.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Scroll&#039;&#039;&#039; - On the app bar, select Scroll Tool . Click and hold, then rotate the clicker up, down, left, or right. To scroll faster, move your hand farther from the center of the scroll tool.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Zoom&#039;&#039;&#039; - On the app bar, select Zoom Tool . Click and hold, then rotate the clicker up to zoom in, or down to zoom out.&lt;br /&gt;
&lt;br /&gt;
==Indicator Lights==&lt;br /&gt;
&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Blinking white&#039;&#039;&#039; - The clicker is in pairing mode.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Fast-blinking white&#039;&#039;&#039; - Pairing was successful.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Solid white&#039;&#039;&#039; - The clicker is charging.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Blinking amber&#039;&#039;&#039; - The battery is low.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Solid amber&#039;&#039;&#039; - The clicker ran into an error and you&#039;ll need to restart it. While pressing the pairing button, click and hold for 15 seconds&lt;br /&gt;
&lt;br /&gt;
==Apps==&lt;br /&gt;
&lt;br /&gt;
==Developer==&lt;br /&gt;
&lt;br /&gt;
==Images==&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Input Devices]] [[Category:Motion Trackers]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=HoloLens_Clicker&amp;diff=10729</id>
		<title>HoloLens Clicker</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=HoloLens_Clicker&amp;diff=10729"/>
		<updated>2016-09-13T00:37:53Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Device Infobox&lt;br /&gt;
|image=&lt;br /&gt;
|Type=[[Input Device]], [[Motion Tracker]]&lt;br /&gt;
|Subtype=[[Hands/Fingers Tracking]]&lt;br /&gt;
|Platform=[[Microsoft HoloLens]]&lt;br /&gt;
|Creator=&lt;br /&gt;
|Developer=[[Microsoft]]&lt;br /&gt;
|Manufacturer=&lt;br /&gt;
|Operating System=[[Windows 10]]&lt;br /&gt;
|Versions=&lt;br /&gt;
|Requires=&lt;br /&gt;
|CPU=&lt;br /&gt;
|GPU=&lt;br /&gt;
|HPU=&lt;br /&gt;
|Memory=&lt;br /&gt;
|Storage=&lt;br /&gt;
|Display=&lt;br /&gt;
|Resolution=&lt;br /&gt;
|Refresh Rate=&lt;br /&gt;
|Persistence=&lt;br /&gt;
|Precision=&lt;br /&gt;
|Field of View=&lt;br /&gt;
|Tracking=3DOF&lt;br /&gt;
|Rotational Tracking=IMUs&lt;br /&gt;
|Positional Tracking=&lt;br /&gt;
|Update Rate=&lt;br /&gt;
|Latency=&lt;br /&gt;
|Audio=&lt;br /&gt;
|Camera=&lt;br /&gt;
|Sensors=&lt;br /&gt;
|Input=&lt;br /&gt;
|Power=&lt;br /&gt;
|Weight=&lt;br /&gt;
|Size=&lt;br /&gt;
|Release Date=&lt;br /&gt;
|Price=&lt;br /&gt;
|Website=&lt;br /&gt;
}}&lt;br /&gt;
{{stub}}&lt;br /&gt;
[[HoloLens Clicker]] is the [[Input Device]] for the [[Microsoft HoloLens]]. Developed by [[Microsoft]], the small remote has a single button along with [[rotational tracking]]. It allows a user to click and scroll with minimal hand motion as a replacement for the air-tap gesture.&lt;br /&gt;
==Introduction==&lt;br /&gt;
The HoloLens Clicker is a peripheral device for Microsoft HoloLens (figure 1) &amp;lt;ref name=”1”&amp;gt; Greenwald, W. (2016). Microsoft HoloLens Development Edition. Retrieved from www.pcmag.com/review/347119/microsoft-hololens-development-edition&amp;lt;/ref&amp;gt;. It was designed exclusively for that augmented reality (AR) device, allowing for another way to control and interact with the holograms displayed by the AR headset &amp;lt;ref name=”2”&amp;gt; Microsoft (2016). Use the HoloLens Clicker. Retrieved support.microsoft.com/pt-pt/help/12646/hololens-use-the-hololens-clicker&amp;lt;/ref&amp;gt;. It is a miniature controller that lets the user click on whatever he or she is looking at, since there is a small dot in the center of the user’s view functioning as a cursor. Hand motion is used for the drag and drop command, since the Clicker has motion sensors on board &amp;lt;ref name=”5”&amp;gt; Rubino, D. (2016). My first 24 hours with Microsoft HoloLens and awesome things I learned. Retrieved from /www.windowscentral.com/my-first-24-hours-microsoft-hololens&amp;lt;/ref&amp;gt;. The user clicks and holds with the Clicker, and then slowly moves the hand to drag objects. This function is mainly used for resizing windows and holograms as well as scrolling up and down documents in Edge &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Wilhelm, P. (2016). Microsoft HoloLens Bluetooth Clicker is our first glimpse at a possible controller. Retrieved from www.techradar.com/news/wearables/microsoft-s-hololens-bluetooth-clicker-is-our-first-glimpse-at-any-controller-1315623&amp;lt;/ref&amp;gt;. It also makes typing an easier endeavor, where the user can look at the letters in the keyboard and click on the device to select them. The HoloLens Clicker can be used in addition or instead of the “air-tap” gesture &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
==Clicker Gestures==&lt;br /&gt;
&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Select&#039;&#039;&#039; - To select a hologram, button, or other element, gaze at it, then click.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Click and hold&#039;&#039;&#039; - Click and hold your thumb down on the button to do some of the same things you would with tap and hold, like move or resize a hologram.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Scroll&#039;&#039;&#039; - On the app bar, select Scroll Tool . Click and hold, then rotate the clicker up, down, left, or right. To scroll faster, move your hand farther from the center of the scroll tool.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Zoom&#039;&#039;&#039; - On the app bar, select Zoom Tool . Click and hold, then rotate the clicker up to zoom in, or down to zoom out.&lt;br /&gt;
&lt;br /&gt;
==Indicator Lights==&lt;br /&gt;
&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Blinking white&#039;&#039;&#039; - The clicker is in pairing mode.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Fast-blinking white&#039;&#039;&#039; - Pairing was successful.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Solid white&#039;&#039;&#039; - The clicker is charging.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Blinking amber&#039;&#039;&#039; - The battery is low.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Solid amber&#039;&#039;&#039; - The clicker ran into an error and you&#039;ll need to restart it. While pressing the pairing button, click and hold for 15 seconds&lt;br /&gt;
&lt;br /&gt;
==Hardware==&lt;br /&gt;
&lt;br /&gt;
==Apps==&lt;br /&gt;
&lt;br /&gt;
==Developer==&lt;br /&gt;
&lt;br /&gt;
==Images==&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Input Devices]] [[Category:Motion Trackers]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=HoloLens_Clicker&amp;diff=10728</id>
		<title>HoloLens Clicker</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=HoloLens_Clicker&amp;diff=10728"/>
		<updated>2016-09-13T00:36:29Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Device Infobox&lt;br /&gt;
|image=&lt;br /&gt;
|Type=[[Input Device]], [[Motion Tracker]]&lt;br /&gt;
|Subtype=[[Hands/Fingers Tracking]]&lt;br /&gt;
|Platform=[[Microsoft HoloLens]]&lt;br /&gt;
|Creator=&lt;br /&gt;
|Developer=[[Microsoft]]&lt;br /&gt;
|Manufacturer=&lt;br /&gt;
|Operating System=[[Windows 10]]&lt;br /&gt;
|Versions=&lt;br /&gt;
|Requires=&lt;br /&gt;
|CPU=&lt;br /&gt;
|GPU=&lt;br /&gt;
|HPU=&lt;br /&gt;
|Memory=&lt;br /&gt;
|Storage=&lt;br /&gt;
|Display=&lt;br /&gt;
|Resolution=&lt;br /&gt;
|Refresh Rate=&lt;br /&gt;
|Persistence=&lt;br /&gt;
|Precision=&lt;br /&gt;
|Field of View=&lt;br /&gt;
|Tracking=3DOF&lt;br /&gt;
|Rotational Tracking=IMUs&lt;br /&gt;
|Positional Tracking=&lt;br /&gt;
|Update Rate=&lt;br /&gt;
|Latency=&lt;br /&gt;
|Audio=&lt;br /&gt;
|Camera=&lt;br /&gt;
|Sensors=&lt;br /&gt;
|Input=&lt;br /&gt;
|Power=&lt;br /&gt;
|Weight=&lt;br /&gt;
|Size=&lt;br /&gt;
|Release Date=&lt;br /&gt;
|Price=&lt;br /&gt;
|Website=&lt;br /&gt;
}}&lt;br /&gt;
{{stub}}&lt;br /&gt;
[[HoloLens Clicker]] is the [[Input Device]] for the [[Microsoft HoloLens]]. Developed by [[Microsoft]], the small remote has a single button along with [[rotational tracking]]. It allows a user to click and scroll with minimal hand motion as a replacement for the air-tap gesture.&lt;br /&gt;
==Clicker Gestures==&lt;br /&gt;
&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Select&#039;&#039;&#039; - To select a hologram, button, or other element, gaze at it, then click.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Click and hold&#039;&#039;&#039; - Click and hold your thumb down on the button to do some of the same things you would with tap and hold, like move or resize a hologram.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Scroll&#039;&#039;&#039; - On the app bar, select Scroll Tool . Click and hold, then rotate the clicker up, down, left, or right. To scroll faster, move your hand farther from the center of the scroll tool.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Zoom&#039;&#039;&#039; - On the app bar, select Zoom Tool . Click and hold, then rotate the clicker up to zoom in, or down to zoom out.&lt;br /&gt;
&lt;br /&gt;
==Indicator Lights==&lt;br /&gt;
&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Blinking white&#039;&#039;&#039; - The clicker is in pairing mode.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Fast-blinking white&#039;&#039;&#039; - Pairing was successful.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Solid white&#039;&#039;&#039; - The clicker is charging.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Blinking amber&#039;&#039;&#039; - The battery is low.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Solid amber&#039;&#039;&#039; - The clicker ran into an error and you&#039;ll need to restart it. While pressing the pairing button, click and hold for 15 seconds&lt;br /&gt;
&lt;br /&gt;
==Hardware==&lt;br /&gt;
&lt;br /&gt;
==Apps==&lt;br /&gt;
&lt;br /&gt;
==Developer==&lt;br /&gt;
&lt;br /&gt;
==Images==&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Input Devices]] [[Category:Motion Trackers]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Redirected_touching&amp;diff=10651</id>
		<title>Redirected touching</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Redirected_touching&amp;diff=10651"/>
		<updated>2016-08-15T20:22:41Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Redirected touching is a technique developed by Luv Kohli, a computer scientist, in which virtual space is warped to map several virtual objects to one real object that serves as a passive haptic prop. The real hand motion is therefore mapped differently in relation to the virtual hand motion, introducing discrepancies that compensate for the differences between the real and virtual objects. Like Redirected Walking, this technique also uses errors in human perception to introduce discrepancies between the virtual environment (VE) and the real one. This technique would not be of use if the differential mapping would be noticeable to the user &amp;lt;ref name=”1”&amp;gt; Kohli, L. (2013). Warping Virtual Space for Low-Cost Haptic Feedback. I3D ’13 Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games, pp. 195-195&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt; Kohli, L., Whitton, M. C. and Brooks Jr., F. P. (2013). Redirected Touching: Training and Adaptation in Warped Virtual Spaces. Proc IEEE Symp 3D User Interfaces, pp. 79–86&amp;lt;/ref&amp;gt;. For example, if a user moved her hand in a VE and there was a 10cm difference between touching the virtual object and sensing the real one, the user might be disconcerted and the immersion broken &amp;lt;ref name=”3”&amp;gt; Kohli, L. (2013). Redirected Touching. PhD dissertation, University of North Carolina at Chapell Hill&amp;lt;/ref&amp;gt;. Previous experimental results have concluded that the mapping can be predictably unnoticeable and with little effect on task performance &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Virtual environments and haptic feedback==&lt;br /&gt;
&lt;br /&gt;
A problem with virtual environments is that virtual objects cannot be felt. This lack of haptic feedback makes the interaction with the VE feel unnatural, possibly reducing the sense of presence &amp;lt;ref name=”4”&amp;gt; Kohli, L. (2009). Exploiting Perceptual Illusions to Enhance Passive Haptics. Proc. of IEEE VR Workshop on Perceptual Illusions in Virtual Environments (PIVE), pp. 22-24&amp;lt;/ref&amp;gt;. Passive haptics (physical props) are a way to provide compelling touch feedback for virtual objects and are commonly used in VEs. The virtual objects are mapped to the real props, generally on a relation of one-to-one &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;. Providing haptic feedback significantly enhance a VE user’s experience, resulting in a more compelling experience because the user touches a real object &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. However, passive haptics displays are inflexible. When changing an object in the VE, it is necessary to change the associated real object &amp;lt;ref name=”5”&amp;gt; Kohli, L., Whitton, M. C. and Brooks Jr., F. P. (2012). Redirected Touching: The Effect of Warping Space on Task Performance. Proc IEEE Symp 3D User Interfaces, pp. 105-112&amp;lt;/ref&amp;gt;. Evidence has suggested that passive haptics are generally preferred over absent touch feedback, providing also improvements in the performance of precision tasks as well as in spatial knowledge.&lt;br /&gt;
&lt;br /&gt;
[[File:1passivehaptics.png|thumb|Figure 1. Passive haptics with one-to-one relation. (Image taken from Kohli, 2013)]]&lt;br /&gt;
&lt;br /&gt;
Although, traditionally, the objects between the VE and real-world have been mapped one-to-one (as can be seen in the example provided on figure 1), this mapping is not necessarily required. With redirected touching, a single physical object can provide haptic feedback for several virtual objects in the VE (figure 2) by warping the virtual space &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. This introduces a discrepancy between the real and virtual hand motions. As an example, a physical flat table could provide haptic feedback for a sloped virtual table by showing the virtual hand moving along a slope while the user’s hand moves on the flat table (1). This technique exploits visual dominance in relation to other senses in order for a single real object to provide haptic feedback for many differently shaped virtual objects &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[File:2differentVOs.png|thumb|Figure 2. Mapping different virtual objects onto one physical object (Image taken from Kohli, 2013)]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Visual dominance&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
As with redirected walking, redirected touching takes advantage of errors in human perception, in this case where the sense of touch and vision are concerned. According to Kohli et al. (2013), “vision usually dominates proprioception when the two conflict; people tend to believe their hand is where they see it, rather than where they feel it.” A person wearing distorting glasses when moving his hand along a straight surface will feel it as curved. Another example is that when subjects are holding an object through a cloth as they view the same object through distorting lens, they believe that the object they held was more similar to the image of the shape seen instead of the shape that they felt. In another study, subjects pushed a piston mounted on a passive isometric input device with their thumb while they were shown a virtual spring that compressed as they applied force to the real piston. Although the real piston did not physically move, their perception of the spring stiffness was influenced by viewing the virtual spring (3). Finally, it has been demonstrated that it is possible to influence the perception of elasticity of a physical elastic deformable surface by altering the amount of deformation that is made by the user’s hand on a virtual object &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Visual dominance is not, generally, complete. When exposed to real haptic and virtual cube-shaped objects with discrepant edge curvatures, the users perceive the curvature of the object to be intermediate between the two. Human sensory signals are weighted by their reliability: more reliable signals will be given more weight &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. In this way, redirected touching leverages this visual dominance to allow for discrepancies between real and virtual objects to go unnoticed. As described above, this process is analogous with redirected walking, where a discrepancy is introduced between the user’s head rotation and the virtual head rotation, allowing for the exploration of large VEs while being in a smaller and limited physical location &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Achieving redirected touching==&lt;br /&gt;
&lt;br /&gt;
In Kohli (2010), it is described an implementation for redirected touching that is low-cost and quick-to-set-up passive haptics in order to provide haptic feedback to a user experiencing a VE. As explained above, the virtual space around an object will be warped, exploiting the user’s visual dominance. The user’s virtual hand will move in virtual directions that are different from the physical motion, so that the real and virtual hands reach their respective objects simultaneously. This system currently focus on finger-based interactions with VEs &amp;lt;ref name=”6”&amp;gt; Kohli, L. (2010). Redirected Touching: Warping Space to Remap Passive Haptics. 3D User Interfaces (3DUI), 2010 IEEE Symposium, pp. 129-130&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[File:3RTequipment.png|thumb|Equipment used to achieve redirected touching. (Image taken from Kohli, 2010)]]&lt;br /&gt;
&lt;br /&gt;
Regarding equipment for this specific setup used in Kohli (2010), there is a need for head tracking (3rdTech HiBall-3000 tracking system), finger tracking (PhaseSpace IMPULSE motion tracking system), and a system to communicate with the trackers (Virtual Reality Peripheral Network). The virtual environment is rendered using the Gamebryo game engine running on a dual quad-core 2.3GHz Intel Xeon machine with 8GB of RAM and an NVIDIA GeForce GTX 280 GPU. The VE is presented on a head mounted display (NVIS nVisor SX HMD). The haptic feedback was provided by a 20”x30” low-cost foam board with two faces (figure 3). In order to achieve the goal of redirected touching, there is a need to know the user’s fingertip location, to have a representation of the physical object’s geometry, and a technique to map virtual objects onto the physical object. The determination of the physical geometry is achieved by the user pointing with the tracked finger to each corner of the real object. The system then interpolates the points to generate vertices for the physical geometry. It has to be noted that this technique works for physical objects consisting of planar facets. After capturing the physical geometry, “correspondences between points on its surface and predetermined points on the virtual surface are passed to the space warping system.” &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
According to Kohli et al. (2012), to warp the virtual space “the surface of the real geometry must be mapped to the surface of the virtual geometry, while smoothly and minimally warping the rest of the space. Our system warps space using the well-known thin-plate spline technique commonly used in medical image analysis. A thin plate spline is a 2D interpolation method for passing a smooth and minimally bent surface through a set of points. The concept extends to higher dimensions; we use the 3D version.” &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Research questions==&lt;br /&gt;
&lt;br /&gt;
The development of redirected touching has led to several questions: Is discrepancy detectable? What kinds and amounts of introduced discrepancy would go unnoticed by users? Does discrepancy hurt task performance? Can users perform tasks with discrepant objects as well as they can with one-to-one objects? How does performance change as discrepancy increases? &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. These have been researched mainly by Luv Kohli. Some of the research already done indicates that for certain tasks, task performance is not affected if a virtual object is warped or not. Also, the detection of discrepancy between what is seen and felt has been evaluated, suggesting that there is indeed a certain amount of discrepancy that is undetectable by the users &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;. Besides this, in tests to evaluate training and adaptation on a rapid aiming task in a real environment, an unwarped VE, and a warped VE led to the conclusion that training for the real task occurred in all conditions. Although training in the real condition was more effective, the results suggest that training with redirected touching transferred to the real world &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Applications==&lt;br /&gt;
&lt;br /&gt;
There are a number of potential applications for redirected touching such as virtual prototyping, redirected walking, entertainment, art, and training &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Redirected touching can be applied to the military aircraft pilots and maintenance crews training. These need to learn to perform cockpit procedures, like sequences of buttons and switches. The real aircraft and full simulators available to train the skills necessary have a great cost and are not available in deployed settings. Redirected touching can therefore “enable a single quickly set-up physical mockup to represent many virtual cockpits, eliminating the need to change the mockup for each new aircraft.” &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Terms]] [[Category:Technical Terms]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Redirected_walking&amp;diff=10646</id>
		<title>Redirected walking</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Redirected_walking&amp;diff=10646"/>
		<updated>2016-08-14T00:18:41Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Virtual Environments and Redirected Walking==&lt;br /&gt;
&lt;br /&gt;
[[Virtual reality]] technology (VR), allied with the development of immersive [[virtual environment]]s (VE), holds the promise of a myriad of uses such as exploring buildings, cities, [[tourism]] oriented [[virtual spaces]], [[training]], [[education]], or [[entertainment]] such as [[games|video games]], with [[HMD|head mounted displays]] (HMD) &amp;lt;ref name=”1”&amp;gt; Zhang, S. (2015). You can’t walk in a straight line – and that’s great for VR. Retrieved from www.wired.com/2015/08/cant-walk-straight-lineand-thats-great-vr&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt; Steinicke, F., Bruder, G., Ropinski, T. and Hinrichs, K. (2008). Moving Towards Generally Applicable Redirected Walking. Proceedings of the Virtual Reality International Conference (VRIC), pages 15-24&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Hodgson, E., Bachmannm, E. and Waller, D. (2011). Redirected Walking to Explore Virtual Environments: Assessing the Potential for Spatial Interference. ACM Transactions on Applied Perception, (8)4&amp;lt;/ref&amp;gt;. Traditionally, the problem with exploring these VEs has been the fact that, in many existing VR systems, the user navigates the virtual world with hand-based input devices that control the direction, speed, acceleration and deceleration of movements, which decreases the sense of immersion. Other devices, such as [[omnidirectional treadmills|treadmills]], allow users to walk through VEs but even these do not allow for a great sense of immersion, since the user still has to change the direction manually. Various prototypes have been developed that try to improve walking as input to explore the virtual spaces such as [[omni-directional treadmills]], motion footpads, robot tiles, and motion carpets. These systems, despite being technological achievements, have the disadvantage of being costly and hardly scalable (they support only one user walking), and as such are not good candidates for advancement beyond the prototype stage &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt; Steinicke, F., Bruder, G., Jerald, J., Frenz, H. and Lappe, M. (2010). Estimation of Detection Tresholds for Redirected Walking Techniques. IEEE Trans Vis Comput Graph., 16(1): 17-27&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The problem of how the user moves around when in VR is still unsolved in a total satisfactory manner, in order to maximize immersion &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. Real walking is more presence-enhancing when compared to the other techniques described above and, as such, presents itself as a possible solution &amp;lt;ref name =”5”&amp;gt; Steinicke, F., Bruder, G., Hinrichs, K. and Steed, A. (2009). Presence-Enhancing Real Walking User Interface for First-Person Video Games. Proceeding of the 2009 ACM SIGGRAPH Symposium on Video Games, pages 111-118&amp;lt;/ref&amp;gt;. [[Presence]] can be defined as the subjective feeling of being in the virtual environment, and is important for VE applications to further engage the user in a credible virtual place &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt; Razzaque, S., Swapp, D., Slater, M., Whitton, M. C. and Steed, A. (2002). Redirected Walking in Place. EGVE &#039;02 Proceedings of the workshop on Virtual environments, pages 123-130&amp;lt;/ref&amp;gt;. Utilizing the user’s [[positional tracking|position]] and [[rotational tracking|orientation tracking]] within a certain area, immersive virtual environments that use HMDs allows them to navigate through the virtual reality in a more natural manner. The position and orientation of the person are constantly updated, and the view in the HMD is correspondingly adjusted. However, it has been difficult to develop compelling large-scale VEs due to the limitations of the tracking technology (e.g. range) and access only to relatively small physical spaces in which the users can walk about &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”7”&amp;gt; Hodgson, E. and Bachmannm, E. (2013). Comparing Four Approaches to Generalized Redirected Walking: Simulation and Live User Data. IEEE Trans Vis Comput Graph., 19(4):634-43&amp;lt;/ref&amp;gt;. This leads to a need of a system that provides the user to walk over large distances in the virtual world while physically remaining constrained to a relatively small place &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;. As an example, first-person video games in virtual reality would benefit of such technology by allowing gamers to experience the game immersively, not only because their [[field-of-view]] is that of the virtual character but also because their movements would be tracked in-game, allowing for the players to cover long distances in virtual reality while staying in an small physical area &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Redirected walking is a possible solution to the problem of tracking physical distances in relation to virtual ones. This approach takes advantage of “people’s inability to detect small discrepancies between visual and proprioceptive sensory information during navigation &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;”, and it allows the user to turn and walk in the VE using the body instead of a joystick while reducing the amount of physical space needed in relation to the virtual &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. According to Steinicke et al. (2009), when humans can use only vision to judge their motion through a virtual scene they can successfully estimate their momentary direction of self-motion but are not as well in perceiving their paths of travel. By creating the right mismatches between the physical movement of the user and the visual consequence in the VE, the user can be steered towards the center of the tracking space, away from the edges of the room &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. People don’t notice an increase or decrease in the virtual distance they have to walk, or if the virtual room is shifted so that they perceive their path as straight when in fact, the real path is curved. Even when the users turn their heads, if there is turn in the virtual space of 49 percent more or 20 percent less, this too will go unnoticed. As long as a movement is seen and sensed, the magnitude of that movement does not have to be precise &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. It is the limitation in the human perception for sensing position, orientation and movement that are exploited by the algorithms of redirected walking &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==How redirected walking is achieved==&lt;br /&gt;
&lt;br /&gt;
The basic technique for redirected walking is to rotate the visual virtual scene around a vertical axis that is centered on the user’s head. When the user wants to walk in a straight line, he or she needs to turn physically to reach the goal. By inserting small rotations over time, the user is induced to return to the center of the physical area without realizing it. In this way, redirected walking allows for an increase in the amount of virtual space that can be simulated and traversed while the user is confined to a small physical area. This technique also prevents users from colliding with the walls of the room utilized &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. Also, the rotation of the visual scene must not increase the [[simulator sickness]] of the users for it to be successful &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The achievement of redirected walking, with its correspondence between real and virtual movements that make the body function as an [[Input Devices|input device]], provides users with a rich spatial-sensory feedback that results in a greater sense of presence, and less of a chance of being disoriented in the VE. Indeed, according to Hodgson and Bachmann (2013), “virtual walking produces the same proprioceptive, inertial, and somatosensory cues that users experience while navigating in the real world.”&lt;br /&gt;
&lt;br /&gt;
The biological basis for redirected walking can be seen in the phenomenon of when someone gets lost in the woods, for example, and walk in a circle without realizing it – even when trying to walk in a straight line &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. Souman et al. (2009) proved this phenomenon by showing that the tested participants walked in circles when they could not see the sun, and even when the sun was visible (to provide some sense of orientation) the participants sometimes went off from a straight course, even though they did not walk in circles in this case &amp;lt;ref&amp;gt; Souman, J. L., Frissen, I., Sreenivasa, M. N. and Ernst, M. O. (2009). Walking Straight into Circles. Current Biology, 19: 1538-1542&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The determination of the space required and the maximum rates of imperceptible steering to effectively use redirected walking is not only dependent on the limits of human perception but also on some other relevant factors that may receive less attention. These can be the specific attention demands of the user task in the VE, adaptation of perception - that is dependent on the duration of sessions and the number of repeated sessions-, the nature of the VE (in relation to the proximity of objects and amount of optic flow), the individual differences between users, and the walking algorithms used. These algorithms are responsible for the imperceptible rotation of the virtual scene and the scaling of movements to guide the users away from the tracking area boundaries, permitting them to explore large virtual worlds while walking naturally in a physical limited space. In Hodgson and Bachmann (2013), four algorithms where tested: Steer-to-Center, Steer-to-Orbit, Steer-to-Multiple-Targets, and Steer-to-Multiple+Center (Figure 1). They concluded that Steer-to-Center tended to outperform the other algorithms at maintaining users in the smallest possible area &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[File:RDW.png|thumb|Figure 1. Four different algorithms used in Hodgson and Bachmann (2013) for redirected walking]]&lt;br /&gt;
&lt;br /&gt;
==Limitations of redirected walking==&lt;br /&gt;
&lt;br /&gt;
In order for redirected walking to be implemented as a widely used tool in VR it needs to be unnoticeable and users must not be distracted by it. It also must not increase the incidence of simulator sickness, and not interfere with spatial learning and memory. Studies that have examined the magnitude of redirection that can be done in a scene without being noticeable have suggested that between specific thresholds, there isn’t a higher incidence of simulator sickness. A more relevant limitation is the minimal physical space required for an effective redirected walking. Hodgson et al. (2011) suggests a tracking area with a diameter of at least 30m to more than 44m is necessary to simulate infinitely large VEs &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. This is a problem that impedes a quick transition of the technology into the living room. Not only is the space available more limited but also there is still a need of an empty space (no furniture, for example). Besides this, it is still an expensive technology for the average consumer, needing a very good position tracking over a big space. If the technology involved in redirected walking keeps developing it might become a reality for the consumer, like the VR headsets &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Terms]] [[Category:Technical Terms]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Field-of-view&amp;diff=10645</id>
		<title>Field-of-view</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Field-of-view&amp;diff=10645"/>
		<updated>2016-08-14T00:17:18Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: Redirected page to Field of view&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[Field of view]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Redirected_walking&amp;diff=10644</id>
		<title>Redirected walking</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Redirected_walking&amp;diff=10644"/>
		<updated>2016-08-14T00:16:54Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Virtual Environments and Redirected Walking==&lt;br /&gt;
&lt;br /&gt;
[[Virtual reality]] technology (VR), allied with the development of immersive [[virtual environment]]s (VE), holds the promise of a myriad of uses such as exploring buildings, cities, [[tourism]] oriented [[virtual spaces]], [[training]], [[education]], or [[entertainment]] such as [[games|video games]], with [[HMD|head mounted displays]] (HMD) &amp;lt;ref name=”1”&amp;gt; Zhang, S. (2015). You can’t walk in a straight line – and that’s great for VR. Retrieved from www.wired.com/2015/08/cant-walk-straight-lineand-thats-great-vr&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt; Steinicke, F., Bruder, G., Ropinski, T. and Hinrichs, K. (2008). Moving Towards Generally Applicable Redirected Walking. Proceedings of the Virtual Reality International Conference (VRIC), pages 15-24&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Hodgson, E., Bachmannm, E. and Waller, D. (2011). Redirected Walking to Explore Virtual Environments: Assessing the Potential for Spatial Interference. ACM Transactions on Applied Perception, (8)4&amp;lt;/ref&amp;gt;. Traditionally, the problem with exploring these VEs has been the fact that, in many existing VR systems, the user navigates the virtual world with hand-based input devices that control the direction, speed, acceleration and deceleration of movements, which decreases the sense of immersion. Other devices, such as [[omnidirectional treadmills|treadmills]], allow users to walk through VEs but even these do not allow for a great sense of immersion, since the user still has to change the direction manually. Various prototypes have been developed that try to improve walking as input to explore the virtual spaces such as [[omni-directional treadmills]], motion footpads, robot tiles, and motion carpets. These systems, despite being technological achievements, have the disadvantage of being costly and hardly scalable (they support only one user walking), and as such are not good candidates for advancement beyond the prototype stage &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt; Steinicke, F., Bruder, G., Jerald, J., Frenz, H. and Lappe, M. (2010). Estimation of Detection Tresholds for Redirected Walking Techniques. IEEE Trans Vis Comput Graph., 16(1): 17-27&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The problem of how the user moves around when in VR is still unsolved in a total satisfactory manner, in order to maximize immersion &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. Real walking is more presence-enhancing when compared to the other techniques described above and, as such, presents itself as a possible solution &amp;lt;ref name =”5”&amp;gt; Steinicke, F., Bruder, G., Hinrichs, K. and Steed, A. (2009). Presence-Enhancing Real Walking User Interface for First-Person Video Games. Proceeding of the 2009 ACM SIGGRAPH Symposium on Video Games, pages 111-118&amp;lt;/ref&amp;gt;. [[Presence]] can be defined as the subjective feeling of being in the virtual environment, and is important for VE applications to further engage the user in a credible virtual place &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt; Razzaque, S., Swapp, D., Slater, M., Whitton, M. C. and Steed, A. (2002). Redirected Walking in Place. EGVE &#039;02 Proceedings of the workshop on Virtual environments, pages 123-130&amp;lt;/ref&amp;gt;. Utilizing the user’s [[position tracking|position]] and [[orientation tracking]] within a certain area, immersive virtual environments that use HMDs allows them to navigate through the virtual reality in a more natural manner. The position and orientation of the person are constantly updated, and the view in the HMD is correspondingly adjusted. However, it has been difficult to develop compelling large-scale VEs due to the limitations of the tracking technology (e.g. range) and access only to relatively small physical spaces in which the users can walk about &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”7”&amp;gt; Hodgson, E. and Bachmannm, E. (2013). Comparing Four Approaches to Generalized Redirected Walking: Simulation and Live User Data. IEEE Trans Vis Comput Graph., 19(4):634-43&amp;lt;/ref&amp;gt;. This leads to a need of a system that provides the user to walk over large distances in the virtual world while physically remaining constrained to a relatively small place &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;. As an example, first-person video games in virtual reality would benefit of such technology by allowing gamers to experience the game immersively, not only because their [[field-of-view]] is that of the virtual character but also because their movements would be tracked in-game, allowing for the players to cover long distances in virtual reality while staying in an small physical area &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Redirected walking is a possible solution to the problem of tracking physical distances in relation to virtual ones. This approach takes advantage of “people’s inability to detect small discrepancies between visual and proprioceptive sensory information during navigation &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;”, and it allows the user to turn and walk in the VE using the body instead of a joystick while reducing the amount of physical space needed in relation to the virtual &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. According to Steinicke et al. (2009), when humans can use only vision to judge their motion through a virtual scene they can successfully estimate their momentary direction of self-motion but are not as well in perceiving their paths of travel. By creating the right mismatches between the physical movement of the user and the visual consequence in the VE, the user can be steered towards the center of the tracking space, away from the edges of the room &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. People don’t notice an increase or decrease in the virtual distance they have to walk, or if the virtual room is shifted so that they perceive their path as straight when in fact, the real path is curved. Even when the users turn their heads, if there is turn in the virtual space of 49 percent more or 20 percent less, this too will go unnoticed. As long as a movement is seen and sensed, the magnitude of that movement does not have to be precise &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. It is the limitation in the human perception for sensing position, orientation and movement that are exploited by the algorithms of redirected walking &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==How redirected walking is achieved==&lt;br /&gt;
&lt;br /&gt;
The basic technique for redirected walking is to rotate the visual virtual scene around a vertical axis that is centered on the user’s head. When the user wants to walk in a straight line, he or she needs to turn physically to reach the goal. By inserting small rotations over time, the user is induced to return to the center of the physical area without realizing it. In this way, redirected walking allows for an increase in the amount of virtual space that can be simulated and traversed while the user is confined to a small physical area. This technique also prevents users from colliding with the walls of the room utilized &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. Also, the rotation of the visual scene must not increase the [[simulator sickness]] of the users for it to be successful &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The achievement of redirected walking, with its correspondence between real and virtual movements that make the body function as an [[Input Devices|input device]], provides users with a rich spatial-sensory feedback that results in a greater sense of presence, and less of a chance of being disoriented in the VE. Indeed, according to Hodgson and Bachmann (2013), “virtual walking produces the same proprioceptive, inertial, and somatosensory cues that users experience while navigating in the real world.”&lt;br /&gt;
&lt;br /&gt;
The biological basis for redirected walking can be seen in the phenomenon of when someone gets lost in the woods, for example, and walk in a circle without realizing it – even when trying to walk in a straight line &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. Souman et al. (2009) proved this phenomenon by showing that the tested participants walked in circles when they could not see the sun, and even when the sun was visible (to provide some sense of orientation) the participants sometimes went off from a straight course, even though they did not walk in circles in this case &amp;lt;ref&amp;gt; Souman, J. L., Frissen, I., Sreenivasa, M. N. and Ernst, M. O. (2009). Walking Straight into Circles. Current Biology, 19: 1538-1542&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The determination of the space required and the maximum rates of imperceptible steering to effectively use redirected walking is not only dependent on the limits of human perception but also on some other relevant factors that may receive less attention. These can be the specific attention demands of the user task in the VE, adaptation of perception - that is dependent on the duration of sessions and the number of repeated sessions-, the nature of the VE (in relation to the proximity of objects and amount of optic flow), the individual differences between users, and the walking algorithms used. These algorithms are responsible for the imperceptible rotation of the virtual scene and the scaling of movements to guide the users away from the tracking area boundaries, permitting them to explore large virtual worlds while walking naturally in a physical limited space. In Hodgson and Bachmann (2013), four algorithms where tested: Steer-to-Center, Steer-to-Orbit, Steer-to-Multiple-Targets, and Steer-to-Multiple+Center (Figure 1). They concluded that Steer-to-Center tended to outperform the other algorithms at maintaining users in the smallest possible area &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[File:RDW.png|thumb|Figure 1. Four different algorithms used in Hodgson and Bachmann (2013) for redirected walking]]&lt;br /&gt;
&lt;br /&gt;
==Limitations of redirected walking==&lt;br /&gt;
&lt;br /&gt;
In order for redirected walking to be implemented as a widely used tool in VR it needs to be unnoticeable and users must not be distracted by it. It also must not increase the incidence of simulator sickness, and not interfere with spatial learning and memory. Studies that have examined the magnitude of redirection that can be done in a scene without being noticeable have suggested that between specific thresholds, there isn’t a higher incidence of simulator sickness. A more relevant limitation is the minimal physical space required for an effective redirected walking. Hodgson et al. (2011) suggests a tracking area with a diameter of at least 30m to more than 44m is necessary to simulate infinitely large VEs &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. This is a problem that impedes a quick transition of the technology into the living room. Not only is the space available more limited but also there is still a need of an empty space (no furniture, for example). Besides this, it is still an expensive technology for the average consumer, needing a very good position tracking over a big space. If the technology involved in redirected walking keeps developing it might become a reality for the consumer, like the VR headsets &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Terms]] [[Category:Technical Terms]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Redirected_walking&amp;diff=10643</id>
		<title>Redirected walking</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Redirected_walking&amp;diff=10643"/>
		<updated>2016-08-14T00:11:00Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Virtual Environments and Redirected Walking==&lt;br /&gt;
&lt;br /&gt;
[[Virtual reality]] technology (VR), allied with the development of immersive [[virtual environment]]s (VE), holds the promise of a myriad of uses such as exploring buildings, cities, [[tourism]] oriented [[virtual spaces]], [[training]], [[education]], or [[entertainment]] such as [[games|video games]], with [[HMD|head mounted displays]] (HMD) &amp;lt;ref name=”1”&amp;gt; Zhang, S. (2015). You can’t walk in a straight line – and that’s great for VR. Retrieved from www.wired.com/2015/08/cant-walk-straight-lineand-thats-great-vr&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt; Steinicke, F., Bruder, G., Ropinski, T. and Hinrichs, K. (2008). Moving Towards Generally Applicable Redirected Walking. Proceedings of the Virtual Reality International Conference (VRIC), pages 15-24&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Hodgson, E., Bachmannm, E. and Waller, D. (2011). Redirected Walking to Explore Virtual Environments: Assessing the Potential for Spatial Interference. ACM Transactions on Applied Perception, (8)4&amp;lt;/ref&amp;gt;. Traditionally, the problem with exploring these VEs has been the fact that, in many existing VR systems, the user navigates the virtual world with hand-based input devices that control the direction, speed, acceleration and deceleration of movements, which decreases the sense of immersion. Other devices, such as [[omnidirectional treadmills|treadmills]], allow users to walk through VEs but even these do not allow for a great sense of immersion, since the user still has to change the direction manually. Various prototypes have been developed that try to improve walking as input to explore the virtual spaces such as [[omni-directional treadmills]], motion footpads, robot tiles, and motion carpets. These systems, despite being technological achievements, have the disadvantage of being costly and hardly scalable (they support only one user walking), and as such are not good candidates for advancement beyond the prototype stage &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt; Steinicke, F., Bruder, G., Jerald, J., Frenz, H. and Lappe, M. (2010). Estimation of Detection Tresholds for Redirected Walking Techniques. IEEE Trans Vis Comput Graph., 16(1): 17-27&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The problem of how the user moves around when in VR is still unsolved in a total satisfactory manner, in order to maximize immersion &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. Real walking is more presence-enhancing when compared to the other techniques described above and, as such, presents itself as a possible solution &amp;lt;ref name =”5”&amp;gt; Steinicke, F., Bruder, G., Hinrichs, K. and Steed, A. (2009). Presence-Enhancing Real Walking User Interface for First-Person Video Games. Proceeding of the 2009 ACM SIGGRAPH Symposium on Video Games, pages 111-118&amp;lt;/ref&amp;gt;. Presence can be defined as the subjective feeling of being in the virtual environment, and is important for VE applications to further engage the user in a credible virtual place &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt; Razzaque, S., Swapp, D., Slater, M., Whitton, M. C. and Steed, A. (2002). Redirected Walking in Place. EGVE &#039;02 Proceedings of the workshop on Virtual environments, pages 123-130&amp;lt;/ref&amp;gt;. Utilizing the user’s position and orientation tracking within a certain area, immersive virtual environments that use HMDs allows them to navigate through the virtual reality in a more natural manner. The position and orientation of the person are constantly updated, and the view in the HMD is correspondingly adjusted. However, it has been difficult to develop compelling large-scale VEs due to the limitations of the tracking technology (e.g. range) and access only to relatively small physical spaces in which the users can walk about &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”7”&amp;gt; Hodgson, E. and Bachmannm, E. (2013). Comparing Four Approaches to Generalized Redirected Walking: Simulation and Live User Data. IEEE Trans Vis Comput Graph., 19(4):634-43&amp;lt;/ref&amp;gt;. This leads to a need of a system that provides the user to walk over large distances in the virtual world while physically remaining constrained to a relatively small place &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;. As an example, first-person video games in virtual reality would benefit of such technology by allowing gamers to experience the game immersively, not only because their field-of-view is that of the virtual character but also because their movements would be tracked in-game, allowing for the players to cover long distances in virtual reality while staying in an small physical area &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Redirected walking is a possible solution to the problem of tracking physical distances in relation to virtual ones. This approach takes advantage of “people’s inability to detect small discrepancies between visual and proprioceptive sensory information during navigation &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;”, and it allows the user to turn and walk in the VE using the body instead of a joystick while reducing the amount of physical space needed in relation to the virtual &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. According to Steinicke et al. (2009), when humans can use only vision to judge their motion through a virtual scene they can successfully estimate their momentary direction of self-motion but are not as well in perceiving their paths of travel. By creating the right mismatches between the physical movement of the user and the visual consequence in the VE, the user can be steered towards the center of the tracking space, away from the edges of the room &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. People don’t notice an increase or decrease in the virtual distance they have to walk, or if the virtual room is shifted so that they perceive their path as straight when in fact, the real path is curved. Even when the users turn their heads, if there is turn in the virtual space of 49 percent more or 20 percent less, this too will go unnoticed. As long as a movement is seen and sensed, the magnitude of that movement does not have to be precise &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. It is the limitation in the human perception for sensing position, orientation and movement that are exploited by the algorithms of redirected walking &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==How redirected walking is achieved==&lt;br /&gt;
&lt;br /&gt;
The basic technique for redirected walking is to rotate the visual virtual scene around a vertical axis that is centered on the user’s head. When the user wants to walk in a straight line, he or she needs to turn physically to reach the goal. By inserting small rotations over time, the user is induced to return to the center of the physical area without realizing it. In this way, redirected walking allows for an increase in the amount of virtual space that can be simulated and traversed while the user is confined to a small physical area. This technique also prevents users from colliding with the walls of the room utilized &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. Also, the rotation of the visual scene must not increase the simulator sickness of the users for it to be successful &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The achievement of redirected walking, with its correspondence between real and virtual movements that make the body function as an input device, provides users with a rich spatial-sensory feedback that results in a greater sense of presence, and less of a chance of being disoriented in the VE. Indeed, according to Hodgson and Bachmann (2013), “virtual walking produces the same proprioceptive, inertial, and somatosensory cues that users experience while navigating in the real world.”&lt;br /&gt;
&lt;br /&gt;
The biological basis for redirected walking can be seen in the phenomenon of when someone gets lost in the woods, for example, and walk in a circle without realizing it – even when trying to walk in a straight line &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. Souman et al. (2009) proved this phenomenon by showing that the tested participants walked in circles when they could not see the sun, and even when the sun was visible (to provide some sense of orientation) the participants sometimes went off from a straight course, even though they did not walk in circles in this case &amp;lt;ref&amp;gt; Souman, J. L., Frissen, I., Sreenivasa, M. N. and Ernst, M. O. (2009). Walking Straight into Circles. Current Biology, 19: 1538-1542&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The determination of the space required and the maximum rates of imperceptible steering to effectively use redirected walking is not only dependent on the limits of human perception but also on some other relevant factors that may receive less attention. These can be the specific attention demands of the user task in the VE, adaptation of perception - that is dependent on the duration of sessions and the number of repeated sessions-, the nature of the VE (in relation to the proximity of objects and amount of optic flow), the individual differences between users, and the walking algorithms used. These algorithms are responsible for the imperceptible rotation of the virtual scene and the scaling of movements to guide the users away from the tracking area boundaries, permitting them to explore large virtual worlds while walking naturally in a physical limited space. In Hodgson and Bachmann (2013), four algorithms where tested: Steer-to-Center, Steer-to-Orbit, Steer-to-Multiple-Targets, and Steer-to-Multiple+Center (Figure 1). They concluded that Steer-to-Center tended to outperform the other algorithms at maintaining users in the smallest possible area &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[File:RDW.png|thumb|Figure 1. Four different algorithms used in Hodgson and Bachmann (2013) for redirected walking]]&lt;br /&gt;
&lt;br /&gt;
==Limitations of redirected walking==&lt;br /&gt;
&lt;br /&gt;
In order for redirected walking to be implemented as a widely used tool in VR it needs to be unnoticeable and users must not be distracted by it. It also must not increase the incidence of simulator sickness, and not interfere with spatial learning and memory. Studies that have examined the magnitude of redirection that can be done in a scene without being noticeable have suggested that between specific thresholds, there isn’t a higher incidence of simulator sickness. A more relevant limitation is the minimal physical space required for an effective redirected walking. Hodgson et al. (2011) suggests a tracking area with a diameter of at least 30m to more than 44m is necessary to simulate infinitely large VEs &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. This is a problem that impedes a quick transition of the technology into the living room. Not only is the space available more limited but also there is still a need of an empty space (no furniture, for example). Besides this, it is still an expensive technology for the average consumer, needing a very good position tracking over a big space. If the technology involved in redirected walking keeps developing it might become a reality for the consumer, like the VR headsets &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Terms]] [[Category:Technical Terms]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Redirected_walking&amp;diff=10642</id>
		<title>Redirected walking</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Redirected_walking&amp;diff=10642"/>
		<updated>2016-08-14T00:08:26Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Virtual Environments and Redirected Walking==&lt;br /&gt;
&lt;br /&gt;
[[Virtual reality]] technology (VR), allied with the development of immersive virtual environments (VE), holds the promise of a myriad of uses such as exploring buildings, cities, tourism oriented virtual spaces, training, education, or entertainment such as videogames, with head mounted displays (HMD) &amp;lt;ref name=”1”&amp;gt; Zhang, S. (2015). You can’t walk in a straight line – and that’s great for VR. Retrieved from www.wired.com/2015/08/cant-walk-straight-lineand-thats-great-vr&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt; Steinicke, F., Bruder, G., Ropinski, T. and Hinrichs, K. (2008). Moving Towards Generally Applicable Redirected Walking. Proceedings of the Virtual Reality International Conference (VRIC), pages 15-24&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Hodgson, E., Bachmannm, E. and Waller, D. (2011). Redirected Walking to Explore Virtual Environments: Assessing the Potential for Spatial Interference. ACM Transactions on Applied Perception, (8)4&amp;lt;/ref&amp;gt;. Traditionally, the problem with exploring these VEs has been the fact that, in many existing VR systems, the user navigates the virtual world with hand-based input devices that control the direction, speed, acceleration and deceleration of movements, which decreases the sense of immersion. Other devices, such as treadmills, allow users to walk through VEs but even these do not allow for a great sense of immersion, since the user still has to change the direction manually. Various prototypes have been developed that try to improve walking as input to explore the virtual spaces such as omni-directional treadmills, motion footpads, robot tiles, and motion carpets. These systems, despite being technological achievements, have the disadvantage of being costly and hardly scalable (they support only one user walking), and as such are not good candidates for advancement beyond the prototype stage &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt; Steinicke, F., Bruder, G., Jerald, J., Frenz, H. and Lappe, M. (2010). Estimation of Detection Tresholds for Redirected Walking Techniques. IEEE Trans Vis Comput Graph., 16(1): 17-27&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The problem of how the user moves around when in VR is still unsolved in a total satisfactory manner, in order to maximize immersion &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. Real walking is more presence-enhancing when compared to the other techniques described above and, as such, presents itself as a possible solution &amp;lt;ref name =”5”&amp;gt; Steinicke, F., Bruder, G., Hinrichs, K. and Steed, A. (2009). Presence-Enhancing Real Walking User Interface for First-Person Video Games. Proceeding of the 2009 ACM SIGGRAPH Symposium on Video Games, pages 111-118&amp;lt;/ref&amp;gt;. Presence can be defined as the subjective feeling of being in the virtual environment, and is important for VE applications to further engage the user in a credible virtual place &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt; Razzaque, S., Swapp, D., Slater, M., Whitton, M. C. and Steed, A. (2002). Redirected Walking in Place. EGVE &#039;02 Proceedings of the workshop on Virtual environments, pages 123-130&amp;lt;/ref&amp;gt;. Utilizing the user’s position and orientation tracking within a certain area, immersive virtual environments that use HMDs allows them to navigate through the virtual reality in a more natural manner. The position and orientation of the person are constantly updated, and the view in the HMD is correspondingly adjusted. However, it has been difficult to develop compelling large-scale VEs due to the limitations of the tracking technology (e.g. range) and access only to relatively small physical spaces in which the users can walk about &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”7”&amp;gt; Hodgson, E. and Bachmannm, E. (2013). Comparing Four Approaches to Generalized Redirected Walking: Simulation and Live User Data. IEEE Trans Vis Comput Graph., 19(4):634-43&amp;lt;/ref&amp;gt;. This leads to a need of a system that provides the user to walk over large distances in the virtual world while physically remaining constrained to a relatively small place &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;. As an example, first-person video games in virtual reality would benefit of such technology by allowing gamers to experience the game immersively, not only because their field-of-view is that of the virtual character but also because their movements would be tracked in-game, allowing for the players to cover long distances in virtual reality while staying in an small physical area &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Redirected walking is a possible solution to the problem of tracking physical distances in relation to virtual ones. This approach takes advantage of “people’s inability to detect small discrepancies between visual and proprioceptive sensory information during navigation &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;”, and it allows the user to turn and walk in the VE using the body instead of a joystick while reducing the amount of physical space needed in relation to the virtual &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. According to Steinicke et al. (2009), when humans can use only vision to judge their motion through a virtual scene they can successfully estimate their momentary direction of self-motion but are not as well in perceiving their paths of travel. By creating the right mismatches between the physical movement of the user and the visual consequence in the VE, the user can be steered towards the center of the tracking space, away from the edges of the room &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. People don’t notice an increase or decrease in the virtual distance they have to walk, or if the virtual room is shifted so that they perceive their path as straight when in fact, the real path is curved. Even when the users turn their heads, if there is turn in the virtual space of 49 percent more or 20 percent less, this too will go unnoticed. As long as a movement is seen and sensed, the magnitude of that movement does not have to be precise &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. It is the limitation in the human perception for sensing position, orientation and movement that are exploited by the algorithms of redirected walking &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==How redirected walking is achieved==&lt;br /&gt;
&lt;br /&gt;
The basic technique for redirected walking is to rotate the visual virtual scene around a vertical axis that is centered on the user’s head. When the user wants to walk in a straight line, he or she needs to turn physically to reach the goal. By inserting small rotations over time, the user is induced to return to the center of the physical area without realizing it. In this way, redirected walking allows for an increase in the amount of virtual space that can be simulated and traversed while the user is confined to a small physical area. This technique also prevents users from colliding with the walls of the room utilized &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. Also, the rotation of the visual scene must not increase the simulator sickness of the users for it to be successful &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The achievement of redirected walking, with its correspondence between real and virtual movements that make the body function as an input device, provides users with a rich spatial-sensory feedback that results in a greater sense of presence, and less of a chance of being disoriented in the VE. Indeed, according to Hodgson and Bachmann (2013), “virtual walking produces the same proprioceptive, inertial, and somatosensory cues that users experience while navigating in the real world.”&lt;br /&gt;
&lt;br /&gt;
The biological basis for redirected walking can be seen in the phenomenon of when someone gets lost in the woods, for example, and walk in a circle without realizing it – even when trying to walk in a straight line &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. Souman et al. (2009) proved this phenomenon by showing that the tested participants walked in circles when they could not see the sun, and even when the sun was visible (to provide some sense of orientation) the participants sometimes went off from a straight course, even though they did not walk in circles in this case &amp;lt;ref&amp;gt; Souman, J. L., Frissen, I., Sreenivasa, M. N. and Ernst, M. O. (2009). Walking Straight into Circles. Current Biology, 19: 1538-1542&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The determination of the space required and the maximum rates of imperceptible steering to effectively use redirected walking is not only dependent on the limits of human perception but also on some other relevant factors that may receive less attention. These can be the specific attention demands of the user task in the VE, adaptation of perception - that is dependent on the duration of sessions and the number of repeated sessions-, the nature of the VE (in relation to the proximity of objects and amount of optic flow), the individual differences between users, and the walking algorithms used. These algorithms are responsible for the imperceptible rotation of the virtual scene and the scaling of movements to guide the users away from the tracking area boundaries, permitting them to explore large virtual worlds while walking naturally in a physical limited space. In Hodgson and Bachmann (2013), four algorithms where tested: Steer-to-Center, Steer-to-Orbit, Steer-to-Multiple-Targets, and Steer-to-Multiple+Center (Figure 1). They concluded that Steer-to-Center tended to outperform the other algorithms at maintaining users in the smallest possible area &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[File:RDW.png|thumb|Figure 1. Four different algorithms used in Hodgson and Bachmann (2013) for redirected walking]]&lt;br /&gt;
&lt;br /&gt;
==Limitations of redirected walking==&lt;br /&gt;
&lt;br /&gt;
In order for redirected walking to be implemented as a widely used tool in VR it needs to be unnoticeable and users must not be distracted by it. It also must not increase the incidence of simulator sickness, and not interfere with spatial learning and memory. Studies that have examined the magnitude of redirection that can be done in a scene without being noticeable have suggested that between specific thresholds, there isn’t a higher incidence of simulator sickness. A more relevant limitation is the minimal physical space required for an effective redirected walking. Hodgson et al. (2011) suggests a tracking area with a diameter of at least 30m to more than 44m is necessary to simulate infinitely large VEs &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. This is a problem that impedes a quick transition of the technology into the living room. Not only is the space available more limited but also there is still a need of an empty space (no furniture, for example). Besides this, it is still an expensive technology for the average consumer, needing a very good position tracking over a big space. If the technology involved in redirected walking keeps developing it might become a reality for the consumer, like the VR headsets &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Terms]] [[Category:Technical Terms]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Redirected_walking&amp;diff=10641</id>
		<title>Redirected walking</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Redirected_walking&amp;diff=10641"/>
		<updated>2016-08-14T00:08:09Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Virtual Environments and Redirected Walking==&lt;br /&gt;
&lt;br /&gt;
Virtual reality technology (VR), allied with the development of immersive virtual environments (VE), holds the promise of a myriad of uses such as exploring buildings, cities, tourism oriented virtual spaces, training, education, or entertainment such as videogames, with head mounted displays (HMD) &amp;lt;ref name=”1”&amp;gt; Zhang, S. (2015). You can’t walk in a straight line – and that’s great for VR. Retrieved from www.wired.com/2015/08/cant-walk-straight-lineand-thats-great-vr&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt; Steinicke, F., Bruder, G., Ropinski, T. and Hinrichs, K. (2008). Moving Towards Generally Applicable Redirected Walking. Proceedings of the Virtual Reality International Conference (VRIC), pages 15-24&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Hodgson, E., Bachmannm, E. and Waller, D. (2011). Redirected Walking to Explore Virtual Environments: Assessing the Potential for Spatial Interference. ACM Transactions on Applied Perception, (8)4&amp;lt;/ref&amp;gt;. Traditionally, the problem with exploring these VEs has been the fact that, in many existing VR systems, the user navigates the virtual world with hand-based input devices that control the direction, speed, acceleration and deceleration of movements, which decreases the sense of immersion. Other devices, such as treadmills, allow users to walk through VEs but even these do not allow for a great sense of immersion, since the user still has to change the direction manually. Various prototypes have been developed that try to improve walking as input to explore the virtual spaces such as omni-directional treadmills, motion footpads, robot tiles, and motion carpets. These systems, despite being technological achievements, have the disadvantage of being costly and hardly scalable (they support only one user walking), and as such are not good candidates for advancement beyond the prototype stage &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt; Steinicke, F., Bruder, G., Jerald, J., Frenz, H. and Lappe, M. (2010). Estimation of Detection Tresholds for Redirected Walking Techniques. IEEE Trans Vis Comput Graph., 16(1): 17-27&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The problem of how the user moves around when in VR is still unsolved in a total satisfactory manner, in order to maximize immersion &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. Real walking is more presence-enhancing when compared to the other techniques described above and, as such, presents itself as a possible solution &amp;lt;ref name =”5”&amp;gt; Steinicke, F., Bruder, G., Hinrichs, K. and Steed, A. (2009). Presence-Enhancing Real Walking User Interface for First-Person Video Games. Proceeding of the 2009 ACM SIGGRAPH Symposium on Video Games, pages 111-118&amp;lt;/ref&amp;gt;. Presence can be defined as the subjective feeling of being in the virtual environment, and is important for VE applications to further engage the user in a credible virtual place &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt; Razzaque, S., Swapp, D., Slater, M., Whitton, M. C. and Steed, A. (2002). Redirected Walking in Place. EGVE &#039;02 Proceedings of the workshop on Virtual environments, pages 123-130&amp;lt;/ref&amp;gt;. Utilizing the user’s position and orientation tracking within a certain area, immersive virtual environments that use HMDs allows them to navigate through the virtual reality in a more natural manner. The position and orientation of the person are constantly updated, and the view in the HMD is correspondingly adjusted. However, it has been difficult to develop compelling large-scale VEs due to the limitations of the tracking technology (e.g. range) and access only to relatively small physical spaces in which the users can walk about &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”7”&amp;gt; Hodgson, E. and Bachmannm, E. (2013). Comparing Four Approaches to Generalized Redirected Walking: Simulation and Live User Data. IEEE Trans Vis Comput Graph., 19(4):634-43&amp;lt;/ref&amp;gt;. This leads to a need of a system that provides the user to walk over large distances in the virtual world while physically remaining constrained to a relatively small place &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;. As an example, first-person video games in virtual reality would benefit of such technology by allowing gamers to experience the game immersively, not only because their field-of-view is that of the virtual character but also because their movements would be tracked in-game, allowing for the players to cover long distances in virtual reality while staying in an small physical area &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Redirected walking is a possible solution to the problem of tracking physical distances in relation to virtual ones. This approach takes advantage of “people’s inability to detect small discrepancies between visual and proprioceptive sensory information during navigation &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;”, and it allows the user to turn and walk in the VE using the body instead of a joystick while reducing the amount of physical space needed in relation to the virtual &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. According to Steinicke et al. (2009), when humans can use only vision to judge their motion through a virtual scene they can successfully estimate their momentary direction of self-motion but are not as well in perceiving their paths of travel. By creating the right mismatches between the physical movement of the user and the visual consequence in the VE, the user can be steered towards the center of the tracking space, away from the edges of the room &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. People don’t notice an increase or decrease in the virtual distance they have to walk, or if the virtual room is shifted so that they perceive their path as straight when in fact, the real path is curved. Even when the users turn their heads, if there is turn in the virtual space of 49 percent more or 20 percent less, this too will go unnoticed. As long as a movement is seen and sensed, the magnitude of that movement does not have to be precise &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. It is the limitation in the human perception for sensing position, orientation and movement that are exploited by the algorithms of redirected walking &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==How redirected walking is achieved==&lt;br /&gt;
&lt;br /&gt;
The basic technique for redirected walking is to rotate the visual virtual scene around a vertical axis that is centered on the user’s head. When the user wants to walk in a straight line, he or she needs to turn physically to reach the goal. By inserting small rotations over time, the user is induced to return to the center of the physical area without realizing it. In this way, redirected walking allows for an increase in the amount of virtual space that can be simulated and traversed while the user is confined to a small physical area. This technique also prevents users from colliding with the walls of the room utilized &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. Also, the rotation of the visual scene must not increase the simulator sickness of the users for it to be successful &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The achievement of redirected walking, with its correspondence between real and virtual movements that make the body function as an input device, provides users with a rich spatial-sensory feedback that results in a greater sense of presence, and less of a chance of being disoriented in the VE. Indeed, according to Hodgson and Bachmann (2013), “virtual walking produces the same proprioceptive, inertial, and somatosensory cues that users experience while navigating in the real world.”&lt;br /&gt;
&lt;br /&gt;
The biological basis for redirected walking can be seen in the phenomenon of when someone gets lost in the woods, for example, and walk in a circle without realizing it – even when trying to walk in a straight line &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. Souman et al. (2009) proved this phenomenon by showing that the tested participants walked in circles when they could not see the sun, and even when the sun was visible (to provide some sense of orientation) the participants sometimes went off from a straight course, even though they did not walk in circles in this case &amp;lt;ref&amp;gt; Souman, J. L., Frissen, I., Sreenivasa, M. N. and Ernst, M. O. (2009). Walking Straight into Circles. Current Biology, 19: 1538-1542&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The determination of the space required and the maximum rates of imperceptible steering to effectively use redirected walking is not only dependent on the limits of human perception but also on some other relevant factors that may receive less attention. These can be the specific attention demands of the user task in the VE, adaptation of perception - that is dependent on the duration of sessions and the number of repeated sessions-, the nature of the VE (in relation to the proximity of objects and amount of optic flow), the individual differences between users, and the walking algorithms used. These algorithms are responsible for the imperceptible rotation of the virtual scene and the scaling of movements to guide the users away from the tracking area boundaries, permitting them to explore large virtual worlds while walking naturally in a physical limited space. In Hodgson and Bachmann (2013), four algorithms where tested: Steer-to-Center, Steer-to-Orbit, Steer-to-Multiple-Targets, and Steer-to-Multiple+Center (Figure 1). They concluded that Steer-to-Center tended to outperform the other algorithms at maintaining users in the smallest possible area &amp;lt;ref name=”7”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[File:RDW.png|thumb|Figure 1. Four different algorithms used in Hodgson and Bachmann (2013) for redirected walking]]&lt;br /&gt;
&lt;br /&gt;
==Limitations of redirected walking==&lt;br /&gt;
&lt;br /&gt;
In order for redirected walking to be implemented as a widely used tool in VR it needs to be unnoticeable and users must not be distracted by it. It also must not increase the incidence of simulator sickness, and not interfere with spatial learning and memory. Studies that have examined the magnitude of redirection that can be done in a scene without being noticeable have suggested that between specific thresholds, there isn’t a higher incidence of simulator sickness. A more relevant limitation is the minimal physical space required for an effective redirected walking. Hodgson et al. (2011) suggests a tracking area with a diameter of at least 30m to more than 44m is necessary to simulate infinitely large VEs &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. This is a problem that impedes a quick transition of the technology into the living room. Not only is the space available more limited but also there is still a need of an empty space (no furniture, for example). Besides this, it is still an expensive technology for the average consumer, needing a very good position tracking over a big space. If the technology involved in redirected walking keeps developing it might become a reality for the consumer, like the VR headsets &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Terms]] [[Category:Technical Terms]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=HTC_Vive&amp;diff=10627</id>
		<title>HTC Vive</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=HTC_Vive&amp;diff=10627"/>
		<updated>2016-08-13T15:07:10Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: /* Apps */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Device Infobox&lt;br /&gt;
|image=[[File:htv vive cv1 hmd1.jpg|350px]]&lt;br /&gt;
|VR/AR=[[Virtual Reality]]&lt;br /&gt;
|Type=[[Head-mounted display]]&lt;br /&gt;
|Subtype=[[Discrete HMD]]&lt;br /&gt;
|Platform=[[SteamVR]]&lt;br /&gt;
|Developer=[[HTC]], [[Valve]]&lt;br /&gt;
|Operating System=[[Windows]]&lt;br /&gt;
|Requires=PC&lt;br /&gt;
|Predecessor=[[HTC Vive Pre]]&lt;br /&gt;
|Successor=[[HTC Vive CV2]]&lt;br /&gt;
|Display=Dual Panel&lt;br /&gt;
|Resolution=2160 x 1200 (1080 x 1200 per eye)&lt;br /&gt;
|Pixel Density=455.63 PPI per eye&lt;br /&gt;
|Refresh Rate=90 Hz&lt;br /&gt;
|Field of View=110° (diagonal)&lt;br /&gt;
|Optics=Fresnel Lenses&lt;br /&gt;
|Tracking=6DOF&lt;br /&gt;
|Rotational Tracking=[[Gyroscope]], [[Accelerometer]], Laser Position Sensor&lt;br /&gt;
|Positional Tracking=[[Base Stations]]&lt;br /&gt;
|Update Rate=Rotational: 1000Hz, Positional: 60Hz&lt;br /&gt;
|Tracking Volume=120°H x 120°V (over 21 feet range)&lt;br /&gt;
|Latency=??&lt;br /&gt;
|Audio=Built-in headphones, external headphones&lt;br /&gt;
|Camera=Pass-through camera&lt;br /&gt;
|Sensors=&lt;br /&gt;
|Input=Controllers in both hands&lt;br /&gt;
|Connectivity=2 HDMI ports, 2 USB ports, 1 headphone jack&lt;br /&gt;
|Cable Length=5+ meters&lt;br /&gt;
|Release Date=April 5, 2016&lt;br /&gt;
|Price=$799, €899, £689, Business Edition: $1200&lt;br /&gt;
|Website=http://www.htcvr.com/&lt;br /&gt;
}}&lt;br /&gt;
{{see also|HTC Vive Developer Editions}}&lt;br /&gt;
[[HTC Vive]] also known as &#039;&#039;&#039;HTC Vive CV1&#039;&#039;&#039; or simply as the &#039;&#039;&#039;Vive&#039;&#039;&#039; is the first consumer version of [[HTC Vive (Platform)]] [[Virtual Reality Devices|Virtual Reality HMD]] developed by [[HTC]]. It is part of the [[SteamVR]] ecosystem created by [[Valve]].&lt;br /&gt;
&lt;br /&gt;
Pre-orders for the Vive begins on February 29, 2016 for $799. Vive released on April 5, 2016. HTC Vive CV1 comes with the [[head-mounted display]], 2 [[SteamVR Controllers|wireless, motion tracked controllers]] and 2 [[lighthouse]] [[Base Stations]] positional sensors that enable [[room-scale VR]]. &lt;br /&gt;
&lt;br /&gt;
On June 9, 2016, Vive announced Business Edition of Vive, called Vive BE, for $1200. While the only physical difference between Vive and Vive BE is 2 extra face cushions, Business Edition includes a 12-month limited warranty and a dedicated Vive BE customer support line.&lt;br /&gt;
==Features==&lt;br /&gt;
*[[Room-scale VR]] - Move around freely in a 12 feet by 12 feet space. Both HMD and the 2 controllers are accurately tracked within that space.&lt;br /&gt;
&lt;br /&gt;
*[[Wireless SteamVR Controllers]] - Input with 2 motion-tracked controllers, 1 held in each hand. &lt;br /&gt;
&lt;br /&gt;
*[[Chaperone]] - Prevents the user from bumping into real life walls and other obstacles.&lt;br /&gt;
&lt;br /&gt;
*Front facing camera - Allows the user see the real life environment in front of them while wearing the HMD.&lt;br /&gt;
&lt;br /&gt;
*[[#Connecting to Your Phone|Smartphone connectivity]] - Connect the HMD to your smartphone via Bluetooth, allowing the user to receive calls, messages remainders and return calls.&lt;br /&gt;
&lt;br /&gt;
==Hardware==&lt;br /&gt;
===Review===&lt;br /&gt;
&#039;&#039;&#039;Design and Ergonomics&#039;&#039;&#039;:&lt;br /&gt;
&lt;br /&gt;
The all-black HMD is sleek and sturdy. It is secured to the user&#039;s head with a harness-like series of straps. The user&#039;s face contacts the HMD with a soft and comfortable facial interface. The foam gasket part of the interface can be removed and replaced by lifting it from the velcro. Vive comes with 2 foam gaskets: the larger &amp;quot;Wide Face&amp;quot; and the smaller &amp;quot;Thin Face&amp;quot;. The You can change the distance between the [[lenses]] to fit your [[IPD]] with the dial on the right side of HMD. [[Eye relief]], the distance between the lenses and your eyes, can be adjusted with 2 gray knobs that you can pull out and rotate to extend or retract the sides of the HMD. While this function allows the Vive to accommodate most glasses, please be aware that increasing the eye relief does negatively affect the [[FOV]]. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Display and Optics&#039;&#039;&#039;:&lt;br /&gt;
&lt;br /&gt;
Vive features dual OLED panel displays of 1080 x 1200 per eye. The colors are vibrant, the [[resolution]] is adequate and the [[screen door effect]] is minimal. The only glaring flaw of Vive&#039;s display and optics system is the [[god rays]]. The god rays are caused by [[Fresnel lenses]]&#039; ridges which scatter light. They look similar to lens flares and are noticeable whenever there are high contract elements on the screen i.e. white text on a black background. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Front facing camera&#039;&#039;&#039;:&lt;br /&gt;
&lt;br /&gt;
A camera is located in the front, bottom of the HMD. The camera can bring up a wide [[FOV]] view of the environment that is in front of you. Players can activate the camera by double tapping of SteamVR controller&#039;s &amp;quot;System&amp;quot; button or set it to automatically activate when you wander too close to the [[Chaperone]] boundaries. Working in conjunction with Chaperone, the camera create another layer that keeps the player safe while moving around wearing the HMD. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Tracking&#039;&#039;&#039;:&lt;br /&gt;
&lt;br /&gt;
[[Tracking]] in Vive has no visible [[latency]]. The [[Tracking#Systems|tracking system]] employed by HTC Vive is called [[Lighthouse]]. While [[rotational tracking]] is achieved with [[IMUs]], [[positional tracking]] is accomplish with 2 IR base stations called [[Base Stations]]. The Base Stations would constantly flood the room with IR that are detected by sensors on the HMD and [[SteamVR Controllers]]. HMD and controllers would figure out where they are in relation to the Base Stations ([[inside-out tracking]]). Vive&#039;s tracking is designed for both [[Seated VR|seated]],  [[standing VR|standing]] and most importantly [[Room-scale VR]] experiences. &lt;br /&gt;
&lt;br /&gt;
Vive is designed to accurately track the position and orientation of the HMD and [[SteamVR Controllers|controllers]] within 12 by 12 feet space.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Voice and Audio&#039;&#039;&#039;:&lt;br /&gt;
&lt;br /&gt;
Vive has a built-in mic but does not have built-in headphones. It comes with a set of earbuds that can be plugged into a headphone jack extending from the HMD.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Cables and Ports&#039;&#039;&#039;:&lt;br /&gt;
&lt;br /&gt;
===In The Box===&lt;br /&gt;
[[File:htc vive cv1 in the box1.jpg|400px|right]]&lt;br /&gt;
A. 2 [[Base Stations]]&lt;br /&gt;
&lt;br /&gt;
B. Sync cable - optional&lt;br /&gt;
&lt;br /&gt;
C. 2 Base station power adapter &lt;br /&gt;
&lt;br /&gt;
D. Mount kit&lt;br /&gt;
&lt;br /&gt;
E. Link box&lt;br /&gt;
&lt;br /&gt;
F. Link box mounting pad&lt;br /&gt;
&lt;br /&gt;
G. Link box power adapter&lt;br /&gt;
&lt;br /&gt;
H. HDMI cable&lt;br /&gt;
&lt;br /&gt;
I. USB cable&lt;br /&gt;
&lt;br /&gt;
J. Earbuds&lt;br /&gt;
&lt;br /&gt;
K. Alternate face cushion (narrow) - &lt;br /&gt;
&lt;br /&gt;
L. Cleaning cloth&lt;br /&gt;
&lt;br /&gt;
M. Documentation&lt;br /&gt;
&lt;br /&gt;
N. Headset with 3-in-1 cable and audio cable&lt;br /&gt;
&lt;br /&gt;
O. 2 Controllers (with lanyard)&lt;br /&gt;
&lt;br /&gt;
P. 2 Micro-USB charger&lt;br /&gt;
&lt;br /&gt;
===Specifications===&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
!Part&lt;br /&gt;
!Spec&lt;br /&gt;
|-&lt;br /&gt;
|Display || Dual OLED Panels&lt;br /&gt;
|-&lt;br /&gt;
|Resolution || 2160 x 1200 (1080 x 1200 per eye)&lt;br /&gt;
|-&lt;br /&gt;
|Pixel density || ???&lt;br /&gt;
|-&lt;br /&gt;
|Refresh rate || 90 Hz&lt;br /&gt;
|-&lt;br /&gt;
|Persistence || Low&lt;br /&gt;
|-&lt;br /&gt;
|Field of View || 110° (diagonal)&lt;br /&gt;
|-&lt;br /&gt;
|Optics || [[Fresnel lenses]]&lt;br /&gt;
|-&lt;br /&gt;
|[[IPD]] || 60.2-74.5mm &lt;br /&gt;
|-&lt;br /&gt;
|[[Tracking]] || 6 degrees of freedom&lt;br /&gt;
|-&lt;br /&gt;
|[[Rotational tracking]] || [[Gyroscope]], [[Accelerometer]], [[Magnetometer]]&lt;br /&gt;
|-&lt;br /&gt;
|[[Positional tracking]] || [[Base Stations]]&lt;br /&gt;
|-&lt;br /&gt;
|Update Rate || Rotational: 1000Hz, Positional: 60Hz&lt;br /&gt;
|-&lt;br /&gt;
|[[#Tracking volume|Tracking Volume]] || 120°H x 120°V (&amp;gt;21 feet range)&lt;br /&gt;
|-&lt;br /&gt;
|Latency || &lt;br /&gt;
|-&lt;br /&gt;
|Connectivity || Multi-part cable with HDMI, USB and power that is connected to a junction box&lt;br /&gt;
|-&lt;br /&gt;
|Weight || 555 grams (1.2 pounds)&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==System Requirements==&lt;br /&gt;
===Recommended===&lt;br /&gt;
*&#039;&#039;&#039;GPU&#039;&#039;&#039;: NVIDIA® GeForce® GTX 970, AMD Radeon™ R9 290 equivalent or better&lt;br /&gt;
*&#039;&#039;&#039;CPU&#039;&#039;&#039;: Intel® i5-4590 / AMD FX 8350 equivalent or better&lt;br /&gt;
*&#039;&#039;&#039;RAM&#039;&#039;&#039;: 4 GB or more&lt;br /&gt;
*&#039;&#039;&#039;Video Output&#039;&#039;&#039;: HDMI 1.4, DisplayPort 1.2 or newer&lt;br /&gt;
*&#039;&#039;&#039;USB Port&#039;&#039;&#039;: 1x USB 2.0 or better port&lt;br /&gt;
*&#039;&#039;&#039;Operating System&#039;&#039;&#039;: Windows 7 SP1, Windows 8.1 or later, Windows 10&lt;br /&gt;
&lt;br /&gt;
===SteamVR Performance Test===&lt;br /&gt;
[http://store.steampowered.com/app/323910 SteamVR Performance Test] is a benchmark software that checks if your system is ready for Vive. It checks your system&#039;s OS, GPU and CPU to see if it has the capability of running VR at 90 FPS and whether VR content can tune the visual fidelity up to the recommended level.&lt;br /&gt;
&lt;br /&gt;
==Play Area Requirements==&lt;br /&gt;
{{see also|#Play Area Setup}}&lt;br /&gt;
The play area sets the virtual boundaries of Vive. Your interaction with VR objects happen within the play area. &lt;br /&gt;
*Make sure your play area is free of furniture and other obstacles. &lt;br /&gt;
*Place your PC next to the play area because the cable of your HMD connecting to your PC is 5 meters long. &lt;br /&gt;
*Make sure your base stations are mounted near power outlets.&lt;br /&gt;
===Room-scale VR Requirements===&lt;br /&gt;
For [[Room-scale VR]], you need a space where you can move freely.&lt;br /&gt;
*Minimum room size: 2 m x 1.5 m (6 feet 6 inches x 5 feet)&lt;br /&gt;
*Maximum between base stations: 5 m (16 feet), Room size: 3.5 m x 3.5 m (11 feet 7 inches x 11 feet 7 inches)&lt;br /&gt;
**Base stations can track further but the length of headset cable is 5 meters.&lt;br /&gt;
&lt;br /&gt;
===Standing/Seated VR Requirements===&lt;br /&gt;
*No minimum space requirements&lt;br /&gt;
&lt;br /&gt;
==Setup Tutorial==&lt;br /&gt;
===HMD Setup===&lt;br /&gt;
&#039;&#039;&#039;Putting the HMD on&#039;&#039;&#039;:&lt;br /&gt;
#Pull the HMD down over your eyes.&lt;br /&gt;
#Slide the straps around the back of your head, and adjust them so that the headset fits.&lt;br /&gt;
#Make sure that the 3-in-1 cables pass through the sleeve at the back of the headset, and are positioned straight down your back.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Adjusting the Head Straps to Fit Perfectly&#039;&#039;&#039;:&lt;br /&gt;
#Walk towards your chaperone boundaries with the Vive on. Get really close to them.&lt;br /&gt;
#Adjust the Vive left and right until the vertical lines don&#039;t have &amp;quot;god rays&amp;quot;. They should become solid and not look smudgy. Once you find a good placement tighten down the side straps.&lt;br /&gt;
#Adjust the Vive up and down until the horizontal chaperone lines don&#039;t look blurry/smudged and don&#039;t have god rays. Tighten the top strap until it&#039;s a nice snug fit.&lt;br /&gt;
#Turn the IPD all the way up, you&#039;ll probably notice the chaperone lines getting blurry. Once it&#039;s at max, bring it back down and they&#039;ll become more in focus. Once they start getting blurry again just open the IPD up a bit, like focusing binoculars.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Adjust IPD&#039;&#039;&#039;: [[IPD]] is the distance between the center of the pupils of in your eyes. You can adjust the distance between the [[lenses]] to match the distance between your pupils by turning the knob on the right side of HMD.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Adjust Eye relief&#039;&#039;&#039;: [[Eye-relief]] is the distance between the lenses and your eyes. Keep the lenses as close as possible because increasing the eye-relief lowers the [[field of view]]. Only increase it if you really need to, such as fitting eye glasses. You can adjust your eye-relief with the 2 large, circular knobs on the sides of HMD. Pull the knobs out then rotate them to increase or decrease the distance. Press them back in to lock the HMD in place.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Replace face cushion&#039;&#039;&#039;: If the face cushion is too wide, replace it with the thin cushion cushion. You can exchange the cushions by peeling the velcro off from the two ends.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Replace nose rest&#039;&#039;&#039;: Remove the nose rest by peeling the flap too. Replace the nose rest by pressing the tabs into their slots and make sure that the flaps are behind the face cushion.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Unplugging the 3-in-1 cable&#039;&#039;&#039;: Slide open the small compartment on top to reveal the cables. Pull the tag to unplug the HDMI cable. Unplug the other cables (USB and power). Slide the cover back on to close the compartment.&lt;br /&gt;
&lt;br /&gt;
===Link Box Setup===&lt;br /&gt;
#Connect the 3-in-1 cable from the HMD to the orange trim side of the link box.&lt;br /&gt;
#Connect the power adapter cable to its corresponding port on the link box, then plug the other side of the cable to an electrical outlet.&lt;br /&gt;
#Insert the HDMI cable to the HDMI port on the link box, and then insert the other end to the HDMI port on your computer&#039;s graphic card.&lt;br /&gt;
#Insert the USB cable on the USB port on the link box, and then insert the opposite end on your computer&#039;s USB port.&lt;br /&gt;
#You can permanently secure the link box to an area by using the adhesive pad.&lt;br /&gt;
&lt;br /&gt;
===Controllers Setup===&lt;br /&gt;
&#039;&#039;&#039;Charging&#039;&#039;&#039;: Charge the Controller with the microUSB cable and/or power adapter. You can check the battery level of the controllers when no apps are running, or when the System Dashboard is up.&lt;br /&gt;
Dashboard is up.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;On/Off&#039;&#039;&#039;: To turn the Controller on, press the System button until you hear a beep. To turn the Controller off, press and hold the System button until you hear a beep. Controller automatically turns off when it is idled for a long time or when you exit SteamVR.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Pairing with HMD&#039;&#039;&#039;: Controller pairs with the HMD automatically when it is on. To manually pair,  launch the SteamVR app, tap Down, and then select Devices then Pair Controller. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Updating Firmware&#039;&#039;&#039;: When SteamVR notifies you about outdated firmware. Connect the Controller to your PC with the microUSB cable to update automatically.&lt;br /&gt;
&lt;br /&gt;
===Base Stations Setup===&lt;br /&gt;
&#039;&#039;&#039;Placing the base stations&#039;&#039;&#039;: &lt;br /&gt;
#Mount the base stations diagonally at the opposite corners of your [[#Play Area Requirements|play area]], ideally more than 2m (6 ft 6in) above ground.&lt;br /&gt;
#Use the mounting kits, tripods, book shelves, poles and light stands to mount the base stations. Find stable places and secure them so they cannot be easily moved or jostled. &lt;br /&gt;
#Make sure the front of the base stations are facing the center of the play area. Each base station has 120 degrees [[FOV]], tilt them about 30 to 45 degrees to fully cover the play area.&lt;br /&gt;
#Connect the base stations to power outlets with power cables. &lt;br /&gt;
#Connect the base stations and set channels:&lt;br /&gt;
##Without Sync Cable: Press the Channel buttons at the back of the base stations so that one base station is set to channel “b”, while the other is set to channel “c”.&lt;br /&gt;
##With Sync Cable: Press the Channel buttons at the back of the base stations so that one base station is set to channel “A”, while the other is set to channel “b”.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Using the mounting kits&#039;&#039;&#039;:&lt;br /&gt;
#Mark where you want to install each of the mounts on your wall, and then screw the mounts in. &lt;br /&gt;
#Rotate the base station to screw it onto the threaded ball joint. Do not screw the base station all the way in, only enough to be stable and oriented correctly.&lt;br /&gt;
#Tighten the wingnut to the base station to secure it in place.&lt;br /&gt;
#To adjust the angle of the base station, loosen the clamping ring while carefully holding the base station to prevent it from falling.&lt;br /&gt;
#Tilt the base station toward the play area.&lt;br /&gt;
#Connect the power cables to each base station and its power outlet.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Update base station firmware&#039;&#039;&#039;: When SteamVR indicates that base station firmware is out of date, unplug and unmount the base stations. Connect the base stations to the PC with a microUSB cable, one at a time. While pressing the Channel button at the back of the base station, plug in the base station’s power adapter. Update should start automatically once SteamVR detects the base station.&lt;br /&gt;
&lt;br /&gt;
===Play Area Setup===&lt;br /&gt;
{{see also|#Play Area Requirements}}&lt;br /&gt;
*The minimum space for a room-scale VR experience is 2m x 1.5m (6ft 6in x 5ft). Standing and seated VR experiences do not require much space. &lt;br /&gt;
&lt;br /&gt;
*Clear of furnitures and other obstacles in the play area. &lt;br /&gt;
&lt;br /&gt;
*Place the PC near the play area. The HMD cable is 5m (16ft 4in) long.&lt;br /&gt;
&lt;br /&gt;
*Place the base stations diagonal to each other, on the opposite corners of the play area space. Make sure there are power outlets near the base stations. Use 12V extension cords as needed.&lt;br /&gt;
&lt;br /&gt;
*Do not leave your HMD in direct sunlight, display can be damaged.&lt;br /&gt;
&lt;br /&gt;
===Software Setup===&lt;br /&gt;
#Download the setup from http://www.htcvive.com/SETUP&lt;br /&gt;
#Install Vive software&lt;br /&gt;
#Install Steam software&lt;br /&gt;
#Install SteamVR&lt;br /&gt;
#Launch SteamVR&lt;br /&gt;
#Pair HMD and Controllers from the SteamVR menu.&lt;br /&gt;
&lt;br /&gt;
===Setup Tips and Tricks===&lt;br /&gt;
*During Play Area Setup, the system asks you to point your controller at your computer monitor. The system performs this because it wants to establish the forward position in VR about 180 degrees from the monitor. It assumes that your computer is in the direction of the monitor and wants you to face opposite of the computer so your cables will go smoothly from the back of your head to the computer.&lt;br /&gt;
&lt;br /&gt;
*During the Play Area Setup, If you have trouble setting up the floor area, try placing your controllers upside down. It can give more accurate readings.&lt;br /&gt;
&lt;br /&gt;
*Be sure to turn on &#039;&#039;Enable Bluetooth communication&#039;&#039; in the General tab of the Settings. It not only allows you to connect your HMD to your phone but also makes your Base Stations &amp;quot;smarter&amp;quot;. Now when you don&#039;t have SteamVR on, your Base Stations will power down and go to stand by mode. You will no longer hear the humming noise created by the spinning gyroscopes inside the Base Stations.&lt;br /&gt;
&lt;br /&gt;
*Take a look at the Audio tab in the Settings. You can do things such as mirror your audio from VR in the speakers.&lt;br /&gt;
&lt;br /&gt;
*In Settings, you can enable the front facing camera in your HMD.&lt;br /&gt;
&lt;br /&gt;
*In VR, at the bottom of your controller you not only have battery indicators but also left and right hand signs to show which controller is for which hand.&lt;br /&gt;
&lt;br /&gt;
*[[SteamVR Desktop Theater Mode]], which you can activate only in VR, allows you to play non-VR Steam games on a big screen in VR.&lt;br /&gt;
&lt;br /&gt;
==Input Devices==&lt;br /&gt;
[[SteamVR Controllers]] - 1 held in each hand, these controllers are tracked by the same system as the HMD ([[Lighthouse]]). &lt;br /&gt;
&lt;br /&gt;
Other devices compatible with [[Steam]]&lt;br /&gt;
&lt;br /&gt;
==Accessories==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot; style= &amp;quot;text-align: center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! Accessory&lt;br /&gt;
! Cost When&amp;lt;br&amp;gt;Purchased Separately&lt;br /&gt;
|-&lt;br /&gt;
|[[SteamVR Base Stations|Base Station]] with AC || $135&lt;br /&gt;
|-&lt;br /&gt;
|[[SteamVR Controllers|Controller]] with AC || $130&lt;br /&gt;
|-&lt;br /&gt;
|HDMI 3-in-1 Cable || $40&lt;br /&gt;
|-&lt;br /&gt;
|Control Box (no AC) || $30&lt;br /&gt;
|-&lt;br /&gt;
|USB 2.0 Cable (AA) 4.5mm || $10&lt;br /&gt;
|-&lt;br /&gt;
|Face Cushion Set of 2 (Narrow) || $25 &lt;br /&gt;
|-&lt;br /&gt;
|Face Cushion Set of 2 (Wide) || $25 &lt;br /&gt;
|-&lt;br /&gt;
|Nose Rest Set of 3 (Narrow) || $13 &lt;br /&gt;
|-&lt;br /&gt;
|Nose Rest Set of 3 (Wide) || $13 &lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Apps==&lt;br /&gt;
Second Bundle (August 12, 2016 and onward): HTC Vive CV1 is shipped with 3 free games: [[Tilt Brush]] by [[Google]], [[The Gallery - Episode 1: Call of the Starseed]] by [[Cloudhead Games]] and [[Zombie Training Simulator]] by [[Acceleroto]].&lt;br /&gt;
&lt;br /&gt;
First Bundle (no longer available): HTC Vive CV1 is shipped with 3 free games: [[Job Simulator: The 2050 Archives]] by [[Owlchemy Labs]], [[Fantastic Contraption]] by [[Northway Games]] and [[Tilt Brush]] by [[Google]].&lt;br /&gt;
&lt;br /&gt;
[[The Lab]] by [[Valve]] is available for free to all Vive users.&lt;br /&gt;
&lt;br /&gt;
==Connecting to Your Phone==&lt;br /&gt;
Users can install the Vive mobile app on their phone from the App Store&amp;lt;ref&amp;gt;https://itunes.apple.com/us/app/htc-vive/id1091173853?mt=8&amp;lt;/ref&amp;gt; or Google Play&amp;lt;ref&amp;gt;https://play.google.com/store/apps/details?id=com.htc.vivephoneservice&amp;amp;hl=en&amp;lt;/ref&amp;gt;. The apps enable your HTC Vive HMD to connect to your mobile phones through Bluetooth. It allows your Vive HMD to receive calls, texts and calendar reminders while your are in [[VR]].&lt;br /&gt;
&lt;br /&gt;
To make phone calls, users need to download the Vive software package &amp;lt;ref&amp;gt;https://www.htcvive.com/us/setup/&amp;lt;/ref&amp;gt; for their PC. Now when you receive a call or text, you&#039;ll be able to call back the individual through the Vive menu.&lt;br /&gt;
&lt;br /&gt;
==Developer==&lt;br /&gt;
&lt;br /&gt;
==Tracking volume==&lt;br /&gt;
{{see also|Tracking volume}}&lt;br /&gt;
[[File:htc vive tracking volume1.png|400px]]&lt;br /&gt;
&lt;br /&gt;
120°H x 120°V (&amp;gt;21 feet range)&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&#039;&#039;&#039;February 29, 2016&#039;&#039;&#039; - Pre-orders for HTC Vive CV1 begin.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;April 5, 2016&#039;&#039;&#039; - HTC Vive CV1 is officially released.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;April 26, 2016&#039;&#039;&#039; - [[Vive X]], $100 million [[HTC Vive]] accelerator program was announced. Vive X will be involved in Seed and Series A investments and are located in Beijing, Taipei and San Francisco.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;June 9, 2016&#039;&#039;&#039; - Business Edition of Vive is announced.&lt;br /&gt;
&lt;br /&gt;
==Images==&lt;br /&gt;
[[File:htc vive cv1 set1.jpg|300px]] [[File:htc vive cv1 hmd2.jpg|300px]] [[File:htc vive cv1 controllers1.jpg|300px]]&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Devices]] [[Category:Virtual Reality Devices]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Elite_Dangerous&amp;diff=10588</id>
		<title>Elite Dangerous</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Elite_Dangerous&amp;diff=10588"/>
		<updated>2016-08-03T14:47:24Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{stub}}&lt;br /&gt;
{{App Infobox&lt;br /&gt;
|image={{#ev:youtube|dwvjElmFCfE|350}}&lt;br /&gt;
|Developer=[[Frontier Developments]]&lt;br /&gt;
|Publisher=[[Frontier Developments]]&lt;br /&gt;
|Director=[[David Braben]]&lt;br /&gt;
|Producer=[[Michael Brookes]]&lt;br /&gt;
|Platform=[[Oculus Rift]], [[SteamVR]]&lt;br /&gt;
|Device=[[Oculus Rift CV1]], [[HTC Vive]]&lt;br /&gt;
|Operating System=[[Windows 7]], [[Windows 8]]&lt;br /&gt;
|Type=[[Virtual Reality]]&lt;br /&gt;
|Genre=[[Action]], [[Adventure]], [[RPG]], [[Simulation]], [[Strategy]]&lt;br /&gt;
|Input Device=[[Gamepad]], Keyboard/Mouse&lt;br /&gt;
|Game Mode=[[Single Player]], [[Multiplayer]]&lt;br /&gt;
|Comfort Level=&lt;br /&gt;
|Version=Update 2.1.05&lt;br /&gt;
|Rating=80/100&lt;br /&gt;
|Downloads=&lt;br /&gt;
|Release Date=April 2, 2015&lt;br /&gt;
|Price= $30.00 + $34.00 (expansion)&lt;br /&gt;
|Website=[https://www.elitedangerous.com// Elite Dangerous]&lt;br /&gt;
|Infobox Updated=8/3/2016&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Review==&lt;br /&gt;
&#039;&#039;Elite Dangerous has been out for a rather long time to relatively mixed user reviews – with consumer versions of VR headsets now out in the wild, will the game be revitalized?&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Frontier Developments’ Elite Dangerous is a rather difficult game to review, especially for someone with fond memories of older multiplayer spaceship-based games, such as Freelancer. &lt;br /&gt;
&lt;br /&gt;
A lot of people may bounce off due to its complexity and the rather perplexing lack of direction. While the former is perfectly fine, I feel more could have been done to help players choose the direction they want to take their game in. Indeed, there are a lot of things to do, but little in the way of helping you discover them. That is not to say that Elite Dangerous should hold the hands of its players at all time, but even series such as Dark Souls, touted for its difficulty and hardcore nature gives you more hints and direction than Elite does sometimes. &lt;br /&gt;
&lt;br /&gt;
Being told you can go anywhere you want and do whatever you want can be very liberating, but, ironically, it can also feel restrictive and the game certainly suffers as a result. &lt;br /&gt;
&lt;br /&gt;
What is more – the 1.0 release back in December 2014 was plainly unfinished. A lot of promised mechanics were missing and the ones that were there often did not work properly. A promised offline mode was scrapped (although you can play solo – but must be always on), much-hyped features such as the galactic economy and evolving politics were broken. All this was not helped by the fact that the learning curve was more akin to a “learning cliff”, a lot of people were frustrated, and understandably so. &lt;br /&gt;
&lt;br /&gt;
Frontier Developments has, however, taken the game a long way since then, and is apparently planning much more. One of the biggest problems was indeed the lack of meaningful player interaction. Things such as multi-crew ships and proper alliances are exactly what people have been asking for. While one shouldn’t expect Elite to suddenly turn into an EVE-like affair, it remains a step in the right direction. &lt;br /&gt;
And yet, Elite Dangerous remains a fascinating experience. Yes, an experience. Wherever the game and its mechanics might fall flat, the experience of actually flying a starship through space holds up beautifully. It would be hard to think of another game that sells this experience quite so well. From the visuals to the sound assets, everything feels authentic. &lt;br /&gt;
&lt;br /&gt;
What about VR headsets then? This is certainly one of the games people dream up when they think of VR. Being in a cockpit bypasses any of the usual problems of seated VR – after all, your character is also seated. If you combine the headset with a HOTAS Flight Stick and some voice commands, Elite Dangerous turns from an interesting but flawed space game into ”Oh my god I am flying my own spaceship!”.&lt;br /&gt;
&lt;br /&gt;
The sense of scale of the space stations, the vast emptiness of space, the stars and planetary bodies that you fly by and land on, the way your character’s hands on the flight stick mimic your own moves, all of these things were already impressive before – with an Oculus or Vive, it becomes the dream of anyone who ever imagined something like this back when the original Elite was out in 1984. &lt;br /&gt;
  &lt;br /&gt;
Elite Dangerous has occasionally been described as “Euro Truck Simulator in space” – and it is true. You could be a peaceful trader, carrying cargo across the stars, avoiding any of that dogfighting nonsense that some people seem to be into, and it would be a very similar experience. &lt;br /&gt;
&lt;br /&gt;
There is a lot that remains unsaid here. None of Elite Dangerous’ mechanics are terribly deep, but the range of things you can do is quite impressive. From the way power management works, to how you smuggle illegal goods into stations, ship choices, outfitting, and do forth – to not even mention the more recent Horizons expansion which introduced the ability to land on planetary bodies. &lt;br /&gt;
&lt;br /&gt;
Overall, Elite Dangerous suffers from the same problems its original 1984 predecessor did. If you are not the kind of person who likes setting their own goals, your experience will suffer. If that does not deter you though, or even if you are simply looking for trucking in space, Elite Dangerous, especially with VR, is the only game that can offer such an experience. At least, the only fully released one.&lt;br /&gt;
&lt;br /&gt;
Elite Dangerous is out now on Steam. It will set you back $30 or your regional equivalent. The expansion will cost an extra $34.&lt;br /&gt;
&lt;br /&gt;
[[Category:VR Games]] [[Category:VR Apps]] [[Category:RPG]] [[Category:Strategy]] [[Category:Simulation]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Test_2&amp;diff=10578</id>
		<title>Test 2</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Test_2&amp;diff=10578"/>
		<updated>2016-08-03T12:43:09Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: Created page with &amp;quot;AR - [http://arstechnica.com/gadgets/2016/01/2016-google-tracker-everything-google-is-working-on-for-the-new-year/5/ reference] - scroll down  Different types of Storytell...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[AR]] - [http://arstechnica.com/gadgets/2016/01/2016-google-tracker-everything-google-is-working-on-for-the-new-year/5/ reference] - scroll down&lt;br /&gt;
&lt;br /&gt;
Different types of Storytelling in VR - [http://www.roadtovr.com/four-different-types-stories-vr/ reference 1]&lt;br /&gt;
&lt;br /&gt;
[[Advertising]] - [[Retinad]] - [[Privacy]]&lt;br /&gt;
&lt;br /&gt;
[[Facial tracking]] - [[Hao Li]], [[Martin Breidt]]&lt;br /&gt;
&lt;br /&gt;
[[Magic Leap]] - [https://cdn1.vox-cdn.com/uploads/chorus_asset/file/4296725/61845907_1_.0.pdf reference 1]&lt;br /&gt;
&lt;br /&gt;
==App Stores==&lt;br /&gt;
SteamVR / Steam Store: http://store.steampowered.com/search/?#sort_by=_ASC&amp;amp;vrsupport=101%2C102&amp;amp;page=1&lt;br /&gt;
&lt;br /&gt;
Cardboard / Google Play: https://store.google.com/category/virtual_reality&lt;br /&gt;
&lt;br /&gt;
==Terms Needed==&lt;br /&gt;
[[SpatialOS]] - [http://www.wired.co.uk/news/archive/2016-01/12/simulating-the-real-world-online reference 1] [https://www.youtube.com/watch?v=xJFwaY30rvM reference 2] - distributed operating system that enables developers to build massive, detailed simulations across thousands of machines in the cloud.&lt;br /&gt;
&lt;br /&gt;
[[A-Frame]] - [https://aframe.io/blog/2015/12/16/introducing-aframe/ reference 1], [https://aframe.io/docs/guide/ reference 2] - created by MozVR, A-Frame is an open source library for creating WebVR without having to know WebGL.&lt;br /&gt;
&lt;br /&gt;
==Technical Articles==&lt;br /&gt;
Rotational tracking using [[IMU]]s - https://developer.oculus.com/blog/sensor-fusion-keeping-it-simple/&lt;br /&gt;
&lt;br /&gt;
VR/AR developer&#039;s blog - http://doc-ok.org/, such as [http://doc-ok.org/?p=1478 lighthouse tracking examined]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Out_of_Ammo&amp;diff=10572</id>
		<title>Out of Ammo</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Out_of_Ammo&amp;diff=10572"/>
		<updated>2016-08-01T14:55:06Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{stub}}&lt;br /&gt;
{{App Infobox&lt;br /&gt;
|image={{#ev:youtube|3-vpLpgOEQM|350}}&lt;br /&gt;
|Type=[[Virtual Reality]]&lt;br /&gt;
|Developer=[[RocketWerkz]]&lt;br /&gt;
|Publisher=&lt;br /&gt;
|Platform=[[Oculus Rift (Platform)]], [[SteamVR]], &lt;br /&gt;
|Device=[[Oculus Rift CV1]], [[HTC Vive]]&lt;br /&gt;
|Operating System=&lt;br /&gt;
|Type=[[Full Game]]&lt;br /&gt;
|Genre=[[Strategy]], [[Shooter]]&lt;br /&gt;
|Input Device=[[Oculus Touch]], [[SteamVR Controllers]]&lt;br /&gt;
|Game Mode=[[Singleplayer]]&lt;br /&gt;
|Comfort Level=&lt;br /&gt;
|Version=&lt;br /&gt;
|Rating=&lt;br /&gt;
|Downloads=&lt;br /&gt;
|Release Date=&lt;br /&gt;
|Price=&lt;br /&gt;
|Website=[http://store.steampowered.com/app/451840/ Out of Ammo on Steam]&lt;br /&gt;
|Infobox Updated=8/1/2016&lt;br /&gt;
}}&lt;br /&gt;
[[Out of Ammo]] is an intense [[virtual reality]] [[RTS]] and [[FPS]] [[game]] that puts the player in the role of a military commander against waves of incoming enemy soldiers &amp;lt;ref name=”1”&amp;gt; TheTwiit (2016). Out of Ammo – VR strategy game review. Retrieved from www.thetwitt.com/reviews/game/out-of-ammo-vr-strategy-game-review&amp;lt;/ref&amp;gt;. It was released on April 15, 2016, as a [[Steam Early Access]] game exclusive to the HTC Vive. The VR game is developed by RocketWerkz studio (based in New Zealand), founded in 2014 by Dean Hall who also created [[DayZ]]. It is an attempt to create something smaller and simpler than DayZ, since this is their first attempt in virtual reality &amp;lt;ref name=”2”&amp;gt; McWhertor, Michael (2016). DayZ creator unveils new shooter strategy game Out of Ammo. Retrieved from www.polygon.com/2016/4/15/11439936/dean-hall-out-of-ammo-rocketwerkz-htc-vive-steam-early-access&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Moddb (2016). DayZ developer Dean Hall launches new VR shooter Out of Ammo. Retrieved from www.moddb.com/games/out-of-ammo&amp;lt;/ref&amp;gt;. The developer studio mentioned that the reasoning behind launching the game in Steam Early Access was “to refine the experience and better explore what virtual reality can offer for the game.” &amp;lt;ref name=”4”&amp;gt; Steam. Out of Ammo. Retrieved from store.steampowered.com/app/451840&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The VR game has a mixture of [[strategy]] with [[first-person shooter]] mechanics, and aesthetically it reminds [[Minecraft]] due to the blocky nature of the graphics &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. The player can manipulate objects and soldiers in order to build defenses, all with gestures integrated with the [[SteamVR Controllers]] for the [[Vive]] &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. The players will then face incoming waves of enemy soldiers of increasing difficulty, and they have to construct defenses, issue orders to the troops, and possess units to assume direct control. When a player takes direct control of a unit, the game transitions into a first-person shooter view &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; Sykes, Tom (2016). Dean Hall’s RocketWerkz releases Vive-exclusive Out of Ammo. Retrieved from www.pcgamer.com/dean-halls-rocketwerkz-releases-vive-exclusive-out-of-ammo&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Currently, there are eight unique environments and five playable units in the early access build, and according to RocketWerkz more will be added in the future &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. Out of Ammo has been generally well received, although some have suggested that it still requires development and an increase in complexity &amp;lt;ref&amp;gt; Meer, Alec (2016). Impressions: DayZ blokey Dean Hall’s Out of Ammo. Retrieved from www.rockpapershotgun.com/2016/04/27/out-of-ammo-early-access-review&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Features==&lt;br /&gt;
*Eight unique freeplay environments to try your skill on&lt;br /&gt;
*Two specialist mission modes&lt;br /&gt;
*Coorperative Multiplayer in freeplay for up to four players&lt;br /&gt;
*Construct defenses such as sandbags and watchtowers&lt;br /&gt;
*Issue orders to your soldiers&lt;br /&gt;
*Five different kinds of units each with special abilities&lt;br /&gt;
*Possess any units directly to control their engagement&lt;br /&gt;
*Call in artillery, airstrikes, and sniper targets&lt;br /&gt;
*Fixed machine guns, Grenades, flares, and more&amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:VR Games]] [[Category:VR Apps]] [[Category:Shooter]] [[Category:Strategy]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Out_of_Ammo&amp;diff=10571</id>
		<title>Out of Ammo</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Out_of_Ammo&amp;diff=10571"/>
		<updated>2016-08-01T14:54:50Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{stub}}&lt;br /&gt;
{{App Infobox&lt;br /&gt;
|image={{#ev:youtube|3-vpLpgOEQM|350}}&lt;br /&gt;
|Type=[[Virtual Reality]]&lt;br /&gt;
|Developer=[[RocketWerkz]]&lt;br /&gt;
|Publisher=&lt;br /&gt;
|Platform=[[Oculus Rift (Platform)]], [[SteamVR]], &lt;br /&gt;
|Device=[[Oculus Rift CV1]], [[HTC Vive]]&lt;br /&gt;
|Operating System=&lt;br /&gt;
|Type=[[Full Game]]&lt;br /&gt;
|Genre=[[Strategy]], [[Shooter]]&lt;br /&gt;
|Input Device=[[Oculus Touch]], [[SteamVR Controllers]]&lt;br /&gt;
|Game Mode=[[Singleplayer]]&lt;br /&gt;
|Comfort Level=&lt;br /&gt;
|Version=&lt;br /&gt;
|Rating=&lt;br /&gt;
|Downloads=&lt;br /&gt;
|Release Date=&lt;br /&gt;
|Price=&lt;br /&gt;
|Website=[http://store.steampowered.com/app/451840/ Out of Ammo on Steam]&lt;br /&gt;
|Infobox Updated=8/1/2016&lt;br /&gt;
}}&lt;br /&gt;
[[Out of Ammo]] is an intense [[virtual reality]] [[RTS]] and [[FPS]] [[game]] that puts the player in the role of a military commander against waves of incoming enemy soldiers &amp;lt;ref name=”1”&amp;gt; TheTwiit (2016). Out of Ammo – VR strategy game review. Retrieved from www.thetwitt.com/reviews/game/out-of-ammo-vr-strategy-game-review&amp;lt;/ref&amp;gt;. It was released on April 15, 2016, as a [[Steam Early Access]] game exclusive to the HTC Vive. The VR game is developed by RocketWerkz studio (based in New Zealand), founded in 2014 by Dean Hall who also created [[DayZ]]. It is an attempt to create something smaller and simpler than DayZ, since this is their first attempt in virtual reality &amp;lt;ref name=”2”&amp;gt; McWhertor, Michael (2016). DayZ creator unveils new shooter strategy game Out of Ammo. Retrieved from www.polygon.com/2016/4/15/11439936/dean-hall-out-of-ammo-rocketwerkz-htc-vive-steam-early-access&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Moddb (2016). DayZ developer Dean Hall launches new VR shooter Out of Ammo. Retrieved from www.moddb.com/games/out-of-ammo&amp;lt;/ref&amp;gt;. The developer studio mentioned that the reasoning behind launching the game in Steam Early Access was “to refine the experience and better explore what virtual reality can offer for the game.” &amp;lt;ref name=”4”&amp;gt; Steam. Out of Ammo. Retrieved from store.steampowered.com/app/451840&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The VR game has a mixture of [[strategy]] with [[first-person shooter]] mechanics, and aesthetically it reminds [[Minecraft]] due to the blocky nature of the graphics &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. The player can manipulate objects and soldiers in order to build defenses, all with gestures integrated with the [[SteamVR Controllers]] for the [[Vive]] &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. The players will then face incoming waves of enemy soldiers of increasing difficulty, and they have to construct defenses, issue orders to the troops, and possess units to assume direct control. When a player takes direct control of a unit, the game transitions into a first-person shooter view &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; Sykes, Tom (2016). Dean Hall’s RocketWerkz releases Vive-exclusive Out of Ammo. Retrieved from www.pcgamer.com/dean-halls-rocketwerkz-releases-vive-exclusive-out-of-ammo&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Currently, there are eight unique environments and five playable units in the early access build, and according to RocketWerkz more will be added in the future &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. Out of Ammo has been generally well received, although some have suggested that it still requires development and an increase in complexity &amp;lt;ref&amp;gt; Meer, Alec (2016). Impressions: DayZ blokey Dean Hall’s Out of Ammo. Retrieved from www.rockpapershotgun.com/2016/04/27/out-of-ammo-early-access-review&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Features==&lt;br /&gt;
*Eight unique freeplay environments to try your skill on&lt;br /&gt;
*Two specialist mission modes&lt;br /&gt;
*Coorperative Multiplayer in freeplay for up to four players&lt;br /&gt;
*Construct defenses such as sandbags and watchtowers&lt;br /&gt;
*Issue orders to your soldiers&lt;br /&gt;
*Five different kinds of units each with special abilities&lt;br /&gt;
*Possess any units directly to control their engagement&lt;br /&gt;
*Call in artillery, airstrikes, and sniper targets&lt;br /&gt;
*Fixed machine guns, Grenades, flares, and more&lt;br /&gt;
&amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:VR Games]] [[Category:VR Apps]] [[Category:Shooter]] [[Category:Strategy]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Out_of_Ammo&amp;diff=10570</id>
		<title>Out of Ammo</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Out_of_Ammo&amp;diff=10570"/>
		<updated>2016-08-01T14:53:16Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{stub}}&lt;br /&gt;
{{App Infobox&lt;br /&gt;
|image={{#ev:youtube|3-vpLpgOEQM|350}}&lt;br /&gt;
|Type=[[Virtual Reality]]&lt;br /&gt;
|Developer=[[RocketWerkz]]&lt;br /&gt;
|Publisher=&lt;br /&gt;
|Platform=[[Oculus Rift (Platform)]], [[SteamVR]], &lt;br /&gt;
|Device=[[Oculus Rift CV1]], [[HTC Vive]]&lt;br /&gt;
|Operating System=&lt;br /&gt;
|Type=[[Full Game]]&lt;br /&gt;
|Genre=[[Strategy]], [[Shooter]]&lt;br /&gt;
|Input Device=[[Oculus Touch]], [[SteamVR Controllers]]&lt;br /&gt;
|Game Mode=[[Singleplayer]]&lt;br /&gt;
|Comfort Level=&lt;br /&gt;
|Version=&lt;br /&gt;
|Rating=&lt;br /&gt;
|Downloads=&lt;br /&gt;
|Release Date=&lt;br /&gt;
|Price=&lt;br /&gt;
|Website=[http://store.steampowered.com/app/451840/ Out of Ammo on Steam]&lt;br /&gt;
|Infobox Updated=8/1/2016&lt;br /&gt;
}}&lt;br /&gt;
[[Out of Ammo]] is an intense [[virtual reality]] [[RTS]] and [[FPS]] [[game]] that puts the player in the role of a military commander against waves of incoming enemy soldiers &amp;lt;ref name=”1”&amp;gt; TheTwiit (2016). Out of Ammo – VR strategy game review. Retrieved from www.thetwitt.com/reviews/game/out-of-ammo-vr-strategy-game-review&amp;lt;/ref&amp;gt;. It was released on April 15, 2016, as a Steam Early Access game exclusive to the HTC Vive. The VR game is developed by RocketWerkz studio (based in New Zealand), founded in 2014 by Dean Hall who also created [[DayZ]]. It is an attempt to create something smaller and simpler than DayZ, since this is their first attempt in virtual reality &amp;lt;ref name=”2”&amp;gt; McWhertor, Michael (2016). DayZ creator unveils new shooter strategy game Out of Ammo. Retrieved from www.polygon.com/2016/4/15/11439936/dean-hall-out-of-ammo-rocketwerkz-htc-vive-steam-early-access&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Moddb (2016). DayZ developer Dean Hall launches new VR shooter Out of Ammo. Retrieved from www.moddb.com/games/out-of-ammo&amp;lt;/ref&amp;gt;. The developer studio mentioned that the reasoning behind launching the game in Steam Early Access was “to refine the experience and better explore what virtual reality can offer for the game.” &amp;lt;ref name=”4”&amp;gt; Steam. Out of Ammo. Retrieved from store.steampowered.com/app/451840&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The VR game has a mixture of strategy with a first-person shooter mechanic, and aesthetically it reminds Minecraft due to the blocky nature of the graphics &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. The player can manipulate objects and soldiers in order to build defenses, all with gestures integrated with the hand controllers for the Vive &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. The players will then face incoming waves of enemy soldiers of increasing difficulty, and they have to construct defenses, issue orders to the troops, and possess units to assume direct control. When a player takes direct control of a unit, the game transitions into a first-person shooter view &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; Sykes, Tom (2016). Dean Hall’s RocketWerkz releases Vive-exclusive Out of Ammo. Retrieved from www.pcgamer.com/dean-halls-rocketwerkz-releases-vive-exclusive-out-of-ammo&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Currently, there are eight unique environments and five playable units in the early access build, and according to RocketWerkz more will be added in the future &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. Out of Ammo has been generally well received, although some have suggested that it still requires development and an increase in complexity &amp;lt;ref&amp;gt; Meer, Alec (2016). Impressions: DayZ blokey Dean Hall’s Out of Ammo. Retrieved from www.rockpapershotgun.com/2016/04/27/out-of-ammo-early-access-review&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Features==&lt;br /&gt;
*Eight unique freeplay environments to try your skill on&lt;br /&gt;
*Two specialist mission modes&lt;br /&gt;
*Coorperative Multiplayer in freeplay for up to four players&lt;br /&gt;
*Construct defenses such as sandbags and watchtowers&lt;br /&gt;
*Issue orders to your soldiers&lt;br /&gt;
*Five different kinds of units each with special abilities&lt;br /&gt;
*Possess any units directly to control their engagement&lt;br /&gt;
*Call in artillery, airstrikes, and sniper targets&lt;br /&gt;
*Fixed machine guns, Grenades, flares, and more&lt;br /&gt;
&amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:VR Games]] [[Category:VR Apps]] [[Category:Shooter]] [[Category:Strategy]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Out_of_Ammo&amp;diff=10568</id>
		<title>Out of Ammo</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Out_of_Ammo&amp;diff=10568"/>
		<updated>2016-08-01T14:44:57Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Out of ammo is an intense virtual reality game that puts the player in the role of a military commander against waves of incoming enemy soldiers &amp;lt;ref name=”1”&amp;gt; TheTwiit (2016). Out of Ammo – VR strategy game review. Retrieved from www.thetwitt.com/reviews/game/out-of-ammo-vr-strategy-game-review&amp;lt;/ref&amp;gt;. It was released on April 15, 2016, as a Steam Early Access game exclusive to the HTC Vive. The VR game is developed by RocketWerkz studio (based in New Zealand), founded in 2014 by Dean Hall who also created DayZ. It is an attempt to create something smaller and simpler than DayZ, since this is their first attempt in virtual reality &amp;lt;ref name=”2”&amp;gt; McWhertor, Michael (2016). DayZ creator unveils new shooter strategy game Out of Ammo. Retrieved from www.polygon.com/2016/4/15/11439936/dean-hall-out-of-ammo-rocketwerkz-htc-vive-steam-early-access&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Moddb (2016). DayZ developer Dean Hall launches new VR shooter Out of Ammo. Retrieved from www.moddb.com/games/out-of-ammo&amp;lt;/ref&amp;gt;. The developer studio mentioned that the reasoning behind launching the game in Steam Early Access was “to refine the experience and better explore what virtual reality can offer for the game.” &amp;lt;ref name=”4”&amp;gt; Steam. Out of Ammo. Retrieved from store.steampowered.com/app/451840&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The VR game has a mixture of strategy with a first-person shooter mechanic, and aesthetically it reminds Minecraft due to the blocky nature of the graphics &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. The player can manipulate objects and soldiers in order to build defenses, all with gestures integrated with the hand controllers for the Vive &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. The players will then face incoming waves of enemy soldiers of increasing difficulty, and they have to construct defenses, issue orders to the troops, and possess units to assume direct control. When a player takes direct control of a unit, the game transitions into a first-person shooter view &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; Sykes, Tom (2016). Dean Hall’s RocketWerkz releases Vive-exclusive Out of Ammo. Retrieved from www.pcgamer.com/dean-halls-rocketwerkz-releases-vive-exclusive-out-of-ammo&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Currently, there are eight unique environments and five playable units in the early access build, and according to RocketWerkz more will be added in the future &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. Out of Ammo has been generally well received, although some have suggested that it still requires development and an increase in complexity &amp;lt;ref&amp;gt; Meer, Alec (2016). Impressions: DayZ blokey Dean Hall’s Out of Ammo. Retrieved from www.rockpapershotgun.com/2016/04/27/out-of-ammo-early-access-review&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Features==&lt;br /&gt;
*Eight unique freeplay environments to try your skill on&lt;br /&gt;
*Two specialist mission modes&lt;br /&gt;
*Coorperative Multiplayer in freeplay for up to four players&lt;br /&gt;
*Construct defenses such as sandbags and watchtowers&lt;br /&gt;
*Issue orders to your soldiers&lt;br /&gt;
*Five different kinds of units each with special abilities&lt;br /&gt;
*Possess any units directly to control their engagement&lt;br /&gt;
*Call in artillery, airstrikes, and sniper targets&lt;br /&gt;
*Fixed machine guns, Grenades, flares, and more&lt;br /&gt;
&amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references/&amp;gt;&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Out_of_Ammo&amp;diff=10567</id>
		<title>Out of Ammo</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Out_of_Ammo&amp;diff=10567"/>
		<updated>2016-08-01T14:44:27Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Out of ammo is an intense virtual reality game that puts the player in the role of a military commander against waves of incoming enemy soldiers &amp;lt;ref name=”1”&amp;gt; TheTwiit (2016). Out of Ammo – VR strategy game review. Retrieved from www.thetwitt.com/reviews/game/out-of-ammo-vr-strategy-game-review&amp;lt;/ref&amp;gt;. It was released on April 15, 2016, as a Steam Early Access game exclusive to the HTC Vive. The VR game is developed by RocketWerkz studio (based in New Zealand), founded in 2014 by Dean Hall who also created DayZ. It is an attempt to create something smaller and simpler than DayZ, since this is their first attempt in virtual reality &amp;lt;ref name=”2”&amp;gt; McWhertor, Michael (2016). DayZ creator unveils new shooter strategy game Out of Ammo. Retrieved from www.polygon.com/2016/4/15/11439936/dean-hall-out-of-ammo-rocketwerkz-htc-vive-steam-early-access&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt; Moddb (2016). DayZ developer Dean Hall launches new VR shooter Out of Ammo. Retrieved from www.moddb.com/games/out-of-ammo&amp;lt;/ref&amp;gt;. The developer studio mentioned that the reasoning behind launching the game in Steam Early Access was “to refine the experience and better explore what virtual reality can offer for the game.” &amp;lt;ref name=”4”&amp;gt; Steam. Out of Ammo. Retrieved from store.steampowered.com/app/451840&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The VR game has a mixture of strategy with a first-person shooter mechanic, and aesthetically it reminds Minecraft due to the blocky nature of the graphics &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. The player can manipulate objects and soldiers in order to build defenses, all with gestures integrated with the hand controllers for the Vive &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;. The players will then face incoming waves of enemy soldiers of increasing difficulty, and they have to construct defenses, issue orders to the troops, and possess units to assume direct control. When a player takes direct control of a unit, the game transitions into a first-person shooter view &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; Sykes, Tom (2016). Dean Hall’s RocketWerkz releases Vive-exclusive Out of Ammo. Retrieved from www.pcgamer.com/dean-halls-rocketwerkz-releases-vive-exclusive-out-of-ammo&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Currently, there are eight unique environments and five playable units in the early access build, and according to RocketWerkz more will be added in the future &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. Out of Ammo has been generally well received, although some have suggested that it still requires development and an increase in complexity &amp;lt;ref&amp;gt; Meer, Alec (2016). Impressions: DayZ blokey Dean Hall’s Out of Ammo. Retrieved from www.rockpapershotgun.com/2016/04/27/out-of-ammo-early-access-review&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Features==&amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;&lt;br /&gt;
*Eight unique freeplay environments to try your skill on&lt;br /&gt;
*Two specialist mission modes&lt;br /&gt;
*Coorperative Multiplayer in freeplay for up to four players&lt;br /&gt;
*Construct defenses such as sandbags and watchtowers&lt;br /&gt;
*Issue orders to your soldiers&lt;br /&gt;
*Five different kinds of units each with special abilities&lt;br /&gt;
*Possess any units directly to control their engagement&lt;br /&gt;
*Call in artillery, airstrikes, and sniper targets&lt;br /&gt;
*Fixed machine guns, Grenades, flares, and more&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references/&amp;gt;&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Werewolves_Within&amp;diff=10565</id>
		<title>Werewolves Within</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Werewolves_Within&amp;diff=10565"/>
		<updated>2016-08-01T14:11:14Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: Shadowdawn moved page Werewolves within to Werewolves Within&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{stub}}&lt;br /&gt;
{{App Infobox&lt;br /&gt;
|image={{#ev:youtube|vWBb1LgtITc|350}}&lt;br /&gt;
|Type=[[Virtual Reality]]&lt;br /&gt;
|Developer=[[Red Storm Entertainment]]&lt;br /&gt;
|Publisher=[[Ubisoft]]&lt;br /&gt;
|Platform=[[Oculus Rift (Platform)]], [[SteamVR]], &lt;br /&gt;
|Device=[[Oculus Rift CV1]], [[HTC Vive]]&lt;br /&gt;
|Operating System=&lt;br /&gt;
|Type=[[Full Game]]&lt;br /&gt;
|Genre=[[Social VR]]&lt;br /&gt;
|Input Device=[[Oculus Touch]], [[SteamVR Controllers]]&lt;br /&gt;
|Game Mode=[[Multiplayer]]&lt;br /&gt;
|Comfort Level=&lt;br /&gt;
|Version=&lt;br /&gt;
|Rating=&lt;br /&gt;
|Downloads=&lt;br /&gt;
|Release Date=Fall 2016&lt;br /&gt;
|Price=&lt;br /&gt;
|Website=[https://www.ubisoft.com/en-US/game/werewolves-within Werewolves With]&lt;br /&gt;
|Infobox Updated=8/1/2016&lt;br /&gt;
}}&lt;br /&gt;
==Introduction==&lt;br /&gt;
Werewolves Within is a [[Social VR]] [[game]] developed by Red Storm Entertainment, an [[Ubisoft]] studio based in North Carolina &amp;lt;ref&amp;gt; Hayden, Scott (2016). Hands-on: Ubisoft’s First Social Game “Werewolves Within”. Retrieved from www.roadtovr.com/hands-ubisofts-first-social-vr-game-werewolves-within-launching-fall&amp;lt;/ref&amp;gt;. It is a multiplayer “social deduction game”, where players try to figure out who is the killer werewolf among them by arguing between themselves and using the game’s mechanics like signaling that they suspect that a specific player is a werewolf &amp;lt;ref name=”2”&amp;gt; Chalk, Andy (2016). Werewolves Within is a VR “social deduction” party game coming this fall. Retrieved from www.pcgamer.com/werewolves-within-is-a-vr-social-deduction-party-game-coming-this-fall&amp;lt;/ref&amp;gt;. The game is played online, and the users interact with each other in the virtual village of Gallowston &amp;lt;ref name=”3”&amp;gt; Kraft, Courtney (2016). Ubisoft brings party games to VR with “Werewolves Within”. Retrieved from geekandsundry.com/paranoia-gets-virtual-with-ubisofts-werewolves-within&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Werewolves Within has been demoed on PCs with the [[Oculus Rift]] headset, and is planned to release on all major [[VR platforms]] in the fall of 2016 &amp;lt;ref name=”4”&amp;gt; Wilson, Jason (2016). Werewolves Within is Ultimate Werewolf in virtual reality – and it’s social, too (update). Retrieved from venturebeat.com/2016/03/15/werewolves-within-preview&amp;lt;/ref&amp;gt; &amp;lt;ref name=”5”&amp;gt; Reparaz, Mikel (2016). Werewolves Within brings multiplayer deception and deduction to VR. Retrieved from blog.ubi.com/werewolves-within-brings-multiplayer-deception-and-deduction-to-vr&amp;lt;/ref&amp;gt;. This game is not the only investment that Ubisoft is going to make in VR during 2016. Besides Werewolves Within, the studio will also publish a game called [[Eagle Flight]], which is an aerial open world game about eagles &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Inspiration for the Game==&lt;br /&gt;
Werewolves Within is a VR adaptation of a popular social party game that had its origins in the 80’s. The first iteration of it started in Russia (it was called Mafia), and in the 90’s the game had been commercialized and taken another forms, such as Werewolf. In all of the different forms that the game has taken, a group of friends take on different roles and try to figure out who is the outsider that is trying to kill them. The creative director for virtual reality at Red Storm studio, David Votypka, mentioned that while thinking about ideas for VR games, Mafia was mentioned and they started working on trying to emulate the social play aspect of the game into VR &amp;lt;ref name=”6”&amp;gt; Crecente, Brian (2016). Ubisoft brings Werewolf party game to virtual reality. Retrieved from www.polygon.com/2016/3/15/11231148/werewolves-within-ubisoft-werewolf-virtual-reality&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Gameplay==&lt;br /&gt;
In Ubisoft’s VR game, five to eight players interact online &amp;lt;ref&amp;gt; Ubisoft. Werewolves Within. Retrieved from www.ubisoft.com/en-US/game/werewolves-within&amp;lt;/ref&amp;gt;. Roles are assigned randomly to them at the beginning of the match. The roles provide special abilities that can help the players to figure out who the werewolf is &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;. For example, a player can be assigned the role of a village tracker or the role of a turncloack. In the first case, if the player leans in either direction and a werewolf is on that side of them, he or she will hear a growling, while in the second, the player assigned that role can only win if the werewolves win, and knows who they are &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. Players are on two sides and there is position tracking that allows for the leaning mechanic, in which a player can also have a private chat with the person immediately at his or her side &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The in-game avatars move according to the volume and inflection of the voice, and there are also some hand-gestures available with the use of a controller &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
After five minutes of arguing, the player casts a vote in who they think is the werewolf. To win a round you have to correctly choose who the werewolf is, survive as a werewolf, or convince the others that you are the werewolf (when the role of the “deviant” is assigned to you) &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:VR Apps]] [[Category:VR Games]] [[Category:Social VR]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Werewolves_Within&amp;diff=10564</id>
		<title>Werewolves Within</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Werewolves_Within&amp;diff=10564"/>
		<updated>2016-08-01T14:10:41Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{stub}}&lt;br /&gt;
{{App Infobox&lt;br /&gt;
|image={{#ev:youtube|vWBb1LgtITc|350}}&lt;br /&gt;
|Type=[[Virtual Reality]]&lt;br /&gt;
|Developer=[[Red Storm Entertainment]]&lt;br /&gt;
|Publisher=[[Ubisoft]]&lt;br /&gt;
|Platform=[[Oculus Rift (Platform)]], [[SteamVR]], &lt;br /&gt;
|Device=[[Oculus Rift CV1]], [[HTC Vive]]&lt;br /&gt;
|Operating System=&lt;br /&gt;
|Type=[[Full Game]]&lt;br /&gt;
|Genre=[[Social VR]]&lt;br /&gt;
|Input Device=[[Oculus Touch]], [[SteamVR Controllers]]&lt;br /&gt;
|Game Mode=[[Multiplayer]]&lt;br /&gt;
|Comfort Level=&lt;br /&gt;
|Version=&lt;br /&gt;
|Rating=&lt;br /&gt;
|Downloads=&lt;br /&gt;
|Release Date=Fall 2016&lt;br /&gt;
|Price=&lt;br /&gt;
|Website=[https://www.ubisoft.com/en-US/game/werewolves-within Werewolves With]&lt;br /&gt;
|Infobox Updated=8/1/2016&lt;br /&gt;
}}&lt;br /&gt;
==Introduction==&lt;br /&gt;
Werewolves Within is a [[Social VR]] [[game]] developed by Red Storm Entertainment, an [[Ubisoft]] studio based in North Carolina &amp;lt;ref&amp;gt; Hayden, Scott (2016). Hands-on: Ubisoft’s First Social Game “Werewolves Within”. Retrieved from www.roadtovr.com/hands-ubisofts-first-social-vr-game-werewolves-within-launching-fall&amp;lt;/ref&amp;gt;. It is a multiplayer “social deduction game”, where players try to figure out who is the killer werewolf among them by arguing between themselves and using the game’s mechanics like signaling that they suspect that a specific player is a werewolf &amp;lt;ref name=”2”&amp;gt; Chalk, Andy (2016). Werewolves Within is a VR “social deduction” party game coming this fall. Retrieved from www.pcgamer.com/werewolves-within-is-a-vr-social-deduction-party-game-coming-this-fall&amp;lt;/ref&amp;gt;. The game is played online, and the users interact with each other in the virtual village of Gallowston &amp;lt;ref name=”3”&amp;gt; Kraft, Courtney (2016). Ubisoft brings party games to VR with “Werewolves Within”. Retrieved from geekandsundry.com/paranoia-gets-virtual-with-ubisofts-werewolves-within&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Werewolves Within has been demoed on PCs with the [[Oculus Rift]] headset, and is planned to release on all major [[VR platforms]] in the fall of 2016 &amp;lt;ref name=”4”&amp;gt; Wilson, Jason (2016). Werewolves Within is Ultimate Werewolf in virtual reality – and it’s social, too (update). Retrieved from venturebeat.com/2016/03/15/werewolves-within-preview&amp;lt;/ref&amp;gt; &amp;lt;ref name=”5”&amp;gt; Reparaz, Mikel (2016). Werewolves Within brings multiplayer deception and deduction to VR. Retrieved from blog.ubi.com/werewolves-within-brings-multiplayer-deception-and-deduction-to-vr&amp;lt;/ref&amp;gt;. This game is not the only investment that Ubisoft is going to make in VR during 2016. Besides Werewolves Within, the studio will also publish a game called [[Eagle Flight]], which is an aerial open world game about eagles &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Inspiration for the Game==&lt;br /&gt;
Werewolves Within is a VR adaptation of a popular social party game that had its origins in the 80’s. The first iteration of it started in Russia (it was called Mafia), and in the 90’s the game had been commercialized and taken another forms, such as Werewolf. In all of the different forms that the game has taken, a group of friends take on different roles and try to figure out who is the outsider that is trying to kill them. The creative director for virtual reality at Red Storm studio, David Votypka, mentioned that while thinking about ideas for VR games, Mafia was mentioned and they started working on trying to emulate the social play aspect of the game into VR &amp;lt;ref name=”6”&amp;gt; Crecente, Brian (2016). Ubisoft brings Werewolf party game to virtual reality. Retrieved from www.polygon.com/2016/3/15/11231148/werewolves-within-ubisoft-werewolf-virtual-reality&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Gameplay==&lt;br /&gt;
In Ubisoft’s VR game, five to eight players interact online &amp;lt;ref&amp;gt; Ubisoft. Werewolves Within. Retrieved from www.ubisoft.com/en-US/game/werewolves-within&amp;lt;/ref&amp;gt;. Roles are assigned randomly to them at the beginning of the match. The roles provide special abilities that can help the players to figure out who the werewolf is &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;. For example, a player can be assigned the role of a village tracker or the role of a turncloack. In the first case, if the player leans in either direction and a werewolf is on that side of them, he or she will hear a growling, while in the second, the player assigned that role can only win if the werewolves win, and knows who they are &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. Players are on two sides and there is position tracking that allows for the leaning mechanic, in which a player can also have a private chat with the person immediately at his or her side &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The in-game avatars move according to the volume and inflection of the voice, and there are also some hand-gestures available with the use of a controller &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
After five minutes of arguing, the player casts a vote in who they think is the werewolf. To win a round you have to correctly choose who the werewolf is, survive as a werewolf, or convince the others that you are the werewolf (when the role of the “deviant” is assigned to you) &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:VR Apps]] [[Category:VR Games]] [[Category:Social VR]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=VR_platforms&amp;diff=10563</id>
		<title>VR platforms</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=VR_platforms&amp;diff=10563"/>
		<updated>2016-08-01T13:59:57Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: Redirected page to Virtual Reality#Platforms&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[Virtual Reality#Platforms]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Werewolves_Within&amp;diff=10562</id>
		<title>Werewolves Within</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Werewolves_Within&amp;diff=10562"/>
		<updated>2016-08-01T13:58:34Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: /* Introduction */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Werewolves Within is a [[Social VR]] [[game]] developed by Red Storm Entertainment, an [[Ubisoft]] studio based in North Carolina &amp;lt;ref&amp;gt; Hayden, Scott (2016). Hands-on: Ubisoft’s First Social Game “Werewolves Within”. Retrieved from www.roadtovr.com/hands-ubisofts-first-social-vr-game-werewolves-within-launching-fall&amp;lt;/ref&amp;gt;. It is a multiplayer “social deduction game”, where players try to figure out who is the killer werewolf among them by arguing between themselves and using the game’s mechanics like signaling that they suspect that a specific player is a werewolf &amp;lt;ref name=”2”&amp;gt; Chalk, Andy (2016). Werewolves Within is a VR “social deduction” party game coming this fall. Retrieved from www.pcgamer.com/werewolves-within-is-a-vr-social-deduction-party-game-coming-this-fall&amp;lt;/ref&amp;gt;. The game is played online, and the users interact with each other in the virtual village of Gallowston &amp;lt;ref name=”3”&amp;gt; Kraft, Courtney (2016). Ubisoft brings party games to VR with “Werewolves Within”. Retrieved from geekandsundry.com/paranoia-gets-virtual-with-ubisofts-werewolves-within&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Werewolves Within has been demoed on PCs with the [[Oculus Rift]] headset, and is planned to release on all major [[VR platforms]] in the fall of 2016 &amp;lt;ref name=”4”&amp;gt; Wilson, Jason (2016). Werewolves Within is Ultimate Werewolf in virtual reality – and it’s social, too (update). Retrieved from venturebeat.com/2016/03/15/werewolves-within-preview&amp;lt;/ref&amp;gt; &amp;lt;ref name=”5”&amp;gt; Reparaz, Mikel (2016). Werewolves Within brings multiplayer deception and deduction to VR. Retrieved from blog.ubi.com/werewolves-within-brings-multiplayer-deception-and-deduction-to-vr&amp;lt;/ref&amp;gt;. This game is not the only investment that Ubisoft is going to make in VR during 2016. Besides Werewolves Within, the studio will also publish a game called [[Eagle Flight]], which is an aerial open world game about eagles &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Inspiration for the Game==&lt;br /&gt;
Werewolves Within is a VR adaptation of a popular social party game that had its origins in the 80’s. The first iteration of it started in Russia (it was called Mafia), and in the 90’s the game had been commercialized and taken another forms, such as Werewolf. In all of the different forms that the game has taken, a group of friends take on different roles and try to figure out who is the outsider that is trying to kill them. The creative director for virtual reality at Red Storm studio, David Votypka, mentioned that while thinking about ideas for VR games, Mafia was mentioned and they started working on trying to emulate the social play aspect of the game into VR &amp;lt;ref name=”6”&amp;gt; Crecente, Brian (2016). Ubisoft brings Werewolf party game to virtual reality. Retrieved from www.polygon.com/2016/3/15/11231148/werewolves-within-ubisoft-werewolf-virtual-reality&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Gameplay==&lt;br /&gt;
In Ubisoft’s VR game, five to eight players interact online &amp;lt;ref&amp;gt; Ubisoft. Werewolves Within. Retrieved from www.ubisoft.com/en-US/game/werewolves-within&amp;lt;/ref&amp;gt;. Roles are assigned randomly to them at the beginning of the match. The roles provide special abilities that can help the players to figure out who the werewolf is &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;. For example, a player can be assigned the role of a village tracker or the role of a turncloack. In the first case, if the player leans in either direction and a werewolf is on that side of them, he or she will hear a growling, while in the second, the player assigned that role can only win if the werewolves win, and knows who they are &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. Players are on two sides and there is position tracking that allows for the leaning mechanic, in which a player can also have a private chat with the person immediately at his or her side &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The in-game avatars move according to the volume and inflection of the voice, and there are also some hand-gestures available with the use of a controller &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
After five minutes of arguing, the player casts a vote in who they think is the werewolf. To win a round you have to correctly choose who the werewolf is, survive as a werewolf, or convince the others that you are the werewolf (when the role of the “deviant” is assigned to you) &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Werewolves_Within&amp;diff=10561</id>
		<title>Werewolves Within</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Werewolves_Within&amp;diff=10561"/>
		<updated>2016-08-01T13:58:14Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Werewolves Within is a [[Social VR]] [[game]] developed by Red Storm Entertainment, an [[Ubisoft]] studio based in North Carolina &amp;lt;ref&amp;gt; Hayden, Scott (2016). Hands-on: Ubisoft’s First Social Game “Werewolves Within”. Retrieved from www.roadtovr.com/hands-ubisofts-first-social-vr-game-werewolves-within-launching-fall&amp;lt;/ref&amp;gt;. It is a multiplayer “social deduction game”, where players try to figure out who is the killer werewolf among them by arguing between themselves and using the game’s mechanics like signaling that they suspect that a specific player is a werewolf &amp;lt;ref name=”2”&amp;gt; Chalk, Andy (2016). Werewolves Within is a VR “social deduction” party game coming this fall. Retrieved from www.pcgamer.com/werewolves-within-is-a-vr-social-deduction-party-game-coming-this-fall&amp;lt;/ref&amp;gt;. The game is played online, and the users interact with each other in the virtual village of Gallowston &amp;lt;ref name=”3”&amp;gt; Kraft, Courtney (2016). Ubisoft brings party games to VR with “Werewolves Within”. Retrieved from geekandsundry.com/paranoia-gets-virtual-with-ubisofts-werewolves-within&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Werewolves Within has been demoed on PCs with the [[Oculus Rift]] headset, and is planned to release on all major [[VR platforms]] in the fall of 2016 &amp;lt;ref name=”4”&amp;gt; Wilson, Jason (2016). Werewolves Within is Ultimate Werewolf in virtual reality – and it’s social, too (update). Retrieved from venturebeat.com/2016/03/15/werewolves-within-preview&amp;lt;/ref&amp;gt; &amp;lt;ref name=”5”&amp;gt; Reparaz, Mikel (2016). Werewolves Within brings multiplayer deception and deduction to VR. Retrieved from blog.ubi.com/werewolves-within-brings-multiplayer-deception-and-deduction-to-vr&amp;lt;/ref&amp;gt;. This game is not the only investment that Ubisoft is going to make in VR during 2016. Besides Werewolves Within, the studio will also publish a game called Eagle Flight, which is an aerial open world game about eagles &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Inspiration for the Game==&lt;br /&gt;
Werewolves Within is a VR adaptation of a popular social party game that had its origins in the 80’s. The first iteration of it started in Russia (it was called Mafia), and in the 90’s the game had been commercialized and taken another forms, such as Werewolf. In all of the different forms that the game has taken, a group of friends take on different roles and try to figure out who is the outsider that is trying to kill them. The creative director for virtual reality at Red Storm studio, David Votypka, mentioned that while thinking about ideas for VR games, Mafia was mentioned and they started working on trying to emulate the social play aspect of the game into VR &amp;lt;ref name=”6”&amp;gt; Crecente, Brian (2016). Ubisoft brings Werewolf party game to virtual reality. Retrieved from www.polygon.com/2016/3/15/11231148/werewolves-within-ubisoft-werewolf-virtual-reality&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Gameplay==&lt;br /&gt;
In Ubisoft’s VR game, five to eight players interact online &amp;lt;ref&amp;gt; Ubisoft. Werewolves Within. Retrieved from www.ubisoft.com/en-US/game/werewolves-within&amp;lt;/ref&amp;gt;. Roles are assigned randomly to them at the beginning of the match. The roles provide special abilities that can help the players to figure out who the werewolf is &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;. For example, a player can be assigned the role of a village tracker or the role of a turncloack. In the first case, if the player leans in either direction and a werewolf is on that side of them, he or she will hear a growling, while in the second, the player assigned that role can only win if the werewolves win, and knows who they are &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. Players are on two sides and there is position tracking that allows for the leaning mechanic, in which a player can also have a private chat with the person immediately at his or her side &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The in-game avatars move according to the volume and inflection of the voice, and there are also some hand-gestures available with the use of a controller &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
After five minutes of arguing, the player casts a vote in who they think is the werewolf. To win a round you have to correctly choose who the werewolf is, survive as a werewolf, or convince the others that you are the werewolf (when the role of the “deviant” is assigned to you) &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Werewolves_Within&amp;diff=10560</id>
		<title>Werewolves Within</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Werewolves_Within&amp;diff=10560"/>
		<updated>2016-08-01T13:57:42Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Werewolves Within is a [[Social VR]] [[game]] developed by Red Storm Entertainment, an [[Ubisoft]] studio based in North Carolina &amp;lt;ref&amp;gt; Hayden, Scott (2016). Hands-on: Ubisoft’s First Social Game “Werewolves Within”. Retrieved from www.roadtovr.com/hands-ubisofts-first-social-vr-game-werewolves-within-launching-fall&amp;lt;/ref&amp;gt;. It is a multiplayer “social deduction game”, where players try to figure out who is the killer werewolf among them by arguing between themselves and using the game’s mechanics like signaling that they suspect that a specific player is a werewolf &amp;lt;ref name=”2”&amp;gt; Chalk, Andy (2016). Werewolves Within is a VR “social deduction” party game coming this fall. Retrieved from www.pcgamer.com/werewolves-within-is-a-vr-social-deduction-party-game-coming-this-fall&amp;lt;/ref&amp;gt;. The game is played online, and the users interact with each other in the virtual village of Gallowston &amp;lt;ref name=”3”&amp;gt; Kraft, Courtney (2016). Ubisoft brings party games to VR with “Werewolves Within”. Retrieved from geekandsundry.com/paranoia-gets-virtual-with-ubisofts-werewolves-within&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Werewolves Within has been demoed on PCs with the Oculus Rift headset, and is planned to release on all major VR platforms in the fall of 2016 &amp;lt;ref name=”4”&amp;gt; Wilson, Jason (2016). Werewolves Within is Ultimate Werewolf in virtual reality – and it’s social, too (update). Retrieved from venturebeat.com/2016/03/15/werewolves-within-preview&amp;lt;/ref&amp;gt; &amp;lt;ref name=”5”&amp;gt; Reparaz, Mikel (2016). Werewolves Within brings multiplayer deception and deduction to VR. Retrieved from blog.ubi.com/werewolves-within-brings-multiplayer-deception-and-deduction-to-vr&amp;lt;/ref&amp;gt;. This game is not the only investment that Ubisoft is going to make in VR during 2016. Besides Werewolves Within, the studio will also publish a game called Eagle Flight, which is an aerial open world game about eagles &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Inspiration for the Game==&lt;br /&gt;
Werewolves Within is a VR adaptation of a popular social party game that had its origins in the 80’s. The first iteration of it started in Russia (it was called Mafia), and in the 90’s the game had been commercialized and taken another forms, such as Werewolf. In all of the different forms that the game has taken, a group of friends take on different roles and try to figure out who is the outsider that is trying to kill them. The creative director for virtual reality at Red Storm studio, David Votypka, mentioned that while thinking about ideas for VR games, Mafia was mentioned and they started working on trying to emulate the social play aspect of the game into VR &amp;lt;ref name=”6”&amp;gt; Crecente, Brian (2016). Ubisoft brings Werewolf party game to virtual reality. Retrieved from www.polygon.com/2016/3/15/11231148/werewolves-within-ubisoft-werewolf-virtual-reality&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Gameplay==&lt;br /&gt;
In Ubisoft’s VR game, five to eight players interact online &amp;lt;ref&amp;gt; Ubisoft. Werewolves Within. Retrieved from www.ubisoft.com/en-US/game/werewolves-within&amp;lt;/ref&amp;gt;. Roles are assigned randomly to them at the beginning of the match. The roles provide special abilities that can help the players to figure out who the werewolf is &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;. For example, a player can be assigned the role of a village tracker or the role of a turncloack. In the first case, if the player leans in either direction and a werewolf is on that side of them, he or she will hear a growling, while in the second, the player assigned that role can only win if the werewolves win, and knows who they are &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. Players are on two sides and there is position tracking that allows for the leaning mechanic, in which a player can also have a private chat with the person immediately at his or her side &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The in-game avatars move according to the volume and inflection of the voice, and there are also some hand-gestures available with the use of a controller &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
After five minutes of arguing, the player casts a vote in who they think is the werewolf. To win a round you have to correctly choose who the werewolf is, survive as a werewolf, or convince the others that you are the werewolf (when the role of the “deviant” is assigned to you) &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Werewolves_Within&amp;diff=10559</id>
		<title>Werewolves Within</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Werewolves_Within&amp;diff=10559"/>
		<updated>2016-08-01T13:56:27Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Werewolves Within is a new virtual reality (VR) game developed by Red Storm Entertainment, an Ubisoft studio based in North Carolina &amp;lt;ref&amp;gt; Hayden, Scott (2016). Hands-on: Ubisoft’s First Social Game “Werewolves Within”. Retrieved from www.roadtovr.com/hands-ubisofts-first-social-vr-game-werewolves-within-launching-fall&amp;lt;/ref&amp;gt;. It is a multiplayer “social deduction game”, where players try to figure out who is the killer werewolf among them by arguing between themselves and using the game’s mechanics like signaling that they suspect that a specific player is a werewolf &amp;lt;ref name=”2”&amp;gt; Chalk, Andy (2016). Werewolves Within is a VR “social deduction” party game coming this fall. Retrieved from www.pcgamer.com/werewolves-within-is-a-vr-social-deduction-party-game-coming-this-fall&amp;lt;/ref&amp;gt;. The game is played online, and the users interact with each other in the virtual village of Gallowston &amp;lt;ref name=”3”&amp;gt; Kraft, Courtney (2016). Ubisoft brings party games to VR with “Werewolves Within”. Retrieved from geekandsundry.com/paranoia-gets-virtual-with-ubisofts-werewolves-within&amp;lt;/ref&amp;gt;.&lt;br /&gt;
Werewolves Within has been demoed on PCs with the Oculus Rift headset, and is planned to release on all major VR platforms in the fall of 2016 &amp;lt;ref name=”4”&amp;gt; Wilson, Jason (2016). Werewolves Within is Ultimate Werewolf in virtual reality – and it’s social, too (update). Retrieved from venturebeat.com/2016/03/15/werewolves-within-preview&amp;lt;/ref&amp;gt; &amp;lt;ref name=”5”&amp;gt; Reparaz, Mikel (2016). Werewolves Within brings multiplayer deception and deduction to VR. Retrieved from blog.ubi.com/werewolves-within-brings-multiplayer-deception-and-deduction-to-vr&amp;lt;/ref&amp;gt;. This game is not the only investment that Ubisoft is going to make in VR during 2016. Besides Werewolves Within, the studio will also publish a game called Eagle Flight, which is an aerial open world game about eagles &amp;lt;ref name=”5”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Inspiration for the Game==&lt;br /&gt;
Werewolves Within is a VR adaptation of a popular social party game that had its origins in the 80’s. The first iteration of it started in Russia (it was called Mafia), and in the 90’s the game had been commercialized and taken another forms, such as Werewolf. In all of the different forms that the game has taken, a group of friends take on different roles and try to figure out who is the outsider that is trying to kill them. The creative director for virtual reality at Red Storm studio, David Votypka, mentioned that while thinking about ideas for VR games, Mafia was mentioned and they started working on trying to emulate the social play aspect of the game into VR &amp;lt;ref name=”6”&amp;gt; Crecente, Brian (2016). Ubisoft brings Werewolf party game to virtual reality. Retrieved from www.polygon.com/2016/3/15/11231148/werewolves-within-ubisoft-werewolf-virtual-reality&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Gameplay==&lt;br /&gt;
In Ubisoft’s VR game, five to eight players interact online &amp;lt;ref&amp;gt; Ubisoft. Werewolves Within. Retrieved from www.ubisoft.com/en-US/game/werewolves-within&amp;lt;/ref&amp;gt;. Roles are assigned randomly to them at the beginning of the match. The roles provide special abilities that can help the players to figure out who the werewolf is &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;. For example, a player can be assigned the role of a village tracker or the role of a turncloack. In the first case, if the player leans in either direction and a werewolf is on that side of them, he or she will hear a growling, while in the second, the player assigned that role can only win if the werewolves win, and knows who they are &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;. Players are on two sides and there is position tracking that allows for the leaning mechanic, in which a player can also have a private chat with the person immediately at his or her side &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The in-game avatars move according to the volume and inflection of the voice, and there are also some hand-gestures available with the use of a controller &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
After five minutes of arguing, the player casts a vote in who they think is the werewolf. To win a round you have to correctly choose who the werewolf is, survive as a werewolf, or convince the others that you are the werewolf (when the role of the “deviant” is assigned to you) &amp;lt;ref name=”6”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Augmented_reality_use_cases&amp;diff=10553</id>
		<title>Augmented reality use cases</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Augmented_reality_use_cases&amp;diff=10553"/>
		<updated>2016-07-21T08:36:51Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! Use Case&lt;br /&gt;
! Examples&lt;br /&gt;
|-&lt;br /&gt;
|[[3D modeling]] ||&lt;br /&gt;
|-&lt;br /&gt;
|[[Architecture]] || &lt;br /&gt;
|- &lt;br /&gt;
|[[Automotive design]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Big data visualisation]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Cinema]] || &lt;br /&gt;
|- &lt;br /&gt;
|[[Cognitive training]] || &lt;br /&gt;
|- &lt;br /&gt;
|[[Courtroom]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Desktop]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Ecommerce]] || &lt;br /&gt;
|- &lt;br /&gt;
|[[Education]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Finance]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Flying drones]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Food manipulation]] || [https://www.youtube.com/watch?v=i7zpDGN5B2A Takuji Narumi of Cyber Interface Lab in Tokyo University]&lt;br /&gt;
|-&lt;br /&gt;
|[[VR Apps|Gaming]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Handheld 3D imaging]] || [https://www.youtube.com/watch?v=FxyppRJPntg Phi3D by Dot Product]&lt;br /&gt;
|-&lt;br /&gt;
|[[Industrial training]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Journalism]] || &lt;br /&gt;
|- &lt;br /&gt;
|[[Magic Show]] || [[Augmented Magic]]&lt;br /&gt;
|-&lt;br /&gt;
|[[Manufacturing]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Marketing]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Mental health]] || &lt;br /&gt;
|- &lt;br /&gt;
|[[Meditation]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Pain relief]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Psychedelics]] ||&lt;br /&gt;
|-&lt;br /&gt;
|[[Recruitment]] || &lt;br /&gt;
|- &lt;br /&gt;
|[[Simulation]] ||&lt;br /&gt;
|-&lt;br /&gt;
|[[Sports spectating]] ||&lt;br /&gt;
|- &lt;br /&gt;
|[[Sports training]] || &lt;br /&gt;
|-&lt;br /&gt;
|[[Social networking]] || &lt;br /&gt;
|- &lt;br /&gt;
|[[Surgery training]] || [[System for Telementoring with Augmented Reality]]&lt;br /&gt;
|-&lt;br /&gt;
|[[Travel]] ||&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Headsets&amp;diff=10551</id>
		<title>Headsets</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Headsets&amp;diff=10551"/>
		<updated>2016-07-20T23:26:51Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: Redirected page to Head-mounted display&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[Head-mounted display]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10550</id>
		<title>Neuromancer</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10550"/>
		<updated>2016-07-20T23:26:38Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;[[Neuromancer]]: A Foreshadow of Things Still to Come&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
By Paulo Pacheco on July 20, 2016&lt;br /&gt;
&lt;br /&gt;
==Introduction==&lt;br /&gt;
Neuromancer is the first novel of the writer William Gibson, and it was published on the July 1, 1984 &amp;lt;ref name=”1”&amp;gt; Sullivan, Mark (2009). Neuromancer Turns 25: What it Got Right, What it got Wrong. Retrieved from www.macworld.com/article/1141500/neuromancer_25.html&amp;lt;/ref&amp;gt;. It has sold more than 6 million copies, and in the year after its launch received the three biggest awards in Science Fiction writing: the Nebula, Philip K Dick and Hugo awards &amp;lt;ref name=”2”&amp;gt; Cumming, Ed (2014). William Gibson: the man who saw tomorrow. Retrieved from www.theguardian.com/books/2014/jul/28/william-gibson-neuromancer-cyberpunk-books&amp;lt;/ref&amp;gt;. It defined an aesthetic – Cyberpunk – and left a mark in the tech and digital culture by envisioning the concept of cyberspace and virtual reality, both integrated and being extensions of the physical world &amp;lt;ref name=”3”&amp;gt; DSMLF (2015). Neuromancer: William Gibson’s Virtual Reality Masterpiece. Retrieved from dsmlf.info/neuromancer-william-gibsons-virtual-reality-masterpiece&amp;lt;/ref&amp;gt;. Today, we have the World Wide Web, and the explosion of [[Virtual Reality]] is finally around the corner (even if it still hasn’t reached the same level has explored in the novel) has reminders of some aspects of the world created by Gibson that crept in into our reality.&lt;br /&gt;
&lt;br /&gt;
==Influences for the Story==&lt;br /&gt;
William Gibson was not a “techie” by nature. He was aware of the new technologies around him, but according to Gareth Damien Martin, “he never had even touched a PC when he wrote Neuromancer.” His exposure to computers came as he met and conversed with science fiction writers and people who were experiencing that novel technology. He focused on observing their behaviors, addictions, obsessions and how they would interface with technology.&lt;br /&gt;
&lt;br /&gt;
Another influence for the novel came from the counter-culture of the 1960’s. The author was embedded in its excesses, in the drug-culture and the exploration of altered states of consciousness. This influence can easily be seen in the main character and in the criminal underworld described in the story. In both of these cases – in the tech and counter-culture world - his value was mainly has an observer &amp;lt;ref name=”4”&amp;gt; Marting, Gareth Damian. Re-reading William Gibson at the Advent of Virtual Reality. Retrieved from versions.killscreen.com/re-reading-william-gibson-at-the-advent-of-virtual-reality&amp;lt;/ref&amp;gt;. Other influences for the work of William Gibson came from movies (e.g. Escape From New York and 1940’s film-noir), music and pop culture elements &amp;lt;ref&amp;gt; McCaffery, Larry (1991). An Interview With William Gibson. Retrieved from project.cyberpunk.ru/idb/gibson_interview.html&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Summary of the Story of Neuromancer==&lt;br /&gt;
The setting of the story is in a “post-apocalyptic, not-too-distant future in which ‘human’ has transformed into ‘post-human’ and ecological systems have been supplanted by technological constructs” &amp;lt;ref&amp;gt; Leaver, Tama (1997). Post-Humanism and Ecocide in William Gibson’s Neuromancer and Ridley Scott’s Blade Runner. Retrieved from cyberpunk.asia/cp_project.php?txt=180&amp;amp;lng=fr&amp;lt;/ref&amp;gt;. It is a future where media, technology, pop culture and market imperatives have spun out of control &amp;lt;ref&amp;gt; Walker, Douglas (1989). Douglas Walker Interviews Science Fiction Author William Gibson. Retrieved from www.douglaswalker.ca/press/gibson.pdf&amp;lt;/ref&amp;gt;. It follows the story of a character called Case, a once “cyberspace cowboy” who could hack into corporate databases. Due to a job gone wrong, Case is left crippled and unable to access cyberspace. He is then recruited by an underworld group of people. They promise to heal Case’s nervous system if he helps them to infiltrate an AI (Artificial Intelligence) called Wintermute &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Cyberspace, Virtual Realities and the Fusion of Technology with Wetware==&lt;br /&gt;
There is no doubt that Neuromancer had a great impact in foreseeing the technologies that would follow its publication, and its level of prescience is still praised; the author’s being named has a prophet of the digital age. Even though there are some technologies that the book foreshadowed, others are still a bit far off &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. We may not have reached - in the real-world - the bleak aesthetics of the novel, but there still are intersecting paths between fiction and reality that are eerily similar.&lt;br /&gt;
&lt;br /&gt;
One of those, is the idea of a World Wide Web: a global network of millions of computers. The concept of linking computers to each other already existed when the book launched – universities had already connected various systems of servers through a telecom link – but not on the global scale that the novel described. The concept of the internet as we know it today was still a decade away, and it may just have been a wild speculation at the time. Jack Womack has suggested, in the afterword of the 2000 re-release of the book, that it could have even influenced the way the Web developed by providing a sort of blueprint, a guide, to the developers who read and grew up with the novel &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
It also defined cyberspace (or the matrix as it is also called) has “a consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts… A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights receding…&amp;quot; &amp;lt;ref&amp;gt; Myers, Tony (2001). The Postmodern Imaginary in William Gibson’s Neuromancer. MFS Modern Fiction Studies, 47(4)&amp;lt;/ref&amp;gt;. The current Virtual Reality technology of our world may not be as advanced as that in the book, where people interact with the network directly through their nervous systems with full sensory stimulation, but that may be just a matter of time &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. Virtual Reality seems to be finally at the cusp of penetrating our world and becoming common norm with the [[Oculus Rift]] and other types of [[headsets]].&lt;br /&gt;
The book reflects, ultimately, the increasing presence of technology in our lives, having in its core the direct integration of man and computer. Indeed, development in this direction has already started &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The [[VR HMDs|VR headsets]] are getting better and providing a greater immersion into their virtual realms. Direct brain-to-brain communication between human subjects has been achieved - a sort of technological telepathy – with the aid of electrodes attached to a person’s scalp and the use of the internet to transmit the information &amp;lt;ref&amp;gt; ScienceDaily (2014). Direct Brain-to-Brain Communication Demonstrated in Human Subjects. Retrieved from www.sciencedaily.com/releases/2014/09/140903105646.htm&amp;lt;/ref&amp;gt;. Real-time brain control of a computer cursor was already done back in 2002 &amp;lt;ref&amp;gt; ScienceDaily (2002). Researchers Demonstrate Direct, Real-Time Brain Control of Computer Cursor. Retrieved from www.sciencedaily.com/releases/2002/03/020314080832.htm&amp;lt;/ref&amp;gt;. There’s a real tendency to merge computers, the Internet and our own wetware &amp;lt;ref&amp;gt; Wikipedia. Wetware (brain). Retrieved from en.wikipedia.org/wiki/Wetware_(brain)&amp;lt;/ref&amp;gt; that is evocative of the world William Gibson created.&lt;br /&gt;
&lt;br /&gt;
With all these developments there is always the risk of abuse, addiction, as escapism – a subject also dealt with in the book. Either way, our connection with the technology we use is already affecting us &amp;lt;ref&amp;gt; ScienceDaily (2009). Is Technology Producing a Decline in Critical Thinking? Retrieved from www.sciencedaily.com/releases/2009/01/090128092341.htm&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; ScienceDaily (2016). Kids Who Text and Watch TV Simultaneously Likely to Underperform at School. Retrieved from www.sciencedaily.com/releases/2016/05/160518102746.htm&amp;lt;/ref&amp;gt; and only time will tell if we will achieved that full integration with the machines that was envisioned in Neuromancer.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Books]] [[Category:Media]] [[Category:VR Books]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10549</id>
		<title>Neuromancer</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10549"/>
		<updated>2016-07-20T23:25:34Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;[[Neuromancer]]: A Foreshadow of Things Still to Come&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
By Paulo Pacheco on July 20, 2016&lt;br /&gt;
&lt;br /&gt;
==Introduction==&lt;br /&gt;
Neuromancer is the first novel of the writer William Gibson, and it was published on the July 1, 1984 &amp;lt;ref name=”1”&amp;gt; Sullivan, Mark (2009). Neuromancer Turns 25: What it Got Right, What it got Wrong. Retrieved from www.macworld.com/article/1141500/neuromancer_25.html&amp;lt;/ref&amp;gt;. It has sold more than 6 million copies, and in the year after its launch received the three biggest awards in Science Fiction writing: the Nebula, Philip K Dick and Hugo awards &amp;lt;ref name=”2”&amp;gt; Cumming, Ed (2014). William Gibson: the man who saw tomorrow. Retrieved from www.theguardian.com/books/2014/jul/28/william-gibson-neuromancer-cyberpunk-books&amp;lt;/ref&amp;gt;. It defined an aesthetic – Cyberpunk – and left a mark in the tech and digital culture by envisioning the concept of cyberspace and virtual reality, both integrated and being extensions of the physical world &amp;lt;ref name=”3”&amp;gt; DSMLF (2015). Neuromancer: William Gibson’s Virtual Reality Masterpiece. Retrieved from dsmlf.info/neuromancer-william-gibsons-virtual-reality-masterpiece&amp;lt;/ref&amp;gt;. Today, we have the World Wide Web, and the explosion of [[Virtual Reality]] is finally around the corner (even if it still hasn’t reached the same level has explored in the novel) has reminders of some aspects of the world created by Gibson that crept in into our reality.&lt;br /&gt;
&lt;br /&gt;
==Influences for the Story==&lt;br /&gt;
William Gibson was not a “techie” by nature. He was aware of the new technologies around him, but according to Gareth Damien Martin, “he never had even touched a PC when he wrote Neuromancer.” His exposure to computers came as he met and conversed with science fiction writers and people who were experiencing that novel technology. He focused on observing their behaviors, addictions, obsessions and how they would interface with technology.&lt;br /&gt;
&lt;br /&gt;
Another influence for the novel came from the counter-culture of the 1960’s. The author was embedded in its excesses, in the drug-culture and the exploration of altered states of consciousness. This influence can easily be seen in the main character and in the criminal underworld described in the story. In both of these cases – in the tech and counter-culture world - his value was mainly has an observer &amp;lt;ref name=”4”&amp;gt; Marting, Gareth Damian. Re-reading William Gibson at the Advent of Virtual Reality. Retrieved from versions.killscreen.com/re-reading-william-gibson-at-the-advent-of-virtual-reality&amp;lt;/ref&amp;gt;. Other influences for the work of William Gibson came from movies (e.g. Escape From New York and 1940’s film-noir), music and pop culture elements &amp;lt;ref&amp;gt; McCaffery, Larry (1991). An Interview With William Gibson. Retrieved from project.cyberpunk.ru/idb/gibson_interview.html&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Summary of the Story of Neuromancer==&lt;br /&gt;
The setting of the story is in a “post-apocalyptic, not-too-distant future in which ‘human’ has transformed into ‘post-human’ and ecological systems have been supplanted by technological constructs” &amp;lt;ref&amp;gt; Leaver, Tama (1997). Post-Humanism and Ecocide in William Gibson’s Neuromancer and Ridley Scott’s Blade Runner. Retrieved from cyberpunk.asia/cp_project.php?txt=180&amp;amp;lng=fr&amp;lt;/ref&amp;gt;. It is a future where media, technology, pop culture and market imperatives have spun out of control &amp;lt;ref&amp;gt; Walker, Douglas (1989). Douglas Walker Interviews Science Fiction Author William Gibson. Retrieved from www.douglaswalker.ca/press/gibson.pdf&amp;lt;/ref&amp;gt;. It follows the story of a character called Case, a once “cyberspace cowboy” who could hack into corporate databases. Due to a job gone wrong, Case is left crippled and unable to access cyberspace. He is then recruited by an underworld group of people. They promise to heal Case’s nervous system if he helps them to infiltrate an AI (Artificial Intelligence) called Wintermute &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Cyberspace, Virtual Realities and the Fusion of Technology with Wetware==&lt;br /&gt;
There is no doubt that Neuromancer had a great impact in foreseeing the technologies that would follow its publication, and its level of prescience is still praised; the author’s being named has a prophet of the digital age. Even though there are some technologies that the book foreshadowed, others are still a bit far off &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. We may not have reached - in the real-world - the bleak aesthetics of the novel, but there still are intersecting paths between fiction and reality that are eerily similar.&lt;br /&gt;
&lt;br /&gt;
One of those, is the idea of a World Wide Web: a global network of millions of computers. The concept of linking computers to each other already existed when the book launched – universities had already connected various systems of servers through a telecom link – but not on the global scale that the novel described. The concept of the internet as we know it today was still a decade away, and it may just have been a wild speculation at the time. Jack Womack has suggested, in the afterword of the 2000 re-release of the book, that it could have even influenced the way the Web developed by providing a sort of blueprint, a guide, to the developers who read and grew up with the novel &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
It also defined cyberspace (or the matrix as it is also called) has “a consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts… A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights receding…&amp;quot; &amp;lt;ref&amp;gt; Myers, Tony (2001). The Postmodern Imaginary in William Gibson’s Neuromancer. MFS Modern Fiction Studies, 47(4)&amp;lt;/ref&amp;gt;. The current Virtual Reality technology of our world may not be as advanced as that in the book, where people interact with the network directly through their nervous systems with full sensory stimulation, but that may be just a matter of time &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. Virtual Reality seems to be finally at the cusp of penetrating our world and becoming common norm with the Oculus Rift and other types of headsets.&lt;br /&gt;
The book reflects, ultimately, the increasing presence of technology in our lives, having in its core the direct integration of man and computer. Indeed, development in this direction has already started &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The VR headsets are getting better and providing a greater immersion into their virtual realms. Direct brain-to-brain communication between human subjects has been achieved - a sort of technological telepathy – with the aid of electrodes attached to a person’s scalp and the use of the internet to transmit the information &amp;lt;ref&amp;gt; ScienceDaily (2014). Direct Brain-to-Brain Communication Demonstrated in Human Subjects. Retrieved from www.sciencedaily.com/releases/2014/09/140903105646.htm&amp;lt;/ref&amp;gt;. Real-time brain control of a computer cursor was already done back in 2002 &amp;lt;ref&amp;gt; ScienceDaily (2002). Researchers Demonstrate Direct, Real-Time Brain Control of Computer Cursor. Retrieved from www.sciencedaily.com/releases/2002/03/020314080832.htm&amp;lt;/ref&amp;gt;. There’s a real tendency to merge computers, the Internet and our own wetware &amp;lt;ref&amp;gt; Wikipedia. Wetware (brain). Retrieved from en.wikipedia.org/wiki/Wetware_(brain)&amp;lt;/ref&amp;gt; that is evocative of the world William Gibson created.&lt;br /&gt;
&lt;br /&gt;
With all these developments there is always the risk of abuse, addiction, as escapism – a subject also dealt with in the book. Either way, our connection with the technology we use is already affecting us &amp;lt;ref&amp;gt; ScienceDaily (2009). Is Technology Producing a Decline in Critical Thinking? Retrieved from www.sciencedaily.com/releases/2009/01/090128092341.htm&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; ScienceDaily (2016). Kids Who Text and Watch TV Simultaneously Likely to Underperform at School. Retrieved from www.sciencedaily.com/releases/2016/05/160518102746.htm&amp;lt;/ref&amp;gt; and only time will tell if we will achieved that full integration with the machines that was envisioned in Neuromancer.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Books]] [[Category:Media]] [[Category:VR Books]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10548</id>
		<title>Neuromancer</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10548"/>
		<updated>2016-07-20T23:24:31Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;[[Neuromancer]]: A Foreshadow of Things Still to Come&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
By Paulo Pacheco on July 20, 2016&lt;br /&gt;
&lt;br /&gt;
==Introduction==&lt;br /&gt;
Neuromancer is the first novel of the writer William Gibson, and it was published on the July 1, 1984 &amp;lt;ref name=”1”&amp;gt; Sullivan, Mark (2009). Neuromancer Turns 25: What it Got Right, What it got Wrong. Retrieved from www.macworld.com/article/1141500/neuromancer_25.html&amp;lt;/ref&amp;gt;. It has sold more than 6 million copies, and in the year after its launch received the three biggest awards in Science Fiction writing: the Nebula, Philip K Dick and Hugo awards &amp;lt;ref name=”2”&amp;gt; Cumming, Ed (2014). William Gibson: the man who saw tomorrow. Retrieved from www.theguardian.com/books/2014/jul/28/william-gibson-neuromancer-cyberpunk-books&amp;lt;/ref&amp;gt;. It defined an aesthetic – Cyberpunk – and left a mark in the tech and digital culture by envisioning the concept of cyberspace and virtual reality, both integrated and being extensions of the physical world &amp;lt;ref name=”3”&amp;gt; DSMLF (2015). Neuromancer: William Gibson’s Virtual Reality Masterpiece. Retrieved from dsmlf.info/neuromancer-william-gibsons-virtual-reality-masterpiece&amp;lt;/ref&amp;gt;. Today, we have the World Wide Web, and the explosion of Virtual Reality is finally around the corner (even if it still hasn’t reached the same level has explored in the novel) has reminders of some aspects of the world created by Gibson that crept in into our reality.&lt;br /&gt;
&lt;br /&gt;
==Influences for the Story==&lt;br /&gt;
William Gibson was not a “techie” by nature. He was aware of the new technologies around him, but according to Gareth Damien Martin, “he never had even touched a PC when he wrote Neuromancer.” His exposure to computers came as he met and conversed with science fiction writers and people who were experiencing that novel technology. He focused on observing their behaviors, addictions, obsessions and how they would interface with technology.&lt;br /&gt;
&lt;br /&gt;
Another influence for the novel came from the counter-culture of the 1960’s. The author was embedded in its excesses, in the drug-culture and the exploration of altered states of consciousness. This influence can easily be seen in the main character and in the criminal underworld described in the story. In both of these cases – in the tech and counter-culture world - his value was mainly has an observer &amp;lt;ref name=”4”&amp;gt; Marting, Gareth Damian. Re-reading William Gibson at the Advent of Virtual Reality. Retrieved from versions.killscreen.com/re-reading-william-gibson-at-the-advent-of-virtual-reality&amp;lt;/ref&amp;gt;. Other influences for the work of William Gibson came from movies (e.g. Escape From New York and 1940’s film-noir), music and pop culture elements &amp;lt;ref&amp;gt; McCaffery, Larry (1991). An Interview With William Gibson. Retrieved from project.cyberpunk.ru/idb/gibson_interview.html&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Summary of the Story of Neuromancer==&lt;br /&gt;
The setting of the story is in a “post-apocalyptic, not-too-distant future in which ‘human’ has transformed into ‘post-human’ and ecological systems have been supplanted by technological constructs” &amp;lt;ref&amp;gt; Leaver, Tama (1997). Post-Humanism and Ecocide in William Gibson’s Neuromancer and Ridley Scott’s Blade Runner. Retrieved from cyberpunk.asia/cp_project.php?txt=180&amp;amp;lng=fr&amp;lt;/ref&amp;gt;. It is a future where media, technology, pop culture and market imperatives have spun out of control &amp;lt;ref&amp;gt; Walker, Douglas (1989). Douglas Walker Interviews Science Fiction Author William Gibson. Retrieved from www.douglaswalker.ca/press/gibson.pdf&amp;lt;/ref&amp;gt;. It follows the story of a character called Case, a once “cyberspace cowboy” who could hack into corporate databases. Due to a job gone wrong, Case is left crippled and unable to access cyberspace. He is then recruited by an underworld group of people. They promise to heal Case’s nervous system if he helps them to infiltrate an AI (Artificial Intelligence) called Wintermute &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Cyberspace, Virtual Realities and the Fusion of Technology with Wetware==&lt;br /&gt;
There is no doubt that Neuromancer had a great impact in foreseeing the technologies that would follow its publication, and its level of prescience is still praised; the author’s being named has a prophet of the digital age. Even though there are some technologies that the book foreshadowed, others are still a bit far off &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. We may not have reached - in the real-world - the bleak aesthetics of the novel, but there still are intersecting paths between fiction and reality that are eerily similar.&lt;br /&gt;
One of those, is the idea of a World Wide Web: a global network of millions of computers. The concept of linking computers to each other already existed when the book launched – universities had already connected various systems of servers through a telecom link – but not on the global scale that the novel described. The concept of the internet as we know it today was still a decade away, and it may just have been a wild speculation at the time. Jack Womack has suggested, in the afterword of the 2000 re-release of the book, that it could have even influenced the way the Web developed by providing a sort of blueprint, a guide, to the developers who read and grew up with the novel &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
It also defined cyberspace (or the matrix as it is also called) has “a consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts… A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights receding…&amp;quot; &amp;lt;ref&amp;gt; Myers, Tony (2001). The Postmodern Imaginary in William Gibson’s Neuromancer. MFS Modern Fiction Studies, 47(4)&amp;lt;/ref&amp;gt;. The current Virtual Reality technology of our world may not be as advanced as that in the book, where people interact with the network directly through their nervous systems with full sensory stimulation, but that may be just a matter of time &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. Virtual Reality seems to be finally at the cusp of penetrating our world and becoming common norm with the Oculus Rift and other types of headsets.&lt;br /&gt;
The book reflects, ultimately, the increasing presence of technology in our lives, having in its core the direct integration of man and computer. Indeed, development in this direction has already started &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The VR headsets are getting better and providing a greater immersion into their virtual realms. Direct brain-to-brain communication between human subjects has been achieved - a sort of technological telepathy – with the aid of electrodes attached to a person’s scalp and the use of the internet to transmit the information &amp;lt;ref&amp;gt; ScienceDaily (2014). Direct Brain-to-Brain Communication Demonstrated in Human Subjects. Retrieved from www.sciencedaily.com/releases/2014/09/140903105646.htm&amp;lt;/ref&amp;gt;. Real-time brain control of a computer cursor was already done back in 2002 &amp;lt;ref&amp;gt; ScienceDaily (2002). Researchers Demonstrate Direct, Real-Time Brain Control of Computer Cursor. Retrieved from www.sciencedaily.com/releases/2002/03/020314080832.htm&amp;lt;/ref&amp;gt;. There’s a real tendency to merge computers, the Internet and our own wetware &amp;lt;ref&amp;gt; Wikipedia. Wetware (brain). Retrieved from en.wikipedia.org/wiki/Wetware_(brain)&amp;lt;/ref&amp;gt; that is evocative of the world William Gibson created.&lt;br /&gt;
&lt;br /&gt;
With all these developments there is always the risk of abuse, addiction, as escapism – a subject also dealt with in the book. Either way, our connection with the technology we use is already affecting us &amp;lt;ref&amp;gt; ScienceDaily (2009). Is Technology Producing a Decline in Critical Thinking? Retrieved from www.sciencedaily.com/releases/2009/01/090128092341.htm&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; ScienceDaily (2016). Kids Who Text and Watch TV Simultaneously Likely to Underperform at School. Retrieved from www.sciencedaily.com/releases/2016/05/160518102746.htm&amp;lt;/ref&amp;gt; and only time will tell if we will achieved that full integration with the machines that was envisioned in Neuromancer.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Books]] [[Category:Media]] [[Category:VR Books]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10547</id>
		<title>Neuromancer</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10547"/>
		<updated>2016-07-20T23:24:07Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;[[Neuromancer]]: A Foreshadow of Things Still to Come&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
By Paulo Pacheco on July 20, 2016&lt;br /&gt;
&lt;br /&gt;
==Introduction==&lt;br /&gt;
Neuromancer is the first novel of the writer William Gibson, and it was published on the July 1, 1984 &amp;lt;ref name=”1”&amp;gt; Sullivan, Mark (2009). Neuromancer Turns 25: What it Got Right, What it got Wrong. Retrieved from www.macworld.com/article/1141500/neuromancer_25.html&amp;lt;/ref&amp;gt;. It has sold more than 6 million copies, and in the year after its launch received the three biggest awards in Science Fiction writing: the Nebula, Philip K Dick and Hugo awards &amp;lt;ref name=”2”&amp;gt; Cumming, Ed (2014). William Gibson: the man who saw tomorrow. Retrieved from www.theguardian.com/books/2014/jul/28/william-gibson-neuromancer-cyberpunk-books&amp;lt;/ref&amp;gt;. It defined an aesthetic – Cyberpunk – and left a mark in the tech and digital culture by envisioning the concept of cyberspace and virtual reality, both integrated and being extensions of the physical world &amp;lt;ref name=”3”&amp;gt; DSMLF (2015). Neuromancer: William Gibson’s Virtual Reality Masterpiece. Retrieved from dsmlf.info/neuromancer-william-gibsons-virtual-reality-masterpiece&amp;lt;/ref&amp;gt;. Today, we have the World Wide Web, and the explosion of Virtual Reality is finally around the corner (even if it still hasn’t reached the same level has explored in the novel) has reminders of some aspects of the world created by Gibson that crept in into our reality.&lt;br /&gt;
&lt;br /&gt;
==Influences for the Story==&lt;br /&gt;
William Gibson was not a “techie” by nature. He was aware of the new technologies around him, but according to Gareth Damien Martin, “he never had even touched a PC when he wrote Neuromancer.” His exposure to computers came as he met and conversed with science fiction writers and people who were experiencing that novel technology. He focused on observing their behaviors, addictions, obsessions and how they would interface with technology.&lt;br /&gt;
Another influence for the novel came from the counter-culture of the 1960’s. The author was embedded in its excesses, in the drug-culture and the exploration of altered states of consciousness. This influence can easily be seen in the main character and in the criminal underworld described in the story. In both of these cases – in the tech and counter-culture world - his value was mainly has an observer &amp;lt;ref name=”4”&amp;gt; Marting, Gareth Damian. Re-reading William Gibson at the Advent of Virtual Reality. Retrieved from versions.killscreen.com/re-reading-william-gibson-at-the-advent-of-virtual-reality&amp;lt;/ref&amp;gt;. Other influences for the work of William Gibson came from movies (e.g. Escape From New York and 1940’s film-noir), music and pop culture elements &amp;lt;ref&amp;gt; McCaffery, Larry (1991). An Interview With William Gibson. Retrieved from project.cyberpunk.ru/idb/gibson_interview.html&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Summary of the Story of Neuromancer==&lt;br /&gt;
The setting of the story is in a “post-apocalyptic, not-too-distant future in which ‘human’ has transformed into ‘post-human’ and ecological systems have been supplanted by technological constructs” &amp;lt;ref&amp;gt; Leaver, Tama (1997). Post-Humanism and Ecocide in William Gibson’s Neuromancer and Ridley Scott’s Blade Runner. Retrieved from cyberpunk.asia/cp_project.php?txt=180&amp;amp;lng=fr&amp;lt;/ref&amp;gt;. It is a future where media, technology, pop culture and market imperatives have spun out of control &amp;lt;ref&amp;gt; Walker, Douglas (1989). Douglas Walker Interviews Science Fiction Author William Gibson. Retrieved from www.douglaswalker.ca/press/gibson.pdf&amp;lt;/ref&amp;gt;. It follows the story of a character called Case, a once “cyberspace cowboy” who could hack into corporate databases. Due to a job gone wrong, Case is left crippled and unable to access cyberspace. He is then recruited by an underworld group of people. They promise to heal Case’s nervous system if he helps them to infiltrate an AI (Artificial Intelligence) called Wintermute &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Cyberspace, Virtual Realities and the Fusion of Technology with Wetware==&lt;br /&gt;
There is no doubt that Neuromancer had a great impact in foreseeing the technologies that would follow its publication, and its level of prescience is still praised; the author’s being named has a prophet of the digital age. Even though there are some technologies that the book foreshadowed, others are still a bit far off &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. We may not have reached - in the real-world - the bleak aesthetics of the novel, but there still are intersecting paths between fiction and reality that are eerily similar.&lt;br /&gt;
One of those, is the idea of a World Wide Web: a global network of millions of computers. The concept of linking computers to each other already existed when the book launched – universities had already connected various systems of servers through a telecom link – but not on the global scale that the novel described. The concept of the internet as we know it today was still a decade away, and it may just have been a wild speculation at the time. Jack Womack has suggested, in the afterword of the 2000 re-release of the book, that it could have even influenced the way the Web developed by providing a sort of blueprint, a guide, to the developers who read and grew up with the novel &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
It also defined cyberspace (or the matrix as it is also called) has “a consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts… A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights receding…&amp;quot; &amp;lt;ref&amp;gt; Myers, Tony (2001). The Postmodern Imaginary in William Gibson’s Neuromancer. MFS Modern Fiction Studies, 47(4)&amp;lt;/ref&amp;gt;. The current Virtual Reality technology of our world may not be as advanced as that in the book, where people interact with the network directly through their nervous systems with full sensory stimulation, but that may be just a matter of time &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. Virtual Reality seems to be finally at the cusp of penetrating our world and becoming common norm with the Oculus Rift and other types of headsets.&lt;br /&gt;
The book reflects, ultimately, the increasing presence of technology in our lives, having in its core the direct integration of man and computer. Indeed, development in this direction has already started &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The VR headsets are getting better and providing a greater immersion into their virtual realms. Direct brain-to-brain communication between human subjects has been achieved - a sort of technological telepathy – with the aid of electrodes attached to a person’s scalp and the use of the internet to transmit the information &amp;lt;ref&amp;gt; ScienceDaily (2014). Direct Brain-to-Brain Communication Demonstrated in Human Subjects. Retrieved from www.sciencedaily.com/releases/2014/09/140903105646.htm&amp;lt;/ref&amp;gt;. Real-time brain control of a computer cursor was already done back in 2002 &amp;lt;ref&amp;gt; ScienceDaily (2002). Researchers Demonstrate Direct, Real-Time Brain Control of Computer Cursor. Retrieved from www.sciencedaily.com/releases/2002/03/020314080832.htm&amp;lt;/ref&amp;gt;. There’s a real tendency to merge computers, the Internet and our own wetware &amp;lt;ref&amp;gt; Wikipedia. Wetware (brain). Retrieved from en.wikipedia.org/wiki/Wetware_(brain)&amp;lt;/ref&amp;gt; that is evocative of the world William Gibson created.&lt;br /&gt;
&lt;br /&gt;
With all these developments there is always the risk of abuse, addiction, as escapism – a subject also dealt with in the book. Either way, our connection with the technology we use is already affecting us &amp;lt;ref&amp;gt; ScienceDaily (2009). Is Technology Producing a Decline in Critical Thinking? Retrieved from www.sciencedaily.com/releases/2009/01/090128092341.htm&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; ScienceDaily (2016). Kids Who Text and Watch TV Simultaneously Likely to Underperform at School. Retrieved from www.sciencedaily.com/releases/2016/05/160518102746.htm&amp;lt;/ref&amp;gt; and only time will tell if we will achieved that full integration with the machines that was envisioned in Neuromancer.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Books]] [[Category:Media]] [[Category:VR Books]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10546</id>
		<title>Neuromancer</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10546"/>
		<updated>2016-07-20T23:23:51Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;[[Neuromancer]]: a foreshadow of things still to come&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
By Paulo Pacheco on July 20, 2016&lt;br /&gt;
&lt;br /&gt;
==Introduction==&lt;br /&gt;
Neuromancer is the first novel of the writer William Gibson, and it was published on the July 1, 1984 &amp;lt;ref name=”1”&amp;gt; Sullivan, Mark (2009). Neuromancer Turns 25: What it Got Right, What it got Wrong. Retrieved from www.macworld.com/article/1141500/neuromancer_25.html&amp;lt;/ref&amp;gt;. It has sold more than 6 million copies, and in the year after its launch received the three biggest awards in Science Fiction writing: the Nebula, Philip K Dick and Hugo awards &amp;lt;ref name=”2”&amp;gt; Cumming, Ed (2014). William Gibson: the man who saw tomorrow. Retrieved from www.theguardian.com/books/2014/jul/28/william-gibson-neuromancer-cyberpunk-books&amp;lt;/ref&amp;gt;. It defined an aesthetic – Cyberpunk – and left a mark in the tech and digital culture by envisioning the concept of cyberspace and virtual reality, both integrated and being extensions of the physical world &amp;lt;ref name=”3”&amp;gt; DSMLF (2015). Neuromancer: William Gibson’s Virtual Reality Masterpiece. Retrieved from dsmlf.info/neuromancer-william-gibsons-virtual-reality-masterpiece&amp;lt;/ref&amp;gt;. Today, we have the World Wide Web, and the explosion of Virtual Reality is finally around the corner (even if it still hasn’t reached the same level has explored in the novel) has reminders of some aspects of the world created by Gibson that crept in into our reality.&lt;br /&gt;
&lt;br /&gt;
==Influences for the Story==&lt;br /&gt;
William Gibson was not a “techie” by nature. He was aware of the new technologies around him, but according to Gareth Damien Martin, “he never had even touched a PC when he wrote Neuromancer.” His exposure to computers came as he met and conversed with science fiction writers and people who were experiencing that novel technology. He focused on observing their behaviors, addictions, obsessions and how they would interface with technology.&lt;br /&gt;
Another influence for the novel came from the counter-culture of the 1960’s. The author was embedded in its excesses, in the drug-culture and the exploration of altered states of consciousness. This influence can easily be seen in the main character and in the criminal underworld described in the story. In both of these cases – in the tech and counter-culture world - his value was mainly has an observer &amp;lt;ref name=”4”&amp;gt; Marting, Gareth Damian. Re-reading William Gibson at the Advent of Virtual Reality. Retrieved from versions.killscreen.com/re-reading-william-gibson-at-the-advent-of-virtual-reality&amp;lt;/ref&amp;gt;. Other influences for the work of William Gibson came from movies (e.g. Escape From New York and 1940’s film-noir), music and pop culture elements &amp;lt;ref&amp;gt; McCaffery, Larry (1991). An Interview With William Gibson. Retrieved from project.cyberpunk.ru/idb/gibson_interview.html&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Summary of the Story of Neuromancer==&lt;br /&gt;
The setting of the story is in a “post-apocalyptic, not-too-distant future in which ‘human’ has transformed into ‘post-human’ and ecological systems have been supplanted by technological constructs” &amp;lt;ref&amp;gt; Leaver, Tama (1997). Post-Humanism and Ecocide in William Gibson’s Neuromancer and Ridley Scott’s Blade Runner. Retrieved from cyberpunk.asia/cp_project.php?txt=180&amp;amp;lng=fr&amp;lt;/ref&amp;gt;. It is a future where media, technology, pop culture and market imperatives have spun out of control &amp;lt;ref&amp;gt; Walker, Douglas (1989). Douglas Walker Interviews Science Fiction Author William Gibson. Retrieved from www.douglaswalker.ca/press/gibson.pdf&amp;lt;/ref&amp;gt;. It follows the story of a character called Case, a once “cyberspace cowboy” who could hack into corporate databases. Due to a job gone wrong, Case is left crippled and unable to access cyberspace. He is then recruited by an underworld group of people. They promise to heal Case’s nervous system if he helps them to infiltrate an AI (Artificial Intelligence) called Wintermute &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Cyberspace, Virtual Realities and the Fusion of Technology with Wetware==&lt;br /&gt;
There is no doubt that Neuromancer had a great impact in foreseeing the technologies that would follow its publication, and its level of prescience is still praised; the author’s being named has a prophet of the digital age. Even though there are some technologies that the book foreshadowed, others are still a bit far off &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. We may not have reached - in the real-world - the bleak aesthetics of the novel, but there still are intersecting paths between fiction and reality that are eerily similar.&lt;br /&gt;
One of those, is the idea of a World Wide Web: a global network of millions of computers. The concept of linking computers to each other already existed when the book launched – universities had already connected various systems of servers through a telecom link – but not on the global scale that the novel described. The concept of the internet as we know it today was still a decade away, and it may just have been a wild speculation at the time. Jack Womack has suggested, in the afterword of the 2000 re-release of the book, that it could have even influenced the way the Web developed by providing a sort of blueprint, a guide, to the developers who read and grew up with the novel &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
It also defined cyberspace (or the matrix as it is also called) has “a consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts… A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights receding…&amp;quot; &amp;lt;ref&amp;gt; Myers, Tony (2001). The Postmodern Imaginary in William Gibson’s Neuromancer. MFS Modern Fiction Studies, 47(4)&amp;lt;/ref&amp;gt;. The current Virtual Reality technology of our world may not be as advanced as that in the book, where people interact with the network directly through their nervous systems with full sensory stimulation, but that may be just a matter of time &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. Virtual Reality seems to be finally at the cusp of penetrating our world and becoming common norm with the Oculus Rift and other types of headsets.&lt;br /&gt;
The book reflects, ultimately, the increasing presence of technology in our lives, having in its core the direct integration of man and computer. Indeed, development in this direction has already started &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The VR headsets are getting better and providing a greater immersion into their virtual realms. Direct brain-to-brain communication between human subjects has been achieved - a sort of technological telepathy – with the aid of electrodes attached to a person’s scalp and the use of the internet to transmit the information &amp;lt;ref&amp;gt; ScienceDaily (2014). Direct Brain-to-Brain Communication Demonstrated in Human Subjects. Retrieved from www.sciencedaily.com/releases/2014/09/140903105646.htm&amp;lt;/ref&amp;gt;. Real-time brain control of a computer cursor was already done back in 2002 &amp;lt;ref&amp;gt; ScienceDaily (2002). Researchers Demonstrate Direct, Real-Time Brain Control of Computer Cursor. Retrieved from www.sciencedaily.com/releases/2002/03/020314080832.htm&amp;lt;/ref&amp;gt;. There’s a real tendency to merge computers, the Internet and our own wetware &amp;lt;ref&amp;gt; Wikipedia. Wetware (brain). Retrieved from en.wikipedia.org/wiki/Wetware_(brain)&amp;lt;/ref&amp;gt; that is evocative of the world William Gibson created.&lt;br /&gt;
&lt;br /&gt;
With all these developments there is always the risk of abuse, addiction, as escapism – a subject also dealt with in the book. Either way, our connection with the technology we use is already affecting us &amp;lt;ref&amp;gt; ScienceDaily (2009). Is Technology Producing a Decline in Critical Thinking? Retrieved from www.sciencedaily.com/releases/2009/01/090128092341.htm&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; ScienceDaily (2016). Kids Who Text and Watch TV Simultaneously Likely to Underperform at School. Retrieved from www.sciencedaily.com/releases/2016/05/160518102746.htm&amp;lt;/ref&amp;gt; and only time will tell if we will achieved that full integration with the machines that was envisioned in Neuromancer.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Books]] [[Category:Media]] [[Category:VR Books]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10545</id>
		<title>Neuromancer</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10545"/>
		<updated>2016-07-20T23:22:56Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Neuromancer: a foreshadow of things still to come&lt;br /&gt;
By Paulo Pacheco on July 20, 2016&lt;br /&gt;
&lt;br /&gt;
==Introduction==&lt;br /&gt;
Neuromancer is the first novel of the writer William Gibson, and it was published on the July 1, 1984 &amp;lt;ref name=”1”&amp;gt; Sullivan, Mark (2009). Neuromancer Turns 25: What it Got Right, What it got Wrong. Retrieved from www.macworld.com/article/1141500/neuromancer_25.html&amp;lt;/ref&amp;gt;. It has sold more than 6 million copies, and in the year after its launch received the three biggest awards in Science Fiction writing: the Nebula, Philip K Dick and Hugo awards &amp;lt;ref name=”2”&amp;gt; Cumming, Ed (2014). William Gibson: the man who saw tomorrow. Retrieved from www.theguardian.com/books/2014/jul/28/william-gibson-neuromancer-cyberpunk-books&amp;lt;/ref&amp;gt;. It defined an aesthetic – Cyberpunk – and left a mark in the tech and digital culture by envisioning the concept of cyberspace and virtual reality, both integrated and being extensions of the physical world &amp;lt;ref name=”3”&amp;gt; DSMLF (2015). Neuromancer: William Gibson’s Virtual Reality Masterpiece. Retrieved from dsmlf.info/neuromancer-william-gibsons-virtual-reality-masterpiece&amp;lt;/ref&amp;gt;. Today, we have the World Wide Web, and the explosion of Virtual Reality is finally around the corner (even if it still hasn’t reached the same level has explored in the novel) has reminders of some aspects of the world created by Gibson that crept in into our reality.&lt;br /&gt;
&lt;br /&gt;
==Influences for the Story==&lt;br /&gt;
William Gibson was not a “techie” by nature. He was aware of the new technologies around him, but according to Gareth Damien Martin, “he never had even touched a PC when he wrote Neuromancer.” His exposure to computers came as he met and conversed with science fiction writers and people who were experiencing that novel technology. He focused on observing their behaviors, addictions, obsessions and how they would interface with technology.&lt;br /&gt;
Another influence for the novel came from the counter-culture of the 1960’s. The author was embedded in its excesses, in the drug-culture and the exploration of altered states of consciousness. This influence can easily be seen in the main character and in the criminal underworld described in the story. In both of these cases – in the tech and counter-culture world - his value was mainly has an observer &amp;lt;ref name=”4”&amp;gt; Marting, Gareth Damian. Re-reading William Gibson at the Advent of Virtual Reality. Retrieved from versions.killscreen.com/re-reading-william-gibson-at-the-advent-of-virtual-reality&amp;lt;/ref&amp;gt;. Other influences for the work of William Gibson came from movies (e.g. Escape From New York and 1940’s film-noir), music and pop culture elements &amp;lt;ref&amp;gt; McCaffery, Larry (1991). An Interview With William Gibson. Retrieved from project.cyberpunk.ru/idb/gibson_interview.html&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Summary of the Story of Neuromancer==&lt;br /&gt;
The setting of the story is in a “post-apocalyptic, not-too-distant future in which ‘human’ has transformed into ‘post-human’ and ecological systems have been supplanted by technological constructs” &amp;lt;ref&amp;gt; Leaver, Tama (1997). Post-Humanism and Ecocide in William Gibson’s Neuromancer and Ridley Scott’s Blade Runner. Retrieved from cyberpunk.asia/cp_project.php?txt=180&amp;amp;lng=fr&amp;lt;/ref&amp;gt;. It is a future where media, technology, pop culture and market imperatives have spun out of control &amp;lt;ref&amp;gt; Walker, Douglas (1989). Douglas Walker Interviews Science Fiction Author William Gibson. Retrieved from www.douglaswalker.ca/press/gibson.pdf&amp;lt;/ref&amp;gt;. It follows the story of a character called Case, a once “cyberspace cowboy” who could hack into corporate databases. Due to a job gone wrong, Case is left crippled and unable to access cyberspace. He is then recruited by an underworld group of people. They promise to heal Case’s nervous system if he helps them to infiltrate an AI (Artificial Intelligence) called Wintermute &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Cyberspace, Virtual Realities and the Fusion of Technology with Wetware==&lt;br /&gt;
There is no doubt that Neuromancer had a great impact in foreseeing the technologies that would follow its publication, and its level of prescience is still praised; the author’s being named has a prophet of the digital age. Even though there are some technologies that the book foreshadowed, others are still a bit far off &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. We may not have reached - in the real-world - the bleak aesthetics of the novel, but there still are intersecting paths between fiction and reality that are eerily similar.&lt;br /&gt;
One of those, is the idea of a World Wide Web: a global network of millions of computers. The concept of linking computers to each other already existed when the book launched – universities had already connected various systems of servers through a telecom link – but not on the global scale that the novel described. The concept of the internet as we know it today was still a decade away, and it may just have been a wild speculation at the time. Jack Womack has suggested, in the afterword of the 2000 re-release of the book, that it could have even influenced the way the Web developed by providing a sort of blueprint, a guide, to the developers who read and grew up with the novel &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
It also defined cyberspace (or the matrix as it is also called) has “a consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts… A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights receding…&amp;quot; &amp;lt;ref&amp;gt; Myers, Tony (2001). The Postmodern Imaginary in William Gibson’s Neuromancer. MFS Modern Fiction Studies, 47(4)&amp;lt;/ref&amp;gt;. The current Virtual Reality technology of our world may not be as advanced as that in the book, where people interact with the network directly through their nervous systems with full sensory stimulation, but that may be just a matter of time &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. Virtual Reality seems to be finally at the cusp of penetrating our world and becoming common norm with the Oculus Rift and other types of headsets.&lt;br /&gt;
The book reflects, ultimately, the increasing presence of technology in our lives, having in its core the direct integration of man and computer. Indeed, development in this direction has already started &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The VR headsets are getting better and providing a greater immersion into their virtual realms. Direct brain-to-brain communication between human subjects has been achieved - a sort of technological telepathy – with the aid of electrodes attached to a person’s scalp and the use of the internet to transmit the information &amp;lt;ref&amp;gt; ScienceDaily (2014). Direct Brain-to-Brain Communication Demonstrated in Human Subjects. Retrieved from www.sciencedaily.com/releases/2014/09/140903105646.htm&amp;lt;/ref&amp;gt;. Real-time brain control of a computer cursor was already done back in 2002 &amp;lt;ref&amp;gt; ScienceDaily (2002). Researchers Demonstrate Direct, Real-Time Brain Control of Computer Cursor. Retrieved from www.sciencedaily.com/releases/2002/03/020314080832.htm&amp;lt;/ref&amp;gt;. There’s a real tendency to merge computers, the Internet and our own wetware &amp;lt;ref&amp;gt; Wikipedia. Wetware (brain). Retrieved from en.wikipedia.org/wiki/Wetware_(brain)&amp;lt;/ref&amp;gt; that is evocative of the world William Gibson created.&lt;br /&gt;
&lt;br /&gt;
With all these developments there is always the risk of abuse, addiction, as escapism – a subject also dealt with in the book. Either way, our connection with the technology we use is already affecting us &amp;lt;ref&amp;gt; ScienceDaily (2009). Is Technology Producing a Decline in Critical Thinking? Retrieved from www.sciencedaily.com/releases/2009/01/090128092341.htm&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; ScienceDaily (2016). Kids Who Text and Watch TV Simultaneously Likely to Underperform at School. Retrieved from www.sciencedaily.com/releases/2016/05/160518102746.htm&amp;lt;/ref&amp;gt; and only time will tell if we will achieved that full integration with the machines that was envisioned in Neuromancer.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Books]] [[Category:Media]] [[Category:VR Books]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10544</id>
		<title>Neuromancer</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10544"/>
		<updated>2016-07-20T23:19:48Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Neuromancer: a foreshadow of things still to come&lt;br /&gt;
By Paulo Pacheco on July 20, 2016&lt;br /&gt;
&lt;br /&gt;
==Introduction==&lt;br /&gt;
Neuromancer is the first novel of the writer William Gibson, and it was published on the July 1, 1984 &amp;lt;ref name=”1”&amp;gt; Sullivan, Mark (2009). Neuromancer Turns 25: What it Got Right, What it got Wrong. Retrieved from www.macworld.com/article/1141500/neuromancer_25.html&amp;lt;/ref&amp;gt;. It has sold more than 6 million copies, and in the year after its launch received the three biggest awards in Science Fiction writing: the Nebula, Philip K Dick and Hugo awards &amp;lt;ref name=”2”&amp;gt; Cumming, Ed (2014). William Gibson: the man who saw tomorrow. Retrieved from www.theguardian.com/books/2014/jul/28/william-gibson-neuromancer-cyberpunk-books&amp;lt;/ref&amp;gt;. It defined an aesthetic – Cyberpunk – and left a mark in the tech and digital culture by envisioning the concept of cyberspace and virtual reality, both integrated and being extensions of the physical world &amp;lt;ref name=”3”&amp;gt; DSMLF (2015). Neuromancer: William Gibson’s Virtual Reality Masterpiece. Retrieved from dsmlf.info/neuromancer-william-gibsons-virtual-reality-masterpiece&amp;lt;/ref&amp;gt;. Today, we have the World Wide Web, and the explosion of Virtual Reality is finally around the corner (even if it still hasn’t reached the same level has explored in the novel) has reminders of some aspects of the world created by Gibson that crept in into our reality.&lt;br /&gt;
&lt;br /&gt;
==Influences for the Story==&lt;br /&gt;
William Gibson was not a “techie” by nature. He was aware of the new technologies around him, but according to Gareth Damien Martin, “he never had even touched a PC when he wrote Neuromancer.” His exposure to computers came as he met and conversed with science fiction writers and people who were experiencing that novel technology. He focused on observing their behaviors, addictions, obsessions and how they would interface with technology.&lt;br /&gt;
Another influence for the novel came from the counter-culture of the 1960’s. The author was embedded in its excesses, in the drug-culture and the exploration of altered states of consciousness. This influence can easily be seen in the main character and in the criminal underworld described in the story. In both of these cases – in the tech and counter-culture world - his value was mainly has an observer &amp;lt;ref name=”4”&amp;gt; Marting, Gareth Damian. Re-reading William Gibson at the Advent of Virtual Reality. Retrieved from versions.killscreen.com/re-reading-william-gibson-at-the-advent-of-virtual-reality&amp;lt;/ref&amp;gt;. Other influences for the work of William Gibson came from movies (e.g. Escape From New York and 1940’s film-noir), music and pop culture elements &amp;lt;ref&amp;gt; McCaffery, Larry (1991). An Interview With William Gibson. Retrieved from project.cyberpunk.ru/idb/gibson_interview.html&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Summary of the Story of Neuromancer==&lt;br /&gt;
The setting of the story is in a “post-apocalyptic, not-too-distant future in which ‘human’ has transformed into ‘post-human’ and ecological systems have been supplanted by technological constructs” &amp;lt;ref&amp;gt; Leaver, Tama (1997). Post-Humanism and Ecocide in William Gibson’s Neuromancer and Ridley Scott’s Blade Runner. Retrieved from cyberpunk.asia/cp_project.php?txt=180&amp;amp;lng=fr&amp;lt;/ref&amp;gt;. It is a future where media, technology, pop culture and market imperatives have spun out of control &amp;lt;ref&amp;gt; Walker, Douglas (1989). Douglas Walker Interviews Science Fiction Author William Gibson. Retrieved from www.douglaswalker.ca/press/gibson.pdf&amp;lt;/ref&amp;gt;. It follows the story of a character called Case, a once “cyberspace cowboy” who could hack into corporate databases. Due to a job gone wrong, Case is left crippled and unable to access cyberspace. He is then recruited by an underworld group of people. They promise to heal Case’s nervous system if he helps them to infiltrate an AI (Artificial Intelligence) called Wintermute &amp;lt;ref name=”1”&amp;gt; &amp;lt;ref name=”4”&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Cyberspace, Virtual Realities and the Fusion of Technology with Wetware==&lt;br /&gt;
There is no doubt that Neuromancer had a great impact in foreseeing the technologies that would follow its publication, and its level of prescience is still praised; the author’s being named has a prophet of the digital age. Even though there are some technologies that the book foreshadowed, others are still a bit far off &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt;&amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt;. We may not have reached - in the real-world - the bleak aesthetics of the novel, but there still are intersecting paths between fiction and reality that are eerily similar.&lt;br /&gt;
One of those, is the idea of a World Wide Web: a global network of millions of computers. The concept of linking computers to each other already existed when the book launched – universities had already connected various systems of servers through a telecom link – but not on the global scale that the novel described. The concept of the internet as we know it today was still a decade away, and it may just have been a wild speculation at the time. Jack Womack has suggested, in the afterword of the 2000 re-release of the book, that it could have even influenced the way the Web developed by providing a sort of blueprint, a guide, to the developers who read and grew up with the novel &amp;lt;ref name=”1”&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
It also defined cyberspace (or the matrix as it is also called) has “a consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts… A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights receding…&amp;quot; &amp;lt;ref&amp;gt; Myers, Tony (2001). The Postmodern Imaginary in William Gibson’s Neuromancer. MFS Modern Fiction Studies, 47(4)&amp;lt;/ref&amp;gt;. The current Virtual Reality technology of our world may not be as advanced as that in the book, where people interact with the network directly through their nervous systems with full sensory stimulation, but that may be just a matter of time &amp;lt;ref name=”4”&amp;gt;. Virtual Reality seems to be finally at the cusp of penetrating our world and becoming common norm with the Oculus Rift and other types of headsets.&lt;br /&gt;
The book reflects, ultimately, the increasing presence of technology in our lives, having in its core the direct integration of man and computer. Indeed, development in this direction has already started &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The VR headsets are getting better and providing a greater immersion into their virtual realms. Direct brain-to-brain communication between human subjects has been achieved - a sort of technological telepathy – with the aid of electrodes attached to a person’s scalp and the use of the internet to transmit the information &amp;lt;ref&amp;gt; ScienceDaily (2014). Direct Brain-to-Brain Communication Demonstrated in Human Subjects. Retrieved from www.sciencedaily.com/releases/2014/09/140903105646.htm&amp;lt;/ref&amp;gt;. Real-time brain control of a computer cursor was already done back in 2002 &amp;lt;ref&amp;gt; ScienceDaily (2002). Researchers Demonstrate Direct, Real-Time Brain Control of Computer Cursor. Retrieved from www.sciencedaily.com/releases/2002/03/020314080832.htm&amp;lt;/ref&amp;gt;. There’s a real tendency to merge computers, the Internet and our own wetware &amp;lt;ref&amp;gt; Wikipedia. Wetware (brain). Retrieved from en.wikipedia.org/wiki/Wetware_(brain)&amp;lt;/ref&amp;gt; that is evocative of the world William Gibson created.&lt;br /&gt;
&lt;br /&gt;
With all these developments there is always the risk of abuse, addiction, as escapism – a subject also dealt with in the book. Either way, our connection with the technology we use is already affecting us &amp;lt;ref&amp;gt; ScienceDaily (2009). Is Technology Producing a Decline in Critical Thinking? Retrieved from www.sciencedaily.com/releases/2009/01/090128092341.htm&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; &amp;lt;ref&amp;gt; ScienceDaily (2016). Kids Who Text and Watch TV Simultaneously Likely to Underperform at School. Retrieved from www.sciencedaily.com/releases/2016/05/160518102746.htm&amp;lt;/ref&amp;gt; and only time will tell if we will achieved that full integration with the machines that was envisioned in Neuromancer.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Books]] [[Category:Media]] [[Category:VR Books]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10543</id>
		<title>Neuromancer</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10543"/>
		<updated>2016-07-20T23:19:07Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Neuromancer: a foreshadow of things still to come&lt;br /&gt;
By Paulo Pacheco on July 20, 2016&lt;br /&gt;
&lt;br /&gt;
==Introduction==&lt;br /&gt;
Neuromancer is the first novel of the writer William Gibson, and it was published on the July 1, 1984 &amp;lt;ref name=”1”&amp;gt; Sullivan, Mark (2009). Neuromancer Turns 25: What it Got Right, What it got Wrong. Retrieved from www.macworld.com/article/1141500/neuromancer_25.html&amp;lt;/ref&amp;gt;. It has sold more than 6 million copies, and in the year after its launch received the three biggest awards in Science Fiction writing: the Nebula, Philip K Dick and Hugo awards &amp;lt;ref name=”2”&amp;gt; Cumming, Ed (2014). William Gibson: the man who saw tomorrow. Retrieved from www.theguardian.com/books/2014/jul/28/william-gibson-neuromancer-cyberpunk-books&amp;lt;/ref&amp;gt;. It defined an aesthetic – Cyberpunk – and left a mark in the tech and digital culture by envisioning the concept of cyberspace and virtual reality, both integrated and being extensions of the physical world &amp;lt;ref name=”3”&amp;gt; DSMLF (2015). Neuromancer: William Gibson’s Virtual Reality Masterpiece. Retrieved from dsmlf.info/neuromancer-william-gibsons-virtual-reality-masterpiece&amp;lt;/ref&amp;gt;. Today, we have the World Wide Web, and the explosion of Virtual Reality is finally around the corner (even if it still hasn’t reached the same level has explored in the novel) has reminders of some aspects of the world created by Gibson that crept in into our reality.&lt;br /&gt;
&lt;br /&gt;
==Influences for the Story==&lt;br /&gt;
William Gibson was not a “techie” by nature. He was aware of the new technologies around him, but according to Gareth Damien Martin, “he never had even touched a PC when he wrote Neuromancer.” His exposure to computers came as he met and conversed with science fiction writers and people who were experiencing that novel technology. He focused on observing their behaviors, addictions, obsessions and how they would interface with technology.&lt;br /&gt;
Another influence for the novel came from the counter-culture of the 1960’s. The author was embedded in its excesses, in the drug-culture and the exploration of altered states of consciousness. This influence can easily be seen in the main character and in the criminal underworld described in the story. In both of these cases – in the tech and counter-culture world - his value was mainly has an observer &amp;lt;ref name=”4”&amp;gt; Marting, Gareth Damian. Re-reading William Gibson at the Advent of Virtual Reality. Retrieved from versions.killscreen.com/re-reading-william-gibson-at-the-advent-of-virtual-reality&amp;lt;/ref&amp;gt;. Other influences for the work of William Gibson came from movies (e.g. Escape From New York and 1940’s film-noir), music and pop culture elements &amp;lt;ref&amp;gt; McCaffery, Larry (1991). An Interview With William Gibson. Retrieved from project.cyberpunk.ru/idb/gibson_interview.html&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Summary of the Story of Neuromancer==&lt;br /&gt;
The setting of the story is in a “post-apocalyptic, not-too-distant future in which ‘human’ has transformed into ‘post-human’ and ecological systems have been supplanted by technological constructs” &amp;lt;ref&amp;gt; Leaver, Tama (1997). Post-Humanism and Ecocide in William Gibson’s Neuromancer and Ridley Scott’s Blade Runner. Retrieved from cyberpunk.asia/cp_project.php?txt=180&amp;amp;lng=fr&amp;lt;/ref&amp;gt;. It is a future where media, technology, pop culture and market imperatives have spun out of control &amp;lt;ref&amp;gt; Walker, Douglas (1989). Douglas Walker Interviews Science Fiction Author William Gibson. Retrieved from www.douglaswalker.ca/press/gibson.pdf&amp;lt;/ref&amp;gt;. It follows the story of a character called Case, a once “cyberspace cowboy” who could hack into corporate databases. Due to a job gone wrong, Case is left crippled and unable to access cyberspace. He is then recruited by an underworld group of people. They promise to heal Case’s nervous system if he helps them to infiltrate an AI (Artificial Intelligence) called Wintermute &amp;lt;ref name=”1”&amp;gt; &amp;lt;ref name=”4”&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Cyberspace, Virtual Realities and the Fusion of Technology with Wetware==&lt;br /&gt;
There is no doubt that Neuromancer had a great impact in foreseeing the technologies that would follow its publication, and its level of prescience is still praised; the author’s being named has a prophet of the digital age. Even though there are some technologies that the book foreshadowed, others are still a bit far off &amp;lt;ref name=”1”&amp;gt; &amp;lt;ref name=”2”&amp;gt; &amp;lt;ref name=”3”&amp;gt;. We may not have reached - in the real-world - the bleak aesthetics of the novel, but there still are intersecting paths between fiction and reality that are eerily similar.&lt;br /&gt;
One of those, is the idea of a World Wide Web: a global network of millions of computers. The concept of linking computers to each other already existed when the book launched – universities had already connected various systems of servers through a telecom link – but not on the global scale that the novel described. The concept of the internet as we know it today was still a decade away, and it may just have been a wild speculation at the time. Jack Womack has suggested, in the afterword of the 2000 re-release of the book, that it could have even influenced the way the Web developed by providing a sort of blueprint, a guide, to the developers who read and grew up with the novel &amp;lt;ref name=”1”&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
It also defined cyberspace (or the matrix as it is also called) has “a consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts… A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights receding…&amp;quot; &amp;lt;ref&amp;gt; Myers, Tony (2001). The Postmodern Imaginary in William Gibson’s Neuromancer. MFS Modern Fiction Studies, 47(4)&amp;lt;/ref&amp;gt;. The current Virtual Reality technology of our world may not be as advanced as that in the book, where people interact with the network directly through their nervous systems with full sensory stimulation, but that may be just a matter of time &amp;lt;ref name=”4”&amp;gt;. Virtual Reality seems to be finally at the cusp of penetrating our world and becoming common norm with the Oculus Rift and other types of headsets.&lt;br /&gt;
The book reflects, ultimately, the increasing presence of technology in our lives, having in its core the direct integration of man and computer. Indeed, development in this direction has already started &amp;lt;ref name=”1”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”2”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”3”&amp;gt;&amp;lt;/ref&amp;gt; &amp;lt;ref name=”4”&amp;gt;&amp;lt;/ref&amp;gt;. The VR headsets are getting better and providing a greater immersion into their virtual realms. Direct brain-to-brain communication between human subjects has been achieved - a sort of technological telepathy – with the aid of electrodes attached to a person’s scalp and the use of the internet to transmit the information &amp;lt;ref&amp;gt; ScienceDaily (2014). Direct Brain-to-Brain Communication Demonstrated in Human Subjects. Retrieved from www.sciencedaily.com/releases/2014/09/140903105646.htm&amp;lt;/ref&amp;gt;. Real-time brain control of a computer cursor was already done back in 2002 &amp;lt;ref&amp;gt; ScienceDaily (2002). Researchers Demonstrate Direct, Real-Time Brain Control of Computer Cursor. Retrieved from www.sciencedaily.com/releases/2002/03/020314080832.htm&amp;lt;/ref&amp;gt;. There’s a real tendency to merge computers, the Internet and our own wetware &amp;lt;ref&amp;gt; Wikipedia. Wetware (brain). Retrieved from en.wikipedia.org/wiki/Wetware_(brain)&amp;lt;/ref&amp;gt; that is evocative of the world William Gibson created.&lt;br /&gt;
&lt;br /&gt;
With all these developments there is always the risk of abuse, addiction, as escapism – a subject also dealt with in the book. Either way, our connection with the technology we use is already affecting us &amp;lt;ref&amp;gt; ScienceDaily (2009). Is Technology Producing a Decline in Critical Thinking? Retrieved from www.sciencedaily.com/releases/2009/01/090128092341.htm&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; &amp;lt;ref&amp;gt; ScienceDaily (2016). Kids Who Text and Watch TV Simultaneously Likely to Underperform at School. Retrieved from www.sciencedaily.com/releases/2016/05/160518102746.htm&amp;lt;/ref&amp;gt; and only time will tell if we will achieved that full integration with the machines that was envisioned in Neuromancer.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Books]] [[Category:Media]] [[Category:VR Books]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10542</id>
		<title>Neuromancer</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Neuromancer&amp;diff=10542"/>
		<updated>2016-07-20T23:18:08Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: Created page with &amp;quot;Neuromancer: a foreshadow of things still to come By Paulo Pacheco on July 20, 2016  ==Introduction== Neuromancer is the first novel of the writer William Gibson, and it was p...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Neuromancer: a foreshadow of things still to come&lt;br /&gt;
By Paulo Pacheco on July 20, 2016&lt;br /&gt;
&lt;br /&gt;
==Introduction==&lt;br /&gt;
Neuromancer is the first novel of the writer William Gibson, and it was published on the July 1, 1984 &amp;lt;ref name=”1”&amp;gt; Sullivan, Mark (2009). Neuromancer Turns 25: What it Got Right, What it got Wrong. Retrieved from www.macworld.com/article/1141500/neuromancer_25.html&amp;lt;/ref&amp;gt;. It has sold more than 6 million copies, and in the year after its launch received the three biggest awards in Science Fiction writing: the Nebula, Philip K Dick and Hugo awards &amp;lt;ref name=”2”&amp;gt; Cumming, Ed (2014). William Gibson: the man who saw tomorrow. Retrieved from www.theguardian.com/books/2014/jul/28/william-gibson-neuromancer-cyberpunk-books&amp;lt;/ref&amp;gt;. It defined an aesthetic – Cyberpunk – and left a mark in the tech and digital culture by envisioning the concept of cyberspace and virtual reality, both integrated and being extensions of the physical world &amp;lt;ref name=”3”&amp;gt; DSMLF (2015). Neuromancer: William Gibson’s Virtual Reality Masterpiece. Retrieved from dsmlf.info/neuromancer-william-gibsons-virtual-reality-masterpiece&amp;lt;/ref&amp;gt;. Today, we have the World Wide Web, and the explosion of Virtual Reality is finally around the corner (even if it still hasn’t reached the same level has explored in the novel) has reminders of some aspects of the world created by Gibson that crept in into our reality.&lt;br /&gt;
&lt;br /&gt;
==Influences for the Story==&lt;br /&gt;
William Gibson was not a “techie” by nature. He was aware of the new technologies around him, but according to Gareth Damien Martin, “he never had even touched a PC when he wrote Neuromancer.” His exposure to computers came as he met and conversed with science fiction writers and people who were experiencing that novel technology. He focused on observing their behaviors, addictions, obsessions and how they would interface with technology.&lt;br /&gt;
Another influence for the novel came from the counter-culture of the 1960’s. The author was embedded in its excesses, in the drug-culture and the exploration of altered states of consciousness. This influence can easily be seen in the main character and in the criminal underworld described in the story. In both of these cases – in the tech and counter-culture world - his value was mainly has an observer &amp;lt;ref name=”4”&amp;gt; Marting, Gareth Damian. Re-reading William Gibson at the Advent of Virtual Reality. Retrieved from versions.killscreen.com/re-reading-william-gibson-at-the-advent-of-virtual-reality&amp;lt;/ref&amp;gt;. Other influences for the work of William Gibson came from movies (e.g. Escape From New York and 1940’s film-noir), music and pop culture elements &amp;lt;ref&amp;gt; McCaffery, Larry (1991). An Interview With William Gibson. Retrieved from project.cyberpunk.ru/idb/gibson_interview.html&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Summary of the Story of Neuromancer==&lt;br /&gt;
The setting of the story is in a “post-apocalyptic, not-too-distant future in which ‘human’ has transformed into ‘post-human’ and ecological systems have been supplanted by technological constructs” &amp;lt;ref&amp;gt; Leaver, Tama (1997). Post-Humanism and Ecocide in William Gibson’s Neuromancer and Ridley Scott’s Blade Runner. Retrieved from cyberpunk.asia/cp_project.php?txt=180&amp;amp;lng=fr&amp;lt;/ref&amp;gt;. It is a future where media, technology, pop culture and market imperatives have spun out of control &amp;lt;ref&amp;gt; Walker, Douglas (1989). Douglas Walker Interviews Science Fiction Author William Gibson. Retrieved from www.douglaswalker.ca/press/gibson.pdf&amp;lt;/ref&amp;gt;. It follows the story of a character called Case, a once “cyberspace cowboy” who could hack into corporate databases. Due to a job gone wrong, Case is left crippled and unable to access cyberspace. He is then recruited by an underworld group of people. They promise to heal Case’s nervous system if he helps them to infiltrate an AI (Artificial Intelligence) called Wintermute &amp;lt;ref name=”1”&amp;gt; &amp;lt;ref name=”4”&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Cyberspace, Virtual Realities and the Fusion of Technology with Wetware==&lt;br /&gt;
There is no doubt that Neuromancer had a great impact in foreseeing the technologies that would follow its publication, and its level of prescience is still praised; the author’s being named has a prophet of the digital age. Even though there are some technologies that the book foreshadowed, others are still a bit far off &amp;lt;ref name=”1”&amp;gt; &amp;lt;ref name=”2”&amp;gt; &amp;lt;ref name=”3”&amp;gt;. We may not have reached - in the real-world - the bleak aesthetics of the novel, but there still are intersecting paths between fiction and reality that are eerily similar.&lt;br /&gt;
One of those, is the idea of a World Wide Web: a global network of millions of computers. The concept of linking computers to each other already existed when the book launched – universities had already connected various systems of servers through a telecom link – but not on the global scale that the novel described. The concept of the internet as we know it today was still a decade away, and it may just have been a wild speculation at the time. Jack Womack has suggested, in the afterword of the 2000 re-release of the book, that it could have even influenced the way the Web developed by providing a sort of blueprint, a guide, to the developers who read and grew up with the novel &amp;lt;ref name=”1”&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
It also defined cyberspace (or the matrix as it is also called) has “a consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts… A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights receding…&amp;quot; &amp;lt;ref&amp;gt; Myers, Tony (2001). The Postmodern Imaginary in William Gibson’s Neuromancer. MFS Modern Fiction Studies, 47(4)&amp;lt;/ref&amp;gt;. The current Virtual Reality technology of our world may not be as advanced as that in the book, where people interact with the network directly through their nervous systems with full sensory stimulation, but that may be just a matter of time &amp;lt;ref name=”4”&amp;gt;. Virtual Reality seems to be finally at the cusp of penetrating our world and becoming common norm with the Oculus Rift and other types of headsets.&lt;br /&gt;
The book reflects, ultimately, the increasing presence of technology in our lives, having in its core the direct integration of man and computer. Indeed, development in this direction has already started &amp;lt;ref name=”1”&amp;gt; &amp;lt;ref name=”2”&amp;gt; &amp;lt;ref name=”3”&amp;gt; &amp;lt;ref name=”4”&amp;gt;. The VR headsets are getting better and providing a greater immersion into their virtual realms. Direct brain-to-brain communication between human subjects has been achieved - a sort of technological telepathy – with the aid of electrodes attached to a person’s scalp and the use of the internet to transmit the information &amp;lt;ref&amp;gt; ScienceDaily (2014). Direct Brain-to-Brain Communication Demonstrated in Human Subjects. Retrieved from www.sciencedaily.com/releases/2014/09/140903105646.htm&amp;lt;/ref&amp;gt;. Real-time brain control of a computer cursor was already done back in 2002 &amp;lt;ref&amp;gt; ScienceDaily (2002). Researchers Demonstrate Direct, Real-Time Brain Control of Computer Cursor. Retrieved from www.sciencedaily.com/releases/2002/03/020314080832.htm&amp;lt;/ref&amp;gt;. There’s a real tendency to merge computers, the Internet and our own wetware &amp;lt;ref&amp;gt; Wikipedia. Wetware (brain). Retrieved from en.wikipedia.org/wiki/Wetware_(brain)&amp;lt;/ref&amp;gt; that is evocative of the world William Gibson created.&lt;br /&gt;
&lt;br /&gt;
With all these developments there is always the risk of abuse, addiction, as escapism – a subject also dealt with in the book. Either way, our connection with the technology we use is already affecting us &amp;lt;ref&amp;gt; ScienceDaily (2009). Is Technology Producing a Decline in Critical Thinking? Retrieved from www.sciencedaily.com/releases/2009/01/090128092341.htm&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; &amp;lt;ref&amp;gt; ScienceDaily (2016). Kids Who Text and Watch TV Simultaneously Likely to Underperform at School. Retrieved from www.sciencedaily.com/releases/2016/05/160518102746.htm&amp;lt;/ref&amp;gt; and only time will tell if we will achieved that full integration with the machines that was envisioned in Neuromancer.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Books]] [[Category:Media]] [[Category:VR Books]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=OSVR_HDK2&amp;diff=10541</id>
		<title>OSVR HDK2</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=OSVR_HDK2&amp;diff=10541"/>
		<updated>2016-07-20T22:05:26Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Device Infobox&lt;br /&gt;
|image=&lt;br /&gt;
|VR/AR=[[Virtual Reality]]&lt;br /&gt;
|Type=[[Head-mounted display]]&lt;br /&gt;
|Subtype=[[Discrete HMD]], [[DIY HMD]]&lt;br /&gt;
|Platform=[[OSVR]]&lt;br /&gt;
|Developer=[[Razer]], [[Sensics]]&lt;br /&gt;
|Requires=PC&lt;br /&gt;
|Predecessor=[[OSVR HDK1]]&lt;br /&gt;
|Successor=&lt;br /&gt;
|Display=5.5 inch? OLED&lt;br /&gt;
|Resolution=2160 x 1200 (1080 x 1200 per eye)&lt;br /&gt;
|Pixel Density=441 PPI&lt;br /&gt;
|Refresh Rate=90 Hz&lt;br /&gt;
|Persistence=Low persistence&lt;br /&gt;
|Field of View=???&lt;br /&gt;
|Optics=???&lt;br /&gt;
|Tracking=3DOF, 6DOF&lt;br /&gt;
|Rotational Tracking=[[Gyroscope]], [[Accelerometer]], [[Magnetometer]]&lt;br /&gt;
|Positional Tracking=???&lt;br /&gt;
|Update Rate=Rotational: 400Hz, Positional: 100 Hz&lt;br /&gt;
|Latency=??&lt;br /&gt;
|Connectivity=???&lt;br /&gt;
|Input=???&lt;br /&gt;
|Weight=&lt;br /&gt;
|Release Date=July 29, 2016&lt;br /&gt;
|Price=$399&lt;br /&gt;
|Website=[http://www.osvr.org/hdk2.html OSVR HDK2]&lt;br /&gt;
}}&lt;br /&gt;
[[OSVR HDK2]] or &#039;&#039;&#039;Hacker Dev Kit 2&#039;&#039;&#039; is a [[Virtual Reality]] [[head-mounted display]] created by [[Razer]]. HDK2 is the [[OSVR]]&#039;s second device. It was announced in E3 2016. OSVR HDK2 has an open source and modular design that not only allows the user to replace and renew its hardware components but also build one from scratch.&lt;br /&gt;
&lt;br /&gt;
OSVR HDK2 went on sale on July 29, 2016 at the price of $399.&lt;br /&gt;
==Hardware==&lt;br /&gt;
&lt;br /&gt;
===Specifications===&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! Part&lt;br /&gt;
! Spec &lt;br /&gt;
|-&lt;br /&gt;
|Display || 5.5 inch?? OLED&amp;lt;br&amp;gt; 441 PPI at 90 FPS&lt;br /&gt;
|-&lt;br /&gt;
|Resolution || 2160 x 1200 (1080 x 1200 per eye)&lt;br /&gt;
|-&lt;br /&gt;
|Refresh Rate || 90 Hz&lt;br /&gt;
|-&lt;br /&gt;
|Field of View || ????&lt;br /&gt;
|-&lt;br /&gt;
|Optics || ???&lt;br /&gt;
|-&lt;br /&gt;
|Interaxial Distance || Adjustable&lt;br /&gt;
|-&lt;br /&gt;
|Tracking || 6 degrees of freedom&lt;br /&gt;
|-&lt;br /&gt;
|Rotational Tracking ||  [[Gyroscope]], [[Accelerometer]], [[Magnetometer]]&lt;br /&gt;
|-&lt;br /&gt;
|Positional Tracking || IR-LED faceplate and External Infrared Camera&lt;br /&gt;
|-&lt;br /&gt;
|Update Rate || Rotational: 400Hz &amp;lt;br&amp;gt; Positional: 100 Hz&lt;br /&gt;
|-&lt;br /&gt;
|Latency || ??&lt;br /&gt;
|-&lt;br /&gt;
|Connectivity || 1 external and 2 internal USB 3.0 ports &lt;br /&gt;
|-&lt;br /&gt;
|Weight || ??&lt;br /&gt;
|-&lt;br /&gt;
|Facemask || bamboo charcoal microfiber foam layer&lt;br /&gt;
|-&lt;br /&gt;
|Input || Various, such as Mouse and Keyboard, Gamepads, [[Leap Motion]], [[Nod]]&lt;br /&gt;
|-&lt;br /&gt;
| Cost || $399.99&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Components===&lt;br /&gt;
&lt;br /&gt;
===Add-ons===&lt;br /&gt;
&lt;br /&gt;
==Setup Tutorial==&lt;br /&gt;
&lt;br /&gt;
==Apps==&lt;br /&gt;
&lt;br /&gt;
==Developer==&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&#039;&#039;&#039;June 13, 2016&#039;&#039;&#039;: OSVR HDK2 is announced at E3 2016.&lt;br /&gt;
&lt;br /&gt;
[[Category:Virtual Reality Devices]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Template:Stub&amp;diff=10540</id>
		<title>Template:Stub</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Template:Stub&amp;diff=10540"/>
		<updated>2016-07-19T04:55:24Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;div class=&amp;quot;stub&amp;quot; style=&amp;quot;border: 1px solid #254b72; color: #000000; font-size: 110%; padding: 3px 3px 3px 3px;&amp;quot;&amp;gt;[[File:Information_icon1.png]] &#039;&#039;This page is a stub, please [http://xinreality.com/mediawiki/index.php?title=Special:CreateAccount expand it] if you have more information.&#039;&#039;&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;includeonly&amp;gt;[[Category:Stubs]]&amp;lt;/includeonly&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;noinclude&amp;gt;[[Category:Templates|Rewrite]]&amp;lt;/noinclude&amp;gt;&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Template:Rewrite&amp;diff=10539</id>
		<title>Template:Rewrite</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Template:Rewrite&amp;diff=10539"/>
		<updated>2016-07-19T04:54:22Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;div class=&amp;quot;stub&amp;quot; style=&amp;quot;border: 1px solid #254b72; color: #000000; font-size: 110%; padding: 3px 3px 3px 3px;&amp;quot;&amp;gt;[[File:rewrite_icon1.png|40px]] &#039;&#039;This article needs to be updated and cleaned up, [http://xinreality.com/mediawiki/index.php?title=Special:CreateAccount please help] if you can.&#039;&#039;&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;includeonly&amp;gt;[[Category:Rewrite]]&amp;lt;/includeonly&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;noinclude&amp;gt;[[Category:Templates|Rewrite]]&amp;lt;/noinclude&amp;gt;&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Template:Rewrite&amp;diff=10538</id>
		<title>Template:Rewrite</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Template:Rewrite&amp;diff=10538"/>
		<updated>2016-07-19T04:53:52Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;div class=&amp;quot;stub&amp;quot; style=&amp;quot;border: 1px solid #254b72; color: #000000; font-size: 110%; padding: 3px 3px 3px 3px;&amp;quot;&amp;gt;[[File:rewrite_icon1.png|40px]] &#039;&#039;This article needs to be updated and cleaned up, [{{SERVER}}{{localurl:{{NAMESPACE}}:{{PAGENAME}}|action=edit}} please help] if you can.&#039;&#039;&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;includeonly&amp;gt;[[Category:Rewrite]]&amp;lt;/includeonly&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;noinclude&amp;gt;[[Category:Templates|Rewrite]]&amp;lt;/noinclude&amp;gt;&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
	<entry>
		<id>https://vrarwiki.com/index.php?title=Pokemon_Go&amp;diff=10537</id>
		<title>Pokemon Go</title>
		<link rel="alternate" type="text/html" href="https://vrarwiki.com/index.php?title=Pokemon_Go&amp;diff=10537"/>
		<updated>2016-07-16T19:43:34Z</updated>

		<summary type="html">&lt;p&gt;Shadowdawn: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{stub}}&lt;br /&gt;
{{App Infobox&lt;br /&gt;
|image={{#ev:youtube|GQgbXJub-IQ|350}}&lt;br /&gt;
|Developer=[[Niantic Labs]]&lt;br /&gt;
|Publisher=[[The Pokémon Company]]&lt;br /&gt;
|Platform=&lt;br /&gt;
|Device=All iOS and Android Devices&lt;br /&gt;
|Operating System=[[iOS]], [[Android]]&lt;br /&gt;
|Type=[[Full Game]]&lt;br /&gt;
|Genre=[[Action/Adventure]]&lt;br /&gt;
|Input Device=&lt;br /&gt;
|Game Mode=[[Single-Player]], [[Multiplayer]]&lt;br /&gt;
|Comfort Level=&lt;br /&gt;
|Version=&lt;br /&gt;
|Rating=&lt;br /&gt;
|Downloads=&lt;br /&gt;
|Release Date=July 6, 2016&lt;br /&gt;
|Price=Free with microtransactions&lt;br /&gt;
|Website=http://www.pokemongo.com/&lt;br /&gt;
|Infobox Updated=7/14/2016&lt;br /&gt;
}}&lt;br /&gt;
[[Pokemon Go]] is a location-based [[augmented reality]] [[mobile game]] developed by [[Niantic Labs]] and published by [[The Pokemon Company]]. This [http://pkmngotrading.com/wiki/Pokemon Pokemon] game was released for all [[iOS]] and [[Android]] [[Devices]] on July 6, 2016.&lt;br /&gt;
==Review==&lt;br /&gt;
&#039;&#039;&#039;A Catch of Success with &amp;quot;Pokémon Go&amp;quot;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
By Paulo Pacheco on July 14, 2016&lt;br /&gt;
&lt;br /&gt;
[http://pkmngotrading.com/wiki/Pokemon_Go_Wiki Pokémon GO] has undoubtedly been a success since its launch, on July 6, 2016. A wide phenomenon that still has to have a worldwide release, but that nevertheless already captured the interest of millions of people. From the beginnings of the franchise created by Satoshi Tajiri – inspired by his childhood hobby of insect collecting – in the early 90’s, to its recent incarnation on the smartphones, there seems to be no stopping to this longtime series, even if there have been a few bumps on the road for the latest game app.&lt;br /&gt;
&lt;br /&gt;
Described in the official Pokémon Go website as a Real World Adventure, the [[augmented reality]] [[Augmented Reality Games|game]] was originally launched in three countries: USA, Australia and New Zealand. It quickly increased in popularity, becoming viral. It’s the fastest mobile game ever to reach No. 1 &amp;lt;ref name=&amp;quot;venturebeat&amp;quot;&amp;gt; http://venturebeat.com/2016/07/11/pokemon-go-outpaces-clash-royale-as-the-fastest-game-ever-to-no-1-on-the-mobile-revenue-charts/&amp;lt;/ref&amp;gt;, and it has become the biggest mobile game in US history, attracting just under 21 million daily active users. If this trend continues, it could even surpass the number of daily active users of [[Snapchat]] and [[Google Maps]], on [[Android]]&amp;lt;ref name=&amp;quot;surveymonkey&amp;gt;https://www.surveymonkey.com/business/intelligence/pokemon-go-biggest-mobile-game-ever/&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The massive success brought, inevitably, an increase in [[Nintendo]]’s shares value&amp;lt;ref name=&amp;quot;bloomberg&amp;quot;&amp;gt;http://www.bloomberg.com/quote/7974:JP&amp;lt;/ref&amp;gt;. Viral means good business, and App Annie communications boss Fabien Pierre-Nicolas has estimated that Pokémon GO could be generating over $1 billion of net revenue for [[Niantic Labs]], the game’s developer &amp;lt;ref name=&amp;quot;venturebeat&amp;quot;&amp;gt;http://venturebeat.com/2016/07/11/pokemon-go-outpaces-clash-royale-as-the-fastest-game-ever-to-no-1-on-the-mobile-revenue-charts/&amp;lt;/ref&amp;gt;. All of this with an official release in only three countries. A phased roll-out launch has begun in Europe, with the release of the app in Germany on the 13th and the United Kingdom on the 14th of July. Other countries are expected to follow in the coming days or weeks &amp;lt;ref name=&amp;quot;twitter&amp;quot;&amp;gt;https://twitter.com/PokemonGoApp&amp;lt;/ref&amp;gt;&amp;lt;ref&amp;gt;http://www.pocket-lint.com/news/138196-pokemon-go-available-in-the-uk-at-last-get-it-on-itunes-and-google-play&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt;http://www.pocket-lint.com/news/138196-pokemon-go-available-in-the-uk-at-last-get-it-on-itunes-and-google-play&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Even though a lot of the focus has been on Nintendo, with the boost in the share value of the company, we must not forget that this is a joint venture between The Pokémon Company, Nintendo and Niantic (with Google also in the mix, since Niantic was founded as an internal Google startup &amp;lt;ref&amp;gt;http://fortune.com/2016/07/12/google-pokemon-go/&amp;lt;/ref&amp;gt;). But even if Nintendo has only a minority stake in Pokémon GO, the success of the game app means exposure for the Japanese video game company. Something much needed since many have viewed Nintendo to be on the decline after the success of the Wii &amp;lt;ref&amp;gt;http://www.forbes.com/sites/erikkain/2016/07/11/will-pokemon-go-be-the-nintendo-cash-cow-investors-are-hoping-for/#2e2b3d765926&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The game takes its foundations from another creation of Niantic: [[Ingress]]. It blends real-world exploration with a digital overlay, through the use of geo-localization and camera functions on the smartphone to superimpose images of [http://pkmngotrading.com/wiki/Pokemon Pokémon] to be captured. Its success can be attributed to this blend of the virtual and the real, the geo-location and, of course, the massive appeal of the Pokémon brand. The allure of hunting down and collecting Pokémon is still high &amp;lt;ref&amp;gt;http://theconversation.com/whats-made-poke-mon-go-such-a-viral-success-62420&amp;lt;/ref&amp;gt;http://www.themarysue.com/pokemon-go-mental-health/.&lt;br /&gt;
&lt;br /&gt;
People are also moving, gathering, and exploring the outside due to the game app. There have been anecdotal reports that the game is helping people with [[Mental health|depression, anxiety and agoraphobia]] to leave the house, helping them by providing the necessary motivation to overcome their conditions &amp;lt;ref&amp;gt;http://www.themarysue.com/pokemon-go-mental-health/&amp;lt;/ref&amp;gt;. It’s not a cure and, as previously stated, these health benefits are only anecdotal, but it’s an example of how powerful game design can be by providing a system of motivation and rewards. The fact is that walking and spending more time outdoors are good for you &amp;lt;ref&amp;gt;http://www.heart.org/HEARTORG/HealthyLiving/PhysicalActivity/Walking/Walk-Dont-Run-Your-Way-to-a-Healthy-Heart_UCM_452926_Article.jsp&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt; http://www.health.harvard.edu/press_releases/spending-time-outdoors-is-good-for-you&amp;lt;/ref&amp;gt;, and it seems that it’s something that Pokémon GO is making a lot of people do.&lt;br /&gt;
&lt;br /&gt;
There have been problems too, since the recent release of the game. Problems with the servers going down due to the overflow of players (which even caused a delay in the worldwide release of the app), bugs, and a myriad of strange occurrences, like the discovery of a dead body by a teenager while playing the game &amp;lt;ref&amp;gt;http://www.forbes.com/sites/davidthier/2016/07/07/pokemon-go-servers-seem-to-be-struggling/#64880df14958&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt;http://www.inverse.com/article/18130-a-short-history-of-the-police-s-weird-relationship-with-pokemon-go&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt;http://tek.sapo.pt/mobile/apps/artigo/pokemon_go_ja_deu_aso_a_uma_mao_cheia_de_situacoes_bizarras-48106umv.html&amp;lt;/ref&amp;gt;. Recently, there have also been concerns over privacy. The Democratic senator Al Franken has even written a letter to Niantic Labs, expressing his worries about the collecting, use and sharing of the users’ personal information by the company &amp;lt;ref&amp;gt;http://www.i4u.com/2016/07/113286/pokemon-go-success-has-alarmed-us-senator-al-franken&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt;http://money.cnn.com/2016/07/13/technology/pokemon-go-al-franken/&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
These troubles don’t seem to be affecting the interest on the game, although questions remain if it’s going to keep the momentum, or fade away like so many other apps. An example closer to Nintendo is that of Miitomo that had early success but could not sustain it &amp;lt;ref name=&amp;quot;surveymonkey&amp;gt;https://www.surveymonkey.com/business/intelligence/pokemon-go-biggest-mobile-game-ever/&amp;lt;/ref&amp;gt; &amp;lt;ref&amp;gt;https://www.surveymonkey.com/business/intelligence/rise-fall-nintendos-miitomo-downloads-arent-enough/&amp;lt;/ref&amp;gt;. With the full release of Pokémon GO throughout the world, we will see if the success is just due to the novelty of it or if the game is indeed well designed and capable of capturing the attention and dedication of players for a long time.&lt;br /&gt;
&lt;br /&gt;
A game that was originally inspired by the natural fauna, by being outdoors and exploring a world filled with novel and wonderful creatures has now come full circle, inviting people to explore their surroundings and its wonders, by merging the real world with the digital creation of the pocket monsters. Whatever the future holds, the impact of Pokémon in the gaming culture is undeniable.&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Apps]] [[Category:AR Apps]] [[Category:Games]] [[Category:Augmented Reality Games]] [[Category:AR Games]] [[Category:iOS Apps]] [[Category:Android Apps]]&lt;/div&gt;</summary>
		<author><name>Shadowdawn</name></author>
	</entry>
</feed>