标签:
本来想自己写呢,我去发现Unity官网写的很详细了,这里直接发官网的部分,
demo介绍:http://unity3d.com/cn/learn/tutorials/topics/virtual-reality/movement-vr?playlist=22946
Along with not achieving the target frame rate - covered in the Overview article - movement in VR is one of the primary causes of VR sickness in users, and as such it requires careful consideration before implementing a solution in your project. This is best considered very early on in development - ideally during the concept phase - as it can heavily impact your project if your chosen movement solution is discovered to induce nausea.
The feeling of nausea - also known as VR Sickness - is partly due to the real-world user’s body being stationary while their virtual point-of-view moves around a virtual environment. Vection is also closely related, and should be avoided wherever possible.
In general, a good rule is to avoid moving the camera unless it’s copying user movement, or this can cause problems with the Vestibular system. Put simply, Vection is the effect of confusing the user’s brain by having conflicting signals sent from the eyes and ears, it can cause the body to assume it has been poisoned, and thus a similar response is evoked - a rejection in the form of sickness. As with everything, there are exceptions, so do experiment and test with as wide an audience as possible to see what works with your game.
Note that the issue of Vection causing nausea is something which is less apparent in VR systems which use spatial tracking, such as the room-scale HTC Vive - but this still relies on you as a developer not moving the user around the environment, and confusing the ear / eye balance.
Currently the most comfortable VR experiences are stationary, usually with the user being seated. As a result, turret-style games are very popular in VR: See our Shooter 180 (Target Gallery) and Shooter 360 (Target Arena) scenes for examples of this style of game.
Traditional first-person character control with mouse and WASD or gamepad frequently induces nausea, so are best avoided. If you do decide to use first person control, test your movement with as many users as possible, and definitely disable head-bobbing. You might want to look into disabling yaw, and instead replace it with snapping rotation, possibly combined with a very quick camera fade. Take a look at the CameraOrbit component below for more information on quick fades, and snapping rotation.
If you do need to implement movement, whether for technical reasons (see the Maze example below) or otherwise, then consider giving the user a constant static visual reference - such as a cockpit, cabin, the interior of a car, or similar. To see examples of cockpits being used in Unity VR games, take a look at Radial-G, Lunar Flight, and Titans of Space.
A popular method for moving between positions in a virtual environment is to implement fade to black transitions - a very quick fade to black, move the camera to the desired position, and then fade back up. An alternative, slightly more complex option would be to implement blink transitions, in which the user blinking is emulated with two black planes. Tom Forsyth at Oculus covers this in his video at Oculus Connect 2014.
As mentioned in our Getting Started with VR Development article, you cannot move the camera directly in Unity. Instead, the camera must be a child of another GameObject, and changes to the position and rotation must be applied to the parent’s Transform.
We wanted to make sure that users had a comfortable VR experience in our VR Samples project, so most scenes (Menu, theTarget games) contain no movement of the camera. However, both Flyer and Maze contain movement: Let’s take a look at what movement they have, and why we chose it. Note that we marked those scenes as ‘Comfortable for some’ instead of ‘most’ in the Menu user interface - and although you should not deter users at this early stage in the VR medium - we feel it only fair to offer expectations to ensure a positive impression.
The camera in our Flyer sample scene moves directly forward at a constant rate, and is already moving when the camera fades in from the introduction UI, so that we do not experience acceleration from a static shot - something which commonly causes nausea.
With the trails behind the ship, and the particles moving around the edges of the environment, this helps give an impression of speed, and racing into the distance. These also provide a frame of reference as you look around as an attempt to reduce nausea.
A similar effect could be achieved by keeping the camera stationary, and moving everything else relative to the camera.
The Maze scene is an example of a top-down “boardgame”-style game, with the camera appearing to be in a stationary position, and the maze rotating via swipes.
However, as we use lightmapping and a NavMesh for the character to navigate the maze, we can’t rotate the board, as these rely on the objects being marked as static. Instead, we have the following hierarchy:
The VROrbitCamera object is positioned in the center of the maze, with the MazeUnityLogo object on one side of the maze, and MainCamera on the other:
This allows us to rotate the VROrbitCamera object, and the children pivot around the center of the Maze. The Unity logo acts as a visual reference frame, and tricks us into thinking that only the board is rotating.
To accomplish the rotation, the VROrbitCamera has two components:
CameraOrbit, which handles the rotation
Rigidbody, which allows us to easily change the Mass and damping parameters
This CameraOrbit component has three important Orbit Style options:
Smooth - This is the option we’re using in the project. This rotates the board smoothly, and slowly dampens the movement. Try changing the Rotation Increment and the properties on the RigidBody to achieve different variations on the speed, damping, and amount the camera orbits per swipe.
Step - Sometimes we might want to immediately snap to the next position when we rotate. Changing Rotation Increment will alter how much the camera rotates.
Step With Fade - This works the same as Step, but very quickly fades the camera to black, rotates the board, and fades back up. Fading the camera quickly, moving the camera’s position, and fading back up is a useful way to move the user around a VR environment without inducing nausea. Changing Rotation Fade Duration will alter the duration of each fade.
Movement and avoiding nausea are critical areas to consider when developing VR content. We highly recommend thoroughly reading the Oculus guides on Simulator Sickness and Motion, which covers these important topics in depth.
Our other two sample scenes simply use rotational head tracking to control the input of movement, along with simple firing input. The environment does not move around the player - targets simply appear for them to look at and shoot, amongst the static environment.
These scenes are marked as ‘Comfortable for most’ in the Menu scene as this kind of gameplay is proven to be the least likely to cause nausea with users.
You should now have a basic understanding of movement in VR, how we accomplished this in our sample scenes, and why we chose the type of movement in each one. You also have several resources for further reading on these topics.
标签:
原文地址:http://blog.csdn.net/qq_15807167/article/details/52053070