| Category | Hard disks |
|
|||
| Created | 2015-09-19 | ||||
| Owner | sandywang5230 | ||||
| Title | the user pushes their head through a virtual wall | ||||
| Description | or a shared library. Why did Oculus announce it was delaying shipping DK2 units to do more work on the SDK? Why not, for example, just ship the units out when they were ready and push the SDK update live when it's ready? The earlier 0.3 Oculus SDK code branch wasnt truly ready for DK2. It didnt include the display driver or the service model that make the headset significantly easier to use. Manual portrait display management wouldve led to developer frustration, plus the creation of applications that rely on old display setup. With the 0.4 SDK and runtime nearly ready, we needed that extra week to improve its stability and robustness. Shipping the SDK alongside the hardware meant that developers will have a better out of the box experience. We pulled in the schedule on 0.4 to bring the huge improvements to DK2 right at the launch, and we needed the extra time to stabilize the newest features. Can you give me some clear examples of how developers can make good use of those new features? To enable positional tracking, Oculus SDK reports head pose as combination of orientation quaternion and a 3D position vector in space. In earlier versions of the SDK, translation was computed solely based on the head model; starting with DK2 it includes correct positional data while within the tracking volume. It should be easy to apply this tracking data to the camera view in most game engines, allowing Runescape players to move around in 3D space.Translating in 3D virtual space is, however, the easy part of the challenge. Next youll need to figure out how head translation interacts with game scenery and engine mechanics. What happens, for example, if the user pushes their head through a virtual wall? runescape gold Or moves out of the camera tracking range? Sergio Hidalgo discussed some of the challenges related to positional tracking in his article VR: Letting Go of the Avatar. One option for handling walls is to fade out the screen until the Runescape player moves back into a known space, but more elegant solutions may be waiting to be discovered.Beyond first person experiences, positional tracking provides a new dimension of input for developers to explore. While a handful of very new experiences like Luckeys Tale and Superhot have highlighted some of these new possibilities, Im excited to see what the broader Oculus development community comes up with once they have a DK2 and the new SDK. Im also looking forward to seeing developers begin to leverage the new display driver. From an engineering perspective, its quite easy to use: just create a window whose swap chain matches the resolution of the Rift and call ovrHmd_AttachToWindow on it. All of the swap chain output will show up on the Rift. Having the output redirected from a window does, however, open up the possibility of using the window surface for other things. Besides mirroring, it could potentially be used to display a third person view or game statistics an external observer. So how has your work with engine makers like Unity and Epic evolved, and how is that reflected in the new SDK? | ||||
| Type | Pc | ||||
| Price | |||||
| Promotion level | None | ||||
|
|
|||||
Please register or log in.
