Darkroland wrote:
From what I've read, they'll be focusing a lot on scaleability with the room-scale, everything from a relatively small area up to 15x15.
I have to say, after trying a vive out, I'm preordering. The presence is amazing, and I can't even imagine VR without the 1:1 hand tracking at this point. It's amazingly immersive.
I think I may not have been clear. Yes, the Lighthouse technology is definitely designed to be remarkably scaleable. But how much care and attention does a developer have to put into their UI to make the experience work in differently sized environments?
To use an example -- let's say that I'm supposed to be able to walk around my room to, I don't know, investigate a crime scene -- maybe in a Heavy Rain-type game. They map out the game world area to be roughly the same size as their target room size of, I don't know, 12x12. So in a 12x12 room, I can walk around the entire scene at basically one-to-one, and it's really cool and immersive and intuitive.
Now, my neighbor has his setup in a smaller room. His room is 10x10, with 2 feet taken up by his computer desk. How does the game map this 10x8 area onto its 12x12 space? Maybe the API takes care of it with just a straight scaling. Does that get disorienting? After all, not only is my neighbor's movement not one-to-one as the developers were targeting, but the scaling factor is different depending on which direction you move. That could get really trippy and potentially even nauseating, especially if the Lighthouse system autodetects an even more oblique space like 6x12. Maybe the API doesn't take care of it automatically, because of quirks like this, and it's judged that it's better to let the developers handle it in a way that is tailored to their games' needs.
So now, maybe developer A decides to keep the scaling factors the same, and only recognizes input in an 8x8 space for my neighbor. Developer B decides that they'll introduce some kind of scrolling system when my neighbor stands two feet from my a wall, to avoid scaling at all. Are these adequate solutions? Maybe one gets implemented poorly -- Developer B's got a great idea, but they don't account well for spaces bigger than the 12x12 target space, and so if you go out of the center of your room, you clip through the intended constraints of the space and start seeing culled faces and such in the game environment.
That's what I was talking about when I said scaleability, and how it's handled, will be important. Not the tech factor of the Lighthouse hardware actually detecting position in different sized rooms, which I've read enough about to be confident Valve is on top of. But just because Valve is on top of this kind of consideration doesn't mean that every developer they give their APIs to will be, and that'll reflect on the relative value of the Lighthouse room-scale thing. The tech can be cool as hell but still end up with **** implementations with software. Just like the Wii and Kinect can be awesome and powerful hardware with paradigm-shifting potential and end up either gimmicky, ignored, or used for dance games.
I totally agree with you on the hand tracking thing. I think that Oculus may have made a serious misstep in fracturing their install-base with regards to their Touch hardware. It's one thing to say "Hey, guys, we got started on this late -- initially we thought the 3rd party solutions (from Razer et al) were going to be more mature and interoperable, so when we realized they weren't going to be, we were coming into it a bit late to hit our headset's launch window. But we're committed to delivering the Oculus Touch experience to everybody when it's ready so that our game developers can count on you having the best, most intuitive, and most immersive hand-tracking input system to pair with the headset you're buying. To make sure that happens, we're including it in the Oculus Rift package, and if you're scheduled to get your Rift pre-order delivered before they're ready in the Summer, we'll ship them to you separately on our dime. Also, we've partnered with Microsoft to include a top-notch game controller in the meantime."
It's quite another thing to say "Yeah, we think that having standardized controls is important, so here's the best traditional game controller that doesn't really play into the VR thing very naturally in the long run. But wait till you see the accessory to our platform that comes out in the Summer, just in time for developers to ignore it as irrelevant and not part properly part of the Rift experience."
Now, Oculus might surprise me and recover somehow. Maybe that'll come by leveraging their control over access to their Oculus Store -- after the Touch comes out, they could certainly say "Okay, everybody, you've got 6 months to integrate Touch controls if you want to continue to be sold through our platform." Or they could say, "Starting in September, any game added to the Oculus store needs to support all current Oculus hardware."
The good news for Oculus might turn out to be that the Oculus Touch can keep up with the Vive's hand controllers in every featureset, and the translation between the two can be handled via API well enough that converting Valve's OpenVR stuff over to the Touch is trivial. Then, if it turns out that the gesture and finger pose features on the Touch are big enough deals, they might be able to penetrate the user input market of Vive owners, too, since it'll be a standalone SKU.
It'll be an exciting year, is all I know for sure. And probably a bit rocky until this all gets ironed out and we're looking at a fairly stable platform with incremental inter-operable periodic upgrades.