This is pretty exciting: Valve may have been a bit late to the OpenXR party, but they're throwing their weight behind it. OpenXR is looking like it'll become The API, despite long odds! https://steamcommunity.com/games/250820/announcements/detail/2522527900755718764
"In early 2021 we’ll offer a new way to distribute your work in the Quest ecosystem"
I'll believe it when I see it. How will they handicap this? How will they maintain their requisite absolute control? And we won't see it until sometime next year? 🤔 https://developer.oculus.com/blog/the-next-chapter-of-oculus-development-and-a-new-quest-distribution-path-coming-in-2021/
This happened a year or two ago, but it's stuff like this that makes me feel that the web will never be excellent, and at best it will be good. On what other platform would it be acceptable to purposefully make a feature worse like this? Seems like they just shrugged, threw their hands in air, and said, "that's the best we got."
Not sure if this idea will work out, but it didn't even occur to me as a possibility on Tuesday. This is why it's good to put things on the shelf for a bit and think on it. Now it's time to code it up, see if it actually works out, and compare the DX vs. traditional raycasting.
Instead, I'm going to try building out a retained-mode raycasting system. That is, you create "Raycaster" objects that can be trivially enabled/disabled/repositioned, even frame-by-frame. When enabled, the Raycaster has a fresh result every frame: THAT is available synchronously.
On Tuesday, I had two ideas on this. First, is there a way to pause script execution and run some synchronous calls? Not really, esp. not for unknown-N arbitrary call sites. Second, can we keep a copy of all physics bodies & do our own raycast? Yes, but not efficient or accurate.
I'm glad I sat on this for a few days while dealing with other dev & ops work. In the background, my brain was crunching on it. The standard approach is for raycasting to happen immediately, synchronously, assuming full synchronous access to the scenegraph. What if you don't?
Putting everything in an asynchronous loop is all fine and dandy until you have to raycast. Wondering how I made it so far into development on this without needing to raycast. Coming to the realization that I need to write my own so it can be done synchronously within the async loop 😡 Maybe it's time to call it a day.
Making VR - https://www.soundboxing.co
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!