I'm having some trouble with locking

I don’t appear to be able to use spatial anchors or locks of any kind - no matter the scale, passthrough or not, with a freshly modelled room, reset spatial anchors, reset everything … nothing, nada, rein!

I do however see ‘Spatial system not ready’ and ‘Cant lock - Waiting for spatial ancho system’ on my wrist … I wonder is there a debug log anywhere?

Also, is it possible to define spatial anchors manually?

1 Like

hi @Oom_Wat and welcome! If you’re feeling social stop by and say hello!

Spatial anchors are a tricky lil fella, and it looks like you’ve tackled some of the troubleshooting as well. There are a few other things you can do that might help.

Walk around your space

This sounds a bit silly, but in order for the anchor system to be activated in Arkio the headset needs to understand your space so it can drop an anchor :anchor: A good way of understanding this is examining how your boundary (the standard grid that Meta uses for all headsets) starts to build the mesh of your space when creating a boundary. I’ve noticed that the headset will recommend a small seat based boundary while I’m standing in a wide open space. What remedies this in Arkio is walking around your space and slowly moving your head around in order for your space to be recognized.

Check your headset for updates

Make sure to check your headset OS that it’s up to date. The Spatial Anchors API shipped a while ago and I’m guessing you probably covered this already but you never know…

Clear and reset your boundary data

In the Physical Space settings in your headset, clear your boundary data and reset a new one. I do this quite often as it’s the best way to get an accurate floor position. I’m not sure if it actually helps trigger the anchoring system, but I choose to believe it does so I can feel better about my actions.

Using the Lock Position button you can… but that brings us back to square 1.

Let me know if this helps!

I’ll give the boundary reset a go, I’m fairly sure I tried that, but hey.

Also, note this is with a blank scene - the quest room import expects walls to be straight and plum, but my house is made of rocks and lime so nothing is square.

Also arkio seems to just refuse to import rooms, so manually placing geometry is just easier.

Is there any debug logging that can be switched on?

OK, I think you might be onto something … while I was re-creating the boundary I noticed that it doesn’t let me create a roomscale boundary - it forces me to use a stationary one.

Would that make a difference for you being able to pick up the spatial anchors … does a stationary boundary even have anchors?

1 Like

Ah, sorry … false positive … Turns out you need to drag the line for creating a roomscale rather than doing point to point.

Still no spatial anchors after creating a roomscale boundary.

Are you trying to import the room first before creating an anchor? This shouldn’t impact anchors, but it’s good know the steps you’re doing

This is really good to know. Because Arkio is a mixed reality app we don’t require a boundary space to be created, instead Meta will generate one at runtime that we can use.

Nope, doing the boundary reset is merely a way of seeing how Meta is understanding your space. Arkio generates a boundary dynamically.

The headset shouldn’t need to be told if it’s stationary vs room scale, it should be able to suggest room scale if your space is clear-ish and has some light. This is similar to what is happening when Arkio is looking to establish an anchor. If your space keeps coming up as stationary and you need to manually set it, then I would assume that Arkio is not getting enough spatial context to drop an anchor.

Try redoing this process and see how much you need to move around before the headset understands it’s at room scale and not stationary. Ideally that should represent the amount of spatial context is needed to grab an anchor

If in the process of manually modelling a 3ft thick wall and a 5ft thick wall, and how they join with a doorway that’s like a tunnel, I have not given the headset time to work out where I am, despite the geometry mostly matching the physical space, then I’m not sure it’s ever going to catch up!

Would it be possible to add logging into the codebase, or at least get somewhere to see the stack traces that must be being generated as it borks.

I think it’s less of the headset “keeping up” and more around it securing enough data to start tracking. Once the first anchor is dropped the tracking system does a great job with updating.

We currently have internal logging for our development builds that we use to really pick apart what is happening during the anchoring process, but this is only internal.

Could I side-load one of those builds - I understand that it probably won’t be usable with all the logging running, I just want to know why it’s failing.

I spent 15 years coding business apps in java if that helps.

We don’t share our .apk publicly. You could poke around Meta’s different SDKs to find out more.