@nickw1_gitlab https://hikar.org/webapp does not work on my iPhone SE (14.4.1), I can see all the labels which seem to be placed in correct direction, but the camera freezes. The same happen with red-box example in GitHub issue AR-js-org/AR.js#216. The iOS Safari console output the error:
Unhandled Promise Rejection: NotAllowedError: The request is not allowed by the user agent or the platform in the current context, possibly because the user denied permission. (aframe-ar-nft.js:210535)
When I replace "videoTexture: true;" with embedded, the camera works but not 'click' and if both "videoTexture: true;" and embedded, then the screen is totally black.
Any idea what could be the problem? Thx in advance🙏🏽
Hi everyone, I will just post this here if someone has similar issue or want to make a pull-request. I needed to display far objects (>1km) like in the Hikar example. I was fighting with
videoTexture: true because the image from camera gets squished and does not maintain aspect ratio. I wanted to maintain aspect ratio and make the image from camera cover whole area. I came up with this solution:
//main <a-scene> element - you need to wait for loading - use `loaded` event var scene = document.getElementById('scene'); //this is the extra component created by AR.js which puts camera feed into scene without making far objects invisible var webcam = document.querySelector("[arjs-webcam-texture]"); var component = webcam.components['arjs-webcam-texture']; //you also need to wait for video to start playing via `loadeddata` event or similar, otherwise the `videoWidth` and `videoHeight` is zero var viewportRatio = scene.clientWidth / scene.clientHeight; var videoRatio = component.video.videoWidth / component.video.videoHeight; var w = Math.max(1, videoRatio / viewportRatio); var h = Math.max(1, viewportRatio / videoRatio); component.geom.attributes.position.array = new Float32Array([ -w / 2, h / 2, 0, w / 2, h / 2, 0, -w / 2, -h / 2, 0, w / 2, -h / 2, 0 ]); component.geom.attributes.position.needsUpdate = true; //you should also call this code once the viewport size changes (onresize event of window)
I am not sure how this code affects placing precision of 3D content in the image (you should probably adjust FOV, but I am not that far yet - outdoor testing required).
And one small note: you should calculate with magnetic declination when displaying location based augmented content because the device's compass gives you angle to magnetic north which is not true north. You can simply calculate declination with some library (like magvar) and rotate the whole scene a bit like this:
<a-scene id="scene"> <!-- mag. declination 5 degs. --> <a-entity rotation="0 5 0" id="parent"> </a-entity> </a-scene>
You obviously need to set the rotation attribute dynamically as the declination differs in various locations. Than put all your location based content into the #parent element.
truewhen creating the
videoelement. If you look at the PR you should be able to figure out the code change you need to make, let me know if not,
<a-text>and set the
gps-projected-entity-placecomponent to the appropriate latitude and longitude. If you look at the 'connecting to a web API' example at https://ar-js-org.github.io/AR.js-Docs/location-based-tutorial/ this gives an example of what to do.