{"_id":"5ade3c180916ec000353abda","category":{"_id":"5ade3c170916ec000353ab9e","version":"5ade3c170916ec000353ab98","project":"578c4badbd223d2000cc1441","__v":0,"sync":{"url":"","isSync":false},"reference":false,"createdAt":"2016-08-01T23:04:12.838Z","from_sync":false,"order":5,"slug":"api-reference","title":"API Reference"},"project":"578c4badbd223d2000cc1441","user":"576c22a3808cf02b00d37419","parentDoc":null,"version":{"_id":"5ade3c170916ec000353ab98","project":"578c4badbd223d2000cc1441","__v":1,"createdAt":"2018-04-23T20:03:35.726Z","releaseDate":"2018-04-23T20:03:35.726Z","categories":["5ade3c170916ec000353ab99","5ade3c170916ec000353ab9a","5ade3c170916ec000353ab9b","5ade3c170916ec000353ab9c","5ade3c170916ec000353ab9d","5ade3c170916ec000353ab9e"],"is_deprecated":false,"is_hidden":false,"is_beta":false,"is_stable":true,"codename":"","version_clean":"2.6.1","version":"2.6.1"},"__v":0,"updates":["5ac6c5cb92aeaa000379380b"],"next":{"pages":[],"description":""},"createdAt":"2017-09-16T17:27:27.260Z","link_external":false,"link_url":"","githubsync":"","sync_unique":"","hidden":false,"api":{"results":{"codes":[]},"settings":"","auth":"required","params":[],"url":""},"isReference":false,"order":74,"body":"The `ViroARScene` component allows developers to logically group their experiences and components and switch between them using the [ViroARSceneNavigator](doc:viroarscenenavigator). \n\nThis component also hosts various properties that enable developers to control and interact with the AR subsystem. Like `displayPointCloud` which configures the renderer to draw the AR point cloud. The `onAnchorFound|Updated|Removed` functions work in conjunction with `ViroARPlane's` [manual anchoring](doc:viroarplane#anchoring) mode to enable developers to fully control their experience.\n\n######Example use:\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"<ViroARScene onTrackingUpdated={this._trackingUpdated} >\\n  <ViroARPlane>\\n    <ViroBox position={[0, .5, 0]} />\\n  </ViroARPlane>\\n</ViroARScene>\",\n      \"language\": \"javascript\"\n    }\n  ]\n}\n[/block]\n\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Props\"\n}\n[/block]\n##Optional Props \n[block:parameters]\n{\n  \"data\": {\n    \"16-0\": \"**onPlatformUpdate**\",\n    \"26-0\": \"**rotation**\",\n    \"27-0\": \"**style**\",\n    \"28-0\": \"**text**\",\n    \"29-0\": \"**transformBehaviors**\",\n    \"31-0\": \"**visible**\",\n    \"30-0\": \"**width**\",\n    \"h-0\": \"PropKey\",\n    \"h-1\": \"PropType\",\n    \"16-1\": \"**React.PropTypes.func**\\n\\nCallback method set to be notified of platform specific information like headset type or controller type.\\n\\nExample Code:\\n```  \\n_onPlatformUpdate(platformInfo){\\n\\tvar platform = platformInfo.vrPlatform;\\n\\tvar headset = platformInfo.headset;\\n\\tvar controller = platformInfo.controller;\\n}\\n```\\n\\nList of supported platforms:\\n\\n| |Cardboard iOS|Cardboard Android|Daydream|GearVR\\n|:- ------------------|:- --------------|:- --------------|:- --------------|:- --------------|\\n|Platform|gvr|gvr|gvr|ovr-mobile|\\n|Headset|cardboard|cardboard|daydream|gearvr|\\n|Controller|cardboard|cardboard|daydream|gearvr|\",\n    \"26-1\": \"PropTypes.arrayOf(PropTypes.number)\\n\\nPut the PropType Description here.\",\n    \"27-1\": \"stylePropType\",\n    \"28-1\": \"PropTypes.string\\n\\nPut the PropType Description here.\",\n    \"29-1\": \"PropTypes.arrayOf(PropTypes.string)\\n\\nPut the PropType Description here.\",\n    \"30-1\": \"PropTypes.number\\n\\nPut the PropType Description here.\",\n    \"31-1\": \"PropTypes.bool\\n\\nPut the PropType Description here.\",\n    \"14-0\": \"**onHover**\",\n    \"14-1\": \"**React.PropTypes.func**\\n\\nFunction to be invoked when the user hovers on on the scene. If this is defined, it is invoked **ONLY** if no other object captures the onHover event.\\n\\nFor example:\\n```  \\n_onHover(isHovering, position, source)  {\\n    if(isHovering) {\\n        // user is hovering on the scene.\\n    } else {\\n        // user is hovering on another object in the scene.\\n    }\\n}\\n```\",\n    \"11-0\": \"**onClickState**\",\n    \"11-1\": \"**PropTypes.func**\\n\\nCalled for each click state an object goes through as it is clicked. Supported click states and their values are the following:\\n\\n|State Value|Description|\\n|:- -----|:- ---------:|\\n|1| Click Down: Triggered when the user has performed a click down action while hovering on this control.|\\n|2| Click Up: Triggered when the user has performed a click up action while hovering on this control.|\\n|3| Clicked: Triggered when the user has performed both a click down and click up action on this control sequentially, thereby having \\\"Clicked\\\" the object.|\\n\\nExample code:\\n```  \\n_onClickState(stateValue, position, source)  {\\n    if(stateValue == 1) {\\n        // Click Down\\n    } else if(stateValue == 2) {\\n        // Click Up\\n    } else if(stateValue == 3) { \\n        // Clicked\\n    }\\n}\\n```\\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section.\\n\\nThis is **ONLY** invoked if a click is not captured on another object within a scene.\",\n    \"18-1\": \"**React.PropTypes.func**\\n\\nNEED TO UPDATE DESCRIPTION\",\n    \"19-1\": \"**React.PropTypes.func**\\n\\nNEED TO UPDATE DESCRIPTION\",\n    \"20-1\": \"**React.PropTypes.func**\\n\\nCalled when the user performs a touch action, while hovering on the control. Provides the touch state type, and the x/y coordinate at which this touch event has occurred.\\n\\n|State Value|Description|\\n|:- -----|:- ---------:|\\n|1| Touch Down: Triggered when the user  makes physical contact with the touch pad on the controller. |\\n|2| Touch Down Move: Called when the user moves around the touch pad immediately after having performed a Touch Down action. |\\n|3| Touch Up: Triggered after the user is no longer in physical contact with the touch pad after a Touch Down action. |\\n\\nFor example:\\n```  \\n_onTouch(state, touchPos, source)  {\\n   var touchX = touchPos[0];\\n   var touchY = touchPos[1];\\n    if(state == 1) {\\n        // Touch Down\\n    } else if(state == 2) {\\n        // Touch Down Move\\n    } else if(state == 3) { \\n        // Touch Up\\n    }\\n}\\n```\\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section.\\n\\nUnsupported VR Platforms: Cardboard(Android and iOS).\",\n    \"18-0\": \"**onScroll**\",\n    \"19-0\": \"**onSwipe**\",\n    \"20-0\": \"**onTouch**\",\n    \"25-0\": \"**soundRoom**\",\n    \"25-1\": \"**PropTypes.shape**\\n\\nDescribes the acoustic properties of the room around the user by allowing the developer to describe the room based on its dimensions and its surface properties.  Note: This is not supported in Cardboard iOS.\\n\\nCode Example:\\n```  \\n    soundRoom={{\\n      size: {[2,2,2]},\\n      wallMaterial: \\\"acoustic_ceiling_tiles\\\",\\n      ceilingMaterial:\\\"glass_thin\\\",\\n      floorMaterial:\\\"concrete_block_coarse\\\"\\n    }}\\n```\\nList of soundRoom properties:\\n\\n|Name|Description|\\n|:- ------------------|:- --------------|\\n|size|The 3D dimensions of the room.|\\n|wallMaterial|Sound Material for the four walls.|\\n|ceilingMaterial|Sound Material for the ceiling|\\n|floorMaterial|Sound Material for the floor|\\n\\nList of Supported Sound Materials:\\n\\n|Name|Description|\\n|:- ------------------|:- --------------|\\n|acoustic_ceiling_tiles|Acoustic ceiling tiles, absorbs most frequencies.|\\n|brick_bare|Bare brick, relatively reflective.|\\n|brick_painted|Painted brick|\\n|concrete_block_coarse|Coarse surface concrete block.|\\n|concrete_block_painted|Painted concrete block.|\\n|curtain_heavy|Heavy curtains.|\\n|fiber_glass_insulation|Fiber glass insulation.|\\n|glass_thin|Thin glass.|\\n|glass_thick|Thick glass.|\\n|grass|Grass.|\\n|linoleum_on_concrete|Linoleum on concrete.|\\n|marble|Marble.|\\n|metal|Galvanized sheet metal.|\\n|parquet_on_concrete|Wooden parquet on concrete.|\\n|plaster_rough|Rough plaster surface.|\\n|plaster_smooth|Smooth plaster surface.|\\n|plywood_panel|Plywood panel|\\n|polished_concrete_or_tile|Polished concrete or tile surface.|\\n|sheet_rock|Sheet rock|\\n|transparent|Acoustically transparent material, reflects no sound.|\\n|water_or_ice_surface|Surface of water or ice.|\\n|wood_ceiling|Wooden ceiling.|\\n|wood_panel|Wood paneling.|\",\n    \"13-0\": \"**onFuse**\",\n    \"13-1\": \"**PropTypes.oneOfType**\\n```  \\nPropTypes.oneOfType([\\n      React.PropTypes.shape({\\n        callback: React.PropTypes.func.isRequired,\\n        timeToFuse: PropTypes.number\\n      }),\\n      React.PropTypes.func,\\n])\\n```  \\nAs shown above, onFuse takes one of the types - either a callback, or a dictionary with a callback and duration. \\n\\nIt is called after the user hovers onto and remains hovered on the control for a certain duration of time, as indicated in timeToFuse that represents the duration of time in milliseconds. \\n\\nWhile hovering, the reticle will display a count down animation while fusing towards timeToFuse.\\n\\nNote that timeToFuse defaults to 2000ms.\\n\\nFor example:\\n```  \\n_onFuse(source){\\n   // User has hovered over object for timeToFuse milliseconds\\n}\\n```\\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section.\",\n    \"10-0\": \"**onClick**\",\n    \"10-1\": \"**PropTypes.func**\\n\\nFunction to be invoked when a user clicks on a scene. This is **ONLY** invoked if a click is not captured on another object within a scene.\\n\\nDefining this can be used to register clicks for 360 Photos and videos.\",\n    \"3-0\": \"**ignoreEventHandling**\",\n    \"12-0\": \"**onDrag**\",\n    \"15-0\": \"**onPinch**\",\n    \"17-0\": \"**onRotate**\",\n    \"21-0\": \"**onTrackingInitialized** (Deprecated)\",\n    \"4-0\": \"**onAmbientLightUpdate**\",\n    \"24-0\": \"**physicsWorld**\",\n    \"23-0\": \"**postProcessEffects**\",\n    \"12-1\": \"**PropTypes.func**\\n\\nCalled when the view is currently being dragged. The dragToPos parameter provides the current 3D location of the dragged object. \\n\\nExample code:\\n```  \\n_onDrag(dragToPos, source)  {\\n    // dragtoPos[0]: x position\\n    // dragtoPos[1]: y position\\n    // dragtoPos[2]: z position\\n}\\n``` \\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section. \\n\\nUnsupported VR Platforms: Cardboard iOS\",\n    \"3-1\": \"**PropTypes.bool**\\n\\nWhen set to true, this control will ignore events and not prevent controls behind it from receiving event callbacks.\\n\\nThe default value is false.\",\n    \"15-1\": \"**React.PropTypes.func**\\n\\nCalled when the user performs a pinch gesture on the control. When the pinch starts, the scale factor is set to 1 is relative to the points of the two touch points.  \\n\\nFor example:\\n```\\n  _onPinch(pinchState, scaleFactor, source) {\\n       if(pinchState == 3) {\\n      // update scale of obj by multiplying by scaleFactor  when pinch ends.\\n        return;\\n       }\\n     //set scale using native props to reflect pinch.\\n  }\\n```\\npinchState can be the following values:\\n\\n|State Value|Description|\\n|:- -----|:- ---------:|\\n|1| Pinch Start: Triggered when the user has started a pinch gesture.|\\n|2| Pinch Move: Triggered when the user has adjusted the pinch, moving both fingers. |\\n|3| Pinch End: When the user has finishes the pinch gesture and released both touch points. |\\n\\n**This event is only available in AR**.\",\n    \"17-1\": \"**React.PropTypes.func**\\n\\nCalled when the user performs a rotation touch gesture on the control. Rotation factor is returned in degrees.\\n\\nWhen setting rotation, the rotation should be relative to it's current rotation, *not* set to the absolute value of the given rotationFactor.\\n\\nFor example:\\n\\n```\\n    _onRotate(rotateState, rotationFactor, source) {\\n\\n      if (rotateState == 3) {\\n        //set to current rotation - rotationFactor.\\n        return;\\n      }\\n     //update rotation using setNativeProps\\n    },\\n\\n```\\nrotationFactor can be the following values:\\n\\n|State Value|Description|\\n|:- -----|:- ---------:|\\n|1| Rotation Start: Triggered when the user has started a rotation gesture.|\\n|2| Rotation Move: Triggered when the user has adjusted the rotation, moving both fingers. |\\n|3| Rotation End: When the user has finishes the rotation gesture and released both touch points. |\\n\\n**This event is only available in AR**.\",\n    \"4-1\": \"**PropTypes.func**\\n\\nFunction that provides an estimate of the light intensity and color temperature.\\n\\n|Parameter|Description|\\n|- --|- --|\\n|intensity| a number representing the estimated intensity of the ambient light as detected by the camera|\\n|colorTemperature|a number representing the estimated colorTemperature of the ambient light as detected by the camera|\",\n    \"21-1\": \"**PropTypes.func** \\n\\n**WARN**: This function will be deprecated in the upcoming release, in favor of onTrackingUpdated.\\n\\nFunction called when the AR system has properly initialized. The platform maintains a right-handed coordinate system, where the origin of the system is the user's location at the time AR tracking was initialized. The camera's forward vector is [0, 0, -1] and up vector is [0,1,0].\",\n    \"23-1\": \"**PropTypes.arrayOf(PropTypes.string)**\\n\\nSpecifies which post-process effects to enable. Refer to [Post-Process Effects](doc:viroarscene#post-process-effects) for more information.\",\n    \"24-1\": \"**PropTypes.shape({\\n      gravity: PropTypes.arrayOf(PropTypes.number).isRequired,\\n       drawBounds: PropTypes.bool,\\n    })**\\n\\nContains and processes the physics bodies of all viro controls that have been physics enabled in this scene. Environmental physics properties are also applied, like gravity. \\n\\n|SubPropType|Description|\\n|:------|:----------:|\\n|gravity| A constant gravitational acceleration that is applied to all physics body objects in this scene. It is a vector in the terms of meters per second. Defaults to [0, -9.81, 0].|\\n|drawBounds| If true, renders the mesh representing the shape of all physics bodies in this scene.|\",\n    \"5-0\": \"**onAnchorFound**\",\n    \"6-0\": \"**onAnchorUpdated**\",\n    \"7-0\": \"**onAnchorRemoved**\",\n    \"5-1\": \"**PropTypes.func**\\n\\nCalled when the AR system finds an Anchor.\\n\\n|Parameters  | Description |\\n|---|---|\\n|anchor| see  [Anchor](doc:viroarscene#anchor) |\",\n    \"6-1\": \"**PropTypes.func**\\n\\nCalled when the AR system detects changed properties of a previously found Anchor.\\n\\n|Parameters  | Description |\\n|---|---|\\n|anchor| see  [Anchor](doc:viroarscene#anchor) |\",\n    \"7-1\": \"**PropTypes.func**\\n\\nCalled when the AR system detects that a previously found Anchor no longer exists\\n\\n|Parameters  | Description |\\n|---|---|\\n|anchor| see  [Anchor](doc:viroarscene#anchor) |\",\n    \"2-0\": \"**dragType**\",\n    \"2-1\": \"**PropTypes.oneOf([\\\"FixedDistance\\\", \\\"FixedToWorld\\\"])**\\n\\nDetermines the behavior of drag if **onDrag** is specified.\\n\\n|Value|Description|\\n|:- -----|:- ---------:|\\n|FixedDistance| Dragging is limited to a fixed radius around the user|\\n|FixedToWorld| Dragging is based on intersection with real world objects. **Available only in AR** |\\n\\nThe default value is \\\"FixedDistance\\\".\",\n    \"1-0\": \"**displayPointCloud**\",\n    \"1-1\": \"**PropTypes.boolean**\\nor\\n**{pointCloudOptions}** described below.\\n\\nSetting this property to `true` draws the point cloud using a default configuration.\\n\\nSetting this property to `false` disables the drawing of the point cloud.\\n\\nThis property can also take a dictionary of properties which enable point cloud drawing with the given **pointCloudOptions:**\\n\\n|Key|Description|\\n|---|---|\\n| imageSource | image used to represent each point|\\n|imageScale | scale of the image used for each point, the default is [.01,.01,.01]|\\n|maxPoints| the max number of points drawn each frame|\\n\\nExample: \\n```\\n<ViroARScene displayPointCloud={{\\n    imageSource : require(\\\"./res/pointCloudPoint.png\\\"),\\n    imageScale : [.02,.02,.02],\\n    maxPoints : 100 }} />\\n```\",\n    \"8-0\": \"**onARPointCloudUpdate**\",\n    \"8-1\": \"**PropTypes.func**\\n\\nThis callback will invoke whenever the point cloud is updated. \\n\\n|Parameters | Description|\\n|---|---|\\n|pointCloud|A Javascript object containing the point cloud in the format below|\\n\\n```\\n{\\n  \\\"pointCloud\\\" : {\\n    \\\"points\\\" : [ [x, y, z, confidence], ... ],\\n    \\\"identifiers\\\" : [ identifier1, identifier2, ... ]\\n  }\\n}\\n```\\n\\nwhere:\\n\\n`x, y, z` - represents the x,y,z coordinates of the point in world space\\n`confidence` - is a float value from 0 -> 1 that represents the confidence that the underlying system has for this point (Android only)\\n`identifier` - is a number that is unique to the corresponding point in the points array that allow the user to track points between point cloud updates (iOS only)\",\n    \"0-0\": \"**anchorDetectionTypes**\",\n    \"0-1\": \"**PropTypes.string**\\nor \\n**PropTypes.arrayOf(PropTypes.string)**\\n\\nDetermines what types of anchors the scene should return.\\n\\nCurrently supports the following values:\\n*None*\\n*PlanesHorizontal*\\n*PlanesVertical* - vertical planes are an ARKit 1.5+/iOS 11.3+ feature\",\n    \"9-0\": \"**onCameraARHitTest**\",\n    \"9-1\": \"**PropTypes.func**\\n\\nIf defined, a callback is invoked returning the camera position and orientation along with a set of hit results in an array consisting of  [ARHitTestResult](doc:viroarscene#arhittestresult) objects. The hit test results correspond to the AR points found by the AR system defined by the ray shooting from the camera direction and position. \\n\\nThis can be used to show a tracking plane placed in the world while the user moves or to inform the user of the confidence of the area being looked at.\\n\\nIf defined, this callback is invoked as often as possible in order to keep with the frame rate.\\n\\nThe following object structure is returned:\\n\\n```\\n{\\n    \\\"hitTestResults\\\": [ [ARHitTestResult1],\\n [ARHitTestResult2],...]\\n    \\\"cameraOrientation\\\": {\\n        position: [x, y, z], \\n        rotation:[x,y,z], \\n        forward:[x,y,z], \\n        up[x,y,z]\\n      }\\n}\\n```\\n [ARHitTestResult](doc:viroarscene#arhittestresult) format found [here](doc:viroarscene#arhittestresult).\\n\\ncameraOrientation consists of position, the rotation of the camera in degrees, and the current forward and up vectors of the camera.\",\n    \"22-0\": \"**onTrackingUpdated**\",\n    \"22-1\": \"**PropTypes.func** \\n\\nInvoked when the tracking state of the device changes. The tracking state indicates how well the device is able to track its position within the real world. Tracking state is subject to lighting conditions, the speed at which the device is moving, and other environmental factors.\\n\\nSample code:\\n\\n(Note that we use [ViroConstants](doc:viroconstants) to properly compare different tracked states): \\n\\n```  \\n_onTrackingUpdated(state, reason) {\\n    if (state == ViroConstants.TRACKING_NORMAL){\\n      // Show my AR Scene experience\\n    } else if (state == ViroConstants.TRACKING_NONE){\\n      // Prompt user to move phone around\\n    }\\n  },\\n```  \\n\\nTracking states include:\\n\\n|AR Tracking State Values|Description|\\n|:- -----|:- ---------:|\\n|1: TRACKING_UNAVAILABLE| Tracking is unavailable: the camera's position in the world is not known. |\\n|2: TRACKING_LIMITED| Tracking is available, but the camera's position in the world may be inaccurate and should not be used with confidence. |\\n|3: TRACKING_NORMAL| Camera position tracking is providing optimal results. |\\n\\nFor iOS, a possible diagnosis for limited tracking quality is provided in the second parameter: \\\"reason\\\". These states include:\\n\\n|AR Tracking State Reason|Description|\\n|:- -----|:- ---------:|\\n|1: TRACKING_REASON_NONE|The current tracking state is not limited. |\\n|2: TRACKING_REASON_EXCESSIVE_MOTION| The device is moving too fast for accurate position tracking. |\\n|3: TRACKING_REASON_INSUFFICIENT_FEATURES\\t| The scene visible to the camera does not contain enough distinguishable features for optimal position tracking. |\"\n  },\n  \"cols\": 2,\n  \"rows\": 26\n}\n[/block]\n\n[block:html]\n{\n  \"html\": \"<div></div>\\n\\n<style></style>\"\n}\n[/block]\n\n[block:html]\n{\n  \"html\": \"<div></div>\\n\\n<style></style>\"\n}\n[/block]\n\n[block:html]\n{\n  \"html\": \"<div></div>\\n\\n<style></style>\"\n}\n[/block]\n\n[block:api-header]\n{\n  \"title\": \"Methods\"\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"async findCollisionsWithRayAsync(from: arrayOf(number), to: arrayOf(number), closest: bool, viroTag: string)\",\n    \"0-0\": \"This function is used to find collisions between [physics](doc:physics) bodies and a line emanating from the given `from` position to the `to` position. Collided components have their `onCollision` callbacks invoked.\\n\\n|Parameters|Description|\\n|- --|- --|\\n|from|the origin position of the line|\\n|to|the end position of the line|\\n|closest| if true, only the first object intersected by the line (determined by closest distance to the origin) receives the `onCollision` callback|\\n|viroTag|the string tag passed to collided components' `onCollision` callbacks|\\n\\n|Return Values|Description|\\n|- --|- --|\\n|hasHit| true/false whether or not a collision was detected|\"\n  },\n  \"cols\": 1,\n  \"rows\": 1\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"async findCollisionsWithShapeAsync(from:arrayOf(number), to:arrayOf(number), shapeString: string, shapeParam: object, viroTag: string)\",\n    \"0-0\": \"This function is used to find collisions between [physics](doc:physics) bodies and the given shape moving from the given `from` position to the `to` position. Collided components have their `onCollision` callbacks invoked.\\n\\nIf the `from` and `to` positions are the same, then this function invokes the `onCollision` callbacks of all components within the given shape.\\n\\n|Parameters|Description|\\n|- --|- --|\\n|from|the origin position of the line|\\n|to|the end position of the line|\\n|shapeString| the name of the shape to use in this test|\\n|shapeParam| the configuration of the shape used in this collision test|\\n|viroTag|the string tag passed to collided components' `onCollision` callbacks|\\n\\n|Return Value|Description|\\n|- --|- --|\\n|hasHit| true/false whether or not a collision was detected|\"\n  },\n  \"cols\": 1,\n  \"rows\": 1\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"async getCameraOrientationAsync()\",\n    \"0-0\": \"This function is used to fetch the current Camera's orientation.\\n\\n|Return Value|Description|\\n|- --|- --|\\n|orientation|an object that contains the camera's `position`, `rotation`, `forward` vector and `up` vector as number arrays|\"\n  },\n  \"cols\": 1,\n  \"rows\": 1\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"async performARHitTestWithRay(ray: arrayOf(number))\",\n    \"0-0\": \"This function performs a AR system-backed hit test with the given ray from the camera's position outward.\\n\\n|Return Value|Description|\\n|---|---|\\n|arHitTestResults| returns an array of [ARHitTestResult](doc:viroarscene#arhittestresult) corresponding to the AR points found by the AR system along the ray.|\"\n  },\n  \"cols\": 1,\n  \"rows\": 1\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"async performARHitTestWithPosition(position: arrayOf(number))\",\n    \"0-0\": \"This function performs an AR system-backed hit test with the ray from the camera to the given position.\\n\\n|Return Value|Description|\\n|---|---|\\n|arHitTestResults| returns an array of [ARHitTestResult](doc:viroarscene#arhittestresult) corresponding to the AR points found by the AR system along the ray.|\"\n  },\n  \"cols\": 1,\n  \"rows\": 1\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"async performARHitTestWithPoint(x:number, y:number)\",\n    \"0-0\": \"This function performs an AR system-backed hit test with the given 2D screen coordinates in pixels. You may need to scale the x and y position by the pixel ratio to get the correct result:\\n\\nFor example:\\n```\\nperformARHitTestWithPoint(evt.nativeEvent.locationX * PixelRatio.get(), evt.nativeEvent.locationX * PixelRatio.get()) \\n```\\n\\n|Return Value|Description|\\n|---|---|\\n|arHitTestResults| returns an array of [ARHitTestResult](doc:viroarscene#arhittestresult) corresponding to the AR points found by the AR system along the ray.|\"\n  },\n  \"cols\": 1,\n  \"rows\": 1\n}\n[/block]\n\n[block:api-header]\n{\n  \"title\": \"ARHitTestResult\"\n}\n[/block]\nThese are the individual objects in the array of ARHitTestResults returned by the two `performARHitTest...` functions. \n\n```\narHitTestResult = (object) {\n  type : string,\n  transform : (object) {\n    position : array(number),\n    rotation : array(number),\n    scale : array(number)\n  }\n}\n```\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"Key\",\n    \"h-1\": \"Description\",\n    \"0-0\": \"type\",\n    \"1-0\": \"transform\",\n    \"0-1\": \"**string**\\n\\nThe type of point returned, can only be one of the following:\\n\\n\\\"ExistingPlaneUsingExtent\\\"\\n\\\"ExistingPlane\\\"\\n\\\"EstimatedHorizontalPlane\\\"\\n\\\"FeaturePoint\\\"\",\n    \"1-1\": \"**object**\\n\\nThe transform of the point. Contains the following keys:\\n\\n`position`, `rotation`, `scale` as arrays of numbers.\"\n  },\n  \"cols\": 2,\n  \"rows\": 2\n}\n[/block]\n\n[block:api-header]\n{\n  \"title\": \"Anchor\"\n}\n[/block]\nThis is the object given to the developer through the `onAnchorFound`, `onAnchorUpdated` and `onAnchorRemoved` callback functions.\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"Key\",\n    \"h-1\": \"Value\",\n    \"0-0\": \"anchorId\",\n    \"3-0\": \"rotation\",\n    \"4-0\": \"center ([ViroARPlane](doc:viroarplane) only)\",\n    \"6-0\": \"width ([ViroARPlane](doc:viroarplane) only)\",\n    \"7-0\": \"height ([ViroARPlane](doc:viroarplane) only)\",\n    \"6-1\": \"**number**\\n\\nCurrent width of the attached plane\",\n    \"7-1\": \"**number**\\n\\nCurrent height of the attached plane\",\n    \"0-1\": \"**string**\\n\\nId of the anchor\",\n    \"3-1\": \"**arrayOf(number)**\\n\\nRotation of the rotation of the anchor in degrees.\",\n    \"4-1\": \"**arrayOf(number)**\\n\\nCenter of the plane relative to the plane's position.\",\n    \"2-0\": \"position\",\n    \"2-1\": \"**arrayOf(number)**\\n\\nPosition of the anchor in world coordinates.\",\n    \"1-0\": \"type\",\n    \"1-1\": \"**string**\\n\\ntype of the anchor\",\n    \"5-0\": \"alignment ([ViroARPlane](doc:viroarplane) only)\",\n    \"5-1\": \"**string**\\n\\nThe plane alignment, one of the following values:\\n\\\"horizontal\\\" - iOS only\\n\\\"HorizontalDownwards\\\" - Android only\\n\\\"HorizontalUpwards\\\" - Android only\\n\\\"NonHorizontal\\\" - Android only\",\n    \"8-0\": \"vertices\",\n    \"8-1\": \"**arrayOf(arrayOf(number))**\\n\\nAn array of 3D points representing the vertices along the boundary of a polygonal plane for this\\nViroARPlane. Although the contents of this property consist of 3D points, the represented polygonal plane is always two-dimensional, and is always positioned in only the x and z axis. This points are always placed relative to the ViroARPlane's center transform.\"\n  },\n  \"cols\": 2,\n  \"rows\": 9\n}\n[/block]\n\n[block:api-header]\n{\n  \"title\": \"Post-Process Effects\"\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"Effect\",\n    \"h-1\": \"Description\",\n    \"0-0\": \"grayscale\",\n    \"1-0\": \"sepia\",\n    \"2-0\": \"sincity\",\n    \"3-0\": \"baralleldistortion\",\n    \"4-0\": \"pincushiondistortion\",\n    \"5-0\": \"thermalvision\",\n    \"6-0\": \"crosshatch\",\n    \"7-0\": \"pixelated\",\n    \"0-1\": \"An effect where the resulting image is in black and white.\",\n    \"1-1\": \"An effect where the resulting image has a dark reddish-brown pigment color effect on it.\",\n    \"2-1\": \"A sin-city like effect where the resulting image is in black and white, except for places where there is saturated red colors.\",\n    \"3-1\": \"A fish-eye-like effect where the fish eye lens \\ndistortion becomes more pronounce towards the center of the image.\",\n    \"4-1\": \"A cushioning effect where the resulting image is \\\"pinched\\\" into the center.\",\n    \"5-1\": \"A coloring effect where the resulting image gives of a \\\"radiant heat\\\" look from a thermal sensor.\",\n    \"6-1\": \"An effect where the resulting image is made up of tiny crossed lines that recreates the scene.\",\n    \"7-1\": \"An effect where the resulting image is pixelized.\"\n  },\n  \"cols\": 2,\n  \"rows\": 8\n}\n[/block]","excerpt":"","slug":"viroarscene","type":"basic","title":"ViroARScene"}
The `ViroARScene` component allows developers to logically group their experiences and components and switch between them using the [ViroARSceneNavigator](doc:viroarscenenavigator). This component also hosts various properties that enable developers to control and interact with the AR subsystem. Like `displayPointCloud` which configures the renderer to draw the AR point cloud. The `onAnchorFound|Updated|Removed` functions work in conjunction with `ViroARPlane's` [manual anchoring](doc:viroarplane#anchoring) mode to enable developers to fully control their experience. ######Example use: [block:code] { "codes": [ { "code": "<ViroARScene onTrackingUpdated={this._trackingUpdated} >\n <ViroARPlane>\n <ViroBox position={[0, .5, 0]} />\n </ViroARPlane>\n</ViroARScene>", "language": "javascript" } ] } [/block] [block:api-header] { "type": "basic", "title": "Props" } [/block] ##Optional Props [block:parameters] { "data": { "16-0": "**onPlatformUpdate**", "26-0": "**rotation**", "27-0": "**style**", "28-0": "**text**", "29-0": "**transformBehaviors**", "31-0": "**visible**", "30-0": "**width**", "h-0": "PropKey", "h-1": "PropType", "16-1": "**React.PropTypes.func**\n\nCallback method set to be notified of platform specific information like headset type or controller type.\n\nExample Code:\n``` \n_onPlatformUpdate(platformInfo){\n\tvar platform = platformInfo.vrPlatform;\n\tvar headset = platformInfo.headset;\n\tvar controller = platformInfo.controller;\n}\n```\n\nList of supported platforms:\n\n| |Cardboard iOS|Cardboard Android|Daydream|GearVR\n|:- ------------------|:- --------------|:- --------------|:- --------------|:- --------------|\n|Platform|gvr|gvr|gvr|ovr-mobile|\n|Headset|cardboard|cardboard|daydream|gearvr|\n|Controller|cardboard|cardboard|daydream|gearvr|", "26-1": "PropTypes.arrayOf(PropTypes.number)\n\nPut the PropType Description here.", "27-1": "stylePropType", "28-1": "PropTypes.string\n\nPut the PropType Description here.", "29-1": "PropTypes.arrayOf(PropTypes.string)\n\nPut the PropType Description here.", "30-1": "PropTypes.number\n\nPut the PropType Description here.", "31-1": "PropTypes.bool\n\nPut the PropType Description here.", "14-0": "**onHover**", "14-1": "**React.PropTypes.func**\n\nFunction to be invoked when the user hovers on on the scene. If this is defined, it is invoked **ONLY** if no other object captures the onHover event.\n\nFor example:\n``` \n_onHover(isHovering, position, source) {\n if(isHovering) {\n // user is hovering on the scene.\n } else {\n // user is hovering on another object in the scene.\n }\n}\n```", "11-0": "**onClickState**", "11-1": "**PropTypes.func**\n\nCalled for each click state an object goes through as it is clicked. Supported click states and their values are the following:\n\n|State Value|Description|\n|:- -----|:- ---------:|\n|1| Click Down: Triggered when the user has performed a click down action while hovering on this control.|\n|2| Click Up: Triggered when the user has performed a click up action while hovering on this control.|\n|3| Clicked: Triggered when the user has performed both a click down and click up action on this control sequentially, thereby having \"Clicked\" the object.|\n\nExample code:\n``` \n_onClickState(stateValue, position, source) {\n if(stateValue == 1) {\n // Click Down\n } else if(stateValue == 2) {\n // Click Up\n } else if(stateValue == 3) { \n // Clicked\n }\n}\n```\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section.\n\nThis is **ONLY** invoked if a click is not captured on another object within a scene.", "18-1": "**React.PropTypes.func**\n\nNEED TO UPDATE DESCRIPTION", "19-1": "**React.PropTypes.func**\n\nNEED TO UPDATE DESCRIPTION", "20-1": "**React.PropTypes.func**\n\nCalled when the user performs a touch action, while hovering on the control. Provides the touch state type, and the x/y coordinate at which this touch event has occurred.\n\n|State Value|Description|\n|:- -----|:- ---------:|\n|1| Touch Down: Triggered when the user makes physical contact with the touch pad on the controller. |\n|2| Touch Down Move: Called when the user moves around the touch pad immediately after having performed a Touch Down action. |\n|3| Touch Up: Triggered after the user is no longer in physical contact with the touch pad after a Touch Down action. |\n\nFor example:\n``` \n_onTouch(state, touchPos, source) {\n var touchX = touchPos[0];\n var touchY = touchPos[1];\n if(state == 1) {\n // Touch Down\n } else if(state == 2) {\n // Touch Down Move\n } else if(state == 3) { \n // Touch Up\n }\n}\n```\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section.\n\nUnsupported VR Platforms: Cardboard(Android and iOS).", "18-0": "**onScroll**", "19-0": "**onSwipe**", "20-0": "**onTouch**", "25-0": "**soundRoom**", "25-1": "**PropTypes.shape**\n\nDescribes the acoustic properties of the room around the user by allowing the developer to describe the room based on its dimensions and its surface properties. Note: This is not supported in Cardboard iOS.\n\nCode Example:\n``` \n soundRoom={{\n size: {[2,2,2]},\n wallMaterial: \"acoustic_ceiling_tiles\",\n ceilingMaterial:\"glass_thin\",\n floorMaterial:\"concrete_block_coarse\"\n }}\n```\nList of soundRoom properties:\n\n|Name|Description|\n|:- ------------------|:- --------------|\n|size|The 3D dimensions of the room.|\n|wallMaterial|Sound Material for the four walls.|\n|ceilingMaterial|Sound Material for the ceiling|\n|floorMaterial|Sound Material for the floor|\n\nList of Supported Sound Materials:\n\n|Name|Description|\n|:- ------------------|:- --------------|\n|acoustic_ceiling_tiles|Acoustic ceiling tiles, absorbs most frequencies.|\n|brick_bare|Bare brick, relatively reflective.|\n|brick_painted|Painted brick|\n|concrete_block_coarse|Coarse surface concrete block.|\n|concrete_block_painted|Painted concrete block.|\n|curtain_heavy|Heavy curtains.|\n|fiber_glass_insulation|Fiber glass insulation.|\n|glass_thin|Thin glass.|\n|glass_thick|Thick glass.|\n|grass|Grass.|\n|linoleum_on_concrete|Linoleum on concrete.|\n|marble|Marble.|\n|metal|Galvanized sheet metal.|\n|parquet_on_concrete|Wooden parquet on concrete.|\n|plaster_rough|Rough plaster surface.|\n|plaster_smooth|Smooth plaster surface.|\n|plywood_panel|Plywood panel|\n|polished_concrete_or_tile|Polished concrete or tile surface.|\n|sheet_rock|Sheet rock|\n|transparent|Acoustically transparent material, reflects no sound.|\n|water_or_ice_surface|Surface of water or ice.|\n|wood_ceiling|Wooden ceiling.|\n|wood_panel|Wood paneling.|", "13-0": "**onFuse**", "13-1": "**PropTypes.oneOfType**\n``` \nPropTypes.oneOfType([\n React.PropTypes.shape({\n callback: React.PropTypes.func.isRequired,\n timeToFuse: PropTypes.number\n }),\n React.PropTypes.func,\n])\n``` \nAs shown above, onFuse takes one of the types - either a callback, or a dictionary with a callback and duration. \n\nIt is called after the user hovers onto and remains hovered on the control for a certain duration of time, as indicated in timeToFuse that represents the duration of time in milliseconds. \n\nWhile hovering, the reticle will display a count down animation while fusing towards timeToFuse.\n\nNote that timeToFuse defaults to 2000ms.\n\nFor example:\n``` \n_onFuse(source){\n // User has hovered over object for timeToFuse milliseconds\n}\n```\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section.", "10-0": "**onClick**", "10-1": "**PropTypes.func**\n\nFunction to be invoked when a user clicks on a scene. This is **ONLY** invoked if a click is not captured on another object within a scene.\n\nDefining this can be used to register clicks for 360 Photos and videos.", "3-0": "**ignoreEventHandling**", "12-0": "**onDrag**", "15-0": "**onPinch**", "17-0": "**onRotate**", "21-0": "**onTrackingInitialized** (Deprecated)", "4-0": "**onAmbientLightUpdate**", "24-0": "**physicsWorld**", "23-0": "**postProcessEffects**", "12-1": "**PropTypes.func**\n\nCalled when the view is currently being dragged. The dragToPos parameter provides the current 3D location of the dragged object. \n\nExample code:\n``` \n_onDrag(dragToPos, source) {\n // dragtoPos[0]: x position\n // dragtoPos[1]: y position\n // dragtoPos[2]: z position\n}\n``` \nFor the mapping of sources to controller inputs, see the [Events](doc:events) section. \n\nUnsupported VR Platforms: Cardboard iOS", "3-1": "**PropTypes.bool**\n\nWhen set to true, this control will ignore events and not prevent controls behind it from receiving event callbacks.\n\nThe default value is false.", "15-1": "**React.PropTypes.func**\n\nCalled when the user performs a pinch gesture on the control. When the pinch starts, the scale factor is set to 1 is relative to the points of the two touch points. \n\nFor example:\n```\n _onPinch(pinchState, scaleFactor, source) {\n if(pinchState == 3) {\n // update scale of obj by multiplying by scaleFactor when pinch ends.\n return;\n }\n //set scale using native props to reflect pinch.\n }\n```\npinchState can be the following values:\n\n|State Value|Description|\n|:- -----|:- ---------:|\n|1| Pinch Start: Triggered when the user has started a pinch gesture.|\n|2| Pinch Move: Triggered when the user has adjusted the pinch, moving both fingers. |\n|3| Pinch End: When the user has finishes the pinch gesture and released both touch points. |\n\n**This event is only available in AR**.", "17-1": "**React.PropTypes.func**\n\nCalled when the user performs a rotation touch gesture on the control. Rotation factor is returned in degrees.\n\nWhen setting rotation, the rotation should be relative to it's current rotation, *not* set to the absolute value of the given rotationFactor.\n\nFor example:\n\n```\n _onRotate(rotateState, rotationFactor, source) {\n\n if (rotateState == 3) {\n //set to current rotation - rotationFactor.\n return;\n }\n //update rotation using setNativeProps\n },\n\n```\nrotationFactor can be the following values:\n\n|State Value|Description|\n|:- -----|:- ---------:|\n|1| Rotation Start: Triggered when the user has started a rotation gesture.|\n|2| Rotation Move: Triggered when the user has adjusted the rotation, moving both fingers. |\n|3| Rotation End: When the user has finishes the rotation gesture and released both touch points. |\n\n**This event is only available in AR**.", "4-1": "**PropTypes.func**\n\nFunction that provides an estimate of the light intensity and color temperature.\n\n|Parameter|Description|\n|- --|- --|\n|intensity| a number representing the estimated intensity of the ambient light as detected by the camera|\n|colorTemperature|a number representing the estimated colorTemperature of the ambient light as detected by the camera|", "21-1": "**PropTypes.func** \n\n**WARN**: This function will be deprecated in the upcoming release, in favor of onTrackingUpdated.\n\nFunction called when the AR system has properly initialized. The platform maintains a right-handed coordinate system, where the origin of the system is the user's location at the time AR tracking was initialized. The camera's forward vector is [0, 0, -1] and up vector is [0,1,0].", "23-1": "**PropTypes.arrayOf(PropTypes.string)**\n\nSpecifies which post-process effects to enable. Refer to [Post-Process Effects](doc:viroarscene#post-process-effects) for more information.", "24-1": "**PropTypes.shape({\n gravity: PropTypes.arrayOf(PropTypes.number).isRequired,\n drawBounds: PropTypes.bool,\n })**\n\nContains and processes the physics bodies of all viro controls that have been physics enabled in this scene. Environmental physics properties are also applied, like gravity. \n\n|SubPropType|Description|\n|:------|:----------:|\n|gravity| A constant gravitational acceleration that is applied to all physics body objects in this scene. It is a vector in the terms of meters per second. Defaults to [0, -9.81, 0].|\n|drawBounds| If true, renders the mesh representing the shape of all physics bodies in this scene.|", "5-0": "**onAnchorFound**", "6-0": "**onAnchorUpdated**", "7-0": "**onAnchorRemoved**", "5-1": "**PropTypes.func**\n\nCalled when the AR system finds an Anchor.\n\n|Parameters | Description |\n|---|---|\n|anchor| see [Anchor](doc:viroarscene#anchor) |", "6-1": "**PropTypes.func**\n\nCalled when the AR system detects changed properties of a previously found Anchor.\n\n|Parameters | Description |\n|---|---|\n|anchor| see [Anchor](doc:viroarscene#anchor) |", "7-1": "**PropTypes.func**\n\nCalled when the AR system detects that a previously found Anchor no longer exists\n\n|Parameters | Description |\n|---|---|\n|anchor| see [Anchor](doc:viroarscene#anchor) |", "2-0": "**dragType**", "2-1": "**PropTypes.oneOf([\"FixedDistance\", \"FixedToWorld\"])**\n\nDetermines the behavior of drag if **onDrag** is specified.\n\n|Value|Description|\n|:- -----|:- ---------:|\n|FixedDistance| Dragging is limited to a fixed radius around the user|\n|FixedToWorld| Dragging is based on intersection with real world objects. **Available only in AR** |\n\nThe default value is \"FixedDistance\".", "1-0": "**displayPointCloud**", "1-1": "**PropTypes.boolean**\nor\n**{pointCloudOptions}** described below.\n\nSetting this property to `true` draws the point cloud using a default configuration.\n\nSetting this property to `false` disables the drawing of the point cloud.\n\nThis property can also take a dictionary of properties which enable point cloud drawing with the given **pointCloudOptions:**\n\n|Key|Description|\n|---|---|\n| imageSource | image used to represent each point|\n|imageScale | scale of the image used for each point, the default is [.01,.01,.01]|\n|maxPoints| the max number of points drawn each frame|\n\nExample: \n```\n<ViroARScene displayPointCloud={{\n imageSource : require(\"./res/pointCloudPoint.png\"),\n imageScale : [.02,.02,.02],\n maxPoints : 100 }} />\n```", "8-0": "**onARPointCloudUpdate**", "8-1": "**PropTypes.func**\n\nThis callback will invoke whenever the point cloud is updated. \n\n|Parameters | Description|\n|---|---|\n|pointCloud|A Javascript object containing the point cloud in the format below|\n\n```\n{\n \"pointCloud\" : {\n \"points\" : [ [x, y, z, confidence], ... ],\n \"identifiers\" : [ identifier1, identifier2, ... ]\n }\n}\n```\n\nwhere:\n\n`x, y, z` - represents the x,y,z coordinates of the point in world space\n`confidence` - is a float value from 0 -> 1 that represents the confidence that the underlying system has for this point (Android only)\n`identifier` - is a number that is unique to the corresponding point in the points array that allow the user to track points between point cloud updates (iOS only)", "0-0": "**anchorDetectionTypes**", "0-1": "**PropTypes.string**\nor \n**PropTypes.arrayOf(PropTypes.string)**\n\nDetermines what types of anchors the scene should return.\n\nCurrently supports the following values:\n*None*\n*PlanesHorizontal*\n*PlanesVertical* - vertical planes are an ARKit 1.5+/iOS 11.3+ feature", "9-0": "**onCameraARHitTest**", "9-1": "**PropTypes.func**\n\nIf defined, a callback is invoked returning the camera position and orientation along with a set of hit results in an array consisting of [ARHitTestResult](doc:viroarscene#arhittestresult) objects. The hit test results correspond to the AR points found by the AR system defined by the ray shooting from the camera direction and position. \n\nThis can be used to show a tracking plane placed in the world while the user moves or to inform the user of the confidence of the area being looked at.\n\nIf defined, this callback is invoked as often as possible in order to keep with the frame rate.\n\nThe following object structure is returned:\n\n```\n{\n \"hitTestResults\": [ [ARHitTestResult1],\n [ARHitTestResult2],...]\n \"cameraOrientation\": {\n position: [x, y, z], \n rotation:[x,y,z], \n forward:[x,y,z], \n up[x,y,z]\n }\n}\n```\n [ARHitTestResult](doc:viroarscene#arhittestresult) format found [here](doc:viroarscene#arhittestresult).\n\ncameraOrientation consists of position, the rotation of the camera in degrees, and the current forward and up vectors of the camera.", "22-0": "**onTrackingUpdated**", "22-1": "**PropTypes.func** \n\nInvoked when the tracking state of the device changes. The tracking state indicates how well the device is able to track its position within the real world. Tracking state is subject to lighting conditions, the speed at which the device is moving, and other environmental factors.\n\nSample code:\n\n(Note that we use [ViroConstants](doc:viroconstants) to properly compare different tracked states): \n\n``` \n_onTrackingUpdated(state, reason) {\n if (state == ViroConstants.TRACKING_NORMAL){\n // Show my AR Scene experience\n } else if (state == ViroConstants.TRACKING_NONE){\n // Prompt user to move phone around\n }\n },\n``` \n\nTracking states include:\n\n|AR Tracking State Values|Description|\n|:- -----|:- ---------:|\n|1: TRACKING_UNAVAILABLE| Tracking is unavailable: the camera's position in the world is not known. |\n|2: TRACKING_LIMITED| Tracking is available, but the camera's position in the world may be inaccurate and should not be used with confidence. |\n|3: TRACKING_NORMAL| Camera position tracking is providing optimal results. |\n\nFor iOS, a possible diagnosis for limited tracking quality is provided in the second parameter: \"reason\". These states include:\n\n|AR Tracking State Reason|Description|\n|:- -----|:- ---------:|\n|1: TRACKING_REASON_NONE|The current tracking state is not limited. |\n|2: TRACKING_REASON_EXCESSIVE_MOTION| The device is moving too fast for accurate position tracking. |\n|3: TRACKING_REASON_INSUFFICIENT_FEATURES\t| The scene visible to the camera does not contain enough distinguishable features for optimal position tracking. |" }, "cols": 2, "rows": 26 } [/block] [block:html] { "html": "<div></div>\n\n<style></style>" } [/block] [block:html] { "html": "<div></div>\n\n<style></style>" } [/block] [block:html] { "html": "<div></div>\n\n<style></style>" } [/block] [block:api-header] { "title": "Methods" } [/block] [block:parameters] { "data": { "h-0": "async findCollisionsWithRayAsync(from: arrayOf(number), to: arrayOf(number), closest: bool, viroTag: string)", "0-0": "This function is used to find collisions between [physics](doc:physics) bodies and a line emanating from the given `from` position to the `to` position. Collided components have their `onCollision` callbacks invoked.\n\n|Parameters|Description|\n|- --|- --|\n|from|the origin position of the line|\n|to|the end position of the line|\n|closest| if true, only the first object intersected by the line (determined by closest distance to the origin) receives the `onCollision` callback|\n|viroTag|the string tag passed to collided components' `onCollision` callbacks|\n\n|Return Values|Description|\n|- --|- --|\n|hasHit| true/false whether or not a collision was detected|" }, "cols": 1, "rows": 1 } [/block] [block:parameters] { "data": { "h-0": "async findCollisionsWithShapeAsync(from:arrayOf(number), to:arrayOf(number), shapeString: string, shapeParam: object, viroTag: string)", "0-0": "This function is used to find collisions between [physics](doc:physics) bodies and the given shape moving from the given `from` position to the `to` position. Collided components have their `onCollision` callbacks invoked.\n\nIf the `from` and `to` positions are the same, then this function invokes the `onCollision` callbacks of all components within the given shape.\n\n|Parameters|Description|\n|- --|- --|\n|from|the origin position of the line|\n|to|the end position of the line|\n|shapeString| the name of the shape to use in this test|\n|shapeParam| the configuration of the shape used in this collision test|\n|viroTag|the string tag passed to collided components' `onCollision` callbacks|\n\n|Return Value|Description|\n|- --|- --|\n|hasHit| true/false whether or not a collision was detected|" }, "cols": 1, "rows": 1 } [/block] [block:parameters] { "data": { "h-0": "async getCameraOrientationAsync()", "0-0": "This function is used to fetch the current Camera's orientation.\n\n|Return Value|Description|\n|- --|- --|\n|orientation|an object that contains the camera's `position`, `rotation`, `forward` vector and `up` vector as number arrays|" }, "cols": 1, "rows": 1 } [/block] [block:parameters] { "data": { "h-0": "async performARHitTestWithRay(ray: arrayOf(number))", "0-0": "This function performs a AR system-backed hit test with the given ray from the camera's position outward.\n\n|Return Value|Description|\n|---|---|\n|arHitTestResults| returns an array of [ARHitTestResult](doc:viroarscene#arhittestresult) corresponding to the AR points found by the AR system along the ray.|" }, "cols": 1, "rows": 1 } [/block] [block:parameters] { "data": { "h-0": "async performARHitTestWithPosition(position: arrayOf(number))", "0-0": "This function performs an AR system-backed hit test with the ray from the camera to the given position.\n\n|Return Value|Description|\n|---|---|\n|arHitTestResults| returns an array of [ARHitTestResult](doc:viroarscene#arhittestresult) corresponding to the AR points found by the AR system along the ray.|" }, "cols": 1, "rows": 1 } [/block] [block:parameters] { "data": { "h-0": "async performARHitTestWithPoint(x:number, y:number)", "0-0": "This function performs an AR system-backed hit test with the given 2D screen coordinates in pixels. You may need to scale the x and y position by the pixel ratio to get the correct result:\n\nFor example:\n```\nperformARHitTestWithPoint(evt.nativeEvent.locationX * PixelRatio.get(), evt.nativeEvent.locationX * PixelRatio.get()) \n```\n\n|Return Value|Description|\n|---|---|\n|arHitTestResults| returns an array of [ARHitTestResult](doc:viroarscene#arhittestresult) corresponding to the AR points found by the AR system along the ray.|" }, "cols": 1, "rows": 1 } [/block] [block:api-header] { "title": "ARHitTestResult" } [/block] These are the individual objects in the array of ARHitTestResults returned by the two `performARHitTest...` functions. ``` arHitTestResult = (object) { type : string, transform : (object) { position : array(number), rotation : array(number), scale : array(number) } } ``` [block:parameters] { "data": { "h-0": "Key", "h-1": "Description", "0-0": "type", "1-0": "transform", "0-1": "**string**\n\nThe type of point returned, can only be one of the following:\n\n\"ExistingPlaneUsingExtent\"\n\"ExistingPlane\"\n\"EstimatedHorizontalPlane\"\n\"FeaturePoint\"", "1-1": "**object**\n\nThe transform of the point. Contains the following keys:\n\n`position`, `rotation`, `scale` as arrays of numbers." }, "cols": 2, "rows": 2 } [/block] [block:api-header] { "title": "Anchor" } [/block] This is the object given to the developer through the `onAnchorFound`, `onAnchorUpdated` and `onAnchorRemoved` callback functions. [block:parameters] { "data": { "h-0": "Key", "h-1": "Value", "0-0": "anchorId", "3-0": "rotation", "4-0": "center ([ViroARPlane](doc:viroarplane) only)", "6-0": "width ([ViroARPlane](doc:viroarplane) only)", "7-0": "height ([ViroARPlane](doc:viroarplane) only)", "6-1": "**number**\n\nCurrent width of the attached plane", "7-1": "**number**\n\nCurrent height of the attached plane", "0-1": "**string**\n\nId of the anchor", "3-1": "**arrayOf(number)**\n\nRotation of the rotation of the anchor in degrees.", "4-1": "**arrayOf(number)**\n\nCenter of the plane relative to the plane's position.", "2-0": "position", "2-1": "**arrayOf(number)**\n\nPosition of the anchor in world coordinates.", "1-0": "type", "1-1": "**string**\n\ntype of the anchor", "5-0": "alignment ([ViroARPlane](doc:viroarplane) only)", "5-1": "**string**\n\nThe plane alignment, one of the following values:\n\"horizontal\" - iOS only\n\"HorizontalDownwards\" - Android only\n\"HorizontalUpwards\" - Android only\n\"NonHorizontal\" - Android only", "8-0": "vertices", "8-1": "**arrayOf(arrayOf(number))**\n\nAn array of 3D points representing the vertices along the boundary of a polygonal plane for this\nViroARPlane. Although the contents of this property consist of 3D points, the represented polygonal plane is always two-dimensional, and is always positioned in only the x and z axis. This points are always placed relative to the ViroARPlane's center transform." }, "cols": 2, "rows": 9 } [/block] [block:api-header] { "title": "Post-Process Effects" } [/block] [block:parameters] { "data": { "h-0": "Effect", "h-1": "Description", "0-0": "grayscale", "1-0": "sepia", "2-0": "sincity", "3-0": "baralleldistortion", "4-0": "pincushiondistortion", "5-0": "thermalvision", "6-0": "crosshatch", "7-0": "pixelated", "0-1": "An effect where the resulting image is in black and white.", "1-1": "An effect where the resulting image has a dark reddish-brown pigment color effect on it.", "2-1": "A sin-city like effect where the resulting image is in black and white, except for places where there is saturated red colors.", "3-1": "A fish-eye-like effect where the fish eye lens \ndistortion becomes more pronounce towards the center of the image.", "4-1": "A cushioning effect where the resulting image is \"pinched\" into the center.", "5-1": "A coloring effect where the resulting image gives of a \"radiant heat\" look from a thermal sensor.", "6-1": "An effect where the resulting image is made up of tiny crossed lines that recreates the scene.", "7-1": "An effect where the resulting image is pixelized." }, "cols": 2, "rows": 8 } [/block]