{"_id":"5a06037134873d0010b39219","category":{"_id":"5a06037134873d0010b39204","version":"5a06037134873d0010b391fe","project":"578c4badbd223d2000cc1441","__v":0,"sync":{"url":"","isSync":false},"reference":false,"createdAt":"2016-08-01T23:04:12.838Z","from_sync":false,"order":5,"slug":"api-reference","title":"API Reference"},"project":"578c4badbd223d2000cc1441","user":"576c22a3808cf02b00d37419","parentDoc":null,"version":{"_id":"5a06037134873d0010b391fe","project":"578c4badbd223d2000cc1441","__v":1,"createdAt":"2017-11-10T19:52:17.163Z","releaseDate":"2017-11-10T19:52:17.163Z","categories":["5a06037134873d0010b391ff","5a06037134873d0010b39200","5a06037134873d0010b39201","5a06037134873d0010b39202","5a06037134873d0010b39203","5a06037134873d0010b39204"],"is_deprecated":false,"is_hidden":false,"is_beta":false,"is_stable":true,"codename":"","version_clean":"2.1.0","version":"2.1.0"},"__v":0,"updates":[],"next":{"pages":[],"description":""},"createdAt":"2017-09-16T17:27:27.260Z","link_external":false,"link_url":"","githubsync":"","sync_unique":"","hidden":false,"api":{"results":{"codes":[]},"settings":"","auth":"required","params":[],"url":""},"isReference":false,"order":8,"body":"The `ViroARScene` component allows developers to logically group their experiences and components and switch between them using the [ViroARSceneNavigator](doc:viroarscenenavigator). \n\nThis component also hosts various properties that enable developers to control and interact with the AR subsystem. Like `displayPointCloud` which configures the renderer to draw the AR point cloud. The `onAnchorFound|Updated|Removed` functions work in conjunction with `ViroARPlane's` [manual anchoring](doc:viroarplane#anchoring) mode to enable developers to fully control their experience.\n\n**Example use:**\n\n```\n<ViroARScene onTrackingInitialized={this._hideLoadingUI} >\n  <ViroARPlane>\n    <ViroBox position={[0, .5, 0]} />\n  </ViroARPlane>\n</ViroARScene>\n``` \n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Props\"\n}\n[/block]\n##Optional Props \n[block:parameters]\n{\n  \"data\": {\n    \"13-0\": \"**onPlatformUpdate**\",\n    \"22-0\": \"**rotation**\",\n    \"23-0\": \"**style**\",\n    \"24-0\": \"**text**\",\n    \"25-0\": \"**transformBehaviors**\",\n    \"27-0\": \"**visible**\",\n    \"26-0\": \"**width**\",\n    \"h-0\": \"PropKey\",\n    \"h-1\": \"PropType\",\n    \"13-1\": \"**React.PropTypes.func**\\n\\nCallback method set to be notified of platform specific information like headset type or controller type.\\n\\nExample Code:\\n```  \\n_onPlatformUpdate(platformInfo){\\n\\tvar platform = platformInfo.vrPlatform;\\n\\tvar headset = platformInfo.headset;\\n\\tvar controller = platformInfo.controller;\\n}\\n```\\n\\nList of supported platforms:\\n\\n| |Cardboard iOS|Cardboard Android|Daydream|GearVR\\n|:- ------------------|:- --------------|:- --------------|:- --------------|:- --------------|\\n|Platform|gvr|gvr|gvr|ovr-mobile|\\n|Headset|cardboard|cardboard|daydream|gearvr|\\n|Controller|cardboard|cardboard|daydream|gearvr|\",\n    \"22-1\": \"PropTypes.arrayOf(PropTypes.number)\\n\\nPut the PropType Description here.\",\n    \"23-1\": \"stylePropType\",\n    \"24-1\": \"PropTypes.string\\n\\nPut the PropType Description here.\",\n    \"25-1\": \"PropTypes.arrayOf(PropTypes.string)\\n\\nPut the PropType Description here.\",\n    \"26-1\": \"PropTypes.number\\n\\nPut the PropType Description here.\",\n    \"27-1\": \"PropTypes.bool\\n\\nPut the PropType Description here.\",\n    \"11-0\": \"**onHover**\",\n    \"11-1\": \"**React.PropTypes.func**\\n\\nFunction to be invoked when the user hovers on on the scene. If this is defined, it is invoked **ONLY** if no other object captures the onHover event.\\n\\nFor example:\\n```  \\n_onHover(isHovering)  {\\n    if(isHovering) {\\n        // user is hovering on the scene.\\n    } else {\\n        // user is hovering on another object in the scene.\\n    }\\n}\\n```\",\n    \"8-0\": \"**onClickState**\",\n    \"8-1\": \"**PropTypes.func**\\n\\nCalled for each click state an object goes through as it is clicked. Supported click states and their values are the following:\\n\\n|State Value|Description|\\n|:- -----|:- ---------:|\\n|1| Click Down: Triggered when the user has performed a click down action while hovering on this control.|\\n|2| Click Up: Triggered when the user has performed a click up action while hovering on this control.|\\n|3| Clicked: Triggered when the user has performed both a click down and click up action on this control sequentially, thereby having \\\"Clicked\\\" the object.|\\n\\nExample code:\\n```  \\n_onClickState(stateValue, source)  {\\n    if(stateValue == 1) {\\n        // Click Down\\n    } else if(stateValue == 2) {\\n        // Click Up\\n    } else if(stateValue == 3) { \\n        // Clicked\\n    }\\n}\\n```\\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section.\\n\\nThis is **ONLY** invoked if a click is not captured on another object within a scene.\",\n    \"15-1\": \"**React.PropTypes.func**\\n\\nNEED TO UPDATE DESCRIPTION\",\n    \"16-1\": \"**React.PropTypes.func**\\n\\nNEED TO UPDATE DESCRIPTION\",\n    \"17-1\": \"**React.PropTypes.func**\\n\\nCalled when the user performs a touch action, while hovering on the control. Provides the touch state type, and the x/y coordinate at which this touch event has occurred.\\n\\n|State Value|Description|\\n|:- -----|:- ---------:|\\n|1| Touch Down: Triggered when the user  makes physical contact with the touch pad on the controller. |\\n|2| Touch Down Move: Called when the user moves around the touch pad immediately after having performed a Touch Down action. |\\n|3| Touch Up: Triggered after the user is no longer in physical contact with the touch pad after a Touch Down action. |\\n\\nFor example:\\n```  \\n_onTouch(state, touchPos, source)  {\\n   var touchX = touchPos[0];\\n   var touchY = touchPos[1];\\n    if(state == 1) {\\n        // Touch Down\\n    } else if(state == 2) {\\n        // Touch Down Move\\n    } else if(state == 3) { \\n        // Touch Up\\n    }\\n}\\n```\\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section.\\n\\nUnsupported VR Platforms: Cardboard(Android and iOS).\",\n    \"15-0\": \"**onScroll**\",\n    \"16-0\": \"**onSwipe**\",\n    \"17-0\": \"**onTouch**\",\n    \"21-0\": \"**soundRoom**\",\n    \"21-1\": \"**PropTypes.shape**\\n\\nDescribes the acoustic properties of the room around the user by allowing the developer to describe the room based on its dimensions and its surface properties.  Note: This is not supported in Cardboard iOS.\\n\\nCode Example:\\n```  \\n    soundRoom={{\\n      size: {[2,2,2]},\\n      wallMaterial: \\\"acoustic_ceiling_tiles\\\",\\n      ceilingMaterial:\\\"glass_thin\\\",\\n      floorMaterial:\\\"concrete_block_coarse\\\"\\n    }}\\n```\\nList of soundRoom properties:\\n\\n|Name|Description|\\n|:- ------------------|:- --------------|\\n|size|The 3D dimensions of the room.|\\n|wallMaterial|Sound Material for the four walls.|\\n|ceilingMaterial|Sound Material for the ceiling|\\n|floorMaterial|Sound Material for the floor|\\n\\nList of Supported Sound Materials:\\n\\n|Name|Description|\\n|:- ------------------|:- --------------|\\n|acoustic_ceiling_tiles|Acoustic ceiling tiles, absorbs most frequencies.|\\n|brick_bare|Bare brick, relatively reflective.|\\n|brick_painted|Painted brick|\\n|concrete_block_coarse|Coarse surface concrete block.|\\n|concrete_block_painted|Painted concrete block.|\\n|curtain_heavy|Heavy curtains.|\\n|fiber_glass_insulation|Fiber glass insulation.|\\n|glass_thin|Thin glass.|\\n|glass_thick|Thick glass.|\\n|grass|Grass.|\\n|linoleum_on_concrete|Linoleum on concrete.|\\n|marble|Marble.|\\n|metal|Galvanized sheet metal.|\\n|parquet_on_concrete|Wooden parquet on concrete.|\\n|plaster_rough|Rough plaster surface.|\\n|plaster_smooth|Smooth plaster surface.|\\n|plywood_panel|Plywood panel|\\n|polished_concrete_or_tile|Polished concrete or tile surface.|\\n|sheet_rock|Sheet rock|\\n|transparent|Acoustically transparent material, reflects no sound.|\\n|water_or_ice_surface|Surface of water or ice.|\\n|wood_ceiling|Wooden ceiling.|\\n|wood_panel|Wood paneling.|\",\n    \"10-0\": \"**onFuse**\",\n    \"10-1\": \"**PropTypes.oneOfType**\\n```  \\nPropTypes.oneOfType([\\n      React.PropTypes.shape({\\n        callback: React.PropTypes.func.isRequired,\\n        timeToFuse: PropTypes.number\\n      }),\\n      React.PropTypes.func,\\n])\\n```  \\nAs shown above, onFuse takes one of the types - either a callback, or a dictionary with a callback and duration. \\n\\nIt is called after the user hovers onto and remains hovered on the control for a certain duration of time, as indicated in timeToFuse that represents the duration of time in milliseconds. \\n\\nWhile hovering, the reticle will display a count down animation while fusing towards timeToFuse.\\n\\nNote that timeToFuse defaults to 2000ms.\\n\\nFor example:\\n```  \\n_onFuse(source){\\n   // User has hovered over object for timeToFuse milliseconds\\n}\\n```\\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section.\",\n    \"7-0\": \"**onClick**\",\n    \"7-1\": \"**PropTypes.func**\\n\\nFunction to be invoked when a user clicks on a scene. This is **ONLY** invoked if a click is not captured on another object within a scene.\\n\\nDefining this can be used to register clicks for 360 Photos and videos.\",\n    \"2-0\": \"**ignoreEventHandling**\",\n    \"9-0\": \"**onDrag**\",\n    \"12-0\": \"**onPinch**\",\n    \"14-0\": \"**onRotate**\",\n    \"18-0\": \"**onTrackingInitialized**\",\n    \"3-0\": \"**onAmbientLightUpdate**\",\n    \"20-0\": \"**physicsWorld**\",\n    \"19-0\": \"**postProcessEffects**\",\n    \"9-1\": \"**PropTypes.func**\\n\\nCalled when the view is currently being dragged. The dragToPos parameter provides the current 3D location of the dragged object. \\n\\nExample code:\\n```  \\n_onDrag(dragToPos, source)  {\\n    // dragtoPos[0]: x position\\n    // dragtoPos[1]: y position\\n    // dragtoPos[2]: z position\\n}\\n``` \\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section. \\n\\nUnsupported VR Platforms: Cardboard iOS\",\n    \"2-1\": \"**PropTypes.bool**\\n\\nWhen set to true, this control will ignore events and not prevent controls behind it from receiving event callbacks.\\n\\nThe default value is false.\",\n    \"12-1\": \"**React.PropTypes.func**\\n\\nCalled when the user performs a pinch gesture on the control. When the pinch starts, the scale factor is set to 1 is relative to the points of the two touch points.  \\n\\nFor example:\\n```\\n  _onPinch(pinchState, scaleFactor, source) {\\n       if(pinchState == 3) {\\n      // update scale of obj by multiplying by scaleFactor  when pinch ends.\\n        return;\\n       }\\n     //set scale using native props to reflect pinch.\\n  }\\n```\\npinchState can be the following values:\\n\\n|State Value|Description|\\n|:- -----|:- ---------:|\\n|1| Pinch Start: Triggered when the user has started a pinch gesture.|\\n|2| Pinch Move: Triggered when the user has adjusted the pinch, moving both fingers. |\\n|3| Pinch End: When the user has finishes the pinch gesture and released both touch points. |\\n\\n**This event is only available in AR iOS**.\",\n    \"14-1\": \"**React.PropTypes.func**\\n\\nCalled when the user performs a rotation touch gesture on the control. Rotation factor is returned in degrees.\\n\\nWhen setting rotation, the rotation should be relative to it's current rotation, *not* set to the absolute value of the given rotationFactor.\\n\\nFor example:\\n\\n```\\n    _onRotate(rotateState, rotationFactor, source) {\\n\\n      if (rotateState == 3) {\\n        //set to current rotation - rotationFactor.\\n        return;\\n      }\\n     //update rotation using setNativeProps\\n    },\\n\\n```\\nrotationFactor can be the following values:\\n\\n|State Value|Description|\\n|:- -----|:- ---------:|\\n|1| Rotation Start: Triggered when the user has started a rotation gesture.|\\n|2| Rotation Move: Triggered when the user has adjusted the rotation, moving both fingers. |\\n|3| Rotation End: When the user has finishes the rotation gesture and released both touch points. |\\n\\n**This event is only available in AR iOS**.\",\n    \"3-1\": \"**PropTypes.func**\\n\\nFunction that provides an estimate of the light intensity and color temperature.\\n\\n|Parameter|Description|\\n|- --|- --|\\n|intensity| a number representing the estimated intensity of the ambient light as detected by the camera|\\n|colorTemperature|a number representing the estimated colorTemperature of the ambient light as detected by the camera|\",\n    \"18-1\": \"**PropTypes.func**\\n\\nFunction called when the AR system has properly initialized. Until this function is called, the camera position is\",\n    \"19-1\": \"**PropTypes.arrayOf(PropTypes.string)**\\n\\nSpecifies which post-process effects to enable. Refer to [Post-Process Effects](doc:viroarscene#post-process-effects) for more information.\",\n    \"20-1\": \"**PropTypes.shape({\\n      gravity: PropTypes.arrayOf(PropTypes.number).isRequired,\\n       drawBounds: PropTypes.bool,\\n    })**\\n\\nContains and processes the physics bodies of all viro controls that have been physics enabled in this scene. Environmental physics properties are also applied, like gravity. \\n\\n|SubPropType|Description|\\n|:------|:----------:|\\n|gravity| A constant gravitational acceleration that is applied to all physics body objects in this scene. It is a vector in the terms of meters per second. Defaults to [0, -9.81, 0].|\\n|drawBounds| If true, renders the mesh representing the shape of all physics bodies in this scene.|\",\n    \"4-0\": \"**onAnchorFound**\",\n    \"5-0\": \"**onAnchorUpdated**\",\n    \"6-0\": \"**onAnchorRemoved**\",\n    \"4-1\": \"**PropTypes.func**\\n\\nCalled when the AR system finds an Anchor.\\n\\n|Parameters  | Description |\\n|---|---|\\n|anchor| see  [Anchor](doc:viroarscene#anchor) |\",\n    \"5-1\": \"**PropTypes.func**\\n\\nCalled when the AR system detects changed properties of a previously found Anchor.\\n\\n|Parameters  | Description |\\n|---|---|\\n|anchor| see  [Anchor](doc:viroarscene#anchor) |\",\n    \"6-1\": \"**PropTypes.func**\\n\\nCalled when the AR system detects that a previously found Anchor no longer exists\\n\\n|Parameters  | Description |\\n|---|---|\\n|anchor| see  [Anchor](doc:viroarscene#anchor) |\",\n    \"1-0\": \"**dragType**\",\n    \"1-1\": \"**PropTypes.oneOf([\\\"FixedDistance\\\", \\\"FixedToWorld\\\"])**\\n\\nDetermines the behavior of drag if **onDrag** is specified.\\n\\n|Value|Description|\\n|:- -----|:- ---------:|\\n|FixedDistance| Dragging is limited to a fixed radius around the user|\\n|FixedToWorld| Dragging is based on intersection with real world objects. **Available only in AR** |\\n\\nThe default value is \\\"FixedDistance\\\".\",\n    \"0-0\": \"**displayPointCloud**\",\n    \"0-1\": \"**PropTypes.boolean**\\nor\\n**{pointCloudOptions}** described below.\\n\\nSetting this property to `true` draws the point cloud using a default configuration.\\n\\nSetting this property to `false` disables the drawing of the point cloud.\\n\\nThis property can also take a dictionary of properties which enable point cloud drawing with the given **pointCloudOptions:**\\n\\n|Key|Description|\\n|---|---|\\n| imageSource | image used to represent each point|\\n|imageScale | scale of the image used for each point, the default is [.01,.01,.01]|\\n|maxPoints| the max number of points drawn each frame|\\n\\nExample: \\n```\\n<ViroARScene displayPointCloud={{\\n    imageSource : require(\\\"./res/pointCloudPoint.png\\\"),\\n    imageScale : [.02,.02,.02],\\n    maxPoints : 100 }} />\\n```\"\n  },\n  \"cols\": 2,\n  \"rows\": 22\n}\n[/block]\n\n[block:api-header]\n{\n  \"title\": \"Methods\"\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"async findCollisionsWithRayAsync(from: arrayOf(number), to: arrayOf(number), closest: bool, viroTag: string)\",\n    \"0-0\": \"This function is used to find collisions between [physics](doc:physics) bodies and a line emanating from the given `from` position to the `to` position. Collided components have their `onCollision` callbacks invoked.\\n\\n|Parameters|Description|\\n|- --|- --|\\n|from|the origin position of the line|\\n|to|the end position of the line|\\n|closest| if true, only the first object intersected by the line (determined by closest distance to the origin) receives the `onCollision` callback|\\n|viroTag|the string tag passed to collided components' `onCollision` callbacks|\\n\\n|Return Values|Description|\\n|- --|- --|\\n|hasHit| true/false whether or not a collision was detected|\"\n  },\n  \"cols\": 1,\n  \"rows\": 1\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"async findCollisionsWithShapeAsync(from:arrayOf(number), to:arrayOf(number), shapeString: string, shapeParam: object, viroTag: string)\",\n    \"0-0\": \"This function is used to find collisions between [physics](doc:physics) bodies and the given shape moving from the given `from` position to the `to` position. Collided components have their `onCollision` callbacks invoked.\\n\\nIf the `from` and `to` positions are the same, then this function invokes the `onCollision` callbacks of all components within the given shape.\\n\\n|Parameters|Description|\\n|- --|- --|\\n|from|the origin position of the line|\\n|to|the end position of the line|\\n|shapeString| the name of the shape to use in this test|\\n|shapeParam| the configuration of the shape used in this collision test|\\n|viroTag|the string tag passed to collided components' `onCollision` callbacks|\\n\\n|Return Value|Description|\\n|- --|- --|\\n|hasHit| true/false whether or not a collision was detected|\"\n  },\n  \"cols\": 1,\n  \"rows\": 1\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"async getCameraOrientationAsync()\",\n    \"0-0\": \"This function is used to fetch the current Camera's orientation.\\n\\n|Return Value|Description|\\n|- --|- --|\\n|orientation|an object that contains the camera's `position`, `rotation`, `forward` vector and `up` vector as number arrays|\"\n  },\n  \"cols\": 1,\n  \"rows\": 1\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"async performARHitTestWithRay(ray: arrayOf(number))\",\n    \"0-0\": \"This function performs a AR system-backed hit test with the given ray from the camera's position outward.\\n\\n|Return Value|Description|\\n|---|---|\\n|arHitTestResults| returns an array of [ARHitTestResult](doc:viroarscene#arhittestresult) corresponding to the AR points found by the AR system along the ray.|\"\n  },\n  \"cols\": 1,\n  \"rows\": 1\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"async performARHitTestWithPosition(position: arrayOf(number))\",\n    \"0-0\": \"This function performs an AR system-backed hit test with the ray from the camera to the given position.\\n\\n|Return Value|Description|\\n|---|---|\\n|arHitTestResults| returns an array of [ARHitTestResult](doc:viroarscene#arhittestresult) corresponding to the AR points found by the AR system along the ray.|\"\n  },\n  \"cols\": 1,\n  \"rows\": 1\n}\n[/block]\n\n[block:api-header]\n{\n  \"title\": \"ARHitTestResult\"\n}\n[/block]\nThese are the individual objects in the array of ARHitTestResults returned by the two `performARHitTest...` functions. \n\n```\narHitTestResult = (object) {\n  type : string,\n  transform : (object) {\n    position : array(number),\n    rotation : array(number),\n    scale : array(number)\n  }\n}\n```\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"Key\",\n    \"h-1\": \"Description\",\n    \"0-0\": \"type\",\n    \"1-0\": \"transform\",\n    \"0-1\": \"**string**\\n\\nThe type of point returned, can only be one of the following:\\n\\n\\\"ExistingPlaneUsingExtent\\\"\\n\\\"ExistingPlane\\\"\\n\\\"EstimatedHorizontalPlane\\\"\\n\\\"FeaturePoint\\\"\",\n    \"1-1\": \"**object**\\n\\nThe transform of the point. Contains the following keys:\\n\\n`position`, `rotation`, `scale` as arrays of numbers.\"\n  },\n  \"cols\": 2,\n  \"rows\": 2\n}\n[/block]\n\n[block:api-header]\n{\n  \"title\": \"Anchor\"\n}\n[/block]\nThis is the object given to the developer through the `onAnchorFound`, `onAnchorUpdated` and `onAnchorRemoved` callback functions.\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"Key\",\n    \"h-1\": \"Value\",\n    \"0-0\": \"anchorId\",\n    \"3-0\": \"rotation\",\n    \"4-0\": \"center ([ViroARPlane](doc:viroarplane) only)\",\n    \"6-0\": \"width ([ViroARPlane](doc:viroarplane) only)\",\n    \"7-0\": \"height ([ViroARPlane](doc:viroarplane) only)\",\n    \"6-1\": \"**number**\\n\\nCurrent width of the attached plane\",\n    \"7-1\": \"**number**\\n\\nCurrent height of the attached plane\",\n    \"0-1\": \"**string**\\n\\nId of the anchor\",\n    \"3-1\": \"**arrayOf(number)**\\n\\nRotation of the rotation of the anchor in degrees.\",\n    \"4-1\": \"**arrayOf(number)**\\n\\nCenter of the plane relative to the plane's position.\",\n    \"2-0\": \"position\",\n    \"2-1\": \"**arrayOf(number)**\\n\\nPosition of the anchor in world coordinates.\",\n    \"1-0\": \"type\",\n    \"1-1\": \"**string**\\n\\ntype of the anchor\",\n    \"5-0\": \"alignment ([ViroARPlane](doc:viroarplane) only)\",\n    \"5-1\": \"**string**\\n\\nThe plane alignment, one of the following values:\\n\\\"horizontal\\\" - iOS only\\n\\\"HorizontalDownwards\\\" - Android only\\n\\\"HorizontalUpwards\\\" - Android only\\n\\\"NonHorizontal\\\" - Android only\"\n  },\n  \"cols\": 2,\n  \"rows\": 8\n}\n[/block]\n\n[block:api-header]\n{\n  \"title\": \"Post-Process Effects\"\n}\n[/block]\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"Effect\",\n    \"h-1\": \"Description\",\n    \"0-0\": \"grayscale\",\n    \"1-0\": \"sepia\",\n    \"2-0\": \"sincity\",\n    \"3-0\": \"baralleldistortion\",\n    \"4-0\": \"pincushiondistortion\",\n    \"5-0\": \"thermalvision\",\n    \"6-0\": \"crosshatch\",\n    \"7-0\": \"pixelated\",\n    \"0-1\": \"An effect where the resulting image is in black and white.\",\n    \"1-1\": \"An effect where the resulting image has a dark reddish-brown pigment color effect on it.\",\n    \"2-1\": \"A sin-city like effect where the resulting image is in black and white, except for places where there is saturated red colors.\",\n    \"3-1\": \"A fish-eye-like effect where the fish eye lens \\ndistortion becomes more pronounce towards the center of the image.\",\n    \"4-1\": \"A cushioning effect where the resulting image is \\\"pinched\\\" into the center.\",\n    \"5-1\": \"A coloring effect where the resulting image gives of a \\\"radiant heat\\\" look from a thermal sensor.\",\n    \"6-1\": \"An effect where the resulting image is made up of tiny crossed lines that recreates the scene.\",\n    \"7-1\": \"An effect where the resulting image is pixelized.\"\n  },\n  \"cols\": 2,\n  \"rows\": 8\n}\n[/block]","excerpt":"","slug":"viroarscene","type":"basic","title":"ViroARScene"}
The `ViroARScene` component allows developers to logically group their experiences and components and switch between them using the [ViroARSceneNavigator](doc:viroarscenenavigator). This component also hosts various properties that enable developers to control and interact with the AR subsystem. Like `displayPointCloud` which configures the renderer to draw the AR point cloud. The `onAnchorFound|Updated|Removed` functions work in conjunction with `ViroARPlane's` [manual anchoring](doc:viroarplane#anchoring) mode to enable developers to fully control their experience. **Example use:** ``` <ViroARScene onTrackingInitialized={this._hideLoadingUI} > <ViroARPlane> <ViroBox position={[0, .5, 0]} /> </ViroARPlane> </ViroARScene> ``` [block:api-header] { "type": "basic", "title": "Props" } [/block] ##Optional Props [block:parameters] { "data": { "13-0": "**onPlatformUpdate**", "22-0": "**rotation**", "23-0": "**style**", "24-0": "**text**", "25-0": "**transformBehaviors**", "27-0": "**visible**", "26-0": "**width**", "h-0": "PropKey", "h-1": "PropType", "13-1": "**React.PropTypes.func**\n\nCallback method set to be notified of platform specific information like headset type or controller type.\n\nExample Code:\n``` \n_onPlatformUpdate(platformInfo){\n\tvar platform = platformInfo.vrPlatform;\n\tvar headset = platformInfo.headset;\n\tvar controller = platformInfo.controller;\n}\n```\n\nList of supported platforms:\n\n| |Cardboard iOS|Cardboard Android|Daydream|GearVR\n|:- ------------------|:- --------------|:- --------------|:- --------------|:- --------------|\n|Platform|gvr|gvr|gvr|ovr-mobile|\n|Headset|cardboard|cardboard|daydream|gearvr|\n|Controller|cardboard|cardboard|daydream|gearvr|", "22-1": "PropTypes.arrayOf(PropTypes.number)\n\nPut the PropType Description here.", "23-1": "stylePropType", "24-1": "PropTypes.string\n\nPut the PropType Description here.", "25-1": "PropTypes.arrayOf(PropTypes.string)\n\nPut the PropType Description here.", "26-1": "PropTypes.number\n\nPut the PropType Description here.", "27-1": "PropTypes.bool\n\nPut the PropType Description here.", "11-0": "**onHover**", "11-1": "**React.PropTypes.func**\n\nFunction to be invoked when the user hovers on on the scene. If this is defined, it is invoked **ONLY** if no other object captures the onHover event.\n\nFor example:\n``` \n_onHover(isHovering) {\n if(isHovering) {\n // user is hovering on the scene.\n } else {\n // user is hovering on another object in the scene.\n }\n}\n```", "8-0": "**onClickState**", "8-1": "**PropTypes.func**\n\nCalled for each click state an object goes through as it is clicked. Supported click states and their values are the following:\n\n|State Value|Description|\n|:- -----|:- ---------:|\n|1| Click Down: Triggered when the user has performed a click down action while hovering on this control.|\n|2| Click Up: Triggered when the user has performed a click up action while hovering on this control.|\n|3| Clicked: Triggered when the user has performed both a click down and click up action on this control sequentially, thereby having \"Clicked\" the object.|\n\nExample code:\n``` \n_onClickState(stateValue, source) {\n if(stateValue == 1) {\n // Click Down\n } else if(stateValue == 2) {\n // Click Up\n } else if(stateValue == 3) { \n // Clicked\n }\n}\n```\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section.\n\nThis is **ONLY** invoked if a click is not captured on another object within a scene.", "15-1": "**React.PropTypes.func**\n\nNEED TO UPDATE DESCRIPTION", "16-1": "**React.PropTypes.func**\n\nNEED TO UPDATE DESCRIPTION", "17-1": "**React.PropTypes.func**\n\nCalled when the user performs a touch action, while hovering on the control. Provides the touch state type, and the x/y coordinate at which this touch event has occurred.\n\n|State Value|Description|\n|:- -----|:- ---------:|\n|1| Touch Down: Triggered when the user makes physical contact with the touch pad on the controller. |\n|2| Touch Down Move: Called when the user moves around the touch pad immediately after having performed a Touch Down action. |\n|3| Touch Up: Triggered after the user is no longer in physical contact with the touch pad after a Touch Down action. |\n\nFor example:\n``` \n_onTouch(state, touchPos, source) {\n var touchX = touchPos[0];\n var touchY = touchPos[1];\n if(state == 1) {\n // Touch Down\n } else if(state == 2) {\n // Touch Down Move\n } else if(state == 3) { \n // Touch Up\n }\n}\n```\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section.\n\nUnsupported VR Platforms: Cardboard(Android and iOS).", "15-0": "**onScroll**", "16-0": "**onSwipe**", "17-0": "**onTouch**", "21-0": "**soundRoom**", "21-1": "**PropTypes.shape**\n\nDescribes the acoustic properties of the room around the user by allowing the developer to describe the room based on its dimensions and its surface properties. Note: This is not supported in Cardboard iOS.\n\nCode Example:\n``` \n soundRoom={{\n size: {[2,2,2]},\n wallMaterial: \"acoustic_ceiling_tiles\",\n ceilingMaterial:\"glass_thin\",\n floorMaterial:\"concrete_block_coarse\"\n }}\n```\nList of soundRoom properties:\n\n|Name|Description|\n|:- ------------------|:- --------------|\n|size|The 3D dimensions of the room.|\n|wallMaterial|Sound Material for the four walls.|\n|ceilingMaterial|Sound Material for the ceiling|\n|floorMaterial|Sound Material for the floor|\n\nList of Supported Sound Materials:\n\n|Name|Description|\n|:- ------------------|:- --------------|\n|acoustic_ceiling_tiles|Acoustic ceiling tiles, absorbs most frequencies.|\n|brick_bare|Bare brick, relatively reflective.|\n|brick_painted|Painted brick|\n|concrete_block_coarse|Coarse surface concrete block.|\n|concrete_block_painted|Painted concrete block.|\n|curtain_heavy|Heavy curtains.|\n|fiber_glass_insulation|Fiber glass insulation.|\n|glass_thin|Thin glass.|\n|glass_thick|Thick glass.|\n|grass|Grass.|\n|linoleum_on_concrete|Linoleum on concrete.|\n|marble|Marble.|\n|metal|Galvanized sheet metal.|\n|parquet_on_concrete|Wooden parquet on concrete.|\n|plaster_rough|Rough plaster surface.|\n|plaster_smooth|Smooth plaster surface.|\n|plywood_panel|Plywood panel|\n|polished_concrete_or_tile|Polished concrete or tile surface.|\n|sheet_rock|Sheet rock|\n|transparent|Acoustically transparent material, reflects no sound.|\n|water_or_ice_surface|Surface of water or ice.|\n|wood_ceiling|Wooden ceiling.|\n|wood_panel|Wood paneling.|", "10-0": "**onFuse**", "10-1": "**PropTypes.oneOfType**\n``` \nPropTypes.oneOfType([\n React.PropTypes.shape({\n callback: React.PropTypes.func.isRequired,\n timeToFuse: PropTypes.number\n }),\n React.PropTypes.func,\n])\n``` \nAs shown above, onFuse takes one of the types - either a callback, or a dictionary with a callback and duration. \n\nIt is called after the user hovers onto and remains hovered on the control for a certain duration of time, as indicated in timeToFuse that represents the duration of time in milliseconds. \n\nWhile hovering, the reticle will display a count down animation while fusing towards timeToFuse.\n\nNote that timeToFuse defaults to 2000ms.\n\nFor example:\n``` \n_onFuse(source){\n // User has hovered over object for timeToFuse milliseconds\n}\n```\nFor the mapping of sources to controller inputs, see the [Events](doc:events) section.", "7-0": "**onClick**", "7-1": "**PropTypes.func**\n\nFunction to be invoked when a user clicks on a scene. This is **ONLY** invoked if a click is not captured on another object within a scene.\n\nDefining this can be used to register clicks for 360 Photos and videos.", "2-0": "**ignoreEventHandling**", "9-0": "**onDrag**", "12-0": "**onPinch**", "14-0": "**onRotate**", "18-0": "**onTrackingInitialized**", "3-0": "**onAmbientLightUpdate**", "20-0": "**physicsWorld**", "19-0": "**postProcessEffects**", "9-1": "**PropTypes.func**\n\nCalled when the view is currently being dragged. The dragToPos parameter provides the current 3D location of the dragged object. \n\nExample code:\n``` \n_onDrag(dragToPos, source) {\n // dragtoPos[0]: x position\n // dragtoPos[1]: y position\n // dragtoPos[2]: z position\n}\n``` \nFor the mapping of sources to controller inputs, see the [Events](doc:events) section. \n\nUnsupported VR Platforms: Cardboard iOS", "2-1": "**PropTypes.bool**\n\nWhen set to true, this control will ignore events and not prevent controls behind it from receiving event callbacks.\n\nThe default value is false.", "12-1": "**React.PropTypes.func**\n\nCalled when the user performs a pinch gesture on the control. When the pinch starts, the scale factor is set to 1 is relative to the points of the two touch points. \n\nFor example:\n```\n _onPinch(pinchState, scaleFactor, source) {\n if(pinchState == 3) {\n // update scale of obj by multiplying by scaleFactor when pinch ends.\n return;\n }\n //set scale using native props to reflect pinch.\n }\n```\npinchState can be the following values:\n\n|State Value|Description|\n|:- -----|:- ---------:|\n|1| Pinch Start: Triggered when the user has started a pinch gesture.|\n|2| Pinch Move: Triggered when the user has adjusted the pinch, moving both fingers. |\n|3| Pinch End: When the user has finishes the pinch gesture and released both touch points. |\n\n**This event is only available in AR iOS**.", "14-1": "**React.PropTypes.func**\n\nCalled when the user performs a rotation touch gesture on the control. Rotation factor is returned in degrees.\n\nWhen setting rotation, the rotation should be relative to it's current rotation, *not* set to the absolute value of the given rotationFactor.\n\nFor example:\n\n```\n _onRotate(rotateState, rotationFactor, source) {\n\n if (rotateState == 3) {\n //set to current rotation - rotationFactor.\n return;\n }\n //update rotation using setNativeProps\n },\n\n```\nrotationFactor can be the following values:\n\n|State Value|Description|\n|:- -----|:- ---------:|\n|1| Rotation Start: Triggered when the user has started a rotation gesture.|\n|2| Rotation Move: Triggered when the user has adjusted the rotation, moving both fingers. |\n|3| Rotation End: When the user has finishes the rotation gesture and released both touch points. |\n\n**This event is only available in AR iOS**.", "3-1": "**PropTypes.func**\n\nFunction that provides an estimate of the light intensity and color temperature.\n\n|Parameter|Description|\n|- --|- --|\n|intensity| a number representing the estimated intensity of the ambient light as detected by the camera|\n|colorTemperature|a number representing the estimated colorTemperature of the ambient light as detected by the camera|", "18-1": "**PropTypes.func**\n\nFunction called when the AR system has properly initialized. Until this function is called, the camera position is", "19-1": "**PropTypes.arrayOf(PropTypes.string)**\n\nSpecifies which post-process effects to enable. Refer to [Post-Process Effects](doc:viroarscene#post-process-effects) for more information.", "20-1": "**PropTypes.shape({\n gravity: PropTypes.arrayOf(PropTypes.number).isRequired,\n drawBounds: PropTypes.bool,\n })**\n\nContains and processes the physics bodies of all viro controls that have been physics enabled in this scene. Environmental physics properties are also applied, like gravity. \n\n|SubPropType|Description|\n|:------|:----------:|\n|gravity| A constant gravitational acceleration that is applied to all physics body objects in this scene. It is a vector in the terms of meters per second. Defaults to [0, -9.81, 0].|\n|drawBounds| If true, renders the mesh representing the shape of all physics bodies in this scene.|", "4-0": "**onAnchorFound**", "5-0": "**onAnchorUpdated**", "6-0": "**onAnchorRemoved**", "4-1": "**PropTypes.func**\n\nCalled when the AR system finds an Anchor.\n\n|Parameters | Description |\n|---|---|\n|anchor| see [Anchor](doc:viroarscene#anchor) |", "5-1": "**PropTypes.func**\n\nCalled when the AR system detects changed properties of a previously found Anchor.\n\n|Parameters | Description |\n|---|---|\n|anchor| see [Anchor](doc:viroarscene#anchor) |", "6-1": "**PropTypes.func**\n\nCalled when the AR system detects that a previously found Anchor no longer exists\n\n|Parameters | Description |\n|---|---|\n|anchor| see [Anchor](doc:viroarscene#anchor) |", "1-0": "**dragType**", "1-1": "**PropTypes.oneOf([\"FixedDistance\", \"FixedToWorld\"])**\n\nDetermines the behavior of drag if **onDrag** is specified.\n\n|Value|Description|\n|:- -----|:- ---------:|\n|FixedDistance| Dragging is limited to a fixed radius around the user|\n|FixedToWorld| Dragging is based on intersection with real world objects. **Available only in AR** |\n\nThe default value is \"FixedDistance\".", "0-0": "**displayPointCloud**", "0-1": "**PropTypes.boolean**\nor\n**{pointCloudOptions}** described below.\n\nSetting this property to `true` draws the point cloud using a default configuration.\n\nSetting this property to `false` disables the drawing of the point cloud.\n\nThis property can also take a dictionary of properties which enable point cloud drawing with the given **pointCloudOptions:**\n\n|Key|Description|\n|---|---|\n| imageSource | image used to represent each point|\n|imageScale | scale of the image used for each point, the default is [.01,.01,.01]|\n|maxPoints| the max number of points drawn each frame|\n\nExample: \n```\n<ViroARScene displayPointCloud={{\n imageSource : require(\"./res/pointCloudPoint.png\"),\n imageScale : [.02,.02,.02],\n maxPoints : 100 }} />\n```" }, "cols": 2, "rows": 22 } [/block] [block:api-header] { "title": "Methods" } [/block] [block:parameters] { "data": { "h-0": "async findCollisionsWithRayAsync(from: arrayOf(number), to: arrayOf(number), closest: bool, viroTag: string)", "0-0": "This function is used to find collisions between [physics](doc:physics) bodies and a line emanating from the given `from` position to the `to` position. Collided components have their `onCollision` callbacks invoked.\n\n|Parameters|Description|\n|- --|- --|\n|from|the origin position of the line|\n|to|the end position of the line|\n|closest| if true, only the first object intersected by the line (determined by closest distance to the origin) receives the `onCollision` callback|\n|viroTag|the string tag passed to collided components' `onCollision` callbacks|\n\n|Return Values|Description|\n|- --|- --|\n|hasHit| true/false whether or not a collision was detected|" }, "cols": 1, "rows": 1 } [/block] [block:parameters] { "data": { "h-0": "async findCollisionsWithShapeAsync(from:arrayOf(number), to:arrayOf(number), shapeString: string, shapeParam: object, viroTag: string)", "0-0": "This function is used to find collisions between [physics](doc:physics) bodies and the given shape moving from the given `from` position to the `to` position. Collided components have their `onCollision` callbacks invoked.\n\nIf the `from` and `to` positions are the same, then this function invokes the `onCollision` callbacks of all components within the given shape.\n\n|Parameters|Description|\n|- --|- --|\n|from|the origin position of the line|\n|to|the end position of the line|\n|shapeString| the name of the shape to use in this test|\n|shapeParam| the configuration of the shape used in this collision test|\n|viroTag|the string tag passed to collided components' `onCollision` callbacks|\n\n|Return Value|Description|\n|- --|- --|\n|hasHit| true/false whether or not a collision was detected|" }, "cols": 1, "rows": 1 } [/block] [block:parameters] { "data": { "h-0": "async getCameraOrientationAsync()", "0-0": "This function is used to fetch the current Camera's orientation.\n\n|Return Value|Description|\n|- --|- --|\n|orientation|an object that contains the camera's `position`, `rotation`, `forward` vector and `up` vector as number arrays|" }, "cols": 1, "rows": 1 } [/block] [block:parameters] { "data": { "h-0": "async performARHitTestWithRay(ray: arrayOf(number))", "0-0": "This function performs a AR system-backed hit test with the given ray from the camera's position outward.\n\n|Return Value|Description|\n|---|---|\n|arHitTestResults| returns an array of [ARHitTestResult](doc:viroarscene#arhittestresult) corresponding to the AR points found by the AR system along the ray.|" }, "cols": 1, "rows": 1 } [/block] [block:parameters] { "data": { "h-0": "async performARHitTestWithPosition(position: arrayOf(number))", "0-0": "This function performs an AR system-backed hit test with the ray from the camera to the given position.\n\n|Return Value|Description|\n|---|---|\n|arHitTestResults| returns an array of [ARHitTestResult](doc:viroarscene#arhittestresult) corresponding to the AR points found by the AR system along the ray.|" }, "cols": 1, "rows": 1 } [/block] [block:api-header] { "title": "ARHitTestResult" } [/block] These are the individual objects in the array of ARHitTestResults returned by the two `performARHitTest...` functions. ``` arHitTestResult = (object) { type : string, transform : (object) { position : array(number), rotation : array(number), scale : array(number) } } ``` [block:parameters] { "data": { "h-0": "Key", "h-1": "Description", "0-0": "type", "1-0": "transform", "0-1": "**string**\n\nThe type of point returned, can only be one of the following:\n\n\"ExistingPlaneUsingExtent\"\n\"ExistingPlane\"\n\"EstimatedHorizontalPlane\"\n\"FeaturePoint\"", "1-1": "**object**\n\nThe transform of the point. Contains the following keys:\n\n`position`, `rotation`, `scale` as arrays of numbers." }, "cols": 2, "rows": 2 } [/block] [block:api-header] { "title": "Anchor" } [/block] This is the object given to the developer through the `onAnchorFound`, `onAnchorUpdated` and `onAnchorRemoved` callback functions. [block:parameters] { "data": { "h-0": "Key", "h-1": "Value", "0-0": "anchorId", "3-0": "rotation", "4-0": "center ([ViroARPlane](doc:viroarplane) only)", "6-0": "width ([ViroARPlane](doc:viroarplane) only)", "7-0": "height ([ViroARPlane](doc:viroarplane) only)", "6-1": "**number**\n\nCurrent width of the attached plane", "7-1": "**number**\n\nCurrent height of the attached plane", "0-1": "**string**\n\nId of the anchor", "3-1": "**arrayOf(number)**\n\nRotation of the rotation of the anchor in degrees.", "4-1": "**arrayOf(number)**\n\nCenter of the plane relative to the plane's position.", "2-0": "position", "2-1": "**arrayOf(number)**\n\nPosition of the anchor in world coordinates.", "1-0": "type", "1-1": "**string**\n\ntype of the anchor", "5-0": "alignment ([ViroARPlane](doc:viroarplane) only)", "5-1": "**string**\n\nThe plane alignment, one of the following values:\n\"horizontal\" - iOS only\n\"HorizontalDownwards\" - Android only\n\"HorizontalUpwards\" - Android only\n\"NonHorizontal\" - Android only" }, "cols": 2, "rows": 8 } [/block] [block:api-header] { "title": "Post-Process Effects" } [/block] [block:parameters] { "data": { "h-0": "Effect", "h-1": "Description", "0-0": "grayscale", "1-0": "sepia", "2-0": "sincity", "3-0": "baralleldistortion", "4-0": "pincushiondistortion", "5-0": "thermalvision", "6-0": "crosshatch", "7-0": "pixelated", "0-1": "An effect where the resulting image is in black and white.", "1-1": "An effect where the resulting image has a dark reddish-brown pigment color effect on it.", "2-1": "A sin-city like effect where the resulting image is in black and white, except for places where there is saturated red colors.", "3-1": "A fish-eye-like effect where the fish eye lens \ndistortion becomes more pronounce towards the center of the image.", "4-1": "A cushioning effect where the resulting image is \"pinched\" into the center.", "5-1": "A coloring effect where the resulting image gives of a \"radiant heat\" look from a thermal sensor.", "6-1": "An effect where the resulting image is made up of tiny crossed lines that recreates the scene.", "7-1": "An effect where the resulting image is pixelized." }, "cols": 2, "rows": 8 } [/block]