The Annotated VRML 97 Reference

1 Intro     Concepts     3 Nodes     4 Fields/Events    Conformance
A Grammar     B Java     C JavaScript     D Examples     E Related Info    References
Quick Java         Quick JavaScript         Quick Nodes   
 

  About the Book
  
Help
  Copyright © 1997-99
  Purchase the book from Amazon.com

Chapter 3:
Node Reference


Intro
Anchor
Appearance
AudioClip
Background
Billboard
Box
Collision
Color
ColorInterpolator
Cone
Coordinate
CoordinateInterpolator
Cylinder
CylinderSensor
DirectionalLight
ElevationGrid
Extrusion
Fog
FontStyle
Group
ImageTexture
IndexedFaceSet
IndexedLineSet
Inline
LOD
Material
MovieTexture
NavigationInfo
Normal
NormalInterpolator
OrientationInterpolator
PixelTexture
PlaneSensor
PointLight
PointSet
PositionInterpolator
ProximitySensor
ScalarInterpolator
Script
Shape
Sound
Sphere
SphereSensor
SpotLight
Switch
Text
TextureCoordinate
TextureTransform
TimeSensor
TouchSensor
Transform
Viewpoint
VisibilitySensor
WorldInfo

+3.51 TouchSensor

TouchSensor { 
  exposedField SFBool  enabled TRUE
  eventOut     SFVec3f hitNormal_changed
  eventOut     SFVec3f hitPoint_changed
  eventOut     SFVec2f hitTexCoord_changed
  eventOut     SFBool  isActive
  eventOut     SFBool  isOver
  eventOut     SFTime  touchTime
}

A TouchSensor node tracks the location and state of the pointing device and detects when the user points at geometry contained by the TouchSensor node's parent group. A TouchSensor node can be enabled or disabled by sending it an enabled event with a value of TRUE or FALSE. If the TouchSensor node is disabled, it does not track user input or send events.

TECHNICAL NOTE: TouchSensor was originally called ClickSensor, and was specified in a "mouse-centric" way. Sam Denton rewrote this section so that it was easier to map alternative input devices (e.g., 3D wands and gloves) into the semantics of the TouchSensor.

The TouchSensor generates events when the pointing device points toward any geometry nodes that are descendants of the TouchSensor's parent group. See "2.6.7.5 Activating and manipulating sensors" for more details on using the pointing device to activate the TouchSensor.

The isOver eventOut reflects the state of the pointing device with regard to whether it is pointing towards the TouchSensor node's geometry or not. When the pointing device changes state from a position such that its bearing does not intersect any of the TouchSensor node's geometry to one in which it does intersect geometry, an isOver TRUE event is generated. When the pointing device moves from a position such that its bearing intersects geometry to one in which it no longer intersects the geometry, or some other geometry is obstructing the TouchSensor node's geometry, an isOver FALSE event is generated. These events are generated only when the pointing device has moved and changed `over' state. Events are not generated if the geometry itself is animating and moving underneath the pointing device.

TIP: The isOver event makes it easy to implement a technique called locate highlighting. Locate highlighting means making an active user interface widget change color or shape when the mouse moves over it, and lets the user know that something will happen if they press the mouse button. User interaction inside a 3D scene is something with which users are not familiar, and they probably will not be able to tell which objects are"hot" and which are just decoration just by looking at the scene. Writing a Script that takes isOver events and changes the geometry's color (or activates a Switch that displays a "Click Me" message on top of the sensor's geometry, or starts a Sound of some kind, or does all three!) will make user interaction much easier and more fun for the user.

As the user moves the bearing over the TouchSensor node's geometry, the point of intersection (if any) between the bearing and the geometry is determined. Each movement of the pointing device, while isOver is TRUE, generates hitPoint_changed, hitNormal_changed and hitTexCoord_changed events. hitPoint_changed events contain the 3D point on the surface of the underlying geometry, given in the TouchSensor node's coordinate system. hitNormal_changed events contain the surface normal vector at the hitPoint. hitTexCoord_changed events contain the texture coordinates of that surface at the hitPoint.

TIP: The combination of isActive and isOver gives four possible states in which a TouchSensor can exist:
  1. isOver FALSE, isActive FALSE: The user has clicked on some other object or the user hasn't clicked at all and the mouse isn't over this sensor's geometry.
  2. isOver TRUE, isActive FALSE: The mouse is over this sensor's geometry but the user hasn't clicked yet. If something will happen when the user clicks, it is a good idea to provide some locate-highlighting feedback indicating this.
  3. isOver TRUE, isActive TRUE: The user has clicked down on the geometry and is still holding the button down, and is still over the geometry. Further feedback at this point is a good idea, but it is also a good idea to allow the user to abort the click by moving the mouse off the geometry.
  4. isOver FALSE, isActive TRUE: The user clicked down on the geometry and is still holding down the button, but has moved the mouse off the geometry. Feedback to the user that they are aborting the operation is appropriate.

If isOver is TRUE, the user may activate the pointing device to cause the TouchSensor node to generate isActive events (e.g., by pressing the primary mouse button). When the TouchSensor node generates an isActive TRUE event, it grabs all further motion events from the pointing device until it is released and generates an isActive FALSE event (other pointing-device sensors will not generate events during this time). Motion of the pointing device while isActive is TRUE is termed a "drag." If a 2D pointing device is in use, isActive events reflect the state of the primary button associated with the device (i.e., isActive is TRUE when the primary button is pressed and FALSE when it is released). If a 3D pointing device is in use, isActive events will typically reflect whether the pointing device is within (or in contact with) the TouchSensor node's geometry.

The eventOut field touchTime is generated when all three of the following conditions are true:

  1. the pointing device was pointing towards the geometry when it was initially activated (isActive is TRUE),
  2. the pointing device is currently pointing towards the geometry (isOver is TRUE),
  3. the pointing device is deactivated (isActive FALSE event is also generated).

Further information about this behaviour may be found in "2.6.7.3 Pointing-device sensors", "2.6.7.4 Drag sensors", and "2.6.7.5 Activating and manipulating sensors."

TECHNICAL NOTE: TouchSensor is designed to be abstract enough to apply to a variety of input devices (e.g., wand, glove) and simple enough for the lowest common denominator hardware found on general-purpose computers today--a pointing device with a single button. The success of Apple's Macintosh proves that multiple buttons aren't necessary to create a really good user interface, and since minimalism was one of the design goals for VRML 2.0, only one-button support is required.


EXAMPLE (click to run): The following example illustrates the TouchSensor. The first TouchSensor is used to move a small Box on the surface of a Sphere. The TouchSensor's hitPoint_changed eventOut is routed to the translation field of the Transform affecting the Box. This has the net effect of translating the Box to the intersection point with the TouchSensor's geometry, the Sphere. Note, however, that the second TouchSensor is used as a toggle button to activate and deactivate the first TouchSensor. This is accomplished with a simple Script node that is routed to the first TouchSensor's enabled field. The Switch nodes are used to change the color of the toggle button (Cone) and the Box, based on the activation state (on or off):
#VRML V2.0 utf8
Transform { children [
  Transform { children [
    # Sphere on which the box is moved.
    DEF TOS1 TouchSensor { enabled FALSE }
    Shape {
      geometry Sphere {}
      appearance Appearance {
        material Material { diffuseColor 1 0 1 }
      }
    }
  ]}
  DEF T1 Transform { children [
    # Box that moves and changes activation color.
    DEF SW1 Switch {
      whichChoice 0
      choice [
        Shape {   # choice 0 = off state
          geometry DEF B Box { size 0.25 0.25 0.25 }
          appearance Appearance {
            material Material { diffuseColor 0.2 0.2 0 }
          }
        }
        Shape {   # choice 1 = on state
          geometry USE B
          appearance Appearance {
            material Material { diffuseColor 1 1 0.2 }
          }
        }
      ]
    }
  ]}
  Transform {
    # toggle button which turns box on/off.
    translation -3 0 0
    children [
      DEF TOS2 TouchSensor {}
      DEF SW2 Switch {
        whichChoice 0
        choice [
          Shape {   # choice 0 = off state
            geometry DEF C Cone {}
            appearance Appearance {
              material Material { diffuseColor 0.8 0.4 0.4 }
            }
          }
          Shape {   # choice 1 = on state
            geometry USE C
            appearance Appearance {
              material Material { diffuseColor 1.0 0.2 0.2 }
            }
          }
        ]
      }
      DEF S2 Script {
        eventIn SFTime touchTime
        field SFBool enabled FALSE
        eventOut SFBool onOff_changed
        eventOut SFInt32 which_changed
        url "javascript:
          function initialize() {
            // Initialize to off state.
            whichChoice = 0;
            onOff_changed = false;
          }
          function touchTime( value, time ) {
            # Toggle state on each click.
            enabled = !enabled;
            onOff_changed = enabled;
            which_changed = enabled;
          }"
      }
    ]
  }
]}
ROUTE TOS2.touchTime TO S2.touchTime
ROUTE S2.onOff_changed TO TOS1.enabled
ROUTE S2.which_changed TO SW1.whichChoice
ROUTE S2.which_changed TO SW2.whichChoice
ROUTE TOS1.hitPoint_changed TO T1.set_translation