X3D Model Documentation: AnimatedViewpointRecorderPrototype.x3d

  1  <?xml version="1.0" encoding="UTF-8"?>
  2 
<!DOCTYPE X3D PUBLIC "ISO//Web3D//DTD X3D 3.0//EN" "https://www.web3d.org/specifications/x3d-3.0.dtd">
  3  <X3D profile='Immersive' version='3.0 xmlns:xsd='http://www.w3.org/2001/XMLSchema-instance' xsd:noNamespaceSchemaLocation='https://www.web3d.org/specifications/x3d-3.0.xsd'>
  4       <head>
  5            <meta name='titlecontent='AnimatedViewpointRecorderPrototype.x3d'/>
  6            <meta name='descriptioncontent='Record camera position and orientation as user navigates, then filter values and produce output, both into the console output window and as a replayable node group. Future work: further filtering.'/>
  7            <meta name='creatorcontent='Don Brutzman, Ken Curtin, Duane Davis, Christos Kalogrias'/>
  8            <meta name='createdcontent='24 October 2003'/>
  9            <meta name='modifiedcontent='12 October 2023'/>
 10            <meta name='referencecontent='AnimatedViewpointRecorderExample.x3d'/>
 11            <meta name='referencecontent='AnimatedViewpointRecorderSample.x3d'/>
 12            <meta name='referencecontent='http://www.realism.com/Web3D/Examples#WhereAmI'/>
 13            <meta name='referencecontent='http://www.realism.com/vrml/Example/WhereAmI/WhereAmI_Proto.wrl'/>
 14            <meta name='subjectcontent='recording animated viewpoint tour'/>
 15            <meta name='identifiercontent='https://savage.nps.edu/Savage/Tools/Authoring/AnimatedViewpointRecorderPrototype.x3d'/>
 16            <meta name='generatorcontent='X3D-Edit 4.0, https://savage.nps.edu/X3D-Edit'/>
 17            <meta name='licensecontent='../../license.html'/>
 18       </head>
<!--

<!--
Event Graph ROUTE Table shows event connections.
-->
<!-- to top Index for DEF nodes: NewViewpointGroup, RecordingScript, RouteHolder, WhereSensor

Index for Viewpoint node: Viewpoint_1

Index for ProtoDeclare definition: AnimatedViewpointRecorder
-->
 19       <Scene>
 20            <!-- ==================== -->
 21            <WorldInfo title='AnimatedViewpointRecorderPrototype.x3d'/>
 22            <ProtoDeclare name='AnimatedViewpointRecorderappinfo='AnimatedViewpointRecorder captures view position and orientation tour to create a guided tour animation. The recording output goes to the browser console where the .x3d (or .x3dv) output can be cut/pasted for further use.'>
 23                 <ProtoInterface>
 24                      <field name='starttype='SFBoolaccessType='inputOnly'
                     appinfo='Set start=true to commence recording viewpoint position/orientation.'/>
 25                      <field name='stoptype='SFBoolaccessType='inputOnly'
                     appinfo='Set stop=true to finish recording viewpoint position/orientation. Resulting VRML is added to scene resulting X3D and VRML is output to console.'/>
 26                      <field name='samplingIntervaltype='SFTimevalue='0.1accessType='initializeOnly'
                     appinfo='default 0.1 seconds'/>
 27                      <field name='outputX3Dtype='SFBoolvalue='trueaccessType='initializeOnly'
                     appinfo='whether to output .x3d syntax on browser console'/>
 28                      <field name='outputClassicVRMLtype='SFBoolvalue='falseaccessType='initializeOnly'
                     appinfo='whether to output .x3d syntax on browser console'/>
 29                      <field name='filterDeadTimetype='SFBoolvalue='falseaccessType='initializeOnly'
                     appinfo='TODO not yet implemented'/>
 30                 </ProtoInterface>
 31                 <ProtoBody>
 32                      <Group>
 33 
                         <!-- Group NewViewpointGroup is a DEF node that has 1 USE node: USE_1 -->
                         <Group DEF='NewViewpointGroup'/>
 34                           <!-- it's a big old world out there! -->
 35 
                         <!-- ROUTE information for WhereSensor node:  [from RecordingScript.recordingInProgress to enabled ] [from position_changed to RecordingScript.set_position ] [from orientation_changed to RecordingScript.set_orientation ] -->
                         <ProximitySensor DEF='WhereSensorsize='1000000000 1000000000 1000000000'/>
 36 
                         <!-- ROUTE information for RecordingScript node:  [from WhereSensor.position_changed to set_position ] [from WhereSensor.orientation_changed to set_orientation ] [from recordingInProgress to WhereSensor.enabled ] -->
                         <Script DEF='RecordingScriptdirectOutput='true'>
 37                                <field name='starttype='SFBoolaccessType='inputOnly'/>
 38                                <field name='stoptype='SFBoolaccessType='inputOnly'/>
 39                                <field name='samplingIntervaltype='SFTimeaccessType='initializeOnly'
                               appinfo='seconds'/>
 40                                <field name='outputX3Dtype='SFBoolaccessType='initializeOnly'
                               appinfo='whether to output .x3d syntax on browser console'/>
 41                                <field name='outputClassicVRMLtype='SFBoolaccessType='initializeOnly'
                               appinfo='whether to output .x3d syntax on browser console'/>
 42                                <field name='recordingInProgresstype='SFBoolaccessType='outputOnly'
                               appinfo='persistent state variable'/>
 43                                <field name='set_positiontype='SFVec3faccessType='inputOnly'/>
 44                                <field name='set_orientationtype='SFRotationaccessType='inputOnly'/>
 45                                <field name='positionArraytype='MFVec3faccessType='initializeOnly'/>
 46                                <field name='positionTimeArraytype='MFTimeaccessType='initializeOnly'/>
 47                                <field name='orientationArraytype='MFRotationaccessType='initializeOnly'/>
 48                                <field name='orientationTimeArraytype='MFTimeaccessType='initializeOnly'/>
 49                                <field name='filterDeadTimetype='SFBoolaccessType='initializeOnly'
                               appinfo='not yet implemented'/>
 50                                <field name='newViewpointGrouptype='SFNodeaccessType='initializeOnly'>
 51                                     <Group USE='NewViewpointGroup'/>
 52                                </field>
 53                                <field name='numberOfToursCreatedtype='SFInt32value='0accessType='initializeOnly'
                               appinfo='persistent holding variable'/>
 54                                <field name='precedingPositiontype='SFVec3fvalue='0 0 0accessType='initializeOnly'
                               appinfo='persistent holding variable'/>
 55                                <field name='precedingOrientationtype='SFRotationvalue='0 1 0 0accessType='initializeOnly'
                               appinfo='persistent holding variable'/>
 56                                <field name='precedingPositionSampleTimetype='SFTimevalue='0accessType='initializeOnly'
                               appinfo='persistent holding variable'/>
 57                                <field name='precedingOrientationSampleTimetype='SFTimevalue='0accessType='initializeOnly'
                               appinfo='persistent holding variable'/>
 58                                <field name='rtype='SFFloatvalue='1accessType='initializeOnly'
                               appinfo='normalization factor'/>
 59                                <field name='positionEventsReceivedtype='SFBoolvalue='falseaccessType='initializeOnly'
                               appinfo='track output of ProximitySensor'/>
 60                                <field name='orientationEventsReceivedtype='SFBoolvalue='falseaccessType='initializeOnly'
                               appinfo='track output of ProximitySensor'/>
 61                                <IS>
 62                                     <connect nodeField='startprotoField='start'/>
 63                                     <connect nodeField='stopprotoField='stop'/>
 64                                     <connect nodeField='samplingIntervalprotoField='samplingInterval'/>
 65                                     <connect nodeField='outputX3DprotoField='outputX3D'/>
 66                                     <connect nodeField='outputClassicVRMLprotoField='outputClassicVRML'/>
 67                                     <connect nodeField='filterDeadTimeprotoField='filterDeadTime'/>
 68                                </IS>
  <![CDATA[
            
ecmascript:

function initialize()
{
   positionArray        = new MFVec3f ();
   orientationArray     = new MFRotation ();
   positionTimeArray    = new MFTime ();
   orientationTimeArray = new MFTime ();

      positionEventsReceived = false;
   orientationEventsReceived = false;
}

function roundoff (value, digits)
{
	resolution = 1;
	for (i = 1; i <= digits; i++ )
	{
		resolution *= 10;
	}
	return Math.round (value*resolution) / resolution; // round to resolution
}
function filterPositions()
{
    // TODO
}

function filterOrientations()
{
    // TODO
}

function set_position (eventValue, timestamp)
{
// Browser.println ('position=' + eventValue);
   // we are counting on an initialization eventValue being sent by ProximitySensor
   positionEventsReceived = true;
   if ( positionArray.length == 0 )
   {
   	positionArray[0]     = eventValue; // initialize
   	positionTimeArray[0] = timestamp;  // initialize
   }
   precedingPositionSampleTime = positionTimeArray[ positionArray.length - 1 ];

   // seconds duration since last valid sample
   if ( (timestamp - precedingPositionSampleTime) > samplingInterval )
   {
	// append values to each array
	positionArray[positionArray.length] = eventValue;
	positionTimeArray[positionTimeArray.length] = timestamp;
   }
   precedingPosition = eventValue;
}

function set_orientation (eventValue, timestamp)
{
   // we are counting on an initialization eventValue being sent by ProximitySensor
   orientationEventsReceived = true;
   if ( orientationArray.length == 0 )
   {
     r = Math.sqrt (eventValue.x*eventValue.x + eventValue.y*eventValue.y + eventValue.z*eventValue.z);
//   Browser.println ('orientation=' + eventValue.toString() + ', r=' + r); // trace
     if (r != 0)
     {
        eventValue.x = eventValue.x / r;
        eventValue.y = eventValue.y / r;
        eventValue.z = eventValue.z / r;
     }
   	 orientationArray[0]     = eventValue; // initialize
   	 orientationTimeArray[0] = timestamp;  // initialize
   }
   precedingOrientationSampleTime = orientationTimeArray[ orientationTimeArray.length - 1 ];

   // append sample values to each array
   if ( (timestamp - precedingOrientationSampleTime) > samplingInterval )
   {
     orientationTimeArray[orientationTimeArray.length] = timestamp;
	 // normalize SFRotation axis if needed
     r = Math.sqrt (eventValue.x*eventValue.x + eventValue.y*eventValue.y + eventValue.z*eventValue.z);
//    Browser.println ('orientation=' + eventValue.toString() + ', r=' + r); // trace
     if (r != 0)
     {
        eventValue.x = eventValue.x / r;
        eventValue.y = eventValue.y / r;
        eventValue.z = eventValue.z / r;
        // auto append to array, no need to allocate
        orientationArray[orientationArray.length] = eventValue;
     }
     else // illegal zero-magnitude axis returned by browser, so just use previous rotation
     {
        // auto append to array, no need to allocate
        orientationArray[orientationArray.length] = precedingOrientation;
     }
   }
   precedingOrientation = eventValue;
}

function start (eventValue, timestamp)
{
	if (eventValue == false) return; // only accept start if eventValue == true
	if (recordingInProgress == true) return; // ignore repeated starts while already running
	recordingInProgress  = true;
    // arrays need to be reinitialized from previous run
	initialize();

    Browser.println ('    <!-- start recording ' + numberOfToursCreated + ' -->');
}

function stop (eventValue, timestamp)
{
	if (eventValue == false) return; // only accept stop  if eventValue == true
	if (recordingInProgress == false)
	{
	   Browser.println ('    <!-- stopped recording without first starting. -->');
       return;
	}

    // ensure legal array lengths in case some events were never sent due to not moving
    if (positionEventsReceived == false)
    {
       Browser.println ('<!-- warning:  no position values received! no action taken. -->');
       return;
    }
    if (orientationEventsReceived == false)
    {
       Browser.println ('<!-- warning:  no orientation values received! no action taken. -->');
       return;
    }
	recordingInProgress = false;

	// preceding last values were at last sampleInterval (either set_position or set_orientation)
	// add one more to each array since they are not sent values by sensor when not changing
       positionArray[   positionArray.length] = precedingPosition;
    orientationArray[orientationArray.length] = precedingOrientation;
       positionTimeArray[   positionTimeArray.length] = timestamp;
    orientationTimeArray[orientationTimeArray.length] = timestamp;

    if (positionArray.length != positionTimeArray.length)
    {
       Browser.println ('<!-- internal error:  positionArray.length (' + positionArray.length + ') != positionTimeArray.length (' + positionTimeArray.length + ') -->');
    }
    if (orientationArray.length != orientationTimeArray.length)
    {
       Browser.println ('<!-- internal error:  orientationArray.length (' + orientationArray.length + ') != orientationTimeArray.length (' + orientationTimeArray.length + ') -->');
    }

   filterPositions();
   filterOrientations();

   // iff events are sent simultaneously, could use either array with start/stop times synchronized
   // however that might be a bad assumption...  so reset start times to match
   if (positionTimeArray[0] > orientationTimeArray[0])    positionTimeArray[0] = orientationTimeArray[0];
   if (positionTimeArray[0] < orientationTimeArray[0]) orientationTimeArray[0] = positionTimeArray[0];

   startTime = positionTimeArray[0];
   stopTime  = positionTimeArray[positionTimeArray.length-1];
   interval  = stopTime - startTime;

   x3dString =
   '    <!-- ********** start recorded Animated Tour ' + numberOfToursCreated + ' using .x3d syntax ********** -->' +
   '    <Group>\n' +
   '      <Viewpoint DEF=\"AnimatedViewpointRecorderViewpoint' + numberOfToursCreated + '\" description=\"Animated Tour ' + numberOfToursCreated + '\"\n' +
   '         position=\"'    + positionArray[0].x + ' '    + positionArray[0].y + ' '    + positionArray[0].z + '\" \n' +
   '         orientation=\"' + orientationArray[0].x + ' ' + orientationArray[0].y + ' ' + orientationArray[0].z + ' ' + orientationArray[0].angle + '\"/>\n' +
   '      <!-- samplingInterval=' + samplingInterval + ' seconds, default TimeSensor loop=true -->' +
   '      <TimeSensor DEF=\"AnimatedViewpointRecorderTimer' + numberOfToursCreated + '\" cycleInterval=\"' + interval + '\"\n' +
   '        enabled=\"true\" loop=\"true\"/>\n' +
   '      <PositionInterpolator DEF=\"AnimatedViewpointRecorderPosition' + numberOfToursCreated + '\" key=\"\n' ;
   for (counter = 0; counter < positionTimeArray.length; counter++)
   {
    x3dString = x3dString +  roundoff(((positionTimeArray[counter] - positionTimeArray[0]) / interval),5) + ' \n';
   }
   x3dString = x3dString + '\"      keyValue=\"\n';
   for (counter = 0; counter < positionArray.length; counter++)
   {
      x3dString = x3dString +   positionArray[counter].x + ' ' + positionArray[counter].y + ' ' + positionArray[counter].z + ', \n';
   }
   x3dString = x3dString + '         \"/>\n' +
   '      <OrientationInterpolator DEF=\"AnimatedViewpointRecorderOrientation' + numberOfToursCreated + '\" key=\"\n';
   for (counter = 0; counter < orientationTimeArray.length; counter++)
   {
    x3dString = x3dString +   roundoff(((orientationTimeArray[counter] - orientationTimeArray[0]) / interval),5) + ' \n';
   }
   x3dString = x3dString +   '\"      keyValue=\"\n';
   for (counter = 0; counter < orientationArray.length; counter++)
   {
      var r = Math.sqrt(orientationArray[counter].x*orientationArray[counter].x + orientationArray[counter].y*orientationArray[counter].y + orientationArray[counter].z*orientationArray[counter].z); // normalize
      if (r == 0) r = 1; // avoid divide by zero
      x3dString = x3dString + (orientationArray[counter].x / r) + ' ' + (orientationArray[counter].y / r) + ' ' + (orientationArray[counter].z / r) + ' ' + orientationArray[counter].angle + ', \n';
   }
   x3dString = x3dString + '         \"/>\n' +
   '      <Group>\n' +
   '        <ROUTE fromField=\"bindTime\" fromNode=\"AnimatedViewpointRecorderViewpoint' + numberOfToursCreated + '\"\n' +
   '          toField=\"startTime\" toNode=\"AnimatedViewpointRecorderTimer' + numberOfToursCreated + '\"/>\n' +
   '        <ROUTE fromField=\"fraction_changed\" fromNode=\"AnimatedViewpointRecorderTimer' + numberOfToursCreated + '\"\n' +
   '          toField=\"set_fraction\" toNode=\"AnimatedViewpointRecorderPosition' + numberOfToursCreated + '\"/>\n' +
   '        <ROUTE fromField=\"fraction_changed\" fromNode=\"AnimatedViewpointRecorderTimer' + numberOfToursCreated + '\"\n' +
   '          toField=\"set_fraction\" toNode=\"AnimatedViewpointRecorderOrientation' + numberOfToursCreated + '\"/>\n' +
   '        <ROUTE fromField=\"value_changed\" fromNode=\"AnimatedViewpointRecorderPosition' + numberOfToursCreated + '\"\n' +
   '          toField=\"position\" toNode=\"AnimatedViewpointRecorderViewpoint' + numberOfToursCreated + '\"/>\n' +
   '        <ROUTE fromField=\"value_changed\"\n' +
   '          fromNode=\"AnimatedViewpointRecorderOrientation' + numberOfToursCreated + '\" toField=\"orientation\" toNode=\"AnimatedViewpointRecorderViewpoint' + numberOfToursCreated + '\"/>\n' +
   '      </Group>\n' +
   '    </Group>\n';

   if (outputX3D) Browser.println (x3dString);

   vrmlString =
      '# ********** start recorded Animated Tour ' + numberOfToursCreated + ' using .x3dv syntax ********** \n' +
      'Group {\n' +
      '  children [\n' +
      '      DEF AnimatedViewpointRecorderViewpoint' + numberOfToursCreated + ' Viewpoint {\n' +
      '        description \"Animated Tour ' + numberOfToursCreated + '\"\n' +
      '        orientation ' + orientationArray[0].x + ' ' + orientationArray[0].y + ' ' + orientationArray[0].z + ' ' + orientationArray[0].angle + '\n' +
      '        position ' + positionArray[0].x + ' '    + positionArray[0].y + ' '    + positionArray[0].z + '\n' +
      '      }\n' +
      '      DEF AnimatedViewpointRecorderTimer' + numberOfToursCreated + ' TimeSensor {\n' +
      '        cycleInterval ' + interval +  '\n' +
      '        loop TRUE\n' +
      '      }\n' +
      '      DEF AnimatedViewpointRecorderPosition' + numberOfToursCreated + ' PositionInterpolator {\n' +
      '        key [\n';
   for (counter = 0; counter < positionTimeArray.length; counter++)
   {
      vrmlString = vrmlString + roundoff(((positionTimeArray[counter] - positionTimeArray[0]) / interval),5) + ' \n';
   }
   vrmlString = vrmlString + '         ]\n' +
      '        keyValue [\n';
   for (counter = 0; counter < positionArray.length; counter++)
   {
      vrmlString = vrmlString +   positionArray[counter].x + ' ' + positionArray[counter].y + ' ' + positionArray[counter].z + ', \n';
   }
   vrmlString = vrmlString + '         ]\n' +
      '      }\n' +
      '      DEF AnimatedViewpointRecorderOrientation' + numberOfToursCreated + ' OrientationInterpolator {\n' +
      '        key [\n';
   for (counter = 0; counter < orientationTimeArray.length; counter++)
   {
    vrmlString = vrmlString +   roundoff(((orientationTimeArray[counter] - orientationTimeArray[0]) / interval),5) + ' \n';
   }
   vrmlString = vrmlString + '         ]\n' +
      '        keyValue [\n';
   for (counter = 0; counter < orientationArray.length; counter++)
   {
      vrmlString = vrmlString + orientationArray[counter].x + ' ' + orientationArray[counter].y + ' ' + orientationArray[counter].z + ' ' + orientationArray[counter].angle + ', \n';
   }
   vrmlString = vrmlString + '         ]\n' +
      '      }\n' +
      '      Group {\n' +
      '         ROUTE AnimatedViewpointRecorderViewpoint' + numberOfToursCreated + '.bindTime TO AnimatedViewpointRecorderTimer' + numberOfToursCreated + '.startTime\n' +
      '         ROUTE AnimatedViewpointRecorderTimer' + numberOfToursCreated + '.fraction_changed TO AnimatedViewpointRecorderPosition' + numberOfToursCreated + '.set_fraction\n' +
      '         ROUTE AnimatedViewpointRecorderTimer' + numberOfToursCreated + '.fraction_changed TO AnimatedViewpointRecorderOrientation' + numberOfToursCreated + '.set_fraction\n' +
      '         ROUTE AnimatedViewpointRecorderPosition' + numberOfToursCreated + '.value_changed TO AnimatedViewpointRecorderViewpoint' + numberOfToursCreated + '.position\n' +
      '         ROUTE AnimatedViewpointRecorderOrientation' + numberOfToursCreated + '.value_changed TO AnimatedViewpointRecorderViewpoint' + numberOfToursCreated + '.orientation\n' +
      '      }\n' +
      '   ]\n' +
      '}\n';

   Browser.println ();
   if (outputClassicVRML) Browser.println (vrmlString);

   numberOfToursCreated++;
   // TODO
   // newNode = new SFNode(vrmlString);
   // newViewpointGroup.children[numberOfToursCreated] = newNode;
}

          
]]>
 70                           </Script>
 71                           <Group DEF='RouteHolder'>
 72                                < ROUTE  fromNode='WhereSensor' fromField='position_changed' toNode='RecordingScript' toField='set_position'/>
 73                                < ROUTE  fromNode='WhereSensor' fromField='orientation_changed' toNode='RecordingScript' toField='set_orientation'/>
 74                                < ROUTE  fromNode='RecordingScript' fromField='recordingInProgress' toNode='WhereSensor' toField='enabled'/>
 75                           </Group>
 76                      </Group>
 77                 </ProtoBody>
 78            </ProtoDeclare>
 79            <!-- ==================== -->
 80            <Background groundColor='0.2 0.4 0.2skyColor='0.2 0.2 0.4'/>
 81            <Viewpoint description='Animated Viewpoint Recorderposition='0 0 14'/>
 82            <Anchor description='AnimatedViewpointRecorder Example'   url=' "AnimatedViewpointRecorderExample.x3d" "https://savage.nps.edu/Savage/Tools/Authoring/AnimatedViewpointRecorderExample.x3d" "AnimatedViewpointRecorderExample.wrl" "https://savage.nps.edu/Savage/Tools/Authoring/AnimatedViewpointRecorderExample.wrl" '>
 83                 <Shape>
 84                      <Text string='"AnimatedViewpointRecorderPrototype" "is a prototype definition file" "" "Click this text to see" "AnimatedViewpointRecorderExample"'>
 85                           <FontStyle justify='"MIDDLE" "MIDDLE"size='1.2'/>
 86                      </Text>
 87                      <Appearance>
 88                           <Material diffuseColor='0.6 0.8 0.4'/>
 89                      </Appearance>
 90                 </Shape>
 91            </Anchor>
 92       </Scene>
 93  </X3D>
<!--

<!--
Event Graph ROUTE Table shows event connections.
-->
<!-- to top Index for DEF nodes: NewViewpointGroup, RecordingScript, RouteHolder, WhereSensor

Index for Viewpoint node: Viewpoint_1

Index for ProtoDeclare definition: AnimatedViewpointRecorder
-->
X3D Tooltips element index: Anchor, Appearance, Background, connect, field, FontStyle, Group, head, IS, Material, meta, ProtoBody, ProtoDeclare, ProtoInterface, ProximitySensor, ROUTE, Scene, Script, Shape, Text, Viewpoint, WorldInfo, X3D, plus documentation for accessType definitions, type definitions, XML data types, and field types

Event Graph ROUTE Table entries with 3 ROUTE connections total, showing X3D event-model relationships for this scene.

Each row shows an event cascade that may occur during a single timestamp interval between frame renderings, as part of the X3D execution model.

     
The following ROUTE chain begins an event-routing loop! Loop occurs at nodeDepth=3.
 
ROUTE RecordingScript.recordingInProgress TO WhereSensor.enabled
RecordingScript
Script
recordingInProgress
SFBool

ROUTE
event to
(1)
WhereSensor
ProximitySensor
enabled
SFBool
then
 
 
 
WhereSensor
ProximitySensor
orientation_changed
SFRotation

ROUTE
event to
(2)
RecordingScript
Script
set_orientation
SFRotation
then
 
 
 
RecordingScript
Script
recordingInProgress
SFBool

ROUTE
event to
(3)
WhereSensor
ProximitySensor
enabled
SFBool
  then
 
 
 
WhereSensor
ProximitySensor
position_changed
SFVec3f

ROUTE
event to
(2)
RecordingScript
Script
set_position
SFVec3f
then
 
 
 
RecordingScript
Script
recordingInProgress
SFBool

ROUTE
event to
(3)
WhereSensor
ProximitySensor
enabled
SFBool

line 82
Anchor
description='AnimatedViewpointRecorder Example' 
User-interaction hint for this node. 

Additional guidance on X3D animation can be found in the 10-Step Animation Design Process and Event Tracing hint sheets. Have fun with X3D! 😀

-->
<!-- Online at
https://savage.nps.edu/Savage/Tools/Authoring/AnimatedViewpointRecorderPrototypeIndex.html -->
<!-- Version control at
https://gitlab.nps.edu/Savage/Savage/Tools/Authoring/AnimatedViewpointRecorderPrototype.x3d -->

<!-- Color-coding legend: X3D terminology <X3dNode  DEF='idNamefield='value'/> matches XML terminology <XmlElement  DEF='idNameattribute='value'/>
(Light-blue background: event-based behavior node or statement) (Grey background inside box: inserted documentation) (Magenta background: X3D Extensibility)
    <ProtoDeclare name='ProtoName'> <field name='fieldName'/> </ProtoDeclare> -->

to top <!-- For additional help information about X3D scenes, please see X3D Tooltips, X3D Resources, and X3D Scene Authoring Hints. -->