using
System.Collections;
using
System.Collections.Generic;
using
UnityEngine;
using
System.IO;
public
class
Reporting : MonoBehaviour {
string
path =
"Assets/Logs.txt"
;
Vector3 current;
Vector3 previous;
Vector3 delta;
// Use this for initialization
void
Start () {
StreamWriter writer =
new
StreamWriter(path,
true
);
writer.WriteLine(
"transform x, transform y, transform z, delta x, delta y, delta z"
);
writer.Close();
current = transform.localPosition;
previous = transform.localPosition;
}
// Update is called once per frame
void
Update () {
//Write some text to the test.txt file
StreamWriter writer =
new
StreamWriter(path,
true
);
current = transform.localPosition;
vel = (current - previous) / Time.deltaTime;
writer.WriteLine(transform.localPosition.x +
","
+ transform.localPosition.y+
","
+ transform.localPosition.z +
","
+ delta.x +
","
+ delta.y +
","
+ delta.z);
writer.Close();
previous = transform.localPosition;
}
}
X vs Y
X vs Z
Y vs Z
Then I graphed each position vs time:
X
Y
Z
I decided that I didn’t want to use straight transform data as it was fairly volatile, especially if the player wasn’t walking exactly in place. I had also collected a change in position over subsequent frames.
Delta X
Delta Y
Delta Z
Based on this data it seemed like delta x would be the best measurement to base a step detection algorithm. I created an algorithm that detects the change of delta x from a negative value to a positive one. Each of these is assumed to be a step. What I actually found when I implemented this was that it worked beautifully… unless the player turned their head rapidly from side to side. Since I wanted users to be able to look around during the experience, this wasn’t going to work. Fortunately, it did work when applied the the delta y variable. This had the added advantage of allowing users to simply bob their heads if they didn’t want to actually walk, while still allowing them to look around.
The second part of walking was to move the player each time a step was detected. I opted to use one of the controllers as a forward pointer, allowing the player to look around while walking. The movement of the player is a bit jerky, but I found that this was less likely to cause VR sickness in the users than if the movement was smoothed. In a future iteration I would both like to be able to calibrate the step detection threshold for each user and allow the user to control the degree to which movement is smoothed out.