The user has just woken up and he is heading to the bathroom. The story begins from the point of him being at the door of the bathroom. Because he is sharing the house, he has three other housemates whom he is living with – all of them with some kind of vision problems (bad vision). There are four glasses lying around in the bathroom. Help him find the right glasses!
Original (blurry)
Four kinds of glasses:
Double vision
Zoom In
Zoom Out
Normal (clear vision)
The user will hear a voice when he puts on the wrong glasses like the following:
Double vision – “Wow…I’m seeing double”
Zoom In – “Hmm…this doesn’t seem quite right”
Zoom Out – “These aren’t mine”
Normal (clear vision) – “Finally, I see things properly!”
For our project, we want to create an environment that relates to sustainability on campus. If someone passes by trash without picking it up, we wanted to challenge what happens in the “real world” where there are no consequences. In our alternate reality, we hope to have negative feedback so that the user/recipient is transformed, translating into different actions/reactions in the real world. We hope to use a Gazelle article written about recycling on campus to inform our interaction design.
Some initial questions: how campus focused should it be? Should we create an environment that is a realistic representation of campus? Do we make the campus environment more abstract? When designing virtual reality experiences, how do we provide feedback to the user when they have reached the edges of our world? How should the trash respond when someone walks past? Do they rise and float in the user’s field of view? Is there some sort of angry sound that increases with time? What feedback is provided if the user puts the trash in the wrong compartment (plastic vs paper vs general like the campus receptacles)?
Our group’s storyboard sketched by the talented Lauren!
From this initial concept, we decided to just start building the piece in Unity to see what we are capable of accomplishing in a relatively short amount of time. We split up the work: Lauren and I will do the interactions and Ju hee and Yufei will build the environment.
After the first weekend, we had an environment built with a skybox and some objects. As a team, we’ve decided to change directions in terms of the environment…we want to build an abstract version of the campus. This will delay things as we only have a week left and the environment will take at least three days to build, but I think it’ll be worth it in the long run. I’d rather have less complex interactions and a more meaningful environment at the end of the day. Since Lauren has a very strong vision of what she wants the environment to look like, we shall separate the tasks differently. She and Ju hee will do the environment and Yufei and I will try to implement the interactions.
Here is the lovely environment that Lauren and Ju hee have built. It looks very much like campus! Yufei and I have just integrated the Vive and SteamVR system into the environment and are looking around the space. I wish we would have integrated it earlier as there are a few scale issues and the space is very very large, things that can only be seen through the headset. We shall have to implement a teleport system and rescale objects.
Yufei is working on the teleport system. SteamVR 2.0 makes it quite simple to add teleport! We just needed to add the ‘teleporting’ prefab and the teleport points. One thing we are struggling with is the level of the teleport system. It needs to be at player arm level and we’ve tried various combinations of levels of ground, player, and teleport points, but when we make it the same level, the player teleports lower for some reason. For now, we shall place the teleport points slightly above.
Yufei made the system into a teleport area rather than points. The arc distance of the raycast seems to be something we need to play around with to match a comfortable level for the player’s arms. For now we have made it 10 which makes it easy to teleport, but difficult to teleport to a close location.
We have spent a lot of time setting up our base stations unfortunately. Additionally, whenever we look at the environment through the headset and move our heads, the buildings seem to flicker in and out and sometimes disappear completely. A forum search reveals that we need to adjust the clipping plane which apparently means the region of interest that is the visible scene. We have adjusted the near and far parameters to 2 and 2000 respectively and that seems to work just fine! Additionally, the textures on the grass and floor seem very pixelly and stretched out so I’ve increased the tiling on their shaders.
tiling of campus ground
Time to implement interactions on the trash! I’ve added the Throwable and Interactable scripts to all the objects. For now, there are cereal boxes, toilet paper rolls, wine bottles, cans from food, and water bottles. Yufei and I decided to delete the toilet paper rolls as why would one throw those away and delete the wine bottles as they have liquid in them and one cannot recycle glass on campus except in a few places. We also deleted the cans as one can only recycle metal in the dorms and we wanted it to feel like waste disposal when walking around campus. We did add a chip bag as we wanted something to go into the waste bin rather than one of the recycling bins.
Speaking of the bins, I’ve added labels to them. At first I used the UI text, but that made it be seen through all objects and it was quite blurry. To rectify the blurriness, I increased the font size substantially so that it was now bigger than the character size and I reduced the scale of the text object to the size that I wanted the text. Another forum search revealed that because it was the UI text, it could be seen through everything, so I have changed to the textmeshpro Text and the problems seem to be fixed.
fixing blurry text
I am testing the objects to see if they can be picked up but they are quite far from the player since they are on the ground. Yufei and I are continually playing around with the ground, teleport, and player levels to find something that works, but nothing seems to. We’ve tried putting the ground and player at 0 like it says to do online, but when we also add the teleport in, the teleport level seems to change a lot. I shall make the objects float for now as we are running out of time.
I am struggling with finding the right attachment settings on the scripts. Additionally, our binding UI does not work, so we seem stuck with the default binding of having an object picked up with the inner button on the controller. The object still doesn’t seem to be picking up.
I don’t know what I’ve done differently, but I can pick up one of the objects now. However, it doesn’t seem to stay in my hand so I’ve only nudged it. It acts like a projectile so it takes whatever velocity my hand gives it in the direction of the nudge. Not good!
Yufei has been working on the objects and says that we need the objects to have gravity for us to be able to throw them. She has also found the sound files. The problem is should we just place the trash on tables. I’ll play around with it and see how it looks…after all, it can just be like the D2 tables I guess. It doesn’t look that bad honestly, but Max comes to save the day by helping us find the right combination of ground, teleport, and player. Also, it seems like our room setup was incorrect which is why it was difficult to reach the floor. Either way, the system seems to work a lot better now and I also feel less nauseous now when testing since it feels more natural. I still need to find an arc distance that works. 7 seems best for now. I have also kept the tables as they are reminiscent of D2, but moved the objects so that they are scattered on the floor.
For the bins’ interaction, I planned to add tags to the objects and the bin. If the tags matched, if they were both ‘plastic,’ then it would be correct. I added the test for this condition inside the loop of the target hit effect script on all the objects, not realizing that the loop just checks for the target collider, not the possibility of the non-target collider. I modified the script to add two public wrong colliders for the other two bins. If it hits the target, I want the correct sound to play and the object to be destroyed upon collision with the bin. If it hits the wrong one, the incorrect sound should play and a message should pop up saying: “This object should be placed in the bin marked: “ + the tag of the object. Thus, for the chip bag, water bottle, and cereal box, their target collider is the bin they should be placed in and the wrong colliders are the two remaining bins.
adding ‘wrong colliders’Settings for target hit effect script
I’ve added two audio sources for the incorrect and correct sounds. However, I keep getting an error in the debug log about there being more than 2 audio listeners. The culprit is strangely the player prefab? It has one attached as expected to the VR camera but it has another one which is strange because it’s a prefab and a Unity scene can only have one audio listener. I delete the one not on the camera and hope for the best. Now my sound works!
Now, the sounds work, but the destroy on collision and the message does not. There is a boolean variable for the destroy already on the script but it doesn’t seem to be working. Perhaps since I modified it? I just make my own destroy method in the script and the problem seems to be resolved. I also need to adjust the level of the bins to be more natural.
I’ve also fixed the cat animation so that it goes from idle A to idle B to walk.
The environment still feels quite big. After you pick up an object, it’s such a chore to walk all the way to the bin on the other side. I’m going to play around with the position of everything in the environment to make it easier to move around. Additionally, I want to make it obvious that one should pick up the trash by featuring the bins quite prominently in the scene, as I don’t want to ruin the feel of the piece with instructions.
It’s now 1 am Sunday night, so I’m off to bed. But hopefully someone in my group or I can work on this Monday morning to implement the nice to haves:
if you place something in the wrong bin, a message could pop up saying which bin to put it in, so it’s more instructional in nature.
Having the trash have an emission or make a sound if you are within a certain distance from them
Having some sort of reaction if you pass this distance without picking it up
Having more trash and more complex trash
I am working on the message now, but seem to have issues with positioning the UI and not concatenating the tag to the message. If I can’t get it before class, I shall simply delete it.
We started off with some brainstorming for our interactions and actions:
Initial Ideas:
Throwing crumpled paper into a basket
Implement points based on how far back you are → makes you move around
Obstacles (desk, etc.)
Crumpling paper
Classroom, library
Putting food onto tray- cafeteria
Washing face
Taking care of plants
Zen
If you cut the plants they just float around
Twisting knob motion to speed up time → plants grow, lighting changes
Drawing
Slingshot
Flipping coin into fountain
Something could pop out, you have to catch it
After deciding on the plant idea we enjoyed, we decided to go more into details:
Taking care of plants:
Time
Lighting changes
Sun/moon
Plant growth
Environment ideas:
Dorm room
Windowsill
Small cottage
Outside garden, fence
Interaction
Watering
Cutting
Picking fruit/flowers
Growing bamboo
With a solid idea in mind, we went ahead and designed our storyboard:
–Step 1–
Clump of bamboo in front of you
To your side: tree stump with watering can + cutting tool
Surrounding mountains and other bamboo
You’re inside a circle of rocks
Butterflies are flying around
It’s golden hour
–Step 2–
You have the water picked up
Water is gone from stump
–Step 3–
Bamboo is taller
–Step 4–
Replace water with axe
Now the water is back on the stump and the axe is gone
–Step 5–
Show the particles of the bamboo disappearing
–Step 6–
Now an empty spot of bamboo
Our storyboard:
Mar 6
After our group roles were self-assigned, of which I got the responsibility of scripting, I thought it would be important to start as soon as possible. Since I knew very little about C#, Unity, and scripting, I got to practice immediately.
The first goal for the day was to play around with some scripts from SteamVR and the online unity manuals. The first script I created was heavily dependent on what I found online. I assembled a script that caused an object to turn into a new prefab once it collided with a sphere. In order to make this functional I had to make sure the sphere and the cube both had rigid bodies and colliders. I also made sure that I could throw the sphere, by using the components from SteamVR, so that I could pick it up by using the back button of the remote. The prefab the sphere turned into was a cup from another asset pack. This was quite a hilarious scene but very beneficial because it taught me how to identify a specific prefab to use as a transformation and to prevent the same effect when touching any other named object.
The next project was to a script that would imitate the
effect of bamboo disappearing when hit with a sickle. I used a capsule in place
of the sickle and I made the movement occur by placing a public variable. The
movement could be in any direction, it would just add that amount to the
current position. The cylinder I attached this script to needed to have a rigid
body and a collider in order have effect. I also made sure gravity was
activated because whenever I hit it with the capsule I wanted it to fall back
down to the floor. The large issue I found with this set up was that the
cylinder could fall out of reach really easily and there was no way of
reobtaining it.
Realizing how easy it was to make effect occur on object I brought in several prefabs for similar experiments. I brought in a sickle and stacked bamboo cups (to imitate a bamboo shoot). The script that I created this time was a lot simpler. Whenever the sickle made collision with the bamboo it would simply destroy the object. This is very effective it making it look like the shoot was being shopped at different areas. If many cups were stacked then I believe it would be quite entertaining to see all of the fall down or have some separate effect. This could be played with a bit to obtain the desired result, I will have to discuss with the group a bit. Overall, today I have found a way to make tools able to be picked up and come in contact with bamboo shoots that could either be blown away or deleted.
Mar 8
Today I started off by recreating a really crude
representation of the environment. I did this so that I could start writing
scripts that would be realistic of the area that the object will be placed in
at the end. This are included a table, the 2 tools to be used (sickle and water
can), and a patch of soil with bamboo.
I worked relentlessly on the script responsible for the
function of the watering can. My goal was to make a raycast that would detect
the bamboo and then return then return it. I also wanted to make it so that the
object knew when it was being poured and when it was not, this depended on the
tilting of the object. Past a certain z rotation, as could be seen in the
scene, it looked correct to be pouring water. I could not get the raycast to
work (couldn’t even see the debug line that was supposed to show up) but I
succeeded in making the program know when it was pouring.
In the end I was very frustrated because the line wasn’t working and I didn’t know how to make the tilting of the can have any function. I placed a child object Particle System on the can when I thought the water should come from and then left the work for another day.
Mar 9
Today I work specifically on three scripts that I thought
would be quite vital for the comfort and enjoyment of the final product. These
three were: one that got the bamboo to be cut by the sickle, one that brought
items back into range if thrown too far, and one that was responsible for the
can’s particle system and detection of the bamboo.
I started off with the bamboo script. I ended up keeping it
quite simple and basing it off a previous scrip that I talked about a few days
ago. I made it so that the object that could cut the bamboo was not definite. A
certain name for the cutting object could be listed.
I noticed a large issue with the script when I was trying to
hit the bamboo. Only when the sickle was dropped on the bamboo would it make it
disappear, this is because there were some problems with the interaction script
from SteamVR. The Attachment Flags had to be altered so that the sickle could
not clip through items. This also made the entire system a bit more realistic.
Next, I worked on the script to bring the items back from a
distance. This took a lot of time but in the end I found out that you had to
reset the rotation and the position of the item before bringing it back or else
it would fly away. The final product looked very good. It would just drop back
on the table once thrown too far. I also made it so that the distance it was
being thrown and teleported back from could be altered. I found 2 to be good in
the z and x direction. I attached this script to both the can and the sickle.
Finally, I tried working on the watering raycast issue but I ran out of time. I found out why the raycast debug line wasn’t seen though. I noticed that you could only see it in the scene editor. It was pointing the wrong way so I changed its direction (in a quite annoying method) and moved it up so that it aligned well with the nozzle (this required some testing since the scales are so messed up).
So much research, video watching, and web scrolling was required to get even here. It makes the final products feel so good when they work…
Mar 11
I noticed that it was difficult to know exactly when the
watering can was pointing directly at the bamboo (causing it to grow) so I
decided to add a pointer of sorts. Initially I tried to work with line
renderers but this very difficult. It required the position to be continuously
updated and the hit position (or end position of the line) could not be
infinity. After not resolving the issue I found an extremely solution simply using
a cube. I created a long thin cube oriented to way the raycast was and colored
it orange just to make it more appealing and fitting with the environment.
Next, I worked heavily on the growing of the bamboo. I found
the growing-flower script from SteamVR and worked off that. I first made two
different kinds of bamboo shoots, bambooOG and bamboo, which essentially are
the stump the bamboo will grow out of and the bamboo that actually grows out. I
did this because I noticed it would make the entire system easier. I could make
solely the bamboo vulnerable to the sickle while keeping the bambooOG
permanent. I added the growing script from SteamVR to the bambooOG.
After trying to grow the bamboo I noticed 2 large problems.
The first is that the bamboo would spawn way too fast because as long as the
can was pointing at the bambooOG the growth would occur (so with every update).
In order to resolve this I set a counter that would increase with every frame
till 60 and then restart along with running the script. This meant that if the
update occurred 60 times a second, then the bamboo would grow 1 per second. Next
to tackle was the issue of the bamboo staying on top of each other after
growing (and pushing the top ones up). The bamboo did not want to go any higher
than 2 shoots tall, they continuously knocked the top one away. This occurred regardless
of the position where you spawned the bamboo.
While doing trial and error with spawning positions and
checking/unchecking all the boxes in all the scripts attached to the bamboo, I
came across something wonderfully beautifully. You can lock the position and
rotation of the spawned bamboo shoots by altering their prefab’s rigid body.
What I tried first was freezing the y position but then I realized how that was
incorrect (all the bamboo would just fly sideways. Next I increased the mass
(for more stability and bounciness) and locked the x and z positions. I came up
with a wonderfully appealing result. The bamboo would grow in an amazing way that
looked like the individual segments were balancing on each other. Even better
was the fact that the bamboos could be interacted with by the can (spinning them
and bouncing them up and down). Cutting them down with the sickle was even more
relieving too because you could see them quickly destroy (without any effect
yet though).
Naturally, I could have locked the rotations and it would really look like the bamboo was growing straight up but I didn’t think the results was nearly as appealing. With this constant balancing, the used was way more pleased (user tested with Max) and wanted to keep making them grow because it was partially random too. I also realized that the used wanted to keep growing the shoots endlessly so I decided to attach segment to the bamboo script that would release the freezing of the rigidbody when a certain height was reached (public int). The final step would be perhaps to make the bamboo disappear after a while once their height of x position was at or lower than ground level (to prevent cluttering).
Mar 12
Today I made finishing touches to the coding. The first
thing I worked on was making sure that the segments at the top of the bamboo
sprout would fall off after a desired height and eventually disappear after
some time on the ground. I did this by making two functions, one that took off
all the restraints at a desired height (before when I did this there were still
some problems) and another that waited a period of time to delete it. The first function was relatively simple but
the second one required a counter. I tried using WaitForSeconds() but
apparently that can only be use in a enumerator. I suspected update to run
roughly 60 times a second so I made a counter based off that.
The next part I worked on was cleaning up the code of the teleporting-an-item-back script. I noticed that this code was not originally very lenient on placement. The items had to be close to the origin of the environment. I fixed this simply by changing some of the math operations. In the end everything worked wonderfully! The next step is to combine this with the environment!
Mar 17
Today was the final day. We all grouped together to make sure everything was working and to make sure that the last day of project development could go successfully. When I came in there were still some issues with the colliding of the water and the execution of the growing commanded. After three of us worked on the code for a while we finally got it to work. The large issue had to do with layers. In the end the terrain had to be placed in a different layer along with the can because the water particle system was constantly colliding with it. We go the physics and action to work efficiently. The environment looked good and the objects worked well.