Project 2: Development Blog

This project is going to be like something out of the Harry Potter universe. It places the recipient in a large, Victorian style greenhouse, in front of a planting station.They are provided with seeds, a watering can and a planting pot. If the user follows what is pretty much expected and plants the seed and waters it they end up growing a giant man eating plant that gets them eaten. Just behind them will be placed a second plant of the same species with a danger sign, an easter egg warning that the user may or may not see.

I envisioned the greenhouse to look like the ones found at the Kew Gardens in London that I visited last summer.

We first set out to find some ready made assets, primarily a greenhouse and the man eating plant. I managed to find a greenhouse that cast some nice shadows and came with a bunch of planting pots and benches. I also managed to find an animated plant with teeth. Getting the animation to loop is something we have yet to figure out. We built the planting area by combining some of the benches that came with the greenhouse.

Then, we began work on the interactions. We brought the player in from one of the example scenes in the Unity VR package. We also brought in a sphere that we will be using as the seed. Adding colliders to the pot and the table allowed us to place the objects on the surface and drop the seed into the pot. Figuring out how to detect the tilt of the watering can to start playing the particle animation of the water took some time but Max was able to figure it out.

I built an expansive terrain around the greenhouse to create a expansive forest. I sprayed the area near the greenhouse with patches of grass and a single species of tree but nothing too extreme or different so That the focus would remain more on the inside. The greenhouse was populated with strange alien plants, trees that reach above the recipient’s head and some close by the use in planters, some of which came with their own animations. The bench forms a visual barrier around the user. These worked wonders for bringing the space to life. Some of them emit clouds of spores, which became quite distracting so I ended up removing them.

To add another ‘alternate’ element to the world we added a creature, a giant butterfly in the sky. The butterfly makes the outside seem an even more daunting space than the inside.

We added some gardening equipment into the space the the recipient can pick up and play around with as well. I think it may make a fun ending to the game if the user was to pick up one of these and fight off the monster plant.

The growing of the flytrap is triggered to happen when both the soil and the water have contact with the seed for a certain amount of time. This took several hours to figure out how to do. The plant, which is already in the pot but extremely small in size grows larger and animates, lunging at the viewer.

Finally, we decided to add some ambient and 3D sound.
We found a bunch of sounds on freesound.com. The sound of a tropical rain forest plays around the recipient as the plant in the cage behind them emanates, low, rumbling growls.

Project #2 Development Blog

Mar 3

Our group: Vivian, Adham, Cassie, Nico

We started off with some brainstorming for our interactions and actions:

Initial Ideas:

  • Throwing crumpled paper into a basket 
    • Implement points based on how far back you are → makes you move around
    • Obstacles (desk, etc.)
    • Crumpling paper
    • Classroom, library
  • Putting food onto tray- cafeteria
  • Washing face
  • Taking care of plants
    • Zen
    • If you cut the plants they just float around
    • Twisting knob motion to speed up time → plants grow, lighting changes
  • Drawing
  • Slingshot
  • Flipping coin into fountain
    • Something could pop out, you have to catch it

After deciding on the plant idea we enjoyed, we decided to go more into details:

Taking care of plants:

  • Time
    • Lighting changes
    • Sun/moon
    • Plant growth
  • Environment ideas:
    • Dorm room
    • Windowsill
    • Small cottage
    • Outside garden, fence 
  • Interaction
    • Watering
    • Cutting
    • Picking fruit/flowers
    • Growing bamboo

With a solid idea in mind, we went ahead and designed our storyboard:

–Step 1–

Clump of bamboo in front of you

To your side: tree stump with watering can + cutting tool

Surrounding mountains and other bamboo

You’re inside a circle of rocks

Butterflies are flying around

It’s golden hour

–Step 2–

You have the water picked up

Water is gone from stump

–Step 3–

Bamboo is taller

–Step 4–

Replace water with axe

Now the water is back on the stump and the axe is gone

–Step 5–

Show the particles of the bamboo disappearing

–Step 6–

Now an empty spot of bamboo

Our storyboard:

Mar 10

Start to work on the particle system – create the effect of the water coming out of water can when user grab it and pour towards the bamboos.

In order to make the water fro watering can realistic, I changed the following parameters: start lifetime/ start speed/start size, gravity modifier to 0.3, hierarchy scaling mode. Under the emission box, I changed the
rate over time” into 200, and for the “force over lifetime” I adjusted Y into -3 and applies it into “world” instead of local. For the “rotation by speed”, I changed the angular velocity into 300, because I started it with 100 but that way in the game the speed the water moves will not be able to catch the player moving speed.

Mar 13

Today I worked on the particle system to make it on/off when an object is rotated at the certain angle – when the watering can face downwards the water particle system will be on and when it’s at the normal position the article system is off and the water effect won’t be shown.

I reached the goal by using the transform.enulerAngles and catching the Z angle input of the water can object. We have a boolean function called “IsPouring”, so I grabbed the particle system under it and I added the code if the angles are beyond the range then the the system stop, else the system play. And we call the function “Is pouring” under the “void update” to make sure it is running all the time.

There was a small problem when I practice the code – the particle system is alway on when it’s playing. So I assumed it was disconnected from its parent object, then I added the code “Print” on the IsPouring function to check if it’s connected to the watering can when the codes are running. It turned out to be that there’s nothing printed out in the console log, so I dragged the the particle system to the water can to make sure it’s under the component section (although the particle system is already a child of the watering can), and then it worked.

Mar 15&16

Today I’m working on the interaction code that when the particle system is pointing at the bamboo the bamboo will grow (instead of grow when being pointed by the point ray); the floating effect of the rocks (to create the sense of zen in the environment) .

  1. The floating rocks effect:

In order to improve the user experience and create the sense of ZEN, I added the floating effect:

I simply just grabbed a floating up and down code from unity community:

  1. public float amplitude; //Set in Inspector
  2. public float speed; //Set in Inspector
  3. public float tempVal;
  4. public Vector3 tempPos;
  5. void Start ()
  6. {
  7. tempVal = transform.position.y;
  8. }
  9. void Update ()
  10. {
  11. tempPos.y = tempVal + amplitude * Mathf.Sin (speed * Time.time);
  12. transform.position = tempPos;
  13. }

2.Bamboo Grow when been pointed by particle system:

We finished this part by using OnParticleCollision to detect the collision between the bamboo and the particle system that has been attached to the watering can. In the beginning we decided to add the particle system collision detect in the Bamboo OG script, because the growing function is in the same script so it’ll be easier to call, however, even been put into different layers and been set as “only collide with bamboo”, the particle system will literally collide with everything. Then we tried to only write the particle collision detection code in the canfunction and call the bamboo grow function from a different script to make sure the two parts are not messed up with each other. So basically in the particle system we say once it’s collide with bamboo, then it triggers the grow function from the BambooOG script, and then it worked. The codes we uses are shown below:

Mar 17

Today I worked on the the mist effect and it will only be triggered when the sickle is cutting/colliding with the bamboo.

At first I was thinking about attaching the particle system (the mist) to the bamboo script, so whenever it is detected that the bamboo is colliding with the sickle (the sickle is hitting the bamboo ), the result of which is to destroy a GameObject (a piece of bamboo), the mist particle system will be triggered to play. However, this design has two significant difficulties: one is that OnParticleCollision is really hard to be repositioned in the “instantiate” to make the mist effect only be shown on the specific piece of bamboo that is hit by the sickle (since there will be a lot of bamboo grow out of the OG bamboo); another difficulty is that since at the same time the game object will be destroyed will the child function on it has been trigger, they effect will not be shown at all because the moment it’s triggered, its parent also dies so the mist has nothing to show on.

Taking these conditions into consideration, I tried to created a new mist script just for the sickle and it’s separated from the bamboo function so we don’t have to reposition the particle system for each specific bamboo. At first I tried to detect the dillion of the particle system by “OnParticleCollision”, however it turns out to be super hard to be detected accurately since there are millions of small particles and it almost collide with everything. Therefore I switched to detect the collusion of the sickle – once the collision of the sickle hits a game object, the particle system (mist) that is attached to be sickled will be triggered. The coded are shown below:

Representation in VR

My friend and I were talking about the fact that despite the radical differences between different groups of languages, the structure of word groups are usually the same. Thus, if we map all the word in English, for example, in a virtual space, we can expect to see that all the English words related to family cluster together. Meanwhile, when we map all the Chinese words in the same space, all the Chinese words related to family will gather at the same space. Such a corresponding relationship between English words and Chinese words makes translation and language learning much easier, considering how different they are in terms of grammars, algorithms, characters, etc.

The reason why I think words should be mapped in a 3D space rather than a 2D one is that the connections between words and word groups are too complex to be represented in a 2D space. In a 3D word, the “distance” between different words will be less skewered: people pick one word/word group, look around, and then they can see all the connected words/word groups around them. This becomes an interface because people could be immersed in the world of words, and seeing the connections between words/word groups shape their understanding of language.

Form of Representation Suited for VR: Coding World

I’ve always been attracted to VR simulations that go beyond entertainment. I firmly believe that the possibilities for VR are limitless and such can be used to solve problems that hinder humans all over the world.  Therefore, I believe that the best VR simulation that can be implemented is educational. Ever since I was a kid, I considered myself to be a visual learner. Despite the fact that I can understand things on a superficial level just by listening, all information that is cemented in my brain has some form of visual representation attached to it. The same applies nowadays as I am studying Data Structures and Algorithms. I always need to spend time making drawings and concept maps in order to fully understand the concepts in class. Some of the concepts are really difficult to follow just by listening to a lecture and a virtual 3D animation of concepts like recursion, binary trees, and sorting algorithms will best be understood by learners all over the world who struggle with these concepts. This world will resemble those platforms like Scratch that teach programming to kids but will match the complexity of higher level programming concepts and algorithms with a medium like virtual reality which will much better represents the concepts.

Teaching Coding to kids

Visualizing Data in VR

During Bret Victor’s talk, I loved learning about William Playfair and how he invented the bar chart and other graphical methods to represent data. Related to these methods are “explorable explanations,” abstract representations that show how a system works or a way for authors to see what they are authoring without the black box of code.

Data visualizations are a powerful representation that is suited for VR. Though there are some visualizations that have been developed in VR, they usually rely on the game engine to navigate between charts or they will have some irrelevant motion like the bars rising in a bar graph when it is first loaded. I think with VR we can do more to incorporate the different modes of understanding that Bret Victor mentioned. For instance, we can build upon our spatial understanding to understand quantities, time, associations between nodes of information, or even how the charts are organized (like a library of books). We can build upon our aural understanding through having audio explaining the data and walking the user through it at a level specific to the user’s experience.

VR can make an data visualization an interface to information that makes the data accessible and easy to understand through abstraction. However, there is also potential for it to unpack the layers of abstraction and show how the data visualization has been made or even give the context behind the data. For instance, if there is a chart showing the amount of snowfall, could the user be immersed in the environment showing the snowfall and the data visualization of its levels? Data visualizations are a person’s stories of that data, so they are already created in mind with a specific objective for their audience. The trouble with these visualizations is that they tend to dehumanize the context behind that data, so VR really has the ability to use its potential for immersion to help the audience better understand the story. However, it is important for VR to not exploit this potential and to falsify the data through creating a specific immersive experience that causes a different perception of that data. I also think using VR to visualize data relates to the dynamic models that Bret Victor discusses at the end of his talk. Imagine data being updated in real time and seeing how the representation changes: the bar increasing, a point on a line graph being added, etc.

Interaction

A few years ago, a kickstarter showed up for a game called Superhot. It had a mechanic which I’ve never seen before, which I really liked a thought it to be really interesting. The game is a shooter where you have to fight through different set stages, but time only moves when you do. So you can asses the situation without moving and then calculate your moves in order to not die and win the round, however if you look around then the bullets continue flying towards you. It started off as a normal computer game, however it came out around the time VR became popular and the game was perfect for VR, so it incorporated VR too.

My Favorite Interaction(s)

Actually there are two interactions I would like to talk about. The first one is a website (showed by Craig in all his classes) that provides an interactive experience. As users access the website via mobile devices, they would be able to make their own paperplane, stamp on it and throw it away. Users could also catch a paperplane, unfold it, see from the stamps where this plane had been, and stamp on it. The design of this experience is quite simple and intuitive. The instructions are quite clear. What users have to do on their mobile devices really resembles what they really have to do if they are making their own paperplane.

https://paperplanes.world/

The other interaction is from a short video I saw, between a user and a piggy bank. The piggy bank looks like a card box. As the user put the coin on the plate and press it, a cat will sneak its claw out from the box and ‘steal’ the coin away. The interaction is quite simple, but it tells a vivid story of the coin being stolen by a greedy cat.

Interaction and Title

Samsung be fearless fear of heights -City Scapes

As someone who used to be afraid of heights as a kid, I felt really connected to this application as it immerses the user in three different types of outdoor settings (elevator, skywalk, and a virtually created tower) with each setting having multiple difficulty levels in order to slowly ease you into the experience. Usually, I tend to see that most virtual reality applications are designed for entertainment purposes. However, I am, interested in learning more about VR experiences with impactful applications that can change someone’s life for the better, and Cityscapes does exactly this. In addition, the application has the possibility of pairing up with a Gear S2 to measure the heartbeat of the user in order to gauge the user’s progress.

In terms of communicating information and composing an attractive view, the app developers did an outstanding job in simulating a real-life environment that can actually make the user’s fear of heights be tested in a variety of ways. As such, I believe that a careful research was done in order to design the best environment possible, one that takes the user into this simulated world and lets them leave with less fear after hours of use.

After perusing through different VR titles, I realized that I want to create one that lets the user interact with non-human entities from our world. For my 1st project, I created a simulation of a camping site and I really enjoyed doing so. As such, I would like to continue delving into the idea of simulating human’s interaction with nature in order to bring awareness on the impact (oftentimes a negative one) that humans have with nature and extend the significance this interaction has for the user.

Interaction I like: Instagram Stories Stickers

Instagram engagement is no longer just likes and comments, it also includes engagement from your stories to build connections between you and your followers, to encourage your followers to chat and share their opinions and experiences with you and therefore to interact with your viewers to foster the loyalty/stickiness.


A simple interaction I like between two human beings is the way we comment/reply to each other on instagram stories via stories stickers like question/poll and vote stickers. They are easy to use, fast to get response, and the results of the interactions are clearly visualized to be seen.

Here are two examples that indicate how stories sticker increase the Instagram engagement/interaction:

The Question Sticker:

Nothing sparks conversation more than a good AMA (Ask Me Anything) on Instagram Stories. And while influencers have been known to use the question sticker to help their followers get to know them more. It’s also a great opportunity for your followers to get to know yourself or a specific brand better, or get more information about your products.

On the flipside, it’s a great place for you to ask your followers some questions. You could spark a conversation about your VR project inspiration, your next season’s color palette, or what product lines they’d like to see more of. It’s engagement, conversation, and customer feedback altogether and it’s designed to be user friendly for both instagrammer and followers. All you need to do is to drag the bar / comment your ideas at the spot.

Poll and Vote Stickers:

Ask people to vote can make you decision-making much more easier and give your followers a sense of they are participating in your choice making in your life and people are also curios about what other people’s choices are. All they need to do to join the decision making is by simply tap the answer and the portion of each choice will be shown after you make your own choice.

The Dream Forest: documentation

It is very difficult for me to focus, especially when I’m trying to sleep. Listening to music certainly helps block out darting thoughts, but I wanted to create a visual space that I could focus on before going to bed. Thus, I wanted this space to be peaceful, minimalist, and beautiful.

Inspired by the place I grew up in, I wanted to create some sort of dream forest that felt very natural even if it had some mystical elements. To me, a forest environment conveys solitude, peace, and has no extraneous elements that could be a distraction, helping the viewer to be more immersed in the environment. One of my favorite things is looking up at trees and seeing the criss-crossed layers of branches against the sky. Thus, I was particularly excited to create a forest in VR because the viewer would be able to look across the forest, but also up at the tree branches. Thus, I began by creating the forest using a mixture of free tree assets. I ended up removing the leaves of the trees because though they contributed to a feeling of peace, the leaves went against the meditative aesthetic I was trying to convey. Perhaps because the leaves prevented the extent to which the viewer could see in every direction which I felt was a crucial element to creating a sense of reflection. The most challenging part of creating the forest was determining the optimal density of the trees. Too little and the environment felt unnatural and bare. Too much and the viewer could not see into the distance. Something I didn’t take into account was how much space the viewer needed, so I originally placed the player camera in the center of the forest, not changing anything for the viewer. This made the environment feel chaotic and cluttered, which was the opposite of what I aimed for. I ended up creating a clearing in the trees in the space around the player camera so that the environment felt more personalized to the viewer and that they’d have more room to breathe.

Once the basic form was created, I could focus on the little details that would create the identity of dream-like peace. I began with changing the skybox to put in a night sky. I felt that a darker environment would be more dream-like and conducive to using the environment before sleeping. However, the dark skies with the barren trees gave the ambience of something dark and sinister rather than calming and beautiful. Thus, I knew I needed to add some elements that would make it dream-like, conveying the sense of being in an alternate reality rather than just any forest at night. I added blue fog which added a tinge of magic, but also aerial perspective for the trees in the distance. I added a moonray which gave the forest a white glow that made it feel more peaceful. I played around with several elements like a pond, mushrooms, mist, swaying flowers, flying birds, but ended up choosing floating orbs of light and a gentle wave. I wanted something with soft, regular motion like breathing or rocking a baby to sleep. I chose to create a wave that flowed through the entire forest because of the sense of peace it gave me and the supernatural ambience it added. I played around with the wave so that it would barely be there and then appear when fading in. For the orbs of light, I created a particle system and adjusted its properties so that the orbs would be a rose gold color to balance the cool tones of the forest. Additionally, I changed the size of the particles and the radius of the system so that the orbs would float up from the whole forest which the viewer could see if they looked upwards. I wanted the orbs of light to balance the darkness of the forest and to be something calming that also invited a sense of awe. Finally, I added soft music with the sounds of waves to reiterate the peacefulness of the environment.

I am quite happy with the results, though a bit disappointed that I couldn’t get it to work with the Google Cardboard. For some reason, every time I would build the project with the Google VR Player prefab, my computer would crash and Unity would automatically quit. I did get it to build successfully once, but in the build version I couldn’t seem to move which was surprising because it worked perfectly fine when running it in Unity. But, I suppose this is okay as I have plenty of time to figure out how to make it work. I actually pulled the environment open once last night when I was feeling stressed and it did calm me down a bit, though that could merely be a placebo effect as a result of my bias towards my personal environment. It would be nice to play test it and see how others respond to the environment and adjust my design from there. One thing I want to play around with is creating a script that changes the skybox depending on the time of the day for the viewer. This is something I want to play around with rather than definitively do because I’m not sure how the barren trees would look during the day. Overall, I’m happy that I got better at ambient lighting and creating particle systems in Unity which will very helpful for future projects. One thing I learned through this project is how much these little details contribute to the identity of the environment. I originally intended to create a dark forest like the Forbidden Forest in Harry Potter and ended up with a dream-like identity just through a few simple elements.

Link to build: https://drive.google.com/open?id=1_yssadg1_JHMBkA9gciECKkH8AFv1Ikc

Link to project folder: https://drive.google.com/open?id=1aKHo-y6Jyy-8rRRMqRCYTvH2BMCCXRO5

Link to class presentation slides: https://docs.google.com/presentation/d/1iZphCYgfIraWx_qzIPrnPdoGto7hJEd31g_X83OTU0w/edit?usp=sharing