Development Blog_Project 2

As we first planned the project, we came up with the idea of building an experience where the user can learn more about sorting trash. We wanted it to be not only fun but also a very educational experience.

My role in this project was to create the virtual reality space. We first tried using pre-existing skybox of a city. We tried placing three different garbage bins and garbage such as plastic water bottles, cans, glass bottles, tissues, and etc. We found some prefabs that included diff garbages. We placed them around the city. The skybox on its own looked amazing, but as soon as the garbages were placed, something didn’t look right. So we decided to get rid of the city-scape and build our campus from scratch.

We decided to focus on D2 area. We wanted to build the buildings, create the grass patches. We added trees. We also added the seating areas in front of the dining hall where people usually eat. This is the first draft of our environment and the reference image of our campus.

At this stage, we had one main building in the center (dining hall) and mostly grass patches. Some trash were added as seen. However, there was another issue. We wanted to make sure all three sides were enclosed so that the user would not have to face the issue of not knowing where the environment ends.

Later we added two more buildings on the each side of dining hall.



The environment part of the project was completed. Now we only have to focus on the interactive part of the project.

Midterm project: development 02

Moving on, I built the space around The Arts Center more. It took a lot of trials and errors to get the right look and see what details will add to it. I realized having the right overall shape of the buildings and the color correct really makes them look realistic. Adding windows is also another fine element that made the buildings come more alive.

Screenshots from work:

building more grass patches around the area

I also looked to see what other structures I could add to make the space more recognizable and realized that there were long, rectangular direction platforms around the grass patches. I went ahead and recreated this in the space:

After that I started building the Dining Hall as well as A6 on the other side.

I identified A6 as being the tallest building among the three, and made sure that was clearly visible in the space.

the initial stage of A6
A6 rising up from the ground!

And finally, here we have the final look of the actual environment!

Midterm project: development 01

For our group, we decided to make a project on bringing awareness to the environment and waste issue at NYU Abu Dhabi. It seems that many of our students lack the knowledge of recycling properly and frequently, so we want to use this project to address that problem.

The interaction we want to recreate in this VR project is the act of picking up trash on the ground and recycling it into the right bin. We decided to use our own campus as the actual environment in the project. One important decision we made in terms of the campus was to focus on one portion of it – not too much space, but enough to move around, be able to see trash littered around, but at the same time exude a sense of a closed space so players don’t wander around and stay within that designated area. Our set area was the space right outside D2, where there are grass patches and A6 and The Arts Center are on the side.

So far, the groups have been set to take on the following duties:

Building the environment – Lauren (making buildings and environment) & Ju Hee (adding objects, trash and cats to the space)

Making the interaction – Yufei & Simran (picking up of the trash, recycling into the bin)

For the environment, I decided to make the actual campus space (buildings, composition, staging and lighting). Initially, I was going to decide on a color palette and build a space using that based on the campus. We thought maybe we could build an “angular” version of our campus using simple shapes. But after trying that out with simple cubes, I decided the key is for players to be able to identify it as our campus and it had to look more realistic. So I went to take some reference photos of that area as well as looking at Google Map to get an accurate sense of space.

referencing Google Map for accurate details

I started working on The Arts Center first because it’s big and easy to recognize. Once I have it finished, I can build the space around it. I also first searched for a lot of textures that I could use for the buildings as well as the ground/environment.

The Arts Center in the making!

The Arts Center with more details

a close up of The Arts Center

I’m happy with the progress I made for today. More coming!

Project #2: Development Blog

For this project, Junior, Claire, and I decided to play around the everyday theme of having to find your glasses. Coincidentally, all three of us have bad eyesight and so we either wear glasses or contact lenses. We shared our experience of how we struggle to find our glasses, especially in the mornings, as we tend to forget where we placed our glasses the night before.

Rough Draft of the Setting

We decided to choose the bathroom as our location and include a bathtub, a toilet, sink, bathroom shelf, a towel hanger and towels. We had originally planned to place different objects, but after placing some objects, figuring out each of the positions, and removing some objects according to the overall balance, we decided to settle down with the current bathroom setting. We also added the hanging light as our light source.

Getting the correct Position, Rotation, and Scale

Claire and I were in charge of creating the bathroom setting, matching the virtual space to the actual available space in the classroom, placing the four glasses in different places, and making sure that the user felt that he/she was in a bathroom trying to find his/her glasses.

Top View of Bathroom Setting
Side View of Bathroom Setting
Different Side View of Bathroom Setting

For our next steps, we need to add the different functions of the four glasses in order to make it interactive. Each of the four glasses will have different functions – whether they are zooming in, zooming out, different colored tints, and the corrected vision. Although we understand that the blurry vision will cause nausea for the user and may not be suitable for long use, we will play around the degree of blurriness to see how we can make it work.

Demo Video

Documentation [Zenboo]

Zenboo was based on the concept of a Zen environment with a simple yet endlessly executable action in place. Originally, the plan was to have flowers that could be endlessly grown but then the idea of bamboo came up. Since, in reality, bamboo grows incredibly quickly and is aesthetically attractive, we decided on this vegetation instead (fig.1). A positive addition was that bamboo inherently already had some connection to the idea of Zen. We wanted to place the user in a comforting environment that presented them clearly with a task which they could continuously do in order to relax for the daily stresses of everyday life. All the artistic choices behind the environment were directed towards this comforting attitude. The interaction were also kept simple and obvious. The interactions involved the picking up of the two objects and the using of the two objects. A watering can could be picked up and used to pour water on the bamboo, which would make it grow, and the sickle could be used to chop the grown bamboo, and make it disappear (fig.2).

Knowing that there was a lot to be done, we split the tasks evenly into two groups: scripting and designing. One individual was responsible for the designing of the environment, another in charge of the music and sound effects, and two responsible for making all the desired actions feasible. I was responsible for scripting actions and did most of my testing in a separate scene than where the environment was being designed. Since the actions had to be explainable without any description, we made sure to use everyday objects and code for recognizable physics behind them. This meant that the watering can could be lifted up and that water would only appear when poured at a certain degree, or that bamboo would grow upwards when water interacted with it. The testing area was modeled around what the final scene would encompass for the user. The tools were placed near to the spawn point of the user and could be used on the bamboo that was close by (fig.3-4).

The behaviors of the objects were expected because they were similar to reality and this made them seem like everyday actions. This meant that tools could be lifted, thrown, dropped, and act the correct way when coming into contact with other objects or when being poured. The only area where an unexpected result appears is when bamboo grows (fig.5). It was discovered, during testing, that bamboo balancing on itself was a lot more attractive and brought more comfort to the user, similar to stacking stones (fig.6), so it replaced the regular straight growth of bamboo shoots. Having the segments of bamboo fall to the ground after they reached a certain height was also a feature of this balancing. This brought new possibilities to a used behavior and also prevented clutter by having the segments disappear after a moment. After all the objects were designed and equipped with their respective behaviors, they were made into prefabs and placed in the final scene, in similar coordinates (see Cassie’s blog). The environment was designed to have warm sunset lighting, comforting wind, grass, and hills in order to bring ease to user. A small oddity observed was the floating rocks, though these objects are not following our reality’s physics they look incredibly mesmerizing and thus were maintained in the environment. Generally, having a few quirks that brought personality to the area, was expected to give the user more reason to desire realizing in this world.

Our expected world was a place a used could freely spend their time in with the goal of alleviating stress. This was achieved because the user had a simple task that could be endlessly continued and a surrounding that promoted comfort. With more experience and time, the world could eventually be expanded. There could be more tasks for the user to indulge in and more scenery that was intriguing to look at and enjoy. Expanding is always a possibility to entertain the user but keeping them in a roughly enclosed area was a solution too. Keeping them enclosed and with only a few tasks to focus on lets them possibly enter a form of meditation, which is by far the best stress relieving method. Better designing of the current scene could have involved the matching of asset styles and consideration for certain behaviors. Making the bamboo that falls intractable by hand and making it so that tools were always held in the correct method would have been logical. Making a better match of the tools’ material with the design style of the environment would have been more attractive. After showing the project in class I also noticed that some of the music could have been worked on to be less hostile and the water system needed some tweaking. Simply, there were a few factors that made the objects in the project seem unworldly and made it harder for the player to immerse themselves.

Though there were several factors that could be worked on there was also a sure sign that the project was a success. This is evident in three behaviors, players would want to place the controller on the stump after they were done, players tried to move out of the way of falling bamboo, and players continued to water the bamboo endlessly without tire. This shows that players were about to connect their reality with the world we created to such an extent that they the lines between the two existences became blurred.

Project 2: Development Blog

This project is going to be like something out of the Harry Potter universe. It places the recipient in a large, Victorian style greenhouse, in front of a planting station.They are provided with seeds, a watering can and a planting pot. If the user follows what is pretty much expected and plants the seed and waters it they end up growing a giant man eating plant that gets them eaten. Just behind them will be placed a second plant of the same species with a danger sign, an easter egg warning that the user may or may not see.

I envisioned the greenhouse to look like the ones found at the Kew Gardens in London that I visited last summer.

We first set out to find some ready made assets, primarily a greenhouse and the man eating plant. I managed to find a greenhouse that cast some nice shadows and came with a bunch of planting pots and benches. I also managed to find an animated plant with teeth. Getting the animation to loop is something we have yet to figure out. We built the planting area by combining some of the benches that came with the greenhouse.

Then, we began work on the interactions. We brought the player in from one of the example scenes in the Unity VR package. We also brought in a sphere that we will be using as the seed. Adding colliders to the pot and the table allowed us to place the objects on the surface and drop the seed into the pot. Figuring out how to detect the tilt of the watering can to start playing the particle animation of the water took some time but Max was able to figure it out.

I built an expansive terrain around the greenhouse to create a expansive forest. I sprayed the area near the greenhouse with patches of grass and a single species of tree but nothing too extreme or different so That the focus would remain more on the inside. The greenhouse was populated with strange alien plants, trees that reach above the recipient’s head and some close by the use in planters, some of which came with their own animations. The bench forms a visual barrier around the user. These worked wonders for bringing the space to life. Some of them emit clouds of spores, which became quite distracting so I ended up removing them.

To add another ‘alternate’ element to the world we added a creature, a giant butterfly in the sky. The butterfly makes the outside seem an even more daunting space than the inside.

We added some gardening equipment into the space the the recipient can pick up and play around with as well. I think it may make a fun ending to the game if the user was to pick up one of these and fight off the monster plant.

The growing of the flytrap is triggered to happen when both the soil and the water have contact with the seed for a certain amount of time. This took several hours to figure out how to do. The plant, which is already in the pot but extremely small in size grows larger and animates, lunging at the viewer.

Finally, we decided to add some ambient and 3D sound.
We found a bunch of sounds on freesound.com. The sound of a tropical rain forest plays around the recipient as the plant in the cage behind them emanates, low, rumbling growls.

A form of representation suits VR

A form of representation I feel would be suited for VR medium for doctors during a surgery.

It creates an interface to the world where the doctor can comprehensively assess the patient’s physical situation, instead of just seeing from a single dimension screen and being influenced by the surrounding environment. It is common nowadays that a doctor’s hands are operating a surgery while turning his/her head to another direction to see the instant image on the screen, this is an “inhumane” design for doctors because it loses the coherency of eyes and hands. What’s more, there are multiple other external conditions might influence a doctor’s judgement – the light in the operation room, the anxious patient, or even the breathe of the surgery assistants. Therefore by implementing VR into this specific situation will minimize the external influence so the doctor could have a better performance.

By operating with the precise equipment attaching to VR, the doctor will be able to focus on minor details in the surgery (eg. sewing the wound) and eliminate the failure of false judgment.

Project #2 Development Blog

Mar 3

Our group: Vivian, Adham, Cassie, Nico

We started off with some brainstorming for our interactions and actions:

Initial Ideas:

  • Throwing crumpled paper into a basket 
    • Implement points based on how far back you are → makes you move around
    • Obstacles (desk, etc.)
    • Crumpling paper
    • Classroom, library
  • Putting food onto tray- cafeteria
  • Washing face
  • Taking care of plants
    • Zen
    • If you cut the plants they just float around
    • Twisting knob motion to speed up time → plants grow, lighting changes
  • Drawing
  • Slingshot
  • Flipping coin into fountain
    • Something could pop out, you have to catch it

After deciding on the plant idea we enjoyed, we decided to go more into details:

Taking care of plants:

  • Time
    • Lighting changes
    • Sun/moon
    • Plant growth
  • Environment ideas:
    • Dorm room
    • Windowsill
    • Small cottage
    • Outside garden, fence 
  • Interaction
    • Watering
    • Cutting
    • Picking fruit/flowers
    • Growing bamboo

With a solid idea in mind, we went ahead and designed our storyboard:

–Step 1–

Clump of bamboo in front of you

To your side: tree stump with watering can + cutting tool

Surrounding mountains and other bamboo

You’re inside a circle of rocks

Butterflies are flying around

It’s golden hour

–Step 2–

You have the water picked up

Water is gone from stump

–Step 3–

Bamboo is taller

–Step 4–

Replace water with axe

Now the water is back on the stump and the axe is gone

–Step 5–

Show the particles of the bamboo disappearing

–Step 6–

Now an empty spot of bamboo

Our storyboard:

Mar 10

Start to work on the particle system – create the effect of the water coming out of water can when user grab it and pour towards the bamboos.

In order to make the water fro watering can realistic, I changed the following parameters: start lifetime/ start speed/start size, gravity modifier to 0.3, hierarchy scaling mode. Under the emission box, I changed the
rate over time” into 200, and for the “force over lifetime” I adjusted Y into -3 and applies it into “world” instead of local. For the “rotation by speed”, I changed the angular velocity into 300, because I started it with 100 but that way in the game the speed the water moves will not be able to catch the player moving speed.

Mar 13

Today I worked on the particle system to make it on/off when an object is rotated at the certain angle – when the watering can face downwards the water particle system will be on and when it’s at the normal position the article system is off and the water effect won’t be shown.

I reached the goal by using the transform.enulerAngles and catching the Z angle input of the water can object. We have a boolean function called “IsPouring”, so I grabbed the particle system under it and I added the code if the angles are beyond the range then the the system stop, else the system play. And we call the function “Is pouring” under the “void update” to make sure it is running all the time.

There was a small problem when I practice the code – the particle system is alway on when it’s playing. So I assumed it was disconnected from its parent object, then I added the code “Print” on the IsPouring function to check if it’s connected to the watering can when the codes are running. It turned out to be that there’s nothing printed out in the console log, so I dragged the the particle system to the water can to make sure it’s under the component section (although the particle system is already a child of the watering can), and then it worked.

Mar 15&16

Today I’m working on the interaction code that when the particle system is pointing at the bamboo the bamboo will grow (instead of grow when being pointed by the point ray); the floating effect of the rocks (to create the sense of zen in the environment) .

  1. The floating rocks effect:

In order to improve the user experience and create the sense of ZEN, I added the floating effect:

I simply just grabbed a floating up and down code from unity community:

  1. public float amplitude; //Set in Inspector
  2. public float speed; //Set in Inspector
  3. public float tempVal;
  4. public Vector3 tempPos;
  5. void Start ()
  6. {
  7. tempVal = transform.position.y;
  8. }
  9. void Update ()
  10. {
  11. tempPos.y = tempVal + amplitude * Mathf.Sin (speed * Time.time);
  12. transform.position = tempPos;
  13. }

2.Bamboo Grow when been pointed by particle system:

We finished this part by using OnParticleCollision to detect the collision between the bamboo and the particle system that has been attached to the watering can. In the beginning we decided to add the particle system collision detect in the Bamboo OG script, because the growing function is in the same script so it’ll be easier to call, however, even been put into different layers and been set as “only collide with bamboo”, the particle system will literally collide with everything. Then we tried to only write the particle collision detection code in the canfunction and call the bamboo grow function from a different script to make sure the two parts are not messed up with each other. So basically in the particle system we say once it’s collide with bamboo, then it triggers the grow function from the BambooOG script, and then it worked. The codes we uses are shown below:

Mar 17

Today I worked on the the mist effect and it will only be triggered when the sickle is cutting/colliding with the bamboo.

At first I was thinking about attaching the particle system (the mist) to the bamboo script, so whenever it is detected that the bamboo is colliding with the sickle (the sickle is hitting the bamboo ), the result of which is to destroy a GameObject (a piece of bamboo), the mist particle system will be triggered to play. However, this design has two significant difficulties: one is that OnParticleCollision is really hard to be repositioned in the “instantiate” to make the mist effect only be shown on the specific piece of bamboo that is hit by the sickle (since there will be a lot of bamboo grow out of the OG bamboo); another difficulty is that since at the same time the game object will be destroyed will the child function on it has been trigger, they effect will not be shown at all because the moment it’s triggered, its parent also dies so the mist has nothing to show on.

Taking these conditions into consideration, I tried to created a new mist script just for the sickle and it’s separated from the bamboo function so we don’t have to reposition the particle system for each specific bamboo. At first I tried to detect the dillion of the particle system by “OnParticleCollision”, however it turns out to be super hard to be detected accurately since there are millions of small particles and it almost collide with everything. Therefore I switched to detect the collusion of the sickle – once the collision of the sickle hits a game object, the particle system (mist) that is attached to be sickled will be triggered. The coded are shown below:

Molecular Pathways – A Better Representation

In molecular and cellular biology there are two three dimensional aspects that are often portrayed in two dimensions, these are molecules and their interaction within pathways. Molecules are usually drawn as simple blobs or shapes in order to better visualize the different domains that have specific functions (proteins have very complex shapes). With these simple shapes, their interactions are usually connected with a vast map of arrows and inhibitions which form the pathways that biologists study and use to develop function specific drugs or site specific research. This decision to go simple and 2D has made studying and using biology a lot simpler but it has taken out a very important factor, molecules and pathways have movement. This movement, which is important in the interaction of proteins, depends vastly on size and surroundings which are all three dimensional. Naturally, to model this mathematically is quite complex and requires a lot of prior knowledge and computational power but once achieved has great benefits. If these pathways could be brought into a dynamically moving intractable three dimensional world, then there would be the possibility of better research and understanding into medicine. By better understanding what happens when certain interactions are “physically” and “visually” removed there would be less wasted effort in pathway structure and experimental design.

Check out this cancer pathway map: https://www.qiagen.com/dk/shop/genes-and-pathways/pathway-details/?pwid=301

Interaction

My favorite interaction in life is board games. The way they are made is so simple but there are rules that you have to follow. An example would be a game called “Munchkin”. The game is a tiny version of “Dungeons and Dragons”. The goal of the game is to reach lvl10 (or lvl20 in expansions).

The game is simple and complicated at the same time. You interact with it using a rolling dice and some coins and of course the cards.