Brainstorming the Final Project

I really wanted to work with the “close the door” idea, and what immediately came to mind was inspired from the movie Howls Moving Castle. Where in the movie there is a ‘moving house’ that is somewhat alive with a special front door with a dial next to it. As you rotate the dial, the castle takes you to a specific place.

The castle
The door with the dial

But as I kept sketching out the idea, I did not seem to be able to make a story or set pathway to the concept…

Sticking with the closing door idea, I thought of different movies where behind doors held a different or unusual world. I found inspiration from movies like ‘The Chronicles of Narnia: The Lion, the Witch and the Wardrobe’ and ‘Monsters Inc.’

And what these both ideas had in common was that the door usually led to a normal closet setting, but at other times a whole different world. Branching from that concept I wanted to recreate that in the VR world.

The player would be placed in the setting of a plain and simple bedroom with one door. Drawn to the door, as they open it they discover it simply to be a normal closet filled with ordinary items. The twist is if the player ‘closes the door’ with them inside, they will be faced with mystical things floating around them and realize they are no longer in their mundane room anymore. The door would still be visible to them in that world though, and if opened again, they would be back where they started; the closet and room.

Project 3 Ideation and Storyboarding

Project Title: Windows XPerience

Team member: Junior, Ju Hee, Adham

For this project, we will go back in time to revisit a staple piece of technology: a Windows XP computer with the classic green valley and blue sky background. Our inspiration came after a rigorous brainstorming session were we touched base on a lot of ideas that interested us. After presenting some of the ideas to the class and after further deliberation amongst our group, we decided that this idea provided the best balance between abstract storytelling and realistic implementation considering the timeframe and themes available. The theme our piece will be based on is: “close the door” and we hope to do so in an interesting way, giving the user the possibility to decide the ending of a story where, inevitably, the door will be closed.

As seen in the Scene 1, the user will be in a room with his old-style windows computer appearing in front of him/her. As he/her clicks on it, he will be sucked inside the computer, shadowing the idea presented by Richard Moore in the Disney Animated film Wreck it Ralph 2. In order to escape the computer, the user will have to click on a set of icons in a predefined sequence. Each icon will transport the user into a different scene, and it is up to the user to click on the right set of icons to escape.

Possible Assets:

  • Windows XP Icons
  • Furniture for the room
  • Laptop
  • Paint brushes (Paint Program)

Interactions to Design and Code:

  • Clicking on the icons
  • Scene Manager to move from one scene to another

Sound:

  • God-like voice that tells the user the situation he is trapped in and gives him clues to possibly escape the computer
  • Sound effects for clicking on different icons
  • Sound effect of getting absorbed into the monitor
  • Peaceful background music when the user is in the green valley

Lighting:

  • Stark contrast in lighting between the room and the inside of the computer.
Windows Background we are trying to emulate

Final project: Lauren & Shenuka

For our final project, we’ve decided to create an environment where the player can move pebbles on the ground of a deserted island surrounded by sea, and having the stars up in the sky to reflect the movement of these pebbles as they are being moved. After a certain time has passed, the sun rises and sets, thus “renewing” the sky and giving it a new blank canvas for the player to create another constellation on.

This idea came from reading the chapter from Invisible Cities, where things happening on the ground affects what’s above it in a similar manner.

Here’s what we imagine the environment to look like:

courtesy of Shenuka Corea

*note: the sea surrounding the little deserted island gives necessary boundaries as to where the player is constrained in terms of space.

And a storyboard of how the world would work:

courtesy of Shenuka Corea

Our project aims to use the space, the objects within it, the relationship between them, and the sense of time in relation to cause and effect to convey “the stuff of story.” The interaction between the player and the world lends itself to discoveries and experiments.

Assets we’ll need:

  • stars in the sky
  • island terrain and water for the ground
  • night sky as skybox
  • animation of sun rising and setting to restart the sky

Interactions to design (+code):

  • moving of pebbles – objects with gravity, and responsive to where the player is moving them
  • similar reflection in the moving of stars
  • extra time element (delay) added to stars so they leave a trail behind as they’re moving into places corresponding to pebbles, creating the effect of shooting stars!

Sound/light:

  • ambient during nighttime – see first photo referenced above
  • calming sounds of ocean waves in the background
  • stars above sparkling a little

DEVELOPMENT BLOG – BETRAY

Our initial idea was to tell the story of the development of the UAE, based on the theme “renewal”. I took a core class last semester called the history and environment of the Middle East, and I was shocked by the fact that this region used to have oasis in ancient times. Thus, we thought it would be interesting to see how this places changed from oasis to desert, and then from desert to such a modernized city with skyscrapers and trees planted along the roads. Our idea was to position our story on an isolated island, which represents the UAE, because recreating the country will be challenging.

Then we thought that it would be too complicated to tell the history of the UAE from the beginning, and we failed to figure out how to make the transitions between the three scenes – oasis, desert, and modern city – natural and intuitive, so we decided to focus on the last two stages: how the UAE developed from desert to a modernized city. After some research, we identified three key stages for economic growth in the UAE, and we listed the elements that should appear in each stage:

Stage #1: desert, tent, barren, camel, cactus, pearl picking

Stage #2: souq, discovery of oil, the combination of seven emirates

Stage #3: Asian cup, skyscraper, NYU, city park, luxury car

In terms of transitions between scenes and the experience of time in our scene, player will be an old man who has lived in the UAE for his entire life. Through the player’s interactions and conversations with people around him in the three stages, player gets a sense of his/her age. For example, in stage #1, the player will see his parents collecting pearls, in stage #2, the player will trade with other people in their mid-age, and in stage #3, the player lives happily with his grandchildren.

However, we still think that the elements we would like to include in our environment and the player are loosely connected: player can actually see the history of UAE from a third person’s view, then why are we designing an VR experience for users to interact with the environment? Following that question, we tried to narrow the scope of the whole story down: instead of telling the story of the whole country, we wanted to focus on what this old man experienced during his life. The experience will start with the player lying in bed in his own luxury bedroom, and we designed three interactions based on the previously mentioned three stages:

Interaction #1 Player approaching the window. When he looks outside the window, he will see the modern city (stage #3)

Interaction #2 Player approaching the wall, touching on the photo hanging on the wall, triggering his midlife memory (stage #2)

Interaction #3 Player approaching a desk, on top of which there’s a hand-made toy. Player picking up the toy, remembering how his mother made him the toy when they lived in tribes (stage #1)

Then we realized another issue with this design: after player enters stage #2 for example, how do they come back to the main scene (the bedroom)? We wanted to avoid using back buttons because we want the experience to be more coherent. Bearing that question in mind, we tried to come up with interactions that could actually push the development of the story, and make the transition between stages more fluent. Then we thought that we could set our environment at the corniche. Player could collect pearls at the beach, and as they collect more pearls, the environment changes: more people gather together, and the souq forms gradually. Then the player can trade with other people, and as more people are trading, the skyscrapers occurs. However, a huge problem with this idea is that the rapid development of the country within the past several decades is due to the discovery of oil, yet we could not come up with a way to assign player a role in the discovery of oil.

At this point, Vivian and I felt like we were trapped by the huge idea of presenting the history of the UAE, and we realized that this is an impossible task considering the number of factors involved in the story. We decided to start over and look at the other two themes.

I had another idea of simulating how people feel after taking hallucinogen, the effects of which has been at the center of debate for decades. Meanwhile, Vivian found this interesting projects, which is a 3D data visualization of the brain activities for 4 seconds of someone who fell in love. We thought it would be interesting to create a multi-sensory experience for love and betray. We decided to name our project Betray, which will be a musical bittersweet love story conveyed by beats, background music and key words that follow the story development timeline.

Our environment will be quite illusionary. We will create an endless world, as if players are floating in the universe, but we will also distort the color of the background, so that the environment looks unfamiliar. There will be a list of words associated with love & betray (maybe also trust?), which appears in order. There will also be a background music. Players can interact with the words. By clicking the trigger/throwing the word away/etc., player can trigger different sound effects and visual effects. There will be a certain level of randomness in the experience, but the certain properties of the effects such as the volume & pitch of the sound effects, and the movement of the visual objects will be decided by how player interact with the word, and the category of the word they interact with. For example, if they interact with a word associated with love, the sound and visual effect will be soft, and calming, while if they interact with a word associated with betray, the sound and visual effect will be harsh and intense.

See Vivian’s blog post for our storyboard.

I made a demo using p5.js for the visual effects of our project. However, after presenting it to the class, I realized that I focused too much on how our project looks but ignored the story behind that. What is the message we try to convey to users by having sound and visual effects? I also shared an alternative I have about creating a virtual version of the Museum of Broken Relationships, where users can play around with objects in the environment, and figure out the story behind each object. Sarah suggested that the storyline should have the top priority if we want this to be an immersive storytelling.

After discussing with Vivian, we came up with the idea of having multiple objects and a cardboard box in the environment. By pairing the objects up and putting them into the box, users can trigger the voiceover of one part of the story. The order in which users put the objects into the box does not matter, since we wanted to leave enough space of imagination and free interpretation of the story to the user. Also, we decided to have a female and a male version of the story, because we believe in gender differences in the understanding and perception of love. Users will be able to choose whether to enter the female or the male version of the story.

Here is our story script:

Male Version:

Movie Ticket & Popcorn Box: I finally got the courage to invite her out for movie. Touched her hand when we tried to reach the popcorn at the same time

Paired Toothbrush & Cup: The first night she slept over, she brought her toothbrush with her. The next morning, she changed my toothbrush into a paired one

Dog Food & Dog Cage: She always wanted a pet dog. I brought home a cute puppy one day, and I couldn’t forget how excited she was: her dream finally came true.

Iron & Suit: First day of work after my promotion. She ironed my suit for me. Looking forward to our future.

Ring & Ring Case: I saved all my money for this ring, prepared for a month for this. Today is finally the day.

Passport & flight ticket: Leaving the country tomorrow. I guess I will have to put this memory in the box and leave it behind…

Female Version:

Movie Ticket & Popcorn Box: He finally asked me to hang out. I’ve been waiting for this for a month. He’s so shy but so cute .

Paired Toothbrush & Cup: I was like a kid after I stayed at his place. I paired our toothbrush up.

Dog Food & Dog Cage: Suddenly one day, he brought home a cute puppy. I was so surprised! I always wanted a pet. I guess now we are a family of three.

Iron & Suit: I want to make his first day after promotion special, so I got up early to iron his suit. Looking forward to the bright future of ours

Ring & Ring Case: I said yes.

Passport & flight ticket: His work left us no choice but to lock all these memories up, forever.

I was a bit worried about not being able to find prefabs for passport, flight ticket, movie ticket, ring, and ring case online, and when we looked into the asset store, we really couldn’t find those. So we changed our storyline a bit, based on what we were able to find online.

New Version:

Male Version:

Paired Toothbrush & Cup: Paired toothbrush is the first step to show we are a family, says her.

Candles & Plates:

I finally got the courage to invite her out for a dinner.

Candle light on her cheeks is the cutest thing ever.

Dog Food & Dog Cage: I brought home a new family member that day, and I couldn’t forget how excited she was: her dream finally came true.

Iron & Iron Board:

First day of work after my promotion.  

She ironed my suit in the morning.  

Looking forward to our bright future.

Pen & Notebook:

She used to keep all of our memories in this notebook …

but it’s meaningless now.

Vast & Watering can:

Roses die. So are our promises….

Eventually it turns out to be a wedding without her…

Female Version:

Paired Toothbrush : Pairing our toothbrush makes us look like a family more…

Dog Food & Dog Cage: That day he brought home a cute puppy. I was so surprised! I always wanted a pet. I guess now we are a family of three.

Iron & Iron Board: I want to make his first day after promotion special, so I got up early to iron his suit. Looking forward to our bright future …

Candles and Plate: Can’t ask for anything better than a candlelight dinner for the first date.

Pen & Notebook: It’s been a while since he left me… i used to keep a diary everyday when we were together …. How stupid i was.

Vast & Watering can: Roses die. So are our promises…. Eventually it turns out to be a wedding without him…

Project 2 Documentation. Don’t Feed The Plants.

This project was created together with Mai Lootah and Shenuka Corea. Since they were working more on the environment and textures and animation, I was working on the scripts and mostly will dicuss them.

Description:
This project is a greenhouse on an alien planet where you can see all sorts of alien plants. And when you place a seed in the pot and water it, something weird grows. Some plants make sound, like the one that is behind you and placed in a cage.
There were a total of two main interaction and some secondary interactions. The main interaction was tied to the seed. You can pick it up, its throwable, but the goal is to put it into the pot. Once the seed is in the pot, you can water it to make it grow. That is the second interaction which is tilting the watering jug so it could pour water. This script was a disaster since it was the first script I have ever written. The secondary interactions are the gardening instruments that can be picked up and thrown.

Implementation:
It started off with an empty terrain where we put an imported from the asset store greenhouse (costed is 1$) and made the outside tree meshes and grass. After the we have created a work station which included some benches from that same greenhouse asset and pots. Right after that we have added mesh colliders to every part of the work station and added a new asset which was the gardening tools. The main camera was set in front of the work station and it consisted of the “player” from the steamVR interaction example scene. After the environment was set by Shenuka and Mai, I have started working on the scripts. The first script was to make the particle system, which was attached to the water canister, play when tilted. This script used the transform.euler angle (took me a while to figure out) and when it is in the desired range, just run it. The next script was the collision detection of the seed and pot. That one was not hard: when the seed is on the soil of the pot, pass a “true” boolean value. The next script was to detect collision from the particle system to the seed. Not hard as well, same thing as with the seed and pot. Next is making a counter for the seed which will count the particles of the water only when the seed is in the pot. That is where we need the seed with pot collision script, and once there is enough water for the seed, it shrinks and the plant grows. This was another script named “growth”. If growth was true, the seed began to shrink and plant to grow. Those are all the scripts that were created to make our project 2.

Reflection:
Our goal was to make a greenhouse with a man-eater plant. You can find a clue about it if you look behind you (same plant in cage) and that it is dangerous. As it is seen on the pictures of the blog which was dedicated to expectations before the project was started, we have reached our expectations. The environment was really well made to understand that we are actually on an alien planet.

Storyboard before the project was created
Expectation before the project was created
First view of the greenhouse
Upgraded view of the greenhouse
The inside of the greenhouse
implementing the water pouring when tilted
Final space

Project 1 Documentation. Sunset Valley.

This project was made to show the environment that could be possibly perfect. The environment set as a goal was a sunset valley.

Description:
I have created a mountain area with trees. The main camera was set at the top of the mountain, not the highest, but then the ground level. From that point of you you could see a lovely sunset and some trees that are below you.

Process and implementation:
Using a terrain tool I have raised the terrain in some areas to make it look like mountains and using the brush tool I have added tree meshes and grass meshes. After that I have added some filters on the main camera to make the sunset look even better and move the camera using the “smooth mouse look” script. This made a wonderful representation of the sunset valley.

Reflection:
This places is build to make a person be immersed into a nice place where you can clear your thoughts and just relax. I have never seen a place with such sunset before personally, but now I have a goal to find something in real life that will look very similar.

Project 1 Documentation

Project 1 Development Blog Link: http://alternaterealities.nyuad.im/2019/02/11/project-1-development-blog-5/

In this project, the goal I set out to accomplish was to create a peaceful environment. Even though the implementation of such differed in many aspects with my initial idea for the project,  the overarching goal of creating a peaceful environment was definitely accomplished. As such, I will use this piece to talk about the similarities and differences between my project’s ideation and its actual implementation, as well as the process that I went through to go from the former to the later.

Differences

If you look at my first entry in the development blog (link above), my initial idea had the user in the middle of the tent. Even though I could have accomplished this in my project, I decided that locating the user outside of the tent created a more meaningful impact given that the user can enjoy more of the scenery thanks to the 360 degree  look of the natural landscape I created.

I also didn’t add the snack assets into my project. I could not find any of the assets I wanted and instead I placed camping tools and wood logs all over my scene. Cutting wood logs was also something I used to do quite often in my camping trips, so this resulted in a really good alternative.

Similarities

All in all, I can say that the tent part of my idea was accomplished effectively. However, I didn’t expect to get so invested in designing the natural landscape. I spent more than 70% of my time placing trees and playing around with the mountain assets. I really liked this portion of my project, and it made me retrospectively analyze the importance I placed in nature. Whenever I went camping, I thought that the most enjoyable part of the experience was to spend time with my friends and that the landscape/scenery came in as an added bonus. After doing this project, I now know that the refreshing look of nature was more important than how I initially perceived it to be, and I hope that I value more this importance as I go on more camping trips.

Implementation:

As stated in the development blog, the design of the scenery was not the most strenuous part of the process. The most time-consuming part was making the executable file. However, looking my project in a Google cardboard Kit was worth it as it gave me a newfound appreciation of the scene view I created. Also, the terrain object proved to be really difficult to alter in the version of Unity I had installed in my computer. As such, I had to use a cube for my floor and then use a mountain asset to fill the user’s distant view with mountain tops.

Project 2 Documentation

What I have learned from this project

I have learned to build bigger environments more effectively. Before it was hard for me to plan out the environment and build it, but this time it was possible for me to see the environment three-dimensionally. It was definitely easier to build the environment

Limitations and Reflection

The project can be improved with these elements in the future

  • More complex objects
  • Having the trash react angrily if you pass without picking up
  • Having a message indicate the right place to recycle something if you don’t put it in the correct bin
  • Develop environment and interaction system more in tandem with each other

Moreover, we should have had more conversations about interactions and the environment. We faced some hardships when it came to adding interactions because we built the environment separately from interactions.

What I want to work on my next project

I would like to work with the interaction part and while working with it, I would like to work with the environment to make sure it is scaled properly. I want the user to feel like they are actually part of the environment. We want them to feel like they are in the right height, and right scale.

Here is the link to my presentation

Project #2 Documentation

Project Description

For project #2, our group decided to build an experience that tackles the problem of sustainability on campus. We wanted to base our scene on campus, and there are trash on the ground. In the real world, if someone passes by trash, and ignores it, there are no consequences. Besides, people all have the mind set that someone else will act on that. We wanted to raise people’s awareness of the NYUAD community by creating an alternate reality where if people walks by a piece of trash without picking it up, they will receive negative feedback indicating that they are not acting in a proper way.

Besides, because of the diversity of the community, there isn’t a shared standard for recycling which everyone agrees upon. Always having been such an ignorant person about the environment, I really get confused when I throw an empty Leban bottle: should I put it in general waste or plastics? The bottle is definitely recyclable, but only after I clean the bottle. Recycling can be extremely complicated: I still remember that I was extremely shocked when the RA told us that we should recycle the lid for Starbucks cups but throw the paper cup into general waste. By creating an educational environment which mimics what actually happens on campus, we hope to teach people how to recycle in an entertaining way. Through the repeated interaction within our scene, users might be able to perceive recycling as less burdensome as they get more familiar with it.

Process and Implementation

The tasks were divided up: Ju Hee and Lauren were in charge of the environment, while Simran and I were in charge of the interaction. After the environment was created, our scene looked like this:



When Simran and I started to work on the interaction with trash in our environment, we found a lot of problems with the environment. First, because we failed to set up our VR station when we first started the project, we didn’t have a sense of the size of our VR space and how it is reflected in Unity. If I were to figure out we need to set up the VR station beforehand before Lauren and Ju Hee started to build the environment, we could have saved a lot of time and energy from rescaling the space. The environment is too large, so that the movement of users are not significant: users can’t really tell that they are moving inside the environment. So we decided to do a teleporting. We divided our tasks: I will be mainly in charge of the teleporting, and Simran will focus on the interactions, but we are helping each other out throughout the process.

I went through several tutorials to understand how teleport in steamVR works in Unity. Here are the links to the tutorials: https://unity3d.college/2016/04/29/getting-started-steamvr/

https://vincentkok.net/2018/03/20/unity-steamvr-basics-setting-up/

At first, I decided to place teleport points next to each piece of trash, so that users can easily access the piece of trash by aiming at the right teleport point. Then I realized that since we have such a huge space, users would never be able to go to the areas where there is no trash, so I think it would be nice to have the whole space to be teleportable: users should be free to move in our space, and they also have the choice of going directly to the trash and complete the training if they are not interested in experiencing our VR campus.

Adding the teleporting object to the scene, setting up the teleport points in the environment, and attaching the TeleportArea script to the ground are easy. However, it becomes frustrating when we have to figure out the scale and the position of our camera. The environment was build in a way that the ground is not set at position (0, 0, 0), and the objects were not tightly attached to the ground. And when we do the teleport, we get teleported beneath the buildings

At first I tried to change the y-position of the camera, so that we actually view everything, but then after raising the camera, we are not able to see our controllers because they are so far away. Then I tried to raise the y-position of the player, but we are still teleported to a place below the ground. Then I figured, instead of making the ground teleportable, I can create a plane that is teleportable, and raise the plane a little bit. By doing that, I fixed the problem.

I also doubled the scale of everything so that the scale looks fine. Then we found several problems when we view the environment via the headset. First, the buildings, or some part of them, disappear when we looking at it.

Then I figured out that the nearest and farthest viewing distance should be adjusted according to the scale of our environment.

Another problem we encountered was how to get close to the trash. Because our scene is in such a huge scale, we cannot even touch the trash lying on the ground because they are so far away, so we decided to have the trash floating in the air, at approximately the same level of the teleport plane, so that users are able to grab them with the controllers. However, if we simply disable the gravity of the trash, they will fly away.

But then if we enable the gravity and kinematics at the same time, the trash won’t be throwable: it couldn’t be dropped into the trash bin. Then I searched online for the correct settings for the Throwable script in steamVR and also asked Vivian how her group did that. In order to make it work properly, we have to set in Regidbody “Use Gravity” to be true, and “Is Kinematic” to be false. Then for the Throwable scripts, we need to select “DetachFromOtherHand”, “ParentToHand” and “TurnOffGravity” for the attachment flags.

I also added the ambience sound to the scene, created the sound objects for positive and negative sound feedbacks, set up the sound script, and attached them properly to each sound object.

Reflection/Evaluation

One of the take-aways from this project is that for VR experience, the scene and the interaction cannot and should not be separated. After dividing the tasks up, Simran and I did not really communicate with Lauren and Ju Hee. Then we took over the already-made environment that was extremely large, and the objects in the scene were kind of off scale. We spent a lot of time fixing the scale of everything, and I felt really bad about not communicating with them beforehand. We could have saved a lot of time.

Another thing I should bear in mind for future projects is that we should never ignore that fact that the hardware might went done. We almost ran out of time when creating the interactions because the sensors kept disconnecting with each other and the controllers kept disappearing from the scene. We should have planned everything ahead rather than leaving everything to the last minute.

Overall, I enjoyed the process of learning from peers and from obstacles, and our project turned out to be nice: we didn’t expect users to be really engaged in our game and to have fun throwing trash.