Categories
Year2

2023 VR research lab with BA Sound Arts

Group 3

Virtual Experience Developer: Ronger Huang & Clara Childerley Garcia

Sound Arts Designer:

Juice Shuting Cui

Maria Grigoriu

Rysia Kaczmar

Benjamin Thorn

Hanifa Uddin

Jacob Lyttle

Week 1 Unit Introduction and Brainstorming

A rough central theme of Surrealism was decided after the first in-person meeting happened a week later the introduction with a very high agreement rate, and all the brainstorming sectors were based on this central topic relating to fantasy, unreal and creepy VR visual and sound arts.

Breton’s definition of surrealism (more automatic/exploration of lucidity and dreams ) is “Psychic automatism in its pure state, by which one proposes to express—verbally, by means of the written word, or in any other manner—the actual functioning of thought. Dictated by thought, in the absence of any control exercised by reason, exempt from any aesthetic or moral concern.”

Margritte’s surrealism is more representational René Magritte described his paintings as “visible images which conceal nothing; they evoke mystery and, indeed, when one sees one of my pictures, one asks oneself this simple question, ‘What does that mean?’. It does not mean anything, because mystery means nothing either, it is unknowable.”

Surrealism is formed following function – the function being to contradict the constraints of perceived reality.

Surrealism is a cultural movement that developed in Europe in the aftermath of World War I in which artists depicted unnerving, illogical scenes and developed techniques to allow the unconscious mind to express itself.[1] Its aim was, according to leader André Breton, to “resolve the previously contradictory conditions of dream and reality into an absolute reality, a super-reality”, or surreality.[2][3][4] It produced works of painting, writing, theatre, filmmaking, photography, and other media.

“the mash-up of cuteness and darkness is the central theme to Madoka, and Kyubey is an epitome of that theme”.A central goal in Urobuchi’s writing was to highlight the moral and ethical dissonance between Kyubey and the young middle school girls, done through actions like Kyubey eating its own corpse to recycle energy. He compared the character to monsters in the works of horror fiction author H. P. Lovecraft, commenting on Kyubey: “he isn’t evil, it is his lack of feelings that make him scary”.

Surrealism aims to revolutionise the human experience. It balances a rational vision of life with one that asserts the power of the unconscious and dreams. The movement’s artists find magic and strange beauty in the unexpected, uncanny, disregarded, and unconventional. At the core of their work is the willingness to challenge imposed values and norms, and a search for freedom.

This weird creepy style of animation called Madoka was the very first thing that came to my mind, this particular final monster refers to a girl that wanted to save her friend. The designer was keen to build a broken image of a lost girl that finally went crazy The queen monster is sad and upset, mostly lost, and is led by a group of little guys acting like her dark inner side, the toys let’s say, are in a cute style that will only be seen in a child story book but cutting things with smiling faces, could be the reason why it looks creepy, also the bloody moments are all painted in children styled.

“Dark fairy tale” would be the best guide for me to research the theme at the time, so I made more like this under the keyword “Creepy Collage arts” to find the contract between 2D and 3D animation can create what kind of strange feelings. A group of girl soldiers wearing dresses grabbed my attention and it felt even more “Uncomfortable” to see plenty of repeated characters appearing continuously, it’s a little bit annoying because that generated the feeling of fear. So this time, the method of repeating would be the way to create a wired gaming environment.

For further research on themes and level designs, I focused on “Collage art”, ” Creepy models”, “Endless environment”, and “Contrast between real and unreal”. According to the Internet’ helpful searching action, I found several examples that could explain my understanding best, including the arts works from Korean artists, Japanese animator, and industrial design show cases, the image of having monsters in each of the levels were kind of decided in my mind: A interactable life-like talking face in a cyberpunk room; several elders playing with collaged animals; a huge blue bear in an abandoned construction area etc.

After the idea-sharing process that we did in our first and half week, the initial design was changed a lot in order to make it achievable, adding lighted colours; finding objects that are more abstract; linking the surreal idea into the design etc.

Alice in Wonderland

From its first appearance in 1865, Alice’s Adventures in Wonderland was devised as a visual work as much as a text. In the first room of the exhibition, we can marvel at the original manuscript by Charles Lutwidge Dodgson (who adopted Carroll as a pen name) with his own sketches of characters, plus learn how the Oxford maths tutor recruited John Tenniel as the illustrator for its original publication.

What’s it all about? Imagination, of course—but there’s more to this “literary nonsense.” Just like kids’ adventures in real life, Alice’s Adventures in Wonderland help her work through awkwardness, confusion, and sadness to get where she’s going Alice in Wonderland is a story that can’t be forgotten. This story is so inspiring that many people find big inspiration in it and made very interesting diy crafts.

From its first appearance, the main theme was kind of decided as a fantasy and wondering land that contains several imaginative experiences that take the player to another environment. A VR sound experience where everything happens like a wonderful dream, A sun-drenched attic may be an unfamiliar space or may evoke a certain moment in your memory. It‘s like an unrealistic corridor into the unknown universe, where you move between fantasy and reality, where you don’t know what is real and what is not. You may catch a glimpse of an interesting story on the way, or you may come across a mystery, but all you have to do is return to that initial space and to a memory that may not be real. A whale sealed in a painting, a music box that leads to another world, a memorable book and a vintage computer that works faintly will be the gateway to fantasy, leading you to infinite possibilities. Never the end when you return to this attic countless times, it will become your ‘reality’ and your ‘Place of belonging’ at the time.

Week 1.5 Narrative Buildup

Narrative type

These are the types of storytelling formats that are linear and non-linear.

a linear environment could be seen as beneficial as mentioned above (like one long hallway)

could also be more nodal

In the “telling a story element” another could be environmental storytelling, where the user moves through an open space creating their own narrative through interaction with objects and the environment.

in writing about this, I’m more referring to a space that the user/player moves through, as opposed to a ‘rollercoaster’ experience (I guess the pure experiential?) where the user is mostly stationary and the environment happens around them.

There are still elements that could be interesting with this, the film A Ghost Story by David Lowery is set in a house that moves through time (based on the perception of the camera/main character)

could also explore a matryoshka/Russian doll kind of idea where the user’s surroundings gradually expand (like walls crumbling down to reveal a larger space.

The final result of discussing narrative formation is we gonna use the second one which is non-linear in that the player can choose their starts and endings that may depend on each person in different ways, and lead to different endings. A simple idea of having a main space for the player to choose which to interact with and come back with, and several interactable objects as a transportation point to the other rooms.

Functions follow form

This refers to the game/experience designed that is based on a specific style, When the form/style comes first, the function is designed in accordance with the form. Form follows function given that the space’s purpose is to serve as an office, it should be designed in a manner that is appropriate for an office. Modular, Spacious and Efficient. In this case, our project is based on “Function follows form”, where a style of “Surrealism” is decided before logic design flow.

Storyboard & Storyline

By imaging having an attic room to start the whole game, I designed a playable space as an introduction part and pull out the idea of having a creepy but surreal talking face appear from the window of the attic to make the atmosphere feels more strange (shown on the second board). And a simple scene transition was made for the music box that lead the player to the dancehall (shown on the first board).

The player would be able to interact with four objects in the bedroom. The objects would encourage interaction through sound. 

The interaction involves the player shrinking (mechanics to be discussed) to the object’s scale and entering the world attached to it. (transitioning from the macro of the hub to the micro of the objects)

The interactive objects were chosen for their ability to create virtual worlds.

Environmental Storytelling/Exploration

Each location has different aesthetics, and ‘rules’ of sound.

This leads to five different storyboards referring to five stories happening in each scene for example, the main scene will have its own storyline with four possible progressions and the music box aka dancehall will have a specific line stick with finding puzzles around the scene, and the painting scene will basically go through with a whalebone or something with instructions in the environment.

The storyboard shows downward are music box and the main scene attic room, with the description of each story progression listed.

The gaming flow and story will be told to the player in this kind way that provides a Music box, painting, computer and book as an open door for the world that the player can go and explore.

Exploring the imaginary worlds/virtuality of objects. Interacting with an object in the bedroom enters its world.

Initially, we were going with a more representational surrealist style, but it’s a bit closer to magic realism.

A Miro board that facilitates collaborative brainstorming and idea generation can greatly enhance group development in game scene designs. By providing a shared digital canvas, it allows every team member, including VR/game designers and sound art designers, to contribute their imaginations and ideas to each game scene design. This open and inclusive approach encourages diverse perspectives and fosters creativity within the group.

With a Miro board, team members can visually express their ideas by adding images, sketches, and text annotations to specific game scenes. This enables a free-flowing exchange of concepts and stimulates a wide range of possibilities. Different team members may have unique insights or expertise that can contribute to the overall design, and the Miro board allows for the seamless integration of these diverse ideas.

The interactive nature of a Miro board also encourages collaboration and builds upon the suggestions made by others. Team members can build upon existing concepts, add their own elements, and create a rich tapestry of ideas. Discussions can take place directly on the board, fostering real-time communication and enabling the refinement of ideas through constructive feedback and iteration.

Overall, a Miro board empowers the team to collectively generate a variety of ideas, leveraging the strengths and perspectives of each member. It encourages a collaborative and inclusive environment that leads to more innovative and comprehensive game scene designs, ultimately enhancing the quality of the final product.

Team roles

Collaboration between virtual reality (VR) or game designers and sound art designers is crucial to creating immersive and engaging experiences. Establishing an effective workflow between these two teams can greatly enhance the quality and coherence of the final product. A workflow chart serves as a visual representation of the process, outlining the various stages, tasks, and dependencies involved in the collaborative effort.

The workflow between VR/game designers and sound art designers typically begins with concept development. Both teams come together to brainstorm and discuss the overall vision, theme, and aesthetic of the project. This initial stage is important for aligning their creative ideas and ensuring a shared understanding of the desired outcome.

Once the concept is established, the workflow moves into the design phase. VR/game designers focus on creating the virtual environment, designing characters, and mapping out gameplay mechanics. Simultaneously, sound art designers start crafting audio assets such as music, sound effects, and ambient sounds. Continuous communication is essential during this phase, as the sound designers need to understand the specific requirements and timing of the visuals, while the VR/game designers should provide feedback and guidance to ensure the audio complements the immersive experience.

After the design phase, the workflow progresses to implementation. The VR/game designers integrate the visual assets and mechanics into the virtual environment, while the sound art designers work on integrating the audio elements. This stage requires close collaboration to synchronize visuals with corresponding sounds, ensuring a seamless and immersive user experience. Regular meetings and iterative testing help identify and resolve any issues or discrepancies between the audio and visual components.

Once the implementation is complete, the workflow transitions into the refinement and polish stage. Both teams work together to fine-tune the details, optimize performance, and enhance the overall audio-visual experience. This stage often involves multiple iterations and feedback loops to achieve the desired level of quality.

A workflow chart acts as a roadmap for this collaborative process. It provides a clear overview of the sequential steps involved and helps manage the workflow efficiently. The chart can include milestones, deadlines, and designated responsibilities for each team member, ensuring that everyone is aware of their role and the project’s progress. By visualizing the workflow, it becomes easier to identify bottlenecks, allocate resources effectively, and maintain effective communication between VR/game designers and sound art designers.

In conclusion, establishing a well-defined workflow and utilizing a workflow chart is essential for effective collaboration between VR/game designers and sound art designers. It promotes clear communication, ensures alignment of creative ideas, and helps manage the project efficiently, resulting in a cohesive and immersive virtual reality or game experience.

Mechanics Overview

Interactive objects — This will be applied to each object in the starting room to increase the “exploring” fac

“shrinking” effects — In some special cases, make the surrounding environment objects larger or make the player’s own perspective preset proportionally smaller to achieve a similar effect similar to “Alice in Wonderland”

Triggering conditions — This refers to some special scenes or perspective changes that will appear when the player starts certain conditions or interacts with certain objects

Scene transitions — The cinematic transition effect

Special effects (environmental) — In terms of “surrealism”, several special processes are to beautify the rendering effect of the environment. (e.g. Particle, reflection, realistic lights)

3D painting effect — A popular presentation of 3D artworks (pics or paintings) that combines 2D plane and 3D contents.

Face capture — A unity program that allows detailed 3D facial expressions and auto-generated face animations.

A list of mechanic functions was drawn after giving that basic imagination to the sound art students, it basically formed with password input, and puzzle recognization for my two scenes, and the password is gonna be at painting, the puzzle pattern should be at the music box scene.

Main Scene

Design & Modeling

This is the place that everything starts with, an old attic that might be familiar to everyone, this is a magical space where the player will not know what is going to happen when they touch anything or get closer to something, it’s an ordinary rooftop with dust and unused stuff, but it also can be the doorway to the other universe.

The earliest step in a well-thought-out design would be to visualise the design in the minds of each member of the group through a picture reference, and to gradually and thoroughly understand the ideas of each member of the group and to highlight the parts of the design that could be improved and implemented from the perspective of the actual designer and put them together for the purpose of realisation and adjustment. In this process, there is of course no shortage of trial and error, including problems such as poor models, inappropriate materials, and imagery that is too abstract for modelling purposes, where I, as the designer and operator, will take on the responsibility of communicating and correcting members, if a member’s imagination is too difficult to realise I and the rest of the group will have to explain the phenomenon and tax another member. I will also ensure that the wishes and feelings of the other person are taken into account in the process and that we do not simply reject or deny, but that compliments and acknowledgements are included in the dialogue as appropriate. This is particularly important for me as a designer because all the unrealistic and realistic designs will be presented and visualised by me alone so that during the whole project there is no group meeting where I am not allowed to go off.

Billed as a “story exploration game,” Gone Home has users exploring an empty house and piecing together why no one is home.

My imagination was more dominant in the design and realisation of the main scene, so I had to communicate more frequently with the group. the attic was not really an old attic, but rather a child’s bedroom, in order to achieve the idea that we were designing it as a ‘cradle of memories’. The story of the Otherworld will start in a cosy but old room, the idea is to guide the evening to explore this strange but familiar space and find objects to interact with, so the area will be familiar to all. For example, the boy’s room would be mainly blue and grey in colour with some personal elements such as basketball games. Still, the girl’s room would generally be pink and white in colour with more personal items such as cute dolls, so I decided to talk to the group about changing the bedroom set to a loft that every family would have. An attic is a space that is inherently old and stereotypically evocative, and at the same time, it can hold any object that seems impractical, such as a discarded computer or an old music box or even an unused tricycle, which would seem logical in an attic.

Environment

This particular attic model, named “Grandpa’s Attic,” was obtained from the Unreal Engine assets store. Its pre-existing design saved me valuable time, as I didn’t have to engage in extensive modelling work during the environment-building process. However, finding a suitable asset proved to be a challenge. The images showcased the model in a way that didn’t align with the desired theme for the main room. While the original model featured elements related to “life,” “bedroom,” and “attic rooftop,” its style and environment presentation didn’t precisely reflect the “familiar to everyone” setting we intended to create. Additionally, the assortment of items within the model, such as snacks, unwashed dishes, and tobacco cases, didn’t fit the concept of childlike objects, including abandoned toys, books, and children’s artwork that we were aiming for. Notably, the presence of a motorcycle in the corner further deviated from our vision.

Consequently, we made the decision to change assets, recognizing the need for a more appropriate model. However, finding a detailed model that fulfilled our specific requirements proved to be a challenging task, necessitating extensive Internet research. After selecting a suitable asset from the Unreal Engine asset library and successfully acquiring the package, our focus shifted towards transferring the assets from Unreal Engine to Unity, which presented its own set of difficulties. I had to rely on tutorials to learn how to import the prefab asset folder into Unreal Engine and ensure that each material was appropriately aligned, and ready for export. Working with prefab folders in UE5, while assigning all the necessary materials, proved to be a complex task during that period, as I was still unfamiliar with the process.

Overall, the process of sourcing and implementing the appropriate assets required careful consideration and effort to align with our project’s professional standards.

These are the interactive object settled in the main scene which are a painting, a music box, a computer etc.

Mechanics

The mechanical research for this scene was quiet at the very start the main goal is to make the scene transition kind of making a scene for everyone, this original idea was to have an animation triggered for the music box so that once the player picked up the music box, the animation will play, and a stencil shader for the painting to make a Life-life picture on the wall for player to get closer to.

Music Box

For the trigger setting of music, the box is basically an animation on it, so the very first step of doing this was obviously to animate a music box.

So the Very first thing to do is find a model for the music box and animate it

The needs

-Animated music box that can be triggered by the player’s hand

-A melody play at the same time with the animation of “rolling”

-A scene transition after the animation is played

public class SceneTransition : MonoBehaviour
{
    public string SceneName;
    public bool SceneHasTriggered = false;
    public GameObject ItemToDestory;

    public WorldManager WM;
    public Animator FadeAnim;

    public GameObject WhalecCard;

    public Transform Player;
    public Transform Target;
    public float distance;

    public Vector3 dirToTarget;

  yield return new WaitForSeconds(fadeScreen.fadeDuration);
    //}

    private void Start()
    {

        WM = GameObject.Find("WM").GetComponent<WorldManager>();

        //check world manager script, is it 1st time visit? or 2nd
        //if 2nd time, destroy me

        WM.TriggerArea1 = transform.gameObject;
        WM.ST_Script = transform.gameObject.GetComponent<SceneTransition>();

        if (WM.LevelState == "MB" && WM.LevelPlayed == 2)
        {
            Destroy(this.gameObject);
            WhalecCard.SetActive(true);

        }

        if (WM.LevelState == "PT" )
        {
            Destroy(this.gameObject);
            WhalecCard.SetActive(true);
        }
    }


    // Start is called before the first frame update
    void OnTriggerEnter(Collider other)
    {
        if (!SceneHasTriggered && other.tag == "Player")
        {
            //FadeAnim.SetBool("SceneChanging", true);
            SceneHasTriggered = true;
            WM.LevelState = "PT";
            WM.LevelPlayed += 1;
            SceneManager.LoadScene(SceneName);
            Debug.Log("Scene loaded");
           
            //change world manager state, +1 visit
            //Destroy(ItemToDestory);
        }
    }

    public void Update()
    {
        distance = Vector3.Distance(Player.position, Target.position);
        if(distance < 2.2f)
        {
            FadeAnim.SetBool("SceneChanging", true);
        }
      
    }

}
public class Musicbox_Animation : MonoBehaviour
{
    
    //public XROrigin XROrigin;
    public Animator MusicBoxAnim;
    public bool MscBoxhasAnimated = false;
    public GameObject ItemToDestroy;
    public Animator FAnim;

    public GameObject MusicCard;

    public WorldManager WM;

    void Start()
    {
        WM = GameObject.Find("WM").GetComponent<WorldManager>();

        WM.MusicboxOpening = transform.gameObject;
        WM.MA_Script = gameObject.GetComponent<Musicbox_Animation>();

        if (WM.LevelState == "MB" || WM.LevelPlayed == 2)
        {
            Destroy(this.gameObject);
            MusicCard.SetActive(true);
        }
    }


    private void OnTriggerEnter(Collider other)
    {
        if (!MscBoxhasAnimated && other.tag == "Player")
        {
            MusicBoxAnim.SetBool("BoxOpen", true);
        }
    }

    public void OnSceneFade()
    {
        FAnim.SetBool("SceneChanging", true);
    }

    public void OnMusicBoxAnimationEnd()
    {
        MscBoxhasAnimated = true;
        WM.LevelState = "MB";
        WM.LevelPlayed += 1;
        SceneManager.LoadScene("DanceHall");
        Debug.Log("Scene loaded");
    }


    // Update is called once per frame
    //void Update()
    //{
    //    if(MscBoxhasAnimated == true)
    //    {
    //        Destroy(ItemToDestroy);
    //        MusicBoxAnim.SetBool("BoxOpen", false);
    //    }
    //}

}

The animation was made in Maya by simply adding and playing keyframes on the music box cover and a rolling- melody playing the animation playing after the box is opened. This was made inside in the same animation clip instead of separated ones, this is because the script that is controlling the animation play point is depending on the player’s triggering action that happens with the OnTriggerEnter recognization on the invisible area settled if I use to different animation clips, it will be hard for the trigger area to know which of them is asked to play with only one trigger point if I’m doing two different trigger area on the same object it will be hard for the controller to interact exactly with one of them in its correct order where the music box cover opening should happen before the melody playing animation in a logical sequence.

As a result of this consideration, I choose to put two animations in one animation clip and be triggered but only one player action, this will be much more achievable and might make more sense at this point.

And several claims related to World Manager should be added to the “MusicBox Animation” script for this specific script to recognize whether the player has entered the Musicbox scene before and to destroy this music box physical state and generate a music note wooden puzzle for the player to exit the game.

public class MusicBoxPlay : MonoBehaviour
{
    public AudioClip Musicbox;
    public AudioSource AudioSource;
    public AudioClip CollideSound;

    // Start is called before the first frame update
    void Start()
    {
       
    }

    private void OnCollisionEnter(Collision collision)
    {
        AudioSource.PlayOneShot(CollideSound);
    }

    // Update is called once per frame
    void Update()
    {
        
    }

    void MusicboxSound()
    {
        AudioSource.PlayOneShot(Musicbox);
    }
}
public class InstrumentPLay : MonoBehaviour
{
    public AudioSource AudioSource;
    public AudioClip[] GuitarClips;



    // Start is called before the first frame update
    void OnTriggerEnter(Collider other)
    {
        if (other.tag == "Player")
        {
            PlayRandomGuitarSound();
        }
    }

    void PlayRandomGuitarSound()
    {
        int randomIndex = Random.Range(0, GuitarClips.Length); // Generate a random index within the array length
        AudioClip randomClip = GuitarClips[randomIndex];

        AudioSource.PlayOneShot(randomClip);
    }
}
public class CollisionSound : MonoBehaviour
{
    public AudioSource AudioSource;
    public AudioClip Clip;

    // Start is called before the first frame update
    void Start()
    {
        
    }

    private void OnCollisionEnter()
    {
        AudioSource.PlayOneShot(Clip);
    }

    // Update is called once per frame
    void Update()
    {
        
    }
}

These small scripts were used to make several interactable sounds in the main scene to immerse the player in it, the physical interaction is recognized in the same way as the other triggered areas where an invisible area was set to read the player’s contact with the term “OnTriggereEnter” and an Audiosource play one shot after the interaction. And the one for collision sound is used to make sounds when the toys or other interactable objects are dropped on the ground or collide with each other, in this case, the “Collider” will not need to be determined as a “player” but everything that is collidable in its physical term.

public class Follow : MonoBehaviour
{

    // 路径脚本
    [SerializeField]
    private WaypointCircuit circuit;
    

    //移动距离
    public float dis;
    //移动速度
    public float speed;
    public bool ActivateMove;  //this activate the cat movement to next point

    public string CatPos;
    public Animator CatAnim;

    public AudioSource AudioSource;
    public AudioClip CatCry1;
    public AudioClip CatSpeak;
    public AudioClip CatPur;
    public AudioClip CatStep;

    public GameObject ItemToDestroy;

    public ActivateCat AC_Script;

    public WorldManager WM;

    // Use this for initialization
    void Start()
    {
        CatPos = "Cat_At_A";
        WM = GameObject.Find("WM").GetComponent<WorldManager>();

        WM.cat_model = transform.gameObject;
        WM.Cat_Script = gameObject.GetComponent<Follow>();

        if (WM.LevelState.Length == 2)
        {
            ActivateMove = true;
            CatAnim.SetBool("Run", true);
        }

        else {
            dis = 0;
        }
            
        //speed = 2;
    }

    void Update()
    {

        if (ActivateMove == true)
        {
            CatAnim.SetBool("Run", true);
            //AudioSource.PlayOneShot(CatStep);
            //计算距离
            dis += Time.deltaTime * speed;
            //获取相应距离在路径上的位置坐标
            transform.position = circuit.GetRoutePoint(dis).position;
            //获取相应距离在路径上的方向
            transform.rotation = Quaternion.LookRotation(circuit.GetRoutePoint(dis).direction);
            speed = 2;

            if (WM.LevelState.Length > 0)
            {
                Destroy(ItemToDestroy);
            }
        }

        if (ActivateMove == false)
        {

        }
    }



    public void Intro()
    {
        Debug.Log("Point A triggered");
        Destroy(ItemToDestroy);
        ActivateMove = false;
        CatAnim.SetBool("CatSpeak", true);
        //AudioSource.PlayOneShot(CatSpeak);
    }

    void CatSpeakStart()
    {
        AudioSource.PlayOneShot(CatSpeak);
    }

    public void OnCatSpeakEnd()
    {
        ActivateMove = true;
        CatAnim.SetBool("Run", true);
    }


    void OnTriggerEnter(Collider other)
    {

        if (other.tag == "PointB") //a Pole loop
        {
            CatAnim.SetBool("PointB_Stop", true);
            ActivateMove = false;
            AudioSource.PlayOneShot(CatPur);
            print("PointBTriggered");

            CatPos = "Cat_At_B";
        }

        if (other.tag == "Cat_Toy")
        {
            ActivateMove = true;

            CatAnim.SetBool("Ball_Play", true);
        }

        if (other.tag == "PointD") //a dig
        {
            ActivateMove = false;
            print("PointDTriggered");

            CatAnim.SetBool("PointD_Stop", true);
            AudioSource.PlayOneShot(CatPur);
            CatPos = "Cat_At_D";
        }
    }



    void OnTriggerExit(Collider other)
    {
        if (other.tag == "PointB")
        {
            ActivateMove = true;
            CatAnim.SetBool("PointB_Stop", false);
            CatAnim.SetBool("Ball_Play", false);
            CatPos = "Nowhere";
        }


        if (other.tag == "PointD")
        {
            ActivateMove = true;
            CatAnim.SetBool("PointD_Stop", false);
            CatAnim.SetBool("Ball_Play", false);
            CatPos = "Nowhere";
        }

    }

    //void OnColliderEnter(Collider other)
    //{
    //    if (CatPos == "Cat_At_B")
    //    {
    //        CatAnim.SetBool("Ball_Play", true);
    //    }


    //    if (CatPos == "Cat_At_D")
    //    {
    //        CatAnim.SetBool("Ball_Play", true);
    //    }

    //}




    public void ResetState()
    {
        ActivateMove = true;
        CatAnim.SetBool("Run", true);
        //CatAnim.SetBool("PointB_Stop", false);
        //CatAnim.SetBool("Ball_Play", false);
        CatPos = "Nowhere";
    }


}
[System.Serializable]
public class WaypointList
{
    public WaypointCircuit circuit;
    public Transform[] items = new Transform[0];
}

public struct RoutePoint
{
    public Vector3 position;
    public Vector3 direction;

    public RoutePoint(Vector3 position, Vector3 direction)
    {
        this.position = position;
        this.direction = direction;
    }
}

public class WaypointCircuit : MonoBehaviour
{
    public WaypointList waypointList = new WaypointList();
    [SerializeField] bool smoothRoute = true;
    int numPoints;
    Vector3[] points;
    float[] distances;

    public float editorVisualisationSubsteps = 100;
    public float Length { get; private set; }
    public Transform[] Waypoints { get { return waypointList.items; } }

    //this being here will save GC allocs
    int p0n;
    int p1n;
    int p2n;
    int p3n;

    private float i;
    Vector3 P0;
    Vector3 P1;
    Vector3 P2;
    Vector3 P3;

    // Use this for initialization
    void Awake()
    {
        if (Waypoints.Length > 1)
        {
            CachePositionsAndDistances();
        }
        numPoints = Waypoints.Length;
    }

    public RoutePoint GetRoutePoint(float dist)
    {
        // position and direction
        Vector3 p1 = GetRoutePosition(dist);
        Vector3 p2 = GetRoutePosition(dist + 0.1f);
        Vector3 delta = p2 - p1;
        return new RoutePoint(p1, delta.normalized);
    }

    public Vector3 GetRoutePosition(float dist)
    {
        int point = 0;

        if (Length == 0)
        {
            Length = distances[distances.Length - 1];
        }

        dist = Mathf.Repeat(dist, Length);

        while (distances[point] < dist) { ++point; }


        // get nearest two points, ensuring points wrap-around start & end of circuit
        p1n = ((point - 1) + numPoints) % numPoints;
        p2n = point;

        // found point numbers, now find interpolation value between the two middle points

        i = Mathf.InverseLerp(distances[p1n], distances[p2n], dist);

        if (smoothRoute)
        {
            // smooth catmull-rom calculation between the two relevant points



            // get indices for the surrounding 2 points, because
            // four points are required by the catmull-rom function
            p0n = ((point - 2) + numPoints) % numPoints;
            p3n = (point + 1) % numPoints;

            // 2nd point may have been the 'last' point - a dupe of the first,
            // (to give a value of max track distance instead of zero)
            // but now it must be wrapped back to zero if that was the case.
            p2n = p2n % numPoints;

            P0 = points[p0n];
            P1 = points[p1n];
            P2 = points[p2n];
            P3 = points[p3n];

            return CatmullRom(P0, P1, P2, P3, i);

        }
        else
        {

            // simple linear lerp between the two points:

            p1n = ((point - 1) + numPoints) % numPoints;
            p2n = point;

            return Vector3.Lerp(points[p1n], points[p2n], i);
        }
    }

    Vector3 CatmullRom(Vector3 _P0, Vector3 _P1, Vector3 _P2, Vector3 _P3, float _i)
    {
        // comments are no use here... it's the catmull-rom equation.
        // Un-magic this, lord vector!
        return 0.5f * ((2 * _P1) + (-_P0 + _P2) * _i + (2 * _P0 - 5 * _P1 + 4 * _P2 - _P3) * _i * _i + (-_P0 + 3 * _P1 - 3 * _P2 + _P3) * _i * _i * _i);
    }


    void CachePositionsAndDistances()
    {
        // transfer the position of each point and distances between points to arrays for
        // speed of lookup at runtime
        points = new Vector3[Waypoints.Length + 1];
        distances = new float[Waypoints.Length + 1];

        float accumulateDistance = 0;
        for (int i = 0; i < points.Length; ++i)
        {
            var t1 = Waypoints[(i) % Waypoints.Length];
            var t2 = Waypoints[(i + 1) % Waypoints.Length];
            if (t1 != null && t2 != null)
            {
                Vector3 p1 = t1.position;
                Vector3 p2 = t2.position;
                points[i] = Waypoints[i % Waypoints.Length].position;
                distances[i] = accumulateDistance;
                accumulateDistance += (p1 - p2).magnitude;
            }
        }
    }


    void OnDrawGizmos()
    {
        DrawGizmos(false);
    }

    void OnDrawGizmosSelected()
    {
        DrawGizmos(true);
    }

    void DrawGizmos(bool selected) //this function for DrawingLine Debug
    {
        waypointList.circuit = this;
        if (Waypoints.Length > 1)
        {
            numPoints = Waypoints.Length;

            CachePositionsAndDistances();
            Length = distances[distances.Length - 1];

            Gizmos.color = selected ? Color.yellow : new Color(1, 1, 0, 0.5f);
            Vector3 prev = Waypoints[0].position;
            if (smoothRoute)
            {
                for (float dist = 0; dist < Length; dist += Length / editorVisualisationSubsteps)
                {
                    Vector3 next = GetRoutePosition(dist + 1);
                    Gizmos.DrawLine(prev, next);
                    prev = next;
                }
                Gizmos.DrawLine(prev, Waypoints[0].position);
            }
            else
            {

                for (int n = 0; n < Waypoints.Length; ++n)
                {
                    Vector3 next = Waypoints[(n + 1) % Waypoints.Length].position;
                    Gizmos.DrawLine(prev, next);
                    prev = next;
                }
            }
        }
    }
}

Another interactable thing in the mains scene is the cat that I was using in my last project to make the cat react to different objects applied to it to make the cat “playable”. This is achieved by adding two functional scripts to the animated cat model. The first one is called the waypoint circuit is used to generate the route of the movement. In order to move the cat from one point to another point I put several position values on update to make a smooth movement_1. But during the research process, I also found several alternative methods to move the object (character or cat). One of them resulting an unnatural movement but with simple coding contents, which is editing the position values of the ‘game object’ directly, and another is changes made by the animator instead which implies the several keys with different positions complete the move, but in the case that I’m doing a lot of animation on the cat itself so the last method would not be the best to choose. This specific coding allows me to control the cat’s moving path by coding a Waypoints script and a Follow script with several GameObjects (cube) to refer to the separated points inside the path design. Another script is used as the control command applier that tells the specific object to follow the path generated by waypoint, inside this script, the object attached is called by “follow” to transit its position in terms of x,y,z in world’s position that corresponding to each small points don’t he path in yellow line, this allows the user to set exactly the speed and displacement that refers to the activate move in the script which controls the time of this movement happens.

A special wait before start command will be called in this script as well where the time could be set as a string with a specific time to wait whenever the game starts or begin, this function is named as “IEnumerator coroutine” which can ask a timer like function when the game or something stars and to make some action to happen after the timing ends.

Stencil Shader

The stencil shader is used to make the painting “alive” in terms of turning 2d image into 3d model inside a visible mask, and the modles behind this mask will be only visible if the camera sees right infront of it.

The stencil shader in Unity URP is a powerful tool that allows developers to create complex rendering effects by selectively rendering specific parts of a scene based on a stencil buffer. The stencil buffer is an additional buffer that stores information about pixel visibility, and it can be used to mark pixels that meet certain criteria defined by the developer. With coding, developers can utilize the stencil shader to achieve various effects such as masking, outlining, and selective rendering.

To use the stencil shader in Unity URP, developers first need to define the stencil buffer operations and comparisons. This is done through coding by setting up stencil states and configuring the desired stencil operations and comparisons. These operations and comparisons determine how the stencil buffer is updated and how pixels are rendered based on the stencil values. For example, a developer can set up the stencil buffer to mark pixels that belong to a specific object or pass a certain depth test.

Once the stencil states are defined, developers can apply the stencil shader to specific materials or renderers in the scene. By attaching the stencil shader to a material, developers can control how the rendered pixels interact with the stencil buffer. For instance, they can choose to render only the pixels that pass a specific stencil test or perform custom operations based on the stencil values. This allows for advanced rendering effects like rendering outlines around objects or applying specific rendering techniques to selected parts of the scene.

But the creepiest thing to notice before actually start making stencil shader is that unity has the realated settings to cahnge in order to make this stencil property readable for unity engine, otherwise it will not give you any results even yu got all your settings right.

Painting Scene

Design & Modeling

In the artistic environment I envision, the predominant color palette revolves around shades of blue, evoking a sense of tranquility, calmness, and sadness. The overall atmosphere carries a subtle warmth, offering a contrasting element to the melancholic tone. As visitors step into this ethereal world, they are transported to an abandoned rounded habitat area that has been ravaged by a catastrophic event, such as a massive flood. The remnants of this disaster become evident through the presence of a lonely door, standing as a solemn reminder of what was once a bustling and thriving community.

The door itself exudes a sense of desolation, its weathered appearance telling tales of a bygone era. It bears the scars of time and the forceful impact of the flood that swept through the area. Its wooden frame is cracked, and the paint has peeled away, revealing layers of history and vulnerability. The door’s forlorn state signifies the passage of time and the abandonment of what was once a vibrant habitat.

Behind the door lies a poignant symbol of the disaster and a call for environmental protection—a whale’s bone. Resting gracefully on the ground, the bone acts as a haunting testament to the power and fragility of nature. Its sheer size and presence remind visitors of the impact humans have on the environment and the need for conservation and stewardship. The juxtaposition of the lonely door and the whale’s bone creates a sense of melancholy and reflection, urging viewers to contemplate the consequences of our actions and the importance of preserving our natural world.

Despite the underlying sadness of the environment, a feeling of warmth permeates the air. Soft lighting casts a gentle glow, hinting at the hope that remains even amidst tragedy. The warmth is represented through warm-toned lighting fixtures scattered throughout the area, casting a soft golden glow that contrasts with the cool blue hues. This combination creates a harmonious balance between melancholy and solace, inviting visitors to embrace the emotions evoked by the scene and find solace within the poignant narrative.

In summary, the artistic environment I envision is a melancholic yet warm space, characterized by shades of blue and a sense of profound sadness. The abandoned rounded habitat area houses a weathered and cracked door, serving as a reminder of a once-thriving community that fell victim to a devastating flood. Behind the door lies a whale’s bone, symbolizing the need for environmental protection. Despite the melancholic atmosphere, the presence of warm lighting offers a glimmer of hope, encouraging visitors to reflect on the consequences of our actions and find solace amidst the beauty of the scene.

The progress of using Maya to model the signature sign of the whole painting scene was not as difficult as I thought at first, it was quite straightforward that I have experience cutting faces and objects with different shapes. Another core tool I was using is the bend tool which is a nonlinear deformation of the selected mesh, with this tool I can freely create any smooth and curved formation on models, this was used on making the cage round and the columns bend on their top parts.

Maya, a powerful 3D modeling software, can bring the specific environmental design I envisioned to life with its extensive capabilities. Using Maya, I can meticulously craft the details of the abandoned rounded habitat area, capturing the weathered texture of the broken door and the subtle nuances of the surrounding environment. With Maya’s versatile modeling tools, I can recreate the cracked wooden frame of the door, showcasing its aged appearance and conveying the passage of time. The software’s rendering features allow me to experiment with various lighting setups, ensuring the warm and melancholic atmosphere is precisely captured. Additionally, Maya provides me with the flexibility to sculpt the whale’s bone, enabling me to create intricate details that reflect its significance and emphasize the importance of environmental protection. With Maya’s comprehensive toolset, I can seamlessly merge artistry and technicality to bring this evocative environment to fruition.

It was quite hard to match the size of the models that related to the height settings of XR origin and also the scales of the other models, at the starting position of modelling and environment building process, this is hard for me as a beginner to in modelling buildings without any scale references given but only several pictures of design, it was hard to tell the exact height and width at the same time while I need to imagine how to build up the base shape under the design skim drawn. For this specific environment with a round water plane at the centre and a bird-cage-like construction around and the half-circle-shaped buildings align aside,

the importance of imaging the depth of field d and making it match the height of each column was extremely time-consuming, I was trying my best but it didn’t really come to the reality that the whole garden let’s say, was kind of smaller than what I aimed to set, and the light of the most obvious part of this scene (broken construction in the middle) end up like a shortened version of the one shown in the reference picture, in this case, that I realized it super late, I have no option to change anything as all the other scaling setting are all done according to it.

The last model I created for this painting scene is the entrance of the whole park, in order to make the style match the Barlock style, I chose to make a church window as the fence around the main entrance door, which offers a weird but not strange kind of visual feeling to this area as the “born place” for the player to the first entry. the modelling process didn’t take too long as I have been familiar with all the tools in Maya and got the rough design draft in my mind this time. the core was to the nonlinear band tool to reshape all the meshed at once.

Environment

The next step after modelling and matching is to put all of my models together to build up a completed scene in our imaginations. This was not difficult to do but plenty of work was needed to find a specific material that belongs to each model and surface, and not all of them has one at first so I’ll need to search for suitable normalized realistic materials from the Internet sometimes and an effort on adjusting colours. I adjusted the colour, normal details, and lighting settings in Unity’s Universal Render Pipeline (URP) to create a more realistic appearance for the 3D models in my scene. To achieve this, I focused on three key elements: the water shader, the glass material, and the circular Barlock-style buildings.

I fine-tuned the colour properties of the models. I experimented with different colour schemes and adjusted the saturation, brightness, and contrast to evoke a natural and lifelike feel. By carefully selecting appropriate colour palettes for each object, I ensured they harmonized with the overall scene. Next, I paid attention to the normal details of the models. By modifying the normal maps and adjusting the intensity of the surface details, I enhanced the perception of depth and texture. This added a sense of realism to the objects, making them appear more tangible and believable within the environment.

In terms of lighting, I meticulously adjusted the light sources and their properties. I considered the position, intensity, and colour of the lights to create the desired atmosphere. Soft, warm lighting was used to simulate natural sunlight, while subtle shadows were incorporated to add depth and dimension. By carefully balancing the lighting setup, I aimed to recreate the interplay of light and shadow as observed in real-world environments, specific attention was given to the water shader and glass material. I applied appropriate shaders and adjusted their properties to achieve realistic reflections and refractions. The water shader was configured to simulate the movement and distortion of waves, while the glass material was made transparent and refractive to mimic the appearance of real glass surfaces.

Throughout this process, I relied on my artistic judgment and knowledge of real-world materials and lighting. By carefully adjusting the colour, normal details, and lighting settings, I was able to create a scene in Unity URP that portrayed the 3D models as lifelike and convincing, bringing a sense of realism to the virtual environment.

Final result

Mechanics

Apart from making it bueatiful, the interaction in this scene were not decided until the final render result’s out, students from sound art gave me a wonderful idea to make this more interesting and playable after seeing this specifc creation — adding sounds on the bones. This was not expected at the very start where I believe our project did gradualy becoming professional durign the basic progresses, for instance the idea might change while sound creator is trying to record something for a specific object but he/she may suddenly turns out with some interesting but creazy new thing that only can be found when you are actually making things, this is amazing cuz we did find lots of interactions with creativity by doing that. The bone bell sound was popped into Maria and Ben’s mind that is super fancy for me at that time, and then the main mechenic goal gradually came out with the idea of making a password system on the bone with music notes playable.

The keypad system with coding written by Herman was hard to read for me at first, the only thing I can try is to download the entire package and click literally every section whatever the game object or scripts they are and try to understand it by simply breaking them into parts.

This image is the process of my trying to understand how those scripts work.

I tried to divide them into two rough parts which consist of the physical model and the scripts on them. Two main scripts are “Keypad Manager” and “Bottom”, the bottom script is placed on each bone that I want to set active with the “On Trigger Enter” to recognize the input and then, send it to the manager to check whether the input is correct or not. And the manager is the one that controls the events set for password correct (whalebone dissolve) and stays the same if it’s incorrect, to compare the actual input and the correct answer, a canvas in the text showing the number imported with the correct answer set in the form of string “Correct Answer” will be read by every frame, of course, a reset function should be called once the input is more than the number of letters in the answer.

TO replace these functions to my own keypad, several changes will be made as well.

public class Keypad : MonoBehaviour
{

   // public Text Ans;
    public TMP_Text Ans;

    public string Answer;

    public int Input;

    public string K_State;

    public DissolveChilds D_script;

    public float WaitingTime;

    public GameObject ItemToDestory1;
    public GameObject ItemToDestory2;

    public bool IsDestoried = false;

    public AudioSource AudioSource;
    public AudioClip WrongSound;

    public void Start()
    {
        K_State = "Lock";
        Input = 0;
        Ans.text = "";  //reset to nothing
    }


    public void Number(int number)
    {
        if (K_State == "Lock")
        {
        Ans.text += number.ToString();
        Input += 1;
        }
    }


    public void Reset()
    {
        if (K_State == "Lock")
        {
            Input = 0;
            Ans.text = "";  //reset to nothing
        }
    }


    bool Update()
    {
        if (K_State == "Lock")
        {
            if (Input > 4)
            {
                AudioSource.PlayOneShot(WrongSound);
                Reset();
                return false;
            }
           
            if (Ans.text == Answer)
            {
                print("Correct");
                K_State = "Unlock";
                Ans.text = "" + K_State;
                StartCoroutine(DissolveCoroutine());
                Destroy(ItemToDestory2);
            }

        }

        if (K_State == "Unlock")
        {
            D_script.Dissolve();
            return false;
        }
        return false;
    }

    IEnumerator DissolveCoroutine()
    {
        //StartCoroutine(FadeOut());
        
        //Print the time of when the function is first called.
        Debug.Log("Started Coroutine at timestamp : " + Time.time);

        //yield on a new YieldInstruction that waits for 5 seconds.
        yield return new WaitForSeconds(WaitingTime);
        
        
        //After we have waited 5 seconds print the time again.
        Debug.Log("Finished Coroutine at timestamp : " + Time.time);

        //Distroy the chrosen item
        Destroy(ItemToDestory1);
        
        yield return IsDestoried = true;

    }
    
}
namespace DissolveExample
{
    public class DissolveChilds : MonoBehaviour
    {
        // Start is called before the first frame update
        //List<Material> materials = new List<Material>();
        public bool PingPong = false;

        public Material DissolveM;

        public float value;

        public ParticleSystem PS;

        public InputAction gripAction;

        //AudioSource audioSource;

        public AudioSource BoneAS;
        public AudioClip BoneDisapearClip;

        public int Playonce;

        void Start()
        {
            //audioSource = GetComponent<AudioSource>();
            PS.Stop();
        }



        private void Reset()
        {
            Start();
            DissolveM.SetFloat("_Dissolve", 0);
#            
        }

        
        // Update is called once per frame
        public void Dissolve()
        {
            Playonce += 1;
            //var value = Mathf.PingPong(Time.time * 0.05f, 1f);
            //SetValue(value);
            DissolveM.SetFloat("_Dissolve", value);
            //AudioSource = this.GetComponent<AudioSource>();
            //audioSource.Play();

            
            value += 0.001f;
            PS.Play();
           

        }

        void Update()
        {
            if (Playonce > 0 && Playonce < 10)
            {
                BoneAS.clip = BoneDisapearClip;
                BoneAS.Play();
                Playonce = 11;
            }
        }

}
        
   
public class Button : MonoBehaviour
{
    public string TypeofButton;

    public int Num; //my assigned number

    public Keypad KP_script;



    public void Input()
    {
        if (TypeofButton == "Number")
        {
            KP_script.Number(Num);
            print("inputted");
        }
      
    }

    void OnCollisionEnter(Collision collision)
    {

        print("touched");
        if (collision.gameObject.tag == "Stick")
        {
            Input();
        }
    }




This is how it works, when the wooden stick tagged with “stick” collides with whalebones, the collision will input a specific number to the keypad manager directly and after several trails, the number of input will reach a set amount (its 4 in this case) a Reset function will be called to clear input numbers to 0, and if the input shown on invisible text plate is equal to correct answer, the dissolve system with lighting and sound effects will be activated and a physical key hidden on top of the bone will drop with the “IEnumerator DissolveCoroutine” that counts the length of this dissapearing effct and dextroy the physical mech of whalebone when it ends, therefore the key will appear.

The script on key and door lead to scene transition would be easy to write with simply a door open animation be called after the key triggers the door handle and a scene transition action placed right after the door animation ends by adding animation event at the very last key.

“fade anim” is a completely white plane put right in front of the camera that will make the whole visible screen to be filled with white colour, and create the barrier for scene transitioning effect. the material applied should be lit and specular. This screen-fading animation is made by decreasing and increasing the transparency with keyframes.

public class SceneTransition3 : MonoBehaviour
{
    public Animator FadeAnim;
    public AudioSource AudioSource;
    public AudioClip DoorNoise;
    public void OnSceneTransiting()
    {
        FadeAnim.SetBool("SceneChanging", true);
    }
    // Start is called before the first frame update
    public void OnDoorOpenAnimationEnd()
    {
        SceneManager.LoadScene("maintest");
    }

    public void OnDoorAnimationPlay()
    {
        AudioSource.PlayOneShot(DoorNoise);
    }
}

DanceHall Scene

Design & Modeling

Matches the wood tones of a music box. The music box might play a basic version of the music, and then a more interesting one will be available in the actual space. The original idea of the world inside the music box was a typical mid-age dancehall made with Barlock-style columns and lighting, here are the image references for this imagination, the sound space should match the exact physical space of the dancehall to create the ambience volume and reverb zone settings, but this comes with the model finished which wasn’t quite easy for me at that time, so what I did for sound art student was finding plenty of reference pictures for them to imagine.

The style was decided as a splendid musical dancehall, a realm where joyous melodies and graceful movements intertwine. As you step into this enchanting scene, your senses are immediately captivated by a symphony of colors, sounds, and motion. The dancehall is adorned with ornate Baroque-style columns that reach towards the heavens, their intricate designs reminiscent of a bygone era of grandeur and opulence. Soft, warm lighting casts a gentle glow, bathing the space in an inviting ambiance.

Amidst this breathtaking setting, you’ll find an array of whimsical nut toys joyously playing orchestral instruments. From delicate squirrels gracefully strumming violins to charismatic chipmunks skillfully tickling the ivories of grand pianos, the air is filled with a harmonious cacophony of music. Each nut toy is meticulously crafted with attention to detail, their expressions reflecting pure delight as they bring the instruments to life. As you explore the dancehall, the melodies seamlessly blend together, transporting you to a world where music transcends boundaries and touches the very depths of your soul. It’s a magical spectacle that sparks a sense of wonder and invites you to join in the rhythm and movement, creating an immersive experience that celebrates the beauty of music and dance.

Using Maya, I meticulously crafted a nut toy character that embodied a unique blend of creepiness, cuteness, and fanciness. I began by sculpting the character’s body, giving it a round, nut-like shape with subtle textures and details. With a mischievous grin and large, expressive eyes, the nut toy exuded a sense of intrigue and playfulness.

To add an eerie touch, I incorporated subtle details like cracks on the surface, giving the impression of a slightly damaged and aged toy. The nut toy was adorned with intricate, fanciful attire, including a dapper top hat and a frilly collar, creating a whimsical and theatrical vibe.

Animating the nut toy was an exciting endeavor. I brought it to life with a combination of quirky movements and graceful gestures. Its motions were carefully choreographed to complement the music and dancehall setting. The nut toy’s animations combined unsettling yet endearing actions, such as a wobbly walk, sudden jumps, and playful spins. This created an eerie yet captivating presence within the splendid dancehall, adding an unexpected element of surprise and whimsy to the overall experience.

I rigged the model with Mixamo AI by importing the fbx file on its web page and let it generate at first, but the completed fbx file that I get from it were too complicated in this case I just need sveral slight movements on them, so I redid it myself with simple bones and skelton provided by maya humanoid, this made it much easier to do animation key settings.

Environment

The entire modle of dance hall were found on Unreal Engine package center and I bought it on a Chinese website that sells it, but as the same process I did for the main attic, the way of transpoting UE5 models to unity were quite harsh as they are based on two completely different engines where finding the right way to place and light the scene up could take me a whole day in unreal engine. But thanks for the “prefab” thing in unreal engine and unity, it was easier to meke all meshes placed.

In Unity URP, I harnessed the power of camera post-processing settings and carefully crafted lighting to elevate the splendor of my scene. I began by fine-tuning the camera’s post-processing effects, such as color grading and bloom. By adjusting the color grading, I enhanced the vibrancy and richness of the colors in the scene, making them more visually appealing and captivating. The bloom effect added a touch of ethereal beauty, creating a soft and radiant glow around the light sources, giving the scene an enchanting and dreamlike atmosphere.

To further enhance the scene’s splendor, I meticulously placed and adjusted the lights. I used a combination of point lights and spotlights strategically positioned to highlight key elements and create depth and dimension. Soft, warm lights were employed to mimic the gentle glow of ambient lighting, while carefully positioned spotlights accentuated important focal points, such as the dance floor or ornate decorations. The interplay of light and shadow added a sense of drama and elegance to the scene, amplifying its visual impact and evoking a feeling of grandeur and magnificence.

By leveraging the camera post-processing settings and fine-tuning the lighting, I transformed my scene into a truly splendid spectacle. The harmonious combination of vibrant colors, ethereal glow, and carefully crafted lighting created a visually captivating experience that immersed the viewers in a world of beauty and grandeur.

Mechanics

To add an element of playability to the dance hall scene, I came up with the idea of incorporating a music notes puzzle. I strategically placed several missing pieces of the puzzle throughout the dance hall, cleverly hidden amidst the grandeur and intricacies of the environment. These missing pieces would need to be discovered by the players, encouraging them to explore every nook and cranny of the scene while engaging their curiosity and sense of discovery.

Each missing piece of the puzzle would be shaped like a music note, and players would need to find and collect them to complete the puzzle. The challenge lay in locating these hidden pieces within the vast dance hall, as they were cleverly concealed in unexpected places. This interactive puzzle added an engaging and interactive aspect to the scene, encouraging players to actively participate and solve the musical mystery while enjoying the splendid surroundings of the dance hall.

The coding goal I had for this specific functionality was similar to the keypad puzzle, but with a slight difference. In this case, no input numbers were required to determine correctness. Instead, I needed to create empty places in the puzzle where objects could snap into position, accompanied by a snapping effect. The main challenge was detecting when an object had entered the designated area and triggering the appropriate response in the puzzle manager.

To tackle this challenge, I approached it in two steps. First, I created an area with the XRsocketInteractable module applied, which allowed the objects to snap into place when they entered the area. This provided the desired snapping effect and ensured the objects aligned correctly. The second step involved creating a separate area responsible for communicating with the puzzle manager. When an object entered this area, it would trigger a message to the manager, but crucially, it would only send the input once. To achieve this, I programmed the area to destroy itself after sending the message, preventing any further instructions from being processed.

By implementing these two distinct areas—one for the snapping effect and another for triggering the puzzle manager—I successfully solved the challenge of ensuring the correct interaction and input handling in the puzzle. This allowed the player to place the objects in their designated spots, triggering the necessary actions without the risk of duplicate or unnecessary inputs.

public class PuzzleSlot : MonoBehaviour
{
    public PuzzleManager PM_script;
    public string PuzzleTag;
    public bool isFilled = false;
    private Collider TriggerCollider;
    public GameObject ItemToDestory;
    public AudioSource AudioSource;
    public AudioClip SnapSound;

    private void Start()
    {
        TriggerCollider = GetComponent<Collider>();
    }

    private void OnTriggerEnter(Collider other)
    {
        if(other.tag == PuzzleTag)
        {
            print("Applied");
            isFilled = true;
            PM_script.AmountofPuz += 1;
            Destroy(ItemToDestory);
            AudioSource.PlayOneShot(SnapSound);
        }
    }

    void Update()
    {
        if(isFilled == true)
        {
            TriggerCollider.enabled = false;
        }
    }
}
public class PuzzleManager : MonoBehaviour
{

    public int AmountofPuz; //track total amount of puzzle
    public bool P_State = false;
    public GameObject ExitDoor;
    public GameObject ItemToDestroy;

    public AudioSource AudioSource;

    void Start()
    {
        AmountofPuz = 0;
        ExitDoor.SetActive(false);
    }


    void Update()
    {    
        if (AmountofPuz == 3)
        {
            P_State = true;
            print("unlock");
            ExitDoor.SetActive(true);
            AudioSource.Play();
            Destroy(ItemToDestroy);
            //FadeAnim.SetBool("Scenechanging", true);
            //SceneManager.LoadScene("maintest");
        }
    }
}

And whether the puzzle is applied to its own place will be detected by adding interact layers to each of the correct puzzle and its own snapping area, by doing this, player will not be able to snap the wrong puzzle and the invisible area set for giving input to the manager will not therefore sending any messages to it.

The “IEnumerator coroutine” is something used to do an automatic timer for a specific object on its script attached, by writing this code, the unity engine will count seconds to detect timestamps, and therefore activate something. In this case, the event to trigger is toyanim–Toywalk, this means the toy walking along way the path will be played after “timebeforestart” is completed.

This same term was used in every scene of mine for different purposes like setting time for whalebone to be destroyed or time before the cat finishes the introduction animation etc.

public class ToyWalk : MonoBehaviour
{
    //public bool ActivateMove;
    [SerializeField] float timeBeforeStart;
    public Animator ToyAnim;

    private void Start()
    {
        ToyAnim.SetBool("ToyWalk", false);
        StartCoroutine(MyCoroutine());
    }

    IEnumerator MyCoroutine()
    {
        //Print the time of when the function is first called.
        Debug.Log("Started Coroutine at timestamp : " + Time.time);

        //yield on a new YieldInstruction that waits for 5 seconds.
        yield return new WaitForSeconds(timeBeforeStart);

        //After we have waited 5 seconds print the time again.
        Debug.Log("Finished Coroutine at timestamp : " + Time.time);

        ToyAnim.SetBool("ToyWalk", true);
        //ActivateMove = true;
    }


}

Story & World

public class CatWelcome : MonoBehaviour
{
    public Animator CatWAnim;

    public Animator CatDisplace;
    // Start is called before the first frame update
    void Start()
    {
        
    }

    private void OnTriggerEnter(Collider other)
    {
        if(other.tag == "Player")
        {
            print("isTriggered");
            CatWAnim.SetBool("RunDoor", true);
            CatDisplace.SetBool("DisCat", true);
        }
    }

    // Update is called once per frame
    void Update()
    {
        
    }
}

In this scene where everything starts, we designed it as a returnable place the player can go back from the other scenes, but what is the main goal in the mechanic in order to achieve this specific function is the “world Manager” aka “Level manager”. This manager refers to the “remembering system” that will be placed at the very first scene with the condition “Don’t destroy me”. The reason why it has to be not destroyed is Unity Engine tends to load every scene under the scene manager in build settings to be started at its own beginning. and continue as a brand new level, this means that every time the player travels back from the other scenes, the main scene will be exactly what it was at the beginning with non of the player level or messages saved. In this case, we have to put a big manager that can read or be read by the scene every time when the scene is loaded, so there will be messages or level information will be saved and remembered.

public class WorldManager : MonoBehaviour
{


    public string LevelState;
    public SceneTransition ST_Script;
    public Musicbox_Animation MA_Script; // MusicboxOpening script
    public Follow Cat_Script;
    public GameObject TriggerArea1;
    public GameObject MusicboxOpening;
    public GameObject cat_model;
    public GameObject BookSlot;

    public bool onetime;

    public int LevelPlayed;

    void Awake()
    {
        LevelPlayed = 0;
        SceneManager.LoadScene("Welcome");

        DontDestroyOnLoad(this.gameObject);
    }


    // Update is called once per frame
    void Update()
    {
        if (LevelState == "PT")
        {
            TriggerArea1 = null;
            ST_Script = null;
        }

        if(LevelState == "MB")
        {
            MusicboxOpening = null;
            MA_Script = null;
        }
    }
}

The world manager script is started with a simple “public” class that literally calls every single script in each scene that we want it to communicate with the manager and therefore, pass the messages. There could be two ways of doing this: letting the manager-script to do everything we want on different levels or, telling the scripts that related to reading the manager by themselves and doing the conditions or events by themselves. For my current situation, the first method didn’t seem to be available in this game as the Unity engine is having trouble reading the scripts that are not destroyable, so I only got one selection left — to let the scripts read the manager’s orders.

In the overall experience, I strategically placed a book towards the end to serve as a guide for the player. Upon entering one of the worlds and successfully completing it, the player would encounter a sign with the words “put puzzle here.” This sign was designed to draw their attention to a specific interactable object where they could place the wooden puzzle they had acquired during their journey. It served as a visual cue for the player to recognize that they needed to place the puzzle in its original location.

To add a sense of challenge and condition for exiting the experience, I implemented a script to check if the player had applied the right puzzle to the right place. The script would verify if the player had correctly placed more than two puzzles in their respective locations. This condition ensured that the player had engaged with the puzzles throughout the experience and had successfully solved them. Only when this condition was fulfilled would the player be able to exit the experience and progress to the next stage.

By incorporating the book as a visual guide and implementing the script to check the placement of the puzzles, I created a clear objective for the player and added a layer of complexity to the overall puzzle-solving experience. This encouraged the player to pay attention to details, complete multiple puzzles, and ultimately unlock the path towards advancing in the game.

public class PieceSlot : MonoBehaviour
{

    public GameObject ExitHint;
    public GameObject EndingText;

    public int PieceNum;

    public AudioSource BookAS;
    public AudioClip EndingM;

    public WorldManager WM;

    // Start is called before the first frame update
    void Start()
    {
        WM = GameObject.Find("WM").GetComponent<WorldManager>();
        WM.BookSlot = transform.gameObject;

        if(WM.LevelState == "MB" || WM.LevelPlayed == 2 || WM.LevelState == "PT")
        {
            ExitHint.SetActive(true);
        }
    }

    

    //Update is called once per frame
    void Update()
    {
        if(PieceNum == 2)
        {
            EndingText.SetActive(true);
            BookAS.PlayOneShot(EndingM);
        }
    }
}
public class PieceSnap : MonoBehaviour
{
    public bool isSnapping = false;
    public GameObject Piece;
    public PieceSlot Slot_Scipt;

    // Start is called before the first frame update
    void Start()
    {
        isSnapping = false;
    }

    void OnTriggerEnter(Collider other)
    {
        if(other.tag == "WhalePiece")
        {
            isSnapping = true;
            Slot_Scipt.PieceNum += 1;
            Destroy(this.gameObject);
        }
    }
    // Update is called once per frame
    //void Update()
    //{
        
    //}
}

Same thing as the one I used for music puzzle, there will be a snap area and a area for detect, and then the event will happen after every conditions are fullfilled.

Final presents

Pitching!

Step into a world where imagination knows no bounds and dreams come alive in a stunning virtual reality experience. In this captivating adventure, players will be transported to an attic filled with immersive objects that hold the keys to surreal dreamscapes, waiting to be unlocked.

Imagine standing in the centre of a room surrounded by a diverse array of fascinating objects—a breathtaking painting that seems to pulsate with life, an ancient book whispering forgotten tales, and a mysterious computer glowing with untold secrets. Each object holds a portal to a unique and mesmerising dream world, where reality bends and dreams become tangible.
As the player explores, they will have the freedom to choose any object that captivates their curiosity. Once they make their selection, they will be instantly transported, diving headfirst into an awe-inspiring, surreal dreamscape crafted to their chosen object’s essence.

The attic will gradually transform as you immerse yourself in the dreamscapes. The once unfamiliar space becomes a haven, a sanctuary where you can return between your adventures, forming a deeper connection with your surroundings. It becomes a place of solace, a true home within the virtual realm.

The beauty of this experience lies not only in its captivating visuals and immersive environments but also in its ability to evoke profound emotional responses. From the wonder and awe of exploring extraordinary landscapes to the contemplation of life’s mysteries, players will be deeply engaged and connected with each dream world they encounter.

By fusing cutting-edge virtual reality technology with surreal dreamscapes, we open a gateway to an entirely new form of experiential storytelling. It transports players beyond the confines of reality, offering an escape into worlds limited only by their imagination.

Prepare to embark on a transformative journey, where dreams manifest as reality, and the boundaries of your imagination blur. We invite you to experience the magic, wonder, and limitless possibilities of the human mind. Are you ready to explore the surreal depths of your dreams?

Critical Reflection

Introduction:

In the realm of virtual reality (VR), the game designer’s role is to transport players into immersive and captivating experiences. As a college student specializing in VR game design, I had the opportunity to work on a project called “Whimsy Attic.” This critical reflection delves into my step-by-step process, overcoming coding and modeling challenges, and the importance of effective collaboration with both my fellow game design partner and students from the BA Sound Art course.

Crafting Immersive Dreams-capes:

The first step in creating “Dreams-capes” involved meticulously designing the virtual environments that would serve as portals to surreal dream worlds. Drawing inspiration from various sources such as paintings, books, and mysterious objects, I aimed to evoke a sense of wonder and captivation. Through careful selection of color palettes, attention to detail in modeling objects, and realistic lighting, I sought to imbue the dreams-capes with a tangible sense of realism. By utilizing my artistic judgment and knowledge of real-world materials, we aimed to blur the boundaries between reality and imagination.

Overcoming Technical Challenges:

In any game development project, coding and modeling hurdles are bound to arise. As a college student, I faced my fair share of challenges. One obstacle I encountered was the creation of the “world Manager” or “Level manager.” This component was crucial in ensuring seamless transitions between scenes and saving player progress. Initially, I attempted to develop a single manager that could handle all required tasks. However, due to Unity Engine limitations, I had to explore an alternative approach. By allowing individual scripts to read the manager’s orders and execute the necessary conditions or events, I managed to overcome this obstacle. Other than this, the scripts were staying in same logic that is corpora-ting with “OnTriggerEnter” with various actions but circulating with three main functional patterns of “puzzle manager”; “keypad manager” and “Sound playing trigger” etc.

Collaboration: Bridging Design and Sound Art:

Collaboration played a vital role in bringing “Dreams-capes” to life. Communication with my game design partner was key to aligning our visions and merging our expertise effectively. However, the process of this in-VR communication were much harder than off-major collaborations with sound students that it was not easy to give a effective way to optimize the VR experience and enhance the narrative journey. But luckily, collaboration with students from the BA Sound Art course added another layer of depth to the project. By understanding the physical space of the dance-hall and its corresponding ambience volume and reverb zone settings, not only imaging the breath-taking environment inside painting, we could synchronize sound design with the virtual environment, elevating the immersive experience to new heights.

Effective Communication and Problem-Solving:

Throughout the development process, effective communication was crucial in overcoming obstacles and fostering a collaborative environment. Regular meetings with my sound art partners allowed us to address challenges, share progress, and provide valuable feedback to improve each other’s work. Additionally, maintaining open lines of communication with each others enabled us to align our visions, synchronize sound elements, and ensure a cohesive and immersive audiovisual experience. The team working format of distributing each art creation sections in roles helped us to effectively understand what we need to do in future work, and most importantly, avoiding the potential controversial issues of imbalanced task distribution.

Conclusion:

Designing “Whimsy Attic” as a college student VR game designer was a trans-formative journey that required a multidisciplinary approach. By meticulously crafting immersive dreams-capes, overcoming technical challenges through problem-solving, and fostering effective collaboration with both my game design partner and students from the BA Sound Art course, I was able to create an experience that transcended the boundaries of reality and imagination. This project not only honed my technical skills but also emphasized the importance of communication, adaptability, and collaboration in the realm of virtual reality game design. As I embark on future endeavors, I will carry the valuable lessons learned from “Dreams-capes” and apply them to create even more captivating and immersive experiences, pushing the boundaries of virtual reality and storytelling. The limitless possibilities of the human mind await, and I am ready to continue exploring the surreal depths of dreams through the trans-formative power of VR.