TechTalk 3: Inclusive Immersive Technology

In this TechTalk, we will look at examples of inclusive immersive technology.


Transcript:

If you’re interested in audio, music, and technology, then you’ve come to the right place.

Hi! I’m Francesca, and you’re watching my TechTalk.

In the last TechTalk, I talked about XR. In this video, we’ll look at how immersive technology can be used in special needs music. Let’s get started!

Immersive technology is becoming more and more popular nowadays thanks to the gaming and entertainment industry. We can see a lot of immersive reality headsets like Oculus Rift, Google Cardboard, HTC Vive, and many more are now available on the market. Although these are mostly used for gaming and media consumption, more industries are trying to take advantage of this technology too. In the healthcare sector, this technology is used in different ways. For the purpose of this video, we will talk about its context in special needs education and music.

After a brief search on the internet, it seems that there are quite a number of projects utilizing immersive technology for use in special needs already, whether it be for educational or therapeutic purposes. In the United States, their Department of Education invested $2.5 Million in a project called “VOISS: Virtual Reality Opportunities to Implement Social Skills.” Through the project they aim to give people with high-functioning autism and disability, a safe virtual space where they can learn social skills and other behaviors at their own pace. The project showed promising results and this further proves that the possibilities for how we can make the lives of people with autism and disabilities easier using this technology are endless.

Another example of inclusive immersive technology is a project called “Performance Without Barriers.” This project aims to design virtual reality instruments for musicians with disabilities. The research team partnered with a software development company to develop “Infinite Instrument” for the HTC Vive headset. Although the virtual instrument has hand-held controllers, it is designed in such a way where different types of mobilities were taken into consideration. The controllers are a good addition because they make the whole experience more immersive; since musicians are able to engage not only their sense of sight and hearing, but also their sense of touch.

The Performance Without Barriers team mentioned that most immersive technologies are to marketed to able-bodied users, and I think this is true. With the ever-growing popularity of immersive technologies, it is important to create immersive experiences that are inclusive too. 
Through the examples mentioned, the design of any immersive tool really needs to be well thought out, especially if it’s going to be used in special needs learning or music settings. Since I aim to develop a mixed reality musical instrument app for my final MSc project, I really need to research the best way to design my app’s interface so that it will be easy for people with limited movement to use it.

In my next TechTalk, I will discuss design considerations, available AR app development software, and the MR hardware I plan to use. 

Once again, I’m Francesca, and thank you for listening to my TechTalk.

References:

Want to connect or collaborate? Send me an email here.

TechTalk 2: Am I Real?

In this TechTalk, we will talk about the different types of extended realities– what they are and how they came to be.


Transcript:

If you’re interested in audio, music, and technology, then you’ve come to the right place.

Hi! I’m Francesca, and you’re watching my TechTalk.

In the last TechTalk, I gave a brief description of AR, MR and special needs music. In this video, we will be discussing virtual technology in more depth. Let’s get started!

When talking about augmented reality and mixed reality, it’s hard not to mention virtual reality too. These technologies as well as other types of immersive experiences fall under the term XR, otherwise known as Extended Reality. The X in XR is also viewed as a stand-in variable which means it can be replaced with other letters representing a specific technology like VR, AR, MR. Often though, the term XR is used to group VR, AR, and MR together. The earliest XR concept can be traced back to Sir Charles Wheatstone, a British philosopher and inventor who contributed in various scientific fields particularly in the area of sound and vision. In 1838, he invented the stereoscope, a device that plays around with our visual perception or how our brain sees things. Basically, viewing through a stereoscope produces a three-dimensional image. This illusion is created by presenting the same image in both the left and the right eye, but with different angles. The different perspectives are then interpreted by the brain as a three-dimensional image. In the 1840s, Scottish Physicist Sir David Brewster, improved the design of the stereoscope by making it more portable. By then, photographs were a thing already, so aside from drawings and artwork, people were now able to view three-dimensional images of the real world. 

Through the years, people have built upon the idea of three-dimensional images and in 1935, a concrete concept of virtual reality first came to light. Stanley G. Weinbaum an American sci-fi writer, wrote the Pygmalion’s Spectacles. Long story short, it involves the creation of spectacles that excites all of the five senses to fully immerse a user in a movie or story so that they can be a part of it. Something similar to this was invented by Morton Heilig in 1957. The machine is called Sensorama and it offered users an immersive movie experience by stimulating their senses such as sight, hearing, and smell. This is the basis of the VR goggles we know today.

The invention of virtual reality gave birth to the development of many other realities. Since then, immersive technology has seen massive improvements. The year 1990 was when the term augmented reality appeared, and in 1992 the first operational mixed reality system was created at the USAF Armstrong Labs at Brooks Air Force Base in San Antonio, Texas. Louis Rosenburg created a system called Virtual Fixtures, which was made up of binocular magnifiers and robot arms. The device overlaid graphical information to a real-world environment and had simple haptic surface combinations. This device was used in military training to increase work efficiency especially for physical tasks. The robot arms were seen in the exact position as the user’s physical arms and the generated visuals were barriers, fields, guides, etc.

Although 1992 saw the first mixed reality system, it wasn’t until 1994 when the term was officially defined. Paul Milgram and Fumio Kishino, in their paper called “A Taxonomy of Mixed Reality Visual Displays,” tackled the virtuality continuum concept, which is sometimes referred to as the reality-virtuality continuum. According to the paper, MR is said to be “…anywhere between the extrema of the virtuality continuum.” In simple terms, MR is anywhere between the real world and the virtual world spectrum. So, it could either be more towards the side of augmented reality or augmented virtuality. Based on this definition, my MSc project is considered to be on the augmented reality side, as I hope to place virtual musical instruments in the real world. 

So the question now is how can MR, or any XR technology for that matter, be used for developing technological solutions for music therapy or special needs music? The answer? Well, you have to watch my next TechTalk to find out.

Once again, I’m Francesca, and thank you for listening to my TechTalk.

References:

Want to connect or collaborate? Send me an email here.