APPLE MUSIC CHANGES THE WAY YOU FEEL MUSIC WITH NEXT GEN FEATURE 

words by ROMÉE AVRIL

Apple Music’s latest feature, Music Habits, celebrates not only 40 years of accessibility at Apple but, most importantly, the physical experience of music.

To launch Apple Music’s newest addition to the Accessibility family, the Battersea Power Station in London served as the setting for an intimate event. After entering the room, I grabbed a glass of water and struck up a conversation with a fellow invitee: “How long have you been deaf?” he asked. I responded by apologizing, explaining that I don’t experience any hearing difficulties. He was surprised and questioned, “Why are you here then?” It was a good question, as I found myself a minority in a room full of people from the deaf and hard-of-hearing community. Later, I realised why I was exactly where I needed to be.

In our society,  music is widely interpreted as an auditory experience that triggers the sense of hearing. Music, as a universal language, holds power in the offline world—it can become physical through shows, festivals, clubs, etc. Artists create environments where people feel the music by dancing, feeling the vibrations of the bass, and the energy of the crowd. But what if you want to speak that language individually, and the physical elements aren’t there to trigger your senses?If the language of music could be interpreted multi-sensorially in a communal way, the world would breathe more joie de vivre. The answer? Music Haptics. By using the Taptic Engine in iPhone, different types of pulses align with the music you are playing while listening to Apple Music or Shazam, offering an elevated vibration experience for all listeners.

Accessibility has been embedded in Apple’s DNA as one of their core values for the last forty years. It’s no surprise that Apple continues to evolve its ecosystem by analyzing the needs and values of its users and translating those into the best technologies for enhancing collaboration among everyone. “It is not an afterthought, but part of the core of our products,” Sarah Herrlinger, Global Head of Accessibility at Apple, explains. She continues: “With disability being part of the human experience, most of our Accessibility tools have focused on practical needs: from sound recognition and eye tracking to hearing tests. Until Music Haptics, a feature that is all about joy.”

Music Haptics is designed to break down barriers for the deaf and hard-of-hearing community by providing them with a unique music experience that stimulates their other senses. But this feature also enhances the experience for anyone looking to deepen their understanding of their favorite music. In London, we had the chance to try out the feature, and I was in awe. I expected to solely feel the vibrations of the drums and bass through the phone, but the complexity of the vibrations went deeper than that, allowing me to discover different elements within the track. You feel vocal riffs, sense the unique bass quality vibrations, and intricate melodies—all contributing to a layered, physical phenomena of music. I got creative and placed the phone on my chest, resulting in a domino effect of goosebumps that electrified my soul. 

As a music editor, my purpose is to catch the meaningful developments within the industry. With the launch of Music Haptics, I feel that by spreading this message I give voice to the abilities of the language we all speak. One that sparks new possibilities for the ones that have been excluded from speaking this with us. 

Interested in trying it for yourself? Music Haptics is now available to all listeners using iPhone 12 and later. Simply go to your settings, click "Accessibility," and navigate to 'Music Haptics.' Toggle the feature on, and get ready to feel through Apple Music.

Previous
Previous

4TH EDITION OF BLUEPRINT AT LEVENSLANG

Next
Next

MARCH AT CLUB RAUM IS LOOKING HOT