When I began studying the principles of visual and UI design, I couldn’t help but connect them to the music practice which I’ve maintained for most of my life. The process of creating visual and audio product is very different but there are techniques and qualities of each medium that can be leveraged to make the work more effective at communicating an intent. Creativity without intent, whether it be music, design, or any other art form, is relatively low stakes. There’s nothing wrong with creating to create, but when we do create to evoke a specific emotion, feeling, or experience, we leave less room for creative luck and have to rely on the skills and understanding of the medium we’re working with.
This piece was written to outline some fundamental correlations that exist between the visual medium of design and the audible medium of music. My good friend Byron Harden is a visually impaired audio engineer and educator who owns and operates a music studio and organization called I See Music. My goal is that after reading and listening to the examples in this article, you too will be able to see music, regardless of your familiarity or experience with the medium. Let’s start with the basics.
Color theory within the practice of visual design is very important and well known. Color and color schemes when used within a project can have a huge impact on the feel and mood of a visual experience.
Similarly in music, pitch and pitch schemes, known as scales, completely dictate the musical mood and feel of a song. Just as a designer may choose to fill their palette with warm and bright tones to reflect a cheery mood, for example, a composer can select chords based on specific scales that evoke a warm and positive feeling. Here’s an example of two different chords with very distinct moods associated with them.
The first chord has a very bright and positive aesthetic while the second has a dark and tense feeling.
In practice, the colors of the visible light spectrum are also organized and selected similarly to the pitches in the chromatic scale. The circular color wheel is often used to find colors that relate to one another. Designers know the intervals between complementary, triadic, and analogous color schemes. Musicians too organize their pitch schemes in similar ways that make it easy to identify the intervals between them. Take the circle of fifths seen below for an example.
The physics of how we experience color and pitch are also quite analogous. While colors are in fact just narrow frequency bands within the visible light spectrum, pitches are also narrow bands within the audible frequency spectrum. In both cases, the color or pitch that we experience is directly related to the frequency of the light or sound wave that reaches our body.
Good designers know how to use negative space, just as good musicians know how to use silence. Negative space within a design is the space around and between subject elements. Often, this will be manifested as the background on which the subject content lays. On this page alone, there’s negative space between individual lines of text, before and after headings, and between each paragraph. All of this helps us identify and appreciate the different chunks of information in the article. If the negative space were eliminated on this page, it would become much more difficult to understand and process the individual elements being presented.
The same principles of negative space apply to music. In music notation, silence is called rest and without it, there would not be any distinction between individual notes, phrases, or ideas. Thoughtful musicians are mindful of the space between individual notes and phrases, just a designer should be mindful of the space between individual subject elements. Here, I’ve created an example of two identical melodic and bass lines each with different degrees of negative space between the notes. You can hear for yourself how much of an impact the amount of space or rest between notes could change the listening experience.
An audio example first with less negative space, then with more
Within the visual design practice, contrast is the optical difference between an element and its surroundings or background. Although contrast can describe many differences including size, shape, and position, the simplest example is the literal contrast between the subject and background. Good designers can leverage this form of contrast to direct the eye toward important content. Generally, the elements with greater figure-ground contrast tend to stand out and attract our eyes.
In music, this phenomenon can be experienced through note intensity. In the auditory sense, silence is the analog to the white background. The louder, or more intense, a musical element is, the more our ears distinguish and process that sound. Here’s an example of a drum pattern made only with one sound playing the same rhythm throughout. See if you distinguish all notes equally, or if you primarily hear the pattern outlined by the loudest accent notes?
An audio example of intensity contrast with an accented pattern
Speaking on one other form of contrast, the use of color in a field of greyscale elements will have a similar emphasis due to contrast. If I add color to one of the squares from the previous visual example, you can test where your eyes are most attracted to now.
Even though the black square farthest to the right has the greatest contrast ratio to the background, many people will now find their eyes attracted to the red square simply because it has the contrast of color on a page that is otherwise black and white.
If we apply this same principle to the drum pattern example, we can add color in the form of a new tone. While the snare drum pattern with the intensity accents remains the same, see which note stands out the most now when we switch one of them to a jingle bell.
The same audio example as previously with a new color
Layouts and grid within a visual UI design form a foundation for the arrangement of a particular page or piece. Even if the design is responsive to screen size, we generally establish a layout scheme early to ensures that the content and UI are distributed nicely with some degree of consistent alignment and spacing. This consistent spacing is sometimes even referred to as “rhythm” which draws direct parallels to the concept of rhythmic pattern and meter within music.
Just like a layout and grid system, the meter of a particular piece dictates the arrangement and rhythm of musical notes in a composition. Meter can come in any numerical value just like a layout arrangement can have any number of columns or rows in a particular view.
The benefits of using a grid in the design are also similar to the benefits they have in music. The grid, within the musical context, refers to the perfect subdivision of a bar, i.e. the metronome that clicks silently along with a track. In digital music production, these beats typically display as a literal grid, and we often have the ability to snap to grid, or “quantize”, our digital music recordings. Just like designers use grids to ensure that elements line up perfectly, some types of music sound best when musical elements are perfectly aligned with one another and the to the metronome.
The music example below shows two completely different meters. You can hear how the meter will basically dictate the entire arrangement of musical elements with regard to the spacing and rhythm of notes. Each example is played first in an unquantized version where I tap the beat in with my computer keyboard as accurately as I can. The second take of each example shows the effect of quantizing or snapping to the grid so that each note is perfectly aligned with one another and the master tempo.
Two examples of different meters. Each example is played naturally first and then quantized to grid second
In design, we judge proximity based on the distance between elements in either the X, Y, or Z directions. In music, proximity is most commonly experienced in relation to the temporal location meaning the relationship of elements within the timeline of a song. Just like the visual design Gestalt principle of proximity, music notes that are close together sound more related than notes that are spaced farther apart.
As the proximity between identical elements continues to get closer, it becomes harder to differentiate the individual elements until eventually they become perceived as a single larger element. Listen to the recording below. Although each note is identical, our ears associate the notes that are closest to one other as being the most related. When the proximity becomes close enough, we can experience multiple notes as one large note, just as the cubes above appear to be one large rectangle when pushed against each other.
An audio example showing 4 groups of 6 identical notes with different levels of proximity. Each group is separated by a bell tone.
Size, in visual design, refers to the dimensional scale of an element especially with regard to elements around it. Visual elements that are of greater size are generally perceived as being more important or of greater hierarchy than related smaller elements. Good visual designers can use size and scale to sequentially direct the viewer's eyes to the content or information which is most important or useful. The main thing to remember when using size is that size alone does not create importance or hierarchy, but rather the difference in size between elements is what creates this visual relationship. Take the image below for an example. Despite the grey square being to the left, where English readers generally first look, the large grey rectangle commands more visual real estate which our minds perceive as importance.
The same principle could be applied to auditory elements, however, instead of elements competing for physical screen space, they compete for space within the audible spectrum. For this reason, audio engineers are mindful not to place too many elements that occupy the same frequency range on top of each other. This is often referred to as “making room in the mix”. Not allowing for each element to occupy its own space is the musical equivalent to placing many elements in the same location on the screen; no matter what, some sounds will be obscured. In the competition for audio real estate, generally the loudest wins.
You don’t have to be an audiophile to hear this phenomenon at play either. In this example, I’ve played the same bass line twice, but using a filter, I reduced the size of the sound by allowing only a small fraction of the audible spectrum to be played. I also normalized the volume between the two versions so that the two that you hear are very close to being the same level. Despite being the same volume, one version sounds larger or “fatter” than the other.
An audio example of a single bass line first with a limited frequency range, then with a full frequency range
Repetition within a visual UI design system can be used to underpin and emphasize the meaning and context of our designs. The value in repeating similar UI patterns or paradigms is that it allows the user to build a visual library in their head of how and when they should interact with specific UI elements. Take the example below. We all know it, love it, and recognize it, but you may not be able to tell me exactly where I got this image.
You may have guessed that this is a Google search bar, but you may not have guessed that this is specifically the search bar that appears in an image search. Regardless of whether you’re on Google’s front page, searching on Google Images, or looking up an address on Google Maps iOS, all of the search bars have this same visual appearance. The repetition of this specific UI across contexts is so consistent that it actually looks strange when we come across a Google search bar that is still square around the edges.
This concept of repetition is very common in music and is especially common (perhaps too much so) in modern pop music. One of the more obvious forms of repetition is within song chorus, where artists literally repeat a part of the song again. The chorus is often the most catchy and central theme to the song, and for that reason repeating it throughout allows casual listeners a section to latch on to as an association for that song. Having a catchy refrain is essentially a prerequisite to becoming a radio hit, at least in the US market.
Verses often use some degree of repetition also, if not in vocal or vocal delivery, often within the instrumental. Phrasing with repetition is useful in music writing because it allows an artist to create movement while also maintaining a level of familiarity. Ultimately, most pop music listeners are much like Google search users in that we don’t always want to be constantly analyzing new patterns, especially within a single song. Consistency through repetition allows the listening experience to feel more stable and predictable.
Elevation is often used within visual and UI design to mimic the real world by leveraging 3D elements on an otherwise 2D image. Visually, this is often achieved with drop shadow variations made to resemble the way in which a shadow may change when an object is lifted off a surface. In the real world, our attention most often goes towards objects that are closer to our faces. The goal of elevation within visual UI design is to essentially bring specific objects closer to the screen on which it is being viewed. This generally has the effect of drawing attention to and emphasizing objects which are visually shown to be closest to the viewing screen.
Similarly to designers, musicians have tools and techniques to feign a 3D listening environment within musical elements. Just as a design specimen would appear flat without any elevation effects on it, a piece of music would appear nondimensional and flat if it was delivered in complete mono without stereo manipulation. Stereo within the context of music describes the methods of creating auditory perspective within a song or musical element. This way, audio can be experienced from the left, right, center, or combination of the three. And in doing so, it’s possible to direct listeners' attention by putting a sound near the center versus in the peripheries of the audio environment.
As audio sounds completely different in every room, engineers can use reverb to mimic the effect of hearing sounds in different listening environments. All of these effects, which commonly play into the stereo experience of a piece, have the impact of mimicking or modifying the listening experience with regard to the environments we most commonly hear the audio in. Skilled musicians and engineers can leverage these stereo techniques in conjunction with an understanding of how people perceive audio in their environment to sculpt a 3D audio image that draws attention to the correct musical elements.
The same audio example three times with increasing stereo dynamics. Best if heard with headphones or on a stereo system.
While the mediums of visual design and music rely on completely different sensory experiences, there are a number of common principles and techniques we can use when developing visual environments and audio soundscapes. Even if as a designer you’re not particularly interested in music or audio, I believe it’s worthwhile to consider how we can map our visual ideas effectively in the audio space. If at the very least this article has helped you empathize with users who can’t fully appreciate visual design due to disability or impairment, then I would consider this to be time well spent. I encourage everyone to take a moment today and listen to a piece of music you enjoy with your eyes completely closed. With enough time and practice, you will begin to develop a mind's eye that responds to the music you hear and allows you to “see” sound by painting a beautiful picture of it deep within your mind.
Elevation, Material Design
Proximity Principle in Visual Design, Nielson Norman Group
Visual Hierarchy, Interaction Design Foundation
Visual Design 101, Jaxon White