I work for the music streaming service Audiomack. And I do a lot of things at Audiomack, but my broad directive is to deal with data and personalization. The former includes everything from royalty reporting to making sure internal curators can see how their playlists are performing. The latter includes building all experiences — from song recommendations to paywalls — that make our app and website feel like it was made specifically for you.
When I talk about my job, I get many more questions about the personalization piece than the data piece. I understand why. Personalization is one of the hottest topics in music. In fact, I often hear people complain how personalization is ruining music for artists and listeners. Of course, I don’t believe this. I couldn’t do my job in good faith if I did. But I have been noticing some concerning trends in the world of music personalization that I think are worth being worried around. As always, this newsletter is also available as a podcast. Listen on Spotify and Apple Podcasts or click play at the top of this page.
A Title Made for You
Often when you hear people bemoaning the dangers of social media or the problems of music streaming, they point to one thing: “the algorithm.” This makes sense — algorithms do power our online experiences — but it is also somewhat silly. An algorithm is just a very specific set of instructions. Here’s an example.
Last week, I was in Disneyland with my girlfriend and our two friends. At every ride, we’d see some measuring stick indicating how tall you had to be to get on. This is a very basic example of an algorithm, namely one that decides if a person is able to get on the ride. If you are greater than or equal to the indicated height, you may get on. If you are shorter than that, you may not. Here’s how you would render that as a function in Python, a popular programming language:
def can_i_ride(height):
if height >= 42:
print("Yes")
else:
print("No")
“Algorithms are tools,” Spotify’s former Data Alchemist Glenn McDonald told me a few months ago. “Tools have no moral stature on their own. They're good if they increase our ability to do interesting, humane things. They're bad if they don't do that.” Let’s conjure an example of a music algorithm — albeit without code — that I think would increase our ability to do “interesting, humane things.”
Let’s say you’re a becoming a big fan of Ella Fitzgerald. According to her website, “During Ella’s 50-plus year career she recorded over 200 albums and around 2,000 songs.” That’s a lot of music to sift through even if you’ve got a world of time. Imagine a tool that showed you the most popular Ella Fitzgerald songs that you haven’t heard yet. That would be very useful. It would be even more useful if it were generalized.
Imagine a tool called the “Discovery Machine” that allowed you to enter any artist’s name. After you did such, it would run an algorithm that returned you a list of that artist’s songs that you’d never heard sorted from most to least popular. I would love a tool like that. It seems plainly beneficial. (Frankly, it sounds like something I should mention to my bosses.)
But algorithms aren’t all plainly beneficial. Imagine I wrote one that was explicitly designed to tell you that you were a worthless pile of garbage every time you opened your phone. Though the algorithm itself might be computationally perfect, I think we would agree that it was morally wrong.
Keep reading with a 7-day free trial
Subscribe to Can't Get Much Higher to keep reading this post and get 7 days of free access to the full post archives.