Daily review constructed from an AcousticBrainz track analysis: Nov. 24 2014

| No Comments
AcousticBrainz aims to automatically analyze the world's music, in partnership with MusicBrainz and "provide music technology researchers and open source hackers with a massive database of information about music." This effort is crowd sourced neatness, which means people from all over the world are contributing data by having their computer crunch through their MusicBrainz-IDed music libraries and automatically uploading all the low-level features it extracts.

I construct today's review from low- and high-level data recently extracted from a particular music track AcousticBrainz. Can you guess what it is? What characteristics it has? ("Probabilities" are in parentheses.) The answer will be revealed below tomorrow.

This female-gendered (0.81) vocal (0.78) track is not likely danceable (0.87), but it has a high probability of being electronic (1.0) and/or ambient (0.57) and/or classical (0.45) and/or jazz (0.31). It is in C major, with a tempo of about 148 bpm, and has a Tango rhythm (0.91). It has a bright timbre, is probably atonal (0.83), and labeled probably happy (0.63), but most likely not relaxed (0.96).

The track is "With God on Our Side (feat. Joan Baez)" by Bob Dylan.

Leave a comment

About this Entry

This page contains a single entry by Bob L. Sturm published on November 24, 2014 2:37 PM.

Paper of the Day (Po'D): The problem of accuracy as an evaluation criterion edition was the previous entry in this blog.

Daily review constructed from an AcousticBrainz track analysis: Nov. 25 2014 is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.