June 16, 2020 By Joseph P. Farrell

In my book Microcosm and Medium, I explored the idea that the arts in general, and music in particular, were a form of mind manipulation, usually (though not always), in the best sense. There is something "intangibly true" about that old adage that music is the universal language of mankind.

But there may be more to  it than that, according to this article spotted and shared by M.W. (to whom a very big thank you for sending an article not having to do with Baal Gates, the Fauci virus, or the magical Gnostic transformations of language wherein rioters become protesters and looters become "alternative shoppers"):

Music Synchronizes the Brains of Performers and Their Audience

There's much in this article to speculate about, because the implications multiply like rabbits. But some things really  stand out for their stunning ramifications if one thinks a moment. For example, what really caught my eye were these statements:

In the study, a violinist performed brief excerpts from a dozen different compositions, which were videotaped and later played back to a listener. Researchers tracked changes in local brain activity by measuring levels of oxygenated blood. (More oxygen suggests greater activity, because the body works to keep active neurons supplied with it.) Musical performances caused increases in oxygenated blood flow to areas of the brain related to understanding patterns, interpersonal intentions and expression.

Data for the musician, collected during a performance, was compared to those for the listener during playback. In all, there were 12 selections of familiar musical works, including “Edelweiss,” Franz Schubert’s “Ave Maria,” “Auld Lang Syne” and Ludwig van Beethoven’s “Ode to Joy.” The brain activities of 16 listeners were compared to that of a single violinist.

All the musical pieces resulted in synchronization of brain activity between the musician and listener, but this was especially true of the more popular performances. Interbrain coherence was insignificant during the early part of each piece and greatest toward its end. The authors explained that the listener required time to initially understand the musical pattern and was later able to enjoy the performance because it matched that person’s expectations.

Synchronous brain activity was localized in the left hemisphere of the brain, to an area known as the temporal-parietal junction. This area is important for empathy, the understanding of others’ thoughts and intentions, and verbal working memory used for expressing thought. It may function in the retrieval of sounds and patterns that give rise to musical expectations.

But it is the right brain hemisphere that is most often associated with interpretation of musical melody—in contrast to the left hemisphere, which is specialized for the interpretation of speech. In the right hemisphere, synchronization was localized to areas involved in recognizing musical structure and pattern (the inferior frontal cortex) and interpersonal understanding (the inferior frontal and postcentral cortices). These sites also involve “mirror neurons,” brain cells that are thought to enables a mirroring or internalization of others’ thoughts and actions.


...The brain activity of that person playing air guitar at your concert is closer to that of a true performer than you might have realized.

What fascinates me here is that this could be a way to explore aspects of the old doctrine of Affektenlehre by comparing performer-listener brain activity during the performance of specific musical procedures.  Additionally, we're all probably familiar with "that person playing air guitar"; for us keyboardists, we play "air keyboards" or tap our fingers on the arm of a chair (or in my case, a crossed leg will do). But that the article is also suggesting is that the more-or-less abstract musical forms themselves - the rhythms, harmonies, the patterns of notes in individual lines of music - are producing measurable physiological and neurological responses, simply by an individual's "active listening" to them.

It's this that I find the most fascinating, because this is straight out of Affektenlehre, i.e., the idea that music can "conjure" more or less general and objective intellectual and emotional responses. An individual's description of that response may vary from individual to individual, and therein lies the reason, perhaps, that many people view emotional-intellectual responses to music as being predominantly subjective. But underneath these subjective descriptions, the article is suggesting that their are synchronous and similar brainwave patterns, and it would be those which would perhaps ultimately be shown to ground certain ideas from the old Affektenlehre cosmology.

And this suggests yet another implication. In Microcosm and Medium I referred to the early (and quite secret) research conducted by neurologists on brainwave responses to particular words. Eventually, this research led to the compilation of what I called "electro-encephalographic dictionaries," literal catalogues of brainwave responses to particular words from several individuals "averaged" to create these "electro-encephalographic dictionaries." These dictionaries, as I outlined there, could in turn be used to "read" an individual's "interior conversation" remotely, and eventually also led to the capability of being able to project a conversation into someone's brain by modulating those brainwave patterns on standard microwaves.

One wonders if a similar technique could be applied to specific melodic, contrapuntal, rhythmic, and harmonic procedures to create a kind of "electro-encephalographic lexicon" of Affektenlehre?

I suspect, given the implications of this article, plus some of the research that I reviewed in Microcosm and Medium, that this very probably has been underway for some time... This of course is MK-Ultra, set to music...

See you on the flip side...