Recently, King Arthur flour is having a moment, as people under stay-at-home orders turn to baking to fill time and relieve stress. King Arthur flour has ramped up production and posted recipes, blog entries, and how-to-videos aimed at the quarantine market.
I had actually been thinking about King Arthur flour and planning my own blog entry about it even before the virus hit us. In fact, I’d just picked up the interlibrary loan books I needed when my institution went to remote learning, but the demands of online teaching meant I haven’t been able to blog until now.
King Arthur flour had come up in my Making History class (an introductory course for history majors) as part of our study of medievalism (the appropriation of medieval material in modern contexts). We’d read about American medievalism in Marcus Bull’s Thinking Medieval, which had a passing mention of the Knights of King Arthur, a boys’ organization similar to the Boy Scouts, founded in New England in 1893.1 In class, I asked the students if they were aware of any other Arthurian connections associated with New England, and one student answered right away, “You mean King Arthur flour?” I did. But I didn’t know anything more about the company than the name and that they were based in New England. Time for research!
I knew that Howard Pyle’s illustrated retellings of the King Arthur stories (which I read as a child and which I’m sure influenced my choice of scholarly specialty) dated from around 1900.2 Given that the Arthurian boys’ clubs were founded in 1893, I wondered if the flour company also began around that time and could be seen as an example of a fin-de siècle American fashion for things Arthurian.
It turned out that I was half right. The company was actually founded in Boston in 1790, not 1890, as Henry Wood and Company, selling flour imported from England. They began selling American-grown and milled flour in the 1820s. In 1896, the company, known since 1890 as Sands, Taylor, and Wood, introduced a new, high-end flour. One of the partners, George Wood, had recently attended a play about King Arthur in Boston, and, in the words of the company history published in their 200th anniversary cookbook,3
came away feeling that the values inherent in the Arthurian legends, purity, loyalty, honesty, superior strength and a dedication to a higher purpose, were the values that most expressed their feelings about their new flour. So it was decided that King Arthur would be its symbol.
The new product was introduced at the Boston Food Fair in 1896, promoted by a man dressed in armor riding through the streets of Boston on a horse. An article in the Boston Post described the scene:4
A horseman clad in glittering armor and armed cap-a-pie [head-to-foot] has been creating no small sensation of late as he guided his prancing steed through the streets of the Hub. No stranger contrast can well be imagined than this figure of medieval romance set down in the busy turmoil and traffic of modern Boston.
It seems at first sight that one of Walter Scott’s heroes had come to life again, or, perchance, that a new Don Quixote had arisen to tilt against the deadly trolley.
The Crusaders’ cross gleams on the coat of mail and adorns the silken standard that he bears aloft. It is, in truth, King Arthur come to earth again—the picture of that gallant warrior is literally perfect. The standard bears the legend ‘King Arthur Flour,’ and the inference is obvious—that as King Arthur was a champion without fear and above reproach, so is King Arthur Flour the peerless champion of modern civilization.
The writer makes up in enthusiasm what he lacks in accuracy. Sir Walter Scott’s novel Ivanhoe (which we also read in Making History) was enormously popular in the nineteenth century and enormously influential in creating the modern understanding of the Middle Ages (which is the reason we read it in Making History), but it has nothing to do with King Arthur—it’s set in England in the late twelfth century, the time of Good King Richard and Bad Prince John.5 Further, King Arthur never went on Crusade, so his depiction with a Crusaders’ cross can’t really be called “literally perfect.” And I’m not at all sure what Don Quixote is doing there. Over a thousand years of history has been compressed into a single medieval moment—King Arthur, Crusaders, Wilfred of Ivanhoe, and Don Quixote all ride through the streets of modern Boston at the same time.
This medieval mashup is also seen in the original King Arthur Flour logo, a version of which still adorns the company’s bags of unbleached flour. In the original version, not only is King Arthur dressed as a Crusader—a Templar, in fact—but in the background you can see palm trees and a desert sun setting behind the walls of a Middle Eastern city.
This discrepancy was eventually noticed by the company, and in the 200th anniversary cookbook, the author, Brinna Sands (married to a fifth-generation member of the Sands family that had been involved with the company since 1840), wrote,6
When our logo was conceived a century ago, the artist inadvertently placed King Arthur in the Middle East as if he were a crusader. King Arthur may have been a crusader, but not in the sense the term is generally accepted. His Crusade was in the land of hill fort “castles’ and ancient oaks which we have substituted for the palm trees and mosques.
The logo is simplified now, with neither palm trees nor ancient oaks in the background. They still celebrate the Arthurian connection, however. The company name was changed from Sands, Taylor and Wood to The King Arthur Flour Company in 1999.7 The commercial product line (sold only in 50-pound bags) includes Sir Galahad (all-purpose), Sir Lancelot (high gluten), and Round Table (low protein) flours. Queen Guinevere, however, has been dethroned, as her namesake product, a bleached cake flour, was discontinued when they developed an unbleached version. The campus in Vermont, where they moved in 1984, is known as Camelot.
I’ll definitely be doing some research on the Arthurian boys’ clubs for a future blog. But now I think I’ll go bake something.
I wrote this essay originally for a summer faculty seminar at Mount St Mary’s University held in 2007 on the American founding, directed by Dr Peter Dorsey of the English department. I learned most of the medieval material from the graduate seminars I took from Brian Tierney at Cornell: Francis of Assisi and the Franciscans, Church and State in the Middle Ages, Medieval Conciliarism, and Medieval Canon Law, as well as from his published works as specified in the notes. I publish it here as a tribute to him.
In the preface to his 1927 book The Twelfth-Century Renaissance, Charles Homer Haskins wrote,
“The title of this book will appear to many to contain a flagrant contradiction. A renaissance in the twelfth century! Do not the Middle Ages, that epoch of ignorance, stagnation, and gloom, stand in the sharpest contrast to the light and progress and freedom of the Italian Renaissance which followed? How could there be a renaissance in the Middle Ages, when men had no eye for the joy and beauty and knowledge of this passing world, their gaze ever fixed on the terrors of the world to come?”
Haskins’ rhetorical questions apply equally to my subtitle. Medieval
Origins of the Declaration of Independence! Do not the Middle Ages, that epoch
of ignorance, stagnation, and gloom, stand in the sharpest contrast to the
light and progress and freedom of the Enlightenment? How could the Middle Ages
have anything to do with the Declaration of Independence, when medieval people
knew nothing of natural rights, government by consent, or a right of rebellion,
their gaze ever fixed on the terrors of the world to come?
Haskins justified his title this way: “The answer is that the continuity of history rejects such sharp and violent contrasts between successive periods, and that modern research shows us the Middle Ages less dark and less static, the Renaissance less bright and less sudden, than was once supposed.” 1 The historiographical movement Haskins represented has been termed “The Revolt of the Medievalists”; this is the historiographical tradition in which Brian Tierney worked. I would like to extend this revolt to tracing the medieval roots of the Declaration.
In an episode of the 1980’s sitcom Family Ties, Alex Keaton, played by Michael J. Fox, dreams that he is watching Thomas Jefferson (played by Alex’s father in the dream) draft the Declaration of Independence. In addition to proposing a few changes in wording (“try ‘self-evident,’ Mr. Jefferson”), Alex also suggests that he use “brown crinkly paper” to write it on. 2 The scene works because not only do we know what the Declaration is supposed to sound like—we know that it says “we hold these truths to be self-evident,” not “really obvious”—we know what it’s supposed to look like. And what it looks like is medieval.
Thomas Jefferson didn’t produce the famous version on the brown crinkly paper (actually parchment, which is also what medieval documents were written on); his handwritten draft survives, written in a surprisingly legible hand and with no special formatting. After the final text of the Declaration was approved on July 4, the Continental Congress directed that the text be “engrossed on parchment.”3 This task was undertaken by Timothy Matlack, assistant to Charles Thomson, Secretary of the Congress. 4 The layout of Matlack’s version follows point by point the layout of medieval papal documents.
Here we see the familiar image of the official copy of the Declaration below a privilege issued by Pope Gregory IX in 1234. The languages and the scripts are different, of course, but otherwise the two documents look strikingly similar. Both highlight their opening words in larger letters that use a different script from the main body of the text. Both occasionally vary the script used in the body. Both have witnesses’ signatures, each of which is accompanied by a distinguishing flourish, arranged in vertical columns at the bottom of the page, with the main signatures (John Hancock and Pope Gregory IX) larger and in the center.
I’m not suggesting, of course, that Timothy Matlack had a papal charter next to him when he dipped his quill into the ink to begin writing “In Congress.” My hypothesis is that the visual format of ecclesiastical documents influenced the look of royal documents, probably by means of clergymen working in royal chanceries—it’s no accident that “cleric” and “clerk” are the same word in Latin (clericus). Then I suspect that colonial charters followed the format of other royal documents, and the colonial charters influenced the look of the Declaration. Someday I’ll take the time to document this hypothesis (yet another retirement project!).
This same path was followed by several of the important
ideas in the Declaration. Concepts
developed in the church, especially by canon lawyers, were applied to secular
governments, including the kingdom of England, and then some of them crossed the Atlantic. These ideas include the existence of natural
rights, government by consent, and the right of rebellion.
Among the truths that the Declaration of Independence holds to be self-evident is that all men “are endowed by their Creator with certain inalienable rights, that among these are life, liberty, and the pursuit of happiness.” Jefferson, clearly, is drawing on the early modern philosophical tradition of natural rights, especially as developed by John Locke (whose formulation was “life, liberty, and property”). But the early modern rights theorists were themselves drawing on a medieval tradition that began with the twelfth-century Decretists and really got going in the fourteenth-century disputes between the Franciscans and the papacy, a tradition that has been documented in the works of Brian Tierney and which I studied in his courses.
Gratian’s Decretum is a collection of canon law texts
compiled around 1140; it includes papal decrees, conciliar pronouncements, and
excerpts from the Church Fathers, all arranged topically. Many of the texts
contradict each other (the collection’s official title is Concordia Discordantium Canonum, or “Concordance of Discordant
Canons”), so canon lawyers immediately began to write commentaries that
explored the issued raised by these opposing texts. These commentators on the Decretum are called
One of the issues the twelfth-century Decretists debated in their commentaries was the origins of private property. The Decretum includes a text that states, “by natural law all things are common.” Human institutions are supposed to reflect natural law, so the Decretists needed somehow to account for the existence of private property. The Decretist Huguccio, for example, concluded that “common” (commune in Latin) meant “to be shared [communicanda] in time of necessity.” But otherwise, individuals had a right to their own property. 5
All the definitions, distinctions, and speculations of the Decretists regarding property were put to good use in the fourteenth-century Franciscan poverty disputes. For centuries Benedictine monks had given up all personal possessions when they joined the monastery, but the monastery as a whole owned plenty of property which the monks were able to share (their model was the early Christian community described in Acts 4:32-35, which “held everything in common”). In his attempt to follow the commandments of the Gospel literally, Francis of Assisi had embraced absolute poverty, enjoining his friars, as the formula had it, to own nothing “either individually or in common.” The problem is that it’s difficult to live that way, especially as the order grew larger and expanded its ministry. The working solution, established in the bull Ordinem vestrem issued in 1245 by Pope Innocent IV, was that buildings, furniture, books, clothing and so forth donated to the Franciscans would be owned by the church as a whole and just “used” by the Franciscans.
This compromise distinguishing between ownership and use was not acceptable to all the Franciscans, however. A splinter group, known as the Spirituals, saw this compromise as a corruption of Franciscan ideals (and therefore of the Gospel). They insisted on what they called “poor use”—it wasn’t enough simply to renounce legal ownership; one should actually live in poverty. The papacy saw the Spirituals as dangerous, because they could easily go from claiming to be holier than churchmen who lived in luxury to claiming that all property held by the church was illegitimate, because it was contrary to the absolute poverty of Christ and the Apostles.
The Spirituals’ position played into the hands of supporters of the Holy Roman Emperor against the temporal claims of the papacy. Faced with this threat, in 1323 Pope John XXII, in the bull Cum inter nonnullos, declared the belief that Christ and the Apostles were absolutely poor to be heretical. To justify the papal position, opponents of the Spirituals asserted a natural right to property. They argued that it is impossible to renounce this right (in other words, it is inalienable) because, while one might give up one’s possessions, one can never renounce the right to one’s own body or to items consumable in use (like food—how can you say you don’t own the food that you swallow and digest?). The rich tradition of medieval discussion of rights was passed on to the seventeenth-century theorists. 6
Another self-evident truth found in the Declaration of Independence is that governments derive “their just powers from the consent of the governed.” The idea that this consent could best be expressed by means of a representative assembly such as a Parliament developed in the Middle Ages. Eighteenth-century American Whigs frequently referred to the Magna Carta as one of the sources of their rights as Englishmen. Item twelve of Magna Carta states that the King agrees that “No scutage or aid [types of monetary contributions to the crown] may be levied in our kingdom without its general consent.” (This, of course, is the urtext for “No taxation without representation”).
But an additional source for the idea of consent comes not from common law but from canon law. Beginning around 1200, canonists began to cite a formula they found in Roman law (although in a completely different context 7), Quod omnes tangit ab omnibus approbetur, or “What touches all ought to be approved by all,” when referring first to the operation of ecclesiastical corporations (such as monasteries, religious orders or cathedral chapters) and then as a justification for church councils. 8
Soon the phrase began to appear in secular contexts as well. For example, in 1293, the government of the Florentine popolo issued a law code called the Ordinances of Justice whose first rubric echoes Quod omnes tangit: “that is agreed to be most perfect which . . . is approved by the judgment of all.” 9 Two years later, King Edward I of England issued a summons to Parliament that included these words: “a most just law, established by the careful providence of sacred princes, exhorts and decrees that what affects all, by all should be approved.” 10
When a government based on consent begins to act tyrannically, wrote Jefferson, the people have a right to rebel against it: “When a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government.” This passage is another part of the Declaration that is usually considered to be Lockean. But again, medieval canonists also wrote about circumstances in which a people might remove their ruler. In the twelfth century, Decretists became concerned about what would happen if the pope were to be a heretic. This would be very dangerous, because a heretical pope could infect the faithful with his incorrect teachings and thereby jeopardize their eternal salvation. The Decretists concluded that the Church, as represented in a General Council, could depose a heretical pope. But some Decretists took their logic a bit further—what if the pope committed not heresy but some other serious sin? Given his position, wouldn’t that be just as bad? What if, for example, he were a fornicator? Huguccio imagines the situation:
“But I believe that it is the same in any notorious crime, that the pope can be accused and condemned if, being admonished, he is not willing to desist. What then? Behold, he steals publicly, he fornicates publicly, he keeps a concubine publicly, he has intercourse with her publicly in the church, near the altar or on it, and being admonished will not desist. Shall he not be accused? Shall he not be condemned? Is it not like heresy to scandalize the church in such a fashion?”11
Huguccio’s list of imaginary papal sins reminds me of the
crimes Jefferson attributed to George III. Huguccio
denied that a General Council could actually sit in judgment on a sinful pope,
because a pope can be judged by no one. Rather, because of his sins he was
incapable of being pope and therefore automatically deposed himself. The
Council simply “declared” that he had done so. Is that possibly what Jefferson
thought he was doing when he listed the king’s crimes as part of declaring
The twelfth-century canonists were thinking hypothetically, but in the late fourteenth-century a situation actually arose in which the pope appeared to be endangering the whole body of the church. During the Great Schism, which began in 1378, first two, then (after 1409) three men all claimed to be the legitimately elected pope and all of them refused any concession or compromise. Drawing on the ideas of Decretists like Huguccio, writers known as Conciliarists argued that in such a dire situation the whole church, as represented in a General Council, had a right to depose the schismatic popes. The first attempt to do so, the Council of Pisa, failed (that’s where the third pope came from), but the 1415 Council of Constance successfully asserted the powers of a General Council, deposed all three popes, and elected a new one.
The Conciliarists, however, went beyond the emergency situation; they believed that the church would be better off if General Councils met regularly, instead of only in a crisis, in hopes that crises would not develop. In other words, they believed in parliamentary government for the church. In 1417 the Council of Constance issued the decree Frequens, which stipulated that from then on General Councils should meet at regular intervals. 12
The Conciliarists, as you may have noticed, ultimately were not successful; the Catholic Church did not become a constitutional monarchy. But their writings were eagerly adapted by seventeenth-century English Parliamentarians during the English Civil War—wherever they read “Pope” they substituted “King,” and for “General Council” they substituted “Parliament.” The 1689 English Bill of Rights includes a provision “that for redress of all grievances, and for the amending, strengthening and preserving of the laws, Parliaments ought to be held frequently.” This was clearly inspired partly by Charles I’s failure to summon Parliament between 1629 and 1640, but it is also a definite echo of Frequens.
Alan Gibson, in Interpreting
the Founding, characterizes J.G.A. Pocock’s republican approach to the
American founding as
“a sweeping narrative that traces the transmissions and transformations of the civic humanist tradition of political thought through three reconstructions: the first in fifteenth-and sixteenth-century Florence (“The Machiavellian Moment”), the second in early modern England (“The Harringtonian Moment”), and a third in revolutionary America.” 13
I would like to suggest that Pocock’s “sweeping narrative” didn’t begin far enough back, and further that it is itself trapped in a sweeping narrative invented in the Renaissance and strengthened in the Enlightenment—the threefold periodization of the western tradition into ancient, medieval and modern. American history is modern history; it therefore by definition can’t be medieval. Brian Tierney, the historian who has done the most to uncover the medieval, and especially the canonical, roots of modern political ideas, has written that the history of constitutional thought can’t be understood “unless we consider the whole period from 1150 to 1650 as a single era of essentially continuous development.” 14
Or, to put it another way, perhaps we should consider the ideological origins of the American Revolution to begin, not with a Machiavellian, but a Huguccian Moment.
One gorgeous summer afternoon a few years ago, while in Gettysburg, PA for a chamber music camp, I used our afternoon break from playing string quartets to visit the Gettysburg National Military Park and take some photos. I stopped the car at the most prominent monument I could see, which turned out to be the Pennsylvania Monument.
As I was walking around it looking for good photographic angles, I noticed how the summer sky was framed by the monument’s arch. “That’s beautiful,” I thought. “It looks just like a painting.” Then I realized, “Hold on—it looks like THE painting.” A quick search on my phone confirmed my suspicion. The Pennsylvania Monument is indeed very similar to the architectural setting of Raphael’s 1509 fresco The School of Athens, right down to the sky framed by the arch. (The other tourists visiting the Battlefield that day probably wondered why I was jumping up and down in excitement).
So, was The School of Athens the inspiration for
the design of the Pennsylvania Monument? The monument was commissioned in 1907
by the Pennsylvania state legislature; architect W. Liance Cottrell was awarded
the job. (Sculptor Samuel Murray, who studied with Thomas Eakins, got the
sculpture commission.) The monument was still incomplete when dedicated in 1910;
more money was appropriated and the finished memorial was rededicated on July
1, 1913, as part of the 50th anniversary commemoration of the Battle
I have so far been unable
to find any evidence that Cottrell had Raphael’s fresco specifically in mind
when he designed the memorial to Pennsylvanians who fought at Gettysburg.
Cottrell was trained in the Beaux-Arts school of architecture, which made
extensive use of classical style. Raphael and Cottrell may simply have chosen the
same classical elements for their creations. But I like to imagine that
Cottrell tried to bring Raphael’s imaginary building to life on the field of
While white women were pushed to the margins of the Fair, the contributions of African-Americans to the story of American progress were not simply marginalized; they were erased. Not for nothing was the Fair nicknamed the “White City.” Only European-derived culture and achievements could be displayed in those gleaming neo-classical buildings. Visitors to the Fair could see Africans themselves displayed on the Midway in Dahomey Village, one of the living ethnological villages whose purpose was to set the utopian vision of progress in the adjacent White City into sharper relief.1 But no African-Americans were on the Fair’s planning commissions; no building was dedicated to the progress they had made since the abolition of slavery. There was a “Colored American Day,” analogous to other special “Days” at the Fair arranged to boost attendance. Antonin Dvořák, who was summering that year in Spillville, Iowa, conducted his Eighth Symphony and other works on Bohemian Day, for example. African-American musicians Harry T. Burleigh and Will Marion Cook (both of whom studied with Dvořák at the National Conservatory in New York) joined poet Paul Laurence Dunbar for a program on Colored American Day at which Frederick Douglass also spoke. Otherwise, African-American participation was unofficial and undocumented.
It is generally believed, based on oral traditions, that several “Piano Professors,” as they were called, playing music that would soon be known as “ragtime,” performed either on the Midway or at various establishments in the neighborhood of the Fair. Despite a lack of written documentation, scholars concur that Scott Joplin, the “King of Ragtime Writers,” was probably one of these Piano Professors. Ragtime has been called the first indigenous American musical style. Joplin established the ragtime form in his “Maple Leaf Rag” of 1899, which also became his biggest hit. Joplin composed over forty other rags after “Maple Leaf,” including the “Cascades” Rag inspired by the 1904 St Louis Fair commemorating the Louisiana Purchase, which he definitely did attend.2
World’s Columbian Exposition closed over 125 years ago, on October 31, 1893. Little
of the physical Fair remains today. The buildings of the White City, which were
never intended to be permanent, are all gone, except for the Fine Arts
building, now the Museum of Science and Industry. Besides its name, the Midway
survives only as a wide grassy strip on the University of Chicago campus. The
legacy of the Fair remains, however, in perhaps unexpected places. If you’ve
ever ridden on a Ferris Wheel or enjoyed the midway at a county fair; drunk
Welch’s grape juice or eaten Cracker Jack (both introduced at the Fair);
recited the Pledge of Allegiance (written for the Fair’s Dedication Day
ceremonies) or sung the fourth verse of “America the Beautiful” (with its
reference to “alabaster cities”), you can thank the Chicago World’s Fair.
The Fair also left a musical legacy. Concert-goers who attend classical performances still mostly hear the music of dead European males, although, after being mostly forgotten after her death in 1944, Amy Beach has enjoyed a renaissance in recent years. Similarly, ragtime faded in popularity in the early twentieth century (although not before it influenced jazz), but experienced a revival in the 1970s, especially after the 1972 movie The Sting used Joplin tunes in its soundtrack. (For a time, it seemed that every piano student in the land was playing an arrangement of “The Entertainer.”) The issues raised by the experience of music at the Chicago World’s Fair—what to play, who should play it, how do you get an audience to come hear it, and how do you pay for it—are familiar to every classical music organization today.
For Further Reading:
Edward A. King of Ragtime: Scott Joplin
and his Era. New York and Oxford: Oxford Univ. Press, 1994.
Susan. Dancing to a Black Man’s Tune: A
Life of Scott Joplin. Columbia and London: Univ. of Missouri Press, 1994.
Both Thomas’ program of concerts and the Fair as a whole were designed to display progress. But progress is by its nature a comparative concept. The idea of progress as it arose in the Enlightenment implies that a society has journeyed from a worse state to a better one. So demonstrating progress requires showing its opposite— knowledge to compare with ignorance, reason with superstition, civilization with barbarism. This ideology of progress was mapped onto the geography of the Fair. Although Bertha Honoré Palmer, President of the Fair’s Board of Lady Managers, had negotiated a Women’s Building to celebrate female accomplishment, and engaged a woman architect, Sophia Hayden, to design it, the Women’s Building was not deemed worthy of a prime location on the Court of Honor.1 Rather, it was pushed, literally, to the margin of the Fair, on the extreme edge of the main Fair grounds adjacent to the Midway. In the Fair’s hierarchy, white women occupied a borderline space, on the threshold between the civilization of the White City and the barbarism of the Midway.
Women’s music was marginalized as well. Like Chadwick and Paine, composer Amy Beach is also considered a member of the Second New England School. Like Chadwick and Paine, she was commissioned to write a work for Dedication Day in October 1892. Unlike Chadwick and Paine, however, Beach was not to hear her piece performed at that ceremony. After much back-and-forth between male Fair officials and Bertha Palmer, Beach’s composition, the “Festival Jubilate” for chorus and orchestra, a setting of Psalm 100, “O be joyful in the Lord, all ye lands” (Opus 17), was instead performed at the dedication of the Women’s Building on May 1, 1893. The lack of music by women composers at Paderewski’s concert was typical of the programming of the rest of the Music Hall concerts (and, of course, typical of much classical music programming even today).2
Although Beach had already written one large-scale work, a Mass in E-flat (Opus 5, 1890), which could have been performed at one of the Choral Hall concerts, she was not given a place in any of the concerts planned by Thomas. She did return to the Fair on July 5-7 for the Women’s Musical Congress. The Fair’s organizers sponsored numerous International Congresses that ran concurrently with the Fair, meeting in downtown Chicago’s newly-constructed Art Institute. The Congresses assessed the state of the topic, discussed controversial issues, and debated what progress had been made and what remained to be done.3
Beach performed her own compositions on each of the Congress’ three days. The pieces she chose for these performances were not the large-scale works like symphonies and concertos that were featured in the Music Hall series. Rather, Beach highlighted smaller-scale genres whose very names— parlor songs, salon pieces, chamber works—emphasize the domestic setting that women musicians were associated with. On July 5, she played two piano pieces, “In Autumn” and “Fireflies,” from her Opus 15, Sketches, published the previous year. The following day she premiered her Romancefor violin and piano, Opus 23, with Maud Powell, the first American violin virtuoso, as the soloist. The final day of the Congress, she accompanied vocalist Jeannette Dutton on Beach’s song “Sweetheart, Sigh no More,” whose melody she had adapted for the Romance. Although much of Beach’s oeuvre falls into these domestic genres, she did not confine her creative output to the parlor. In the years following the Fair, she composed her Gaelic Symphony in E minor, opus 32 (1897) and her Piano Concerto in C# minor, opus 45 (1900), both premiered by the Boston Symphony (the Concerto with Beach as the soloist).
Next: Progress and Piano Professors
For Further Reading:
Adrienne Fried. Amy Beach, Passionate
Victorian: The Life and Work of an American Composer, 1867-1944. New York
and Oxford: Oxford Univ. Press, 1998.
Ann E. “Being Heard: Women Composers and Patrons at the 1893 World’s Columbian
Exposition.” Notes, 2nd series, 47, no.
1 (Sept. 1990), 7-20.
Let’s return to the Paderewski story. Theodore Thomas, a prominent conductor in late-nineteenth-century America, had recently become the conductor of what would later be known as the Chicago Symphony Orchestra; the Exposition Orchestra was in fact mostly made up of Chicago Symphony musicians. Thomas was also the Director of the Music Bureau of the Fair, and he had planned an ambitious series of concerts for the Fair’s six-month run. The Paderewski concert was the first of what was intended to be a series of orchestral concerts in the Music Hall; additional concerts were planned for the Choral Hall, the Fair’s other indoor music venue, as well as outdoor band concerts.
Paderewski Concert: The
The program for Paderewski’s concert was all well-known works by European composers, all (except for Paderewski himself) dead and all (except for the Poles Paderewski and Chopin) German. The program choices fit in with one of Thomas’ stated aims, to educate the American public and elevate their musical taste: “to bring before the people of the United States a full illustration of music in its highest forms, as exemplified by the most enlightened nations of the world.” To Thomas, the “highest form” of music was symphonic; the “most enlightened nation” was Germany. This aim perhaps conflicted with Thomas’ other goal, “to make a complete showing to the world of musical progress in this country.”1 Thomas had commissioned two works by American composers for the Fair’s Dedication Day in October 1892, the “Columbian Ode” by George Whitefield Chadwick and “Columbus March and Hymn” by John Knowles Paine, two leading American composers of the day and members of what is now known as the Second New England School.2 But when it came time to inaugurate his concert series, he chose a European musician performing European repertoire.
Paderewski Concert: The
Paderewski played the concert on a Steinway piano. He was what we would now call a “Steinway Artist”—Steinway and Sons supplied the instrument for his entire U.S. tour.3 Many pianos were on display in the immense Manufactures and Liberal Arts Building on the Fair’s Court of Honor. Piano-makers like Chickering, Kimball, Everett, and many others now forgotten showcased their latest models. Makers of accessories like piano stools and component parts like piano wire were also present. Some displays were quite creative: Alfred Dolge and Son, maker of hammers, dampers, and, as the official report on the display of musical instruments put it, all the “woolly parts” of instruments, adorned his display with lampposts in the shape of giant piano hammers.4
is not surprising that pianos should be featured so prominently at the Fair.
1893 was in the midst of the Golden Age of the piano—it was standard equipment
in every middle-class home, and a standard part of the education of every
middle-class young girl, one of the “accomplishments,” along with drawing and
needlework, that would show she was a lady. Many of the piano companies
exhibiting at the Fair employed such accomplished young ladies, referred to as
“pianistes,” to demonstrate their products.
Furthermore, the piano conformed to the Fair’s ideology of progress. The design and manufacture of pianos underwent significant improvements in the course of the nineteenth century. In 1895, Charles Daniell asserted that if Bartolomeo Cristofori, the 18th-century inventor of the modern piano, had “visited the World’s Columbian Exposition he would have been amazed at what he saw.” Daniell explained that “the evolution of the piano has been very great, from the tinkling little clavichord of the early eighteenth century to the perfect instrument of today.” He concluded that the exhibitors at Chicago “proved their spirit of progressiveness as never before.”5 It is fitting, therefore, that the first Music Hall concert should feature the piano.
piano exhibitors, however, did not find it fitting at all. They had nothing
against Paderewski himself or the choice of repertoire; it was his Steinway
piano they objected to. Steinway and Sons, as well as some other eastern piano
companies, had chosen not to exhibit at the Fair because they objected to the
procedure to be used for awarding prizes. When the exhibitors heard that
Paderewski planned to play his accustomed Steinway, they protested, demanding
that he use a piano from one of the exhibiting companies. He refused, and what
we would now call a flame war ensued in the Chicago and New York papers.
Supported by Theodore Thomas, Paderewski prevailed, but it was not an
auspicious beginning to Thomas’s concert series.
The inauspicious beginning didn’t get much better. After Paderewski’s opening concerts, which probably benefited from the soloist’s celebrity status (not to mention the publicity generated by the piano controversy), the remainder of Thomas’ carefully-planned Music Hall concerts played to near-empty houses. Maybe it was the one-dollar admission fee—twice the cost of admission to the Fair itself—that kept the crowds away. The Panic of 1893, a serious economic depression that began that summer, probably also contributed. Maybe it was Thomas’ insistence on programming “serious” music with no concession to popular taste, since the more pops-oriented concerts, which were free, packed them in. In fact, the most popular musical performances were the open-air band concerts. By August 12, Thomas’ accumulated problems led to his loss of support by the Fair’s organizers, and he resigned.
Next: Progress and Parlor Music
For Further Reading:
Frank D., and Charles A. Daniell. Musical
Instruments at the World’s Columbian Exposition. Chicago: The Presto
David M. “From Yankee Doodle Thro’ to Handel’s Largo: Music at the World’s
Columbian Exposition.” College Music
Symposium 24, no. 1 (Spring, 1984), 81-96.
May 2, 1893, Polish piano virtuoso Ignaz Paderewski performed at the inaugural
concert of the Music Hall on the grounds of the World’s Columbian Exposition in
Chicago (also known as the Chicago World’s Fair). The program opened with the
114-member Exposition Orchestra, conducted by Theodore Thomas, playing
Beethoven’s “Consecration of the House” overture, followed by Paderewski
performing as the soloist in his own piano concerto, playing his preferred
Steinway instrument. This was followed by a selection of solo piano works by
Chopin and Schumann. The orchestra returned to conclude the concert with
Wagner’s Prelude to Die Meistersinger.
apparently unremarkable story of a performance actually encapsulates the story
of music, particularly piano music, at the Chicago World’s Fair. Every aspect
of the performance—the event itself, the program, and the instrument—can serve
as a window into the context of the Fair’s musical activities. At the same
time, this seemingly routine account masks tensions regarding American
identity, between highbrow and lowbrow forms of entertainment, and over the
status of women and African-Americans that disturbed not only the Fair but also
Gilded Age American society as a whole.
The Chicago World’s Fair
Fair commemorated the 400th anniversary of Columbus’ discovery of
America. It was located in Jackson Park on the shore of Lake Michigan, seven
miles south of the Loop, where landscape architect Frederick Law Olmsted and
supervising architect Daniel Hudson Burnham created what became known as the
“White City.” The individual fair buildings, although designed by different
architects, adhered to a common Neo-Classical style, known as “Beaux-Arts” from
the school in Paris where many architects trained, and were all painted white. The main exhibition buildings, such as
Machinery Hall, the Agriculture Building, and the gigantic Manufactures and
Liberal Arts building, were arranged around a basin carved by Olmsted out of
the marshy lakeshore and called the “Court of Honor.” Perpendicular to the
fairgrounds proper ran the “Midway Plaisance,” a wide boulevard about a mile
long. Here were gathered not only food
concessions, rides, and other entertainment options—giving its name to the
“midway” of every subsequent state and county fair with their carnival rides
and cotton-candy stands—but also living ethnological exhibits and the Fair’s
signature attraction, the great Wheel designed by George Washington Gale Ferris
and intended to surpass the iron tower constructed by Gustave Eiffel for the
Exposition Universelle held in Paris in 1889.
The World’s Columbian Exposition was dedicated on October 21, 1892. Hold on, you say, isn’t Columbus Day October 12? Yes it is, but New York City had scheduled its Columbus commemoration for that day and Chicago didn’t want to compete either for attention or for dignitaries—they were hoping U.S. President Benjamin Harrison would attend. So they creatively reasoned that if the Gregorian calendar had been in use in 1492, the day Columbus sighted land would have been October 21, not October 12, which makes October 21 the “real” Columbus Day. As it turned out, Benjamin Harrison couldn’t come, as his wife was dying, but he sent Vice President Levi Morton in his place. Morton expressed the overall purpose of the Fair when he dedicated it “to the world’s progress in arts, in science, in agriculture, and in manufacture.”1 The new President, Grover Cleveland, did attend the Opening Day of the Fair on May 1, 1893. The building of the Fair continued through the winter of 1892-93, and it opened to the public on May 1, 1893, closing six months later on October 31.
Next: The Paderewski Concert
For Further Reading:
R. Reid. The Great American Fair: The
World’s Columbian Exposition and American Culture.
Neil, Wim de Wit, James Gilbert, and Robert W. Rydell. Grand Illusions: Chicago’s World’s Fair of 1893. Chicago: Chicago
Historical Society, 1993.
Erik. The Devil in the White City:
Murder, Magic, and Madness at the Fair that Changed America. New York:
Vintage Books, 2004.
Lawrence W. Highbrow/Lowbrow: The
Emergence of Cultural Hierarchy in America. Cambridge, Mass.: Harvard Univ.
Robert. Celebrating the New World:
Chicago’s Columbian Exposition of 1893. Chicago: Ivan R. Dee, 1993.
Robert W. All the World’s a Fair: Visions
of Empire at American International Expositions, 1876-1916. Chicago and
London: Univ. of Chicago Press, 1984.
Rydell, Robert W., John E. Findling, and Kimberly D. Pelle. Fair America: World’s Fairs in the United States. Washington and London: Smithsonian Institution Press, 2000.