|Posted on August 18, 2011 at 9:05 AM|
Historically, American scholastic marching band has always been associated with competition. From its earliest days around the turn of the 20th century, youth marching band and drum and bugle corps organizations were sponsored by VFW posts, Boy Scout troops, and church youth programs. They made the move into school systems along the way primarily as entertainment during football games.
Most of the youth organizations centered around summer competitive shows sponsored by the VFW in which units would start at the goal line (the starting line), march to midfield playing a march, post the colors on the front sideline, perform selections in concert formation, then parade off the opposite finish line. Groups were evaluated primarily on adherence to flag code and the level to which they performed cleanly and precisely. Evaluators typically used a "tick system", keeping a tally of hash marks on their clipboard for every error the saw or heard. The group with the fewest ticks would win.
Over the decades, as marching units became more innovative and artistic in nature, the system of adjudication adjusted to reward creativity and originality but still maintain precision as a major component. In the past ten years, most competitive circuits have adopted what is a very effective system of rewarding both creativity and precision for a wide range of groups, from novice units with a small skill set to the most advanced professional-level performers.
It's All In The Language
Judges' sheets vary from circuit to circuit primarily in their language and in how the numbers are managed. Most sheets are divided into two subcaptions, usually displayed on the sheet as the "top box" and "bottom box". Typically, the top box evaluates the content of the show from a design standpoint. It's the "what" in the equation. For example, on an Ensemble Music judges' sheet, the top box evaluates the effectiveness of the musical arrangements being played by the unit from a construction standpoint (do the arrangements highlight every section of the unit? Do they demonstrate the technical capabilities of the performers? etc.).
The bottom box typically evaluates the level of precision with which the group performs the content from the top box. It's the "how" in the equation. This box is often called "Excellence" and evaluates the technical performance of the unit - how well do they perform the content? For the aforementioned Music Ensemble judge, things evaluated in this box are things like "Are the voices of the ensemble equally present in balance? Does the unit perform with consistent intonation across sections? Is rhythmic timing and alignment consistent across the ensemble at all times?"
The language used to describe these two boxes is incredibly important in determining how an adjudicator evaluates the ensemble. Just a small change in the wording can have a tremendous effect over the kind of feedback the group receives.
Numbers Management - How Do They Come Up With Those Scores?
Scoring a competitive music ensemble is a challenging endeavor. Judges refer to the process of scoring units over the course of a competition "numbers management." Because of the subjective nature of the art form, personal opinion is reflected in a judge's scoring and commentary. However, over the course of competitive music's long history, the system of evaluation has become quite effective at establishing a common level of expectation based on the construction of the judges' sheets and the general quality level of the competitive units in any given circuit of competition.
Scoring is typically divided into either four or five point ranges that are also known as boxes. These point ranges vary from one competition circuit to another. These boxes loosely correspond to the adjectives of "Superior" (box 5), "Excellent" (box 4), "Good" (box 3), "Fair" (box 2), and "Poor" (box 1). Traditionally, all competitive music judging systems use a 100-point scale to rank and rate competitive units. Very few, if any, judging systems allow a score less than a 50. It is typical that most units start the competitive season scoring in the 60-80 point range and finish the season scoring anywhere from the upper 70's to the upper 90's.
Each box has with it descriptive phrases that describe the level of performance in that box. Adjectives like "rarely", "sometimes", "frequently", or "always" are used to quantify the level of quality of a performance. To score a unit, a judge will reflect on the commentary they provided on the verbal recording and on the written sheet, then decide which of the scoring boxes the unit's performance in their caption falls into. They will then decide where within the spectrum of that box the unit's score should be - are they a box 2 approaching box 3?, etc.
Ranking And Rating
When it comes to scoring, a judge's job is to first get the competitive units ranked in the correct order based on the performance displayed during this competition only. In general, judges do not compare scores from week to week and contest to contest (though we know that most band/corps directors do). Judges do begin to evaluate units as soon as they have wrapped up the scoring and commentary from the unit before, even if the official adjudication period has not begun yet. For example, how a unit comes into the stadium and takes the field already communicates quite a bit about both their level of experience and skill sets and also the level of excellence they have achieved so far.
A judge will usually be able to decide within the first few minutes of a show which scoring box this performance belongs in. As the show progresses, the judge may adjust this reaction up or down based on the entire content of the show. When it comes time to write down the number, the judge will decide which part of which box the unit's score belongs in. They will also take into consideration how many more units have yet to compete and will typically leave enough space between the scores they assign to allow them to score another group. The primary goal is to get the ranking order correct, then score the units appropriately based on their performance that day.
Individual judges don't always "get it right" when it comes to ranking and rating, but over the course of an entire competition, a panel of judges typically "gets it right" collectively as the scores add up to 100. For championship shows, many competitive circuits will use a double panel, having two judges for each caption in order to minimize the ability of one judge to sway the units score dramatically.
Judges Are People Too - Suggestions For Critique
Judges, just like referees in baseball, are often the target of criticism and ire because their personal opinion can affect the outcome of the event dramatically. The majority of adjudicators employed by the various judging circuits are individuals with years of experience in the caption they have been assigned to judge. Whether or not they are a music teacher, a professional performer, an instructor, or just a life-long enthusiast for the activity, judges by and large want to see performing units succeed and want to help them improve their performances.
It is common at many competitions to have a post-show meeting with the judges to have direct interaction, commonly referred to as "critique". Here are some suggestions for any band/corps staff going into a critique session:
- If possible, listen to the judges' recordings and take notes on them before you go to the meeting. Draw a line down the middle of a sheet of paper and write positive comments on one column and negatives on the other. Star or circle any comments that you don't understand or that you want to clarify for the judge if you think they missed something. Bring those points up in critique.
- Have something to say to each judge. If you don't talk, you'll get a nice recap of what the judge said on their recording and not much else. Good judges will elaborate further on what they commented on and may even have suggestions for ways you can address the issues. Engage them and get your questions answered.
- Be sure to identify show design problems with the ensemble and effect judges early. If you need to make changes to the show to improve your overall performance over the course of the season, you want to get those changes identified early. Typically, marching bands in particular have a limited amount of time to drop in changes in the fall and need to spend the majority of their rehearsal time cleaning the show and training young members.
- Be open to the notion that you and your band staff may need to improve and that it is not just the students' lack of ability, skill, or motivation. We all have things to learn that can make us better instructors and show designers.
- If you are not happy with the scores you are receiving, it is much more productive to probe judges for the things that are preventing them from scoring you differently than it is to ask why so-and-so scored higher than you did. The answer you will receive almost every time from a judge for the latter question will be "in today's competition, this is where you ranked." Judges don't want to talk about their numbers in general and dislike having to defend them. That doesn't mean you shouldn't question the numbers, particularly if you think you are not being scored correctly. Remember, it's all about perception, so if you aren't getting rewarded for something you are attempting, you may need to make that aspect of your show more obvious. Be sure to point out to judges that the next time they see you, you'd like them to evaluate that part of your program more closely.
- Judges are people too - if they make a suggestion to you for improvement and then you go and make a point of improving that aspect of your show, they will most likely notice the next time they see you and react positively. That does not mean, however, that you should make "judge pleasing" your entire raison d'etre. Your staff should be meeting to discuss judges' feedback and prioritizing which comments to go after. If you are getting similar comments from multiple judges about a certian issue, that should become a high priority. You simply don't have time to get to everything they suggest, and quite frankly, not everything they suggest is worth worrying about.
- Judges work towards being as impartial as they can. If you feel that a particular judge "hates you" or "hates our show", you should either address it individually with the judge or contact the judging coordinator in charge of quality control. The judge may not dislike you at all, but again it's about perception, and they need to know that you feel you are being "black-balled".
Competitive music can be exhilarating, frustrating, character-building, and character-damaging. Having been a band director of both competitive and non-competitive bands, and having judged a wide variety of bands, I can honestly say that competition is not for everyone. Band directors should seriously consider whether competition is a good fit for their program, and whether or not the community that you work in will be supportive of a competition band.
This article (c) 2011 Thomas J. West. All content on ThomasJWestMusic dot com is licensed under a Creative Contributions Attribution-No Derivative Works 3.0 License. Please contact the author before publishing on or off-line.