With regard to Jean-Pierre Keller and Donald Byrd's letters in CMJ Vol. 8, #1, concerning the standardization of musical editing functions, several points should be considered.
First, there seem to be 2 ways in which standards emerge. Some are created "top down" by centralized pre-planning and decision on the part of assumed experts. Others are practises which emerge in "free market competitive" situations, becoming dominant because they fulfill need and functions better than others available for their purposes. There are pros and cons to both of these ways of development, and they should be thought through. Centralized decision may fail to provide for approaches it is unfamiliar or not in sympathy with, or may misunderstand the needs of large groups of intended users of the standards it creates. It may also provide better for some minority needs or views. Popular "vote" may produce periods of co-existent and incompatible standards (Betamax and VHS in video) or other inconveniences, or leave out minority views, but this method of standardizing may result in less complacency due to competition and allow users to select what works best for them. It may maintain greater variety of options or approaches and allow higher quality or unanticipated features to evolve into general usage.
It is worth noting that Conventional Music Notation (CMN) became standardized by common practise rather than by centralized design. The 20th century has seen tremendous need to modify this standard practise as new compositional techniques and structures and new musical materials have been explored. CMN has survived in this century as a workable standard practise in part because it provides equally to each of its users the means to escape, expand, or modify the standard practise. One can use deviant symbols or formats freely as is desired and add explanatory notes. Some deviations from common practise are intuitively understandable without explanation, too, as CMN grows and evolves in similar ways to other natural languages.
In contrast, pre-programmed software systems for music editing may not be able to provide the user with adequate means for their own modification and extension. From an end-user perspective, the defined limitations of software may seem as fixed and inflexible as those of hardware.
Second, a standard is common to some group. Therefore it is necessary for musical standards to involve input from as large and varied a group as possible or else to specifically define the group toward whose purposes the standards will be oriented and the nature of their use. While CMA is a good vehicle for the development of standards, initial meetings at places like IRCAM may limit formative input from non-university or other unaffiliated computer music practitioners from the Americas, Japan, Australia and other places, particularly those who may take original or unique approaches on personal computer or other smaller systems, and not be budgeted for world travel. If computer-based musical tools are to become the dominant musical media of the future (which is likely), standards must be defined with ample input from non-academic, non-institutional, even non-computer-using and non-European-tradition musicians, who are unlikely to be present at the ICMC for the planned meeting. Computer music editing standards may ultimately define the conceptual workspaces of all types of music, as digital media and tools proliferate. Therefore input from non-European, tonal, traditional, commercial, instrumental, and other musicians not likely to be found in the relatively avant garde, computer-accustomed, and European-tradition atmosphere of ICMC in Paris is important. It would make more sense to hold such initial meetings where ever the largest and most diverse user base for digital music systems already exists, (probably the USA, which also has a larger base of instrument designers and producers.) It is also important to actively solicit participation of music editors involved in film, publishing, arranging, and other areas of music with different editing practises and requirements from those of composition and synthesis.
Third, when looking at the history of various standards in common usage, it is easy to see the ill effects of those which were defined too early in the evolution of their respective technologies. Once a standard is defined, it is difficult to change in light of subsequent improvements. America was the first country to develop high saturation of television use, and we're stuck with poor TV resolution, compared with more recently developed standards in other countries. Prestel graphics protocol, again a first, is vastly more limited than later computer graphics transmission protocols. Once the first version of the NAPLPS (North American Presentation Level Protocol Syntax) was adopted and proliferated by ANSI, it was required that all subsequent improvements in it maintain backward compatibility with the original standard, severely limiting future improvements. Such standards, once in place in a given country, are generally superceded by better ones only in other countries, rendering the country which standardized too early incompatible with later developments. If a standard is adopted internationally, incompatibility is not a problem, but being stuck with limitations is.
The question to consider is whether computerized editing of music is ready yet to consider standardization or whether it might limit or constrain future development if a standard is attempted too early. (Do we have editing yet such that Beethoven would have been able to compose his works as easily as some of us?) Do the "tools" we currently use on our computers reflect adequate study of traditional editing processes, as well as contemporary methods, for us to standardize them as sufficiently general? Does anyone really understand the full nature and interplay of the various creative processes that go on in the mind of anyone writing music with a pencil? I doubt it.
Are we still too close to the beginnings of research in perception, cognitive processing, psychoacoustics, and in signal processing, computer system design, user interface research and development, and the psychology of the creative process -- all of our researches -- to standardize? How much would the lack of standardization impede such research by preventing us from working together more easily (if that is an intended purpose of standardization)? To what extent might such standardization impede this research, instead of fostering it?
Fourth, editing cannot be divorced from other aspects of music for purposes of standardization. The nature of editing depends on the nature of what is being edited. This brings up the question of standardizing the representation of music in digital systems. Editing is a fundamental process in musical creation, and its standardization is likely to have strong effects on musical conceptualization and realization. Any language has conceptual biases (Sapir-Whorf). So does any technique. What is to be considered entity and what will be continuous dimension? How is time to be modelled? What are the semantic units of music above the level of the "event" or below it?
To standardize editing is to standardize the conceptual space in which music is conceived and evolved by individuals, so it is difficult to imagine standardizing editing independently of the general representation of music's dimensionality, processes, relationships, materials and structures.
A single representation can be edited by various techniques, but can one use the same editing procedures on all forms of musical representation? If editing and representation are interdependent, should they be tackled separately or together? If separately, which should be dealt with first? In text, ASCII (a representation) has been beneficially standardized, but word processor functionality (editing) has not, nor has this been advocated due to the vast diversity of text editing applications. Novelists and poets have different needs from secretaries or typesetters. A single system satisfying all needs would be excessive in cost and complexity to all individuals using only subsets.
Fifth, standards create communities in relation to which individuals must either conform or deviate, should a standard not prove to be all-encompassing. In a field to which individuality is as important as it is in music, and in which all human cultures world-wide participate, the inertia of any community unified by a shared standard may discourage the pursuit or introduction of differences and limit diversity. Communal values such as exchange of information, sharing, portability, and mobility of work tend to take precedence over the values of continued fresh exploration, individuality, and variety. Communality of practise may give that community's members a vested interest in maintaining what is already established, and give others incentives to compromise in order to participate with that community.
Sixth, a standard fulfills a purpose, such as increasing the ease of communication. MIDI is a communicative protocol, like traditional music notation, and standardization is beneficial because communication is facilitated by shared definitions (representations). What purpose would the standardization of editing fulfill? Editing is a private, non-communicative, activity. (Such a standard might make it easier for people to relocate among systems. It might set a minimum standard of "quality" as decided by some consensus, assuming some group could formulate a true superset of all possible editing operations useful to all creative individuals working in all musical genres and professions, and assuming that the superset of useful editing procedures is not infinite and ever-changing as the superset of meaningful music seems to be.)
The purpose of the proposed editing standard needs clarification before it can be balanced as to potential benefit and detriment.
Seventh, there are several "levels" on which editing is done in music as well as various dimensional axes. The acoustic, semantic, parametric, and architectural levels (or however a standard might define them), and such axes as sequence and simultaneity, within each level and among them, are grouped, interrelated, alternated between, and differentiated in different ways in various editing approaches. A truly general standard would have to avoid bias toward any single fixed conceptualization of the relationships among such common and useful but simplistic constructs. It would have to go beyond such common formulations and contemporary concepts and incorporate a strong focus on the mental processes of the creating mind, for example in basing itself on perceptually meaningful rather than acoustically descriptive data. Again, this suggests seeking a general representation of the musically meaningful prior to the standardization of editing practises. This attaches to the ideal of generality an even greater elusiveness when we consider the fact that what is perceptually meaningful in music is culturally determined to an unknown (and unknowable?) degree.
It definitely seems to be time to consider standardization in digital music, if only because this will happen for some aspects or areas with or without the participation of CMJ, ICMC, ANSI or UNESCO. The success of MIDI documents the need for some means of intercommunication, as does the amount of data-swapping among such larger single-format communities as AlphaSyntauri users or other system-based groups, or DARMS, despite the non-generality of such formats. An increasing amount of acoustic and compositional data exists in digital formats, and overlaps in functionality and concept exist. Each shared format already has one or more available approaches to editing, as well.
Some standard will develop, whether by common usage in a free competitive evolution or by centralized design, or multiple standards may co-exist for different purposes. All of the above questions deserve serious thought, including the isolation of aspects of music which could benefit from the implementation of standards. Editing may or may not be among them.
Among the reasons that such caveats spring so readily to my mind upon reading the 2 letters in CMJ is that fact that I am currently involved (through Syntronics in Toronto) in projects in telecommunications, electronic publishing, compositional and editing languages, and related areas which depend upon standardized (intercommunicable) representations of musical material. This work has made me more aware of such questions as are above. The importance to music of careful consideration of when, what, and whether to standardize, as well as how, why, and why not, can not be overestimated. Not only editing, but storage, retrieval, analysis, synthesis, publication, transmission, distribution, composition, education, freedom of conceptualization and expression, aesthetics and other aspects of music may be affected by standardization and may be vulnerable to the deficiencies of any such standards.