Saturday, June 30, 2007

Musical Instrument Digital Interface (MIDI)




MIDI (Musical Instrument Digital Interface; IPA: /ˈmɪdi/) is an industry-standard electronic communications protocol that enables electronic musical instruments, computers and other equipment to communicate, control and synchronize with each other in real time.


Note names and MIDI note numbers.MIDI does not transmit an audio signal or media — it simply transmits digital data "event messages" such as the pitch and intensity of musical notes to play, control signals for parameters such as volume, vibrato and panning, cues and clock signals to set the tempo. As an electronic protocol, it is notable for its success, both in its widespread adoption throughout the industry, and in remaining essentially unchanged in the face of technological developments since its introduction in 1983. Also see: Category:MIDI standards

Contents [hide]
1 History
2 Overview
3 MIDI interfaces
4 MIDI message interoperability
5 How MIDI channel messages work
6 How MIDI Show Control works
7 The MIDI 1.0 Protocol
7.1 Hardware Transport (Electrical and Mechanical Connections)
7.2 Message Format
7.2.1 Low bandwidth
8 MIDI file formats
8.1 Standard MIDI File (SMF) Format
8.2 MIDI Karaoke File (.KAR) Format
8.3 XMF File Formats
8.4 RMI File Format
9 MIDI usage and applications
9.1 Extensions of the MIDI standard
9.1.1 General MIDI
9.1.2 General MIDI 2
9.1.3 SP-MIDI
9.1.4 Alternate Hardware Transports
9.1.5 Alternate Tunings
9.2 Other applications of MIDI
9.3 MIDI controllers: hardware, software, datastream
10 Beyond MIDI 1.0
10.1 OSC
10.2 mLAN
10.3 HD-MIDI
11 MIDI software
12 Sound samples
13 See also
14 External links
14.1 Official MIDI Standards Organizations
14.2 Unofficial Sources
14.3 MIDI Search engines
14.4 Other resources



[edit] History
By the end of the 1970s, electronic musical devices were becoming increasingly common and affordable. However, devices from different manufacturers were generally not compatible with each other and could not be interconnected. Different interfacing models included:

analog control voltages at various standards (such as 1 volt per octave, or the logarithmic "hertz per volt")
analog clock, trigger and "gate" signals (both positive "V-trig" and negative "S-trig" varieties, between -15V to +15V)
proprietary digital interfaces such as Roland Corporation's DCB (digital control bus) and Yamaha's "keycode" system.
In an attempt to find a way forward from this situation, audio engineer and synthesizer designer Dave Smith of Sequential Circuits, Inc. proposed the MIDI standard in 1981 in a paper to the Audio Engineering Society. The proposal received widespread enthusiasm within the industry, and the MIDI Specification 1.0 was published in August 1983. Today, Dave Smith is generally regarded as the "Father of MIDI" and MIDI technology has been standardized and is maintained by the MIDI Manufacturers Association (MMA).


[edit] Overview
All official MIDI standards are jointly developed and published by the MIDI Manufacturers Association (MMA) in Los Angeles, California, USA (http://www.midi.org), and for Japan, the MIDI Committee of the Association of Musical Electronic Industry (AMEI) in Tokyo (http://www.amei.or.jp). The primary reference for MIDI is The Complete MIDI 1.0 Detailed Specification, document version 96.1, available only directly from MMA in English, or from AMEI in Japanese.

The MIDI Show Control (MSC) protocol (in the Real Time System Exclusive subset) is an industry standard ratified by the MIDI Manufacturers Association in 1991 which allows all types of media control devices to talk with each other and with computers to perform show control functions in live and canned entertainment applications. Just like musical MIDI (above), MSC does not transmit the actual show media — it simply transmits digital data providing information such as the type, timing and numbering of technical cues called during a multimedia or live theatre performance.

Almost all music recordings today use MIDI devices. In addition, MIDI is also used to control hardware including recording devices and live performance equipment such as stage lights and effects pedals.

MIDI allows computers, synthesizers, MIDI controllers, sound cards, samplers and drum machines to control one another, and to exchange system data.

MIDI was a major factor in bringing an end to the "wall of synthesizers" phenomenon in 1970s-80s rock music concerts, when keyboard instrument performers were sometimes hidden behind banks of various instruments. Following the advent of MIDI, many synthesizers were released in rack-mount versions, enabling performers to control multiple instruments from a single keyboard.

Another important result of MIDI has been the development of hardware and computer-based sequencers, which can be used to record, edit and play back performances. In the years immediately after the 1983 ratification of the MIDI specification, MIDI interfaces were released for both the Apple Macintosh computer and the Windows platform, allowing for the development of a market for powerful, inexpensive, and now-widespread computer-based MIDI sequencers.

Synchronization of MIDI sequences is made possible by the use of MIDI timecode, an implementation of the SMPTE time code standard using MIDI messages, and MIDI timecode has become the standard for digital music synchronization.

A number of music file formats have been based on the MIDI bytestream. These formats are very compact; a file as small as 10 KB can produce a full minute of music. This is advantageous for applications such as mobile phone ringtones, and some video games.

The term "MIDI sound" has often been used as a synonym for "bad sounding computer music", but this reflects a misunderstanding: MIDI does not define the sound, only the control protocol. This is probably a result of the poor quality sound sythesis provided by many early sound cards, which relied on FM synthesis instead of wavetables to produce audio.


[edit] MIDI interfaces
All MIDI In and MIDI Out connectors are part of a MIDI interface. A MIDI interface moves internal binary data to the MIDI Out connector for transmission to another device's MIDI In connector, in MIDI message form. It also receives incoming MIDI messages arriving on the MIDI In connector (from another device's MIDI Out connector) into internal binary data. All MIDI compatible instruments have a built-in MIDI interface. Some computers' sound cards have a built-in MIDI Interface, whereas others require an external MIDI Interface which is usually connected to the computer via USB or FireWire.


[edit] MIDI message interoperability
All MIDI compatible controllers, musical instruments, and MIDI-compatible software follow the same MIDI 1.0 specification, and thus interpret any given MIDI message the same way, and so can communicate with and understand each other. For example, if a note is played on a MIDI controller, it will sound at the right pitch on any MIDI instrument whose MIDI In connector is connected to the controller's MIDI Out connector. Often, the joystick port doubles as a midi port.


[edit] How MIDI channel messages work
When a musical performance is played on an MIDI instrument (or controller) it transmits MIDI channel messages from its MIDI Out connector. A typical MIDI channel message sequence corresponding to a key being struck and released on a keyboard is:

The user presses the middle C key with a specific velocity (which is usually translated into the volume of the note but can also be used by the synthesiser to set the timbre as well). ---> The instrument sends one Note On message.
The user changes the pressure applied on the key while holding it down - a technique called aftertouch (can be repeated, optional). ---> The instrument sends one or more Aftertouch messages.
The user releases the middle C key, again with the possibility of velocity of release controlling some parameters. ---> The instrument sends one Note Off message.
Note On, Aftertouch, and Note Off are all channel messages. For the Note On and Note Off messages, the MIDI specification defines a number (from 0–127) for every possible note pitch (C, C#, D etc.), and this number is included in the message. For example, the Middle C note played on any MIDI compatible musical instrument will always transmit the same MIDI channel message from its MIDI Out connector.

Other performance parameters can be transmitted with channel messages, too. For example, if the user turns the pitch wheel on the instrument, that gesture is transmitted over MIDI using a series of Pitch Bend messages (also a channel message). The musical instrument generates the messages autonomously; all the musician has to do is play the notes (or make some other gesture that produces MIDI messages). This consistent, automated abstraction of the musical gesture could be considered the core of the MIDI standard.


[edit] How MIDI Show Control works
Main article: MIDI Show Control.

When any cue is called by a user (typically a Stage Manager) and/or preprogrammed timeline in a show control software application, the show controller transmits one or more Real Time System Exclusive messages from its 'MIDI Out' port. A typical MSC message sequence is:

the user just called a cue
the cue is for lighting device 3
the cue is number 45.8
the cue is in cue list 7

[edit] The MIDI 1.0 Protocol
Main article: The MIDI 1.0 Protocol
IMPORTANT: Some of the information in this section diverges from the official MMA/AMEI MIDI specifications in terminology and in technical detail. Developers interested in maximizing interoperability are encouraged to work directly from the official MMA/AMEI specifications.

There are two sides to MIDI 1.0: the hardware transport specification describing the electrical and mechanical connection, and the message format specification.


[edit] Hardware Transport (Electrical and Mechanical Connections)

MIDI ports and cable.The MIDI standard consists of a communications messaging protocol designed for use with musical instruments, as well as a physical interface standard. It consists physically of a one-way (simplex) digital current loop serial communications electrical connection signaling at 31,250 bits per second. One start bit (must be 0), eight data bits, no parity bit and one stop bit (must be 1) is used.


[edit] Message Format
Every MIDI connection is a one-way connection from the MIDI Out connector of the sending device to the MIDI In connector of the receiving device. Each such connection can carry a stream of MIDI messages, with most messages representing a common musical performance event or gesture such as note-on, note-off, controller value change (including volume, pedal, modulation signals, etc.), pitch bend, program change, aftertouch, channel pressure. All of those messages include channel number. There are 16 possible channels in the protocol. The channels are used to separate "voices" or "instruments", somewhat like tracks in a multi-track mixer.


[edit] Low bandwidth
MIDI messages are extremely compact, due to the low bandwidth of the connection, and the need for real-time accuracy. Most messages consist of a status byte (channel number in the low 4 bits, and an opcode in the high 4 bits), followed by one or two data bytes. However, the serial nature of MIDI messages means that long strings of MIDI messages take an appreciable time to send, at times even causing audible delays, especially when dealing with dense musical information or when many channels are particularly active.

To further optimize the data stream, "Running status", a convention that allows the status byte to be omitted if it would be the same as that of the previous message, helps to mitigate bandwidth issues somewhat.


[edit] MIDI file formats

[edit] Standard MIDI File (SMF) Format
MIDI messages (along with timing information) can be collected and stored in a computer file system, in what is commonly called a MIDI file, or more formally, a Standard MIDI File (SMF). The SMF specification was developed by, and is maintained by, the MIDI Manufacturers Association (MMA). MIDI files are typically created using computer-based sequencing software (or sometimes a hardware-based MIDI instrument or workstation) that organizes MIDI messages into one or more parallel "tracks" for independent recording and editing. In most but not all sequencers, each track is assigned to a specific MIDI channel and/or a specific General MIDI instrument patch. Although most current MIDI sequencer software uses proprietary "session file" formats rather than SMF, almost all sequencers provide export or "Save As..." support for the SMF format.

An SMF consists of one header chunk and one or more track chunks. There are three SMF formats; the format is encoded in the file header. Format 0 contains a single track and represents a single song performance. Format 1 may contain any number of tracks, enabling preservation of the sequencer track structure, and also represents a single song performance. Format 2 may have any number of tracks, each representing a separate song performance. Sequencers do not commonly support Format 2.

Large collections of SMFs can be found on the web, most commonly with the extension .mid. These files are most frequently authored with the assumption that they will be played on General MIDI players.


[edit] MIDI Karaoke File (.KAR) Format
MIDI-Karaoke (which uses the ".kar" file extension) files are an "unofficial" extension of MIDI files, used to add synchronized lyrics to standard MIDI files. SMF players play the music as they would a .mid file but do not display these lyrics unless they have specific support for .kar messages. These often display the lyrics synchronized with the music in "follow-the-bouncing-ball" fashion, essentially turning any PC into a karaoke machine.

MIDI-Karaoke file formats are not maintained by any standardization body.


[edit] XMF File Formats
The MMA has also defined (and AMEI has approved) a new family of file formats, XMF (eXtensible Music File), some of which package SMF chunks with instrument data in DLS format (Downloadable Sounds, also an MMA/AMEI specification), to much the same effect as the MOD file format. The XMF container is a binary format (not XML-based, although the file extensions are similar). See the main article Extensible Music Format (XMF).


[edit] RMI File Format
On Microsoft Windows, the system itself uses RIFF-based MIDI files with the .rmi extension. Note, Standard MIDI Files per se are not RIFF-compliant. An RMI file, however, is simply a Standard MIDI File wrapped in a RIFF header. If the RIFF header is thrown away, the result should be a regular Standard MIDI File.

The RMI file format is not maintained by any standardization body.


[edit] MIDI usage and applications
Main article: MIDI usage and applications

[edit] Extensions of the MIDI standard
Many extensions of the original official MIDI 1.0 spec have been standardized by MMA/AMEI. Only a few of them are described here; for more comprehensive information, see the MMA web site.


[edit] General MIDI
The General MIDI (GM) and General MIDI 2 (GM2) standards define a MIDI instrument's response to the receipt of a defined set of MIDI messages. As such, they allow a given, conformant MIDI stream to be played on any conformant instrument. Although dependent on the basic MIDI 1.0 specification, the GM and GM2 specifications are each separate from it. As such, it is not generally safe to assume that any given MIDI message stream or MIDI file is intended to drive GM-compliant or GM2-compliant MIDI instruments. General Midi 1 was introduced in 1991.


[edit] General MIDI 2
Later, companies in Japan's Association of Musical Electronics Industry (sic) (AMEI) developed General MIDI Level 2 (GM2), incorporating aspects of the Yamaha XG and Roland GS formats, extending the instrument palette, specifying more message responses in detail, and defining new messages for custom tuning scales and more. The GM2 specs are maintained and published by the MMA and AMEI.

General MIDI 2 was introduced in 1992.


[edit] SP-MIDI
Later still, GM2 became the basis of the instrument selection mechanism in Scalable Polyphony MIDI (SP-MIDI), a MIDI variant for mobile applications where different players may have different numbers of musical voices. SP-MIDI is a component of the 3GPP mobile phone terminal multimedia architecture, starting from release 5.

GM, GM2, and SP-MIDI are also the basis for selecting player-provided instruments in several of the MMA/AMEI XMF file formats (XMF Type 0, Type 1, and Mobile XMF), which allow extending the instrument palette with custom instruments in the Downloadable Sound (DLS) formats, addressing another major GM shortcoming.


[edit] Alternate Hardware Transports
In addition to the original 31.25 kBaud current-loop, 5-pin DIN transport, transmission of MIDI streams over USB, IEEE 1394 AKA FireWire, and ethernet is now common. Perhaps in the long run the IETF's RTP MIDI specification for transport of MIDI streams over ethernet and internet may completely supersede the original DIN transport, since RTP MIDI is capable of providing the high-bandwidth channel that earlier alternatives to MIDI (such as ZIPI) were intended to bring. See external links below for further information.


[edit] Alternate Tunings
By convention, instruments that receive MIDI generally use the conventional 12-pitch per octave equal temperament tuning system. Unfortunately this tuning system makes many types of music inaccessible because the music depends on a different intonation system. To address this issue in a standardized manner, in 1992 the MMA ratified the MIDI Tuning Standard, or MTS. This standard allow MIDI instruments that support MTS to be tuned in any way desired, through the use of a MIDI Non-Real Time System Exclusive message.

MTS uses three bytes, which can be thought of as a three-digit number base 128, to specify a pitch in logarithmic form. The following formula gives the byte values needed to encode a given frequency in Hertz:


For a note in A440 equal temperament, this formula delivers the standard MIDI note number. Any other frequencies fill the space evenly.

While support for MTS is not particularly widespread in commercial hardware instruments, it is nonetheless supported by some instruments and software, for example the free software programs TiMidity and Scala (program), as well as other microtuners.


[edit] Other applications of MIDI
MIDI is also used every day as a control protocol in applications other than music, including:

show control
theatre lighting
special effects
sound design
recording system synchronization
audio processor control
computer networking, as demonstrated by the early first-person shooter game MIDI Maze, 1987
animatronic figure control
Such non-musical applications of MIDI are possible because any device built with a standard MIDI Out connector should in theory be able to control any other device with a MIDI In port, just as long as the developers of both devices have the same understanding about the semantic meaning of all the MIDI messages the sending device emits. This agreement can come either because both follow the published MIDI specifications, or else in the case of any non-standard functionality, because the message meanings are agreed upon by the two manufacturers.


[edit] MIDI controllers: hardware, software, datastream
The term MIDI controller is used in two different ways.

In one sense, a MIDI controller is a hardware or software entity able to transmit MIDI messages via a MIDI Out connector to other devices with MIDI In connectors.
In the other (more technical) sense, a MIDI controller is any parameter in a device with a MIDI In connector that can be set with the MIDI Control Change message. For example, a synthesizer may use controller number 18 for a low-pass filter's frequency; to open and close that filter with a physical slider, a user would assign the slider to transmit controller number 18. Then, all changes in the slider position will be transmitted as MIDI Control Change messages with the controller number field set to 18; when the synthesizer receives the messages, the filter frequency will change accordingly.

[edit] Beyond MIDI 1.0
Although traditional MIDI connections work well for most purposes, a number of newer message protocols and hardware transports have been proposed over the years to try to take the idea to the next level. Some of the more notable efforts include:


[edit] OSC
The Open Sound Control (OSC) protocol was at CNMAT. OSC has been implemented in the well-known software synthesizer Reaktor and in other projects including SuperCollider, Pure Data, Isadora, Max/MSP, Csound, VVVV and ChucK. The Lemur Input Device, a customizable touch panel with MIDI controller-type functions, also uses OSC. OSC differs from MIDI over traditional 5-pin DIN in that it can run at broadband speeds when sent over Ethernet connections. Unfortunately few mainstream musical applications and no standalone instruments support the protocol so far, making whole-studio interoperability problematic. OSC is not owned by any private company, however it is also not maintained by any standards organization.


[edit] mLAN
Yamaha has its mLAN[1] protocol, which is a based on the IEEE 1394 transport (also known as FireWire) and carries multiple MIDI message channels and multiple audio channels. mLAN is not maintained by a standards organization as it is a proprietary protocol. mLAN is open for licensing.

No comments: