Some user agents have music devices, such as synthesizers, keyboard and other controllers, and drum machines connected to their host computer or device. The widely adopted Musical Instrument Digital Interface (MIDI) protocol enables electronic musical instruments, controllers and computers to communicate and synchronize with each other. MIDI does not transmit audio signals: instead, it sends event messages about musical notes, controller signals for parameters such as volume, vibrato and panning, cues and clock signals to set the tempo, and system-specific MIDI communications (e.g. to remotely store synthesizer-specific patch data). This same protocol has become a standard for non-musical uses, such as show control, lighting and special effects control.

This specification defines an API supporting the MIDI protocol, enabling web applications to enumerate and select MIDI input and output devices on the client system and send and receive MIDI messages. It is intended to enable non-music MIDI applications as well as music ones, by providing low-level access to the MIDI devices available on the users' systems. The Web MIDI API is not intended to describe music or controller inputs semantically; it is designed to expose the mechanics of MIDI input and output interfaces, and the practical aspects of sending and receiving MIDI messages, without identifying what those actions might mean semantically (e.g., in terms of "modulate the vibrato by 20Hz" or "play a G#7 chord", other than in terms of changing a controller value or sending a set of note-on messages that happen to represent a G#7 chord).

To some users, "MIDI" has become synonymous with Standard MIDI Files and General MIDI. That is not the intent of this API; the use case of simply playing back a .SMF file is not within the purview of this specification (it could be considered a different format to be supported by the HTML5 <audio> element, for example). The Web MIDI API is intended to enable direct access to devices that respond to MIDI - controllers, external synthesizers or lighting systems, for example. The Web MIDI API is also explicitly designed to enable a new class of applications on the web that can respond to MIDI controller inputs - using external hardware controllers with physical buttons, knobs and sliders (as well as musical controllers like keyboard, guitar or wind instrument controllers) to control web applications.

The Web MIDI API is also expected to be used in conjunction with other APIs and elements of the web platform, notably the Web Audio API. This API is also intended to be familiar to users of MIDI APIs on other systems, such as Apple's CoreMIDI and Microsoft's Windows MIDI API.

Introduction

The Web MIDI API specification defines a means for web developers to enumerate, manipulate and access MIDI devices - for example, interfaces that may provide hardware MIDI ports with other devices plugged in to them and USB devices that support the USB-MIDI specification. Having a Web API for MIDI enables web applications that use existing software and hardware synthesizers, hardware music controllers and light systems and other mechanical apparatus controlled by MIDI. This API has been defined with this wide variety of use cases in mind.

The approaches taken by this API are similar to those taken in Apple's CoreMIDI API and Microsoft's Windows MIDI API; that is, the API is designed to represent the low-level software protocol of MIDI, in order to enable developers to build powerful MIDI software on top. The API enables the developer to enumerate input and output interfaces, and send and receive MIDI messages, but (similar to the aforementioned APIs) it does not attempt to semantically define or interpret MIDI messages beyond what is necessary to robustly support current devices.

The Web MIDI API is not intended to directly implement high-level concepts such as sequencing; it does not directly support Standard MIDI Files, for example, although a Standard MIDI File player can be built on top of the Web MIDI API. It is also not intended to semantically capture patches or controller assignments, as General MIDI does; such interpretation is outside the scope of the Web MIDI API (though again, General MIDI can easily be utilized through the Web MIDI API).

This specification defines conformance criteria that apply to a single product: the user agent that implements the interfaces that it contains.

Implementations that use ECMAScript to implement the APIs defined in this specification MUST implement them in a manner consistent with the ECMAScript Bindings defined in the Web IDL specification [[!WEBIDL]], as this specification uses that specification and terminology.

Terminology

The concepts of queueing a task and event handlers are defined in [[!HTML]].

The Web Audio API and its associated interfaces and concepts are defined in [[!webaudio]].

The Event interface and how to fire an event are defined in [[!DOM]].

The DOMHighResTimeStamp interface is defined in [[!HIGHRES-TIME]].

The terms MIDI, MIDI device, MIDI input port, MIDI output port, MIDI interface, MIDI message, MIDI System Real-Time message and system exclusive are defined in [[!MIDI]].

Promise objects are defined in [[!ECMASCRIPT]].

The DOMException is defined in [[WEBIDL]].

Obtaining Access to MIDI Devices

Feature Policy Integration

requestMIDIAccess is a policy-controlled feature, as defined by Feature Policy.

The feature name for requestMIDIAccess is midi.

The default allowlist for requestMIDIAccess is ["self"].

Extensions to the Navigator interface

            partial interface Navigator {
              [SecureContext] Promise <MIDIAccess> requestMIDIAccess(optional MIDIOptions options);
            };
          
requestMIDIAccess

When invoked, returns a Promise object representing a request for access to MIDI devices on the user's system.

Requesting MIDI access SHOULD prompt the user for access to MIDI devices, particularly if system exclusive access is requested. In some scenarios, this permission may have already been implicitly or explicitly granted, in which case this prompt may not appear. If the user gives express permission or the call is otherwise approved, the vended Promise is resolved. The underlying system may choose to allow the user to select specific MIDI interfaces to expose to this API (i.e. pick and choose interfaces on an individual basis), although this is not required. The system may also choose to prompt (or not) based on whether system exclusive support is requested, as system exclusive access has greater privacy and security implications.

If the user declines or the call is denied for any other reason, the Promise is rejected with a DOMException parameter.

When the requestMIDIAccess method is called, the user agent MUST run the algorithm to request MIDI Access:

  1. Let promise be a new Promise object and resolver be its associated resolver.

  2. Return promise and run the following steps asynchronously.

  3. Let document be the calling context's Document.

  4. If document is not allowed to use the policy-controlled feature named midi, jump to the step labeled failure below.

  5. Optionally, e.g. based on a previously-established user preference, for security reasons, or due to platform limitations, jump to the step labeled failure below.

  6. Optionally, e.g. based on a previously-established user preference, jump to the step labeled success below.

  7. Prompt the user in a user-agent-specific manner for permission to provide the entry script's origin with a MIDIAccess object representing control over user's MIDI devices. This prompt may be contingent upon whether system exclusive support was requested, and may allow the user to enable or disable that access.

    If permission is denied, jump to the step labeled failure below. If the user never responds, this algorithm will never progress beyond this step. If permission is granted, continue the following steps.

  8. success: Let access be a new MIDIAccess object. (It is possible to call requestMIDIAccess() multiple times; this may prompt the user multiple times, so it may not be best practice, and the same instance of MIDIAccess will not be returned each time.)

  9. Call resolver's accept(value) method with access as value argument.

  10. Terminate these steps.

  11. failure: Let error be a new DOMException. This exception's .name should be "SecurityError" if the user or their security settings denied the application from creating a MIDIAccess instance with the requested options, or if the error is the result of document not being allowed to use the feature, "AbortError" if the page is going to be closed for a user navigation, "InvalidStateError" if the underlying systems raise any errors, or otherwise it should be "NotSupportedError".

  12. Call resolver's reject(value) method with error as value argument.

MIDIOptions Dictionary

This dictionary contains optional settings that may be provided to the requestMIDIAccess request.

          dictionary MIDIOptions {
            boolean sysex;
            boolean software;
          };
        
sysex

This member informs the system whether the ability to send and receive system exclusive messages is requested or allowed on a given MIDIAccess object. On the option passed to requestMIDIAccess, if this member is set to true, but system exclusive support is denied (either by policy or by user action), the access request will fail with a "SecurityError" error. If this support is not requested (and allowed), the system will throw exceptions if the user tries to send system exclusive messages, and will silently mask out any system exclusive messages received on the port.

software

This member informs the system whether the ability to utilize any software synthesizers installed in the host system is requested or allowed on a given MIDIAccess object. On the option passed to requestMIDIAccess, if this member is set to true, but software synthesizer support is denied (either by policy or by user action), the access request will fail with a "SecurityError" error. If this support is not requested, the system should not include any software synthesizers in the MIDIAccess exposure of available ports.

Note that may result in a two-step request procedure if software synthesizer support is desired but not required - software synthesizers may be disabled when MIDI hardware device access is allowed.

MIDIInputMap Interface

          interface MIDIInputMap {
            readonly maplike <DOMString, MIDIInput>;
          };
        

The MIDIInputMap is a maplike interface whose value is a MIDIInput instance and key is its ID.

This type is used to represent all the currently available MIDI input ports. This enables:

    // to tell how many entries there are:
    var numberOfMIDIInputs = inputs.size;

    // add each of the ports to a <select> box
    inputs.forEach( function( port, key ) {
      var opt = document.createElement("option");
      opt.text = port.name;
      document.getElementById("inputportselector").add(opt);
    });

    // or you could express in ECMAScript 6 as:
    for (let input of inputs.values()) {
      var opt = document.createElement("option");
      opt.text = input.name;
      document.getElementById("inputportselector").add(opt);
    }

MIDIOutputMap Interface

          interface MIDIOutputMap {
            readonly maplike <DOMString, MIDIOutput>;
          };
        

The MIDIOutputMap is a maplike interface whose value is a MIDIOutput instance and key is its ID.

This type is used to represent all the currently available MIDI output ports. This enables:

    // to tell how many entries there are:
    var numberOfMIDIOutputs = outputs.size;

    // add each of the ports to a <select> box
    outputs.forEach( function( port, key ) {
      var opt = document.createElement("option");
      opt.text = port.name;
      document.getElementById("outputportselector").add(opt);
    });

    // or you could express in ECMAScript 6 as:
    for (let output of outputs.values()) {
      var opt = document.createElement("option");
      opt.text = output.name;
      document.getElementById("outputportselector").add(opt);
    }

MIDIAccess Interface

This interface provides the methods to list MIDI input and output devices, and obtain access to an individual device.

        [SecureContext] interface MIDIAccess: EventTarget {
          readonly attribute MIDIInputMap inputs;
          readonly attribute MIDIOutputMap outputs;
          attribute EventHandler onstatechange;
          readonly attribute boolean sysexEnabled;
        };
      
inputs
The MIDI input ports available to the system.
outputs
The MIDI output ports available to the system.
onstatechange

The handler called when a new port is connected or an existing port changes the state attribute.

This event handler, of type MIDIConnectionEvent, MUST be supported by all objects implementing the MIDIAccess interface.

It is important to understand that leaving an EventHandler attached to this object will prevent it from being garbage-collected; when finished using the MIDIAccess, you should remove any onstatechange listeners.

Whenever a previously unavailable MIDI port becomes available for use, or an existing port changes the state attribute, the user agent SHOULD run the following steps:

  1. Let port be the MIDIPort corresponding to the newly-available, or the existing port.
  2. Let event be a newly constructed MIDIConnectionEvent, with the port attribute set to the port.
  3. Fire an event named "statechange" at the MIDIAccess, using the event as the event object.
sysexEnabled
This attribute informs the user whether system exclusive support is enabled on this MIDIAccess.

MIDIPort Interface

This interface represents a MIDI input or output port.

        [SecureContext] interface MIDIPort: EventTarget {
          readonly attribute DOMString id;
          readonly attribute DOMString? manufacturer;
          readonly attribute DOMString? name;
          readonly attribute MIDIPortType type;
          readonly attribute DOMString? version;
          readonly attribute MIDIPortDeviceState state;
          readonly attribute MIDIPortConnectionState connection;
          attribute EventHandler onstatechange;
          Promise <MIDIPort> open();
          Promise <MIDIPort> close();
        };
      
id

A unique ID of the port. This can be used by developers to remember ports the user has chosen for their application. The User Agent MUST ensure that the id is unique to only that port. The User Agent SHOULD ensure that the id is maintained across instances of the application - e.g., when the system is rebooted - and when a device is removed from the system. Applications may want to cache these ids locally to re-create a MIDI setup. Some systems may not support completely unique persistent identifiers; in such cases, it will be more challenging to maintain identifiers when another interface is added or removed from the system. (This might throw off the index of the requested port.) It is expected that the system will do the best it can to match a port across instances of the MIDI API: for example, an implementation may opaquely use some form of hash of the port interface manufacturer, name and index as the id, so that a reference to that port id is likely to match the port when plugged in. Applications may use the comparison of id of MIDIPorts to test for equality.

manufacturer

The manufacturer of the port.

name

The system name of the port.

type

A descriptor property to distinguish whether the port is an input or an output port. For MIDIOutput, this MUST be "output". For MIDIInput, this MUST be "input".

version

The version of the port.

state
The state of the device.
connection
The state of the connection to the device.
onstatechange

The handler called when an existing port changes its state or connection attributes.

This event handler, of type statechange, MUST be supported by all objects implementing MIDIPort interface.

It is important to understand that leaving an EventHandler attached to this object will prevent it from being garbage-collected; when finished using the MIDIPort, you should remove any onstatechange listeners.

open

Makes the MIDI device corresponding to the MIDIPort explicitly available. Note that this call is NOT required in order to use the MIDIPort - calling send() on a MIDIOutput or attaching a MIDIMessageEvent handler on a MIDIInput will cause an implicit open(). The underlying implementation may not need to do anything in response to this call. However, some underlying implementations may not be able to support shared access to MIDI devices, so using explicit open() and close() calls will enable MIDI applications to predictably control this exclusive access to devices.

When invoked, this method returns a Promise object representing a request for access to the given MIDI port on the user's system.

If the port device has a state of "connected", when access to the port has been obtained (and the port is ready for input or output), the vended Promise is resolved.

If access to a connected port is not available (for example, the port is already in use in an exclusive-access-only platform), the Promise is rejected (if any) is invoked.

If open() is called on a port that is "disconnected", the port's .connection will transition to "pending", until the port becomes "connected" or all references to it are dropped.

When this method is called, the user agent MUST run the algorithm to open a MIDIPort:

  1. Let promise be a new Promise object and resolver be its associated resolver.

  2. Return promise and run the following steps asynchronously.

  3. Let port be the given MIDIPort object.

  4. If the device's connection is already "open" (e.g. open() has already been called on this MIDIPort, or the port has been implicitly opened), jump to the step labeled success below.

  5. If the device's connection is "pending" (i.e. the connection had been opened and the device was subsequently disconnected), jump to the step labeled success below.

  6. If the device's state is "disconnected", change the connection attribute of the MIDIPort to "pending", and enqueue a new MIDIConnectionEvent to the statechange handler of the MIDIAccess and to the statechange handler of the MIDIPort and jump to the step labeled success below.

  7. Attempt to obtain access to the given MIDI device in the system. If the device is unavailable (e.g. is already in use by another process and cannot be opened, or is disconnected), jump to the step labeled failure below. If the device is available and access is obtained, continue the following steps.

  8. Change the connection attribute of the MIDIPort to "open", and enqueue a new MIDIConnectionEvent to the statechange handler of the MIDIAccess and to the statechange handler of the MIDIPort.

  9. If this port is an output port and has any pending data that is waiting to be sent, asynchronously begin sending that data.

  10. success: Call resolver's accept(value) method with port as value argument.

  11. Terminate these steps.

  12. failure: Let error be a new DOMException. This exception's .name should be "InvalidAccessError" if the port is unavailable.

  13. Call resolver's reject(value) method with error as value argument.

close

Makes the MIDI device corresponding to the MIDIPort explicitly unavailable (subsequently changing the state from "open" to "closed"). Note that successful invocation of this method will result in MIDI messages no longer being delivered to MIDIMessageEvent handlers on a MIDIInput (although setting a new handler will cause an implicit open()).

The underlying implementation may not need to do anything in response to this call. However, some underlying implementations may not be able to support shared access to MIDI devices, and the explicit close() call enables MIDI applications to ensure other applications can gain access to devices.

When invoked, this method returns a Promise object representing a request for access to the given MIDI port on the user's system. When the port has been closed (and therefore, in exclusive access systems, the port is available to other applications), the vended Promise is resolved. If the port is disconnected, the Promise is rejected.

When the close() method is called, the user agent MUST run the algorithm to close a MIDIPort:

  1. Let promise be a new Promise object and resolver be its associated resolver.

  2. Return promise and run the following steps asynchronously.

  3. Let port be the given MIDIPort object.

  4. If the port is already closed (its .connection is "closed" - e.g. the port has not yet been implicitly or explicitly opened, or close() has already been called on this MIDIPort), jump to the step labeled closed below.

  5. If the port is an input port, skip to the next step. If the output port's .state is not "connected", clear all pending send data and skip to the next step. Clear any pending send data in the system with timestamps in the future, then finish sending any send messages with no timestamp or with a timestamp in the past or present, prior to proceeding to the next step.

  6. Close access to the port in the underlying system if open, and release any blocking resources in the underlying system.

  7. Change the connection attribute of the MIDIPort to "closed", and enqueue a new MIDIConnectionEvent to the statechange handler of the MIDIAccess and to the statechange handler of the MIDIPort.

  8. closed: Call resolver's accept(value) method with port as value argument.

  9. Terminate these steps.

Whenever the MIDI port corresponding to the MIDIPort changes the state attribute, the user agent SHOULD run the following steps:

  1. Let port be the MIDIPort.

  2. Let event be a newly constructed MIDIConnectionEvent, with the port attribute set to the port.

  3. Fire an event named statechange at the MIDIPort, and statechange at the MIDIAccess, using the event as the event object.

MIDIInput Interface

          [SecureContext] interface MIDIInput: MIDIPort {
            attribute EventHandler onmidimessage;
          };
        
onmidimessage

This event handler, of type "midimessage", MUST be supported by all objects implementing MIDIInput interface.

If the handler is set and the state attribute is not "opened", underlying implementation tries to make the port available, and change the state attribute to "opened". If succeeded, MIDIConnectionEvent is delivered to the corresponding MIDIPort and MIDIAccess.

Whenever the MIDI port corresponding to the MIDIInput finishes receiving one or more MIDI messages, the user agent MUST run the following steps:

  1. Let port be the MIDIInput.

  2. If the MIDIAccess did not enable system exclusive access, and the message is a system exclusive message, abort this process.

  3. Let event be a newly constructed MIDIMessageEvent, with the timestamp attribute set to the time the message was received by the system, and with the data attribute set to a Uint8Array of MIDI data bytes representing a single MIDI message.

  4. Fire an event named "midimessage" at the port, using the event as the event object.

It is specifically noted that MIDI System Real-Time Messages may actually occur in the middle of other messages in the input stream; in this case, the System Real-Time messages will be dispatched as they occur, while the normal messages will be buffered until they are complete (and then dispatched).

MIDIOutput Interface

          [SecureContext] interface MIDIOutput : MIDIPort {
            void send(sequence<octet> data, optional DOMHighResTimeStamp timestamp = 0);
            void clear();
          };
        
send

Enqueues the message to be sent to the corresponding MIDI port. The underlying implementation will (if necessary) coerce each member of the sequence to an unsigned 8-bit integer. The use of sequence rather than a Uint8Array enables developers to use the convenience of output.send( [ 0x90, 0x45, 0x7f ] ); rather than having to create a Uint8Array, e.g. output.send( new Uint8Array( [ 0x90, 0x45, 0x7f ] ) ); - while still enabling use of Uint8Arrays for efficiency in large MIDI data scenarios (e.g. reading Standard MIDI Files and sending sysex messages).

The data contains one or more valid, complete MIDI messages. Running status is not allowed in the data, as underlying systems may not support it.

If data is not a valid sequence or does not contain a valid MIDI message, throw a TypeError exception.

If data is a system exclusive message, and the MIDIAccess did not enable system exclusive access, throw an InvalidAccessError exception.

If the port is "disconnected", throw an InvalidStateError exception.

If the port is "connected" but the connection is "closed", asynchronously try to open the port.

sequence<octet> data
The data to be enqueued, with each sequence entry representing a single byte of data.
optional DOMHighResTimeStamp timestamp
The time at which to begin sending the data to the port (as a DOMHighResTimeStamp - a number of milliseconds measured relative to the navigation start of the document). If timestamp is set to zero (or another time in the past), the data is to be sent as soon as possible.
clear

Clears any pending send data that has not yet been sent from the MIDIOutput's queue. The implementation will need to ensure the MIDI stream is left in a good state, so if the output port is in the middle of a sysex message, a sysex termination byte (0xf7) should be sent.

MIDIPortType Enum

          enum MIDIPortType {
            "input",
            "output",
          };
        
input
If a MIDIPort is an input port, the type member MUST be this value.
output
If a MIDIPort is an output port, the type member MUST be this value.

MIDIPortDeviceState Enum

        enum MIDIPortDeviceState {
          "disconnected",
          "connected",
        };
      
disconnected
The device that MIDIPort represents is disconnected from the system. When a device is disconnected from the system, it should not appear in the relevant map of input and output ports.
connected
The device that MIDIPort represents is connected, and should appear in the map of input and output ports.

MIDIPortConnectionState Enum

        enum MIDIPortConnectionState {
          "open",
          "closed",
          "pending",
        };
      
open
The device that MIDIPort represents has been opened (either implicitly or explicitly) and is available for use.
closed
The device that MIDIPort represents has not been opened, or has been explicitly closed. Until a MIDIPort has been opened either explicitly (through open()) or implicitly (by adding a midimessage event handler on an input port, or calling send() on an output port, this should be the default state of the device.
pending
The device that MIDIPort represents has been opened (either implicitly or explicitly), but the device has subsequently been disconnected and is unavailable for use. If the device is reconnected, prior to sending a statechange event, the system should attempt to reopen the device (following the algorithm to open a MIDIPort); this will result in either the connection state transitioning to "open" or to "closed".

MIDIMessageEvent Interface

An event object implementing this interface is passed to a MIDIInput's onmidimessage handler when MIDI messages are received. Note that the DOM Event timeStamp attribute is defined as a DOMHighResTimeStamp, and represents the high-resolution time of when the event was received or is to be sent.

        [SecureContext], [Constructor(DOMString type, optional MIDIMessageEventInit eventInitDict)]
        interface MIDIMessageEvent: Event {
          readonly attribute Uint8Array data;
        };
      
data

A Uint8Array containing the MIDI data bytes of a single MIDI message.

MIDIMessageEventInit Dictionary

          dictionary MIDIMessageEventInit: EventInit {
            Uint8Array data;
          };
        
data

A Uint8Array containing the MIDI data bytes of a single MIDI message.

MIDIConnectionEvent Interface

An event object implementing this interface is passed to a MIDIAccess' onstatechange handler when a new port becomes available (for example, when a MIDI device is first plugged in to the computer), when a previously-available port becomes unavailable, or becomes available again (for example, when a MIDI interface is disconnected, then reconnected) and (if present) is also passed to the onstatechange handlers for any MIDIPorts referencing the port.

When a MIDIPort is in the "pending" state and the device is reconnected to the host system, prior to firing a statechange event the algorithm to open a MIDIPort is run on it to attempt to reopen the port. If this transition fails (e.g. the Port is reserved by something else in the underlying system, and therefore unavailable for use), the connection state moves to "closed", else it transitions back to "open". This is done prior to the statechange event for the device state change so that the event will reflect the final connection state as well as the device state.

Some underlying systems may not provide notification events for device connection status; such systems may have long time delays as they poll for new devices infrequently. As such, it is suggested that heavy reliance on connection events not be used.

        [SecureContext], [Constructor(DOMString type, optional MIDIConnectionEventInit eventInitDict)]
        interface MIDIConnectionEvent: Event {
          readonly attribute MIDIPort port;
        };
      
port

The port that has been connected or disconnected.

MIDIConnectionEventInit Dictionary

          dictionary MIDIConnectionEventInit: EventInit {
            MIDIPort port;
          };
        
port

The port that has been connected or disconnected.

Examples of Web MIDI API Usage in JavaScript

The following are some examples of common MIDI usage in JavaScript.

Getting Access to the MIDI System

This example shows how to request access to the MIDI system.

var midi = null;  // global MIDIAccess object

function onMIDISuccess( midiAccess ) {
  console.log( "MIDI ready!" );
  midi = midiAccess;  // store in the global (in real usage, would probably keep in an object instance)
}

function onMIDIFailure(msg) {
  console.log( "Failed to get MIDI access - " + msg );
}

navigator.requestMIDIAccess().then( onMIDISuccess, onMIDIFailure );

Requesting Access to the MIDI System with System Exclusive Support

This example shows how to request access to the MIDI system, including the ability to send and receive system exclusive messages.

var midi = null;  // global MIDIAccess object

function onMIDISuccess( midiAccess ) {
  console.log( "MIDI ready!" );
  midi = midiAccess;  // store in the global (in real usage, would probably keep in an object instance)
}

function onMIDIFailure(msg) {
  console.log( "Failed to get MIDI access - " + msg );
}

navigator.requestMIDIAccess( { sysex: true } ).then( onMIDISuccess, onMIDIFailure );

Listing Inputs and Outputs

This example gets the list of the input and output ports and prints their information to the console log, using ES6 for...of notation.

function listInputsAndOutputs( midiAccess ) {
  for (var entry of midiAccess.inputs) {
    var input = entry[1];
    console.log( "Input port [type:'" + input.type + "'] id:'" + input.id +
      "' manufacturer:'" + input.manufacturer + "' name:'" + input.name +
      "' version:'" + input.version + "'" );
  }

  for (var entry of midiAccess.outputs) {
    var output = entry[1];
    console.log( "Output port [type:'" + output.type + "'] id:'" + output.id +
      "' manufacturer:'" + output.manufacturer + "' name:'" + output.name +
      "' version:'" + output.version + "'" );
  }
}

Handling MIDI Input

This example prints incoming MIDI messages on a single arbitrary input port to the console log.

function onMIDIMessage( event ) {
  var str = "MIDI message received at timestamp " + event.timestamp + "[" + event.data.length + " bytes]: ";
  for (var i=0; i<event.data.length; i++) {
    str += "0x" + event.data[i].toString(16) + " ";
  }
  console.log( str );
}

function startLoggingMIDIInput( midiAccess, indexOfPort ) {
  midiAccess.inputs.forEach( function(entry) {entry.onmidimessage = onMIDIMessage;});
}

Sending MIDI Messages to an Output Device

This example sends a middle C note on message immediately on MIDI channel 1 (MIDI channels are 0-indexed, but generally referred to as channels 1-16), and queues a corresponding note off message for 1 second later.

function sendMiddleC( midiAccess, portID ) {
  var noteOnMessage = [0x90, 60, 0x7f];    // note on, middle C, full velocity
  var output = midiAccess.outputs.get(portID);
  output.send( noteOnMessage );  //omitting the timestamp means send immediately.
  output.send( [0x80, 60, 0x40], window.performance.now() + 1000.0 ); // Inlined array creation- note off, middle C,  
                                                                      // release velocity = 64, timestamp = now + 1000ms.
}

A Simple Loopback

This example loops all input messages on the first input port to the first output port - including system exclusive messages.

var midi = null;  // global MIDIAccess object
var output = null;

function echoMIDIMessage( event ) {
  if (output) {
    output.send( event.data, event.timestamp );
  }
}

function onMIDISuccess( midiAccess ) {
  console.log( "MIDI ready!" );
  var input = midiAccess.inputs.entries.next();
  if (input)
    input.onmidimessage = echoMIDIMessage;
  output = midiAccess.outputs.values().next().value;
  if (!input || !output)
    console.log("Uh oh! Couldn't get i/o ports.");
}

function onMIDIFailure(msg) {
  console.log( "Failed to get MIDI access - " + msg );
}

navigator.requestMIDIAccess().then( onMIDISuccess, onMIDIFailure );

A Simple Monophonic Sine Wave MIDI Synthesizer

This example listens to all input messages from all available input ports, and uses note messages to drive the envelope and frequency on a monophonic sine wave oscillator, creating a very simple synthesizer, using the Web Audio API. Note on and note off messages are supported, but sustain pedal, velocity and pitch bend are not. This sample is also hosted on webaudiodemos.appspot.com.

    var context=null;   // the Web Audio "context" object
    var midiAccess=null;  // the MIDIAccess object.
    var oscillator=null;  // the single oscillator
    var envelope=null;    // the envelope for the single oscillator
    var attack=0.05;      // attack speed
    var release=0.05;   // release speed
    var portamento=0.05;  // portamento/glide speed
    var activeNotes = []; // the stack of actively-pressed keys

    window.addEventListener('load', function() {
      // patch up prefixes
      window.AudioContext=window.AudioContext||window.webkitAudioContext;

      context = new AudioContext();
      if (navigator.requestMIDIAccess)
        navigator.requestMIDIAccess().then( onMIDIInit, onMIDIReject );
      else
        alert("No MIDI support present in your browser.  You're gonna have a bad time.")

      // set up the basic oscillator chain, muted to begin with.
      oscillator = context.createOscillator();
      oscillator.frequency.setValueAtTime(110, 0);
      envelope = context.createGain();
      oscillator.connect(envelope);
      envelope.connect(context.destination);
      envelope.gain.value = 0.0;  // Mute the sound
      oscillator.start(0);  // Go ahead and start up the oscillator
    } );

    function onMIDIInit(midi) {
      midiAccess = midi;

      var haveAtLeastOneDevice=false;
      var inputs=midiAccess.inputs.values();
      for ( var input = inputs.next(); input && !input.done; input = inputs.next()) {
        input.value.onmidimessage = MIDIMessageEventHandler;
        haveAtLeastOneDevice = true;
      }
      if (!haveAtLeastOneDevice)
        alert("No MIDI input devices present.  You're gonna have a bad time.");
    }

    function onMIDIReject(err) {
      alert("The MIDI system failed to start.  You're gonna have a bad time.");
    }

    function MIDIMessageEventHandler(event) {
      // Mask off the lower nibble (MIDI channel, which we don't care about)
      switch (event.data[0] & 0xf0) {
        case 0x90:
          if (event.data[2]!=0) {  // if velocity != 0, this is a note-on message
            noteOn(event.data[1]);
            return;
          }
          // if velocity == 0, fall thru: it's a note-off.  MIDI's weird, y'all.
        case 0x80:
          noteOff(event.data[1]);
          return;
      }
    }

    function frequencyFromNoteNumber( note ) {
      return 440 * Math.pow(2,(note-69)/12);
    }

    function noteOn(noteNumber) {
      activeNotes.push( noteNumber );
      oscillator.frequency.cancelScheduledValues(0);
      oscillator.frequency.setTargetAtTime( frequencyFromNoteNumber(noteNumber), 0, portamento );
      envelope.gain.cancelScheduledValues(0);
      envelope.gain.setTargetAtTime(1.0, 0, attack);
    }

    function noteOff(noteNumber) {
      var position = activeNotes.indexOf(noteNumber);
      if (position!=-1) {
        activeNotes.splice(position,1);
      }
      if (activeNotes.length==0) {  // shut off the envelope
        envelope.gain.cancelScheduledValues(0);
        envelope.gain.setTargetAtTime(0.0, 0, release );
      } else {
        oscillator.frequency.cancelScheduledValues(0);
        oscillator.frequency.setTargetAtTime( frequencyFromNoteNumber(activeNotes[activeNotes.length-1]), 0, portamento );
      }
    }

Security and Privacy Considerations of MIDI

There are two primary security and privacy concerns with adding the Web MIDI API to the web platform:

  1. Allowing the enumeration of the user's MIDI interfaces is a potential target for fingerprinting (that is, uniquely identifying a user by the specific MIDI interfaces they have connected). Note that in this context, what can be enumerated is the MIDI interfaces - not, for example, an individual sampler or synthesizer plugged into a MIDI interface, as these would not be enumerated, unless those devices are connected to the host computer with USB (USB-MIDI devices typically have their own MIDI interface, and would be enumerated). The interfaces that could be fingerprinted are equivalent to MIDI "ports", and for each device the API will expose the name of the device, manufacturer, and opaque identifier of the MIDI interface (but not any attached devices).

    Few systems will have significant numbers of MIDI devices attached; those systems that do will typically use hardware MIDI interfaces, not fanning out a dozen USB-MIDI connections through USB hubs. In this case, of course, enumerating the MIDI “devices” will only see the hardware MIDI interface(s), not the synthesizers, samplers, etc. plugged into it on the other side. Given the few number of devices plugged in, the amount of information exposed here is fairly symmetric with the fingerprinting concern exposed by other APIs such as the Gamepad API. The vast majority of systems have relatively few MIDI interfaces attached.

  2. Separate from the fingerprinting concerns of identifying the available ports are concerns around sending and receiving MIDI messages. Those issues are explored in more depth below.

In brief, the general categories of things you can do with MIDI ports are:

  1. Sending short messages (all messages except SysEx)
  2. Receiving short messages (all messages except SysEx)
  3. Sending SysEx messages. SysEx messages include both commonly recognized MIDI Time Code and MIDI Sample Dump Standard, as well as device-specific messages (like “patch control data for a Roland Jupiter-80 synthesizer”) that do not apply to other devices.
  4. Receiving SysEx messages.

The impact of each of these is:

  1. Sending short messages: sending note-on/note-off/controller messages would let you cause sounds to be played by attached devices, including (on Mac and Windows) any default virtual synthesizers. This by itself does not cause any concerning exposure - you can already make sounds without interaction, through <audio>, Flash, or Web Audio. Some attached devices might be professional lighting control systems, so it’s possible you could control stage lighting; however, this is extremely rare, and no known system has the ability to cause lasting damage or information leakage based solely on short messages; at worst, a malicious page could flash lights, and the user could close the page and reset their lighting controller.
  2. Receiving short messages: receiving note-on/note-off/controller messages would not cause any information exposure or security issues, as there is no identifying data being received, just a stream of controller messages - all of which must be initiated by the user on that MIDI device (except clock-type messages). This is very analogous to receiving keyboard or mouse events.
  3. Sending and Receiving SysEx. This is the biggest concern, because it would be possible to write code that looked for system-specific responses to sysex messages, which could identify the hardware available, and then use it to download data - e.g. samples stored in a sampler - or replace that data (erasing sample data or patches in the device), although both these scenarios would have to be coded for a particular device. It is also possible that some samplers might enable a system exclusive message to start recording a sample - so if the sampler happened to have a dedicated microphone attached (uncommon in practice, but possible), it would be possible to write code specific to a particular device that could record a short sample of sound and then upload it to the network without further user intervention. (You could not stream audio from the device, and most samplers have fairly limited memory, and MIDI Sample Dump sysex is a slow way to transfer data - it has to transcode into 7-bit - so it’s unlikely you could listen in for long periods.) More explicit fingerprinting is a concern, as the patch information/stored samples/user configuration could uniquely identify the system (although again, this requires much device-specific code; there is not standardized “grab all patches and hash it” capability.) This does suggest that system exclusive messages are in a security category of their own.

It's also useful to examine what scenarios are enabled by MIDI, mapped against these features:

  1. Receiving short messages. This is the most attractive scenario for Web MIDI, as it enables getting input from keyboards, drum pads, guitars, wind controllers, DJ/controllerist controllers, and more, and use those messages as input to control instruments and features in the Web Audio API as well as other control scenarios (MIDI is the protocol of choice for the multi-billion-dollar music production industry for getting physical controllers like knobs and buttons attached to your computer, both in pro/prosumer audio and media applications as well as consumer applications like Garageband.)
  2. Sending short messages - it’s tempting to say sending is significantly less interesting, as the scenario of attached output devices like hardware synthesizers is less common in today's market. The major exception to this is that many of the MIDI controllers have external host control of their indicator lights, and this makes them dramatically more useful. For example, the very popular Novation Launchpad controller uses MIDI note on/off messages sent to it to turn on/off and change colors of the buttons. The same is true of nearly all DJ controllers.
  3. Sending and receiving SysEx - obviously, for more advanced communication with high-end hardware devices, SysEx is required. Unfortunately, some common MIDI commands are also sent as system exclusive messages (MIDI Machine Control, for example - generic start/stop/rew/ffw commands) - and many devices use system exclusive to program patches, send advanced controller messages, download firmware, etc., which are much-demanded scenarios for Web MIDI. Some devices use sysex as a direct control protocol, as they can pack more data into a single “message”, and most devices use SysEx as way to save and restore patches and configuration information on less-expensive computer storage. Several of the major music hardware producers have expressed strong interest in using Web MIDI to provide web-based configuration and programming interfaces to their hardware. In short, disabling sysex altogether does not only disable high-end scenarios.

In short: the additional fingerprinting exposure of enumerating MIDI devices is directly analogous to the Gamepad API’s additional fingerprinting exposure through gamepad enumeration; typical users will only have at most a few devices connected, their configuration may change, and the information exposed is about the interface itself (i.e., no user-configured data).

The additional security concern for receiving short messages is also small - it’s analogous to listening to keyboard, mouse, mobile/laptop accelerometer, touch input or gamepad events; there is no additional information exposed, and all messages other than clock signals must be initiated by the user.

The additional concerns about sending short messages are analogous to any audio output - you cannot overwrite user information or expose use information, but you can make sounds happen, change patches, or (in rare configurations) toggle lights - but non-destructively, and not persistently.

System Exclusive, on the other hand, has a much less bounded potential, and it seems that distinguishing requests for SysEx separately in the API is a good idea, in order to more carefully provide user security hooks. The suggested security model explicitly allows user agents to require the user's approval before giving access to MIDI devices, although it is not currently required to prompt the user for this approval - but it also detailed that system exclusive support must be requested as part of that request.