Why You Should Care About MIDI 2.0 Now
If you're building a music app on Apple platforms, MIDI 2.0 is no longer a future consideration — it's a present reality. macOS has supported MIDI 2.0 at the CoreMIDI level since macOS 11, and the ecosystem of MIDI 2.0 hardware and software is growing rapidly.
MIDI 2.0 brings transformative improvements over the 40-year-old MIDI 1.0 standard:
- 32-bit velocity and controller resolution — 65,536 times more precision than MIDI 1.0's 7-bit values
- Per-note controllers — apply pitch bend, pressure, and custom parameters to individual notes
- Device Discovery (MIDI-CI) — devices automatically negotiate capabilities
- Property Exchange — query and set device parameters without proprietary SysEx
- Bidirectional communication — devices can talk back to hosts
But Apple's CoreMIDI only provides the transport layer: sending and receiving raw UMP packets. Everything above that — discovery, property exchange, protocol negotiation — is left to you. That's where MIDI2Kit comes in.
MIDI2Kit handles the protocol complexity so you can focus on building great music experiences.
MIDI 2.0 Basics for Swift Developers
The Universal MIDI Packet (UMP)
MIDI 2.0 replaces the byte-stream format of MIDI 1.0 with a new packet format called the Universal MIDI Packet (UMP). Every UMP is 32-bit aligned and can be 1, 2, 3, or 4 words (32-bit integers) long. The first 4 bits of the first word encode the message type, which determines the packet length and semantics.
Here's how the message types break down:
- Type 0x0 — Utility messages (JR Timestamp, JR Clock)
- Type 0x1 — System Common and System Real Time (1 word)
- Type 0x2 — MIDI 1.0 Channel Voice (1 word, legacy compatibility)
- Type 0x3 — Data messages / SysEx 7-bit (2 words)
- Type 0x4 — MIDI 2.0 Channel Voice (2 words, the new hotness)
- Type 0x5 — Data messages / SysEx 8-bit (2 words)
With MIDI2Kit, you don't need to manually parse these bit patterns. The UMPParser and UMPBuilder types handle encoding and decoding for you.
MIDI-CI: The Intelligence Layer
MIDI-CI (Capability Inquiry) is the protocol layer that sits on top of UMP transport. It enables devices to discover each other, negotiate capabilities, exchange properties, and configure profiles. MIDI-CI messages are transported as SysEx within UMP packets.
MIDI-CI defines three major feature areas:
- Discovery — Devices announce their presence and capabilities (manufacturer, model, supported features)
- Property Exchange (PE) — A JSON-based request/response protocol for reading and writing device parameters
- Profile Configuration — Standardized behavior profiles (e.g., Piano Profile, Drawbar Organ Profile)
Setting Up Your Project
MIDI2Kit integrates via Swift Package Manager. Add it to your Package.swift:
// Package.swift
import PackageDescription
let package = Package(
name: "MyMIDIApp",
platforms: [.macOS(.v14), .iOS(.v17)],
dependencies: [
.package(
url: "https://github.com/midi2kit/MIDI2Kit-SDK.git",
from: "1.0.0"
)
],
targets: [
.executableTarget(
name: "MyMIDIApp",
dependencies: [
.product(name: "MIDI2Kit", package: "MIDI2Kit-SDK")
]
)
]
)
Or in Xcode: File → Add Package Dependencies, paste https://github.com/midi2kit/MIDI2Kit-SDK.git, and select the MIDI2Kit product.
MIDI2Kit has zero external dependencies — it only relies on Apple system frameworks (CoreMIDI, Foundation). It supports iOS 17+, macOS 14+, tvOS 17+, watchOS 10+, and visionOS 1.0+.
MIDI2Kit is also available as a pre-built XCFramework for zero build-time integration. See the GitHub repository for details.
Sending and Receiving UMP Messages
Creating a MIDI2Kit Client
The central type in MIDI2Kit is MIDI2Client, a Swift actor that manages your connection to the MIDI subsystem. Here's how to initialize and start it:
import MIDI2Kit
// Create a client with your app's name
let client = try MIDI2Client(name: "MyMIDIApp")
// Start the client — this connects to CoreMIDI
// and begins MIDI-CI Discovery automatically
try await client.start()
Listening for UMP Events
MIDI2Kit delivers all events through an AsyncSequence-based event stream. This integrates naturally with Swift Concurrency:
// Create an event stream
let eventStream = await client.makeEventStream()
// Process events in a structured concurrency context
Task {
for await event in eventStream {
switch event {
case .midi2ChannelVoice(let msg):
switch msg.status {
case .noteOn:
print("Note On: \(msg.note) velocity: \(msg.velocity32)")
case .noteOff:
print("Note Off: \(msg.note)")
case .controlChange:
print("CC \(msg.index): \(msg.value32)")
default:
break
}
case .deviceDiscovered(let device):
print("New device: \(device.displayName)")
case .deviceDisappeared(let muid):
print("Device left: \(muid)")
default:
break
}
}
}
Sending MIDI 2.0 Messages
Use UMPBuilder to construct messages and send them to a destination:
// Build a MIDI 2.0 Note On message with high-resolution velocity
let noteOn = UMPBuilder.midi2NoteOn(
group: 0,
channel: 0,
note: 60, // Middle C
velocity: .midi2(0xC000_0000), // ~75% in 32-bit resolution
attributeType: .none
)
// Send to a specific destination
try await client.send(noteOn, to: destinationID)
// Build a per-note pitch bend (a MIDI 2.0 exclusive feature)
let perNoteBend = UMPBuilder.midi2PerNotePitchBend(
group: 0,
channel: 0,
note: 60,
value: 0x6000_0000 // Slight upward bend
)
try await client.send(perNoteBend, to: destinationID)
MIDI 1.0 and 2.0 Conversion
MIDI2Kit provides bidirectional conversion between MIDI 1.0 and MIDI 2.0 message formats. This is essential when communicating with devices that only support MIDI 1.0:
// Convert a MIDI 1.0 message to MIDI 2.0
let midi1NoteOn = UMPBuilder.midi1NoteOn(
group: 0, channel: 0, note: 60, velocity: 100
)
let midi2Equivalent = UMPConverter.toMIDI2(midi1NoteOn)
// Convert a MIDI 2.0 message back to MIDI 1.0
let downscaled = UMPConverter.toMIDI1(midi2Equivalent)
// Velocity is scaled: MIDI 2.0 32-bit → MIDI 1.0 7-bit
// with proper rounding to minimize precision loss
Device Discovery with MIDI-CI
One of the most powerful features of MIDI 2.0 is automatic device discovery. With MIDI-CI Discovery, your app can find all MIDI 2.0 capable devices, learn their capabilities, and decide how to interact with them — all without user intervention.
How Discovery Works
When you call client.start(), MIDI2Kit automatically begins the MIDI-CI Discovery process:
- A Discovery message is broadcast to all connected MIDI ports
- MIDI 2.0 capable devices respond with their MUID (unique identifier), manufacturer info, and supported features
- MIDI2Kit assigns each device a
DiscoveredDeviceobject and delivers it through the event stream - If a device disconnects, an
InvalidateMUIDmessage is sent and MIDI2Kit emits adeviceDisappearedevent
// Discover devices and inspect their capabilities
for await event in await client.makeEventStream() {
if case .deviceDiscovered(let device) = event {
print("Device: \(device.displayName)")
print(" Manufacturer: \(device.manufacturer)")
print(" Model: \(device.model)")
print(" MUID: \(device.muid)")
print(" Supports PE: \(device.supportsPropertyExchange)")
print(" Supports Profiles: \(device.supportsProfileConfiguration)")
// Get all currently discovered devices at any time
let allDevices = await client.discoveredDevices
print("Total devices online: \(allDevices.count)")
}
}
Filtering and Managing Devices
In a real application, you'll want to filter devices by capability and maintain a device list:
actor DeviceManager {
private var devices: [MUID: DiscoveredDevice] = [:]
func handle(_ event: MIDI2Event) {
switch event {
case .deviceDiscovered(let device):
devices[device.muid] = device
case .deviceDisappeared(let muid):
devices.removeValue(forKey: muid)
default:
break
}
}
/// Returns only devices that support Property Exchange
var peCapableDevices: [DiscoveredDevice] {
devices.values.filter { $0.supportsPropertyExchange }
}
}
Property Exchange Basics
Property Exchange (PE) is MIDI-CI's request/response protocol for reading and writing device parameters. Think of it as a RESTful API for MIDI devices — you can GET and SET resources using JSON-like payloads, all transported over SysEx within UMP.
Reading Device Properties
The most common PE operation is reading the DeviceInfo resource, which every PE-capable device must support:
// Get basic device information
let deviceInfo = try await client.getDeviceInfo(from: device.muid)
print("Manufacturer: \(deviceInfo.manufacturerName ?? "Unknown")")
print("Product: \(deviceInfo.productName ?? "Unknown")")
print("Firmware: \(deviceInfo.firmwareVersion ?? "Unknown")")
print("Serial: \(deviceInfo.serialNumber ?? "N/A")")
Browsing Resource Lists
Devices can expose a list of available resources that you can query and modify:
// Get the list of available resources on a device
let resources = try await client.getResourceList(from: device.muid)
for resource in resources {
print("Resource: \(resource.resource)")
print(" Can GET: \(resource.canGet)")
print(" Can SET: \(resource.canSet)")
print(" Can Subscribe: \(resource.canSubscribe)")
}
// Read a specific resource
let channelList = try await client.getProperty(
resource: "ChannelList",
from: device.muid
)
// Set a property value
try await client.setProperty(
resource: "ProgramName",
value: ["name": "My Custom Patch"],
on: device.muid
)
Subscribing to Property Changes
PE supports subscriptions, allowing your app to receive real-time updates when a device property changes:
// Subscribe to changes on a resource
let subscription = try await client.subscribe(
to: "CurrentProgram",
on: device.muid
)
// Receive updates as an AsyncSequence
Task {
for await update in subscription.updates {
print("Program changed: \(update.value)")
}
}
// Unsubscribe when done
try await subscription.cancel()
Handling Connections: USB, Bluetooth, and Network
MIDI2Kit works with all transport types that CoreMIDI supports. The beauty of the UMP architecture is that your code doesn't need to change based on how a device is connected.
USB MIDI
USB MIDI devices are the most straightforward — they appear automatically through CoreMIDI. MIDI2Kit discovers them as soon as they're plugged in and emits deviceDiscovered events if they support MIDI-CI.
Bluetooth MIDI
For Bluetooth MIDI on iOS, you need to present a connection UI. MIDI2Kit provides a helper for this:
// Present the system Bluetooth MIDI picker (iOS)
#if os(iOS)
import CoreAudioKit
let picker = CABTMIDICentralViewController()
present(picker, animated: true)
// Once connected, the device appears in MIDI2Kit's
// event stream automatically — no extra setup needed
#endif
Network MIDI
macOS supports Network MIDI sessions through CoreMIDI. MIDI2Kit treats network MIDI endpoints the same as any other — once established, discovery and property exchange work identically:
// Network MIDI sessions are managed through
// Audio MIDI Setup on macOS.
// Once a session is established, endpoints appear
// to MIDI2Kit like any other MIDI device.
// You can identify the transport type if needed:
let endpoint = await client.endpoints.first {
$0.displayName == "Network Session 1"
}
if let endpoint {
print("Transport: \(endpoint.transportType)")
// .usb, .bluetooth, .network, etc.
}
Common Patterns and Best Practices
1. Use Structured Concurrency
MIDI2Kit is built on Swift actors and AsyncSequence. Embrace structured concurrency to avoid common pitfalls:
// Good: Use TaskGroup for parallel operations
try await withThrowingTaskGroup(of: Void.self) { group in
let devices = await client.discoveredDevices
for device in devices where device.supportsPropertyExchange {
group.addTask {
let info = try await client.getDeviceInfo(from: device.muid)
print("\(device.displayName): \(info.productName ?? "Unknown")")
}
}
}
// Good: Cancel event processing cleanly
let task = Task {
for await event in await client.makeEventStream() {
handleEvent(event)
}
}
// Later, when shutting down:
task.cancel()
try await client.stop()
2. Handle Device Disconnection Gracefully
Devices can disconnect at any time. Always handle the deviceDisappeared event and clean up:
for await event in await client.makeEventStream() {
switch event {
case .deviceDiscovered(let device):
await deviceManager.add(device)
// Start PE queries only after discovery
if device.supportsPropertyExchange {
Task { await fetchDeviceDetails(device) }
}
case .deviceDisappeared(let muid):
await deviceManager.remove(muid)
// Cancel any pending PE transactions for this device
await cancelPendingRequests(for: muid)
default:
break
}
}
3. Respect Rate Limits
MIDI-CI transactions are SysEx-based and can be bandwidth-intensive. Avoid flooding devices with rapid-fire PE requests:
// Bad: Rapid-fire requests
for resource in resources {
let value = try await client.getProperty(
resource: resource.name, from: muid
)
}
// Better: Rate-limit with a small delay
for resource in resources {
let value = try await client.getProperty(
resource: resource.name, from: muid
)
try await Task.sleep(for: .milliseconds(50))
}
4. Build a Responder for Testing
MIDI2Kit's MIDI2ResponderClient lets you create a virtual MIDI 2.0 device — perfect for testing without hardware:
// Create a virtual MIDI 2.0 responder
let responder = try MIDI2ResponderClient(
name: "Virtual Synth",
manufacturer: "MyCompany",
model: "TestSynth"
)
// Add a typed resource
await responder.addResource("DeviceStatus") {
ComputedResource(getTyped: { context in
DeviceStatus(
battery: 85,
firmware: "2.1.0",
activeVoices: 12
)
})
}
// Add a settable resource
await responder.addResource("ProgramName") {
StoredResource(
initialValue: ProgramInfo(name: "Init", bank: 0, number: 0),
onSet: { newValue, context in
print("Program changed to: \(newValue.name)")
return .ok
}
)
}
// Start — the responder is now visible to other MIDI 2.0 apps
try await responder.start()
5. Use the Type System
MIDI2Kit uses strongly typed values wherever possible. Prefer typed APIs over raw bytes:
// Prefer typed velocity over raw integers
let velocity = Velocity.midi2(0xB000_0000) // ~69%
let velocity7 = Velocity.midi1(100) // 7-bit
let velocityMax = Velocity.max // Full velocity
// Use note numbers with octave info
let note = NoteNumber(60) // Middle C
print(note.name) // "C4"
print(note.frequency) // 261.63 Hz
// Build messages with type safety
let msg = UMPBuilder.midi2NoteOn(
group: 0,
channel: 0,
note: note,
velocity: velocity,
attributeType: .none
)
Where to Go from Here
This guide covered the fundamentals of MIDI 2.0 development with Swift and MIDI2Kit. Here are some next steps:
- Read the API Reference for complete documentation
- Explore the Guides for deep dives into Property Exchange, Responders, and UMP parsing
- Check out the example projects on GitHub
- Read the official MIDI 2.0 specification from the MIDI Association
MIDI 2.0 is a generational upgrade to how musical devices communicate. With MIDI2Kit, you can start building for it today in pure Swift — no C bridges, no callback pyramids, no thread-safety concerns.
Ready to build with MIDI 2.0?
MIDI2Kit is open source, MIT licensed, and ready for production.