Why MIDI 2.0 Matters in 2026
MIDI 1.0 served musicians and developers for over 40 years. But in 2026, the transition to MIDI 2.0 is no longer theoretical — it's happening.
Windows MIDI Services went GA in February 2026. For the first time, both Windows and macOS offer native, OS-level MIDI 2.0 support. Microsoft's new stack brings UMP-native transport, multi-client access, and microsecond-precision timestamps to Windows 11 — a watershed moment for cross-platform MIDI 2.0 adoption.
Hardware manufacturers are shipping real products. KORG's Keystage controller uses Property Exchange to auto-configure synths like multi/poly and wavestate. Roland's A-88MKII implements the Piano Profile with Synthogy Ivory. StudioLogic, Rhodes, and others are following suit.
DAWs are catching up too. Cubase 14 delivers full MIDI 2.0 to VST3 translation with high-resolution velocity and per-note controllers. Logic Pro supports Property Exchange for automatic device control mapping. Studio One handles MIDI-CI Discovery and Profile negotiation.
The ecosystem is ready. The question for Apple platform developers is: how do you build for MIDI 2.0 in Swift?
What is MIDI2Kit?
MIDI2Kit is a Swift library that provides the complete MIDI 2.0 protocol stack for Apple platforms. It covers everything from low-level UMP parsing to high-level device discovery and property exchange — the layers that CoreMIDI leaves to you.
While Apple's CoreMIDI gives you the transport (sending and receiving UMP packets), MIDI2Kit builds the MIDI-CI protocol on top: Discovery, Property Exchange, and Responder capabilities. It is the only Swift library that fully implements these higher-level MIDI 2.0 protocols.
The entire API is built on Swift Concurrency. All managers are Swift actors. All data types are Sendable. Event streams use AsyncSequence. If you're writing modern Swift, MIDI2Kit feels native.
- 602+ tests across 77 test suites
- Swift 6 strict concurrency support from day one
- Zero external dependencies — only Apple frameworks
- All Apple platforms: iOS 17+, macOS 14+, tvOS 17+, watchOS 10+, visionOS 1.0+
Key Features
MIDI-CI Discovery
Automatically discover MIDI 2.0 capable devices on the network. MIDI2Kit sends Discovery messages, parses responses, and delivers results as a real-time AsyncSequence event stream. Device appearance, disappearance, and capability changes are all tracked for you.
Property Exchange
Read and write device properties via the MIDI-CI Property Exchange protocol. MIDI2Kit handles the full PE transaction lifecycle: chunked SysEx assembly, request/response correlation, timeout management, and retry logic. Query device info, browse resource lists, or set custom parameters — all with typed, async/await APIs.
UMP Conversion
Bidirectional MIDI 1.0 and MIDI 2.0 message conversion. SysEx7 encoding/decoding, RPN/NRPN translation, and multi-packet reassembly are built in. The UMPBuilder and UMPParser give you full control over the Universal MIDI Packet format.
Responder API
Publish your own device on the MIDI-CI network. MIDI2ResponderClient lets you register typed resources, respond to property queries, and control which devices can connect via connection policy filtering. Build a virtual instrument, a control surface, or a diagnostic tool — all visible to other MIDI 2.0 devices.
Quick Example: Discover MIDI 2.0 Devices in 10 Lines
Here's everything you need to discover MIDI 2.0 devices and read their properties:
import MIDI2Kit
let client = try MIDI2Client(name: "MyApp")
try await client.start()
for await event in await client.makeEventStream() {
if case .deviceDiscovered(let device) = event {
print("Found: \(device.displayName)")
if device.supportsPropertyExchange {
let info = try await client.getDeviceInfo(from: device.muid)
print("Product: \(info.productName ?? "Unknown")")
}
}
}
That's it. No manual SysEx construction. No callback pyramids. No thread-safety concerns. The MIDI2Client actor handles all concurrency for you.
Want to go the other direction? Here's a responder that publishes a custom resource:
let responder = try MIDI2ResponderClient(name: "MyDevice")
await responder.addResource("Status") {
ComputedResource(getTyped: { _ in
DeviceStatus(battery: 85, firmware: "2.1.0")
})
}
try await responder.start()
What's Next
MIDI2Kit covers Discovery and Property Exchange today. Here's what's coming:
- Profile Configuration — Currently in design (MIDI-CI v1.2 compliant). Enable Piano Profile, mixer mappings, and custom profiles. The
MIDI2Profilemodule will ship as a new target in an upcoming release. - Network MIDI 2.0 — UDP-based MIDI 2.0 transport for local area networks. As hardware support grows (Amenote, Bome, Kissbox), we'll add a transport backend.
- DocC Documentation — Full API documentation with interactive tutorials, hosted on GitHub Pages.
- Sample Apps — A MIDI 2.0 Device Explorer for iOS and a Monitor app for macOS.
We're building in the open. Check the GitHub Issues for the current roadmap, and join the discussion if there's a feature you need.
Get Started
Add MIDI2Kit to your project with Swift Package Manager:
// Package.swift
dependencies: [
.package(
url: "https://github.com/midi2kit/MIDI2Kit-SDK.git",
from: "1.0.0"
)
]
Or in Xcode: File → Add Package Dependencies and paste the repository URL.
MIDI2Kit is distributed as both source code (for debugging and learning) and XCFramework (for zero build-time integration). Choose what works for your workflow.
Ready to build with MIDI 2.0?
MIDI2Kit is open source, MIT licensed, and ready for production.