Haptics on Apple Platforms
Developer guide introducing all the APIs provided by Apple around Haptics
Haptics create an experience of touch by applying forces, vibrations, or motions to the user. Apple provides various APIs to expose Haptics to app developers depending on the platform. This blog post serves as an introduction to these various techniques.
iOS
Haptics engage peopleโs sense of touch to enhance the experience of interacting with onscreen interfaces. For example, the system plays haptics in addition to visual and auditory feedback to highlight the confirmation of an Apple Pay transaction. Haptics can also enhance touch gestures and interactions like scrolling through a picker or toggling a switch.
From Apple's Human Interface Guidelines we learn that standard UI elements, e.g. switches or sliders or pickers, do play Apple-designed system haptics by default. The HIG is an excellent place to learn more about system-defined haptics.
You can expand and use a feedback generator to play one of several predefined haptic patterns in three categories: notification, impact, and selection.
Finally, you can use the CoreHaptics
framework for more complex scenarios. It allows you to create a wide range of different haptic experiences by combining transient and continuous events, varying sharpness and intensity, and including optional audio content.
Feedback Generator
UIKit
gives you feedback generators to add your own feedback to custom views and controls by playing one of several predefined haptic patterns.
Abstract class UIFeedbackGenerator is the superclass for all feedback generators. Don't bother with it.
Rather have a look and use one of its three subclasses:
Here is an example of using UINotificationFeedbackGenerator
in SwiftUI.
import SwiftUI
import UIKit
struct ContentView: View {
var body: some View {
VStack {
Text("Hello, World!")
.onTapGesture {
let generator = UINotificationFeedbackGenerator()
generator.notificationOccurred(.success)
}
}
}
}
For more details please read the documentation.
Core Haptics
Compose and play haptic patterns to customize your appโs haptic feedback.
Core Haptics lets you add customized haptic and audio feedback to your app. Use haptics to engage users physically, with tactile and audio feedback that gets attention and reinforces actions. Some system-provided interface elementsโlike pickers, switches, and slidersโautomatically provide haptic feedback as users interact with them. With Core Haptics, you extend this functionality by composing and combining haptics beyond the default patterns.
CoreHaptics
is its own framework and not part of UIKit
. CoreHaptics
runs on
- iOS 13.0+
- iPadOS 13.0+
- Mac Catalyst 13.0+
- tvOS 14.0+
I highly recommend the Getting Started With Core Haptics tutorial from raywenderlich.com and then look at Apple's Documentation.
Nevertheless, I want to give you an example of using Core Haptics in a SwiftUI application. I encapsulate the use of CoreHaptics
in a reference type conforming to ObservableObject
.
import CoreHaptics
class HapticManager: ObservableObject {
private var hapticEngine: CHHapticEngine? = nil
init() {
let hapticCapability = CHHapticEngine.capabilitiesForHardware()
guard hapticCapability.supportsHaptics else {
print("Device does not support Haptics")
return
}
do {
hapticEngine = try CHHapticEngine()
} catch let error {
print("Haptic engine not created: \(error)")
}
}
}
I create a custom haptic through extension.
extension HapticManager {
func intenseSharpTap() {
var events = [CHHapticEvent]()
// create one intense, sharp tap
let intensity = CHHapticEventParameter(parameterID: .hapticIntensity, value: 1)
let sharpness = CHHapticEventParameter(parameterID: .hapticSharpness, value: 1)
let event = CHHapticEvent(eventType: .hapticTransient, parameters: [intensity, sharpness], relativeTime: 0)
events.append(event)
// convert those events into a pattern and play it immediately
do {
let pattern = try CHHapticPattern(events: events, parameters: [])
let player = try hapticEngine?.makePlayer(with: pattern)
try player?.start(atTime: 0)
} catch {
print("Failed to play pattern: \(error.localizedDescription).")
}
}
}
Finally, I can use the reference type as @StateObject
in my view and assign the haptic through an onTapGesture
view modifier.
import SwiftUI
struct ContentView: View {
@StateObject var hapticManager = HapticManager()
var body: some View {
VStack {
Text("Hello, World!")
.onTapGesture(perform: hapticManager.intenseSharpTap)
}
}
}
macOS
UIKit
and CoreHaptics
can be leveraged for macOS applications with Mac Catalyst. But if you want to work with haptics related to your Mac's Force Touch trackpad, you have to use AppKit
framework.
AppKit
gives you NSHapticFeedbackManager
to access haptic feedback management attributes on a system with a Force Touch trackpad.
You then can create haptic feedbacks for three categories as specified in enum NSHapticFeedbackManager.FeedbackPattern:
- alignment: to be used in response to the alignment of an object the user is dragging around
- level: to be used as the user moves between discrete levels of pressure
- generic: use this when no other feedback patterns apply.
watchOS
Apple's Human Interface guidelines for watchOS explain the system haptics in watchOS.
The following styles of feedback are predefined through WKHapticType
- notification
- directionUp
- directionDown
- success
- failure
- retry
- start
- stop
- click
- navigationGenericManeuver
- navigationLeftTurn
- navigationRightTurn
Very easy to use in Swift with WKInterfaceDevice
.
WKInterfaceDevice.currentDevice().playHaptic(.Success)
Did you find this article valuable?
Support Marco Eidinger by becoming a sponsor. Any amount is appreciated!