# Apple iOS export

The Apple Haptic and Audio Pattern (AHAP) format is used to represent haptic signals to be played back on iOS devices.

Note: The AHAP format is only supported on iPhone 8, 8 Plus, X, XR, XS, or XS Max or later, running iOS 13 or later.

# Exporting

Select the project whose tracks you aim to export. Then, open the exporter in the menu under File... -> Export to AHAP file... .

The exporter interface allows you to select individual or all tracks to be exported. To select the tracks you would like to export, check the tracks' corresponding checkboxes.

On the bottom of the exporter interface, you can specify the directory to which the resulting .ahap and .wav files will be written. To browse your filesystem for this, press the button labeled "Choose".

Once you have selected the tracks to export and the directory to which to export the tracks, you can press the button labeled "Export" to begin the process. Once the export is complete, a notification on the bottom of the screen will indicate the successful export.

# Deployment

When inspecting the target directory on your filesystem, you will see files with the .ahap extension, and possibly audio files with a .wav extension (if any of your exported tracks included an audio block).

To incorporate the signals in your iOS application, copy all files to your iOS app's directory.

# React Native

The easiest way to deploy your haptic signals in a React Native application is to use the react-native-hapticlabs npm package:

import { playAHAP } from 'react-native-hapticlabs';

// Play an AHAP file
playAHAP('/path/to/your/ahap/file.ahap');

# Swift

In Swift, you can trigger the haptic signals as follows:

var engine

// Create and configure a haptic engine.
do {
    // Associate the haptic engine with the default audio session
    // to ensure the correct behavior when playing audio-based haptics.
    let audioSession = AVAudioSession.sharedInstance()
    engine = try CHHapticEngine(audioSession: audioSession)
} catch let error {
    print("Engine Creation Error: \(error)")
}

// Express the path to the AHAP file before attempting to load it.
guard let path = Bundle.main.path(forResource: "YOUR AHAP FILENAME", ofType: "ahap") else {
    return
}

// Start the engine in case it's idle.
try engine?.start()

// Tell the engine to play your AHAP file.
try engine?.playPattern(from: URL(fileURLWithPath: path))

Note that you don't need to explicitly reference audio files associated with your haptic signals. The engine will find them autonomously.

If your project contains stereo tracks, you will see that the exporter creates two corresponding AHAP files, one of which will be marked as "Channel B". To reproduce the same effect as felt with the Hapticlabs Player application, trigger both signals simultaneously:

var engine

// Create and configure a haptic engine.
do {
    // Associate the haptic engine with the default audio session
    // to ensure the correct behavior when playing audio-based haptics.
    let audioSession = AVAudioSession.sharedInstance()
    engine = try CHHapticEngine(audioSession: audioSession)
} catch let error {
    print("Engine Creation Error: \(error)")
}

// Express the paths to the AHAP files before attempting to load them.
guard let pathA = Bundle.main.path(forResource: "YOUR MAIN AHAP FILENAME", ofType: "ahap") else {
    return
}
guard let pathB = Bundle.main.path(forResource: "YOUR CHANNEL B AHAP FILENAME", ofType: "ahap") else {
    return
}

// Start the engine in case it's idle.
try engine?.start()

// Tell the engine to play both AHAPs.
try engine?.playPattern(from: URL(fileURLWithPath: pathA))
try engine?.playPattern(from: URL(fileURLWithPath: pathB))