Knowledge Base Help Center
IAR – User Guide
1. Overview: The Power of Audio in Unreal Engine
Welcome to the Insane Audio Recorder (IAR) user guide, the plugin designed to capture, analyze, and synthesize audio in high quality directly within Unreal Engine. This guide will walk you through everything from initial setup to exploring advanced functionalities and MIDI interaction in your Blueprints.
The Insane Audio Recorder (IAR) is a comprehensive solution for all things audio in your Unreal Engine projects. Beyond simply recording audio from various sources, IAR excels with its real-time analysis and MIDI synthesis capabilities, opening up a universe of creative and functional possibilities.
With IAR, you can:
- Record audio from microphones, mixers, files, or even generate simulated audio.
- Extract detailed audio features in real-time (such as RMS, pitch, and Attitude-Gram features).
- Transcribe audio to MIDI events.
- Synthesize audio from MIDI events.
- Control external MIDI devices and play MIDI files.
- Integrate all these functionalities intuitively into your Blueprints.
Get ready to take the sound experience of your projects to a new level with IAR!
2. Using IAR: Basic Audio Recording in Blueprints
The IARAudioComponent
is your central control point for all audio operations in IAR. By adding it to an actor, you gain access to audio recording, analysis, and synthesis functionalities directly from your Blueprints.
2.1. The Central Component: IARAudioComponent
To interact with IAR, you will need a reference to the IARAudioComponent
. This component can be added to any actor in your level that needs to manage audio operations, such as a Game Manager
actor, a PlayerController
, or the GameMode
.
Blueprint Example: How to get a reference to IARAudioComponent
- If the component is on your own actor:
(Your Actor Blueprint) Event BeginPlay ----> Get IARAudioComponent (Self) ----> (Promote to variable) ----> Set (IARAudioComponent_Ref)
- If the component is on another actor:
(Your Blueprint) Event BeginPlay ----> Get All Actors Of Class (Your Actor with IARAudioComponent) ----> For Each Loop ----> Get IARAudioComponent (Array Element) ----> (Promote to variable) ----> Set (IARAudioComponent_Ref)
With the component reference in hand, you will be able to access all its functions and properties.
2.2. Starting and Stopping Audio Recording
The Start Recording
and Stop Recording
functions on the IARAudioComponent
are essential for controlling the lifecycle of your audio recording sessions.
Important Configuration Properties for Basic Recording (FIAR_AudioStreamSettings
):
Before starting any recording, you must configure the AudioStreamSettings
on your IARAudioComponent
. This structure defines how audio will be captured, processed, and saved.
SampleRate
(Integer): The audio sample rate, in Hertz (Hz).- Example:
48000
(DVD/Blu-ray quality) or44100
(CD quality).
- Example:
NumChannels
(Integer): The number of audio channels.- Example:
1
(Mono) or2
(Stereo).
- Example:
BitDepth
(Integer): The audio bit depth.- Example:
16
(for 16-bit PCM audio).
- Example:
Codec
(String): The audio codec for file recording (e.g.,PCM
for uncompressed, or video codecs likeAAC
,MP3
for other purposes).- Example:
"PCM"
.
- Example:
Bitrate
(Integer): The bitrate in bits per second (bps), mainly for compressed codecs.- Example:
192000
(192 kbps for MP3).
- Example:
SourceType
(EnumEIARAudioSourceType
): Defines the origin of the audio to be recorded or processed.Simulated
: Generates synthetic audio (sine wave) – great for pipeline testing.AudioMixer
: Captures audio from a system input device (like a microphone or mixer).AudioFile
: Reads audio from a local audio file (.wav
,.mp3
).Folder
: Processes multiple audio/MIDI files in a directory in batch.MIDIInput
: Captures MIDI events from a physical MIDI input device.MIDIFile
: Reads MIDI events from a local.mid
file.
bEnableResampling
(Boolean): IfTRUE
, IAR will attempt to resample the audio source to the configuredSampleRate
. Useful for ensuring consistency.bEnableRTFeatures
(Boolean): WhenTRUE
, IAR does not record audio to a file. Instead, it processes the audio and extractsFeatures
in real-time, sending them via a delegate (OnRealTimeAudioFrameReady
). Ideal for visualizations or in-game analysis.bDebugDrawFeatures
(Boolean): Enables debug visualization of features, such as spectrograms and waveforms, in dynamic textures.FilePath
(String): (Used withSourceType = AudioFile
orMIDIFile
) The relative path to the audio or MIDI file. Example:Audio/MySong.wav
.FolderPath
(String): (Used withSourceType = Folder
) The path to the folder containing media files.InputDeviceIndex
(Integer): (Used withSourceType = AudioMixer
orMIDIInput
) The index of the input device to be used. You can list available devices.PlaybackSpeed
(Float): (Used withSourceType = AudioFile
orMIDIFile
) The playback speed of the audio/MIDI file.bLoopPlayback
(Boolean): (Used withSourceType = AudioFile
orMIDIFile
orFolder
) Whether the file/folder playback should loop.
Blueprint Example: Setting Up and Starting Basic Audio Recording
This example shows how to start recording audio from a microphone (assuming InputDeviceIndex
0) when the player presses a key, and stop recording on another key. The audio will be saved to a WAV file in the Saved/Recordings
directory.
(Your Actor Blueprint, e.g.: PlayerController)
[Event Input: Key 'G' (Pressed)]
|
|---- If (Is Recording?) -- (False) ----------------------------------------------------
| |
| |
|---- Set (IARAudioComponent_Ref->AudioStreamSettings.SampleRate = 48000) -----------|
| |
|---- Set (IARAudioComponent_Ref->AudioStreamSettings.NumChannels = 2) --------------|
| |
|---- Set (IARAudioComponent_Ref->AudioStreamSettings.BitDepth = 16) ----------------|
| |
|---- Set (IARAudioComponent_Ref->AudioStreamSettings.Codec = "PCM") ----------------|
| |
|---- Set (IARAudioComponent_Ref->AudioStreamSettings.SourceType = AudioMixer) ------|
| |
|---- Set (IARAudioComponent_Ref->AudioStreamSettings.InputDeviceIndex = 0) ---------|
| |
|---- IARAudioComponent_Ref->Start Recording ----------------------------------------|
| |
|---- Print String (String: "Audio Recording Started!") -----------------------------|
| |
|---- (True) ------------------------------------------------------------------------|
| |
|---- Print String (String: "Already recording audio!") -----------------------------|
[Event Input: Key 'H' (Pressed)]
|
|---- If (Is Recording?) -- (True) -----------------------------------------------------
| |
|---- IARAudioComponent_Ref->Stop Recording -----------------------------------------|
| |
|---- Print String (String: "Audio Recording Stopped!") -----------------------------|
| |
|---- (False) -----------------------------------------------------------------------|
| |
|---- Print String (String: "Not recording audio!") ---------------------------------|
In the example above, IARAudioComponent_Ref
is a variable reference to your IARAudioComponent
, obtained previously in the Event BeginPlay
.
Start Recording
: This Blueprint node initiates the audio capture and encoding process. It is a Latent function, meaning the Blueprint execution flow might pause briefly while the audio pipeline initializes.- Important: If
AudioStreamSettings.bEnableRTFeatures
is set toTRUE
,Start Recording
will not initiate a file recording. Instead, it will enable the broadcast of real-time audio frames and features through theOnRealTimeAudioFrameReady
delegate.
- Important: If
Stop Recording
: This Blueprint node ends the recording session. It finalizes the audio file and releases resources.
2.3. Next Steps: Exploring Audio Devices
To effectively use SourceType = AudioMixer
, it’s helpful to know which audio input devices are available on your system.
Enumerate Audio Input Devices
: This Blueprint node (from theIARAudioComponent
) populates a list of available audio input devices on the system.(Your Actor Blueprint) [Event BeginPlay] | |---- Get IARAudioComponent (Self) ----> Enumerate Audio Input Devices
Get Available Audio Input Devices List
: Returns anArray
ofFIAR_AudioDeviceInfo
structures, each containingDeviceName
,DeviceId
,NumInputChannels
, andSampleRate
. You can iterate over this array to display options to the user in an interface.Get Audio Input Device By Id
: Allows you to get the details of a specific device by itsDeviceId
.
With these functions, you can create robust device selection logic for your users.
3. Advanced Audio Features of IAR
3.1. Real-Time Audio Feature Capture
To leverage IAR’s analysis and processing capabilities without necessarily recording to a file, you can enable Real-Time Features
mode. In this mode, IAR processes the audio and extracts valuable information, making it available in real-time via a delegate.
How to Enable:
In your IARAudioComponent
, within the AudioStreamSettings
structure, set the property:
bEnableRTFeatures
(Boolean): Mark this box asTRUE
.
When Start Recording
is called with bEnableRTFeatures
enabled, IAR will begin processing audio from the configured SourceType
, but it will not create an audio file on disk. Instead, it will trigger the OnRealTimeAudioFrameReady
delegate periodically, sending the FIAR_JustRTFrame
structure (an audio version) containing the raw processed audio data and all its extracted features.
Blueprint Example: Receiving Real-Time Audio Data and Features
To access this data, you must subscribe to the OnRealTimeAudioFrameReady
delegate of your IARAudioComponent
.
(Your Actor Blueprint, e.g.: PlayerController)
[Event BeginPlay]
|
|---- Get IARAudioComponent (Self) ----> (Promote to variable) ----> Set (IARAudioComponent_Ref)
| |
| |---- Bind Event to OnRealTimeAudioFrameReady (from IARAudioComponent_Ref)
| | |
| | |---- Custom Event (OnRealTimeAudioDataReceived)
| |
| |---- Print String (String: "Subscribed to RTFeatures Audio delegate")
[Custom Event: OnRealTimeAudioDataReceived] (with an input "RealTimeFrame" of type FIVR_JustRTFrame)
|
|---- (Here you will process the raw audio data and its features)
|
|---- Print String (String: "RT Audio Frame received! Timestamp: " + RealTimeFrame.Timestamp)
The FIAR_JustRTFrame
structure for audio is similar to that for video, but the raw data is TArray<float>
(representing audio samples) and the Features
substructure contains FIAR_AudioFeatures
.
RealTimeFrame.RawAudioBuffer
(Array offloat
): Contains the raw audio samples of the processed frame. These are floating-point samples, generally normalized between -1.0 and 1.0. They can be used for waveform visualizations or other direct processing.RealTimeFrame.SpectrogramTexture
(UTexture2D*): IfbDebugDrawFeatures
is enabled (see section 3.3), this dynamic texture will contain the audio spectrogram, visualizing frequency distribution.RealTimeFrame.WaveformTexture
(UTexture2D*): IfbDebugDrawFeatures
is enabled, this texture will contain the audio waveform, visualizing amplitude over time.RealTimeFrame.FilteredSpectrogramTexture
(UTexture2D*): If Contextual Frequency Filtering is active (see section 3.3), this texture will show the spectrogram after the filter has been applied, useful for visualizing the effect.RealTimeFrame.Features
(StructureFIAR_AudioFeatures
): Contains all the metrics and analyses extracted from the audio.
3.2. FIAR_AudioFeatures
Structure: Detailed Audio Analysis
The FIAR_AudioFeatures
structure is at the heart of IAR’s audio analysis. It provides a rich collection of metrics from both the time and frequency domains, as well as features for musical analysis.
Members of FIAR_AudioFeatures
:
- Time Domain Metrics:
RMSAmplitude
(Float): Root Mean Square. Measures the average energy level of the audio signal. Represents the “loudness” of the sound.PeakAmplitude
(Float): The absolute maximum value reached by the signal samples in the frame. Indicates volume peaks.ZeroCrossingRate
(Float): The rate at which the audio signal crosses the zero point. Can be used to estimate the pitch of periodic signals (like voices or musical instruments) and for audio segmentation.
- Frequency Domain Metrics:
PitchEstimate
(Float): The estimated fundamental frequency of the audio signal in Hertz (Hz). For musical sounds, this corresponds to the main note.DetectedNotes
(Array ofFIAR_AudioNoteFeature
): An array of MIDI notes detected in the frame. IAR can detect the three most prominent notes.FIAR_AudioNoteFeature
: Each element in this array describes a detected note:NoteName
(String): Note name (e.g., “C”, “Db”).bIsBemol
,bIsSharp
(Boolean): Indicates whether the note is a flat or sharp.PitchHz
(Float): Exact frequency of the note in Hz.MIDINoteNumber
(Integer): MIDI note number (0-127).Octave
(Integer): Octave of the note.Velocity
(Float): “Strength” or prominence of the note (normalizado 0.0-1.0).StartTime
,Duration
(Float): Início e duração da note (if the MIDI transcriber is active).SemitonesFromPrevious
(Float): How many semitones the current note differs from the main note of the previous frame. Fundamental for Mu6 Melodic Contour.
- Attitude-Gram Features:
(These metrics are calculated based on the history of detected notes over time, not just the current frame, to reflect the musical “style” or “attitude”.)
OctavesUsed
(Integer): Number of different octaves used in the performance.AccidentalsUsed
(Integer): Number of accidentals (sharps/flats) used.AverageNoteDuration
(Float): Average duration of notes.MostUsedMidiNote
(Integer): The MIDI number of the most frequent note.UniqueMidiNotesCount
(Integer): Number of unique MIDI notes played.MaxConsecutiveRepeats
(Integer): The maximum number of times the same note was repeated consecutively.AverageBPM
(Float): Estimated average beats per minute (BPM).AttitudeScore
(Float): A score that attempts to quantify the musician’s “attitude,” combining metrics of duration, repetitions, and melodic complexity.
3.3. Audio Pre-Processing
IAR offers audio pre-processing functionalities that can be applied directly to raw data, before feature analysis. This is useful for noise reduction, isolating specific frequencies, or shaping the sound.
You can enable and configure these filters in the IARAudioComponent
:
- Noise Gate: Reduces the volume of signals below a certain threshold, eliminating background noise.
bEnableNoiseGate
(Boolean): Activates the Noise Gate.NoiseGateThresholdRMS
(Float): The RMS threshold. Signals with RMS below this value will be attenuated.
- Low Pass Filter: Allows frequencies below a cutoff value to pass through, attenuating higher frequencies.
bEnableLowPassFilter
(Boolean): Activates the low-pass filter.LowPassCutoffFrequencyHz
(Float): The cutoff frequency in Hz.
- High Pass Filter: Allows frequencies above a cutoff value to pass through, attenuating lower frequencies.
bEnableHighPassFilter
(Boolean): Activates the high-pass filter.HighPassCutoffFrequencyHz
(Float): The cutoff frequency in Hz.
- Contextual Frequency Filtering: This is an advanced feature of
UIARAdvancedAudioFeatureProcessor
that attenuates frequencies outside an expected tonal range, based on the main note detected in the previous frame. This can help “clean” the spectrum for more accurate pitch detection in noisy environments or with multiple instruments.bEnableContextualFrequencyFiltering
(Boolean): Enables this filter (requiresUIARAdvancedAudioFeatureProcessor
).ContextualFilterSemitoneRange
(Integer): The width of the allowed frequency “window,” in semitones, centered on the previous note.ContextualFilterAttenuationFactor
(Float): The attenuation factor for frequencies outside the window.
3.4. Audio to MIDI Transcription (UIARAudioToMIDITranscriber
)
IAR can transcribe real-time audio into MIDI events, such as notes (Note On
/Note Off
). This is performed by the UIARAudioToMIDITranscriber
, which receives FIAR_AudioFeatures
and generates FIAR_MIDIEvent
.
To use the transcriber:
- Configure the
IARAudioComponent
to send real-time features (AudioStreamSettings.bEnableRTFeatures = TRUE
). - The
UIARAudioToMIDITranscriber
is an internal object of theIARAudioComponent
and processes the features automatically. - Subscribe to the
OnMIDITranscriptionEventGenerated
delegate:- On your
IARAudioComponent
, this delegate is triggered whenever a MIDI event is transcribed.
- On your
(Your Actor Blueprint, e.g.: PlayerController)
[Event BeginPlay]
|
|---- Get IARAudioComponent (Self) ----> (Promote to variable) ----> Set (IARAudioComponent_Ref)
| |
| |---- Bind Event to OnMIDITranscriptionEventGenerated (from IARAudioComponent_Ref->MIDITranscriber)
| | |
| | |---- Custom Event (OnMIDINoteTranscribed)
| |
| |---- Print String (String: "Subscribed to MIDI transcriber")
[Custom Event: OnMIDINoteTranscribed] (with an input "MIDIEvent" of type FIAR_MIDIEvent)
|
|---- If (MIDIEvent.Status == 144) -- (Note On, 0x90 in decimal)
| |
| |---- (True) ---------------------------------------------------------------------
| | |---- Print String (String: "Note ON! MIDI: " + MIDIEvent.Data1 + " Vel: " + MIDIEvent.Data2)
| |
| |---- (False) --------------------------------------------------------------------
| | |---- If (MIDIEvent.Status == 128) -- (Note Off, 0x80 in decimal)
| | | |
| | | |---- (True) -----------------------------------------------------------
| | | | |---- Print String (String: "Note OFF! MIDI: " + MIDIEvent.Data1)
3.5. MIDI to Audio Synthesis (UIARMIDIToAudioSynthesizer
)
IAR can generate audio from MIDI events, functioning as a rudimentary synthesizer. This is useful for playing MIDI files, monitoring live MIDI input, or bringing your MIDI transcriptions to life.
How to Enable:
- On the
IARAudioComponent
, locate theSynthesizerInstance
property. - Set
bEnableMIDISynthesizerOutput
(Boolean) toTRUE
.
When enabled, the UIARMIDIToAudioSynthesizer
will automatically listen for MIDI events and generate audio.
- If the
SourceType
of theIARAudioComponent
isMIDIFile
orMIDIInput
, IAR will send the read MIDI events directly to the synthesizer. - If the
SourceType
is audio (e.g.,AudioMixer
) and the transcriber is active, the transcribed MIDI events will also feed the synthesizer.
The synthesizer internally creates a USoundWaveProcedural
and plays it, making the sound audible in the game.
3.6. Batch Media Processing via Folders (SourceType = Folder
)
The SourceType = Folder
functionality allows IAR to automatically process collections of media files (audio and MIDI) located in a specific folder. This is a powerful capability for:
- Generating Datasets: Create audio or MIDI datasets from existing collections.
- Converting Media Libraries: Convert large quantities of audio files to MIDI, or MIDI to audio.
- Preparing Content: Adapt your media files to new formats without manual intervention for each file.
How It Works:
When you configure the IARAudioComponent
to use SourceType = Folder
and start recording, IAR will not perform continuous real-time recording. Instead, it will:
- Scan the input folder (
InputFolderPath
) for supported media files (.wav
,.mp3
,.mid
). - For each file found, it will attempt to perform the following conversions automatically:
- If the input file is audio (
.wav
or.mp3
), IAR will transcribe it to a MIDI (.mid
) file. - If the input file is MIDI (
.mid
), IAR will synthesize it to an audio (.wav
) file.
- If the input file is audio (
- Converted files will be saved to the output folder (
OutputFolderPath
), retaining the original file name but with the new extension.
Important: IAR will attempt to perform BOTH conversions (audio to MIDI and MIDI to audio) if files of both types are present in the input folder.
Configuration in Blueprint:
To use batch processing via folders, you will need to configure the AudioStreamSettings
on your IARAudioComponent
and, crucially, define the folder paths.
- Set the Source Type:
- In your
IARAudioComponent
, access theAudioStreamSettings
structure. - Set
SourceType
toFolder
.
- In your
- Configure Input and Output Paths:
When
SourceType = Folder
, theIARAudioComponent
creates an instance of aUIARFolderSource
object to manage the process. Some of its properties are accessible through the created instance:InputFolderPath
(String):- This is the path to the folder where your original audio/MIDI files are located.
- You configure it in the
FolderPath
property withinAudioStreamSettings
. - Example:
IARAudioComponent_Ref->AudioStreamSettings.FolderPath = "C:/MyProject/Content/MyMediaInput"
- Internally, IAR will use this value to scan files.
OutputFolderPath
(String):- This is the path to the folder where converted files will be saved.
- By default, IAR will save converted files to
[ProjectSavedDirectory]/Recording/IAR_ConvertedMedia
. - If you want to use a different output path, you will need to get a reference to the created
UIARFolderSource
instance and set this property directly. - Example:
Get IARAudioComponent->CurrentMediaSource
and cast toUIARFolderSource
. From there, you can set theOutputFolderPath
on the returned object.
bOverwriteExistingFiles
(Boolean):- Controls whether existing output files (with the same name) should be overwritten during conversion.
- By default, it is
FALSE
(IAR does not overwrite, skipping conversion for that file if the destination already exists). - Like
OutputFolderPath
, this property belongs to theUIARFolderSource
instance and needs to be accessed via cast.
- Processing Feedback Delegates:
The
IARAudioComponent
triggers delegates that you can bind in your Blueprints to monitor the progress and result of folder processing:OnFolderProcessingCompleted
(Delegate): Triggered when all folder processing is completed successfully.- Parameters:
OutputFolderPath
(String) – The path to the folder where files were saved.
- Parameters:
OnFolderProcessingError
(Delegate): Triggered if an error occurs during file processing or initialization.- Parameters:
ErrorMessage
(String) – A description of the error.
- Parameters:
OnFolderProcessingProgress
(Delegate): Triggered periodically to report progress of the conversion.- Parameters:
CurrentFileName
(String) – The name of the file currently being processed. ProgressRatio
(Float) – The total progress of the folder (0.0 to 1.0).
- Parameters:
Blueprint Example: Setting Up and Starting a Folder Conversion
This example shows how to configure the IARAudioComponent
to process a media folder and how to capture the progress and completion events.
(Your Actor Blueprint, e.g.: PlayerController or GameMode)
[Event BeginPlay]
|
|---- Get IARAudioComponent (Self) ----> (Promote to variable) ----> Set (IARAudioComponent_Ref)
| |
| |---- Bind Event to OnFolderProcessingCompleted (from IARAudioComponent_Ref)
| | |
| | |---- Custom Event (OnFolderConversionDone)
| |
| |---- Bind Event to OnFolderProcessingError (from IARAudioComponent_Ref)
| | |
| | |---- Custom Event (OnFolderConversionError)
| |
| |---- Bind Event to OnFolderProcessingProgress (from IARAudioComponent_Ref)
| | |
| | |---- Custom Event (OnFolderConversionProgress)
| |
| |---- Print String (String: "Listeners configured for Folder Processing")
[Custom Event: TriggerFolderConversion] (Can be activated by a UI button, for example)
|
|---- Set (IARAudioComponent_Ref->AudioStreamSettings.SourceType = Folder) ----------------------------------------------------|
| |
|---- Set (IARAudioComponent_Ref->AudioStreamSettings.FolderPath = "C:/Users/YourUser/Documents/MyMediaInput") ----------------| (Set the path to your input folder)
| |
|---- Cast To UIARFolderSource (Object: IARAudioComponent_Ref->CurrentMediaSource) ----> Set bOverwriteExistingFiles = True ---| (Optional: overwrite existing files)
| |
|---- IARAudioComponent_Ref->Start Recording ----------------------------------------------------------------------------------|
| |
|---- Print String (String: "Folder Processing Started!") ---------------------------------------------------------------------|
[Custom Event: OnFolderConversionDone] (Input: OutputFolderPath (String))
|
|---- Print String (String: "Folder Processing COMPLETED! Files in: " + OutputFolderPath)
[Custom Event: OnFolderConversionError] (Input: ErrorMessage (String))
|
|---- Print String (String: "ERROR in Folder Processing: " + ErrorMessage)
[Custom Event: OnFolderConversionProgress] (Input: CurrentFileName (String), ProgressRatio (Float))
|
|---- Format Text (String: "Processing: {0} ({1}%)")
| | |
| | |---- (In_1: CurrentFileName)
| | |---- (In_2: ProgressRatio * 100)
|
|---- Print String (String: Formatted Text)
3.7. Interacting with MIDI Devices and Files (UIARKeyboard
)
The UIARKeyboard
is a separate object from the IARAudioComponent
that provides direct functionalities for sending and receiving MIDI events, including MIDI file playback. It utilizes Unreal Engine’s “MIDI Device Support” plugin.
How to Use UIARKeyboard
:
- Create a
UIARKeyboard
Instance: As it is not a component, you can create it in your GameMode, PlayerController, or a dedicated actor.[Event BeginPlay] | |---- Create Object (Class: UIARKeyboard) ----> (Promote to variable) ----> Set (MyKeyboard_Ref)
- Initialize the
UIARKeyboard
: Connect it to physical MIDI devices if desired.Initialize Keyboard
:InInputDeviceID
: ID of the MIDI input device (to receive notes). Use -1 to disable input.InOutputDeviceID
: ID of the MIDI output device (to send notes). Use -1 for the first available device.InDefaultVelocity
: Default velocity for notes (0-127).InMIDIChannel
: MIDI channel to be used (0-15).
Main UIARKeyboard
Functionalities:
Press Note
: Sends aNote On
message to the configured MIDI output device.Release Note
: Sends aNote Off
message to the configured MIDI output device.Play Note Once
: A latent function that sendsNote On
, waits for a duration, and then sendsNote Off
. Useful for playing short notes directly from Blueprint.Play Midi File
: Loads and plays a.mid
file through the MIDI output device.FilePath
: Full path to the MIDI file on disk.
Stop Midi File Playback
: Stops MIDI file playback.
Receiving MIDI Events (from your MIDI keyboard or files):
The UIARKeyboard
triggers the OnMIDIEventGenerated
delegate whenever a MIDI event is received (from a physical input device or a MIDI file being played). You can subscribe to this delegate to create MIDI-based logic.
(Your Actor Blueprint where MyKeyboard_Ref is the reference to your UIARKeyboard)
[Event BeginPlay]
|
|---- Get (MyKeyboard_Ref) ----> Bind Event to OnMIDIEventGenerated
| |
| |---- Custom Event (OnKeyboardMIDIEvent)
[Custom Event: OnKeyboardMIDIEvent] (with an input "MIDIEvent" of type FIAR_MIDIEvent)
|
|---- (Process the received MIDI event, similar to the transcription example)
`