Interface IBufferSourceAudioTrack

Inherited from LocalAudioTrack, BufferSourceAudioTrack is an interface for the audio from a local audio file and adds several functions for controlling the processing of the audio buffer, such as starting processing, stopping processing, and seeking a specified time location.

You can create an audio track from an audio file by calling AgoraRTC.createBufferSourceAudioTrack.

Hierarchy

Index

Events

source-state-change

  • Occurs when the state of processing the audio buffer in BufferSourceAudioTrack changes.

    Parameters

    • currentState: AudioSourceState

      The state of processing the audio buffer:

      • "stopped": The SDK stops processing the audio buffer. Reasons may include:
        • The SDK finishes processing the audio buffer.
        • The user manually stops the processing of the audio buffer.
      • "paused": The SDK pauses the processing of the audio buffer.
      • "playing": The SDK is processing the audio buffer.

    Returns void

Properties

currentState

currentState: AudioSourceState

The current state of audio processing, such as start, pause, or stop.

duration

duration: number

The total duration of the audio (seconds).

isPlaying

isPlaying: boolean

Whether a media track is playing on the webpage:

  • true: The media track is playing on the webpage.
  • false: The media track is not playing on the webpage.

source

source: string | File | AudioBuffer

The source specified when creating an audio track.

trackMediaType

trackMediaType: "audio" | "video"

The type of a media track:

  • "audio": Audio track.
  • "video": Video track.

Methods

close

  • close(): void
  • Closes a local track and releases the audio and video resources that it occupies.

    Once you close a local track, you can no longer reuse it.

    Returns void

getCurrentTime

  • getCurrentTime(): number
  • Returns number

getListeners

  • getListeners(event: string): Function[]
  • Gets all the listeners for a specified event.

    Parameters

    • event: string

      The event name.

    Returns Function[]

getMediaStreamTrack

  • getMediaStreamTrack(): MediaStreamTrack

getStats

getTrackId

  • getTrackId(): string
  • Gets the ID of a media track, a unique identifier generated by the SDK.

    Returns string

    The media track ID.

getTrackLabel

  • getTrackLabel(): string
  • Gets the label of a local track.

    Returns string

    The label that the SDK returns may include:

    • The MediaDeviceInfo.label property, if the track is created by calling createMicrophoneAudioTrack or createCameraVideoTrack.
    • The sourceId property, if the track is created by calling createScreenVideoTrack.
    • The MediaStreamTrack.label property, if the track is created by calling createCustomAudioTrack or createCustomVideoTrack.

getVolumeLevel

  • getVolumeLevel(): number
  • Gets the audio level of a local audio track.

    Returns number

    The audio level. The value range is [0,1]. 1 is the highest audio level.

off

  • off(event: string, listener: Function): void
  • Removes the listener for a specified event.

    Parameters

    • event: string

      The event name.

    • listener: Function

      The callback that corresponds to the event listener.

    Returns void

on

once

  • once(event: string, listener: Function): void
  • Listens for a specified event once.

    When the specified event happens, the SDK triggers the callback that you pass and then removes the listener.

    Parameters

    • event: string

      The event name.

    • listener: Function

      The callback to trigger.

    Returns void

pauseProcessAudioBuffer

  • pauseProcessAudioBuffer(): void
  • Pauses processing the audio buffer.

    Returns void

play

  • play(): void
  • Plays a local audio track.

    When playing a audio track, you do not need to pass any DOM element.

    Returns void

removeAllListeners

  • removeAllListeners(event?: undefined | string): void
  • Removes all listeners for a specified event.

    Parameters

    • Optional event: undefined | string

      The event name. If left empty, all listeners for all events are removed.

    Returns void

resumeProcessAudioBuffer

  • resumeProcessAudioBuffer(): void
  • Resumes processing the audio buffer.

    Returns void

seekAudioBuffer

  • seekAudioBuffer(time: number): void
  • Jumps to a specified time point.

    Parameters

    • time: number

      The specified time point (seconds).

    Returns void

setAudioFrameCallback

  • setAudioFrameCallback(audioFrameCallback: null | function, frameSize?: undefined | number): void
  • Sets the callback for getting raw audio data in PCM format.

    After you successfully set the callback, the SDK constantly returns the audio frames of a local audio track in this callback by using AudioBuffer.

    You can set the frameSize parameter to determine the frame size in each callback, which affects the interval between the callbacks. The larger the frame size, the longer the interval between them.

    track.setAudioFrameCallback((buffer) => {
      for (let channel = 0; channel < buffer.numberOfChannels; channel += 1) {
        // Float32Array with PCM data
        const currentChannelData = buffer.getChannelData(channel);
        console.log("PCM data in channel", channel, currentChannelData);
      }
    }, 2048);
    
    // ....
    // Stop getting the raw audio data
    track.setAudioFrameCallback(null);

    Parameters

    • audioFrameCallback: null | function

      The callback function for receiving the AudioBuffer object. If you set audioBufferCallback as null, the SDK stops getting raw audio data.

    • Optional frameSize: undefined | number

      The number of samples of each audio channel that an AudioBuffer object contains. You can set frameSize as 256, 512, 1024, 2048, 4096, 8192, or 16384. The default value is 4096.

    Returns void

setEnabled

  • setEnabled(enabled: boolean): Promise<void>
  • Since
       4.0.0

    Enables/Disables the track.

    After a track is disabled, the SDK stops playing and publishing the track.

    Parameters

    • enabled: boolean

      Whether to enable the track:

      • true: Enable the track.
      • false: Disable the track.

    Returns Promise<void>

setPlaybackDevice

  • setPlaybackDevice(deviceId: string): Promise<void>
  • Since
       4.1.0

    Sets the audio playback device, for example, the speaker.

    This method supports Chrome only. Other browsers throw a 'NOT_SUPPORTED error when calling this method.

    Parameters

    Returns Promise<void>

setVolume

  • setVolume(volume: number): void
  • Sets the volume of a local audio track.

    Parameters

    • volume: number

      The volume. The value ranges from 0 (mute) to 1000 (maximum). A value of 100 is the original volume.

    Returns void

startProcessAudioBuffer

  • Starts processing the audio buffer.

    Starting processing the audio buffer means that the processing unit in the SDK has received the audio data. If the audio track has been published, the remote user can hear the audio. Whether the local user can hear the audio depends on whether the SDK calls the play method and sends the audio data to the sound card.

    Parameters

    Returns void

stop

  • stop(): void
  • Stops playing the media track.

    Returns void

stopProcessAudioBuffer

  • stopProcessAudioBuffer(): void
  • Stops processing the audio buffer.

    Returns void