Interface IMicrophoneAudioTrack

Inherited from LocalAudioTrack, MicrophoneAudioTrack is an interface for the audio sampled by a local microphone and adds several functions such as switching devices.

You can create a local microphone audio track by calling AgoraRTC.createMicrophoneAudioTrack.

Hierarchy

Index

Properties

isPlaying

isPlaying: boolean

Whether a media track is playing on the webpage:

  • true: The media track is playing on the webpage.
  • false: The media track is not playing on the webpage.

trackMediaType

trackMediaType: "audio" | "video"

The type of a media track:

  • "audio": Audio track.
  • "video": Video track.

Methods

close

  • close(): void
  • Closes a local track and releases the audio and video resources that it occupies.

    Once you close a local track, you can no longer reuse it.

    Returns void

getListeners

  • getListeners(event: string): Function[]
  • Gets all the listeners for a specified event.

    Parameters

    • event: string

      The event name.

    Returns Function[]

getMediaStreamTrack

  • getMediaStreamTrack(): MediaStreamTrack

getStats

getTrackId

  • getTrackId(): string
  • Gets the ID of a media track, a unique identifier generated by the SDK.

    Returns string

    The media track ID.

getTrackLabel

  • getTrackLabel(): string
  • Gets the label of a local track.

    Returns string

    The label that the SDK returns may include:

    • The MediaDeviceInfo.label property, if the track is created by calling createMicrophoneAudioTrack or createCameraVideoTrack.
    • The sourceId property, if the track is created by calling createScreenVideoTrack.
    • The MediaStreamTrack.label property, if the track is created by calling createCustomAudioTrack or createCustomVideoTrack.

getVolumeLevel

  • getVolumeLevel(): number
  • Gets the audio level of a local audio track.

    Returns number

    The audio level. The value range is [0,1]. 1 is the highest audio level.

off

  • off(event: string, listener: Function): void
  • Removes the listener for a specified event.

    Parameters

    • event: string

      The event name.

    • listener: Function

      The callback that corresponds to the event listener.

    Returns void

on

once

  • once(event: string, listener: Function): void
  • Listens for a specified event once.

    When the specified event happens, the SDK triggers the callback that you pass and then removes the listener.

    Parameters

    • event: string

      The event name.

    • listener: Function

      The callback to trigger.

    Returns void

play

  • play(): void
  • Plays a local audio track.

    When playing a audio track, you do not need to pass any DOM element.

    Returns void

removeAllListeners

  • removeAllListeners(event?: undefined | string): void
  • Removes all listeners for a specified event.

    Parameters

    • Optional event: undefined | string

      The event name. If left empty, all listeners for all events are removed.

    Returns void

setAudioFrameCallback

  • setAudioFrameCallback(audioFrameCallback: null | function, frameSize?: undefined | number): void
  • Sets the callback for getting raw audio data in PCM format.

    After you successfully set the callback, the SDK constantly returns the audio frames of a local audio track in this callback by using AudioBuffer.

    You can set the frameSize parameter to determine the frame size in each callback, which affects the interval between the callbacks. The larger the frame size, the longer the interval between them.

    track.setAudioFrameCallback((buffer) => {
      for (let channel = 0; channel < buffer.numberOfChannels; channel += 1) {
        // Float32Array with PCM data
        const currentChannelData = buffer.getChannelData(channel);
        console.log("PCM data in channel", channel, currentChannelData);
      }
    }, 2048);
    
    // ....
    // Stop getting the raw audio data
    track.setAudioFrameCallback(null);

    Parameters

    • audioFrameCallback: null | function

      The callback function for receiving the AudioBuffer object. If you set audioBufferCallback as null, the SDK stops getting raw audio data.

    • Optional frameSize: undefined | number

      The number of samples of each audio channel that an AudioBuffer object contains. You can set frameSize as 256, 512, 1024, 2048, 4096, 8192, or 16384. The default value is 4096.

    Returns void

setDevice

  • setDevice(deviceId: string): Promise<void>
  • Sets the device for sampling audio.

    You can call the method either before or after publishing an audio track.

    Parameters

    Returns Promise<void>

setEnabled

  • setEnabled(enabled: boolean): Promise<void>
  • Since
       4.0.0

    Enables/Disables the track.

    After a track is disabled, the SDK stops playing and publishing the track.

    Parameters

    • enabled: boolean

      Whether to enable the track:

      • true: Enable the track.
      • false: Disable the track.

    Returns Promise<void>

setPlaybackDevice

  • setPlaybackDevice(deviceId: string): Promise<void>
  • Since
       4.1.0

    Sets the audio playback device, for example, the speaker.

    This method supports Chrome only. Other browsers throw a 'NOT_SUPPORTED error when calling this method.

    Parameters

    Returns Promise<void>

setVolume

  • setVolume(volume: number): void
  • Sets the volume of a local audio track.

    Parameters

    • volume: number

      The volume. The value ranges from 0 (mute) to 1000 (maximum). A value of 100 is the original volume.

    Returns void

stop

  • stop(): void
  • Stops playing the media track.

    Returns void