Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FFmpegFrameRecorder recorder?.start() Hangs Indefinitely #2235

Open
nick2525 opened this issue May 18, 2024 · 1 comment
Open

FFmpegFrameRecorder recorder?.start() Hangs Indefinitely #2235

nick2525 opened this issue May 18, 2024 · 1 comment

Comments

@nick2525
Copy link

I am experiencing an issue where FFmpegFrameRecorder.start() hangs indefinitely when trying to start an RTSP stream. The issue occurs consistently and prevents the stream from being established.

Steps to Reproduce:

Initialize FFmpegFrameRecorder with the following settings:
kotlin
Copy code
val screenSize = Toolkit.getDefaultToolkit().screenSize
val recorder = FFmpegFrameRecorder("rtsp://0.0.0.0:8556/live", screenSize.width, screenSize.height).apply {
format = "rtsp"
videoCodec = avcodec.AV_CODEC_ID_H264
audioCodec = avcodec.AV_CODEC_ID_AAC
frameRate = 30.0
videoBitrate = 2000000
audioBitrate = 192000
interleaved = true
maxDelay = 500000
option("rtsp_transport", "tcp")
}
Call recorder.start().
The call to recorder.start() hangs indefinitely.
Expected Behavior:
The recorder should start without hanging, allowing the RTSP stream to be established.

Environment:

JavaCV version: 1.5.6
FFmpeg version: 4.4-1.5.6
OS: [Your Operating System]
JDK version: [Your JDK version]
Additional Information:
I have initialized the FFmpeg network components before starting the recorder. Here is the complete code used to reproduce the issue:

kotlin
Copy code
import org.bytedeco.ffmpeg.global.avcodec
import org.bytedeco.ffmpeg.global.avformat
import org.bytedeco.ffmpeg.global.avutil
import org.bytedeco.javacpp.Loader
import org.bytedeco.javacv.FFmpegFrameRecorder
import org.bytedeco.javacv.Java2DFrameConverter
import java.awt.Rectangle
import java.awt.Robot
import java.awt.Toolkit
import java.awt.image.BufferedImage
import javax.sound.sampled.*
import kotlin.concurrent.thread
import java.nio.ByteBuffer
import java.nio.ShortBuffer

class JvmRtspServer {

private var recorder: FFmpegFrameRecorder? = null

fun start() {
    thread { 
        mainStartRtsp() 
    }

    // Delay to ensure RTSP server starts
    Thread.sleep(2000)

    // Initialize FFmpeg libraries
    try {
        Loader.load(avutil::class.java)
        Loader.load(avcodec::class.java)
        Loader.load(avformat::class.java)
        avformat.avformat_network_init()
        println("FFmpeg network initialized")
    } catch (e: Exception) {
        e.printStackTrace()
        return
    }

    // Initialize the recorder
    try {
        val screenSize = Toolkit.getDefaultToolkit().screenSize
        recorder = FFmpegFrameRecorder("rtsp://0.0.0.0:8556/live", screenSize.width, screenSize.height).apply {
            format = "rtsp"
            videoCodec = avcodec.AV_CODEC_ID_H264
            audioCodec = avcodec.AV_CODEC_ID_AAC
            frameRate = 30.0
            videoBitrate = 2000000
            audioBitrate = 192000
            interleaved = true
            maxDelay = 500000
            option("rtsp_transport", "tcp")
        }
        recorder?.start()
        println("FFmpegFrameRecorder started")
    } catch (e: Exception) {
        e.printStackTrace()
        return
    }

    // Delay to ensure the recorder starts
    Thread.sleep(1000)

    // Capture audio
    val audioLine = captureAudio()
    val audioFormat = audioLine.format
    val bufferSize = audioFormat.frameSize * audioFormat.sampleRate.toInt() * 2
    val audioBuffer = ByteArray(bufferSize)
    val converter = Java2DFrameConverter()

    // Capture and stream screen
    thread {
        while (true) {
            try {
                val image = captureScreen()
                if (image != null) {
                    val frame = converter.convert(image)
                    if (frame != null) {
                        recorder?.record(frame)
                        println("Frame recorded successfully")
                    } else {
                        println("Failed to convert image to frame")
                    }
                } else {
                    println("Failed to capture screen image")
                }
                Thread.sleep((1000 / recorder!!.frameRate).toLong())
            } catch (e: Exception) {
                e.printStackTrace()
            }
        }
    }

    // Capture and stream audio
    thread {
        while (true) {
            try {
                val bytesRead = audioLine.read(audioBuffer, 0, audioBuffer.size)
                if (bytesRead > 0) {
                    val audioSamples = ShortArray(bytesRead / 2) // 2 bytes per sample for 16-bit audio
                    ByteBuffer.wrap(audioBuffer).asShortBuffer().get(audioSamples)
                    val shortBuffer = ShortBuffer.wrap(audioSamples)
                    recorder?.recordSamples(audioFormat.sampleRate.toInt(), audioFormat.channels, shortBuffer)
                    println("Audio samples recorded successfully")
                }
            } catch (e: Exception) {
                e.printStackTrace()
            }
        }
    }
}

private fun captureScreen(): BufferedImage? {
    return try {
        val screenRect = Rectangle(Toolkit.getDefaultToolkit().screenSize)
        Robot().createScreenCapture(screenRect)
    } catch (e: Exception) {
        e.printStackTrace()
        null
    }
}

private fun captureAudio(): TargetDataLine {
    return try {
        val format = AudioFormat(44100.0f, 16, 2, true, false)
        val info = DataLine.Info(TargetDataLine::class.java, format)
        val line = AudioSystem.getLine(info) as TargetDataLine
        line.open(format)
        line.start()
        println("Audio line opened and started")
        line
    } catch (e: Exception) {
        e.printStackTrace()
        throw RuntimeException("Failed to capture audio")
    }
}

// Mock function to start RTSP server. Replace with actual implementation.
fun mainStartRtsp() {
    println("RTSP server started")
}

}

// Main function to create and start the RTSP server
fun main() {
val server = JvmRtspServer()
server.start()
}
Logs:

arduino
Copy code
FFmpeg network initialized
Audio line opened and started
RTSP server started
Additional Information:

This issue occurs even after ensuring FFmpeg network initialization.
The same issue is reproducible on multiple machines with the same configuration.
Screenshots/Video:
(Attach any relevant screenshots or a video demonstrating the issue, if possible.)

Submit the Issue:
After filling out the template, click on the "Submit new issue" button.
Conclusion
By providing a detailed bug report with a clear description, steps to reproduce, environment details, and additional information, you can help the maintainers of the project to understand and address the issue more efficiently.

@saudet
Copy link
Member

saudet commented May 19, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants