diederikh
diederikh

Reputation: 25281

HTTP Live Streaming Mac app

I am developing a Mac app which needs to provide a HTTP Live Stream (just the last 2 seconds or so) for the main screen (Desktop).

I was thinking of the following process:

  1. Create a AVCaptureSession with a AVCaptureScreenInput as input (sessionPreset = AVCaptureSessionPresetPhoto)
  2. Add a AVCaptureVideoDataOutput output to the session
  3. Capture the frames (in kCVPixelFormatType_32BGRA format) in captureOutput:didDropSampleBuffer:fromConnection: and write these to a ffmpeg process for segmenting (using a pipe or something) that creates the MPEG-TS and playlist files.
  4. Use an embedded HTTP server to server up the segmented files and playlist file.

Is this the best approach and is there is no way to circumvent the ffmpeg part for encoding and segmenting the video stream?

What is the best way to pipe the raw frames to ffmpeg?

Upvotes: 0

Views: 472

Answers (1)

vipw
vipw

Reputation: 7645

It sounds like a good approach. You can use have ffmpeg output to a stream and use the segmenting tools from Apple to segment it. I believe that the Apple tools have slightly better mux rate, but it might not matter for your use case.

Upvotes: 1

Related Questions