Reputation: 595
I am trying to create an app that will allow me to stream video FROM the iPhone TO a server. my current theory as to how to do this is to create a series of FFMpeg files and send them to the server. as far as i can tell i have compiled the FFMpeg library correctly for the iPhone. i followed these instructions here. a series of executable files appeared in the folder so i'm assuming it worked.
my question is now what? how do i get these into an app? how do i make calls to these executable files? and most importantly will this even work the way i want it to?
Upvotes: 1
Views: 2497
Reputation: 486
Ok you said that you succesfully compiled FFMpeg for iPhone, right? As mvds said, you can't use them as executable files. So in order to use this libraries after compilation finishes you need to copy all the .a libs generated to your project (as when you add libraries or other frameworks). These libraries are:
libavcodec.a
libavfilter.a
libavutil.a
libswscale.a
libavdevice.a
libavformat.a
libswresample.a
Then you have to configure your project
Clic on your project -> build settings
Search "Header Search Paths" and add the folder location of your libraries (location can be absolute or relative)
Clic on build phases -> Link Binary With Libraries -> Add other, and add all .a files
Voila! Now you can import and use the libraries of FFmpeg for your project
#include <avcodec.h>
#include ...
// More C and/or Objective-C Code
To access individual uncompressed frames you can use captureOutput:didOutputSampleBuffer:fromConnection: delegate method from AVCaptureVideoDataOutput (there are plenty of examples), and somehow encoding them to h264 maybe using AVFrame? As far as I know FFmpeg also can stream using RTSP for live streaming, but it seems that documentation is close to zero :(
To answer your final question
and most importantly will this even work the way i want it to?
The answer is yes, it can work, I found 2 libraries that do just what you want to achieve http://www.foxitsolutions.com/iphone_h264_sdk.html
Both uses FFmpeg the same way you are suggesting, this question is a little old but I have found many users trying to achieve this so so I have a question Do you had success on doing this? Can you share your experience or recomendations?
Upvotes: 0
Reputation: 3865
Althought this issue is quite old this could help to other users in the future:
Just take a look at the source code here http://dev.wunderground.com/support/wunderradio/wunderradio.1.9lgpl.zip
Good luck
Upvotes: 1
Reputation: 47034
You have built the ffmpeg binary which can run on an iPhone. You cannot run executables from an app on a (non-jailbroken) phone. So you would have to compile the library, and link against that. Then, from your app, call the relevant functions directly, mimicing what the ffmpeg program does.
Upvotes: 2