Reputation: 523
so I'm using gstreamer for java and I'm trying to playback live video stream and record it at the same time. right now, I could do one at a time, but I'm not sure how to do them both at the same time. I tried threading it, but there was a conflict since both of the threads were trying to access the same resource. THen, my ta told me that I need to use a tee and queues, basically, have all of the different paths share the same resources instead of trying to control it (thats what I think), I'm not sure how to do this in java though, and right now the internet doesnt have a good tutorial for java about tees... (looked through a bit, all it is is code that doesnt compile on my machine) heres an idea of what i was doing
public class Main {
private static Pipeline pipe;
private static Pipeline pipeB;
public static void main(String[] args) {
args = Gst.init("SwingVideoTest", args);
pipe = new Pipeline("pipeline");
pipeB = new Pipeline("pipeline");
final Element tee = ElementFactory.make("queue", null);
Element queue0 = ElementFactory.make("queue", null);
Element queue1 = ElementFactory.make("queue", null);
AppSink appsink = (AppSink)ElementFactory.make("appsink", null);
tee.set("silent", "false");
appsink.set("emit-signals", "true");
final Element videosrc = ElementFactory.make("v4l2src", "source");
videosrc.set("device" , "/dev/video1");
final Element colorspace = ElementFactory.make("ffmpegcolorspace", "colorspace");
final Element videofilter = ElementFactory.make("capsfilter", "flt");
videofilter.setCaps(Caps.fromString("video/x-raw-yuv, width=640, height=480, framerate=10/1"));
final Element enc = ElementFactory.make("ffenc_mpeg4", "Encoder");
final Element mux = ElementFactory.make("avimux", "muxer");
final Element sink = ElementFactory.make("filesink", "File sink");
sink.set("location", "./test.avi");
final Element videosrcB = ElementFactory.make("v4l2src", "source");
videosrcB.set("device" , "/dev/video0");
final Element videofilterB = ElementFactory.make("capsfilter", "flt");
videofilterB.setCaps(Caps.fromString("video/x-raw-yuv, width=640, height=480"));
VideoPlayer threadA = new VideoPlayer("screen", videosrcB, null, videofilterB, null, null, null, pipe);
VideoPlayer threadB = new VideoPlayer("recorder", videosrc, colorspace, videofilter, enc, mux, sink, pipeB);
threadA.run();
threadB.run();
}
public class VideoPlayer implements Runnable{
private String playerType;
private Element videosrc, colorspace, videofilter, enc, mux, sink;
private Pipeline pipe;
VideoPlayer(final String playerType, final Element videosrc, final Element colorspace, final Element videofilter, final Element enc, final Element mux, final Element sink, final Pipeline pipe){
this.playerType = playerType;
this.videosrc = videosrc;
this.colorspace = colorspace;
this.videofilter = videofilter;
this.enc = enc;
this.mux = mux;
this.sink = sink;
this.pipe = pipe;
}
public void run(){
VideoComponent videoComponent = new VideoComponent();
Element videosink = videoComponent.getElement();
if(playerType.equals("screen")){
System.out.println(playerType);
pipe.addMany(videosrc, videofilter, videosink);
Element.linkMany(videosrc, videofilter, videosink);
JFrame frame = new JFrame("Swing Video Test");
frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
frame.add(videoComponent, BorderLayout.CENTER);
videoComponent.setPreferredSize(new Dimension(640, 480));
frame.pack();
frame.setVisible(true);
// Start the pipeline processing
pipe.setState(State.PLAYING);
}
else if(playerType.equals("recorder")){
System.out.println(playerType);
pipe.addMany(videosrc, colorspace, videofilter, enc, mux, sink);
Element.linkMany(videosrc, colorspace, videofilter, enc, mux, sink);
JFrame frame = new JFrame("Swing Video Test");
frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
frame.add(videoComponent, BorderLayout.CENTER);
videoComponent.setPreferredSize(new Dimension(640, 480));
frame.pack();
//frame.setVisible(true);
pipe.setState(State.PLAYING);
}
}
kinda lengthy but pretty easy to see what I was trying do. If anyone could tell me how to implement the tee (I tried?) that would be great. Thanks!
Upvotes: 4
Views: 4212
Reputation: 2160
You shouldn't create two Video Sources as you did. videosrc & videosrcB
.
Basically, you should receive data from videosrc
and give it to GstTee
now the Tee must have two SrcPads. This will allow two paths to function separately.
The first src path should get connect to your enc
and mux
which will go on to recording And the second path will take up to the display. All should work simultaneously.
The Tee
can be buffered by Queue
in each path. From the point of view of circuit this is not essential, but it is good so that path#2 doesn't wait till path#1 finishes under a blocking call.
Here is how the circuit looks like.
Upvotes: 6