84

I have two video clips. Both are 640x480 and last 10 minutes. One contains background audio, the other one a singing actor. I would like to create a single 10 minute video clip measuring 1280x480 (in other words, I want to place the videos next to each other and play them simultaneously, mixing audio from both clips). I've tried trying to figure out how to do this with ffmpeg/avidemux, but so far I came up empty. They all refer to concatenating when I search for merging.

Any recommendations?

user
  • 915
  • 7
  • 27
  • 38
  • This answer is great, because it provides not only side-by-side examples but also more advanced grid layouts. https://stackoverflow.com/a/33764934/1576548 – Raleigh L. Jul 24 '23 at 05:23

5 Answers5

109

To be honest, using the accepted answer resulted in a lot of dropped frames for me.

However, using the hstack filter_complex produced perfectly fluid output:

ffmpeg -i left.mp4 -i right.mp4 -filter_complex hstack output.mp4
Albus Dumbledore
  • 1,191
  • 1
  • 7
  • 5
  • 21
    In case anyone is wondering how to stack more than 2 videos, you can simply specify `hstack=inputs=3` – Simbi Jun 17 '20 at 18:34
  • What to do when they're of different size? I got "Input 1 height 1080 does not match input 0 height 800" error. – trinity420 Jun 17 '23 at 23:08
74
ffmpeg \
  -i input1.mp4 \
  -i input2.mp4 \
  -filter_complex '[0:v]pad=iw*2:ih[int];[int][1:v]overlay=W/2:0[vid]' \
  -map '[vid]' \
  -c:v libx264 \
  -crf 23 \
  -preset veryfast \
  output.mp4

This essentially doubles the size of input1.mp4 by padding the right side with black the same size as the original video, and then places input2.mp4 over the top of that black area with the overlay filter.

Source: https://superuser.com/questions/153160/join-videos-split-screen

Jan
  • 7,600
  • 2
  • 34
  • 41
32

This can be done with just two filters and the audio from both inputs will also be included.

ffmpeg -i left.mp4 -i right.mp4 -filter_complex \
"[0:v][1:v]hstack=inputs=2[v]; \
 [0:a][1:a]amerge[a]" \
-map "[v]" -map "[a]" -ac 2 output.mp4
  • hstack will place each video side-by-side.
  • amerge will combine the audio from both inputs into a single, multichannel audio stream, and -ac 2 will make it stereo; without this option the audio stream may end up as 4 channels if both inputs are stereo.
llogan
  • 952
  • 7
  • 10
  • Hello, Can I overlay one video on to another video? – Nisarg Sep 21 '16 at 11:38
  • @Nisarg Use the [overlay filter](http://ffmpeg.org/ffmpeg-filters.html#overlay). – llogan Sep 21 '16 at 22:25
  • I want upper video little transparent so one can see whats happening in background, Can you help me out with that? – Nisarg Sep 22 '16 at 05:01
  • Hello Sir, Can you care to look [this](http://superuser.com/q/1127002/510107) out. – Nisarg Sep 22 '16 at 06:34
  • 3
    I lost audio with the accepted answer. This one works perfectly. – Matt Hough Feb 22 '19 at 08:48
  • `hstack` has the limitation that it requires constant resolution, which live videos streamed by Chrome (e.g. with WebRTC) won't have, because Chrome changes resolution dynamically depending on the available bandwidth. Then `hstack` fails. So best of both worlds, use the accepted answer for the video part, and this answer for the audio! – j1elo Sep 02 '21 at 11:00
  • @j1elo how would that command be? – Zee Sep 05 '21 at 13:04
7
ffmpeg -y -ss 0 -t 5 -i inputVideo1.mp4  
-ss 0 -t 5 -i inputVideo2.mp4  
-i BgPaddingImage.jpg  
-filter_complex "nullsrc=size=720*720[base];[base][2:v]overlay=1,format=yuv420p[base1];[0:v]setpts=PTS-STARTPTS,scale=345*700[upperleft];[1:v]setpts=PTS-STARTPTS,scale=345*700[upperright];[base1][upperleft]overlay=shortest=1:x=10:y=10[tmp1];[tmp1][upperright]overlay=shortest=1:x=366:y=10"  
-c:a copy -strict experimental  
-ss 0 -t 5 -preset ultrafast -an 
output.mp4

Add Two Video Side by side And Also Add OverLay Image That Show On Videos Padding With You can change Background Image [BgPaddingImage.jpg] here set your bg image path.

Show This below Video its create from Above command

enter image description here

Sanjay Hadiya
  • 177
  • 1
  • 3
0

Gradle Dependency

implementation "com.writingminds:FFmpegAndroid:0.3.2"

Code

Command to concate two videos side by side into one

val cmd : arrayOf("-y", "-i", videoFile!!.path, "-i", videoFileTwo!!.path, "-filter_complex", "hstack", outputFile.path)

Command to append two videos (one after another) into one

  val cmd : arrayOf("-y", "-i", videoFile!!.path, "-i", videoFileTwo!!.path, "-strict", "experimental", "-filter_complex",
                        "[0:v]scale=iw*min(1920/iw\\,1080/ih):ih*min(1920/iw\\,1080/ih), pad=1920:1080:(1920-iw*min(1920/iw\\,1080/ih))/2:(1080-ih*min(1920/iw\\,1080/ih))/2,setsar=1:1[v0];[1:v] scale=iw*min(1920/iw\\,1080/ih):ih*min(1920/iw\\,1080/ih), pad=1920:1080:(1920-iw*min(1920/iw\\,1080/ih))/2:(1080-ih*min(1920/iw\\,1080/ih))/2,setsar=1:1[v1];[v0][0:a][v1][1:a] concat=n=2:v=1:a=1",
                        "-ab", "48000", "-ac", "2", "-ar", "22050", "-s", "1920x1080", "-vcodec", "libx264", "-crf", "27",
                        "-q", "4", "-preset", "ultrafast", outputFile.path)

Note :

"videoFile" is your first video path.
"videoFileTwo" is your second video path.
"outputFile" is your combined video path which is our final output path

To create output path of video

fun createVideoPath(context: Context): File {
        val timeStamp: String = SimpleDateFormat(Constant.DATE_FORMAT, Locale.getDefault()).format(Date())
        val imageFileName: String = "APP_NAME_"+ timeStamp + "_"
        val storageDir: File? = context.getExternalFilesDir(Environment.DIRECTORY_MOVIES)
        if (storageDir != null) {
            if (!storageDir.exists()) storageDir.mkdirs()
        }
        return File.createTempFile(imageFileName, Constant.VIDEO_FORMAT, storageDir)
    }

Code to execute command

try {
            FFmpeg.getInstance(context).execute(cmd, object : ExecuteBinaryResponseHandler() {
                override fun onStart() {

                }

                override fun onProgress(message: String?) {
                    callback!!.onProgress(message!!)
                }

                override fun onSuccess(message: String?) {
                    callback!!.onSuccess(outputFile)
                }

                override fun onFailure(message: String?) {
                    if (outputFile.exists()) {
                        outputFile.delete()
                    }
                    callback!!.onFailure(IOException(message))
                }

                override fun onFinish() {
                    callback!!.onFinish()
                }
            })
        } catch (e: Exception) {
            
        } catch (e2: FFmpegCommandAlreadyRunningException) {
           
        }
  • Welcome to the site, and thank you for your contribution. Please edit your post to indicate which language you used for the proposed code, and ideally a command-line example on how to apply it to the OPs problem. – AdminBee Nov 11 '21 at 09:38