FFMPEG and ROAV Dash Cam C1 Pro

I recently purchased a dedicated dashcam on sale to replace my GoPro setup for trip videos. This gives me a new need to understand a new file format.

2018-05-18

The Roav Dashcam stores sequential mp4 files. When configuring the camera it’s possible to set the loop time, which is the duration of each mp4. There’s also an option to watermark the files. I have it turned on, and the only thing I’ve noticed is the ROAV logo, timestamp, and speed in the bottom right. It does not appear to have a way of adjusting the size of the text.

My initial recordings were set to run at 1080p 60 fps. I wanted to concatenate multiple files, add some text of my own, and speed up the video. This was my first experience using the -filter_complex option of FFMPEG. Here’s what I came up with to put together three files, speed the output up by a factor of 60, and add some text. I’m dropping the audio completely. The ROAV can record audio inside the car, but I configured it not to, as I don’t want to hear what I was listening to on the radio or what I might be saying if I make a phone call..

ffmpeg.exe -hide_banner -i 2018_0512_130537_050A.MP4 -i 2018_0512_131537_051A.MP4 -i 2018_0512_132537_052A.MP4 -filter_complex "[0:v] [1:v] [2:v] concat=n=3:v=1 [v];[v]setpts=0.01666*PTS,drawtext=fontfile=C\\:/WINDOWS/Fonts/consola.ttf:fontcolor=white:fontsize=80:y=main_h-text_h-50:x=50:text=WimsWorld[o]" -map "[o]" -c:v libx265 -crf 23 -preset veryfast -movflags +faststart -bf 2 -g 15 -pix_fmt yuv420p  FirstMixSpeed60Concat.mp4

This first video was recorded at 1080p60. The camera can record at 1440p30 which I will be trying soon to see if things like license plates are more legible. The setpts factor that I’m currently using was 1/60, so that 1 minute of real time was compressed to 1 second of video, and just dropping the extra frames. I expect to need to change the setpts factor to 1/30 because of the decreased frame rate at the higher resolution.

Advertisements

FFMPEG and h.265

I’ve noticed that YouTube transcodes my videos after I upload them and wanted to know more. It turns out that they are internally using a form of h.265 video encoding, which reduces the data size significantly without reducing perceived quality over h.264 video compression.

I decided to run some tests using my GoPro time lapse program to see how much compression I’d get versus how much extra time for encoding.

First I had to read up on the settings for using h.265 in FFmpeg. According to https://trac.ffmpeg.org/wiki/Encode/H.265, If I add -c:v libx265 into my existing FFmpeg command line without changing anything, I’ll get an h.265 output with the defaults of -crf 23 and -preset medium.

The -preset value effects how much work is done in the compression, but shouldn’t affect the perceived quality. It will effect both time to create and output file size.

The -crf value effects the perceived quality. I’ve been using the defaults in my previous h.264 mp4 files, which should be approximately -crf 23 and supposedly -crf 28 in h.265 is equivalent to the lower h.264 value. A -crf 0 would be a completely lossless conversion. For my tests, I left crf at the default 23.

I ran all of these tests on a set of 7,129 photos I’d captured while sailboat racing on March 24th using my GoPro Hero 3+ Black. Each input image was 4000×3000 and I’m creating an output video with resolution 3840×2160 by cropping it at 3/4 of the height and scaling to fit.

The original h.264 conversion took 28:36 minutes to create and was 1,162,079,866 bytes long.

Here’s a trimmed down and sorted file listing. The first column is the time it took to create, second is filesize, and third is filename.

28:36 1,162,079,866 20180324-2160p30-cropped-h264.mp4
30:25 1,006,292,960 20180324-veryfast-2160p30-cropped-h265-crf20.mp4
21:35   328,700,100 20180324-ultrafast-2160p30-cropped-crf28.mp4
21:59   339,438,778 20180324-superfast-2160p30-cropped-crf28.mp4
25:12   350,085,434 20180324-veryfast-2160p30-cropped-crf28.mp4
25:17   349,720,030 20180324-faster-2160p30-cropped-crf28.mp4
27:24   348,582,310 20180324-fast-2160p30-cropped-crf28.mp4
52:07   365,050,252 20180324-medium-2160p30-cropped-crf28.mp4

I created a single test at -crf 20 because I was interested in seeing a higher quality video and how different it would be in size. It took slightly longer than the original h.264 compression and had a slight improvement in size.

Two things became obvious to me from this test. More time spent in compression doesn’t always mean better compression. For my application, running preset medium actually hurts the performance, both in file size and time taken.

I ran all of these tests only a single time on my desktop computer with an Intel(R) Core(TM) i7-4771 CPU @ 3.50GHz, 3501 Mhz, 4 Core(s), 8 Logical Processor(s) running the latest version of Windows Version 10.0.16299 Build 16299. I was running FFmpeg version N-88668-g723b6baaf8 from https://ffmpeg.zeranoe.com/builds/. The variations in time could be related to background tasks running on the machine, and I should have run a more comprehensive battery of tests, averaging time and with more accurate timekeeping.

Odd Wildcard Matching in Windows 10

I recently ran into an odd behavior of more files matching a pattern than I expected. I’d used exiftool to modify the dates on files my GoPro produced. It creates backup files of the original images when it modifies the tags. Here’s the command I ran.

exiftool.exe -r "-AllDates+=4:7:6 17:40:00" -ext jpg f:\GoPro\20170807

Now I had about 4000 files with the .JPG extension and another 4000 files with a .JPG_original extension.

I ran my program that parses the directory structure and turns all those images into a time lapse movie, and it seemed to be including both the file extensions, making a very disjointed movie.

I loaded my source code in the debugger and it seemed to be doing a findfirst / findnext specifically looking for .JPG files, and not some other extension, but it was definitely retrieving files both with .JPG and .JPG_original extensions.

I then ran a couple of commands at the windows command prompt and was surprised to find the same results there.

dir F:\GoPro\20170807\372GOPRO\G*.JPG /p
dir F:\GoPro\20170807\372GOPRO\G???????.JPG /p

Each command returned both the JPG and JPG_original files.

dir F:\GoPro\20170807\372GOPRO\G*.JPG_original /p

returned just the JPG_original files.

dir F:\GoPro\20170807\372GOPRO\G??????.JPG /p

had one less question mark and correctly returned no files.

This is all unexpected behavior, though I’m glad to see that it was consistent with the operating system and not something specific to the C runtime. I’d love an explanation of what’s going on.

exiftool to manage DJI media files

DJI Drones don’t seem to remember the image count between formats of a media card. This creates a problem for me when I’m trying to backup and maintain my images and video.

Because the dates are all correct in the media files, retrieved from GPS data, organizing the files by naming them based on the date works for me.

Using ExifTool by Phil Harvey is a great solution for pulling the metadata from the files and renaming the files.

The command line that I was initially using is:

exiftool "-FileName<${CreateDate}.$filetype" -d %Y%m%d-%H%M%S%%-c -ext mp4 -ext dng -ext jpg dji*

It’s problem is that it orphans the SRT subtitle files from my videos that I’d like to keep matching the video files.

I’ve tried this variation to do it in one step but it doesn’t work, because the SRT files get renamed as MP4 files.

exiftool -verbose "-FileName<${CreateDate}" -d %Y%m%d-%H%M%S%%-c.%%le -ext mp4 -ext dng -ext jpg dji* -srcfile %f.srt

If anyone has a suggestion for how to rename all the media files in one directory I’d appreciate it. Even running two commands in sequence would be fine.

Update:

I’ve figured out that running these two commands in sequence will get me the results I am looking for:

exiftool "-FileName<${CreateDate}" -d %Y%m%d-%H%M%S%%-c.srt -ext mp4 -srcfile %f.srt dji*
exiftool "-FileName<${CreateDate}" -d %Y%m%d-%H%M%S%%-c.%%le -ext mp4 -ext dng -ext jpg dji*

I’m still looking for a way of doing it in a single command that may leave less room for error, but this is working for now.

FFMPEG and drawtext

Several years ago I wrote a program that consolidates time-lapse pictures into a directory and calls FFMPEG to create a video.

I had been wanting the time-code from when each picture was taken printed on the screen while the video was playing but had not figured out how to get it done until this weekend.

Video TimeCode

Frame from video showing the DateTimeOriginal timecode embedded.

I’d gone down multiple paths in an attempt to get this result before finally getting the drawtext feature to work. My program manually pulled the metadata from the images before feeding them to ffmpg. I’d tried creating both text files and image files for overlaying. none of those got the result that I was looking for.

When I finally got everything working, it seems simple, but the underlying problem has to do with the amount of string escaping required to get the command to work.

Here’s an example command I was issuing to ffmpeg that got the result I was looking for.

ffmpeg.exe -hide_banner -r 30 -i Wim%05d.JPG -vf crop=in_w:3/4*in_h,drawtext=fontfile=C\\:/WINDOWS/Fonts/OCRAEXT.ttf:fontcolor=white:fontsize=160:y=main_h-text_h-50:x=main_w-text_w-50:text=WimsWorld,drawtext=fontfile=C\\:/WINDOWS/Fonts/OCRAEXT.ttf:fontcolor=white:fontsize=160:y=main_h-text_h-50:x=50:text=%{metadata\\:DateTimeOriginal} -s 3840x2160 -pix_fmt yuv420p -n Test-2160p30-cropped.mp4

If you look at the -vf option parameter, I’m cropping my input pictures to 3/4 their original height, then using the drawtext feature twice. First I write the static text to the bottom right of the frame, then I extract metadata from the source image and write it to the bottom left of the frame.

Because I’m calling this from a program, I had extra escaping of the \ character in my code. All of the escaping required a lot of trial and error to get things working. I’m using OCRAEXT as my font, but I could be using any fixed spacing font. because of the fact that the time is changing every frame, it’s important that the font not be proportional to make it easy to read.

Flashing ESCs on Hobbylord BumbleBee

I bought a Bumblebee Quad from a local hobby shop a few months ago, and when I finally got around to trying to build it with a proper autopilot found that it’s ESCs used a protocol called UltraPWM that is a very uncommon protocol.

I came across this page, https://github.com/sim-/tgy/issues/13 , which leads me to believe that I should be able to flash the ESCs with a simonk tgy firmware and use the hardware I already have.

I came across a cable from hobbyking that is designed to make contact with the surface mounted atmel device and allow programming without any soldering or desoldering. http://hobbyking.com/hobbyking/store/__27195__Atmel_Atmega_Socket_Firmware_Flashing_Tool.html It was designed to be used with an atmel programming device that they also sell http://hobbyking.com/hobbyking/store/__27990__USBasp_AVR_Programming_Device_for_ATMEL_proccessors.html and so I thought I’d be good to go. The cable cost $20 while the programmer cost $4, but not needing to solder anything was a very positive solution for me.

USBasp AVR Programming Device for ATMEL proccessors

USBasp AVR Programming Device

Atmel Atmega Socket Firmware Flashing Tool

Atmel Atmega Socket Firmware Flashing Tool

What I didn’t recognize until it all arrived was that Hobbyking has updated the USB Programmer to use a 6 conductor connector, but not updated their programming cable from the 10 conductor cable. The message boards on hobbyking discuss the change, and have pinout descriptions, but it’s been very frustrating because getting the parts to do the correct wiring has not been as simple as plug and play.

Atmega contact points

Atmega contact points

Atmega contact points

Atmega contact points

Cable Pinout Description

Cable Pinout Description

This has been extremely frustrating to me as the parts I ordered were billed as no soldering required, but could not be simply plugged into each other.

Time-lapse videos from GoPro

In November 2013 I purchased a GoPro HERO3+ Black Edition to play with video recording, but almost immediately became enthralled with taking sequences of photos over long time periods.

The GoPro can be configured to take a picture every 0.5, 1, 2, 5, 10, 30, or 60 seconds. I’ve found that I like taking pictures every 2 seconds, and then converting them to video at 30fps (Frames per second) which gets me an easy to convert time scale. 1 second of video came from 1 minute of photos, 1 minute of video came from 1 hour of photos.

GoPro has a freely available software package to edit videos, as well as creating videos from sequences of images. Because of my past familiarity with FFMPEG I wanted a more scriptable solution for creating videos from thousands of photos.

https://trac.ffmpeg.org/wiki/Create%20a%20video%20slideshow%20from%20images has nice instructions for creating videos from sequences of images using FFMPEG. What it glosses over is that the first image in the sequence needs to be numbered zero or one. Another complication in the process is that the GoPro uses the standard camera file format where no more than 1000 images will be stored in a single directory. This means that with the 1800 images created in a single hour, at least two directories will hold the source images. An interesting issue I ran across is that sometimes the GoPro will skip a number in its image sequence, especially when it has just moved to the next directory in sequence. This is why I had to write my program using directory listings as opposed to simply looking for known files.

The standard GoPro battery will record just about two hours worth of photos. If the GoPro is connected to an external power supply, you can be limited only by the amount of storage space.

Here’s yesterday morning’s weather changing in Seattle.

Here’s a comparison of cropping vs compressing the video. I took this video on a flight from Seattle to Pullman last weekend. You can see much more of the landscape in the compressed version, and see that the top of the propeller leaves the frame in the cropped version.

Compressed:


Cropped:

I’ve written a program that takes three parameters, copies all of the images to a temporary location with an acceptable filename sequence, runs FFMPEG to create a video, then deletes the temporary images. The GoPro is configured to take full resolution still frames, 4000×3000, and I convert those to a 1080p video format using FFMPEG. Because the aspect ratio is different, and the GoPro uses a fish eye lens to begin with, both vertical and horizontal distortion shows up. I run FFMPEG twice, once creating a compressed video and a second time creating a cropped video. This allows me to chose which level of distortion I prefer after the fact.

The three parameters are the video name, the first image in the sequence, and the last image in the sequence. I am currently doing very little error checking. I’m presenting this code here, just to document what I’ve done so far. If you find this useful, please let me know.

Here’s some helper functions I regularly use.

using namespace std;

/////////////////////////////////////////////////////////////////////////////
CString FindEXEFromPath(const CString & csEXE)
{
	CString csFullPath;
	CFileFind finder;
	if (finder.FindFile(csEXE))
	{
		finder.FindNextFile();
		csFullPath = finder.GetFilePath();
		finder.Close();
	}
	else
	{
		TCHAR filename[MAX_PATH];
		unsigned long buffersize = sizeof(filename) / sizeof(TCHAR);
		// Get the file name that we are running from.
		GetModuleFileName(AfxGetResourceHandle(), filename, buffersize );
		PathRemoveFileSpec(filename);
		PathAppend(filename, csEXE);
		if (finder.FindFile(filename))
		{
			finder.FindNextFile();
			csFullPath = finder.GetFilePath();
			finder.Close();
		}
		else
		{
			CString csPATH;
			csPATH.GetEnvironmentVariable(_T("PATH"));
			int iStart = 0;
			CString csToken(csPATH.Tokenize(_T(";"), iStart));
			while (csToken != _T(""))
			{
				if (csToken.Right(1) != _T("\\"))
					csToken.AppendChar(_T('\\'));
				csToken.Append(csEXE);
				if (finder.FindFile(csToken))
				{
					finder.FindNextFile();
					csFullPath = finder.GetFilePath();
					finder.Close();
					break;
				}
				csToken = csPATH.Tokenize(_T(";"), iStart);
			}
		}
	}
	return(csFullPath);
}
/////////////////////////////////////////////////////////////////////////////
static const CString QuoteFileName(const CString & Original)
{
	CString csQuotedString(Original);
	if (csQuotedString.Find(_T(" ")) >= 0)
	{
		csQuotedString.Insert(0,_T('"'));
		csQuotedString.AppendChar(_T('"'));
	}
	return(csQuotedString);
}
/////////////////////////////////////////////////////////////////////////////
std::string timeToISO8601(const time_t & TheTime)
{
	std::ostringstream ISOTime;
	struct tm UTC;// = gmtime(&timer);
	if (0 == gmtime_s(&UTC, &TheTime))
	{
		ISOTime.fill('0');
		ISOTime << UTC.tm_year+1900 << "-";
		ISOTime.width(2);
		ISOTime << UTC.tm_mon+1 << "-";
		ISOTime.width(2);
		ISOTime << UTC.tm_mday << "T";
		ISOTime.width(2);
		ISOTime << UTC.tm_hour << ":";
		ISOTime.width(2);
		ISOTime << UTC.tm_min << ":";
		ISOTime.width(2);
		ISOTime << UTC.tm_sec;
	}
	return(ISOTime.str());
}
std::wstring getTimeISO8601(void)
{
	time_t timer;
	time(&timer);
	std::string isostring(timeToISO8601(timer));
	std::wstring rval;
	rval.assign(isostring.begin(), isostring.end());
	
	return(rval);
}
/////////////////////////////////////////////////////////////////////////////

Here’s a routine I found useful to parse the standard camera file system naming format.

/////////////////////////////////////////////////////////////////////////////
bool SplitImagePath(
	CString csSrcPath,
	CString & DestParentDir,
	int & DestChildNum,
	CString & DestChildSuffix,
	CString & DestFilePrefix,
	int & DestFileNumDigits,
	int & DestFileNum,
	CString & DestFileExt
	)
{
	bool rval = true;
	DestFileExt.Empty();
	while (csSrcPath[csSrcPath.GetLength()-1] != _T('.'))
	{
		DestFileExt.Insert(0, csSrcPath[csSrcPath.GetLength()-1]);
		csSrcPath.Truncate(csSrcPath.GetLength()-1);
	}
	csSrcPath.Truncate(csSrcPath.GetLength()-1); // get rid of dot

	CString csDestFileNum;
	DestFileNumDigits = 0;
	while (iswdigit(csSrcPath[csSrcPath.GetLength()-1]))
	{
		csDestFileNum.Insert(0, csSrcPath[csSrcPath.GetLength()-1]);
		DestFileNumDigits++;
		csSrcPath.Truncate(csSrcPath.GetLength()-1);
	}
	DestFileNum = _wtoi(csDestFileNum.GetString());

	DestFilePrefix.Empty();
	while (iswalpha(csSrcPath[csSrcPath.GetLength()-1]))
	{
		DestFilePrefix.Insert(0, csSrcPath[csSrcPath.GetLength()-1]);
		csSrcPath.Truncate(csSrcPath.GetLength()-1);
	}
	csSrcPath.Truncate(csSrcPath.GetLength()-1); // get rid of backslash

	DestChildSuffix.Empty();
	while (iswalpha(csSrcPath[csSrcPath.GetLength()-1]))
	{
		DestChildSuffix.Insert(0, csSrcPath[csSrcPath.GetLength()-1]);
		csSrcPath.Truncate(csSrcPath.GetLength()-1);
	}

	CString csDestChildNum;
	while (iswdigit(csSrcPath[csSrcPath.GetLength()-1]))
	{
		csDestChildNum.Insert(0, csSrcPath[csSrcPath.GetLength()-1]);
		csSrcPath.Truncate(csSrcPath.GetLength()-1);
	}
	DestChildNum = _wtoi(csDestChildNum.GetString());

	DestParentDir = csSrcPath;
	return(rval);
}
/////////////////////////////////////////////////////////////////////////////

And here’s the main program.

/////////////////////////////////////////////////////////////////////////////
int _tmain(int argc, TCHAR* argv[], TCHAR* envp[])
{
	int nRetCode = 0;

	HMODULE hModule = ::GetModuleHandle(NULL);

	if (hModule != NULL)
	{
		// initialize MFC and print and error on failure
		if (!AfxWinInit(hModule, NULL, ::GetCommandLine(), 0))
		{
			// TODO: change error code to suit your needs
			_tprintf(_T("Fatal Error: MFC initialization failed\n"));
			nRetCode = 1;
		}
		else
		{
			CString csFFMPEGPath(FindEXEFromPath(_T("ffmpeg.exe")));
			CString csFirstFileName;
			CString csLastFileName;
			CString csVideoName;

			if (argc != 4)
			{
				std::wcout << "command Line Format:" << std::endl;
				std::wcout << "\t" << argv[0] << " VideoName PathToFirstFile.jpg PathToLastFile.jpg" << std::endl;
			}
			else
			{
				csVideoName = CString(argv[1]);
				csFirstFileName = CString(argv[2]);
				csLastFileName = CString(argv[3]);

				int DirNumFirst = 0;
				int DirNumLast = 0;
				int FileNumFirst = 0;
				int FileNumLast = 0;
				CString csFinderStringFormat;

				CString DestParentDir;
				CString DestChildSuffix;
				CString DestFilePrefix;
				CString DestFileExt;
				int DestFileNumDigits;
				SplitImagePath(csFirstFileName, DestParentDir, DirNumFirst, DestChildSuffix, DestFilePrefix, DestFileNumDigits, FileNumFirst, DestFileExt);
				csFinderStringFormat.Format(_T("%s%%03d%s\\%s*.%s"), DestParentDir.GetString(), DestChildSuffix.GetString(), DestFilePrefix.GetString(), DestFileExt.GetString());
				SplitImagePath(csLastFileName, DestParentDir, DirNumLast, DestChildSuffix, DestFilePrefix, DestFileNumDigits, FileNumLast, DestFileExt);

				std::vector<CString> SourceImageList;
				int DirNum = DirNumFirst;
				int FileNum = FileNumFirst;
				do 
				{
					CString csFinderString;
					csFinderString.Format(csFinderStringFormat, DirNum);
					CFileFind finder;
					BOOL bWorking = finder.FindFile(csFinderString.GetString());
					while (bWorking)
					{
						bWorking = finder.FindNextFile();
						SplitImagePath(finder.GetFilePath(), DestParentDir, DirNum, DestChildSuffix, DestFilePrefix, DestFileNumDigits, FileNum, DestFileExt);
						if ((FileNum >= FileNumFirst) && (FileNum <= FileNumLast))
							SourceImageList.push_back(finder.GetFilePath());
					}
					finder.Close();
					DirNum++;
				} while (DirNum <= DirNumLast);

				std::wcout << "[" << getTimeISO8601() << "] " << "First File: " << csFirstFileName.GetString() << std::endl;
				std::wcout << "[" << getTimeISO8601() << "] " << "Last File:  " << csLastFileName.GetString() << std::endl;
				std::wcout << "[" << getTimeISO8601() << "] " << "Total Files: " << SourceImageList.size() << std::endl;

				TCHAR szPath[MAX_PATH] = _T("");
				SHGetFolderPath(NULL, CSIDL_MYVIDEO, NULL, 0, szPath);
				PathAddBackslash(szPath);
				CString csImageDirectory(szPath);
				csImageDirectory.Append(csVideoName);
				if (CreateDirectory(csImageDirectory, NULL))
				{
					int OutFileIndex = 0;
					for (auto SourceFile = SourceImageList.begin(); SourceFile != SourceImageList.end(); SourceFile++)
					{
						CString OutFilePath(csImageDirectory);
						OutFilePath.AppendFormat(_T("\\Wim%05d.JPG"), OutFileIndex++);
						std::wcout << "[" << getTimeISO8601() << "] " << "CopyFile " << SourceFile->GetString() << " to " << OutFilePath.GetString() << "\r";
						CopyFile(SourceFile->GetString(), OutFilePath, TRUE);
					}
					std::wcout << "\n";

					CString csImagePathSpec(csImageDirectory); csImagePathSpec.Append(_T("\\Wim%05d.JPG"));
					CString csVideoFullPath(csImageDirectory); csVideoFullPath.Append(_T(".mp4"));
					if (csFFMPEGPath.GetLength() > 0)
					{
						csVideoFullPath = csImageDirectory + _T("-1080p-cropped.mp4");
						std::wcout << "[" << getTimeISO8601() << "] " << csFFMPEGPath.GetString() << " -i " << QuoteFileName(csImagePathSpec).GetString() << " -y " << QuoteFileName(csVideoFullPath).GetString() << std::endl;
						if (-1 == _tspawnlp(_P_WAIT, csFFMPEGPath.GetString(), csFFMPEGPath.GetString(), 
							#ifdef _DEBUG
							_T("-report"),
							#endif
							_T("-i"), QuoteFileName(csImagePathSpec).GetString(),
							_T("-vf"), _T("crop=in_w:3/4*in_h"),
							// _T("-vf"), _T("rotate=PI"), // Us this to rotate the movie if we forgot to put the GoPro in upside down mode.
							_T("-s"), _T("1920x1080"),
							_T("-y"), // Cause it to overwrite exiting output files
							QuoteFileName(csVideoFullPath).GetString(), NULL))
							std::wcout << "[" << getTimeISO8601() << "]  _tspawnlp failed: " /* << _sys_errlist[errno] */ << std::endl;
						csVideoFullPath = csImageDirectory + _T("-1080p-compressed.mp4");
						std::wcout << "[" << getTimeISO8601() << "] " << csFFMPEGPath.GetString() << " -i " << QuoteFileName(csImagePathSpec).GetString() << " -y " << QuoteFileName(csVideoFullPath).GetString() << std::endl;
						if (-1 == _tspawnlp(_P_WAIT, csFFMPEGPath.GetString(), csFFMPEGPath.GetString(), 
							#ifdef _DEBUG
							_T("-report"),
							#endif
							_T("-i"), QuoteFileName(csImagePathSpec).GetString(),
							// _T("-vf"), _T("rotate=PI"), // Us this to rotate the movie if we forgot to put the GoPro in upside down mode.
							_T("-s"), _T("1920x1080"),
							_T("-y"), // Cause it to overwrite exiting output files
							QuoteFileName(csVideoFullPath).GetString(), NULL))
							std::wcout << "[" << getTimeISO8601() << "]  _tspawnlp failed: " /* << _sys_errlist[errno] */ << std::endl;
					}
					do 
					{
						CString OutFilePath(csImageDirectory);
						OutFilePath.AppendFormat(_T("\\Wim%05d.JPG"), --OutFileIndex);
						std::wcout << "[" << getTimeISO8601() << "] " << "DeleteFile " << OutFilePath.GetString() << "\r";
						DeleteFile(OutFilePath);
					} while (OutFileIndex > 0);
					std::wcout << "\n[" << getTimeISO8601() << "] " << "RemoveDirectory " << csImageDirectory.GetString() << std::endl;
					RemoveDirectory(csImageDirectory);
				}
			}
		}
	}
	else
	{
		_tprintf(_T("Fatal Error: GetModuleHandle failed\n"));
		nRetCode = 1;
	}
	return nRetCode;
}