Govee H5075 and H5074, Bluetooth Low Energy, and MRTG

I have been wanting a method of keeping track of temperatures for a long time. Last week I acquired a Govee H5075 Bluetooth Thermometer Hygrometer. It communicates with an app from Govee on my iPhone using Bluetooth Low Energy (BLE).

I’ve now learned some details on BLE, and have written a program that listens for BLE advertisements from either type of thermometer and logs the temperature and humidity in a text file. The code for my project is available on GitHub. https://github.com/wcbonner/GoveeBTTempLogger

The same program can also be called to get the last value from the log and produce output compatible with MRTG. MRTG is not the best method for graphing these temperatures, because all graphs start with zero on the Y axis, and neither the temperature or humidity is likely to be near zero.

MRTG graph of Temperature and Relative Humidity

My program seems to receive advertisements from each thermometer about every ten seconds. I’ve had a friend running the code in his location with a different set of thermometers and it doesn’t get advertisements nearly as frequently. I don’t know if that’s just because environment is different, or if there’s something else going on.

Time-lapse videos from GoPro

In November 2013 I purchased a GoPro HERO3+ Black Edition to play with video recording, but almost immediately became enthralled with taking sequences of photos over long time periods.

The GoPro can be configured to take a picture every 0.5, 1, 2, 5, 10, 30, or 60 seconds. I’ve found that I like taking pictures every 2 seconds, and then converting them to video at 30fps (Frames per second) which gets me an easy to convert time scale. 1 second of video came from 1 minute of photos, 1 minute of video came from 1 hour of photos.

GoPro has a freely available software package to edit videos, as well as creating videos from sequences of images. Because of my past familiarity with FFMPEG I wanted a more scriptable solution for creating videos from thousands of photos.

https://trac.ffmpeg.org/wiki/Create%20a%20video%20slideshow%20from%20images has nice instructions for creating videos from sequences of images using FFMPEG. What it glosses over is that the first image in the sequence needs to be numbered zero or one. Another complication in the process is that the GoPro uses the standard camera file format where no more than 1000 images will be stored in a single directory. This means that with the 1800 images created in a single hour, at least two directories will hold the source images. An interesting issue I ran across is that sometimes the GoPro will skip a number in its image sequence, especially when it has just moved to the next directory in sequence. This is why I had to write my program using directory listings as opposed to simply looking for known files.

The standard GoPro battery will record just about two hours worth of photos. If the GoPro is connected to an external power supply, you can be limited only by the amount of storage space.

Here’s yesterday morning’s weather changing in Seattle.

Here’s a comparison of cropping vs compressing the video. I took this video on a flight from Seattle to Pullman last weekend. You can see much more of the landscape in the compressed version, and see that the top of the propeller leaves the frame in the cropped version.

Compressed:


Cropped:

I’ve written a program that takes three parameters, copies all of the images to a temporary location with an acceptable filename sequence, runs FFMPEG to create a video, then deletes the temporary images. The GoPro is configured to take full resolution still frames, 4000×3000, and I convert those to a 1080p video format using FFMPEG. Because the aspect ratio is different, and the GoPro uses a fish eye lens to begin with, both vertical and horizontal distortion shows up. I run FFMPEG twice, once creating a compressed video and a second time creating a cropped video. This allows me to chose which level of distortion I prefer after the fact.

The three parameters are the video name, the first image in the sequence, and the last image in the sequence. I am currently doing very little error checking. I’m presenting this code here, just to document what I’ve done so far. If you find this useful, please let me know.

Here’s some helper functions I regularly use.

using namespace std;

/////////////////////////////////////////////////////////////////////////////
CString FindEXEFromPath(const CString & csEXE)
{
	CString csFullPath;
	CFileFind finder;
	if (finder.FindFile(csEXE))
	{
		finder.FindNextFile();
		csFullPath = finder.GetFilePath();
		finder.Close();
	}
	else
	{
		TCHAR filename[MAX_PATH];
		unsigned long buffersize = sizeof(filename) / sizeof(TCHAR);
		// Get the file name that we are running from.
		GetModuleFileName(AfxGetResourceHandle(), filename, buffersize );
		PathRemoveFileSpec(filename);
		PathAppend(filename, csEXE);
		if (finder.FindFile(filename))
		{
			finder.FindNextFile();
			csFullPath = finder.GetFilePath();
			finder.Close();
		}
		else
		{
			CString csPATH;
			csPATH.GetEnvironmentVariable(_T("PATH"));
			int iStart = 0;
			CString csToken(csPATH.Tokenize(_T(";"), iStart));
			while (csToken != _T(""))
			{
				if (csToken.Right(1) != _T("\\"))
					csToken.AppendChar(_T('\\'));
				csToken.Append(csEXE);
				if (finder.FindFile(csToken))
				{
					finder.FindNextFile();
					csFullPath = finder.GetFilePath();
					finder.Close();
					break;
				}
				csToken = csPATH.Tokenize(_T(";"), iStart);
			}
		}
	}
	return(csFullPath);
}
/////////////////////////////////////////////////////////////////////////////
static const CString QuoteFileName(const CString & Original)
{
	CString csQuotedString(Original);
	if (csQuotedString.Find(_T(" ")) >= 0)
	{
		csQuotedString.Insert(0,_T('"'));
		csQuotedString.AppendChar(_T('"'));
	}
	return(csQuotedString);
}
/////////////////////////////////////////////////////////////////////////////
std::string timeToISO8601(const time_t & TheTime)
{
	std::ostringstream ISOTime;
	struct tm UTC;// = gmtime(&timer);
	if (0 == gmtime_s(&UTC, &TheTime))
	{
		ISOTime.fill('0');
		ISOTime << UTC.tm_year+1900 << "-";
		ISOTime.width(2);
		ISOTime << UTC.tm_mon+1 << "-";
		ISOTime.width(2);
		ISOTime << UTC.tm_mday << "T";
		ISOTime.width(2);
		ISOTime << UTC.tm_hour << ":";
		ISOTime.width(2);
		ISOTime << UTC.tm_min << ":";
		ISOTime.width(2);
		ISOTime << UTC.tm_sec;
	}
	return(ISOTime.str());
}
std::wstring getTimeISO8601(void)
{
	time_t timer;
	time(&timer);
	std::string isostring(timeToISO8601(timer));
	std::wstring rval;
	rval.assign(isostring.begin(), isostring.end());
	
	return(rval);
}
/////////////////////////////////////////////////////////////////////////////

Here’s a routine I found useful to parse the standard camera file system naming format.

/////////////////////////////////////////////////////////////////////////////
bool SplitImagePath(
	CString csSrcPath,
	CString & DestParentDir,
	int & DestChildNum,
	CString & DestChildSuffix,
	CString & DestFilePrefix,
	int & DestFileNumDigits,
	int & DestFileNum,
	CString & DestFileExt
	)
{
	bool rval = true;
	DestFileExt.Empty();
	while (csSrcPath[csSrcPath.GetLength()-1] != _T('.'))
	{
		DestFileExt.Insert(0, csSrcPath[csSrcPath.GetLength()-1]);
		csSrcPath.Truncate(csSrcPath.GetLength()-1);
	}
	csSrcPath.Truncate(csSrcPath.GetLength()-1); // get rid of dot

	CString csDestFileNum;
	DestFileNumDigits = 0;
	while (iswdigit(csSrcPath[csSrcPath.GetLength()-1]))
	{
		csDestFileNum.Insert(0, csSrcPath[csSrcPath.GetLength()-1]);
		DestFileNumDigits++;
		csSrcPath.Truncate(csSrcPath.GetLength()-1);
	}
	DestFileNum = _wtoi(csDestFileNum.GetString());

	DestFilePrefix.Empty();
	while (iswalpha(csSrcPath[csSrcPath.GetLength()-1]))
	{
		DestFilePrefix.Insert(0, csSrcPath[csSrcPath.GetLength()-1]);
		csSrcPath.Truncate(csSrcPath.GetLength()-1);
	}
	csSrcPath.Truncate(csSrcPath.GetLength()-1); // get rid of backslash

	DestChildSuffix.Empty();
	while (iswalpha(csSrcPath[csSrcPath.GetLength()-1]))
	{
		DestChildSuffix.Insert(0, csSrcPath[csSrcPath.GetLength()-1]);
		csSrcPath.Truncate(csSrcPath.GetLength()-1);
	}

	CString csDestChildNum;
	while (iswdigit(csSrcPath[csSrcPath.GetLength()-1]))
	{
		csDestChildNum.Insert(0, csSrcPath[csSrcPath.GetLength()-1]);
		csSrcPath.Truncate(csSrcPath.GetLength()-1);
	}
	DestChildNum = _wtoi(csDestChildNum.GetString());

	DestParentDir = csSrcPath;
	return(rval);
}
/////////////////////////////////////////////////////////////////////////////

And here’s the main program.

/////////////////////////////////////////////////////////////////////////////
int _tmain(int argc, TCHAR* argv[], TCHAR* envp[])
{
	int nRetCode = 0;

	HMODULE hModule = ::GetModuleHandle(NULL);

	if (hModule != NULL)
	{
		// initialize MFC and print and error on failure
		if (!AfxWinInit(hModule, NULL, ::GetCommandLine(), 0))
		{
			// TODO: change error code to suit your needs
			_tprintf(_T("Fatal Error: MFC initialization failed\n"));
			nRetCode = 1;
		}
		else
		{
			CString csFFMPEGPath(FindEXEFromPath(_T("ffmpeg.exe")));
			CString csFirstFileName;
			CString csLastFileName;
			CString csVideoName;

			if (argc != 4)
			{
				std::wcout << "command Line Format:" << std::endl;
				std::wcout << "\t" << argv[0] << " VideoName PathToFirstFile.jpg PathToLastFile.jpg" << std::endl;
			}
			else
			{
				csVideoName = CString(argv[1]);
				csFirstFileName = CString(argv[2]);
				csLastFileName = CString(argv[3]);

				int DirNumFirst = 0;
				int DirNumLast = 0;
				int FileNumFirst = 0;
				int FileNumLast = 0;
				CString csFinderStringFormat;

				CString DestParentDir;
				CString DestChildSuffix;
				CString DestFilePrefix;
				CString DestFileExt;
				int DestFileNumDigits;
				SplitImagePath(csFirstFileName, DestParentDir, DirNumFirst, DestChildSuffix, DestFilePrefix, DestFileNumDigits, FileNumFirst, DestFileExt);
				csFinderStringFormat.Format(_T("%s%%03d%s\\%s*.%s"), DestParentDir.GetString(), DestChildSuffix.GetString(), DestFilePrefix.GetString(), DestFileExt.GetString());
				SplitImagePath(csLastFileName, DestParentDir, DirNumLast, DestChildSuffix, DestFilePrefix, DestFileNumDigits, FileNumLast, DestFileExt);

				std::vector<CString> SourceImageList;
				int DirNum = DirNumFirst;
				int FileNum = FileNumFirst;
				do 
				{
					CString csFinderString;
					csFinderString.Format(csFinderStringFormat, DirNum);
					CFileFind finder;
					BOOL bWorking = finder.FindFile(csFinderString.GetString());
					while (bWorking)
					{
						bWorking = finder.FindNextFile();
						SplitImagePath(finder.GetFilePath(), DestParentDir, DirNum, DestChildSuffix, DestFilePrefix, DestFileNumDigits, FileNum, DestFileExt);
						if ((FileNum >= FileNumFirst) && (FileNum <= FileNumLast))
							SourceImageList.push_back(finder.GetFilePath());
					}
					finder.Close();
					DirNum++;
				} while (DirNum <= DirNumLast);

				std::wcout << "[" << getTimeISO8601() << "] " << "First File: " << csFirstFileName.GetString() << std::endl;
				std::wcout << "[" << getTimeISO8601() << "] " << "Last File:  " << csLastFileName.GetString() << std::endl;
				std::wcout << "[" << getTimeISO8601() << "] " << "Total Files: " << SourceImageList.size() << std::endl;

				TCHAR szPath[MAX_PATH] = _T("");
				SHGetFolderPath(NULL, CSIDL_MYVIDEO, NULL, 0, szPath);
				PathAddBackslash(szPath);
				CString csImageDirectory(szPath);
				csImageDirectory.Append(csVideoName);
				if (CreateDirectory(csImageDirectory, NULL))
				{
					int OutFileIndex = 0;
					for (auto SourceFile = SourceImageList.begin(); SourceFile != SourceImageList.end(); SourceFile++)
					{
						CString OutFilePath(csImageDirectory);
						OutFilePath.AppendFormat(_T("\\Wim%05d.JPG"), OutFileIndex++);
						std::wcout << "[" << getTimeISO8601() << "] " << "CopyFile " << SourceFile->GetString() << " to " << OutFilePath.GetString() << "\r";
						CopyFile(SourceFile->GetString(), OutFilePath, TRUE);
					}
					std::wcout << "\n";

					CString csImagePathSpec(csImageDirectory); csImagePathSpec.Append(_T("\\Wim%05d.JPG"));
					CString csVideoFullPath(csImageDirectory); csVideoFullPath.Append(_T(".mp4"));
					if (csFFMPEGPath.GetLength() > 0)
					{
						csVideoFullPath = csImageDirectory + _T("-1080p-cropped.mp4");
						std::wcout << "[" << getTimeISO8601() << "] " << csFFMPEGPath.GetString() << " -i " << QuoteFileName(csImagePathSpec).GetString() << " -y " << QuoteFileName(csVideoFullPath).GetString() << std::endl;
						if (-1 == _tspawnlp(_P_WAIT, csFFMPEGPath.GetString(), csFFMPEGPath.GetString(), 
							#ifdef _DEBUG
							_T("-report"),
							#endif
							_T("-i"), QuoteFileName(csImagePathSpec).GetString(),
							_T("-vf"), _T("crop=in_w:3/4*in_h"),
							// _T("-vf"), _T("rotate=PI"), // Us this to rotate the movie if we forgot to put the GoPro in upside down mode.
							_T("-s"), _T("1920x1080"),
							_T("-y"), // Cause it to overwrite exiting output files
							QuoteFileName(csVideoFullPath).GetString(), NULL))
							std::wcout << "[" << getTimeISO8601() << "]  _tspawnlp failed: " /* << _sys_errlist[errno] */ << std::endl;
						csVideoFullPath = csImageDirectory + _T("-1080p-compressed.mp4");
						std::wcout << "[" << getTimeISO8601() << "] " << csFFMPEGPath.GetString() << " -i " << QuoteFileName(csImagePathSpec).GetString() << " -y " << QuoteFileName(csVideoFullPath).GetString() << std::endl;
						if (-1 == _tspawnlp(_P_WAIT, csFFMPEGPath.GetString(), csFFMPEGPath.GetString(), 
							#ifdef _DEBUG
							_T("-report"),
							#endif
							_T("-i"), QuoteFileName(csImagePathSpec).GetString(),
							// _T("-vf"), _T("rotate=PI"), // Us this to rotate the movie if we forgot to put the GoPro in upside down mode.
							_T("-s"), _T("1920x1080"),
							_T("-y"), // Cause it to overwrite exiting output files
							QuoteFileName(csVideoFullPath).GetString(), NULL))
							std::wcout << "[" << getTimeISO8601() << "]  _tspawnlp failed: " /* << _sys_errlist[errno] */ << std::endl;
					}
					do 
					{
						CString OutFilePath(csImageDirectory);
						OutFilePath.AppendFormat(_T("\\Wim%05d.JPG"), --OutFileIndex);
						std::wcout << "[" << getTimeISO8601() << "] " << "DeleteFile " << OutFilePath.GetString() << "\r";
						DeleteFile(OutFilePath);
					} while (OutFileIndex > 0);
					std::wcout << "\n[" << getTimeISO8601() << "] " << "RemoveDirectory " << csImageDirectory.GetString() << std::endl;
					RemoveDirectory(csImageDirectory);
				}
			}
		}
	}
	else
	{
		_tprintf(_T("Fatal Error: GetModuleHandle failed\n"));
		nRetCode = 1;
	}
	return nRetCode;
}

WimTiVoServer changes to use FFProbe

WimTiVoServer was originally written using the libraries that FFMPEG is based on to retrieve details about video files. I had downloaded the packages from http://ffmpeg.zeranoe.com/builds/ and used the DLLs for the library calls. In other program I’m building related to FFMPEG I am updating FFMPEG on a regular basis. Maintaining the correct link path any time I came back for a minor adjustment to WimTiVoServer became more of an effort than I wanted to deal with, so I investigated what else was available.

My solution has been to use FFProbe, which is distributed with FFmpeg. I am using the spawning a child process and capturing the standard output. I read the results of my command and put it into a IStream memory stream object, which I then use the IXmlReader object to parse the XML for the items I’m looking for.

The command line I’m using for FFProbe is ffprobe.exe -show_streams -show_format -print_format xml INPUT. An example of the output it produces is:

<?xml version="1.0" encoding="UTF-8"?>
<ffprobe>
    <streams>
        <stream index="0" codec_name="ac3" codec_long_name="ATSC A/52A (AC-3)" codec_type="audio" codec_time_base="1/48000" codec_tag_string="[0][0][0][0]" codec_tag="0x0000" sample_fmt="fltp" sample_rate="48000" channels="6" bits_per_sample="0" dmix_mode="-1" ltrt_cmixlev="-1.000000" ltrt_surmixlev="-1.000000" loro_cmixlev="-1.000000" loro_surmixlev="-1.000000" id="0x27" r_frame_rate="0/0" avg_frame_rate="0/0" time_base="1/10000000" start_pts="22054844" start_time="2.205484" duration_ts="19133694951" duration="1913.369495" bit_rate="384000">
            <disposition default="0" dub="0" original="0" comment="0" lyrics="0" karaoke="0" forced="0" hearing_impaired="0" visual_impaired="0" clean_effects="0" attached_pic="0"/>
        </stream>
        <stream index="1" codec_name="ac3" codec_long_name="ATSC A/52A (AC-3)" codec_type="audio" codec_time_base="1/48000" codec_tag_string="[0][0][0][0]" codec_tag="0x0000" sample_fmt="fltp" sample_rate="48000" channels="2" bits_per_sample="0" dmix_mode="-1" ltrt_cmixlev="-1.000000" ltrt_surmixlev="-1.000000" loro_cmixlev="-1.000000" loro_surmixlev="-1.000000" id="0x28" r_frame_rate="0/0" avg_frame_rate="0/0" time_base="1/10000000" start_pts="23039510" start_time="2.303951" bit_rate="192000">
            <disposition default="0" dub="0" original="0" comment="0" lyrics="0" karaoke="0" forced="0" hearing_impaired="0" visual_impaired="0" clean_effects="0" attached_pic="0"/>
        </stream>
        <stream index="2" codec_name="mpeg2video" codec_long_name="MPEG-2 video" profile="Main" codec_type="video" codec_time_base="1001/120000" codec_tag_string="[0][0][0][0]" codec_tag="0x0000" width="1280" height="720" has_b_frames="1" sample_aspect_ratio="1:1" display_aspect_ratio="16:9" pix_fmt="yuv420p" level="4" timecode="00:00:00:00" id="0x29" r_frame_rate="60000/1001" avg_frame_rate="60000/1001" time_base="1/10000000" start_pts="31875510" start_time="3.187551">
            <disposition default="0" dub="0" original="0" comment="0" lyrics="0" karaoke="0" forced="0" hearing_impaired="0" visual_impaired="0" clean_effects="0" attached_pic="0"/>
        </stream>
        <stream index="3" codec_type="subtitle" codec_time_base="1/10000000" codec_tag_string="[0][0][0][0]" codec_tag="0x0000" id="0x2a" r_frame_rate="0/0" avg_frame_rate="0/0" time_base="1/10000000" start_pts="32209177" start_time="3.220918">
            <disposition default="0" dub="0" original="0" comment="0" lyrics="0" karaoke="0" forced="0" hearing_impaired="0" visual_impaired="0" clean_effects="0" attached_pic="0"/>
        </stream>
        <stream index="4" codec_name="mjpeg" codec_long_name="MJPEG (Motion JPEG)" codec_type="video" codec_time_base="1/90000" codec_tag_string="[0][0][0][0]" codec_tag="0x0000" width="200" height="113" has_b_frames="0" sample_aspect_ratio="1:1" display_aspect_ratio="200:113" pix_fmt="yuvj420p" level="-99" id="0xffffffff" r_frame_rate="90000/1" avg_frame_rate="0/0" time_base="1/90000" start_pts="198494" start_time="2.205489" duration_ts="172203255" duration="1913.369500">
            <disposition default="0" dub="0" original="0" comment="0" lyrics="0" karaoke="0" forced="0" hearing_impaired="0" visual_impaired="0" clean_effects="0" attached_pic="1"/>
            <tag key="title" value="TV Thumbnail"/>
        </stream>
    </streams>

    <format filename="d:\Recorded TV\Archer_FXPHD_2013_02_28_22_00_00.wtv" nb_streams="5" nb_programs="0" format_name="wtv" format_long_name="Windows Television (WTV)" start_time="2.205484" duration="1913.369495" size="1956642816" bit_rate="8180930" probe_score="100">
        <tag key="WM/MediaClassPrimaryID" value="db9830bd-3ab3-4fab-8a371a995f7ff74"/>
        <tag key="WM/MediaClassSecondaryID" value="ba7f258a-62f7-47a9-b21f4651c42a000"/>
        <tag key="Title" value="Archer"/>
        <tag key="WM/SubTitle" value="Live and Let Dine"/>
        <tag key="WM/SubTitleDescription" value="Archer, Lana and Cyril go undercover in celebrity chef Lance Casteau&apos;s hellish kitchen."/>
        <tag key="genre" value="Comedy;General;Series"/>
        <tag key="WM/OriginalReleaseTime" value="0"/>
        <tag key="language" value="en-us"/>
        <tag key="WM/MediaCredits" value="H. Jon Benjamin/Jessica Walter/Aisha Tyler/George Coe/Chris Parnell/Judy Greer;;;Anthony Bourdain"/>
        <tag key="service_provider" value="FXPHD"/>
        <tag key="service_name" value="FX HD (Pacific)"/>
        <tag key="WM/MediaNetworkAffiliation" value="Satellite"/>
        <tag key="WM/MediaOriginalChannel" value="728"/>
        <tag key="WM/MediaOriginalChannelSubNumber" value="0"/>
        <tag key="WM/MediaOriginalBroadcastDateTime" value="2013-02-28T08:00:00Z"/>
        <tag key="WM/MediaOriginalRunTime" value="19144791872"/>
        <tag key="WM/MediaIsStereo" value="false"/>
        <tag key="WM/MediaIsRepeat" value="false"/>
        <tag key="WM/MediaIsLive" value="false"/>
        <tag key="WM/MediaIsTape" value="false"/>
        <tag key="WM/MediaIsDelay" value="false"/>
        <tag key="WM/MediaIsSubtitled" value="false"/>
        <tag key="WM/MediaIsMovie" value="false"/>
        <tag key="WM/MediaIsPremiere" value="false"/>
        <tag key="WM/MediaIsFinale" value="false"/>
        <tag key="WM/MediaIsSAP" value="false"/>
        <tag key="WM/MediaIsSport" value="false"/>
        <tag key="WM/Provider" value="MediaCenterDefault"/>
        <tag key="WM/VideoClosedCaptioning" value="false"/>
        <tag key="WM/WMRVEncodeTime" value="2013-03-01 06:00:05"/>
        <tag key="WM/WMRVSeriesUID" value="!MCSeries!225842780"/>
        <tag key="WM/WMRVServiceID" value="!MCService!188913961"/>
        <tag key="WM/WMRVProgramID" value="!MCProgram!285145704"/>
        <tag key="WM/WMRVRequestID" value="0"/>
        <tag key="WM/WMRVScheduleItemID" value="0"/>
        <tag key="WM/WMRVQuality" value="0"/>
        <tag key="WM/WMRVOriginalSoftPrePadding" value="300"/>
        <tag key="WM/WMRVOriginalSoftPostPadding" value="120"/>
        <tag key="WM/WMRVHardPrePadding" value="-300"/>
        <tag key="WM/WMRVHardPostPadding" value="0"/>
        <tag key="WM/WMRVATSCContent" value="true"/>
        <tag key="WM/WMRVDTVContent" value="true"/>
        <tag key="WM/WMRVHDContent" value="true"/>
        <tag key="Duration" value="19151788198"/>
        <tag key="WM/WMRVEndTime" value="2013-03-01 06:32:00"/>
        <tag key="WM/WMRVBitrate" value="8.173201"/>
        <tag key="WM/WMRVKeepUntil" value="-1"/>
        <tag key="WM/WMRVActualSoftPrePadding" value="294"/>
        <tag key="WM/WMRVActualSoftPostPadding" value="120"/>
        <tag key="WM/WMRVContentProtected" value="true"/>
        <tag key="WM/WMRVContentProtectedPercent" value="99"/>
        <tag key="WM/WMRVExpirationSpan" value="9223372036854775807"/>
        <tag key="WM/WMRVInBandRatingSystem" value="255"/>
        <tag key="WM/WMRVInBandRatingLevel" value="255"/>
        <tag key="WM/WMRVInBandRatingAttributes" value="0"/>
        <tag key="WM/WMRVWatched" value="false"/>
        <tag key="WM/MediaThumbWidth" value="352"/>
        <tag key="WM/MediaThumbHeight" value="198"/>
        <tag key="WM/MediaThumbStride" value="1056"/>
        <tag key="WM/MediaThumbRet" value="0"/>
        <tag key="WM/MediaThumbRatingSystem" value="9"/>
        <tag key="WM/MediaThumbRatingLevel" value="17"/>
        <tag key="WM/MediaThumbRatingAttributes" value="0"/>
        <tag key="WM/MediaThumbAspectRatioX" value="16"/>
        <tag key="WM/MediaThumbAspectRatioY" value="9"/>
        <tag key="WM/MediaThumbTimeStamp" value="4647772712253334203"/>
    </format>
</ffprobe>

I am parsing the XML and keeping track of only the first video stream details and the first audio stream details, and then looking for some specific items in the metadata tags. I store the information and return it to the TiVo as information when it’s requesting a list of what programs are available to transfer and then when I transfer the file itself.

An interesting side effect of moving to using XML from using the libraries is that the XML created by FFProbe handles extended characters that are not in the ASCII character set. Because I’m using the XML Parser that works with Unicode by default, it takes care of the characters properly. When I was using the libraries, I was looping on AVDictionaryEntry values and doing comparisons with char values.

Here is the code that I’m currently using. It’s not the prettiest code but it gets the job done and runs quickly enough.

void cTiVoFile::PopulateFromFFProbe(void)
{
	static const CString csFFProbePath(FindEXEFromPath(_T("ffprobe.exe")));
	if (!csFFProbePath.IsEmpty())
	{
		// Set the bInheritHandle flag so pipe handles are inherited. 
		SECURITY_ATTRIBUTES saAttr;  
		saAttr.nLength = sizeof(SECURITY_ATTRIBUTES); 
		saAttr.bInheritHandle = TRUE; 
		saAttr.lpSecurityDescriptor = NULL; 

		// Create a pipe for the child process's STDOUT. 
		HANDLE g_hChildStd_OUT_Rd = NULL;
		HANDLE g_hChildStd_OUT_Wr = NULL;
		if ( ! CreatePipe(&g_hChildStd_OUT_Rd, &g_hChildStd_OUT_Wr, &saAttr, 0x800000) ) 
			std::cout << "[" << getTimeISO8601() << "] "  << __FUNCTION__ << "\t ERROR: StdoutRd CreatePipe" << endl;
		else
		{
			// Ensure the read handle to the pipe for STDOUT is not inherited.
			if ( ! SetHandleInformation(g_hChildStd_OUT_Rd, HANDLE_FLAG_INHERIT, 0) )
				std::cout << "[" << getTimeISO8601() << "] "  << __FUNCTION__ << "\t ERROR: Stdout SetHandleInformation" << endl;
			else
			{
				// Create a child process that uses the previously created pipes for STDIN and STDOUT.
				// Set up members of the PROCESS_INFORMATION structure.  
				PROCESS_INFORMATION piProcInfo; 
				ZeroMemory( &piProcInfo, sizeof(PROCESS_INFORMATION) );
 
				// Set up members of the STARTUPINFO structure. 
				// This structure specifies the STDIN and STDOUT handles for redirection.
				STARTUPINFO siStartInfo;
				ZeroMemory( &siStartInfo, sizeof(STARTUPINFO) );
				siStartInfo.cb = sizeof(STARTUPINFO); 
				siStartInfo.hStdError = GetStdHandle(STD_ERROR_HANDLE);
				siStartInfo.hStdInput = GetStdHandle(STD_INPUT_HANDLE);
				siStartInfo.hStdOutput = g_hChildStd_OUT_Wr;
				siStartInfo.dwFlags |= STARTF_USESTDHANDLES;
 
				CString csCommandLine(QuoteFileName(csFFProbePath));
				csCommandLine.Append(_T(" -show_streams -show_format -print_format xml "));
				csCommandLine.Append(QuoteFileName(m_csPathName));

				TRACE(_T("CreateProcess: %s\n"), csCommandLine.GetString());
				// Create the child process.
				if (CreateProcess(NULL, 
					(LPTSTR) csCommandLine.GetString(),     // command line 
					NULL,          // process security attributes 
					NULL,          // primary thread security attributes 
					TRUE,          // handles are inherited 
					0,             // creation flags 
					NULL,          // use parent's environment 
					NULL,          // use parent's current directory 
					&siStartInfo,  // STARTUPINFO pointer 
					&piProcInfo))  // receives PROCESS_INFORMATION 
				{
					CloseHandle(g_hChildStd_OUT_Wr);	// If I don't do this, then the parent will never exit!
					CComPtr<IStream> spMemoryStreamOne(::SHCreateMemStream(NULL, 0));
					if (spMemoryStreamOne != NULL)
					{
						const int RAWDataBuffSize = 0x1000;	// 0x1000 is 4k
						char * RAWDataBuff = new char[RAWDataBuffSize];
						for (;;)
						{
							DWORD dwRead = 0;
							BOOL bSuccess = ReadFile(g_hChildStd_OUT_Rd, RAWDataBuff, RAWDataBuffSize, &dwRead, NULL);
							if( (!bSuccess) || (dwRead == 0)) break;
							ULONG cbWritten;
							spMemoryStreamOne->Write(RAWDataBuff, dwRead, &cbWritten);
						} 
						delete[] RAWDataBuff;
						// reposition back to beginning of stream
						LARGE_INTEGER position;
						position.QuadPart = 0;
						spMemoryStreamOne->Seek(position, STREAM_SEEK_SET, NULL);
						HRESULT hr = S_OK;
						CComPtr<IXmlReader> pReader; 
						if (SUCCEEDED(hr = CreateXmlReader(__uuidof(IXmlReader), (void**) &pReader, NULL))) 
						{
							if (SUCCEEDED(hr = pReader->SetProperty(XmlReaderProperty_DtdProcessing, DtdProcessing_Prohibit))) 
							{
								if (SUCCEEDED(hr = pReader->SetInput(spMemoryStreamOne))) 
								{
									int indentlevel = 0;
									XmlNodeType nodeType; 
									const WCHAR* pwszLocalName;
									const WCHAR* pwszValue;
									CString csLocalName;
									bool bIsFormat = false;
									bool bVideoStreamInfoNeeded = true;
									bool bAudioStreamInfoNeeded = true;

									//read until there are no more nodes 
									while (S_OK == (hr = pReader->Read(&nodeType))) 
									{
										if (nodeType == XmlNodeType_Element)
										{
											if (SUCCEEDED(hr = pReader->GetLocalName(&pwszLocalName, NULL)))
											{
												csLocalName = CString(pwszLocalName);
												if ((bVideoStreamInfoNeeded || bAudioStreamInfoNeeded) && !csLocalName.Compare(_T("stream")))
												{
													CString cs_codec_name;
													CString cs_codec_type;
													CString cs_codec_time_base;
													CString cs_width;
													CString cs_height;
													CString cs_duration;
													while (S_OK == pReader->MoveToNextAttribute())
													{
														if (SUCCEEDED(hr = pReader->GetLocalName(&pwszLocalName, NULL)))
															if (SUCCEEDED(hr = pReader->GetValue(&pwszValue, NULL)))
														{
															csLocalName = CString(pwszLocalName);
															if (!csLocalName.Compare(_T("codec_name")))
																cs_codec_name = CString(pwszValue);
															else if (!csLocalName.Compare(_T("codec_type")))
																cs_codec_type = CString(pwszValue);
															else if (!csLocalName.Compare(_T("codec_time_base")))
																cs_codec_time_base = CString(pwszValue);
															else if (!csLocalName.Compare(_T("width")))
																cs_width = CString(pwszValue);
															else if (!csLocalName.Compare(_T("height")))
																cs_height = CString(pwszValue);
															else if (!csLocalName.Compare(_T("duration")))
																cs_duration = CString(pwszValue);
														}
													}
													if (!cs_codec_type.Compare(_T("video")))
													{
														bVideoStreamInfoNeeded = false;
														if (!cs_codec_name.Compare(_T("mpeg2video")))
															m_VideoCompatible = true;
														m_SourceFormat = cs_codec_type + CString(_T("/")) + cs_codec_name;
														int width = 0;
														std::wstringstream ss;
														ss << cs_width.GetString();
														ss >> width;
														if (width >= 1280)
															m_VideoHighDefinition = true;
														double duration = 0;
														ss = std::wstringstream();
														ss << cs_duration.GetString();
														ss >> duration;
																												m_Duration = duration * 1000 + 5;													}
													else if (!cs_codec_type.Compare(_T("audio")))
													{
														bAudioStreamInfoNeeded = false;
														if (!cs_codec_name.Compare(_T("ac3")))
															m_AudioCompatible = true;
													}	
												}
												else if (!csLocalName.Compare(_T("format")))
												{
													bIsFormat = true;
													const CString ccs_duration(_T("duration"));
													while (S_OK == pReader->MoveToNextAttribute())
													{
														if (SUCCEEDED(hr = pReader->GetLocalName(&pwszLocalName, NULL)))
															if (SUCCEEDED(hr = pReader->GetValue(&pwszValue, NULL)))
														{
															if (!ccs_duration.Compare(pwszLocalName))
															{
																double duration = 0;
																std::wstringstream ss;
																ss << pwszValue;
																ss >> duration;
																m_Duration = duration * 1000 + 5;
															}
														}
													}
												}
												// Here's where I need to dig deeper.
												else if (bIsFormat && (!csLocalName.Compare(_T("tag"))))
												{
													CString csAttributeKey;
													CString csAttributeValue;
													while (S_OK == pReader->MoveToNextAttribute())
													{
														if (SUCCEEDED(hr = pReader->GetLocalName(&pwszLocalName, NULL)))
															if (SUCCEEDED(hr = pReader->GetValue(&pwszValue, NULL)))
														{
															if (!CString(_T("key")).Compare(pwszLocalName))
																csAttributeKey = CString(pwszValue);
															else if (!CString(_T("value")).Compare(pwszLocalName))
																csAttributeValue = CString(pwszValue);
														}
													}
													if (!csAttributeKey.CompareNoCase(_T("title")))
														m_Title = csAttributeValue;
													else if (!csAttributeKey.CompareNoCase(_T("episode_id")))
														m_EpisodeTitle = csAttributeValue;
													else if (!csAttributeKey.CompareNoCase(_T("description")))
														m_Description = csAttributeValue;
													else if (!csAttributeKey.CompareNoCase(_T("WM/SubTitle")))
														m_EpisodeTitle = csAttributeValue;
													else if (!csAttributeKey.CompareNoCase(_T("WM/SubTitleDescription")))
														m_Description = csAttributeValue;
													else if (!csAttributeKey.CompareNoCase(_T("genre")))
														m_vProgramGenre = csAttributeValue;
													else if (!csAttributeKey.CompareNoCase(_T("service_provider")))
														m_SourceStation = csAttributeValue;
													else if (!csAttributeKey.CompareNoCase(_T("WM/MediaOriginalChannel")))
														m_SourceChannel = csAttributeValue;
													else if (!csAttributeKey.CompareNoCase(_T("WM/MediaCredits")))
													{
														m_vActor = csAttributeValue;
														while (0 < m_vActor.Replace(_T(";;"),_T(";")));
														while (0 < m_vActor.Replace(_T("//"),_T("/")));
													}
													else if (!csAttributeKey.CompareNoCase(_T("WM/WMRVEncodeTime")))
													{
														CTime OriginalBroadcastDate = ISO8601totime(std::string(CStringA(csAttributeValue).GetString()));
														if (OriginalBroadcastDate > 0)
															m_CaptureDate = OriginalBroadcastDate;
													}
													else if (!csAttributeKey.CompareNoCase(_T("WM/MediaOriginalBroadcastDateTime")))
													{
														CTime OriginalBroadcastDate = ISO8601totime(std::string(CStringA(csAttributeValue).GetString()));
														if (OriginalBroadcastDate > 0)
															m_CaptureDate = OriginalBroadcastDate;
													}
																										m_Description.Trim();
												}
											}
										}
										else if (nodeType == XmlNodeType_EndElement)
										{
											if (SUCCEEDED(hr = pReader->GetLocalName(&pwszLocalName, NULL)))
												if (!CString(pwszLocalName).Compare(_T("format")))
													bIsFormat = false;
										}
									}
								}
							}
						}
					}
					// Close handles to the child process and its primary thread.
					// Some applications might keep these handles to monitor the status
					// of the child process, for example. 
					CloseHandle(piProcInfo.hProcess);
					CloseHandle(piProcInfo.hThread);
				}
			}
			CloseHandle(g_hChildStd_OUT_Rd);
		}
	}
}

Researching DVD Subtitle Format

I am attempting to stream webcam video from a BeagleBoneBlack to other computers over ethernet. I want to add an overlay with details about the video. I am capturing video from a Logitech C920 webcam, which is doing the hard work of creating the video on H.264 format, using FFMPEG to MUX the video into a network stream. The current video stream runs at 3Mb/s over ethernet, and seems to run at the same bitrate whether I’m sending video 30FPS at 1920×1080, 1280×720, or any other resolution I’ve tried. If I’m running the BBB at 1GHz FFMPEG uses only 3% load on the processor, while at 300MHz it uses 10% load. Either processor speed indicates that I should have plenty of CPU available for creating a subtitle frame a second.

If I transcode the H264 coming from the C920 to h.264 from FFMPEG the BBB CPU is 100% used and I’ve not been able to get over 5 FPS. This has led me to the idea of adding a second stream with much more compressible data and requiring the client computer to know how to enable subtitles.

My understanding of DVD Subtitles is that they are stored as image overlays. The images seem to be 3 color plus transparency, with the color indexed. They are RLE (Run Length Encoded) images but don’t seem to conform to any standard that would be created by an image library such as OpenCV.

The most useful links I’ve come across related to the DVD subtitles are these three:

Using FFMPEG to examine at a video that was ripped from a DVD into an MKV file with several subtitle layers shows the following:

Stream #0:0(eng): Video: mpeg2video (Main), yuv420p, 720x480 [SAR 32:27 DAR 16:9], SAR 186:157 DAR 279:157, 29.97 fps, 29.97 tbr, 1k tbn, 59.94 tbc
Stream #0:1(eng): Audio: ac3, 48000 Hz, 5.1(side), fltp, 448 kb/s (default)
Metadata:
  title           : 3/2+1
Stream #0:2(eng): Audio: ac3, 48000 Hz, 5.1(side), fltp, 384 kb/s
Metadata:
  title           : 3/2+1
Stream #0:3(spa): Audio: ac3, 48000 Hz, stereo, fltp, 192 kb/s
Metadata:
  title           : 2/0
Stream #0:4(fre): Audio: ac3, 48000 Hz, stereo, fltp, 192 kb/s
Metadata:
  title           : 2/0
Stream #0:5(eng): Subtitle: dvd_subtitle (default)
Stream #0:6(spa): Subtitle: dvd_subtitle
Stream #0:7(eng): Subtitle: dvd_subtitle
Stream #0:8(spa): Subtitle: dvd_subtitle
Stream #0:9(fre): Subtitle: dvd_subtitle

All of the descriptions of creating subtitle tracks are directly related to creating textual subtitles using tools that are wonderful for mainstream movie content but not what I want to do. e.g.

I’ve not figured out how to create my own subtitle stream and am still looking for information on that. I’ve not figured out what parameters may need to be passed to FFMPEG to indicate that I’m passing in a subtitle track. I’ve not figured out if there’s a way in FFMPEG to indicate that the subtitles should be on by default, or forced subtitles, while still keeping them as a separate stream.

It doesn’t help that the DVD subtitle files seem to use the STL extension and that same extension is used for the input files for many 3D Printers.

BeagleBoneBlack Webcam using a V4L2, FFMPEG and a FIFO

In my previous post I was starting FFMPEG with the command to read from standard input using a pipe. That’s a great method when you have a single input you want to supply to FFMPEG, but my ultimate goal is to supply multiple streams to FFMPEG and have it multiplex them into a single transport stream.

I believe that my end solution for this is going to be using several named pipes, which are First In First Out devices created by the command mkfifo() in linux.

The following code snippet functions nearly identically to the snippet in my previous post except that it creates a FIFO in the /tmp/ directory and tells FFMPEG to read from the FIFO instead of from stdin.

if (0 != mkfifo("/tmp/Wim_C920_FFMPEG", S_IRWXU))
	std::cerr << "[" << getTimeISO8601() << "] FiFo /tmp/Wim_C920_FFMPEG NOT created Successfully" << std::endl;
else
	std::cerr << "[" << getTimeISO8601() << "] FiFo /tmp/Wim_C920_FFMPEG created Successfully" << std::endl;
/* Attempt to fork */
pid_t pid = fork();
if(pid == -1)
{
	fprintf(stderr,"Fork error! Exiting.\n");  /* something went wrong */
	exit(1);        
}
else if (pid == 0)
{
	/* A zero PID indicates that this is the child process */
	/* Replace the child fork with a new process */
	if(execlp("ffmpeg", "ffmpeg", "-i", "/tmp/Wim_C920_FFMPEG", "-vcodec", "copy", "-f", "rtp", "rtp://239.8.8.8:8090/", "-metadata", "title=\"Wims BoneCam\"", "-vcodec", "copy", OutputFilename.c_str(), NULL) == -1)
	{
		std::cerr << "execlp Error! Exiting." << std::endl;
		exit(1);
	}
}
else
{
	/* A positive (non-negative) PID indicates the parent process */
	open_device();
	init_device();
	start_capturing();
	//mainloop();
	int pipe_fd = open("/tmp/Wim_C920_FFMPEG", O_WRONLY);
	if (pipe_fd <= 0)
		std::cerr << "[" << getTimeISO8601() << "] FiFo NOT opened Successfully" << std::endl;
	else
	{
		std::cerr << "[" << getTimeISO8601() << "] FiFo opened Successfully" << std::endl;
		time_t CurrentTime;
		time(&CurrentTime);
		time_t FileStartTime = CurrentTime;
		while ((difftime(CurrentTime, FileStartTime) < (60 * 60)) && (bRun))
		{
			fd_set fds;
			struct timeval tv;
			int r;

			FD_ZERO(&fds);
			FD_SET(fd, &fds);

			/* Timeout. */
			tv.tv_sec = 2;
			tv.tv_usec = 0;

			r = select(fd + 1, &fds, NULL, NULL, &tv);
			if (-1 == r) 
			{
				if (EINTR == errno)
					continue;
				errno_exit("select");
			}
			if (0 == r) 
			{
				fprintf(stderr, "select timeout\n");
				exit(EXIT_FAILURE);
			}
			struct v4l2_buffer buf;
			WIMLABEL:
			CLEAR(buf);
			buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
			buf.memory = V4L2_MEMORY_MMAP;

			if (-1 == xioctl(fd, VIDIOC_DQBUF, &buf)) 
			{
				switch (errno) 
				{
				case EAGAIN:
					goto WIMLABEL;
				case EIO:
					/* Could ignore EIO, see spec. */
					/* fall through */
				default:
					errno_exit("VIDIOC_DQBUF");
				}
			}
			else
			{
				assert(buf.index < n_buffers);
				write(pipe_fd, buffers[buf.index].start, buf.bytesused);
			}
			if (-1 == xioctl(fd, VIDIOC_QBUF, &buf))
					errno_exit("VIDIOC_QBUF");
			time(&CurrentTime);
		}
		stop_capturing();
		uninit_device();
		close_device();
		close(pipe_fd);		/* Close side of pipe I'm writing to, to get the child to recognize it's gone away */
		std::cerr << "\n[" << getTimeISO8601() << "] Pipe Closed, Waiting for FFMPEG to exit" << std::endl;
	}
	int ffmpeg_exit_status;
	wait(&ffmpeg_exit_status);				/* Wait for child process to end */
	std::cerr << "[" << getTimeISO8601() << "] FFMPEG exited with a  " << ffmpeg_exit_status << " value" << std::endl;
	if (0 == remove("/tmp/Wim_C920_FFMPEG"))
		std::cerr << "[" << getTimeISO8601() << "] FiFo /tmp/Wim_C920_FFMPEG removed Successfully" << std::endl;
	else
		std::cerr << "[" << getTimeISO8601() << "] FiFo /tmp/Wim_C920_FFMPEG NOT removed Successfully" << std::endl;
}

An interesting side effect is that ffmpeg now can read from stdin for arbitrary commands, as if it had been run from the linux command line, and so the "q" key will cause it to quit running. I've not fully investigated this behavior to see what happens to the parent program when the fifo consumer suddenly quits. Initially, it seems that the fifo supplier quits as well.

A bad thing about going with named pipes is that there should be a way of ensuring that the names used are unique and don't conflict with anything else on the system. They should also be cleaned up at the end of the process. My current method has very little error correction and should not be used as an example beyond to get started.

BeagleBoneBlack Webcam using a V4L2, FFMPEG and a Pipe

I’ve been working on streaming video from my BeagleBoneBlack (BBB) over WiFi and also keeping a copy of the video on the local flash. My starting point for capturing the video using Video For Linux 2 (V4L2) came from demonstration code from Derek Molloy’s site. He has some very well done instructional videos that are much more accessible and complete than most of what I’ve put together.

I am doing most of my development in C/C++ simply because it’s my strongest language. I’m doing the development on the BBB under linux which means that some of the skills I’m using are fairly rusty.

Because I wanted to have the program pulling the video from the camera in charge of the entire process, including managing which local output files are created, and making sure that there is disk storage space, I decided to use pipe(), fork(), and exec(), to manage ffmpeg as a child process of my program.

I started with the same demonstration program from V4L2 and then modified it so that instead of writing out to stdout, it’s writing to a file descriptor I supply. This all happens inside of the primary loop that generates the outputfilename for the ffmpeg command line and makes sure that there is enough free storage for an hours worth of video. Here’s the main code snippet.

int	pipefd[2];		/* This holds the fd for the input & output of the pipe */
/* Setup communication pipeline first */
if(pipe(pipefd))
{
	std::cerr << "[" << getTimeISO8601() << "] Pipe error! Exiting." << std::endl;
	exit(1);
}
/* Attempt to fork */
pid_t pid = fork();
if(pid == -1)
{
	std::cerr << "[" << getTimeISO8601() << "] Fork error! Exiting." << std::endl; /* something went wrong */
	exit(1);        
}
else if (pid == 0)
{
	/* A zero PID indicates that this is the child process */
	dup2(pipefd[0], 0);	/* Replace stdin with the in side of the pipe */
	close(pipefd[1]);	/* Close unused side of pipe (out side) */
	/* Replace the child fork with a new process */
	if(execlp("ffmpeg", "ffmpeg", "-i", "-", "-vcodec", "copy", "-f", "rtp", "rtp://239.8.8.8:8090/", "-metadata", "title=\"Wims BoneCam\"", "-vcodec", "copy", OutputFilename.c_str(), NULL) == -1)
	{
		std::cerr << "[" << getTimeISO8601() << "] execlp Error! Exiting." << std::endl;
		exit(1);
	}
}
else
{
	/* A positive (non-negative) PID indicates the parent process */
	close(pipefd[0]);		/* Close unused side of pipe (in side) */
	open_device();
	init_device();
	start_capturing();
	//mainloop();

	time_t CurrentTime;
	time(&CurrentTime);
	time_t FileStartTime = CurrentTime;
	while ((difftime(CurrentTime, FileStartTime) < (60 * 60)) && (bRun))
	//unsigned int count = frame_count;
	//while ((count-- > 0) && (bRun))
	{
		fd_set fds;
		struct timeval tv;
		int r;

		FD_ZERO(&fds);
		FD_SET(fd, &fds);

		/* Timeout. */
		tv.tv_sec = 2;
		tv.tv_usec = 0;

		r = select(fd + 1, &fds, NULL, NULL, &tv);
		if (-1 == r) 
		{
			if (EINTR == errno)
				continue;
			errno_exit("select");
		}
		if (0 == r) 
		{
			fprintf(stderr, "select timeout\n");
			exit(EXIT_FAILURE);
		}
		struct v4l2_buffer buf;
		WIMLABEL:
		CLEAR(buf);
		buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
		buf.memory = V4L2_MEMORY_MMAP;

		if (-1 == xioctl(fd, VIDIOC_DQBUF, &buf)) 
		{
			switch (errno) 
			{
			case EAGAIN:
				goto WIMLABEL;
			case EIO:
				/* Could ignore EIO, see spec. */
				/* fall through */
			default:
				errno_exit("VIDIOC_DQBUF");
			}
		}
		else
		{
			assert(buf.index < n_buffers);
			write(pipefd[1], buffers[buf.index].start, buf.bytesused);
		}
		if (-1 == xioctl(fd, VIDIOC_QBUF, &buf))
				errno_exit("VIDIOC_QBUF");
		time(&CurrentTime);
	}

	stop_capturing();
	uninit_device();
	close_device();
	close(pipefd[1]);		/* Close side of pipe I'm writing to, to get the child to recognize it's gone away */
	std::cerr << "\n[" << getTimeISO8601() << "] Pipe Closed, Waiting for FFMPEG to exit" << std::endl;
	int ffmpeg_exit_status;
	wait(&ffmpeg_exit_status);				/* Wait for child process to end */
	std::cerr << "[" << getTimeISO8601() << "] FFMPEG exited with a  " << ffmpeg_exit_status << " value" << std::endl;
}

You can see the calls to open_device(), init_device(), and start_capturing() that I didn’t change at all. My loop runs for a particular amount of time instead of number of frames. I flattened the call to mainloop() into this routine in a very quick and dirty fashion to make sure I had access to the file descriptor to write the data from the camera to the input for ffmpeg.

ffmpeg is started with the execlp() command. The parameters -i – tell ffmpeg to read it’s input from standard input, which in this case I’ve pointed to be the output end of the pipe I created.

Webcam on BeagleBoardBlack using OpenCV

I’ve been working with my BBB and Logitech C920 webcam trying to stream video at low latency for some time and have not yet managed to get the latency under 2 seconds.

As a side project I wanted to use the BBB to create a time lapse video, capturing a picture a second, and then later stitching all of the pictures into a video using ffmpeg.

I’m using OpenCV for the first time. I’m really only using it for the capture/save and to draw some text and lines onto the image, which probably makes OpenCV significant overkill.

My C++ code for the process is:

#include <iostream> // for standard I/O
#include <string>   // for strings
#include <iomanip>  // for controlling float print precision
#include <sstream>  // string to number conversion
#include <unistd.h> // for sleep
#include<opencv2/opencv.hpp>
using namespace std;
using namespace cv;

std::string timeToISO8601(const time_t & TheTime)
{
	std::ostringstream ISOTime;
	struct tm * UTC = gmtime(&TheTime);
	ISOTime.fill('0');
	ISOTime << UTC->tm_year+1900 << "-";
	ISOTime.width(2);
	ISOTime << UTC->tm_mon+1 << "-";
	ISOTime.width(2);
	ISOTime << UTC->tm_mday << "T";
	ISOTime.width(2);
	ISOTime << UTC->tm_hour << ":";
	ISOTime.width(2);
	ISOTime << UTC->tm_min << ":";
	ISOTime.width(2);
	ISOTime << UTC->tm_sec;
	ISOTime << "Z";
	return(ISOTime.str());
}
std::string getTimeISO8601(void)
{
	time_t timer;
	time(&timer);
	return(timeToISO8601(timer));
}

int main()
{
    VideoCapture capture(-1);	// Using -1 tells OpenCV to grab whatever camera is available.
    if(!capture.isOpened()){
	    std::cout << "Failed to connect to the camera." << std::endl;
		return(1);
    }
    capture.set(CAP_PROP_FRAME_WIDTH,1920);
    capture.set(CAP_PROP_FRAME_HEIGHT,1080);
    //capture.set(CAP_PROP_FRAME_WIDTH,2304);	// This should be possible for still images, but not for 30fps video.
    //capture.set(CAP_PROP_FRAME_HEIGHT,1536);

	for (int OutputFolderNum = 100;	OutputFolderNum < 1000; OutputFolderNum++)
		for (int OutputImageNum = 1; OutputImageNum < 10000; OutputImageNum++)
		{
			Mat C920Image;
		    capture >> C920Image;
			if(!C920Image.empty())
			{
				std::ostringstream OutputFilename;
				OutputFilename.fill('0');
				OutputFilename << "/media/BONEBOOT/DCIM/";
				OutputFilename.width(3);
				OutputFilename << OutputFolderNum;
				OutputFilename << "WIMBO/img_";
				OutputFilename.width(4);
				OutputFilename << OutputImageNum;
				OutputFilename << ".jpg";

				line(C920Image, Point(0, C920Image.rows/2), Point(C920Image.cols, C920Image.rows/2), Scalar(255, 255, 255, 32)); // Horizontal line at center
				line(C920Image, Point(C920Image.cols/2, 0), Point(C920Image.cols/2, C920Image.rows), Scalar(255, 255, 255, 32)); // Vertical line at center

				circle(C920Image, Point(C920Image.cols/2, C920Image.rows/2), 240, Scalar(255, 255, 255, 32)); // Circles based at center
				putText(C920Image, "10", Point((C920Image.cols/2 + 240), (C920Image.rows/2)), FONT_HERSHEY_SIMPLEX, 1.0, Scalar(0, 0, 255));
				circle(C920Image, Point(C920Image.cols/2, C920Image.rows/2), 495, Scalar(255, 255, 255, 32)); // Circles based at center
				putText(C920Image, "20", Point((C920Image.cols/2 + 495), (C920Image.rows/2)), FONT_HERSHEY_SIMPLEX, 1.0, Scalar(0, 0, 255));
				circle(C920Image, Point(C920Image.cols/2, C920Image.rows/2), 785, Scalar(255, 255, 255, 32)); // Circles based at center
				putText(C920Image, "30", Point((C920Image.cols/2 + 785), (C920Image.rows/2)), FONT_HERSHEY_SIMPLEX, 1.0, Scalar(0, 0, 255));
				circle(C920Image, Point(C920Image.cols/2, C920Image.rows/2), 1141, Scalar(255, 255, 255, 32)); // Circles based at center
				putText(C920Image, "40", Point((C920Image.cols/2 + 1141), (C920Image.rows/2)), FONT_HERSHEY_SIMPLEX, 1.0, Scalar(0, 0, 255));

				string DateTimeText = "WimsWorld.com " + getTimeISO8601();
				int baseline=0;
				Size textSize = getTextSize(DateTimeText, FONT_HERSHEY_SIMPLEX, 1, 1, &baseline);
				putText(C920Image, DateTimeText, Point((C920Image.cols - textSize.width), (C920Image.rows - baseline)), FONT_HERSHEY_SIMPLEX, 1.0, Scalar(0, 0, 255));
				imwrite(OutputFilename.str(), C920Image);
				std::cout << DateTimeText << " Wrote File : " << OutputFilename.str() << std::endl;
			}
			std::cout << getTimeISO8601() << "\r" << std::flush;
			sleep(1);
		}
    return 0;
}

I compile it on the BBB with the command:

g++ -O2 `pkg-config --cflags --libs opencv` TimeLapse.cpp -o TimeLapse

I’ve got a bug in that I don’t automatically create the directory structure that I’m saving files into. That’s in the to-do list.

I had been interested in the angle of view on the C920 and found it defined on the Logitech support site that the “Diagonal Field of View (FOV) for the Logitech C920 is 78°”. Unfortunately I was not able to understand if that varied based on the resolution being used. I’m currently using the resolution of 1920×1080, but for stills the camera can capture up to 2304×1536.

I did the geometry math to figure out that 10° off center would be a radius of 240, 20° off center would be a radius of 495, and 30° off center would be a radius of 785. Remembering SOHCAHTOA as Some Old Hags Can’t Always Hide Their Old Age from 9th grade math class came in useful. Using 1920×1080 and 78°angle, my diagonal radius (opposite) works out at 1101 and angle of 39° for tangent, allowing me to calculate my eye height of 1360 = (1101/Tan(39°)). Once I had my eye height I could calculate the radius of circles at any angle by Radius = Tan(Angle) * EyeHeight.

I wanted the circles and angles of vision for my streaming video application and decided that seeing them drawn on the images created here would be helpful, along with both the horizontal and vertical center lines.

The thing I’m not happy with is that the application seems to be running between 30% and 60% of the CPU load on the BBB. When I stream video from the C920 using the native H.264 output the C920 can produce, I was only using about 3% of the BBB CPU. I’ve commented out my drawing code, and verified that the CPU load is primarily related to acquiring the image from the capture device and saving it out to a jpeg file. The lines and text drawing produce minimal incremental CPU. I want to keep the CPU load as low as possible because I’m powering this device from a battery and want it to have as long a runtime as possible.

I believe that the OpenCV library is opening the capture device in a movie streaming mode, and it’s using more CPU interpreting the stream as it’s coming in than the method I was using for streaming to a file. I’ve not yet figured out if there’s a way to define what mode OpenCV acquires the image from the camera.

I was trying to draw the lines and circles with some alpha transparency, but it seems that my underlying image is not the right number of channels and so the lines are being drawn fully opaque.

When the capture opens, it outputs several instances of the same error “VIDIOC_QUERYMENU: Invalid argument” that I’ve not figured out what they mean, or stopped procucing.

I am working on a 32GB flash card, partitioned into two 16GB filesystems. The first is Fat32, has a simple uEnv.txt file in the root allowing the BBB onboard flash to be used, and following the Design rules for Camera File systems standard for the image naming. It allows me to take out the card put it in a PC and it’s recognized just like a normal camera memory card.

Contents of uEnv.txt:

mmcdev=1
bootpart=1:2
mmcroot=/dev/mmcblk1p2
optargs=quiet

The camera seems to be focusing on the building across the street instead of West Seattle.

View from 1200 Western Ave, 13th Floor Elevator Room

1200 Western Ave, 13th Floor Elevator Room

WimTiVoServer Processes

I have a TiVoHD, and so have written my program specifically to support the features of the TiVoSeries3 and TiVoHD. The newer TiVoPremier units support more codecs and faster transfer speeds, and the earlier Series2 units do not support the 720p or 1080i resolutions that I allow to be transferred to the TiVo. My initial TiVo server serves files from specific shares on the server, \\server\TiVo\* and \\server\Videos\*, which are specified in the registry key and does not have a user interface to modify them. I recursively parse the shares for files FFMPEG reports as video files and present them as a flattened container sorted by date.
[HKEY_LOCAL_MACHINE\SOFTWARE\WimsWorld\WimTiVoServer]
"Container"="//server/TiVo/*;//server/Videos/*"

The TiVo Server operates using the HTTP protocol. It can also operate using SSL/HTTPS. The TiVo itself uses HTTPS for all of the XML transfers, but transfers its videos using HTTP. The videos the TiVo transfers are encrypted using the Media Access Key (MAK) specific to each TiVo. Videos transferred to the TiVo do not need to be encrypted.

There are multiple ways that your server can announce itself to the TiVo. I’m using the simplest to develop, which is broadcasting a specially formatted UDP packet every minute on the local network to port number 2190. An example packet looks like:
tivoconnect=1
method=broadcast
platform=pc/win-nt
machine=FREDS-PC
identity={D936E980-79E3-11D6-A84A-00045A43EEE7}
services=FooService:1234,BarService:4321

Here is a current broadcast from my server indicating that it will serve TiVo compatible media via the HTTP protocol on port 56667. My program follows the tivo recommendation of using a TCP port supplied by the OS, and attempting to use the same port on subsequent restarts of the server. It also generates the GUID and attempts to save and reuse it.
tivoconnect=1
method=broadcast
platform=pc/WinNT:6.2.9200
machine=Acid
identity={FF121976-2D7B-4682-A1DA-464510DCACEB}
services=TiVoMediaServer:56667/http
swversion=20130604103659

After the TiVo has received a broadcast with the service it is interested in, it will make a request of the server to get the top level containers. http://{machine}/TiVoConnect?Command=QueryContainer&Container=/ The server responds with the XML describing the top level containers.

Here’s an example of the code produced by my server, which is named Acid and has a single top level container with the same name. URLs embedded in the XML can be either relative or absolute. I’ve chosen to always use relative URLs because that way I don’t have to embed the TCP port I’m serving data from in the XML.

<?xml version="1.0" encoding="UTF-8"?>
<TiVoContainer xmlns="http://www.tivo.com/developer/calypso-protocol-1.6/">
  <Details>
    <Title>Acid</Title>
    <ContentType>x-tivo-container/tivo-server</ContentType>
    <SourceFormat>x-tivo-container/folder</SourceFormat>
    <TotalItems>1</TotalItems>
  </Details>
  <Item>
    <Details>
      <Title>Acid</Title>
      <ContentType>x-tivo-container/tivo-videos</ContentType>
      <SourceFormat>x-tivo-container/folder</SourceFormat>
    </Details>
    <Links>
      <Content>
        <Url>/TiVoConnect?Command=QueryContainer&amp;Container=%2FTiVoNowPlaying</Url>
        <ContentType>x-tivo-container/tivo-videos</ContentType>
      </Content>
    </Links>
  </Item>
  <ItemStart>0</ItemStart>
  <ItemCount>1</ItemCount>
</TiVoContainer>

When a user goes to the Now Playing screen on the TiVo, the top level containers should show up in the list at the bottom. Selecting the container will cause the TiVo to request the particular container from the server via its content URL. http://{machine}/TiVoConnect?Command=QueryContainer&Container=/Foo

That container may contain TiVoItems which have URLS for both details and content. The TiVo will only request the content URL after it’s successfully requested the URL for the details, which was something that it took a while for me to understand.

Here is an example of the output when the TiVo requests the container with the URL http://acid:56667/TiVoConnect?Command=QueryContainer&Container=/TiVoNowPlaying&ItemCount=1 :

<?xml version="1.0" encoding="UTF-8"?>
<TiVoContainer xmlns="http://www.tivo.com/developer/calypso-protocol-1.6/">
  <ItemStart>0</ItemStart>
  <ItemCount>1</ItemCount>
  <Details>
    <Title>Acid</Title>
    <ContentType>x-tivo-container/folder</ContentType>
    <SourceFormat>x-tivo-container/folder</SourceFormat>
    <TotalItems>2116</TotalItems>
  </Details>
  <Item>
    <Details>
      <Title>ShowTitle</Title>
      <ContentType>video/x-tivo-mpeg</ContentType>
      <SourceFormat>video/h264</SourceFormat>
      <SourceSize>1805804000</SourceSize>
      <Duration>1289860</Duration>
      <CaptureDate>0x51b0ff42</CaptureDate>
    </Details>
    <Links>
      <Content>
        <ContentType>video/x-tivo-mpeg</ContentType>
        <Url>/TiVoConnect/TivoNowPlaying///Acid/TiVo/ShowTitle.mp4</Url>
      </Content>
      <CustomIcon>
        <ContentType>image/*</ContentType>
        <AcceptsParams>No</AcceptsParams>
        <Url>urn:tivo:image:save-until-i-delete-recording</Url>
      </CustomIcon>
      <TiVoVideoDetails>
        <ContentType>text/xml</ContentType>
        <AcceptsParams>No</AcceptsParams>
        <Url>/TiVoConnect?Command=TVBusQuery&amp;Url=/TiVoConnect/TivoNowPlaying///Acid/TiVo/ShowTitle.mp4</Url>
      </TiVoVideoDetails>
    </Links>
  </Item>
</TiVoContainer>

When I select “ShowTitle” in the tivo onscreen interface, it will request this same URL, with the anchoritem set to the content URL. It will then display the details about the show on the TV screen. After I select that I want the show to be transferred it first requests an undocumented URL, my server responds that the URL was not found, then the TiVo requests the TiVoVideoDetails URL, and I respond with an XML chunk that I couldn’t find documentation for anywhere and had to fake together from looking at the output of TiVoDecode and pyTiVo. The really ugly bit of the XML is in the initial element declaration. I wasn’t able to make the XML be accepted by the TiVo if it didn’t have the full namespace declarations, even though the hosts listed are not accessable.

<?xml version="1.0" encoding="UTF-8"?>
<TvBusMarshalledStruct:TvBusEnvelope xmlns:xs="http://www.w3.org/2001/XMLSchema-instance" xmlns:TvBusMarshalledStruct="http://tivo.com/developer/xml/idl/TvBusMarshalledStruct" xmlns:TvPgdRecording="http://tivo.com/developer/xml/idl/TvPgdRecording" xmlns:TvBusDuration="http://tivo.com/developer/xml/idl/TvBusDuration" xmlns:TvPgdShowing="http://tivo.com/developer/xml/idl/TvPgdShowing" xmlns:TvDbShowingBit="http://tivo.com/developer/xml/idl/TvDbShowingBit" xmlns:TvBusDateTime="http://tivo.com/developer/xml/idl/TvBusDateTime" xmlns:TvPgdProgram="http://tivo.com/developer/xml/idl/TvPgdProgram" xmlns:TvDbColorCode="http://tivo.com/developer/xml/idl/TvDbColorCode" xmlns:TvPgdSeries="http://tivo.com/developer/xml/idl/TvPgdSeries" xmlns:TvDbShowType="http://tivo.com/developer/xml/idl/TvDbShowType" xmlns:TvPgdBookmark="http://tivo.com/developer/xml/idl/TvPgdBookmark" xmlns:TvPgdChannel="http://tivo.com/developer/xml/idl/TvPgdChannel" xmlns:TvDbBitstreamFormat="http://tivo.com/developer/xml/idl/TvDbBitstreamFormat" xs:schemaLocation="http://tivo.com/developer/xml/idl/TvBusMarshalledStruct TvBusMarshalledStruct.xsd http://tivo.com/developer/xml/idl/TvPgdRecording TvPgdRecording.xsd http://tivo.com/developer/xml/idl/TvBusDuration TvBusDuration.xsd http://tivo.com/developer/xml/idl/TvPgdShowing TvPgdShowing.xsd http://tivo.com/developer/xml/idl/TvDbShowingBit TvDbShowingBit.xsd http://tivo.com/developer/xml/idl/TvBusDateTime TvBusDateTime.xsd http://tivo.com/developer/xml/idl/TvPgdProgram TvPgdProgram.xsd http://tivo.com/developer/xml/idl/TvDbColorCode TvDbColorCode.xsd http://tivo.com/developer/xml/idl/TvPgdSeries TvPgdSeries.xsd http://tivo.com/developer/xml/idl/TvDbShowType TvDbShowType.xsd http://tivo.com/developer/xml/idl/TvPgdBookmark TvPgdBookmark.xsd http://tivo.com/developer/xml/idl/TvPgdChannel TvPgdChannel.xsd http://tivo.com/developer/xml/idl/TvDbBitstreamFormat TvDbBitstreamFormat.xsd" xs:type="TvPgdRecording:TvPgdRecording">
  <recordedDuration>PT21M29S</recordedDuration>
  <vActualShowing />
  <vBookmark />
  <recordingQuality value="75">HIGH</recordingQuality>
  <showing>
    <showingBits value="0" />
    <time>2013-06-06:21:29:38Z</time>
    <duration>PT21M29S</duration>
    <program>
      <vActor />
      <vAdvisory />
      <vChoreographer />
      <colorCode value="4">COLOR</colorCode>
      <description></description>
      <vDirector />
      <episodeTitle></episodeTitle>
      <vExecProducer />
      <vProgramGenre />
      <vGuestStar />
      <vHost />
      <isEpisode>false</isEpisode>
      <originalAirDate>2013-06-06:21:29:38Z</originalAirDate>
      <vProducer />
      <series>
        <isEpisodic>false</isEpisodic>
        <vSeriesGenre />
        <seriesTitle>ShowTitle</seriesTitle>
      </series>
      <showType value="5">SERIES</showType>
      <title>ShowTitle</title>
      <vWriter />
    </program>
    <channel>
      <displayMajorNumber />
      <displayMinorNumber />
      <callsign />
    </channel>
  </showing>
  <startTime>2013-06-06:21:29:38Z</startTime>
  <stopTime>2013-06-06:21:51:07Z</stopTime>
</TvBusMarshalledStruct:TvBusEnvelope>

After the TvBusMarshalledStruct XML has been transferred to the TiVo the tivo will finally request the content url from the original container listing. When the content is sent, it should be sent using HTTP Chunked Encoding.

My current ffmpeg command line is fairly simple, and mostly copied from my reading of the pyTiVo source code. I make a couple of simple decisions based on the source file codecs. If the video is mpeg2video or the audio is ac3 I use the “-vcodec copy” or “-acodec copy” command,
otherwise I specify those as codecs. I also include the options “-b:v 16384k -maxrate 30000k -bufsize 4096k -ab 448k -ar 48000 -f vob” which I need to spend more time learning about.

One of the gotchas that I spent a significant amount of time debugging was that the SourceSize element in the Item listing is significant. It appears that the tivo pre-allocates space for the file as it starts the transfer and if the transfer is proceeding significantly larger than the reported source size the tivo will close its end of the http connection. I have not verified importance in any internal part of the TvBusMarshalledStruct. I got it working, and have been adding support from metadata as available but that’s it currently.

It’s interesting to me that the transfer speed seems completely governed by the speed of the TiVo processor and what the TiVo is doing. The maximum speed I’ve been able to transfer a file to the TiVo is about 16Mb/s. That speed was achieved while the TiVo was not recording any shows and had a 480p program displayed on the screen in a paused state. If I’m playing a 1080i or 720p program the transfer rate is usually much closer to 7Mb/s. This means that if I’m transferring a program that was recorded in HD on my windows media center from my HDHomeRunPrime, I generally cannot watch in real time because it transfers slightly slower than the data rate of the video and audio.

WimTiVoServer reasons for existence.

I had multiple reasons for writing this program.

I wanted to work with the XMLLite library for XML processing which Microsoft has included as part of their operating system for several years. http://msdn.microsoft.com/en-us/library/windows/desktop/ms752838(v=vs.85).aspx I’ve used XML as my data storage and transfer medium for years but most of the time have manually processed the XML as opposed to using a prebuilt library. My reasoning behind my own parsing and generation was that I was building cross platform solutions that ran on embedded systems working with windows programs. It was easier to build the same code to run on both platforms and assure compatibility than to deal with the overhead and maintenance of third party libraries. I’ve still not formed a full opinion on using the library for reading XML, but it certainly makes creating XML easier by making sure tags are properly closed and nested, and the option of auto indentation in the formatting is nice.

I wanted experience using FFMPEG and its associated video processing libraries. With the release of FFMPEG v1 late last year it became much more capable in dealing with all of the container formats and encoding types that I was interested in, including the WTV containers that Windows Media Center uses and the MKV containers that are common on the internet for high definition files. In my functioning version of my server, I’m using the libraries directly linked into the program to parse the media files for metadata, but spawning a full copy of FFMPEG to do the required transcoding to send the proper format to the TiVo. I’m considering migrating entirely to the spawned FFMPEG process to simplify licensing if I want to make my program publicly available. It would also simplify future support for codecs and containers that FFMPEG may support if my service itself didn’t need to be relinked with updated libraries.

I’ve been frustrated with the state of the TiVo Desktop software provided by the TiVo company. It was designed so that it plugs into the windows video codec stack for video transcoding, as well as the apple protocol stack for Bonjour protocol support. Both of those lead to complications when upgrading programs. Apple regularly releases updates to iTunes. Whenever an update to iTunes needed to be installed, it caused problems because the TiVo service was running and keeping some of the apple services locked in use. It essentially required a full uninstall and reinstall of iTunes every time I needed to update it, with several machine reboots in that process.  Somehow I’d managed to get a codec installed on my machine that allowed the TiVo desktop to support the MKV container. Duplicating that installation on a new machine or server was not something I wanted to attempt. Since the modern FFMPEG supports MKV, I get that support without manipulating a video codec stack in windows.

The TiVo Desktop software only runs as a user process, when a user is logged in. Files are not served from a background service. This is an issue when I’d like to run the process on a headless server. There are ways around the issue, but my program solves it simply by being a pure service. The TiVo Desktop both announces itself via the UDP beacon process on port 2190 and also listens for other servers on the same port. Because my program is purely a server I saw no reason to listen for incoming beacons, and so I do not tie up the UDP port for receiving. This allows secondary programs to be started at the users discretion to listen for TiVo processes on the network.

WimTiVoServer

With the lack of movement on TiVo Desktop support for Windows 8, and several other issues I’ve had I decided to write my own implementation of a TiVo Server. I’d first investigated what else was out there. The primary server is pyTiVo, a python and FFMPEG based tivo server. It has had significant work into it, and supports both push and pull methods for transferring files to TiVos. My issue with it is that it requires the installation of python, which is one more environment that I don’t really want to install and maintain on my windows server if I don’t need to.

I used to run a TiVo Publisher Add-In for my windows home server. http://durfee.net/software/2007/07/tivo-publisher-for-whs.html It hasn’t been updated in over 5 years, and didn’t handle newer codecs or install on newer servers. I’ve upgraded my server to a much newer server and would like a similar service to run on my server, so that files on the server can be directly transcoded to the TiVo without requiring a running desktop machine, or a logged in console.

In the decision to write my own software, I was reminded that the reason I’ve not done it in the past is that the information on the TiVo protocol is so hard to come by, plus it has some confusing issues.  The TiVo Server protocol is referred to as TiVo Home Media Option, or HMO Server.  The documents from TiVo are sometimes difficult to be found, but a copy is located at http://www.tivocommunity.com/tivo-vb/showthread.php?p=5834238#post5834238

The fact that TiVo doesn’t host the documents in a standardized location indicates something about the withering support for the developer community by the company.

After starting to type some of this information up, I’ve realized how much information I’ve collected and distilled down to a small running program. Because of that I plan on breaking my thoughts into several posts.