On 8/27/17, Evert Vorster <evors...@gmail.com> wrote: > Hi there. Hi,
> > I have asked this question on the ffmpeg-users list, but it's been oddly > quiet on this front. > > I am trying to create a ffmpeg command line that re-maps and stitches > together footage from the Samsung Gear 360 camera. The basic methodoly can > be applied to any 360 view camera. The reason I am doing this is that the > software that is available to do this is closed source and extremely > expensive. For a hobbiest, this is a bad situation. > > First thing I do, is to load a specially crafted series of frames in Hugin, > and perfectly map the lenses. > Then I make the remap files that the ffmpeg remap filter uses with nona -c. > I hand craft a alpha map to do the blending between the lenses, and the > results are really good. > > Unfortunately I have a real problem with vignetting. > The vignette filter in ffmpeg seems to have some room for improvement. > In Ffmpeg there is only one variable, and the x,y center. This allows for > only one type of vignetting correction with the "backward" option set. > > In Hugin (and the panotools) the lens is described with 3 variables, and an > x,y center. This enables the vignetting strength to be described as a > custom curve. > > If you are curious about the project, this is the github page, with > examples. > https://github.com/evertvorster/dualfisheye2equirectangular_ffmpeg_remap > There are example .pto files, that when loaded in Hugin show the lens > chataristics of the Samsung Gear 360 lenses, and some sample footage to > test on. > > How difficult would it be to port the panotools' vignetting correcting > filter into ffmpeg? How it is actually named? _______________________________________________ ffmpeg-devel mailing list ffmpeg-devel@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-devel