James Lowe <james.l...@datacore.com> writes: > The google tracker issue does work in as much as the main point of this is > to split the Lilypond score into 'lines' and each line is exported to an > EPS file - this bit is done by Lilypond (IIR) you can then manually put > each line into a video using most standard video editing software, this > bit is done by the 'user'.
That would be more like flashing line by line rather than a continuous horizontal scroll, right? > Then (again using the video software) you use the timing marks created > manual, the music itself (that can be generated separately as midi > from LilyPond or be a real recording) and the snippets of music to > create a 'video'. It works but it is very clunky, takea a fair bit of > work. If you are comfortable with your video editing software, and > have a good recording of you music then fine, but I am not sure it is > quite the same as a 'follow along' video. I'd likely be able to write (in a nontrivial amount of time) the required code for matching Midi time to sound file time. That's reasonably manageable signal processing. Correlating grobs to miditime would be another task. And then, of course, generating the equivalent of one long strip, and making that run through, likely in a nice smoothed motion with some cursor following the notes more accurately. -- David Kastrup _______________________________________________ lilypond-devel mailing list lilypond-devel@gnu.org http://lists.gnu.org/mailman/listinfo/lilypond-devel