Hi, > I spent much of the last week converting my compositions from MIDI files to > Lily format. I tried midi2ly first, but found it lacking, and decided to roll > my own in C++. I gather there are other converters around, as I saw at least > one on GitHub. I may put mine on GitHub too, after its code settles down a > bit. It assumes the MIDI file is already quantized, but it handles triplets, > and works well enough for me. If you're curious, some of the scores I created > with it are here: > [https://www.chriskorda.com/misc/scores.html](https://www.chriskorda.com/misc/scores.html)
Just out of curiosity: have you tried first converting the MIDI to MusicXML (using one of the various available tools), thenĀ converting that MusicXML to LilyPond using either musicxml2ly or [xml2ly](https://github.com/jacques-menu/musicformats)? I'd expect this to produce much better results than midi2ly. > Presumably I could then scroll the resulting PNG file to generate video > frames. But would I know the pixel offsets of notes/rests within each measure? You need some Scheme code for that. See [here](https://github.com/aspiers/ly2video/blob/41364ad9c5d512c502de2c9b06f7878bd88b77e1/ly2video/cli.py#L1326) for what ly2video uses.
signature.asc
Description: This is a digitally signed message part