On Mon, Jun 27, 2011 at 2:01 PM,  <tlaro...@polynum.com> wrote:
> On Mon, Jun 27, 2011 at 01:34:07PM -0400, erik quanstrom wrote:
>>
>> i don't even have an opinion on this.  i don't understand the conflation
>> of the input character set and tex's internal representations.  could
>> you explain why you are taking about them as the same?
>>
>> to be brutally honest, tex could internally use an array of monkeys
>> flinging poo to represent characters /internally/ and i would be much
>> happer than with a reasonable internal representation and a difficult
>> and incompatable external representation.  at least that way the monkeys
>> flinging poo are hermetically sealed within the program and not flinging
>> poo all over my system.  :-)
>
> In TeX there is, initially, a defined subset: ASCII. Because TeX is a
> compiler/interpreter and one needs to be able to send some
> "bootstrapping" commands. This can be rapidly overwritten (but starting
> with some ASCII like characters). This can be almost arbitrary.
>
> What people were precisely arguing is precisely that external business,
> and "state of the art" (that is soon to be "out of fashion") fonts and
> whatever mood "du jour" should lead to the rewrite of TeX internals.
>
> I precisely claim to let TeX internals alone. The majority of the work
> is external (the main being in the dvi drivers). If I want to use
> ligatures, I shall be able to do. If others want to put directly the
> code for the ligatured glyph, they can, but this is their problem and
> not a holy rule.

That's not how OpenType works, actually. It actually works more like
TeX, in that it allows for files to store text as basic
ASCII/Latin-1/8-bit UTF-8 subset format which an OpenType-enabled
renderer (such as XeTeX, InDesign or even Office 2010) then presents
(on screen or on page) as the correct ligature. Thus the big advantage
of OpenType over, say Type 1, is that it offers a featureset much
closer to Computer Modern's full set of ligatures, accents and
alternatives than Type 1 ever could (at least without serious
scripting to combine multiple Type 1 fonts containing all the needed
glyphs into a single "virtual font" as described in your first post)

> Unfortunately wrong. Read back the thread (if you really have
> nothing more interesting to do). I have explained this "256 subfonts"
> business in the first message, and immediately got answers that
> the "correct way" was teaching TeX "modern" fonts.

The subfont system works fine if you both have a complete Type 1 font
set including all the "expert fonts" including the extra glyphs and
the like AND are willing to put together a mapping for it. The problem
is that fonts haven't shipped (to consumers, at least) in that form
for about 10 years. Unless I fundamentally misunderstand the subfont
system (which I admit that I might), for  any font made within the
last 10 years or so, using the subfont/virtual font system would
entail the following steps:
1. Break the complete OpenType font down into a combination of PFBs
and AFMs containing the complete set of characters between them,
carefully remapping each glyph outside of 8-bit range into it so that
they remain accessible. This may break the license agreement for many
fonts and would almost certainly cause the loss of many kerning pairs,
hints and other metadata (I'm not sure how much of that TeX uses, so
that may not be as big a problem as it sounds)
2. Build the virtual font mappings as with a with a "real" Type 1 set
3. Hope for the best.

Given the complexity of the process involved, I would hope you can
understand why, as a USER, teaching TeX to play nice with modern fonts
looks like a good way to go ;)

Again, none of this is meant as a put-down of your quite impressive
work, but rather as a reminder of some areas where others might run
into problems with making USE of said work.

Mike

Reply via email to