Andre Poenitz wrote: >> The only problem that remains is extracting the cached_bufferview (which >> is a problem common to all these 'solutions'). > > I am starting to believe that all insets should cache a bufferview or some > "context" after e.g. draw() is called ... > >> Am I correct in saying that you >> store it in the mathed insets? > > Yes. But I always thought of this as a kludge. But maybe it is not as much > of a kludge afterall. > > It certainly cleans up the rest of the interface a lot as the BufferView > does not have to be passed around all the time. On the other hand, it > makes things a bit more fragile... > >> Could you move this store into InsetBase? > > Sure. Should I? This would certainly help IU a lot... > > But maybe one or two of the others could give there opinions as this is a > rather serious architectural change, at least for inset/*. math used to > work like that all the time AFAIK, so it is technically feasible, even if > it's not really "nice"...
Perhaps I should show you what the alternative is, using the current, signal-based solution modified to the new reality: class InsetBase { virtual ~InsetBase() { hideDialog(name()); } virtual string const & name() const = 0; boost::signal1<void, string const &> hideDialog; }; void Dialogs::show(string const & name, string const & data, InsetBase * inset) { if (!isValidName(name)) return; Dialog * dialog = dialogs_[name].get(); if (!dialog) return; dialog->show(data); open_insets_[name] = inset; if (inset) inset->hideDialog.connect(boost::bind(&Dialogs::hide, this)); } void Dialogs::hide(string const & name) { if (!isValidName(name)) return; dialogs_[name]->hide(); open_dialogs_[name] = 0; } Ie, the hide signal is connected to the correct slot when the dialog is shown, so it just 'does the right thing' when the inset d-tor is invoked. The code above is 'safe' but I think that the other approach is far more transparent. -- Angus