My impression is that Sage's directory structure tries to address different 

> concerns (meaning here: different mathematical topics). Do you think 
> that the present directory structure does not succeed to keep different 
> concerns apart? 
>
 
Here are a few folders in the main folder: 
- databases
- interfaces
- libs
- misc
- server
- tests

- finance
- geometry
- probability
- quadratic_forms 

I think we should cluster the directories as I just did. We could have 
backend/ and math/. Or make the "political" choice not to use a math/ 
folder, and leave all the math subfolders in the main directory, but still 
use a subdirectory for all the backend/ stuff. Obviously that would be a 
huge amount of work, but it would be facilitated if we announce this as a 
guideline and new stuff gets incorporated the right way.
 

> Ideally, every concern of Sage (according to the directory structure, it 
> is categories, coding, combinat, ..., structure, symbolic, tensor) 
> should have a top-level documentation, in addition to the documentation 
> of each class/method. I am actually not sure to what extent such 
> top-level documentation exists. 
>
It exists in some places (for instance the categories module), but then no 
one is forced to go look there.
 

> > - Separate the actual Categories framework as it currently exists into 
> two 
> > parts. The first would be a "Mathematical abstraction" layer, where very 
> > abstract mathematical information is implemented. 
> > - Force the mathematical developer to think in terms of mathematics 
> first, 
> > i.e. require a category to be specified in order to be able to construct 
> a 
> > Parent. 
>
> +1. But how to enforce it? Some people (including myself) occasionally try 
> to shift old code to the new category and coercion model. Would it be 
> enough to ask the reviewers of a patch to take care that the category of 
> any 
> *new* parent is duely defined? Or should one make doctests fail on 
> purpose, if a parent is not properly put into a category, to enforce its 
> use? 
>
I would strongly advocate making doctests fail on purpose, to enforce its 
use. Without an abstraction layer we are all running around in different 
directions, and not building on the same foundation.

Trying to shift old code is likely to break a ton of stuff, for instance 
all the sitting tickets on trac and all the patches people are about to 
submit. If this was a clearly outlined guideline, it would help embolden 
people to do it. Putting it as a guideline shows that we have identified 
this as desirable in the long term even if it causes headache short term. 
 

> > - Specify the coercions that should exist at the "Mathematical 
> abstraction" 
> > level. 
> > - Specify for each new Parent how to coerce from the Category down to 
> the 
> > Parent. 
>
> Could you elaborate a bit more? What does "to coerce from the category 
> down to the parent" mean, and *what* coercions should exists at the 
> mathematical abstraction level? 
>

The fact that there exists a coercion from a field to a polynomial ring 
over that field, for instance, is a mathematical fact. This is true 
regardless of implementation (i.e. the parent), so it should be in the 
abstraction layer (and construct all the associated tests). It bothers me 
that to explain coercion of (something in a field) and (polynomials over a 
subring of the field) into a common category, one _always_ needs to be 
concerned about parents.

change "To coerce from the category down to the parent" --> "To coerce from 
the abstraction layer to the parent". In this abstraction layer (which is 
close to the categories now available, but not quite the same), we would 
have a list of methods for the parent class and the element class of a 
Category C. Now say you have parentA and parentB both implementing the same 
category C. As part of its definition, parentA c/should tell how to 
initialize it using exclusively methods from this abstraction layer. But 
parentB implements some of these methods. With any luck there might be 
enough overlap that we automatically get a coercion from parentB -> 
parentA. This would work by going from parentB "up" to the abstraction 
layer's method and then "down" onto parentA. I mean up and down not in the 
sense of class hierarchy but in terms of abstraction level. 

 

> A while ago, I have created a public worksheet on "how to implement 
> basic algebraic structures in Sage", using category framework and 
> coercion. 
> I think at some point it was supposed to become part of the 
> documentation, but I think we lost momentum. 
>
I just found what you posted a year ago. That's great! As I said, exactly 
this should be available somewhere official and marked as current.

 

> Both decorators and metaclasses are used in Sage. Note one problem, 
> though: 
> If you want to create a new class inheriting from classes with distinct 
> metaclasses, then Python complains. 

Yes. Yes. 
 

> I had experimental code to overcome that problem, namely by using 
> meta-metaclasses that dynamically create a new common metaclass out of any 
> list of metaclasses. Hence, if A and B are classes with metaclasses MA and 
> MB, then you can define 
>  class C(A,B) 
> so that C gets a dynamically created metaclass MC that is a sub-class of 
> both MA and MB. The idea is that both MA and MB (which *are* 
> metaclasses) have a common meta-metaclass MM, and MM is able to create a 
> new metaclass MC out of MA and MB. 
>
Yes. Since these metaclasses are there to specifically abstract some 
concerns's implementation details, these metaclasses are aspects. Then this 
step is aspect weaving.  
 

> In that way, you could have metaclasses for different purposes, and can 
> freely combine them: You want a class that is pickled by construction 
> and whose instances are cached? Then simply combine 
> DynamicMetaclass and ClasscallMetaclass. No need to separately/manually 
> define DynamicClasscallMetaclass anymore - the meta-metaclass would do 
> that for you. 
>
Yes. I was thinking along the same lines. Do you think the ordering of the 
metaclasses should matter?  
 

> But it seems people found the idea of meta-metaclasses a bit confusing... 
>
Well, it is, but I think it's a valid solution. And 95% of people won't 
have to worry about these meta metaclasses: the interface could be a 
customizable MagicObject for instance. The cost is on the side of the 
metaclass, whose complexity is fairly high. The reward is that the 
decorators are trivial to use. 
 

>
> >  - "This will slow down the one class I am using a lot": Not if it's 
> done 
> > right. In fact, if it's done very right it would speed up considerably 
> > everything, 
>
> +1. 
>
> One example: One can write a FastHashMetaclass. If you then take a 
> python class A that has a `__hash__` method, and apply to A the 
> FastHashMetaclass, then A.__hash__ would automatically be turned into a 
> fast Cython hash that moreover is cached (so that the hash value does 
> not need to be computed repeatedly). 
>
> Best regards, 
> Simon 
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"sage-devel" group.
To post to this group, send email to sage-devel@googlegroups.com.
To unsubscribe from this group, send email to 
sage-devel+unsubscr...@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-devel?hl=en.


Reply via email to