Steve Holden wrote:
Martin MOKREJŠ wrote:
Steve Holden wrote:
[...]
I will be *very* surprised if you can't get a much better (i.e.
easier and more efficient) solution by stepping back from the
programming
Hmm, I'm not convinced, but I'll put few more words here then. ;)
details for a moment and explaining what it is you are actually
trying to achieve in user-space.
Can you describe the problem you are trying to solve, rather than the
solution you are hoping to adopt?
User inputs data through html forms, data has to be quality-checked
and strored into mysql. Later, data is read from mysql and presented
through web. There's nothing special in it, just that the tables
describe a lot of specific experimental parameters. Tables do reflect
type of informations, so they group the data into logical units - so
tablename reflects the data contained.
In general, there are two types of data, hence my X and Y objects.
The underlaying data at some time point go into sql tables. Before that
happens, the data == variable contents are checked that they contain
expected values (in some cases enumerated values, in some cases integers,
sometime chars and only about 6 blobs). I spent a year developing the
database schema and php code, the schema is nearly optimal. I got bored
by the php code, as it was partly developed by a lazy guy (lazier than
I'm).
I went fot python - to have better error handling, have not only web app,
but reusable code for standalone application (html forms can be replaced
by any tcl/tk widget for M$ Windows). Sql transaction I have added to
the php code, but anyway it sucks to work with it further.
My idea is to check some of the values while instantiating, as I get
it for
free (assigning either to a default value or raising an exception when
variable is empty). In most cases this is not enough, and I have to
type in
the allowed values.
1. In case of enumerated types, I hope to find a tool
able to read sql files and able to extract column definitions. In this
particular case, the program would dynamically read allowed ENUM values,
so whenever sql table is altered, the program will recognize new value
allowed.
2. In most other cases, the values are simply some kind of string, and
.find() et al. will suffice.
3. In case data was read from mysql, I can verify that foreign keys refer
to what they should refer.
OK, I get the data written to mysql. I can fetch it back, and want to
dump
it into xml and present on web/(local gui).
I have the claases corresponding to all tables as superclasses of X and Y
as necessary. I went to ask on this list how to assign the variables
easily
because parts of the code are almost identical. I believe this has been
answered quite well.
I believe the approach using classes corresponding to every single table
is right, when using them as superclasses for those two, practically
used objects: X and Y.
To print the output on web or any gui, I think I'll use the xml output
and just parse it. I need xml anyway for testing, and definitely want
to be able to construct the html/GUI output from the xml input - again,
for testing. So the objects will more or less exist only to get the
necessary checks done for those myriads of variables, which must be
evaluated in current context. I'd get crazy if I'd store things into
bsbdb -- I'm not going to remember that a[0] is table1, a[1] is table2,
a[0][0] is the primary key called blah, a[0][22] is allowed to be
equal only to "foo" or "bar" ... and that if a[2][4] is defined
(actually number),
the number is the key to search in c[key]. Simply, that's for what I
use mysql
I don't want to invent the database schema in bsddb in python. ;)
It's simply data, it must be read into variables in some objects, those
object are groupped into just two superobjects. The superobjects define
check-methods, define how to dump the it's data into xml, how
to write (in which order) the values into mysql.
I'm sorry not to send in the sql schema + the code, but this is my phd
thesis. ;)
I'm very glad there's so many people interrested to help - not only - me.
Thanks! Now I'm really looking forward how would you rework this thing.
It's simple, easy, it's just sometime tedious as having 250 columns in
20 tables
simply makes you bored to type the code, after while.
The only think where I think I need help is, how to dump easily into
xml say object
X, having variables a, b, c, where c is a ref. to object B, containing
variables p, q, r.
B = obj()
setattr(B, p, 44)
setattr(B, q, "sdjahd")
setattr(B, r, "qew")
X = obj()
setattr(X, a, 1)
setattr(X, a, 2)
setattr(X, a, B)
print do_magick(X)
<X>
<a>1</a>
<b>2</b>
<B>
<p>44</p>
<q>sdjahd</q>
<r>qew</r>
</B>
</X>
I still don't really see why you have to store this thing as objects,
but I appreciate that you can only give limited information and still
retain the validity of a thesis.
The project is not published yet. When it is, I'll make it free. I'm a
biologist,
and most biologists care only about the content of the database, not about
*any* technical details. It's very interresting project for them/me,
and I'm the only one who cares about technical details.
My own usual approach in such situations is to use metadata to describe
the structure and required processing of the data, as it's often much
easier to write small amounts of relatively flexible data-driven code
that it is to hard-wire all the logic around specific structures and data.
Can you give me some example? What are the "metadata"? Sure I want to learn
something and I don't rely on almost anything. But I simply thought that
the object at least group together common methods, common variables.
Anyway when reading or writing to a single sql table, I have to have handy
which coulmns to expect. Supergrouping into superobject gives me way to
define order, in which I have to interact with set of mysql tables.
Whats' wrong here? ;)
M.
--
http://mail.python.org/mailman/listinfo/python-list