On 31/12/2015 17:15, Rob Gaddi wrote:
I'm looking for some advice on handling data collection/analysis in
Python.  I do a lot of big, time consuming experiments in which I run a
long data collection (a day or a weekend) in which I sweep a bunch of
variables, then come back offline and try to cut the data into something
that makes sense.

For example, my last data collection looked (neglecting all the actual
equipment control code in each loop) like:

for t in temperatures:
   for r in voltage_ranges:
     for v in test_voltages[r]:
       for c in channels:
         for n in range(100):
           record_data()

I've been using Sqlite (through peewee) as the data backend, setting up
a couple tables with a basically hierarchical relationship, and then
handling analysis with a rough cut of SQL queries against the
original data, Numpy/Scipy for further refinement, and Matplotlib
to actually do the visualization.  For example, one graph was "How does
the slope of straight line fit between measured and applied voltage vary
as a function of temperature on each channel?"

The whole process feels a bit grindy; like I keep having to do a lot of
ad-hoc stitching things together.  And I keep hearing about pandas,
PyTables, and HDF5.  Would that be making my life notably easier?  If
so, does anyone have any references on it that they've found
particularly useful?  The tutorials I've seen so far seem to not give
much detail on what the point of what they're doing is; it's all "how
you write the code" rather than "why you write the code".  Paying money
for books is acceptable; this is all on the company's time/dime.

Thanks,
Rob


I'd start with pandas http://pandas.pydata.org/and see how you get on.

If and only if pandas isn't adequate, and I think that highly unlikely, try PyTables. Quoting from http://www.pytables.org/ "PyTables is a package for managing hierarchical datasets and designed to efficiently and easily cope with extremely large amounts of data." and "PyTables is built on top of the HDF5 library". I've no idea what the definition of "extremely large" is in this case. How much data are you dealing with?

I don't understand your comment about tutorials. Once they've given you an introduction to the tool, isn't it your responsibility to manipulate your data in the way that suits you? If you can't do that, either you're doing something wrong, or the tool is inadequate for the task. For the latter I believe you've two options, find another tool or write your own.

I would not buy books, on the simple grounds that they go out of date far faster then the online docs :)

--
My fellow Pythonistas, ask not what our language can do for you, ask
what you can do for our language.

Mark Lawrence

--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to