Kumar Appaiah wrote:
On Tue, Aug 03, 2010 at 12:12:26PM -0700, Dino Vliet wrote:
Dear debian people,
Can you help me with this task I have? I have a lot of files in a subdirectory
containing the following text:
You should use awk.
- cut -
You ought to read the Awk manual, and then it would be a mattle of a
couple of hours of thought at most.
you might want to start by perusing the "sed" manual - it's an even
simpler tool, though it might not be powerful enough for what you're doing
also take a look at:
http://www.smashingmagazine.com/2009/04/10/25-text-batch-processing-tools-reviewed/
not Unix, but a collection of various visual tools for processing text
in batches
looks to me like your biggest problem is that each file has several
sections, each in different formats, so it's not just a matter getting
everything into a uniform tabular structure for import into a
spreadsheet. You might want to think of this as a several step process
that either:
a. breaks each file into several files, each of a uniform format, then
process each type of file separately, or b,
c. process each file to normalize it into something that's easier to
turn into csv format
Or, as someone suggested - hire someone. This is the silly kind of task
that's really easy if your facile with regular expressions, shell
scripts, and such; but can end up taking forever to get right. Judging
from the sample data, I'm guessing your at a university, there should be
enough student hackers around who work cheap.
Miles Fidelman
--
In theory, there is no difference between theory and practice.
In<fnord> practice, there is. .... Yogi Berra
--
To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org
Archive: http://lists.debian.org/4c58969d.7080...@meetinghouse.net