> > split -b Considered Harmful.
>
> Just curious having not heard of split before, what's the purpose of
> it as-is? :)
Do you mean what's it's use? I'm no unix (or Plan 9) elder, but I think it's a
file-based implementation of what some text editors call folding.
Not a trivial function. Be
# usage: sledge /tmp/foo /tmp/bar 31415
# could also be named crowbar
fn sledge () {
ifile=$1
ofile=$2
size=$3
if (~ $#* 3) {
for (i in `{seq 0 `{ls -l $ifile | awk '{print
int($6/'^$size^')}'}}) {
echo dd -if $ifile -of $ofi
On 10 May 2010, at 16:39, erik quanstrom wrote:
Just curious having not heard of split before, what's the purpose of
it as-is? :)
split(1) is pretty informative. in Þe auncien[t] dais, it was
sometimes hard
to send big files by email or fit them on floppies or 9 track tapes.
rchistory(1)
> Just curious having not heard of split before, what's the purpose of
> it as-is? :)
split(1) is pretty informative. in Þe auncien[t] dais, it was sometimes hard
to send big files by email or fit them on floppies or 9 track tapes.
rchistory(1) shows i haven't used it once in the last 5 years.
On 10 May 2010, at 15:52, Iruata Souza wrote:
On Mon, May 10, 2010 at 11:01 AM, gas wrote:
Isn't the sensible solution to add a "-b bytes" option to split?
split -b Considered Harmful.
Just curious having not heard of split before, what's the purpose of
it as-is? :)
--
Simplicity doe
On Mon, May 10, 2010 at 11:01 AM, gas wrote:
> Isn't the sensible solution to add a "-b bytes" option to split?
>
split -b Considered Harmful.
Isn't the sensible solution to add a "-b bytes" option to split?
On 10 May 2010 12:40, erik quanstrom wrote:
> this isn't awk's fault. awk gets the right result. you've illustrated
> the dismalness of seq.
good point. i should have worked that out!
it's not the first time i've been caught out by %g - perhaps
the default precision of %g should be 12 or more.
> but there's still a little annoyance - if the file size is an exact
> multiple of the block size, it will generate an unnecessary zero-length
> file at the end.
sometimes what should work, doesn't. that (and the whitespace)
are why my version is longer.
> i tried to fix it to get rid of this,
hi,
since i see many solutions, i thought i will add mine too (if you can
use inferno). attached please see the limbo program i wrote for this.
it has helped me many times: didnt have to come out with parameters
for dd commands which i used to use before for this kind of tasks.
please note that t
i had a similar need some time back and used inferno's sh:
http://a-30.net/inferno/dis/split
arvindh
On 8 May 2010 18:35, Russ Cox wrote:
> bs=1474560
> cat $file | for(i in `{seq 0 `{ls -l $file | awk '{print
> int($6)/'$bs'}'}}) { dd -bs $bs -count 1 -of $file.$i }
that looks very plausible, but it doesn't actually work,
as awk doesn't coelesce short reads (it gets short
reads from the pipe)
bs=1474560
cat $file | for(i in `{seq 0 `{ls -l $file | awk '{print
int($6)/'$bs'}'}}) { dd -bs $bs -count 1 -of $file.$i }
> How might I split a file into pieces specified by size?
dd(1)
cpue% dd -if /dev/zero -of /tmp/foo -bs 1 -count 1024
1024+0 records in
1024+0 records out
cpue% ls -l /tmp/foo
--rw-rw-r-- M 1106 fst fst 1024 May 7 22:37 /tmp/foo
cpue% dd -if /tmp/foo -of /tmp/foo1 -bs 1 -count 512
512+0 records
On Fri May 7 23:42:15 EDT 2010, yard-...@telus.net wrote:
> How might I split a file into pieces specified by size?
>
> split(1) lets me specify lines and regex contexts as delimiters, but what if
> I want a 40MB file split up into 1.44MB chunks, say?
in retrospect, this would be better in c.
How might I split a file into pieces specified by size?
split(1) lets me specify lines and regex contexts as delimiters, but what if I
want a 40MB file split up into 1.44MB chunks, say?
16 matches
Mail list logo