On 4/11/18 9:29 PM, cuddlycave...@gmail.com wrote:
I’m replying to your post on January 28th
Nice carefully chosen non random numbers Steven D'Aprano.
Was just doing what you asked, but you don’t remember 😂😂😂
Best practice is to include a quote of the thing you are replying to.
It makes it m
I’m replying to your post on January 28th
Nice carefully chosen non random numbers Steven D'Aprano.
Was just doing what you asked, but you don’t remember 😂😂😂
--
https://mail.python.org/mailman/listinfo/python-list
On Tue, 10 Apr 2018 23:36:27 -0700, cuddlycaveman wrote:
[snip a number of carefully chosen, non-random numbers shown in binary]
> Don’t know if that helps
Helps what?
With no context, we don't know who you are replying to, what they asked,
or why you think this is helpful.
According to my a
387420479
00110011 00111000 00110111 00110100 00110010 0011 00110100 00110111 00111001
72 bits
Equal to
(9^9)-10
00101000 00111001 0100 00111001 00101001 00101101 00110001 0011
64 bits
387420499
00110011 00111000 00110111 00110100 00110010 0011 00110100 00111001 00111001
72 bits
On Sat, 27 Jan 2018 21:26:06 -0800 (PST), pendrysamm...@gmail.com wrote:
> If it is then show him this
>
> 387,420,489
>=
> 00110011 00111000 00110111 00101100 00110100 00110010 0011 0 ...
To save the casual reader a moment of disorientation, the
above binary string is just the ASCII represent
On 2018-01-28, pendrysamm...@gmail.com wrote:
> I have it in my head, just need someone to write the program for me,
> I know nothing about data compression or binary data other than 1s
> and 0s and that you can not take 2 number without a possible value
> more or less than them selves and compre
On Sat, 27 Jan 2018 21:50:24 -0800, pendrysammuel wrote:
> 387,420,489 is a number with only 2 repeating binary sequences
Okay. Now try these two numbers:
387420479
387420499
--
Steve
--
https://mail.python.org/mailman/listinfo/python-list
On Sat, 27 Jan 2018 22:14:46 -0800, pendrysammuel wrote:
> I have it in my head, just need someone to write the program for me,
Sure, my rate is $150 an hour.
> I
> know nothing about data compression or binary data other than 1s and 0s
> and that you can not take 2 number without a possible v
Lawrence D’Oliveiro
In other words yes, I just need to be sober first.
--
https://mail.python.org/mailman/listinfo/python-list
I have it in my head, just need someone to write the program for me, I know
nothing about data compression or binary data other than 1s and 0s and that you
can not take 2 number without a possible value more or less than them selves
and compress them, I have been working for 1 1/2 years on a sol
387,420,489 is a number with only 2 repeating binary sequences
In binary 387,420,489 is expressed as 00110011 00111000 00110111 00101100
00110100 00110010 0011 00101100 00110100 00111000 00111001
387,420,489 can be simplified to 9*9 or nine to the power of nine
In binary 9*9 is represented
On Sun, Jan 28, 2018 at 4:26 PM, wrote:
> If it is then show him this
>
> 387,420,489
> =
> 00110011 00111000 00110111 00101100 00110100 00110010 0011 00101100
> 00110100 00111000 00111001
>
> 9^9 = ⬇️ (^ = to the power of)
> = 387,420,489
>
> But
>
> 9^9
> =
> 00111001 0100 00111001
I
Gregory Ewing writes:
> Ben Bacarisse wrote:
>> But that has to be about the process that gives rise to the data, not
>> the data themselves.
>
>> If I say: "here is some random data..." you can't tell if it is or is
>> not from a random source. I can, as a parlour trick, compress and
>> recover
On Sun, 29 Oct 2017 01:56 pm, Stefan Ram wrote:
> If the entropy of an individual message is not defined,
> than it is still available to be defined. I define it
> to be log2(1/p), where p is the probability of this
> message. I also choose a unit for it, which I call "bit".
That is exact
On Sun, 29 Oct 2017 06:03 pm, Chris Angelico wrote:
> On Sun, Oct 29, 2017 at 6:00 PM, Ian Kelly wrote:
>> On Oct 28, 2017 5:53 PM, "Chris Angelico" wrote:
>>> One bit. It might send the message, or it might NOT send the message.
>>
>> Not sending the message is equivalent to having a second pos
On Sun, 29 Oct 2017 02:31 pm, Gregory Ewing wrote:
> Steve D'Aprano wrote:
>> I don't think that's right. The entropy of a single message is a
>> well-defined quantity, formally called the self-information.
>>
>> https://en.wikipedia.org/wiki/Self-information
>
> True, but it still depends on kn
Chris Angelico wrote:
One bit. It might send the message, or it might NOT send the message.
The entropy formula assumes that you are definitely
going to send one of the possible messages. If not
sending a message is a possibility, then you need
to include an empty message in the set of messages
On Sun, Oct 29, 2017 at 6:00 PM, Ian Kelly wrote:
> On Oct 28, 2017 5:53 PM, "Chris Angelico" wrote:
>> One bit. It might send the message, or it might NOT send the message.
>
> Not sending the message is equivalent to having a second possible message.
Okay, now we're getting seriously existenti
On Oct 28, 2017 5:53 PM, "Chris Angelico" wrote:
> One bit. It might send the message, or it might NOT send the message.
Not sending the message is equivalent to having a second possible message.
--
https://mail.python.org/mailman/listinfo/python-list
On Sun, Oct 29, 2017 at 2:08 PM, Gregory Ewing
wrote:
> Stefan Ram wrote:
>>
>> Well, then one can ask about the entropy of a data source
>> that only is emitting this message.
>
>
> You can, but it's still the *source* that has the entropy,
> not the message.
>
> (And the answer in that case
Steve D'Aprano wrote:
I don't think that's right. The entropy of a single message is a well-defined
quantity, formally called the self-information.
https://en.wikipedia.org/wiki/Self-information
True, but it still depends on knowing (or assuming) the
probability of getting that particular me
Stefan Ram wrote:
Well, then one can ask about the entropy of a data source
that only is emitting this message.
You can, but it's still the *source* that has the entropy,
not the message.
(And the answer in that case is that the entropy is zero.
If there's only one possible message you can
On Sun, Oct 29, 2017 at 1:32 PM, Chris Angelico wrote:
> On Sun, Oct 29, 2017 at 1:18 PM, Gregory Ewing
> wrote:
>> You're missing something fundamental about what
>> entropy is in information theory.
>>
>> It's meaningless to talk about the entropy of a single
>> message. Entropy is a function o
On Sun, Oct 29, 2017 at 1:18 PM, Gregory Ewing
wrote:
> You're missing something fundamental about what
> entropy is in information theory.
>
> It's meaningless to talk about the entropy of a single
> message. Entropy is a function of the probability
> distribution of *all* the messages you might
Steve D'Aprano wrote:
Random data = any set of data generated by "a source of random".
Any set of data generated by Grant Thompson?
https://www.youtube.com/user/01032010814
--
Greg
--
https://mail.python.org/mailman/listinfo/python-list
danceswithnumb...@gmail.com wrote:
10101011
This equals
61611
This can be represented using
0-6 log2(7)*5= 14.0367746103 bits
11010101
This equals
54543
This can be represented using
0-5 log2(6)*5= 12.9248125036 bits
You're missing something fundamental about what
entropy is
Ben Bacarisse wrote:
But that has to be about the process that gives rise to the data, not
the data themselves.
If I say: "here is some random data..." you can't tell if it is or is
not from a random source. I can, as a parlour trick, compress and
recover this "random data" because I chose it
On Oct 28, 2017 10:30 AM, "Stefan Ram" wrote:
> Well, then one can ask about the entropy of a data source
> thatt only is emitting this message. (If it needs to be endless:
> thatt only is emitting this message repeatedly.)
If there is only one possible message then the entropy is zero.
-1.0 * l
On Sun, 29 Oct 2017 07:03 am, Peter Pearson wrote:
> On Thu, 26 Oct 2017 19:26:11 -0600, Ian Kelly wrote:
>>
>> . . . Shannon entropy is correctly calculated for a data source,
>> not an individual message . . .
>
> Thank you; I was about to make the same observation. When
> people talk about t
On Thu, 26 Oct 2017 19:26:11 -0600, Ian Kelly wrote:
>
> . . . Shannon entropy is correctly calculated for a data source,
> not an individual message . . .
Thank you; I was about to make the same observation. When
people talk about the entropy of a particular message, you
can bet they're headed
Steve D'Aprano writes:
> On Fri, 27 Oct 2017 09:53 am, Ben Bacarisse wrote:
>
>> A source of random can be defined but "random data" is much more
>> illusive.
>
> Random data = any set of data generated by "a source of random".
(I had an editing error there; it should be "a source of random data
On Fri, 27 Oct 2017 09:53 am, Ben Bacarisse wrote:
> A source of random can be defined but "random data" is much more
> illusive.
Random data = any set of data generated by "a source of random".
--
Steve
“Cheer up,” they said, “things could be worse.” So I cheered up, and sure
enough, thing
On Thu, Oct 26, 2017 at 8:48 PM, wrote:
> Shouldn't that be?
>
> py> 16 * (-7/16 * math.log2(7/16) - 6/16 * math.log2(6/16)) =
No, that's failing to account for 3/16 of the probability space.
--
https://mail.python.org/mailman/listinfo/python-list
On Thu, Oct 26, 2017 at 8:19 PM, wrote:
> It looks like that averages my two examples.
I don't know how you can look at two numbers and then look at a third
number that is larger than both of them and conclude it is the
average.
> H by the way that equation is really coolwhy does it ret
Marko Rauhamaa writes:
> Ben Bacarisse :
>
>>> In this context, "random data" really means "uniformly distributed
>>> data", i.e. any bit sequence is equally likely to be presented as
>>> input. *That's* what information theory says can't be compressed.
>>
>> But that has to be about the process
Ben Bacarisse :
>> In this context, "random data" really means "uniformly distributed
>> data", i.e. any bit sequence is equally likely to be presented as
>> input. *That's* what information theory says can't be compressed.
>
> But that has to be about the process that gives rise to the data, not
Shouldn't that be?
py> 16 * (-7/16 * math.log2(7/16) - 6/16 * math.log2(6/16)) =
--
https://mail.python.org/mailman/listinfo/python-list
It looks like that averages my two examples. H by the way that equation is
really coolwhy does it return a high bit count when compared to >>>dec to
bin?
--
https://mail.python.org/mailman/listinfo/python-list
On Thu, Oct 26, 2017 at 2:38 PM, wrote:
>
> Thomas Jollans
>
> On 2017-10-25 23:22, danceswi...@gmail.com wrote:
>> With every transform the entropy changes,
>
> That's only true if the "transform" loses or adds information.
>
> If it loses information, that's lossy compression, which is only use
Gregory Ewing writes:
> Ben Bacarisse wrote:
>> The trouble is a pedagogic one. Saying "you can't compress random data"
>> inevitably leads (though, again, this is just my experience) to endless
>> attempts to define random data.
>
> It's more about using terms without making sure everyone agree
Thomas Jollans
On 2017-10-25 23:22, danceswi...@gmail.com wrote:
> With every transform the entropy changes,
That's only true if the "transform" loses or adds information.
If it loses information, that's lossy compression, which is only useful
in very specific (but also extremely common) ci
On 2017-10-24 22:30, Steve D'Aprano wrote:
> On Wed, 25 Oct 2017 07:09 am, Peter J. Holzer wrote:
>
>> On 2017-10-23 04:21, Steve D'Aprano wrote:
>>> On Mon, 23 Oct 2017 02:29 pm, Stefan Ram wrote:
>>> If the probability of certain codes (either single codes, or sequences of
>>> codes) are n
On 2017-10-25 23:22, danceswithnumb...@gmail.com wrote:
> With every transform the entropy changes,
That's only true if the "transform" loses or adds information.
If it loses information, that's lossy compression, which is only useful
in very specific (but also extremely common) circumstances.
I
On Thu, 26 Oct 2017 08:22 am, danceswithnumb...@gmail.com wrote:
> with each pass you can compress untill the entropy is so random it can no
> longer be comressed.
Which is another way of saying that you cannot compress random binary data.
--
Steve
“Cheer up,” they said, “things could be wo
So if the theoretical min compression limit (log2(n)*(x)) has a 3% margin but
your transform has a less than 3% inflate rate at most then there is room for
the transform to compress below the theoretical min. With every transform the
entropy changes, the potential for greater compression also ch
Whatever you do, you'll find that *on average* you
will need *at least* 34 bits to be able to represent
all possible 10-digit decimal numbers. Some might
be shorter, but then others will be longer, and
the average won't be less than 34.
The theoretical limit for arbitrary numbers 0 - 9 must
On 10/24/17, Richard Damon wrote:
> My understanding of the 'Random Data Comprehensibility' challenge is
> that is requires that the compression take ANY/ALL strings of up to N
> bits, and generate an output stream no longer than the input stream, and
> sometime less.
That's incorrect, at least o
Lele Gaifax wrote:
That's simple enough: of course one empty file would be
"music.mp3.zip.zip.zip", while the other would be
"movie.avi.zip.zip.zip.zip.zip"... some sort of
https://en.wikipedia.org/wiki/Water_memory applied to file system entries :-)
If you're allowed to alternate between two c
Ben Bacarisse wrote:
The trouble is a pedagogic one. Saying "you can't compress random data"
inevitably leads (though, again, this is just my experience) to endless
attempts to define random data.
It's more about using terms without making sure everyone agrees
on the definitions being used.
I
Steve D'Aprano wrote:
- Encrypted data looks very much like random noise.
There's actually a practical use for that idea. If you can feed
the output of an encryption algorithm through a compressor and
make it smaller, it means there is a cryptographic weakness
in the algorithm that could potent
On 10/24/17 6:30 PM, Steve D'Aprano wrote:
On Wed, 25 Oct 2017 07:09 am, Peter J. Holzer wrote:
On 2017-10-23 04:21, Steve D'Aprano wrote:
On Mon, 23 Oct 2017 02:29 pm, Stefan Ram wrote:
If the probability of certain codes (either single codes, or sequences of
codes) are non-equal, then yo
On Wed, Oct 25, 2017 at 9:11 AM, Steve D'Aprano
wrote:
> On Wed, 25 Oct 2017 02:40 am, Lele Gaifax wrote:
>
>> Steve D'Aprano writes:
>>
>>> But given an empty file, how do you distinguish the empty file you get
>>> from 'music.mp3' and the identical empty file you get from 'movie.avi'?
>>
>> Tha
On Wed, 25 Oct 2017 07:09 am, Peter J. Holzer wrote:
> On 2017-10-23 04:21, Steve D'Aprano wrote:
>> On Mon, 23 Oct 2017 02:29 pm, Stefan Ram wrote:
>>>
>> If the probability of certain codes (either single codes, or sequences of
>> codes) are non-equal, then you can take advantage of that by enc
On Wed, 25 Oct 2017 02:40 am, Lele Gaifax wrote:
> Steve D'Aprano writes:
>
>> But given an empty file, how do you distinguish the empty file you get
>> from 'music.mp3' and the identical empty file you get from 'movie.avi'?
>
> That's simple enough: of course one empty file would be
> "music.m
On Tue, Oct 24, 2017 at 12:20 AM, Gregory Ewing
wrote:
> danceswithnumb...@gmail.com wrote:
>>
>> I did that quite a while ago. 352,954 kb.
>
>
> Are you sure? Does that include the size of all the
> code, lookup tables, etc. needed to decompress it?
My bet is that danceswithnumbers does indeed h
On Tue, 24 Oct 2017 14:51:37 +1100, Steve D'Aprano wrote:
On Tue, 24 Oct 2017 01:27 pm, danceswithnumb...@gmail.com wrote:
> Yes! Decode reverse is easy..sorry so excited i could shout.
Then this should be easy for you:
http://marknelson.us/2012/10/09/the-random-compression-challenge-
On 2017-10-23 04:21, Steve D'Aprano wrote:
> On Mon, 23 Oct 2017 02:29 pm, Stefan Ram wrote:
>>
> If the probability of certain codes (either single codes, or sequences of
> codes) are non-equal, then you can take advantage of that by encoding the
> common cases into a short representation, and th
On 24/10/2017 16:40, Lele Gaifax wrote:
Steve D'Aprano writes:
But given an empty file, how do you distinguish the empty file you get
from 'music.mp3' and the identical empty file you get from 'movie.avi'?
That's simple enough: of course one empty file would be
"music.mp3.zip.zip.zip", while
Steve D'Aprano writes:
> But given an empty file, how do you distinguish the empty file you get
> from 'music.mp3' and the identical empty file you get from 'movie.avi'?
That's simple enough: of course one empty file would be
"music.mp3.zip.zip.zip", while the other would be
"movie.avi.zip.zip.z
Steve D'Aprano writes:
> On Tue, 24 Oct 2017 06:46 pm, danceswithnumb...@gmail.com wrote:
>
>> Greg, you're very smart, but you are missing a big key. I'm not padding,
>> you are still thinking inside the box, and will never solve this by doing
>> so. Yes! At least you see my accomplishment, thi
Steve D'Aprano writes:
> On Tue, 24 Oct 2017 09:23 pm, Ben Bacarisse wrote:
>
>> Forget random data. For one thing it's hard to define,
>
> That bit is true.
>
>> but more importantly no one cares about it.
>
> But that's wrong.
All generalisations are false. I was being hyperbolic.
> For in
On Tue, 24 Oct 2017 06:46 pm, danceswithnumb...@gmail.com wrote:
> Greg, you're very smart, but you are missing a big key. I'm not padding,
> you are still thinking inside the box, and will never solve this by doing
> so. Yes! At least you see my accomplishment, this will compress any random
> fi
On Tue, 24 Oct 2017 09:23 pm, Ben Bacarisse wrote:
> Forget random data. For one thing it's hard to define,
That bit is true.
> but more importantly no one cares about it.
But that's wrong.
For instance:
- Encrypted data looks very much like random noise. With more and more data
traversin
On 24 October 2017 at 12:04, Ben Bacarisse wrote:
> Paul Moore writes:
>
>> On 24 October 2017 at 11:23, Ben Bacarisse wrote:
>>> For example, run the complete works of Shakespeare through your program.
>>> The result is very much not random data, but that's the sort of data
>>> people want to c
Paul Moore writes:
> On 24 October 2017 at 11:23, Ben Bacarisse wrote:
>> For example, run the complete works of Shakespeare through your program.
>> The result is very much not random data, but that's the sort of data
>> people want to compress. If you can compress the output of your
>> compre
On 24 October 2017 at 11:23, Ben Bacarisse wrote:
> For example, run the complete works of Shakespeare through your program.
> The result is very much not random data, but that's the sort of data
> people want to compress. If you can compress the output of your
> compressor you have made a good s
On Tue, 24 Oct 2017 05:20 pm, Gregory Ewing wrote:
> danceswithnumb...@gmail.com wrote:
>> I did that quite a while ago. 352,954 kb.
>
> Are you sure? Does that include the size of all the
> code, lookup tables, etc. needed to decompress it?
>
> But even if you have, you haven't disproved the th
danceswithnumb...@gmail.com writes:
> Finally figured out how to turn this into a random binary compression
> program. Since my transform can compress more than dec to binary. Then
> i took a random binary stream,
Forget random data. For one thing it's hard to define, but more
importantly no one
On 24 October 2017 at 09:43, Gregory Ewing wrote:
> Paul Moore wrote:
>>
>> But that's not "compression", that's simply using a better encoding.
>> In the technical sense, "compression" is about looking at redundancies
>> that go beyond the case of how effectively you pack data into the
>> bytes a
Paul Moore wrote:
But that's not "compression", that's simply using a better encoding.
In the technical sense, "compression" is about looking at redundancies
that go beyond the case of how effectively you pack data into the
bytes available.
There may be a difference in the way the terms are use
danceswithnumb...@gmail.com wrote:
My 8 year old can decode this back into base 10,
Keep in mind that your 8 year old has more information
than just the 32 bits you wrote down -- he can also
see that there *are* 32 bits and no more. That's
hidden information that you're not counting.
--
Greg
No leading zeroes are being dropped offwish this board has an edit button.
--
https://mail.python.org/mailman/listinfo/python-list
Am 23.10.17 um 12:13 schrieb Marko Rauhamaa:
Thomas Jollans :
On 2017-10-23 11:32, danceswithnumb...@gmail.com wrote:
According to this website. This is an uncompressable stream.
https://en.m.wikipedia.org/wiki/Incompressible_string
12344321
No, it's not. According to that article,
Greg, you're very smart, but you are missing a big key. I'm not padding, you
are still thinking inside the box, and will never solve this by doing so. Yes!
At least you see my accomplishment, this will compress any random file.
--
https://mail.python.org/mailman/listinfo/python-list
danceswithnumb...@gmail.com wrote:
Compress this:
4135124325
Bin to dec...still very large
0110
0000
1101
01100101
Wait right there! You're cheating by dropping off leading
0 bits. The maximum value of a 10 digit decimal number is
99, which in hex is 2540be3ff. That's 34
Gregory Ewing :
> What you *can't* do is compress 16 random decimal digits to less than
> 6.64 bytes.
More precisely:
Regardless of the compression scheme, the probability of shortening
the next bit sequence is less than 0.5 if the bits are distributed
evenly, randomly and independently
danceswithnumb...@gmail.com wrote:
I did that quite a while ago. 352,954 kb.
Are you sure? Does that include the size of all the
code, lookup tables, etc. needed to decompress it?
But even if you have, you haven't disproved the theorem about
compressing random data. All you have is a program t
danceswithnumb...@gmail.com wrote:
12344321
It only takes seven 8 bit bytes to represent this
This is not surprising. The theoretical minimum size
for 16 arbitrary decimal digits is:
log2(10) * 16 = 53.15 bits = 6.64 bytes
I think you misunderstand what is meant by the phrase
"random
On Tue, Oct 24, 2017 at 2:28 AM, Paul Moore wrote:
> Hope this helps put the subject into context. Compression is a very
> technical subject, to "do it right". Special cases can be worked out,
> sure, but the "hidden assumptions" in a method are what make the
> difference between a "compression al
On Tue, 24 Oct 2017 03:13 pm, danceswithnumb...@gmail.com wrote:
> I did that quite a while ago. 352,954 kb.
Sure you did. Let's see the code you used.
--
Steve
“Cheer up,” they said, “things could be worse.” So I cheered up, and sure
enough, things got worse.
--
https://mail.python.org/ma
I did that quite a while ago. 352,954 kb.
--
https://mail.python.org/mailman/listinfo/python-list
On Tue, 24 Oct 2017 01:27 pm, danceswithnumb...@gmail.com wrote:
> Finally figured out how to turn this into a random binary compression
> program. Since my transform can compress more than dec to binary. Then i
> took a random binary stream, changed it to a decimal stream 0-9 tranformed
> it into
Finally figured out how to turn this into a random binary compression program.
Since my transform can compress more than dec to binary. Then i took a random
binary stream, changed it to a decimal stream 0-9 tranformed it into a
compressed/encrypted binary stream 23.7% smaller. Yes! Decode revers
On Mon, Oct 23, 2017 at 1:42 PM, wrote:
> Wow, do programmers actually use zscii. That is huge. So much wated space.
Not really. ZSCII is only relevant if you're writing Z-code or a
Z-code interpreter. Those in turn are only relevant if you're writing
Infocom games.
--
https://mail.python.org/m
Wow, do programmers actually use zscii. That is huge. So much wated space.
--
https://mail.python.org/mailman/listinfo/python-list
On 2017-10-23, alister via Python-list wrote:
>> 12344321
>>
>> It only takes seven 8 bit bytes to represent this
>
> Would you care to provide the seven 8-bit bytes you propose to use?
> Paul
I would suspect he is using BCD & storing 2 values in reach
>>>
Good point
I hope it has a use, other than a cute toyi don't see it yet.
--
https://mail.python.org/mailman/listinfo/python-list
On 2017-10-23 17:39, danceswithnumb...@gmail.com wrote:
> Thanks Paul...blunt to the point.
>
> My 8 year old can decode this back into base 10, i still have to help him a
> bit going from base 10 to 8 bit bytesit's incredibly simple to decode. No
> dictionary, can easily be done with pencil
On 2017-10-23 07:39, Steve D'Aprano wrote:
> By the way: here is a very clever trick for hiding information in the file
> system:
>
> http://www.patrickcraig.co.uk/other/compression.php
>
>
> but as people point out, the information in the file, plus the information in
> the file system, ends up
Thanks Paul...blunt to the point.
My 8 year old can decode this back into base 10, i still have to help him a bit
going from base 10 to 8 bit bytesit's incredibly simple to decode. No
dictionary, can easily be done with pencil and paper, does not rely on
redundancies.
Jon Hutton
--
https
Just trying to find a practical application for this alg. Not real useful as it
stands now.
Jon Hutton
--
https://mail.python.org/mailman/listinfo/python-list
On 23 October 2017 at 15:29, wrote:
> I'm really not trolling, and even though some are sarcastic i sm learning
> from your comments.
I'm willing to believe that, but if you're trying to claim you have
"compressed" data (in a way that satisfies the technical,
information-theoretic meaning of th
I'm really not trolling, and even though some are sarcastic i sm learning from
your comments. Dec to bin is not bad at removing wasted space, but there is a
better way. Here is an example. How would you compress these numbers. If you
look for redundancy and then code to a bulky dictionary or cha
On Mon, 23 Oct 2017 13:40:59 +, Neil Cerutti wrote:
> On 2017-10-23, Chris Angelico wrote:
>> On Mon, Oct 23, 2017 at 11:18 PM, alister via Python-list
>> wrote:
>>> On Mon, 23 Oct 2017 10:41:55 +0100, Paul Moore wrote:
>>>
On 23 October 2017 at 10:32,
wrote:
> According to th
On 2017-10-23, Chris Angelico wrote:
> On Mon, Oct 23, 2017 at 11:18 PM, alister via Python-list
> wrote:
>> On Mon, 23 Oct 2017 10:41:55 +0100, Paul Moore wrote:
>>
>>> On 23 October 2017 at 10:32,
>>> wrote:
According to this website. This is an uncompressable stream.
https://en
On Mon, Oct 23, 2017 at 11:18 PM, alister via Python-list
wrote:
> On Mon, 23 Oct 2017 10:41:55 +0100, Paul Moore wrote:
>
>> On 23 October 2017 at 10:32, wrote:
>>> According to this website. This is an uncompressable stream.
>>>
>>> https://en.m.wikipedia.org/wiki/Incompressible_string
>>>
>>>
On Mon, 23 Oct 2017 10:41:55 +0100, Paul Moore wrote:
> On 23 October 2017 at 10:32, wrote:
>> According to this website. This is an uncompressable stream.
>>
>> https://en.m.wikipedia.org/wiki/Incompressible_string
>>
>> 12344321
>>
>> It only takes seven 8 bit bytes to represent this
>
danceswithnumb...@gmail.com writes:
> ... First let me clarify before you lump this in with
> perpetual motion, or cold fusion. It is a mapping solution to compress
> ANY i repeat ANY random file with numbers of only 0 - 9 such as are in
> the million rand numbers page. Entirely possible.
Of cour
Thomas Jollans :
> On 2017-10-23 11:32, danceswithnumb...@gmail.com wrote:
>> According to this website. This is an uncompressable stream.
>>
>> https://en.m.wikipedia.org/wiki/Incompressible_string
>>
>> 12344321
>
> No, it's not. According to that article, that string is incompressible
I really do not think this has a value besides being a trinket or cute toy.
Like i said i can not see how it can be adapted to work as a rand binary
compression alg...it only works with 0-9 in any seq. It's taken me six years to
solve, but so what.
Jon Hutton
danceswithnumb...@gmail.com
--
htt
1 - 100 of 153 matches
Mail list logo