ing Language" book. My
> code is based on this example, but whereas du4 briefly peaks at 70-80% CPU
> my own code never exceeds 50%. So I guess the problem is with my code:-(
>
> On Monday, February 27, 2017 at 6:59:11 PM UTC, Guillermo Estrada wrote:
>>
>> Are you sure
Hi, first I would avoid using 2D arrays for storing pixels and such, that's
not even a Go thing, (although Go internal library uses a 1D array for
storing pixels of images), even in C or C++ a 1D array is recomended, it is
faster to say the least and more efficient.
img[x][y] becomes img[y*widt
Are you sure your virtual machine has 4 cores assigned? You might have 4
cores, but when you create the VM you can assign any number of cores to it
(1 being default IIRC).
On Monday, February 27, 2017 at 12:39:31 PM UTC-6, Mark wrote:
>
> I just tried this on an old Windows 7-32bit machine which
Thanks! I guess I missed them cause of the ordering variables being on the
very top and ByteOrder interface being the last. I knew there had to be a
way and I was missing something.
On Monday, February 27, 2017 at 12:51:08 PM UTC-6, howar...@gmail.com wrote:
>
> Look a little further down - they
On Monday, February 27, 2017 at 12:48:53 PM UTC-6, howar...@gmail.com wrote:
>
> You are missing that it is encoding. Specifically, using VarInt is asking
> for Variable integer encoding:
> https://developers.google.com/protocol-buffers/docs/encoding#varints
>
> The important bit here is that t
, b2, "->", i2)
}
On Monday, February 27, 2017 at 12:44:56 PM UTC-6, Guillermo Estrada wrote:
>
> On Monday, February 27, 2017 at 12:41:43 PM UTC-6, zeebo wrote:
>>
>> You're using the variable width encoding. The number of bytes of a
>> v
On Monday, February 27, 2017 at 12:41:43 PM UTC-6, zeebo wrote:
>
> You're using the variable width encoding. The number of bytes of a
> variable width encoded int64 will depend on the magnitude of the value. If
> you use binary.BigEndian or binary.LittleEndian you can use the
> *PutUint64* meth
Hey Gophers! I'm having a bit of trouble understanding something about the
standard library, I'm pretty sure either it is not wrong, or there is a
reason behind it, but either way I don't understand which one. As the title
suggests, I'm using encode/binary to write a int64 into a byte slice, but