Am 01.08.2013 um 15:33 schrieb Henrik Johansen <henrik.s.johan...@veloxit.no>:

> 
> On Jul 23, 2013, at 5:33 , Norbert Hartl <norb...@hartl.name> wrote:
> 
>> Max,
>> 
>> Am 23.07.2013 um 17:27 schrieb Max Leske <maxle...@gmail.com>:
>> 
>>> 
>>> On 23.07.2013, at 15:32, Mariano Martinez Peck <marianop...@gmail.com> 
>>> wrote:
>>> 
>>>> 
>>>> 
>>>> 
>>>> On Tue, Jul 23, 2013 at 9:48 AM, Norbert Hartl <norb...@hartl.name> wrote:
>>>>> Mariano,
>>>>> 
>>>>> Am 23.07.2013 um 14:43 schrieb Mariano Martinez Peck 
>>>>> <marianop...@gmail.com>:
>>>>> 
>>>>>> Norbert, does the model have lots of strings? Because if so you can try 
>>>>>> using the compression. It may increase speed. There is always a trade 
>>>>>> off between the additional time of compressing/uncompressing. But if the 
>>>>>> model has quite an amount of strings, since writing to disk is also slow 
>>>>>> and the final stream would be smaller, it could be faster. 
>>>>> thanks. Yes, the model is a lot of Dictionaries having string keys. And 
>>>>> there are lots more. The usage of the model is write once read many. And 
>>>>> 30ms is good enough for me. So I might try compression only to see the 
>>>>> difference. Well, and maybe to reduce storage size. The mcz is now 261 
>>>>> kb. 
>>>>> 
>>>>> Thanks, I'll see but I am already satisfied with the current approach. 
>>>> 
>>>> Mmmm I thought it was ready, sorry my bad. Seems it was not ready nor 
>>>> integrated: https://code.google.com/p/fuel/issues/detail?id=160
>>>> But I might be wrong. 
>>>> Max?
>>> 
>>> No, that issue is still open. You're welcome to use the experimental 
>>> version though (Fuel-MaxLeske.757.mcz in 
>>> http://www.squeaksource.com/FuelExperiments) but keep in mind that this is 
>>> a few months behind stable.
>> I'm not sure the priority is high enough to try.
>> 
>>> The other idea however, that Mariano describes in the linked issue, is 
>>> available but probably not desirable: simply pass a compressing stream to 
>>> Fuel (see FLGZippedBasicSerializationTest for an example on how to do 
>>> that). That will compress *all* data, not only strings (which is the idea 
>>> outlined in the linked issue).
>> I don't think that will gain anything. A monticello package is written as 
>> zip. So I wouldn't expect much from an additional compression.
>> 
>> thanks,
>> 
>> Norbert
> Unless MAPExampleModels tcap is something like
>       ^'mySerializedTCAP.bin' asFile readStream binary contents
> instead of 
> tcap
>       ^#( 0 0 0  … ), 
> compressing the fuel binary output would save you some runtime memory (method 
> literal size) / .changes space (method source), though for test resources I 
> guess that's a less relevant aspect than for other resources.
> 
Right! They difference is in the image. I'll check that. It might come in handy 
when I have more of these methods. But then I need to try because every method 
is used multiple times and having it compressed in the image will add some time 
to every test run. And then it is measuring penalties (again)

thanks,

Norbert

Reply via email to