It's a problem with the traditional Smalltalk-80 implementation of Dictionaries.
Using my compiler and library,
1311 usec to build
 498 usec to delete (1..10000)
1318 usec to build
 472 usec to delete (10000..1)
(The time difference between building and deleting is mostly #printString.)
There is no defect in the Pharo compiler here, it's the hash table design
which is basically flawed if you want delete entries.
My library uses separate chaining
(https://en.wikipedia.org/wiki/Hash_table#Separate_chaining)
which makes deletion simple and fast and allows 'null' keys.

Pharo uses open addressing, which makes deletion much harder.
It doesn't seem to improve retrieval performance either.

On Mon, 3 Feb 2020 at 23:18, LABORDE Pierre
<[email protected]> wrote:
>
> Hi all,
>
>
>
> I have a problem with Dictionaries :
>
>
>
> dic := Dictionary new: 10000.
>
>
>
> 1 to: 10000 do:[ :i |
>
>             dic at: i put: i printString.
>
> ].
>
>
>
> 1 to: 10000 do:[ :i |
>
>             dic removeKey: i.
>
> ]
>
>
>
> Removing each keys is very slooow, time execution difference between add and 
> remove is crazy.
>
> Any idea for fix that ? Can I use another Dictionary implementation or 
> settings ?
>
>
>
> Thanks.
>
>
>
> Pierre
>
>

Reply via email to