my statement may seem unlogical while evaluating and comparing the languages as 
a whole..

I thought when I give a small number into the programme , the more decimals I 
can see after the dot as an output, the more human readable it is.

when I see a bunch of numbers with 'e' s I know the numbers are small but it is 
hard for me to compare it to other numbers with 'e'. , specially with the human 
eye.

I dont know much about scala actually. I have just have tried to give 0.0001 
and it returned a presentation with an 'e' .whereas python takes 0.0001 and 
gives 0.0001 . it made me think python is better in that specific subject.

However, python though starts to give 'e' number when 5 decimals are given as 
input.  Although  there can be systems around which are better in this subject 
other things I can achieve in python overrides some disadvantages.
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to