Il 25/07/23 19:58, Antonio ha scritto:
La stima di 10^23 FLOPS, come qualcuno ha capito, รจ il conto totale dei FLOPS 
necessari per allenare GPT-3.
trovato :)
Qui (pag.6) spiega tutto: https://arxiv.org/pdf/2104.10350.pdf

Molto interessante. Grazie per il tempo investito, e per la condivisione derivatane.

Bye,
DV

N.processori: 10000
Tipo processore: NVIDIA V100
Consumo singolo: 300W
Performance singola media: 24,6 TFLOPS/s
Training time: 14,8 giorni
Totale potenza di calcolo: 3.14E+23 FLOPS (= 10000 * 24,6 * 86400 * 14,8)
Totale consumo: 1287 MWh

Antonio
_______________________________________________
nexa mailing list
nexa@server-nexa.polito.it
https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa

--
Damiano Verzulli
e-mail: dami...@verzulli.it
---
possible?ok:while(!possible){open_mindedness++}
---
"...I realized that free software would not generate the kind of
income that was needed. Maybe in USA or Europe, you may be able
to get a well paying job as a free software developer, but not
here [in Africa]..." -- Guido Sohne - 1973-2008
   http://ole.kenic.or.ke/pipermail/skunkworks/2008-April/005989.html

Attachment: OpenPGP_signature
Description: OpenPGP digital signature

_______________________________________________
nexa mailing list
nexa@server-nexa.polito.it
https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa

Reply via email to