Just for some additional perspective. I have also tried this on some general chemistry word problems. In general, I see it getting basic one logical step processes correct (e.g. a single step dilution or grams -> moles). Things with multiple steps or requiring understanding the physical situation it does poorly on. That said, I think it does better than some of my weakest students. It does not seem to be able to use significant figures in computations (also a problem for my weaker students).
It seems to be improving rapidly. If it can get to reliably differentiating between correct (workable) solutions and erroneous ones, it will be more useful to most people (including my students) than searches of the internet or a cheating sight such as Chegg. My two cents worth of opinion. Jonathan On Wednesday, December 14, 2022 at 4:28:05 PM UTC-6 Francesco Bonazzi wrote: > [image: chatgpt.sympy.matrix_diag.png] > > On Wednesday, December 14, 2022 at 11:26:37 p.m. UTC+1 Francesco Bonazzi > wrote: > >> Not everything is perfect... ChatGPT misses the *convert_to( ... ) *function >> in *sympy.physics.units*, furthermore, the given code does not work: >> >> [image: chatgpt.sympy.unit_conv.png] >> >> On Wednesday, December 14, 2022 at 11:24:29 p.m. UTC+1 Francesco Bonazzi >> wrote: >> >>> [image: chatgpt.sympy.logical_inference.png] >>> >>> On Wednesday, December 14, 2022 at 11:23:43 p.m. UTC+1 Francesco Bonazzi >>> wrote: >>> >>>> https://en.wikipedia.org/wiki/ChatGPT >>>> >>>> Some tested examples attached as pictures to this post. Quite >>>> impressive... >>>> >>>> -- You received this message because you are subscribed to the Google Groups "sympy" group. To unsubscribe from this group and stop receiving emails from it, send an email to sympy+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/sympy/6af62b19-1fb0-4681-9fd2-5e5fccfcb46fn%40googlegroups.com.