bornach, 1 month ago Or maybe it was taught by BingChat/Copilot - isn't Microsoft reportedly using GPT-4? Its solution is even more tortuous and later admits to measuring the wrong amount of water. It never realises it could have stopped after step 1.
Or maybe it was taught by BingChat/Copilot - isn't Microsoft reportedly using GPT-4? Its solution is even more tortuous and later admits to measuring the wrong amount of water. It never realises it could have stopped after step 1.