I used ChatGPT to solve an open problem in convex optimization. *Part III* 1/N
I used ChatGPT to solve an open problem in convex optimization. *Part I* (1/N)
7
49
7
404
Actually, the real open problem is to establish point convergence of the Nesterov accelerated gradient (NAG) method. That is, the discrete-time, implementable algorithm. 2/N
2
2
31
NAG was introduced by Nesterov in 1983 as an accelerated improvement upon plain gradient descent, yet its point convergence remained unresolved until today. Nesterov ODE was studied as a simplified proxy, aimed at gaining insight into the behavior of its discrete counterpart. 3/N
1
1
31
Indeed, translating the continuous-time proof into the discrete-time setup turned out to be relatively straightforward. With one piece of feedback, ChatGPT was able to work out the proof. chatgpt.com/share/68fc3977-b… 4/N
1
1
29
The proof, cleaned up and typed up by me, resolves the 42-year-old open problem: 5/N
3
6
2
82
In 2016, Kim and Fessler introduced a variant called OGM, which improves upon NAG by a constant factor of 2. We also show that OGM exhibits point convergence. 6/N
2
2
34
After the initial tweet went out, several colleagues in the optimization community reached out with excitement and amusement. It was a lot of fun reconnecting with old friends. And I now have some interesting conversational material for the next optimization conference. 7/N
1
1
40
In particular, Radu Ioan Boţ, Jalal Fadili, Dang-Khoa Nguyen reached out with a preprint of their own, capitalizing on the ideas in the continuous-time proof to also establish point convergence of Nesterov 1983 and more! @RaduIoanBot 8/N
2
1
30
The arXiv preprint of [Bot ̧ Fadili, Nguyen 2025] will go public in a few days. They also prove weak convergence in the infinite-dimensional Hilbert space setting. (ChatGPT will now be happy.) x.com/ErnestRyu/status/19812… 9/N
Replying to @ErnestRyu
ChatGPT noticed this resolves the open problem. But instead of congratulating (paraphrased): “Well... that doesn’t *really* solve the open problem, because this only works in finite dimensions. What if you were optimizing in an infinite-dimensional real Hilbert space?” 6/N
1
2
29
They also argue that the FISTA method [Beck, Teboulle 2009] exhibits point convergence. Check out their work when it goes public! (As far as I know, [Bot ̧ Fadili, Nguyen 2025] did not use AI tools.) 10/N
1
1
20
Conclusion: Starting from the key idea of the first tweet, we extended the convergence result to several related settings and resolved the main 42-year-old open problem, with ChatGPT doing most of the heavy lifting along the way. 11/N

Oct 25, 2025 · 6:59 AM UTC

1
1
29
Overall, this entire journey took just a week, less than 30 hours of my time. ChatGPT’s assistance provided a significant speedup, and without it, I would most likely have given up after three days of slow progress. (As I did in the past.) 12/N
1
2
48
Again, ChatGPT is now at the level of solving some math research questions, but you do need an expert guiding it. I strongly encourage fellow mathematicians to try incorporating AI assistance into their workflow. It takes some getting used to, but it can be worth it. 13/N
1
7
2
76
This concludes my three-part Twitter series. Now that we have the mathematical results sufficient for a publication, I will post our work on arXiv on Monday. After a week or two of polishing the writing and gathering feedback, I’ll submit it for peer review. 14/N, N=14
2
4
81