That’s concerning. If it was “I generated a function with an LLM and reviewed it myself” I’d be much less concerned, but 14k added lines and 10k removed lines is crazy. We already know that LLMs don’t generate up to scratch code quality…
I won’t use PostgreSQL with ntfy, and keep an eye on it to see if they continue down this path for other parts of ntfy. If so I’ll have to switch to another UP provider.





Except you can already download and run models on your local machine for free with ollama. Price raising might at least calm the AI craze with the normies though. Probably not with developers who know how to run LLMs locally.