Discussion about this post

User's avatar
Brian Heming's avatar

Generally speaking, local language models aren't much worse for coding than the biggest cloud models--at least the local ones that barely fit in 16 GB of VRAM. They still can't produce correct programs by themselves for cases that don't have working examples on Stack Overflow, still properly produce boilerplate, test cases, and sample code that resembles Stack Overflow answers vaguely finetuned to your use case, still are pretty good at rapid prototyping, and still can't substitute for an actual programmer.

Relying on a cloud service instead of one running on your own computer, as a programmer who surely has the level of skill needed to set up an off-the-shelf local language model, just seems dumb.

Nibmeister's avatar

I look forward to consumer hardware that can run these giant LLM directly and bypass Big Tech and their gargantuan server farms.

5 more comments...

No posts

Ready for more?