having so much fun with this vibe coding what used to take me two or three hours can now be done in a single day
having so much fun with this vibe coding what used to take me two or three hours can now be done in a single day
@futurebird see now clearly you need to be using the latest models and adopt the MDITA2 MEGA-do-it-all-2.0 markdown framework in order to make it take several other people's days
I feel obligated to try all of this stuff. And there was a moment when I was a little impressed an excited. I thought "wow now I can make all of those apps I always think about but don't have time to make"
I think I felt that same way when I first started using the internet and seeing all of the code libraries people were just sharing.
But adapting the work of others is time intensive in a way that adapting your own work will never be.
The vibe coding thing does highlight how much code is pointless. A load of the things that I’ve seen people be impressed with are things that should be a couple of hundred lines of code but somehow modern frameworks have focused on making things require more code to accomplish the same thing. Systems like HyperCard or even Flash let people produce rich GUIs with almost no code. The kinds of things that could be built in a visual editor with a small amount of code 15-25 years ago are now being generated as tens of thousands of lines of unmaintainable and buggy LLM code.
@david_chisnall @futurebird I talked to a friend who made a lot of money taking a company public and who is a huge booster of llm-assisted coding about this exact issue.
his response was kind of horrifying, but at least now I understand how these people think. He said "I don't care. companies don't fail because their code is unsustainable, they fail because they don't have a product. by the time of the tech deck that comes to, you should have already sold the company."
@david_chisnall @futurebird But also, I've seen a lot of less experienced programmers, before vibe coding was possible, just write thousands of lines of code for something that could be a hundred because no one teaches them the value of parsimony or requires the abstract/mathematical/architectural sophistication of them to really understand what's going on at a low level.
I know this is very Old Woman Yells At Clouds, but part of why even non-AI-generated code ends up being pointless is that someone decided Moore's Law was an excuse to not teach what was going on under the hood. I can't even have a conversation about how why things are bad with juniors sometimes because they aren't asked to think that way. Not ALL of them, by any means. But a lot.
Ah well. Since nobody can afford RAM anymore anyway, people will either run slop code in the cloud no one can debug, use, or maintain, or learn the hard way.
I despair.
I think there are probably some interesting incentives for people to study here. It’s struck me a lot that the popular GUI frameworks today take far more code to achieve good results than good ones from the ‘90s (though less than the worst of the ‘90s). I suspect that it’s a combination of three things:
None of this is really to do with the cost of RAM or compute. Smalltalk-80 was a full GUI on a machine with 1 MiB of RAM and a CPU slower than the slowest Cortex-A0 and it ran interpreted bytecode.
@david_chisnall @grrrr_shark @futurebird
Back in the 1980s, we build a perfectly usable full X.500 email client that ran on BBC micros (that’s 32kB of RAM, or 48kB with sideways RAM mod). Bloat has exploded since then.
@KimSJ @david_chisnall @futurebird Right? The concern I was expressing was this - useful things don't NEED to be huge. But so many people don't even have the skills to make them clean and small now.
Even when I write with bloated languages/frameworks/tools now, I still think about what the code I'm writing is going to do and try to be parsimonious. But it IS a skill and if folks don't learn it, of course they won't do it.
@david_chisnall @futurebird yeah, I can certainly agree that the incentives for writing small, clear maintainable programs are... Hidden, at best. That's part of what angers me.
And the management incentives - LOC, releases regardless of what's in them, integrate AI into this even if it will break everything, are powerful.
I would like to go live on the moon, please.
Could that be it?
I've mostly noticed that the kind of things I want to do with computers has generally gotten much more difficult to do, and far far more difficult to teach.
But making the computer do what you want remains a real source of joy. My 5th grade students were delighted to make a text adventure type program ... I thought they'd find it boring but they were so excited to have their friends try their adventures.
Have you played with Godot at all? It’s been on my to-learn list for a couple of years and some initial poking suggested it would be a great learn-to-program platform:
@david_chisnall @futurebird One theory I have here, is that "more effort" is easier to translate into "more money" than "quality" - especially when most people don't see the bloat.
hypercard was suppressed by The Man because it made the people too powerful!!
@futurebird @david_chisnall Visual Basic as well, I will not forget the snide comments from "experts" that a younger me received when trying to learn the brain rot language.
// I have used LLMs to make a few utilities and apps that I have been using every day for months now - things not interesting or profitable enough for anybody else to make.
@hiway @futurebird @david_chisnall I fixed a bug in my manager's Visual Basic program. He said it was still there.
On closer inspection, I found that every screen in the program was a complete copy and paste of the first page code, with a tiny amount of code added for that specific functional area. The same bug was in the code twelve times.
To be fair, his speciality was technical drawing. I think of him as the original design pattern for LLMs.
@AbramKedge @hiway @futurebird
A lot of VB code was like that. I did encounter one bit of in-house VB6 that was beautifully structured, had clean abstractions, and spoke to a SQL Server back end, so I at least have an existence proof that good, clear, maintainable code was possible in VB. I never managed to write any though. Somewhere I have some floppy disks full of truly terrible VB2 to VB4 that I wrote as a child.
In the ‘90s there was a huge push in software engineering to component models. COM and CORBA both came out of this. The idea was to build libraries as reusable blocks. Brad Cox wrote a lot about this and created Objective-C as a way of packaging C libraries with late-bound interfaces that could be exposed to higher-level languages easily.
This combined with the push towards visual programming, where you’d be able to drag these libraries into your GUI and then wire things up to their interfaces with drag-and-drop UIs. The ‘Visual’ in Visual Studio is a hangover from this push.
Advocates imagined stores of reusable components and people being able to build apps for precisely their use case by just taking these blocks and assembling them.
It failed because the incentives were exactly wrong for proprietary COTS apps. Companies made money by locking people into app ecosystems. If it’s easy for someone to buy a (small, cheap) new component to Word 95 that adds the new feature that they need, how do you convince them to buy Word 97?
The incentives for F/OSS are the exact opposite. If another project can add a feature that some users want (but you don’t) without forcing you to maintain that code, everyone wins. But we now have an entire generation that has grown up with big monolithic apps who copy them in F/OSS ecosystems because it’s all they’ve ever known.
@david_chisnall
There are more problems with components than just monetization.
Plug-in style extensions add extra layers of complexity for both developers and users. End users have to source and manage thier plug-ins. Developers often build their plug-in for only one operating system or one version of the application then abandon it.
There are good technical and social reasons for projects (such as the Linux kernel) to use a monolithic model.