Depends on the language I’d assume. The last thing I heard was that the current Codestral version is optimal for Python for example.
Lemmy account of natanox@chaos.social
Depends on the language I’d assume. The last thing I heard was that the current Codestral version is optimal for Python for example.
Yeah, same with Codestral. You have to tell it what to do very specifically, and once it gets stuck somewhere you have to move to a new session to get rid of the history junk.
Both it and ChatGPT also repeatedly told me to save binary data I wanted to store in memory as a list, with every 1024 bytes being a new entry… in form of a string (supposedly). And the worst thing is that, given the way it extracted that data later on, this unholy implementation from hell would’ve probably even worked up to a certain point.
For a moment I wondered why the Rust code was so much more readable than I remembered.
This would make a nice VS Codium plugin to deal with all the visual clutter. I actually like this.
Instead they’ll become curiosities leading down rabbit holes to understand why and how they happened.
Yes, chromosomes are meaningless to who someone is (except edge-cases).
No, sex and gender aren’t the same.
That’s a chromosome you encoded there which is one of a few markers that define sex, not gender.
Probably to the “oh my god new slang so cringe this youth” crowd. You know, those who always said they’ll never get old and annoying non-understanding adults… and now became exactly that. 😁
The Element Matrix client in a nutshell.
Yeah… I’m quickly reaching the point where I’m quicker thinking and writing Python code than even writing the prompts. Let alone the additional time going through the generated stuff to adjust and fix things.
It’s good to get a grip on syntax, terminology and as an overly fancy (but very fast) search bot that can (mostly) apply your question to the very code that’s in front of you, at least in popular languages. But once you got that stuff in your head… I don’t think I’ll bother too much in the future. There surely are tons of useful things you can do with multimodal LLMs, coding on its own properly just isn’t one of it. At least not with the current generation.