• 0 Posts
  • 21 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle












  • No arguments from me that it’s better if people are just better at their job, and I like to think I’m good at mine too, but let’s be real - a lot of people are out of their depth and I can imagine it can help there. OTOH is it worth the investment in time (from people who could themselves presumably be doing astonishing things) and carbon energy? Probably not. I appreciate that the tech exists and it needs to, but shoehorning it in everywhere is clearly bollocks. I just don’t know yet how people will find it useful and I guess not everyone gets that spending an hour learning to do something that takes 10s when you know how is often better than spending 5 mins making someone or something else do it for you… And TBF to them, they might be right if they only ever do the thing twice.








  • silasmariner@programming.devtoScience Memes@mander.xyzGolden
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    There are so many different areas of computer science though… Everything from pure mathematics (e.g ‘we found a new algorithm that does X in O(logx)’) to the absurdly specific (‘when I run the load tests with this configuration it’s faster’). The former would get published. The latter wouldn’t. And the stuff in the middle ranges the gamut from ‘here’s my new GC algorithm that performs better in benchmarks on these sample sets’ to ‘looks like programmers have fewer bugs when you constrain them with these invariants’. All the way over on the other side, NFT/Blockchain/AI announcement crap usually doesn’t even have a scientific statement to be expressed, so there’s nothing to confirm or deny. There are issues with some areas, but I’m not sure that replication is really the big one for most of these. Only one it commonly applies to IMO are productivity or bug-frequency claims which are generally hella suss