• 1 Post
  • 111 Comments
Joined 3 years ago
cake
Cake day: February 1st, 2023

help-circle











  • There are many more interesting proposed approaches though. Like creating a religious cult around avoiding nuclear waste, all kinds of hostile-looking architecture, and my favourite is the idea of stocking waste in containers so durable that any people advanced enough to break into it would have to be advanced enough to know how to behave around nuclear waste




  • edinbruh@feddit.ittoLemmy Shitpost@lemmy.worldturing completeness
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    1 month ago

    LLM are not the path to go forward to simulate a person, this is a fact. By design they cannot reason, it’s not a matter of advancement, it’s literally how they work as a principle. It’s a statistical trick to generate random texts that look like thought out phrases, no reasoning involved.

    If someone tells you they might be the way forward to simulate a human, they are scamming you. No one who actually knows how they work says that unless they are a CEO of a trillion dollar company selling AI.


  • I don’t like it because people don’t shut up about it and insist everyone should use it when it’s clearly stupid.

    LLMs are language models, they don’t actually reason (not even reasoning models), when they nail a reasoning it’s by chance, not by design. Everything that is not language processing shouldn’t be done by an LLM. Viceversa, they are pretty good with language.

    We already had automated reasoning tools. They are used for industrial optimization (i.e. finding optimal routes, finding how to allocate production, etc.) and no one cared about those.

    As if it wasn’t enough. The internet is now full of slop. And hardware companies are warmongering an arms race that is fueling an economic bubble. And people are being fired to be replaced by something that will not actually work in the long run because it does not reason.




  • If Turing was alive he would say that LLMs are wasting computing power to do something a human should be able to do on their own, and thus we shouldn’t waste time studying them.

    Which is what he said about compilers and high level languages (in this instance, high level means like Fortran, not like python)