• expatriado@lemmy.world
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    2
    ·
    8 days ago

    Physics’ Nobel prize awarded for a Computer Science achievement, actual physics is having a dry spell I guess

    • kamenLady.@lemmy.world
      link
      fedilink
      English
      arrow-up
      32
      ·
      8 days ago

      Beyond recognizing the laureates’ inspirations from condensed-matter physics and statistical mechanics, the prize celebrates interdisciplinarity. At its core, this prize is about how elements of physics have driven the development of computational algorithms to mimic biological learning, impacting how we make discoveries today across STEM.

      They explain the flex at least

    • CarbonIceDragon@pawb.social
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      4
      ·
      8 days ago

      To be fair, regardless of one’s stance on the utility of current AI or the wisdom of developing it, it is an extremely difficult and potentially world changing technical achievement, and given there isn’t a computer science prize, physics is probably the most relevant category to it

      • vrighter@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 days ago

        not really. A lot of techniques have been known for decades. What we didn’t have back then was insane compute power.

        and there’s the turing award for computer science.

        • PixelProf@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          7 days ago

          Insane compute wasn’t everything. Hinton helped develop the technique which allowed more data to be processed in more layers of a network without totally losing coherence. It was more of a toy before then because it capped out at how much data could be used, how many layers of a network could be trained, and I believe even that GPUs could be used efficiently for ANNs, but I could be wrong on that one.

          Either way, after Hinton’s research in ~2010-2012, problems that seemed extremely difficult to solve (e.g., classifying images and identifying objects in images) became borderline trivial and in under a decade ANNs went from being almost fringe technology that many researches saw as being a toy and useful for a few problems to basically dominating all AI research and CS funding. In almost no time, every university suddenly needed machine learning specialists on payroll, and now at about 10 years later, every year we are pumping out papers and tech that seemed many decades away… Every year… In a very broad range of problems.

          The 580 and CUDA made a big impact, but Hinton’s work was absolutely pivotal in being able to utilize that and to even make ANNs seem feasible at all, and it was an overnight thing. Research very rarely explodes this fast.

          Edit: I guess also worth clarifying, Hinton was also one of the few researching these techniques in the 80s and has continued being a force in the field, so these big leaps are the culmination of a lot of old, but also very recent work.

    • skillissuer@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 days ago

      i’m here to remind you that for last 20ish years half of the time chemistry nobel goes to biologists, and now they doubled down on ai wankery with giving it to alphafold

  • lambda_notation@lemmy.ml
    link
    fedilink
    English
    arrow-up
    33
    ·
    7 days ago

    “They used physics to do it” is just a laughably pathetic motivation. Nobel hated “abstract wankery” or “intellectual masturbation” and wanted to promote results which benefitted the common man and society directly. This is incidentally also why there doesn’t exist a Nobel prize in economics. The nobel prize comitte has since long abandoned Nobel’s will in this matter and it is anyones guess what the order of magnitude of spin Nobel’s corpse has accumulated.

  • msantossilva@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    8 days ago

    I guess some people are genuinely concerned about AI wiping out humanity. Do not worry, that will never happen. We are already doing a fine job fostering our own extinction. If we keep going down our current path, those soulless robots will never even get the chance.

    Now, in truth, I do not know what will kill us first, but I reckon it is important to stay positive

    • slaacaa@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 days ago

      I mean it’s definitely helping, but not in the way I imagined. It is becoming a major driver of CO2 emissions due to the large computational power if needs, which will only increase in the future. The planet is boiling, and they will keep building more server farms for the next LLM upgrade, giving up on stopping/controlling climate change.

      • mindaika@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 days ago

        Wouldn’t that be something: we choke to death trying to create a supercomputer to tell us to stop doing exactly that

        True irony

      • Zos_Kia@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        4
        ·
        7 days ago

        To clarify: AI is NOT a major driver of CO2 emissions. The most pessimistic estimations place it at a fraction of a percent of global energy consumption by 2030.

          • Zos_Kia@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            7 days ago

            I mean it is also true for crypto. BTC, the most energy-hungry blockchain, is estimated to burn ~150TWh/year, compared to a global consumption of 180 000TWh/y.

            Now is that consumption useless ? Yes, it is completely wasted. But it is a drop in the bucket. One shouldn’t underestimate the astounding energy consumption of legacy industries - as a whole the tech industry is estimated to represent just a few percents of the global energy budget.

  • zlatiah@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    8 days ago

    So it was the physics Nobel… I see why the Nature News coverage called it “scooped” by machine learning pioneers

    Since the news tried to be sensational about it… I tried to see what Hinton meant by fearing the consequences. Believe he is genuinely trying to prevent AI development without proper regulations. This is a policy paper he was involved in (https://managing-ai-risks.com/). This one did mention some genuine concerns. Quoting them:

    “AI systems threaten to amplify social injustice, erode social stability, and weaken our shared understanding of reality that is foundational to society. They could also enable large-scale criminal or terrorist activities. Especially in the hands of a few powerful actors, AI could cement or exacerbate global inequities, or facilitate automated warfare, customized mass manipulation, and pervasive surveillance”

    like bruh people already lost jobs because of ChatGPT, which can’t even do math properly on its own…

    Also quite some irony that the preprint has the following quote: “Climate change has taken decades to be acknowledged and confronted; for AI, decades could be too long.”, considering that a serious risk of AI development is climate impacts

  • mindaika@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    7 days ago

    It’s probably easier to righteously quit your job after a decade of collecting senior executive salary

    Also: physics?

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 days ago

      Is that because you also don’t know any physics Nobels off the top of your head?