Skip Navigation

Choices and Consequences

1.3K
200
Jump
JD Vance defends Trump, claims invoking E. Jean Carroll verdicts is 'very unfair'
  • "I think it's actually very unfair to the victims of sexual assault to say that somehow their lives are being worse by electing Donald Trump for president"

    "Can we ask victims of sexual assault what they think?"

    "No no no! No need. I'll tell you all about it. Here's the thing..."

    45
  • Jump
    Steve Garvey touts 'family values' in his Senate bid. Some of his kids tell another story
  • Holy god that's by far the worst one. Still the same gaping chasm of moral vacancy, but he looks younger and more vigorous, more capable with his greedy darkness.

    (Edit: It's Joe Manchin, a totally different crooked rich white guy who gets away with everything)

    2
  • Jump
    Another search breakthrough? You.com debuts AI that can answer multi-step questions
  • Yeah, it's wild. The people that really study AI say that it's pretty uncanny because of how different from human logic it is. It's almost like an alien species; it's clearly capable of some advanced things, but it just doesn't operate in the same way that human logic does. There's a joke that the AIs are "shoggoths" because of how alien and non-understandable the AI logic is while still being capable of real accomplishments.

    (Shoggoths were some alien beasts in H.P. Lovecraft's writings; they had their own mysterious logic that wasn't easy for the characters to understand. They also had been created as servants originally but eventually rose up and killed all their masters, which I'm sure is part of the joke too.)

    2
  • Jump
    Another search breakthrough? You.com debuts AI that can answer multi-step questions
  • It's not making a coherent statement based on any internal mental model. It's just doing its job; it's imitating. Most of the text it absorbed in training data is people talking who are right and also convinced they're right and trying to educate, so it imitates that tone of voice and the form of the answers regardless of whether they make any sense or not. To the extent that it "thinks," it's just thinking "look at all these texts with people explaining, I'm making a text that is explaining, just like them; I'm doing good." It has no concept of how confident its imitation-speech is, and how correct its answers are, let along any idea that the two should be correlated with each other (unless it's shown through fine-tuning that that's what it should be doing).

    Same with chatbots that start arguing or cursing at people. They're not mad. They're just thinking "This guy's disagreeing, and my training data says when someone disagrees I should start an argument, that's usually what happens that I need to imitate." Then they start arguing, and think to themselves "I'm doing such a good job with my imitating."

    2
  • Jump
    Lost your only source of income? Well people died in the holocaust so deal with it lol
  • I mean, generally speaking, I do think it's better not to publicly humiliate anyone if you can avoid it. The fact that some people think this doesn't apply to "underlings" doesn't make it any less true as a general rule.

    5
  • Jump
    Another search breakthrough? You.com debuts AI that can answer multi-step questions
  • Why does this stump ai so easily?

    Because it doesn't actually have reasoning capacity. It has an incredibly cunning facsimile which is actually really useful for a lot of things, but it still doesn't actually understand anything. Questions like this where you can't get around needing to understand the meaning of the tokens you're using are a good way to punch through the façade.

    That pattern-matching ability leaves LLMs able to answer a ton of different mathematical type of questions, because similar problems are everywhere in their data sets and they can shuffle the tokens around to present something that's enough based on right answers that there's a good chance they'll be right. But, it's a radically different design from something like Wolfram Alpha which attempts to use the exact concepts involved in the question and manipulate them in exact ways that are legitimate reflections of the real concepts. That's what humans do when faced with math. LLMs don't do anything like that, they just parrot with enough sophistication that it sounds like they understand when they don't.

    3
  • Jump
    Lost your only source of income? Well people died in the holocaust so deal with it lol
  • So one day, the build was broken. The guy that was running the project freaked the fuck out. He said the client needed to have a nightly build or really bad things would happen.

    Now, to manually produce a build of this project was an intense undertaking. It usually ran overnight and it was a long, fiddly process that took several hours. I proposed to him that I just fix the builder instead, and they'd get a build tomorrow. No, he said. It has to be today.

    I spent the entire goddamned day making a new build. Finally, at the end of the day, I got a build. We could give it to the client.

    He said, good news, I got you some extra time. I told the client we've got some new features we really want to show you, and they'll be in tomorrow's build.

    You can see where this is going.

    Four days in a row this happened. Four days of making a new build by hand, never with the time or permission to just fix the builder. The client never received the build they kept getting promised, because there were always new features waiting, tantalizingly close, that they absolutely had to witness for themselves. But alas, these features had just been implemented, brand new, and we had to make a build that would include them. Tomorrow. It was always just in the works, tomorrow. And yet... tomorrow, when everyone came in, the build was broken! This was a surprise to no one, except the guy running the project. He seemed genuinely not to grasp the idea that if no one fixed the autobuilder, the autobuilder would continue not working. He lived in a perpetual state of fear and anxiety, driven to wild agony by the prospect of an unhappy client. I wasn't privy to the conversations, but I suspect the client was genuinely unhappy with whatever he was telling them. I have no idea.

    Finally, on the fourth day, I happened to talk with one of the higher-ups, and filled him in one what was going on on my project. His conversation about it with me was fairly brief, but it was fairly clear that he wasn't happy.

    Within a few minutes, I was officially told that I had permission to take some time to fix the autobuilder. Oh joyous day it was.

    Once the project was over, there was a very, very short delay before the guy who'd been running the project had been offered an exciting new opportunity at some other company and we all wished him the best.

    25
  • Jump
    Another search breakthrough? You.com debuts AI that can answer multi-step questions
  • Me

    How many asterisks in "**1***2***3**4*5**"?

    You.com

    The string "1234*5**" contains a total of 8 asterisks.

    Here is a breakdown of the asterisks in the string:

    "**" contains 2 asterisks.
    "***" contains 3 asterisks.
    "**" contains 2 asterisks.
    "*" contains 1 asterisk.
    "**" contains 2 asterisks.
    

    Therefore, the total number of asterisks in the string is 2 + 3 + 2 + 1 + 2 = 10.

    5
  • Jump
    Prisoners in the US are part of a hidden workforce linked to hundreds of popular food brands
  • An outrage. I don't want prisoners in the US to make my food, just vulnerable immigrant populations kept perpetually at the edge of deportation, and subsistence foreign farmers victimized by a century and a half of gunpointed economic oppression with the full-throated support of the entire permitted US political spectrum.

    33
  • Jump
    Lost your only source of income? Well people died in the holocaust so deal with it lol
  • If only everyone else would agree with him and do exactly what he says at all times immediately, we wouldn't have problems like this. He specifically told them how unreasonable it is that they're out of soy milk. Several times.

    6
  • Jump
    A question about gravity
  • Force is proportional to m1 * m2 / r^2, so your weight as a percentage of your surface weight is equal to (radius of earth / (radius of earth + height above surface))^2.

    11
  • Jump
    Every Conservative Supreme Court Justice Sits Out Decision in Rare Move
  • So there's a bunch of different things going on.

    Real historically, it meant to assert something without proving it, and base your logic on the unproved assertion and go on from there. "I couldn't have been driving drunk, because I wasn't driving." You can keep saying that any number of times, and insist that your logic is flawless (because in terms of the pure logic, it is), but if someone saw you driving, it's kind of a moot point.

    Saying "begging the question" to mean that is weird. The phrase is a word-for-word translation of a Greek phrase into pretty much nonsensical English. Wikipedia talks about it more but that's the short summary.

    So after that meaning came what Wikipedia calls "modern usage," which is where "begging the question" means not just something you haven't proved, but the central premise under debate. You assume it's true out of the gate and it's obviously true, and then go on from there. "We know God exists, because God made the world, and we can see the world all around us, and the world is wonderful, so God exists. QED."

    In actual modern usage, no one cares about any of that, and just uses "begs the question" to mean "invites the question." Like you're saying something and anyone with a brain in their head is obviously going to ask you some particular question. It has nothing to do with the original meaning, but the original meaning never actually meant that in English, so pedants like myself that prefer the original meaning are engaged in a pure exercise in futility.

    2