• 0 Posts
  • 231 Comments
Joined 10 months ago
cake
Cake day: August 27th, 2023

help-circle


  • I only recently tossed a Handbook Of Christian Apologetics by Tacelli & Kreeft. I was never devout, let alone outright brainwashed into anti-science nonsense, but at the cusp of my reddit atheist phase I figured a question this big deserved a fair shake. So I got a big ol’ book of the best arguments anyone had. They all sucked. So that was that.

    The one that made me put the book down and go ‘yep, atheist’ was “the argument from magic.” You think about moving your hand. Your hand moves by thought alone. Magic! Therefore, Jesus. I fucking wish I was exaggerating.

    Took another decade to figure out the people pushing these arguments don’t actually give a shit about being right. The point is performing loyalty to the ingroup. There’s a conclusion, and it comes from people above you, so your job is to make whatever mouth noises get there. A monotheistic god is only the purest expression of that tribalist worldview.
















  • The Deftones managed to pull off what they were heading toward in their self-titled album, which was deeply iffy back in 2003. Going from Around The Fur and White Pony to the likes of “Good Morning Beautiful” or “Lucky You” was jarring. Even promising tracks like “Minerva” felt unpolished and off-brand. “When Girls Telephone Boys” had the right energy but wasn’t exactly an easy recommendation for new listeners. It’s the kind of song where you think you know the lyrics, and then you read what they’re supposed to be, and you understand even less than you did before.

    Saturday Night Wrist captured more of the frisson from their earlier work. “Cherry Waves” and “Kimdracula” in particular. But it’s considerably more relaxed than anything previous. Even seemingly frenetic tracks like “Combat” feel halfway to pop. Diamond Eyes has more of the vibe but less of the polish. It does let parts of each song stand out more. “Command / Control” and “You’ve Seen The Butcher” aren’t great songs, but they understood what was good about them, and let it come to the fore. “Risk” is solid.

    They finally nailed it for Koi No Yokan. The first song is so-so - as is tradition - but beyond that it’s just A+ material. Spacious, detailed, bombastic, and like little else. The classic vitriol comes out in “Leathers.” The gentle new stuff becomes “Entombed.” The dreamy frisson evolves into “Rosemary.” It’s unambiguously a fantastic album.

    No idea what the fuck happened on Gore. They had it and they lost it. The first track is one of their best songs ever, and then it’s half an hour of ehhh. The album only picks up for some transitions on either side of “Phantom Bride.” For about two non-consecutive minutes, Sergio Vega and Stephen Carpenter get to flex, and then it closes with a track that just leaves you wanting.

    Ohms veers back toward the right idea. It’s a lot more 90s than even their self-titled album, between the opening track and the middle run of “The Spell Of Mathematics” through “This Link Is Dead.” Tracks like “Error” are novel, and modern, and clean. Holy shit, the things this band has done to clarify and distinguish all the work they put in. I can’t even listen to 80s wall-of-sound sludge anymore. Better to convey the world’s loudest guitar than a whole orchestra constantly going hard.

    But now I want to check back in after another twenty years, because they pissed off the bassist they’ve had since Diamond Eyes. I have no fucking idea what they’re going to sound like next. I might be more interested in whatever that guy does without them.



  • Years ago I found myself explaining to Chinese Room dinguses - in a neural network, the part that does stuff is not the part written by humans.

    I’m not sure it’s meaningful to say this sort of AI has source. You can have open data sets. (Or rather you can be open about your data sets. I don’t give a shit if LLMs list a bunch of commercial book ISBNs.) But rebuilding a network isn’t exactly a matter of hitting “compile” and going out for coffee. It can take months, and the power output of a small city… and it still can’t be exact. There’s so much randomness involved in the process that it’d be iffy whether you get the same weights twice, even if you built everything around that goal.

    Saying “here’s the binary, do whatever” is honestly a lot better for neural networks than for code, because it’s not like the people who made it know how it works either.