-
Posts
221 -
Joined
-
Last visited
-
I'm pretty sure his point was that the levels had more presence than usual for a shooter in August of 95. Duke 3D is known for being a pioneer in designing levels to actually resemble locations, and that came later.
-
Upcoming discussion with sociologist / statistician Jacy Reese Anthis
Shaddy replied to Ross Scott's topic in General News
I'm happy to see Ross alongside other people, I just don't love his recent choice of company -- granted, there's plenty of much worse fearmongering and apocalyptic conspiracy theories he could be platforming than AI stuff, I just hope it stays away. -
Upcoming discussion with sociologist / statistician Jacy Reese Anthis
Shaddy replied to Ross Scott's topic in General News
I clicked on the dude's website and it immediately got stuck on a loading screen that looked like this: is this what AF is gonna be now? Just platforming lesswrong and effective altruism cranks? -
Upcoming discussion / possible debate with AI expert Eliezer Yudkowsky
Shaddy replied to Ross Scott's topic in General News
Neural nets have been around for years (we've been using it to upscale old game textures for a while), the current splash is due to the improvement in text bots and image generation, and the current controversy is over the reference material being unethically-sourced. We're also seeing text generators being used to run bots, image generation and voice generators spreading disinformation and impersonating people. Nobody doubts the capability as tools, but we do need conversations about how those tools are used, because they're easily misused. I bring it up because Yud and his cult of personality mostly just speculate about the robot apocalypse, but since both these things use the word "AI", it's very convenient for the weirdos who want your money. Okay, "assume he's a bad person and move on" is definitely not a good idea, but I mostly included it because it's well-sourced and I didn't want the things I was saying to sound baseless. Obviously any wiki has a degree of bias, and maybe it is a little weird to fixate on a person in this manner, but I don't think their assessment is incorrect. Well, I mean, this is a discussion about how much disruption something new is causing. If you don't care, I'm not sure what I or anyone else is supposed to do to convince you. It is essentially pascal's wager but CYBERRRR and not sarcastic, yes. Rich people with outweighed influence on the world have lasting effects and consequences. This is something that people pay attention to and document, both because it can cause a lot of problems, and because understanding those things is an important step in actually solving problems. I feel like you're not really seeing the forest for the trees here. Objectively, these kind of technological leaps should be a great thing that makes a ton of work easier for everyone. But like all other forms of automation, the system they exist under is rife for abuse. It's good to talk about that -
Upcoming discussion / possible debate with AI expert Eliezer Yudkowsky
Shaddy replied to Ross Scott's topic in General News
I mean I don't think he's gonna pose a danger to Ross or anything, but Yudkowsky and his lesswrong friends are just cranks. He's not really a programmer, not really an expert beyond having written a lot of things, and the "research" his foundation does with all the giant piles of money donated to them amounts to wild speculation about sci-fi shit becoming real. The current wave of neural network popularity and discourse is probably a convenient advertising strat, but it's mostly unrelated the robot apocalypse they keep predicting. If we wanna talk about the threats of AI, we're going to be talking about automation, impersonation, data and statistical bias, things that accentuate the problems we already face in the real world due to centuries of compounded socioeconomic tension and systemic injustice. The current problems are not a break with tradition, they are the tradition, only more. By contrast, the thing that Yudkowski (and Elon Musk, in case you wanted more evidence it's a fake problem) is most afraid of amounts to a time-traveling bonzibuddy that tortures you for not thinking about it hard enough. I highly recommend rationalwiki's dutifully-maintained and sourced page for more info on all this garbage. A lot of it is really funny. -
Upcoming discussion / possible debate with AI expert Eliezer Yudkowsky
Shaddy replied to Ross Scott's topic in General News
This dude claims he sat in a room for a while and logically solved all of human morality by thinking really hard. I hope Ross is prepared for an hour of bad-faith arguments followed by a bunch of crypto chodes claiming the other guy won -
Eliezer Yudkowsky is, in fact, just some chump.
-
Shaddy changed their profile photo
-
I thought this was just a joke about the title, but no. That really is what the game is
-
That's awful optimistic. Given the past...okay, forever, but especially during Donny's run, I can't be anything but cynical at the idea of anyone with power in this country facing real justice. Also, we haven't exactly made significant progress against the voter suppression that has won republicans so many prior elections, and it's clear that Democrats just...hate being in power? I'm not sure any political party is more self-destructive right now, other than maybe the current UK labour party. Republicans have been just fucking disgusting, but there's nothing about their current strand of reactionary absurdity that breaks with their regular, traditional reactionary absurdity.
-
Sorry! I was definitely being a bit aggressive before. It IS cool work, the language of "solved" just put me off a bit, because this is a thing I think is both important and interesting, and sometimes seeing a work-in-progress sold as a solution gets under my skin, makes me worry that it's not gonna continue improving. You see it in reverse, too! People talk about Y2K like it was a fake problem people were wrong to ever worry about, instead of an issue that a lot of people worked very hard to prevent from being too destructive. We're probably gonna see the same mentality brought out whenever we get around to climate mitigation.
-
"This is solved now" bruh, no it's not the advances in neural net tech have been impressive (and make a lot of the ideas in Ross's video seem achievable), but they clearly aren't at "solution" level
-
I should probably watch FLCL again after like six years but I still vividly remember all the pillows songs
-
I think it'd take longer to finish all that anyway, but doing things in that manner would be both inefficient and potentially harder. I don't think anyone wants a repeat of the late-2014 cram to finish Freeman's Mind 1.
-
I wonder if they'll take this time to finally finish that fucking comic book.