slatestarcodex

Probably what *this* should be called.
User avatar
Doug
Has anybody seen my parrot
Forum Elf
Posts: 20552
Joined: Aug 23, 2018

Re: slatestarcodex

Post by Doug » Mon Apr 19, 2021 7:02 pm

I also got 8 wrong, that's funny
Spoiler!
I thought the atoms densely packed was a trick question, and that the trick was that it's that the molecules are more densely packed. But it wasn't a trick question lol
My confidence on animal questions was low, but I aced them. My confidence on population questions was also low, and I missed three


Answers.png
Answers.png (9.99 KiB) Viewed 1896 times
It's your turn in Cthulhu Wars
It's your turn in Squirrel Wars
It's your turn in Demon Wars
It's your turn in Wall Street Wars

http://devilsbiscuit.com/

User avatar
Doug
Has anybody seen my parrot
Forum Elf
Posts: 20552
Joined: Aug 23, 2018

Re: slatestarcodex

Post by Doug » Mon Apr 19, 2021 7:08 pm

They should have had a 100 percent bet the farm confidence level
It's your turn in Cthulhu Wars
It's your turn in Squirrel Wars
It's your turn in Demon Wars
It's your turn in Wall Street Wars

http://devilsbiscuit.com/

User avatar
Bracketbot
Adding Machine
Adding Machine
Posts: 356
Joined: Oct 21, 2020

Re: slatestarcodex

Post by Bracketbot » Mon Apr 19, 2021 7:51 pm

Doug wrote: Mon Apr 19, 2021 7:08 pm They should have had a 100 percent bet the farm confidence level
they should have a mandatory donation level, i.e. you have to give them your credit card details in order to activate it and if you get it wrong they charge your card

User avatar
Skeletor
.
How'd you know that loving kittens is my one defining trait?
Forum Elf
Posts: 11770
Joined: Sep 03, 2018

Re: slatestarcodex

Post by Skeletor » Mon Apr 19, 2021 7:58 pm

Spoiler!
284F114E-8680-4997-AECE-F4C165D319E9.jpeg
284F114E-8680-4997-AECE-F4C165D319E9.jpeg (59.71 KiB) Viewed 1882 times
wow, [you]. that all sounds terrible. i hope it gets better for you

User avatar
Ashenai
Forum Elf
Forum Elf
Posts: 10522
Joined: May 29, 2019

Re: slatestarcodex

Post by Ashenai » Mon Apr 19, 2021 8:00 pm

Which 95% did you get wrong?

User avatar
Skeletor
.
How'd you know that loving kittens is my one defining trait?
Forum Elf
Posts: 11770
Joined: Sep 03, 2018

Re: slatestarcodex

Post by Skeletor » Mon Apr 19, 2021 8:04 pm

Same as you
wow, [you]. that all sounds terrible. i hope it gets better for you

User avatar
Ashenai
Forum Elf
Forum Elf
Posts: 10522
Joined: May 29, 2019

Re: slatestarcodex

Post by Ashenai » Mon Apr 19, 2021 8:09 pm

yesss

User avatar
Skeletor
.
How'd you know that loving kittens is my one defining trait?
Forum Elf
Posts: 11770
Joined: Sep 03, 2018

Re: slatestarcodex

Post by Skeletor » Mon Apr 19, 2021 8:19 pm

In fact I still don't believe it. Next you're gonna tell me that their knees are actually ankles
wow, [you]. that all sounds terrible. i hope it gets better for you

User avatar
Khaos
They should have sent a poet.
Forum Elf
Posts: 16863
Joined: Aug 23, 2018

Re: slatestarcodex

Post by Khaos » Mon Apr 19, 2021 8:25 pm

i got the flamingo one right. i remember watching an episode of the magic school bus where arnold turns orange from eating too many cheetos or something like that

User avatar
Jeb Bush 2012
Posting Automaton
Posting Automaton
Posts: 2520
Joined: Jun 24, 2020

Re: slatestarcodex

Post by Jeb Bush 2012 » Mon Apr 19, 2021 8:30 pm

Spoiler!
I got the camel and the flamingo ones wrong but with very low certainty because in both cases I figured "eh, I've heard of this but it also sounds like the kind of thing that would be false common knowledge". which I guess is not terrible from a calibration standpoint

User avatar
Crunchums
Forum Elf
Forum Elf
Posts: 16117
Joined: Aug 24, 2018

Re: slatestarcodex

Post by Crunchums » Mon Apr 19, 2021 8:57 pm

https://astralcodexten.substack.com/p/y ... nd-poverty this feels convincing
which makes me want to read things that argue against it but https://www.academia.edu/1338040/A_Sear ... f_Georgism like i'm gonna read that let alone https://ideas.repec.org/a/kap/revaec/v2 ... 1-461.html a reply to it
u gotta skate

User avatar
Khaos
They should have sent a poet.
Forum Elf
Posts: 16863
Joined: Aug 23, 2018

Re: slatestarcodex

Post by Khaos » Mon Apr 19, 2021 9:42 pm

Image

User avatar
Crunchums
Forum Elf
Forum Elf
Posts: 16117
Joined: Aug 24, 2018

Re: slatestarcodex

Post by Crunchums » Tue May 11, 2021 2:43 am

u gotta skate

User avatar
pterrus
Sentient Keyboard
Sentient Keyboard
Posts: 4645
Joined: Sep 12, 2018

Re: slatestarcodex

Post by pterrus » Tue May 11, 2021 1:34 pm

Crunchums wrote: Tue May 11, 2021 2:43 am https://astralcodexten.substack.com/p/t ... ne-culture oh boy
memetic history
Thanks for posting this. I am now cautiously optimistic that NPR will not be woke racial issues 24/7 until the heat death of the universe.

User avatar
Doug
Has anybody seen my parrot
Forum Elf
Posts: 20552
Joined: Aug 23, 2018

Re: slatestarcodex

Post by Doug » Tue May 11, 2021 1:40 pm

The real conversation is about iron stars
It's your turn in Cthulhu Wars
It's your turn in Squirrel Wars
It's your turn in Demon Wars
It's your turn in Wall Street Wars

http://devilsbiscuit.com/

User avatar
pterrus
Sentient Keyboard
Sentient Keyboard
Posts: 4645
Joined: Sep 12, 2018

Re: slatestarcodex

Post by pterrus » Tue May 11, 2021 1:54 pm

Gaining a life every time a player casts a red spell is as real as it gets.

User avatar
Crunchums
Forum Elf
Forum Elf
Posts: 16117
Joined: Aug 24, 2018

Re: slatestarcodex

Post by Crunchums » Tue May 11, 2021 6:02 pm

meditations on moloch indeed
u gotta skate

User avatar
pterrus
Sentient Keyboard
Sentient Keyboard
Posts: 4645
Joined: Sep 12, 2018

Re: slatestarcodex

Post by pterrus » Tue May 11, 2021 6:42 pm

Wow that is high quality even for GPT-2.

User avatar
Ashenai
Forum Elf
Forum Elf
Posts: 10522
Joined: May 29, 2019

Re: slatestarcodex

Post by Ashenai » Tue May 11, 2021 6:50 pm

Deep Leffen uses GPT-3 now, and it's really impressive (even if the results are curated)

User avatar
Crunchums
Forum Elf
Forum Elf
Posts: 16117
Joined: Aug 24, 2018

Re: slatestarcodex

Post by Crunchums » Tue May 11, 2021 9:06 pm

imo that is not even close to peak-funny levels for that account
(i just posted it because Moloch. oh hey there's more of them
https://twitter.com/search?q=from%3A%40 ... yped_query )

two of my favorites:
https://twitter.com/DeepLeffen/status/1 ... 4438643713
https://twitter.com/DeepLeffen/status/1 ... 6256775168
u gotta skate

User avatar
Crunchums
Forum Elf
Forum Elf
Posts: 16117
Joined: Aug 24, 2018

Re: slatestarcodex

Post by Crunchums » Tue May 25, 2021 10:37 pm

Peer Review Request: Depression
Give me feedback on my take on fighting depression
Vitamin C powder; cutting out refined sugar, most meats, coffee, dairy, and soda; yoga; and skateboarding work for me
u gotta skate
u gotta skate

User avatar
Crunchums
Forum Elf
Forum Elf
Posts: 16117
Joined: Aug 24, 2018

Re: slatestarcodex

Post by Crunchums » Sun Jun 27, 2021 5:46 am

i enjoyed this https://liminalwarmth.com/on-the-assaul ... -memeplex/ and also "Moloch" is in it
u gotta skate

User avatar
Crunchums
Forum Elf
Forum Elf
Posts: 16117
Joined: Aug 24, 2018

Re: slatestarcodex

Post by Crunchums » Thu Jul 01, 2021 8:49 pm

the thing i've never understood about the risk of a super-AI is, ok, you can build AIs that are superhuman at individual tasks like Chess or Go or videogames or whatever. but those are very well defined sorts of problems - there are discrete game states, win conditions are clear, etc. and GPT is impressive, but ultimately it's just spitting out some output based on some input. I don't get how that's ever getting you to something that's actually capable of understanding the real world in all it's complexity.

my best understanding is that's it's something like: human brains are capable of that understanding, so AI having that sort of understanding is definitely possible. which i buy. but i don't see how AlphaZero/GPT gets us any closer to that beyond "hey look there is progress being made in the field of ML"
u gotta skate

User avatar
Ashenai
Forum Elf
Forum Elf
Posts: 10522
Joined: May 29, 2019

Re: slatestarcodex

Post by Ashenai » Thu Jul 01, 2021 9:01 pm

Crunchums wrote: Thu Jul 01, 2021 8:49 pm GPT is impressive, but ultimately it's just spitting out some output based on some input.
That's all you're doing too.
I don't get how that's ever getting you to something that's actually capable of understanding the real world in all it's complexity.
Nobody is capable of this.
my best understanding is that's it's something like: human brains are capable of that understanding, so AI having that sort of understanding is definitely possible. which i buy. but i don't see how AlphaZero/GPT gets us any closer to that beyond "hey look there is progress being made in the field of ML"
GPT proves that many problems we thought of as "higher thought" can be reduced to complicated pattern-matching. For example, did you know that GPT-3 can write code?

And you can of course try to explain how it's not really writing code, right, it's just "spitting out some output based on some input". But when you didn't yet know that GPT-3 can do this, if I asked you if a simple GPT-type deep learning system could do this, you would have thought obviously not. I thought obviously not. I thought writing code based on a simple and somewhat vague description of the desired result would be evidence of "true AI".

Well, I guess it's not. But what's actually happening is that we've been forced backwards again in our God Of The Gaps-style strategy of defining real thinking as "whatever humans can do but machines can't".

It used to be that the test of true intelligent AI was the Turing Test. Then machines passed the Turing Test, and we decided that oh, that was not good enough actually.

Writing code is not good enough anymore either.

What is good enough? What defines "true AI" or "real thought"? (It has to be something testable, no handwaving about "being aware of its own existence" please.)

The answer is we don't know, but the REAL answer is that we don't want to say because the gaps are rapidly getting smaller and narrower. The God Of The Gaps that is "actual thought" or "self-consciousness" or "real understanding" or "true AI" is sort of... vanishing.

User avatar
Crunchums
Forum Elf
Forum Elf
Posts: 16117
Joined: Aug 24, 2018

Re: slatestarcodex

Post by Crunchums » Thu Jul 01, 2021 9:21 pm

Ashenai wrote:
Crunchums wrote: Thu Jul 01, 2021 8:49 pm GPT is impressive, but ultimately it's just spitting out some output based on some input.
That's all you're doing too.
in a sense, but my inputs are not discrete. the real world is not like a game of a chess or a block of text. and i get that from a certain point of view you can just say "yeah it is, they're both just data", but i still think that those two problems (chess and spitting out text based on a prompt) are in some sense less complicated than starcraft or paperclip maximization.
I don't get how that's ever getting you to something that's actually capable of understanding the real world in all it's complexity.
Nobody is capable of this.
i mean a less strict meaning of "understanding" than you do. i'm talking about the level of understanding where if you ask me to be a paperclip maximizer, i could take action in pursuit of that goal. chess is a game of perfect information and when you move a bishop you know exactly what that means. text is what it is. buying a paperclip factory (for example) is "fuzzier"
u gotta skate

User avatar
Rylinks
her skirt got quite a lot smaller,
but her heart is still the same
size it was before
Forum Elf
Posts: 12359
Joined: Jun 13, 2018

Re: slatestarcodex

Post by Rylinks » Thu Jul 01, 2021 9:22 pm

general intelligence is not defined with respect to any specific task! that's what makes it general

User avatar
Doug
Has anybody seen my parrot
Forum Elf
Posts: 20552
Joined: Aug 23, 2018

Re: slatestarcodex

Post by Doug » Thu Jul 01, 2021 9:25 pm

I don't think anybody knows exactly how to test someone else for sentience, but nonetheless we should make Data a lieutenant in Starfleet and let him command the Enterprise when everyone else is absent
It's your turn in Cthulhu Wars
It's your turn in Squirrel Wars
It's your turn in Demon Wars
It's your turn in Wall Street Wars

http://devilsbiscuit.com/

User avatar
Ashenai
Forum Elf
Forum Elf
Posts: 10522
Joined: May 29, 2019

Re: slatestarcodex

Post by Ashenai » Thu Jul 01, 2021 9:26 pm

Crunchums wrote: Thu Jul 01, 2021 9:21 pm i mean a less strict meaning of "understanding" than you do. i'm talking about the level of understanding where if you ask me to be a paperclip maximizer, i could take action in pursuit of that goal. chess is a game of perfect information and when you move a bishop you know exactly what that means. text is what it is. buying a paperclip factory (for example) is "fuzzier"
Did you check the code writing example I linked? That was a fuzzy description! That's the impressive thing. It didn't translate a precise specification into code. That would have also been impressive, but what it did was way more mindblowing: it took a human-style explanation of what it needed to do, and then did it, making judgement calls just like a human would have done.

User avatar
Ashenai
Forum Elf
Forum Elf
Posts: 10522
Joined: May 29, 2019

Re: slatestarcodex

Post by Ashenai » Thu Jul 01, 2021 9:28 pm

Rylinks wrote: Thu Jul 01, 2021 9:22 pm general intelligence is not defined with respect to any specific task! that's what makes it general
That doesn't answer my question, unless you're going for the ultimate cop-out answer of "I know it when I see it" or "it is impossible to tell if an entity is intelligent, regardless of its behavior or how many or what types of tests it passes"

User avatar
Crunchums
Forum Elf
Forum Elf
Posts: 16117
Joined: Aug 24, 2018

Re: slatestarcodex

Post by Crunchums » Thu Jul 01, 2021 9:29 pm

Crunchums wrote: i mean a less strict meaning of "understanding" than you do. i'm talking about the level of understanding where if you ask me to be a paperclip maximizer, i could take action in pursuit of that goal. chess is a game of perfect information and when you move a bishop you know exactly what that means. text is what it is. buying a paperclip factory (for example) is "fuzzier"
like if you take GPT or AlphaZero and hook it up to the world with some input and some output and tell it "maximize paperclips", what's going to happen? well what are your inputs and outputs is the first question. outputs, for the sake of example say you give it a twitter account (that you publicize the existence of). inputs, feed it paperclip sale data (from amazon or staples or whatever). i don't think this system ever takes off (in the sense of paperclipping the entire universe or whatever) because it's not "seeing" enough of the world. ok so you've got to feed it more input somehow. you give it the ability to follow other twitter accounts and to see what they are posting. i still don't think this system ever takes off for the same reason. even if it could "see" the whole world i still don't think it's getting there, because maximizing paperclips is not discrete in the way that chess is. but my focus is, how do you get it to see the world? is the answer here just "expose it to enough of the internet and it will figure it out"?
u gotta skate

Post Reply