background top icon
background center wave icon
background filled rhombus icon
background two lines icon
background stroke rhombus icon

Download "Will AI kill all of us? | Yuval Noah Harari and Lex Fridman"

input logo icon
Video tags
|

Video tags

ai
ai clips
ai podcast
ai podcast clips
artificial intelligence
artificial intelligence podcast
computer science
consciousness
deep learning
einstein
elon musk
engineering
friedman
joe rogan
lex ai
lex clips
lex fridman
lex fridman podcast
lex friedman
lex mit
lex podcast
machine learning
math
math podcast
mathematics
mit ai
philosophy
physics
physics podcast
science
tech
tech podcast
technology
turing
yuval noah harari
Subtitles
|

Subtitles

subtitles menu arrow
  • ruRussian
Download
00:00:02
let me ask you kind of a big
00:00:04
philosophical question about Ai and and
00:00:06
the threat of it uh let's look like uh
00:00:09
at the threat side so folks like Elias
00:00:12
yadkowski worry that AI might
00:00:15
kill all of us
00:00:17
do you worry about that range of
00:00:20
possibilities where artificial
00:00:22
intelligence systems
00:00:24
in a variety of ways might uh destroy
00:00:28
human civilization
00:00:30
I talk a lot about it about the dangers
00:00:33
of AI I sometimes get into trouble
00:00:34
because I depict these scenarios of how
00:00:37
AI becoming very dangerous and then
00:00:39
people say that I'm encouraging these
00:00:40
scenarios but I'm I'm you know I'm
00:00:42
talking about it as a warning I'm not so
00:00:46
terrified of the simplistic idea again
00:00:49
the Terminator scenario of robot running
00:00:52
in the street shooting everybody
00:00:55
I'm more worried about AI accumulating
00:00:59
more and more power and basically taking
00:01:02
over Society based taking over our lives
00:01:06
taking power away from us until we don't
00:01:10
understand what is happening and we lose
00:01:12
control of our lives and of the future
00:01:15
the two most important things to realize
00:01:18
about AI you know so many things are
00:01:19
being said now about AI but I think
00:01:21
there are two things that every person
00:01:23
should know about AI
00:01:25
first is that AI is the First Tool in
00:01:28
history that can make decisions by
00:01:31
itself
00:01:32
all previous Tools in history couldn't
00:01:34
make decisions this is why they
00:01:36
empowered us you invent a knife you
00:01:40
invent an atom bomb the atom bomb cannot
00:01:42
decide to start a war cannot decide
00:01:44
which city to own
00:01:47
AI can make decisions by itself
00:01:51
an autonomous weapon systems can decide
00:01:54
by themselves who to kill who to bomb
00:01:59
the second thing is that AI is the First
00:02:02
Tool in history that can create new
00:02:03
ideas by itself
00:02:05
the printing press could print our ideas
00:02:08
but could not create new ideas AI can
00:02:13
create new ideas entirely by itself
00:02:16
this is unprecedented therefore it is
00:02:19
the first technology in history that
00:02:22
instead of giving power to humans it
00:02:25
takes power away from us
00:02:27
and the danger is that if it will
00:02:30
increasingly take more and more power
00:02:32
from us
00:02:33
until we are left helpless and clueless
00:02:37
about what is happening in the world and
00:02:40
this is already beginning to happen in
00:02:43
an accelerated uh Pace more and more
00:02:47
decisions about our lives whether to
00:02:49
give us a loan whether to give us a
00:02:51
mortgage whether to give us a job or
00:02:53
taking by Ai and more and more of the
00:02:56
ideas of the images of the stories that
00:03:00
surround us and shape our our minds our
00:03:03
world our produced or created by AI not
00:03:07
by human beings if you can just Linger
00:03:09
on that what is the danger of that that
00:03:12
more and more of the creative
00:03:14
side
00:03:16
is done by AI the idea generation
00:03:19
is it that we become stale in our
00:03:21
thinking is that that idea generation is
00:03:24
so fundamental to like the
00:03:26
evolution of humanity but we can't
00:03:28
resist the idea is to to resist an idea
00:03:31
you need to have some some vision of of
00:03:35
the creative process yeah now this is a
00:03:37
very old fear
00:03:39
um you go back to Plato's Cave
00:03:41
some of this idea that people are
00:03:43
sitting chained in a cave and seeing
00:03:47
shadows on a screen on a wall and
00:03:50
thinking this is reality
00:03:51
you go back to to the cart and he has
00:03:54
this thought experiment of the of the
00:03:57
demon and the card asks himself how do I
00:04:00
know that any of this is real maybe
00:04:02
there is a demon who is creating all of
00:04:05
this and is basically enslaving Me by
00:04:08
surrounding me with these Illusions you
00:04:11
go back to Buddha it's the same question
00:04:13
what if we are living in a world of
00:04:16
Illusions and because we have been
00:04:18
living in it throughout our lives all
00:04:21
our ideas or our desires how we
00:04:24
understand ourselves this is all the
00:04:26
product of the same illusions
00:04:29
and this was a big philosophical
00:04:32
question for thousands of years now it's
00:04:34
becoming a practical question of
00:04:37
engineering because previously all the
00:04:39
ideas as far as we know maybe we are
00:04:42
living inside a computer simulation or
00:04:44
for intelligent Rats from the planet
00:04:46
Zircon if that's the case we don't know
00:04:48
about it but taking what we do know
00:04:51
about human history until now all the
00:04:54
again stories images paintings songs
00:04:58
operas theater everything we've
00:05:00
encountered and shaped our minds was
00:05:03
created by humans
00:05:04
now increasingly we live in a world
00:05:07
where the more and more of these
00:05:09
cultural artifacts will be coming from
00:05:11
an alien intelligence
00:05:14
very quickly we might reach a point when
00:05:16
most
00:05:18
of the story stories images songs TV
00:05:22
shows whatever are created by an alien
00:05:25
intelligence
00:05:26
and
00:05:28
um if we now find ourselves inside this
00:05:30
kind of world of illusions
00:05:32
created by an alien intelligence that we
00:05:36
don't understand but it understands us
00:05:40
this is a kind of you know spiritual
00:05:43
enslavement that we won't be able to
00:05:45
break out of because it understands us
00:05:50
it understands how to manipulate us but
00:05:52
we don't understand
00:05:55
what is behind this screen of stories
00:05:59
and images and and songs
00:06:02
so if there's a set of AI systems that
00:06:05
are operating in the space of ideas
00:06:07
they're far superior to ours and we're
00:06:10
not almost able to it's it's opaque to
00:06:13
us we're not able to see through how
00:06:15
does that change the um
00:06:18
the pursuit of happiness
00:06:20
the human pursuit of happiness life
00:06:23
where do we get joy
00:06:25
if there were surrounded by AI systems
00:06:28
that are doing most of the cool things
00:06:30
humans do much better than us you know
00:06:34
some of the things it's okay that the
00:06:35
AIS would do them many human tasks and
00:06:38
jobs
00:06:40
um you know it's it's the drudgery they
00:06:43
are not fun they are not developing I
00:06:46
will ask emotionally or spiritually it's
00:06:49
fine if the robots take over
00:06:52
I don't know I think about the people in
00:06:54
supermarkets or grocery stores that
00:06:56
spend hours every day just you know
00:06:57
passing uh items and and then charging
00:07:01
you the money I mean if this can be
00:07:03
automated wonderful we need to make sure
00:07:06
that these people uh then have better
00:07:09
jobs uh uh better means of supporting
00:07:13
themselves and developing their social
00:07:17
abilities their spiritual abilities uh
00:07:20
and and that that's the that's the ideal
00:07:23
world that AI can create
00:07:25
that it takes away from us the things
00:07:29
that uh uh it's better if we don't do
00:07:32
them
00:07:33
and allows us to focus on the most
00:07:36
important things and the deepest aspects
00:07:39
of of our nature of our potential
00:07:42
if we give AI Control of the sphere of
00:07:45
ideas at this stage I think it's very
00:07:49
very dangerous because it doesn't
00:07:51
understand us and
00:07:53
AI at present is mostly digesting the
00:07:58
products of human culture
00:08:00
everything we've produced over thousands
00:08:03
of years it eats all of these cultural
00:08:06
products digests it and starts producing
00:08:10
its own new stuff
00:08:12
but we still still haven't figured out
00:08:15
ourselves in our bodies our brains our
00:08:18
minds our psychology
00:08:21
um so an AI based on our float
00:08:25
understanding of ourselves is a very
00:08:27
dangerous thing
00:08:29
I think that we need first of all to
00:08:35
keep developing ourselves
00:08:37
if if for every dollar and every minute
00:08:40
that we spend on developing AI
00:08:43
artificial intelligence we spend another
00:08:46
dollar and another minute in developing
00:08:49
human consciousness the human mind will
00:08:51
be okay
00:08:53
the danger is that we spend all our
00:08:55
effort on developing an AI at the time
00:08:58
that we don't understand ourselves
00:09:00
and then letting the AI take over that's
00:09:03
that's a road to a human catastrophe
00:09:06
does this surprise you how well large
00:09:09
language models work I mean has it
00:09:11
modified your understanding of the
00:09:13
nature of intelligence yes I mean you
00:09:15
know I've been writing about AI for
00:09:18
I know like eight years now
00:09:20
and engage with all these predictions
00:09:23
and speculations and when it actually
00:09:26
came it was much faster and more
00:09:28
powerful than I thought it would be
00:09:31
um I didn't think that we would have in
00:09:33
2023 and AI that can hold the
00:09:37
conversation that you can't know if it's
00:09:39
a human being or an AI that can write
00:09:43
beautiful texts in telling I mean I I
00:09:46
read the texts written by AI
00:09:49
and the thing that strikes me most is
00:09:52
the coherence
00:09:53
you know people think oh it's nothing
00:09:55
they just takes ideas from here and
00:09:57
their words for me and and putting no
00:09:59
it's so current I mean you read in not
00:10:03
sentences you read paragraphs you read
00:10:06
entire texts and there is logic that
00:10:09
there is the structure not only go here
00:10:11
and it's convincing yes and the
00:10:14
beautiful thing about it that has to do
00:10:16
with your work it doesn't have to be
00:10:18
true
00:10:20
gets facts wrong but it still is
00:10:22
convincing
00:10:24
and it is both scary and beautiful yeah
00:10:27
that our brains love language so much
00:10:30
that we don't
00:10:32
need the facts to be correct we just
00:10:35
need it to be a beautiful story yeah and
00:10:38
that's been the secret of politics and
00:10:40
religion for thousands of years and now
00:10:43
it's coming with AI so you as a person
00:10:46
who has written some of the most
00:10:48
impactful words ever written in your
00:10:50
books uh how does that make you feel
00:10:53
that you might be one of the last
00:10:56
effective human writers that's a good
00:10:59
question
00:11:00
first of all do you think that's
00:11:01
possible I think it is possible
00:11:04
um I I've seen a lot of examples of AI
00:11:07
being told right like you will know
00:11:09
Harare and what it produces has it ever
00:11:11
done better than you think you could
00:11:12
have written yourself
00:11:14
I mean on the level of content of ideas
00:11:18
no there are things I say I would I
00:11:21
would never say that
00:11:22
but when it comes to the you know I mean
00:11:25
there is again the coherence and the
00:11:28
quality of writing is such
00:11:31
that I say it's it's unbelievable how
00:11:34
good it is
00:11:35
and who knows in 10 years in 20 years
00:11:38
maybe it can do better even on in
00:11:42
according to certain measures of on the
00:11:45
level of of content
00:11:47
so that people would be able to do like
00:11:49
a style transfer to uh in the style of
00:11:53
Yvonne or Harare right anything right
00:11:56
why I should have ice cream tonight
00:12:00
and and make it convincing I don't know
00:12:01
if I have anything convincing to say
00:12:03
about this I think you'll be surprised I
00:12:05
think you'd be surprised they could be
00:12:06
an evolutionary biology explanation for
00:12:08
why ice cream is good for you yeah so I
00:12:11
mean uh
00:12:13
I mean that changes the nature of
00:12:15
writing ultimately I I think it it goes
00:12:18
back
00:12:19
um
00:12:20
much of of my writing is suspicious of
00:12:25
itself I write stories about the danger
00:12:29
of stories I write about intelligence
00:12:33
but highlighting the dangers of
00:12:35
intelligence ultimately I don't think
00:12:38
that in terms of power
00:12:41
human power comes from intelligence and
00:12:44
from stories but I think that the
00:12:47
deepest and best qualities of humans are
00:12:49
not
00:12:50
intelligence and not storytelling and
00:12:53
not power
00:12:54
again with all our power with all our
00:12:57
cooperation with our intelligence we are
00:13:00
on the verge of destroying ourselves and
00:13:03
destroying much of the ecosystem
00:13:05
our best qualities are not there
00:13:09
our best qualities are non-verbal again
00:13:13
they come from things like compassion
00:13:15
from introspection and introspection for
00:13:18
my experience is not variable if you try
00:13:21
to understand yourself with words you
00:13:23
will never succeed there is a a place
00:13:27
where you need the words
00:13:28
but the deepest insights they don't come
00:13:31
from words
00:13:33
and you can't write about it that that's
00:13:36
second it goes back to to Wittgenstein
00:13:38
to Buddha to so many of these sages
00:13:40
before that uh this these are the things
00:13:43
we are silent about
00:13:44
yeah but eventually you have to project
00:13:46
it as a writer you have to
00:13:48
do the silent introspection but project
00:13:51
it onto a page yes but you still have to
00:13:54
warn people you will never find the
00:13:57
deepest truth in a book you will never
00:13:59
find it in words
00:14:02
uh you can only find it in I think in
00:14:05
direct experience which is non-verbal
00:14:07
which is pretty verbal
00:14:09
in the Silence of your own mind yes
00:14:11
somewhere in there

Description:

Lex Fridman Podcast full episode: https://www.youtube.com/watch?v=Mde2q7GFCrw Please support this podcast by checking out our sponsors: - MasterClass: https://www.masterclass.com/lex to get 15% off - Eight Sleep: https://www.eightsleep.com/eu/lex/ to get special savings - ExpressVPN: https://expressvpn.com/lexpod to get 3 months free - InsideTracker: https://info.insidetracker.com/lex to get 20% off - AG1: https://drinkag1.com/lex to get 1 month supply of fish oil GUEST BIO: Yuval Noah Harari is a historian, philosopher, and author of Sapiens, Homo Deus, 21 Lessons for the 21st Century, and Unstoppable Us. PODCAST INFO: Podcast website: https://lexfridman.com/podcast Apple Podcasts: https://podcasts.apple.com/us/podcast/lex-fridman-podcast/id1434243584 Spotify: https://open.spotify.com/show/2MAi0BvDc6GTFvKFPXnkCL RSS: https://lexfridman.com/feed/podcast/ Full episodes playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4 Clips playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOeciFP3CBCIEElOJeitOr41 SOCIAL: - Twitter: https://twitter.com/lexfridman - LinkedIn: https://www.linkedin.com/in/lexfridman - Facebook: https://www.facebook.com/unsupportedbrowser - Instagram: https://www.facebook.com/unsupportedbrowser - Medium: https://medium.com/@lexfridman - Reddit: https://reddit.com/r/lexfridman - Support on Patreon: https://www.patreon.com/lexfridman

Preparing download options

popular icon
Popular
hd icon
HD video
audio icon
Only sound
total icon
All
* — If the video is playing in a new tab, go to it, then right-click on the video and select "Save video as..."
** — Link intended for online playback in specialized players

Questions about downloading video

mobile menu iconHow can I download "Will AI kill all of us? | Yuval Noah Harari and Lex Fridman" video?mobile menu icon

  • http://unidownloader.com/ website is the best way to download a video or a separate audio track if you want to do without installing programs and extensions.

  • The UDL Helper extension is a convenient button that is seamlessly integrated into YouTube, Instagram and OK.ru sites for fast content download.

  • UDL Client program (for Windows) is the most powerful solution that supports more than 900 websites, social networks and video hosting sites, as well as any video quality that is available in the source.

  • UDL Lite is a really convenient way to access a website from your mobile device. With its help, you can easily download videos directly to your smartphone.

mobile menu iconWhich format of "Will AI kill all of us? | Yuval Noah Harari and Lex Fridman" video should I choose?mobile menu icon

  • The best quality formats are FullHD (1080p), 2K (1440p), 4K (2160p) and 8K (4320p). The higher the resolution of your screen, the higher the video quality should be. However, there are other factors to consider: download speed, amount of free space, and device performance during playback.

mobile menu iconWhy does my computer freeze when loading a "Will AI kill all of us? | Yuval Noah Harari and Lex Fridman" video?mobile menu icon

  • The browser/computer should not freeze completely! If this happens, please report it with a link to the video. Sometimes videos cannot be downloaded directly in a suitable format, so we have added the ability to convert the file to the desired format. In some cases, this process may actively use computer resources.

mobile menu iconHow can I download "Will AI kill all of us? | Yuval Noah Harari and Lex Fridman" video to my phone?mobile menu icon

  • You can download a video to your smartphone using the website or the PWA application UDL Lite. It is also possible to send a download link via QR code using the UDL Helper extension.

mobile menu iconHow can I download an audio track (music) to MP3 "Will AI kill all of us? | Yuval Noah Harari and Lex Fridman"?mobile menu icon

  • The most convenient way is to use the UDL Client program, which supports converting video to MP3 format. In some cases, MP3 can also be downloaded through the UDL Helper extension.

mobile menu iconHow can I save a frame from a video "Will AI kill all of us? | Yuval Noah Harari and Lex Fridman"?mobile menu icon

  • This feature is available in the UDL Helper extension. Make sure that "Show the video snapshot button" is checked in the settings. A camera icon should appear in the lower right corner of the player to the left of the "Settings" icon. When you click on it, the current frame from the video will be saved to your computer in JPEG format.

mobile menu iconWhat's the price of all this stuff?mobile menu icon

  • It costs nothing. Our services are absolutely free for all users. There are no PRO subscriptions, no restrictions on the number or maximum length of downloaded videos.