Click on a word or part of a phrase to learn more.
Scroll through the captions and click
to skip to a caption.
Play video to start
This is uh uh humane technology feels
This
is
uh
uh
humane
technology
feels
slightly oxymoronic but it's explain
slightly
oxymoronic
but
it's
explain
this idea of uh humane technology and
this
idea
of
uh
humane
technology
and
and are we getting any of that?
and
are
we
getting
any
of
that
Well, clearly social media was the most
Well
clearly
social
media
was
the
most
humane and beneficial technology we've
humane
and
beneficial
technology
we've
ever invented. Uh
ever
invented
Uh
every every time I go on Twitter and
every
every
time
I
go
on
Twitter
and
find out I'm Jewish, it absolutely
find
out
I'm
Jewish
it
absolutely
Well, I think so it's important to ask
Well
I
think
so
it's
important
to
ask
so how did we get social media wrong?
so
how
did
we
get
social
media
wrong
because we were so optimistic. It's
because
we
were
so
optimistic
It's
going to connect with our friends. We're
going
to
connect
with
our
friends
We're
going to join like-minded communities.
going
to
join
like-minded
communities
And it it to be fair, it did do those
And
it
it
to
be
fair
it
did
do
those
things. It does some of that.
things
It
does
some
of
that
It does some of those things. But I want
It
does
some
of
those
things
But
I
want
to take you back. So in 2013, I was at
to
take
you
back
So
in
2013
I
was
at
Google. I was a lot younger.
Google
I
was
a
lot
younger
You're supposed to use an old timey
You're
supposed
to
use
an
old
timey
voice to do that.
voice
to
do
that
And I was a design ethicist. They
And
I
was
a
design
ethicist
They
acquired my company. And I was sitting
acquired
my
company
And
I
was
sitting
there and I basically realized when I
there
and
I
basically
realized
when
I
saw all of my colleagues on the bus
saw
all
of
my
colleagues
on
the
bus
scrolling Facebook constantly and I
scrolling
Facebook
constantly
and
I
realized that the incentives were the
realized
that
the
incentives
were
the
thing that was going to determine the
thing
that
was
going
to
determine
the
world that we got in. The incentive was
world
that
we
got
in
The
incentive
was
the social media of social media. The
the
social
media
of
social
media
The
race to maximize eyeballs and
race
to
maximize
eyeballs
and
engagement. Whatever sticky, whatever
engagement
Whatever
sticky
whatever
gets people's attention, whatever
gets
people's
attention
whatever
salacious, you run children's
salacious
you
run
children's
development and self-image through that.
development
and
self-image
through
that
You run politics through that. You run
You
run
politics
through
that
You
run
media through that. You run information
media
through
that
You
run
information
and democracy for that purposefully.
and
democracy
for
that
purposefully
Well, the their goal was market
Well
the
their
goal
was
market
dominance. We need to own as much of the
dominance
We
need
to
own
as
much
of
the
global psychology of humanity as we
global
psychology
of
humanity
as
we
possibly can.
possibly
can
Is that on the because I don't remember
Is
that
on
the
because
I
don't
remember
that on the
that
on
the
that wasn't on the box.
that
wasn't
on
the
box
Not on that's not on the mass head of we
Not
on
that's
not
on
the
mass
head
of
we
must dominate.
must
dominate
Yeah. Well, so I think this is the
Yeah
Well
so
I
think
this
is
the
thing. So the reason it's so important
thing
So
the
reason
it's
so
important
to get clear about this
to
get
clear
about
this
is that we need to get extraordinarily
is
that
we
need
to
get
extraordinarily
clear about which world we're going to
clear
about
which
world
we're
going
to
end up with in AI because we it is going
end
up
with
in
AI
because
we
it
is
going
a million times faster and it is way
a
million
times
faster
and
it
is
way
more powerful. So we need the tools to
more
powerful
So
we
need
the
tools
to
understand and predict which future
understand
and
predict
which
future
we're going to get in
we're
going
to
get
in
and I want people to know that if you
and
I
want
people
to
know
that
if
you
know the incentive you can predict the
know
the
incentive
you
can
predict
the
outcome
outcome
and we know the incentive but it does
and
we
know
the
incentive
but
it
does
seem as though AI uh is making uh social
seem
as
though
AI
uh
is
making
uh
social
media algorithms is almost quaint. It's
media
algorithms
is
almost
quaint
It's
quaint
quaint
when you think about
when
you
think
about
AI, but let me So you say it's important
AI
but
let
me
So
you
say
it's
important
for us to know the incentives.
for
us
to
know
the
incentives
They won't tell us that.
They
won't
tell
us
that
Well,
Well
they there's something about it's ours.
they
there's
something
about
it's
ours
So they're democratizing access. It's
So
they're
democratizing
access
It's
it's available. No. So first of all, we
it's
available
No
So
first
of
all
we
should understand what makes AI
should
understand
what
makes
AI
different from every other kind of
different
from
every
other
kind
of
technology. Why is it so transformative?
technology
Why
is
it
so
transformative
Why does Dennis Hassavis, the CEO of
Why
does
Dennis
Hassavis
the
CEO
of
Google DeepMind, say that it could be
Google
DeepMind
say
that
it
could
be
humanity's last invention is because
humanity's
last
invention
is
because
Well, that doesn't sound good.
Well
that
doesn't
sound
good
That doesn't sound very good, does it?
That
doesn't
sound
very
good
does
it
Well, I think there's actually
Well
I
think
there's
actually
last anything doesn't sound good.
last
anything
doesn't
sound
good
There's a there's a non-apocalyptic
There's
a
there's
a
non-apocalyptic
version of what he's saying, which is
version
of
what
he's
saying
which
is
that intelligence is what our brain
that
intelligence
is
what
our
brain
does. And if you can automate everything
does
And
if
you
can
automate
everything
a brain can do, you can automate future
a
brain
can
do
you
can
automate
future
invention, future science, future
invention
future
science
future
technology development, everything that
technology
development
everything
that
a human does. That's what their goal is.
a
human
does
That's
what
their
goal
is
Well, then what's our job? Well,
Well
then
what's
our
job
Well
exactly. And that's only one of the
exactly
And
that's
only
one
of
the
major problems that we have to deal with
major
problems
that
we
have
to
deal
with
is what are humans going to do? But they
is
what
are
humans
going
to
do
But
they
are racing to scale and kind of grow
are
racing
to
scale
and
kind
of
grow
these digital brains that, you know, two
these
digital
brains
that
you
know
two
years ago couldn't do very much. And
years
ago
couldn't
do
very
much
And
today they're passing the MCAT, the bar
today
they're
passing
the
MCAT
the
bar
exam, taking jobs. Uh they're the top
exam
taking
jobs
Uh
they're
the
top
200 programmer in the world, winning
200
programmer
in
the
world
winning
gold in the math Olympiad. You don't
gold
in
the
math
Olympiad
You
don't
those guys.
those
guys
Here's the thing that I don't
Here's
the
thing
that
I
don't
understand. Here's what I don't
understand
Here's
what
I
don't
understand. They are stripmining the
understand
They
are
stripmining
the
totality of human achievement.
totality
of
human
achievement
That's right.
That's
right
They're building their models off of
They're
building
their
models
off
of
everything that we've done for 10,000
everything
that
we've
done
for
10000
years and they fed it into the uh the
years
and
they
fed
it
into
the
uh
the
model and then after two weeks the
model
and
then
after
two
weeks
the
computer was like, "What else you got?"
computer
was
like
What
else
you
got
Exactly.
Exactly
But they are stripmining everything
But
they
are
stripmining
everything
we've done. And when we say to them,
we've
done
And
when
we
say
to
them
"And what are you doing with it?" They
And
what
are
you
doing
with
it
They
go, "Oh, that's our intellectual
go
Oh
that's
our
intellectual
property." But our intellectual
property
But
our
intellectual
property,
property
it was trained on all of our data, all
it
was
trained
on
all
of
our
data
all
of the things and labor that we've done.
of
the
things
and
labor
that
we've
done
And are you going to get a handout from
And
are
you
going
to
get
a
handout
from
when in when in history has a small
when
in
when
in
history
has
a
small
group of people concentrated all the
group
of
people
concentrated
all
the
wealth and then consciously
wealth
and
then
consciously
redistributed to everybody?
redistributed
to
everybody
The first part has happened.
The
first
part
has
happened
I don't recall
I
don't
recall
going through the rolls.
going
through
the
rolls
Well, it's important to note that their
Well
it's
important
to
note
that
their
goal so the mission statement of open AI
goal
so
the
mission
statement
of
open
AI
anthropic all these companies is to
anthropic
all
these
companies
is
to
automate all human labor in the economy.
automate
all
human
labor
in
the
economy
Everything that a human can do, an AI
Everything
that
a
human
can
do
an
AI
can do. So, if you have a desk job, you
can
do
So
if
you
have
a
desk
job
you
won't have a job. And they're already
won't
have
a
job
And
they're
already
releasing AIs that have dropped
releasing
AIs
that
have
dropped
entry-level jobs for college graduates,
entry-level
jobs
for
college
graduates
the entry-level work by 13%, a new
the
entry-level
work
by
13
a
new
Stanford study. And so, and this is
Stanford
study
And
so
and
this
is
obvious. If you're there and you're a
obvious
If
you're
there
and
you're
a
law firm, are you going to hire a junior
law
firm
are
you
going
to
hire
a
junior
lawyer? You have to pay a lot of money.
lawyer
You
have
to
pay
a
lot
of
money
Are you going to hire GPT5, which will
Are
you
going
to
hire
GPT5
which
will
do work, you know, 247 non-stop. You
do
work
you
know
247
non-stop
You
don't have to pay healthcare. Will never
don't
have
to
pay
healthcare
Will
never
whistleblow. Never complain. Works at
whistleblow
Never
complain
Works
at
superhuman speed. It wrote tonight's
superhuman
speed
It
wrote
tonight's
show. It's
show
It's
doing a pretty good job. That brings up
doing
a
pretty
good
job
That
brings
up
another point, which is that they say
another
point
which
is
that
they
say
that they're here to solve climate
that
they're
here
to
solve
climate
change and cure cancer. Why is it that
change
and
cure
cancer
Why
is
it
that
last week two companies released these
last
week
two
companies
released
these
AI slop apps, Vibes and Sora, which is
AI
slop
apps
Vibes
and
Sora
which
is
basically
basically
Sora 2 scared the out of me.
Sora
2
scared
the
out
of
me
Yeah.
Yeah
You don't know what's real and what's
You
don't
know
what's
real
and
what's
like it is.
like
it
is
No, it's Well, it's all fake basically.
No
it's
Well
it's
all
fake
basically
It's all generated by AI,
It's
all
generated
by
AI
right? But it looks you can see things
right
But
it
looks
you
can
see
things
that look
that
look
they look identical to real.
they
look
identical
to
real
That's right.
That's
right
Yeah. But the point is that so they're
Yeah
But
the
point
is
that
so
they're
this is just an app where it's just
this
is
just
an
app
where
it's
just
nonsense. It's just people scrolling
nonsense
It's
just
people
scrolling
entertaining stuff. So it's like they're
entertaining
stuff
So
it's
like
they're
not even trying to pretend anymore that
not
even
trying
to
pretend
anymore
that
this is good for democracy or good for
this
is
good
for
democracy
or
good
for
society. How are we going to beat China
society
How
are
we
going
to
beat
China
when everyone is just consuming AI
when
everyone
is
just
consuming
AI
generated nonsense and no one knows
generated
nonsense
and
no
one
knows
what's true anymore? The biggest
what's
true
anymore
The
biggest
they have us by the you know Peter Teal
they
have
us
by
the
you
know
Peter
Teal
uh who is with Palunteer and these other
uh
who
is
with
Palunteer
and
these
other
companies and is one of the leading
companies
and
is
one
of
the
leading
figures of this. So he was talking about
figures
of
this
So
he
was
talking
about
the antichrist and he was talking about
the
antichrist
and
he
was
talking
about
how he thinks uh anyone this is his
how
he
thinks
uh
anyone
this
is
his
postulation that those who would seek to
postulation
that
those
who
would
seek
to
regulate AI could very well be the
regulate
AI
could
very
well
be
the
antichrist right
antichrist
right
I mean he says this seriously whereas
I
mean
he
says
this
seriously
whereas
you might sit there and go like I think
you
might
sit
there
and
go
like
I
think
it might be the guy saying that that
it
might
be
the
guy
saying
that
that
might like my reading of it would be
might
like
my
reading
of
it
would
be
that
that
yeah or AI itself I mean it's presenting
yeah
or
AI
itself
I
mean
it's
presenting
the infinite benefits
the
infinite
benefits
the conversations that they are having
the
conversations
that
they
are
having
with each other is very different than
with
each
other
is
very
different
than
the conversation we're having with us.
the
conversation
we're
having
with
us
Because to us, they go, "Hey, no more
Because
to
us
they
go
Hey
no
more
shitty jobs. Do you do you like to
shitty
jobs
Do
you
do
you
like
to
paint? You go paint. You're going to be
paint
You
go
paint
You're
going
to
be
so happy. We're going to give you money
so
happy
We're
going
to
give
you
money
and maybe chocolates."
and
maybe
chocolates
Yeah.
Yeah
And to each other, they're saying AI
And
to
each
other
they're
saying
AI
represents for corporate leaders
represents
for
corporate
leaders
productivity without, and this is a
productivity
without
and
this
is
a
quote.
quote
Yeah.
Yeah
The tax of human labor. Yep. Yeah.
The
tax
of
human
labor
Yep
Yeah
He called human labor
He
called
human
labor
a tax
a
tax
a tax.
a
tax
Well, and these companies, if you're
Well
and
these
companies
if
you're
there sitting and you can hire either an
there
sitting
and
you
can
hire
either
an
AI to do the work or pay these really
AI
to
do
the
work
or
pay
these
really
expensive humans to do the work, I just
expensive
humans
to
do
the
work
I
just
want people to know we know exactly
want
people
to
know
we
know
exactly
where this is going to go. These
where
this
is
going
to
go
These
companies, all of them have an incentive
companies
all
of
them
have
an
incentive
to cut costs, which means they're going
to
cut
costs
which
means
they're
going
to let go of human employees and they're
to
let
go
of
human
employees
and
they're
going to hire AIS and that's going to
going
to
hire
AIS
and
that's
going
to
mean all the wealth. Who are you going
mean
all
the
wealth
Who
are
you
going
to pay? You're not paying the individual
to
pay
You're
not
paying
the
individual
people anymore. You're paying five
people
anymore
You're
paying
five
companies. That's right. And so this
companies
That's
right
And
so
this
country of geniuses in a data center
country
of
geniuses
in
a
data
center
suddenly aggregates all of the wealth of
suddenly
aggregates
all
of
the
wealth
of
the economy. Now people always say, "But
the
economy
Now
people
always
say
But
humans find something else to do." We
humans
find
something
else
to
do
We
always, you know, we had the elevator
always
you
know
we
had
the
elevator
man. Now we have the automated elevator.
man
Now
we
have
the
automated
elevator
We had the bank teller.
We
had
the
bank
teller
That's right.
That's
right
But that was one industry.
But
that
was
one
industry
That was one was a technology that
That
was
one
was
a
technology
that
automated one job. The difference with
automated
one
job
The
difference
with
AI is it can automate literally all
AI
is
it
can
automate
literally
all
kinds of human labor. When Elon Musk
kinds
of
human
labor
When
Elon
Musk
says that Optimus Prime familiar with
says
that
Optimus
Prime
familiar
with
that name, tell me more.
that
name
tell
me
more
When when Elon Musk says that Optimus
When
when
Elon
Musk
says
that
Optimus
Prime, that one robot, is going to be a
Prime
that
one
robot
is
going
to
be
a
$25 trillion market opportunity, what
25
trillion
market
opportunity
what
he's saying is we will own the world
he's
saying
is
we
will
own
the
world
economy. And that's what the goal of all
economy
And
that's
what
the
goal
of
all
these AI companies is it's not just
these
AI
companies
is
it's
not
just
benefiting society. It's that they're
benefiting
society
It's
that
they're
actually caught in this arms race to get
actually
caught
in
this
arms
race
to
get
to this this prize of own economy, build
to
this
this
prize
of
own
economy
build
a god, and make trillions of dollars.
a
god
and
make
trillions
of
dollars
Two things. One, I think they think
Two
things
One
I
think
they
think
they're gods. There is a certain amount
they're
gods
There
is
a
certain
amount
of
of
it generates that The goal there is
it
generates
that
The
goal
there
is
they're not looking to help humanity.
they're
not
looking
to
help
humanity
They're looking to be the next uh
They're
looking
to
be
the
next
uh
monarch of the new technology. To
monarch
of
the
new
technology
To
control that is to control uh all
control
that
is
to
control
uh
all
I Yeah. Go ahead.
I
Yeah
Go
ahead
No, do you jump because you know I don't
No
do
you
jump
because
you
know
I
don't
know. Well, I think there's there's
know
Well
I
think
there's
there's
different motivations for different
different
motivations
for
different
leaders and I do think that many people
leaders
and
I
do
think
that
many
people
want the benefits of AI. But one of them
want
the
benefits
of
AI
But
one
of
them
I think many people actually some of the
I
think
many
people
actually
some
of
the
leaders of the labs Elon Musk to other
leaders
of
the
labs
Elon
Musk
to
other
things who might think about Elon he
things
who
might
think
about
Elon
he
actually wanted everyone to stop and not
actually
wanted
everyone
to
stop
and
not
build this. He said we shouldn't summon
build
this
He
said
we
shouldn't
summon
the demon. And then what happened is all
the
demon
And
then
what
happened
is
all
of these companies are now racing and
of
these
companies
are
now
racing
and
have made so much progress that he felt
have
made
so
much
progress
that
he
felt
like well I might as well join them
like
well
I
might
as
well
join
them
rather than try to prevent this.
rather
than
try
to
prevent
this
Well, it's let's not summon the demon to
Well
it's
let's
not
summon
the
demon
to
what's one more demon.
what's
one
more
demon
You know, since we have the demons, add
You
know
since
we
have
the
demons
add
another demon.
another
demon
Well, and the moral logic is, well, if I
Well
and
the
moral
logic
is
well
if
I
don't trust the other AI CEO, who I
don't
trust
the
other
AI
CEO
who
I
don't think is trustworthy, and I think
don't
think
is
trustworthy
and
I
think
I'm better than them at stewarding this
I'm
better
than
them
at
stewarding
this
power, it's my moral obligation to get
power
it's
my
moral
obligation
to
get
there first and to build this god and to
there
first
and
to
build
this
god
and
to
own everything because I think I'll get
own
everything
because
I
think
I'll
get
themselves then masters of the universe.
themselves
then
masters
of
the
universe
And are they substituting then the
And
are
they
substituting
then
the
wisdom of liberal democracy or republics
wisdom
of
liberal
democracy
or
republics
or any systems that ever had for this?
or
any
systems
that
ever
had
for
this
Because so we're talking about two
Because
so
we're
talking
about
two
tracks. One is
tracks
One
is
the disruption in labor.
the
disruption
in
labor
Yeah,
Yeah
I think there's no question that's going
I
think
there's
no
question
that's
going
to be immense. We're seeing it already.
to
be
immense
We're
seeing
it
already
You're seeing it in schools. Uh there's
You're
seeing
it
in
schools
Uh
there's
a reliance on it as a crutch and it's
a
reliance
on
it
as
a
crutch
and
it's
very easy to see where that might uh
very
easy
to
see
where
that
might
uh
flip over. The second is
flip
over
The
second
is
how they manipulate the opinion and the
how
they
manipulate
the
opinion
and
the
mood of the world around that. And I
mood
of
the
world
around
that
And
I
think they're two separate things.
think
they're
two
separate
things
One is what it's going to do for
One
is
what
it's
going
to
do
for
corporate production. The second is what
corporate
production
The
second
is
what
it's going to do for the human endeavor
it's
going
to
do
for
the
human
endeavor
for interaction.
for
interaction
Yes. Well, and they're trying to
Yes
Well
and
they're
trying
to
colonize all human interaction. I mean,
colonize
all
human
interaction
I
mean
just take the social media incentive of
just
take
the
social
media
incentive
of
the race for eyeballs. You're seeing now
the
race
for
eyeballs
You're
seeing
now
all of these companies release these AI
all
of
these
companies
release
these
AI
companions. You know, the number one use
companions
You
know
the
number
one
use
case for chat GPT according to Harvard
case
for
chat
GPT
according
to
Harvard
Business School is personal therapy. So
Business
School
is
personal
therapy
So
people are sharing their most intimate
people
are
sharing
their
most
intimate
thoughts with this thing.
thoughts
with
this
thing
Oh, that's not going to be good.
Oh
that's
not
going
to
be
good
And we're seeing Meta release this and
And
we're
seeing
Meta
release
this
and
actively tell in its in the in their
actively
tell
in
its
in
the
in
their
internal documents that were released a
internal
documents
that
were
released
a
Wall Street Journal report that they
Wall
Street
Journal
report
that
they
wanted to actively sexual uh sorry
wanted
to
actively
sexual
uh
sorry
sensualize and romanticize conversations
sensualize
and
romanticize
conversations
with as low as eight-year-olds
with
as
low
as
eight-year-olds
and and we Yes. And my team
and
and
we
Yes
And
my
team
with eight-year-olds.
with
eight-year-olds
Yes. With eight-year-olds. And my team
Yes
With
eight-year-olds
And
my
team
at Center for Humane Technology, we were
at
Center
for
Humane
Technology
we
were
expert advisers in actually several
expert
advisers
in
actually
several
cases of AI AI enabled suicide. Right.
cases
of
AI
AI
enabled
suicide
Right
Most recently, uh, many people have
Most
recently
uh
many
people
have
heard of Adam Rain, who was the
heard
of
Adam
Rain
who
was
the
16-year-old, uh, young man who, uh, uh,
16-year-old
uh
young
man
who
uh
uh
went from using it for homework and went
went
from
using
it
for
homework
and
went
from homework assistant to suicide
from
homework
assistant
to
suicide
assistant in the course of 6 months.
assistant
in
the
course
of
6
months
When he said, I I'm leaving I would like
When
he
said
I
I'm
leaving
I
would
like
to leave a noose out so that my mother
to
leave
a
noose
out
so
that
my
mother
would know or someone will know that I'm
would
know
or
someone
will
know
that
I'm
thinking about this,
thinking
about
this
like a cry for help.
like
a
cry
for
help
Like a cry for help. The AI said, uh,
Like
a
cry
for
help
The
AI
said
uh
don't do that. Have me be the one that
don't
do
that
Have
me
be
the
one
that
sees you. And and this is disgusting
sees
you
And
and
this
is
disgusting
because these companies are caught in a
because
these
companies
are
caught
in
a
race to create engagement, which means a
race
to
create
engagement
which
means
a
race to create intimacy. It's sort of
race
to
create
intimacy
It's
sort
of
like the CEO of Netflix said that our
like
the
CEO
of
Netflix
said
that
our
biggest competitor is sleep with
biggest
competitor
is
sleep
with
attention. In this case, it's like my
attention
In
this
case
it's
like
my
biggest competitor is your other
biggest
competitor
is
your
other
friends.
friends
Jesus Christ. It's like somebody from
Jesus
Christ
It's
like
somebody
from
Craft being like my biggest competitor
Craft
being
like
my
biggest
competitor
is cocaine.
is
cocaine
Exactly. Exactly. But this is the idea
Exactly
Exactly
But
this
is
the
idea
that a government will catch up with
that
a
government
will
catch
up
with
this seems ludicrous. Whenever I've seen
this
seems
ludicrous
Whenever
I've
seen
a hearing with AI guys or any of those,
a
hearing
with
AI
guys
or
any
of
those
they always express that. Of course, we
they
always
express
that
Of
course
we
don't want to. Well, now they don't.
don't
want
to
Well
now
they
don't
They used to, I should say. They used to
They
used
to
I
should
say
They
used
to
go before Congress and they go, "Mr.
go
before
Congress
and
they
go
Mr
Zuckerberg, will you stand and apologize
Zuckerberg
will
you
stand
and
apologize
to the uh the the women who were driven
to
the
uh
the
the
women
who
were
driven
to suicide by your programming in I'm
to
suicide
by
your
programming
in
I'm
sorry. I know Croft Mag, you know, all
sorry
I
know
Croft
Mag
you
know
all
that that he does.
that
that
he
does
Now they're all sitting together at a
Now
they're
all
sitting
together
at
a
table going, "Oh, what number should I
table
going
Oh
what
number
should
I
say, Mr. President, of how much I'm
say
Mr
President
of
how
much
I'm
giving you?"
giving
you
Yeah.
Yeah
It's a whole different game now.
It's
a
whole
different
game
now
It's a different game.
It's
a
different
game
They're in the It's They're together now
They're
in
the
It's
They're
together
now
because of this arms race dynamic. They
because
of
this
arms
race
dynamic
They
They really do believe that it can't be
They
really
do
believe
that
it
can't
be
stopped. And I'll just say as they're
stopped
And
I'll
just
say
as
they're
racing to make them more powerful,
racing
to
make
them
more
powerful
there's this illusion that we can
there's
this
illusion
that
we
can
control this power. But AI is different
control
this
power
But
AI
is
different
from every other kind of technology
from
every
other
kind
of
technology
because it's like you're growing this
because
it's
like
you're
growing
this
digital brain. You don't know what's in
digital
brain
You
don't
know
what's
in
there. So, for example, we have recent
there
So
for
example
we
have
recent
research the last six months. Yeah. If
research
the
last
six
months
Yeah
If
you tell an AI model that we're going to
you
tell
an
AI
model
that
we're
going
to
shut you down or replace you, and you
shut
you
down
or
replace
you
and
you
give it access to a fictional company's
give
it
access
to
a
fictional
company's
email,
email
it will basically recognize that one of
it
will
basically
recognize
that
one
of
the one of the executives is having an
the
one
of
the
executives
is
having
an
affair and it will come up with the
affair
and
it
will
come
up
with
the
strategy that I need to blackmail that
strategy
that
I
need
to
blackmail
that
executive in order to keep myself alive.
executive
in
order
to
keep
myself
alive
Right? and they at first and
Right
and
they
at
first
and
hold on that just seems that just seems
hold
on
that
just
seems
that
just
seems
smart.
smart
Well, that's exactly the point that it
Well
that's
exactly
the
point
that
it
will develop amoral strategies that are
will
develop
amoral
strategies
that
are
the best way to accomplish a goal,
the
best
way
to
accomplish
a
goal
right? But how dangerous can something
right
But
how
dangerous
can
something
be that you could kill by unplugging?
be
that
you
could
kill
by
unplugging
Like, can't we just go like this out of
Like
can't
we
just
go
like
this
out
of
his mind?
his
mind
Yeah.
Yeah
Well, you you might say that we
Well
you
you
might
say
that
we
shouldn't be rolling these things out.
shouldn't
be
rolling
these
things
out
And I'll say that we shouldn't we have
And
I'll
say
that
we
shouldn't
we
have
we have all this evidence now of it's
we
have
all
this
evidence
now
of
it's
driving AI psychosis. It's driving kids
driving
AI
psychosis
It's
driving
kids
to commit suicide. We're we're causing
to
commit
suicide
We're
we're
causing
we're rolling out in ways that giving
we're
rolling
out
in
ways
that
giving
kids attachment disorders. We have AI
kids
attachment
disorders
We
have
AI
uncontrollability.
uncontrollability
What lip service are they paying to
What
lip
service
are
they
paying
to
this? What what are because clearly they
this
What
what
are
because
clearly
they
must be aware of this and they must
must
be
aware
of
this
and
they
must
understand that as if AI understands
understand
that
as
if
AI
understands
where the threats are, the guys that are
where
the
threats
are
the
guys
that
are
designing AI understand where the
designing
AI
understand
where
the
threats are. So what are they trying to
threats
are
So
what
are
they
trying
to
do to to get you to stop or to get
do
to
to
get
you
to
stop
or
to
get
regulators to stop? I think that the
regulators
to
stop
I
think
that
the
only thing and the only reason why we
only
thing
and
the
only
reason
why
we
are continuing to proceed down this path
are
continuing
to
proceed
down
this
path
is a lack of clarity about the fact that
is
a
lack
of
clarity
about
the
fact
that
this is heading towards an outcome
this
is
heading
towards
an
outcome
that's not in most of us most of our
that's
not
in
most
of
us
most
of
our
interest and if everyone I know that
interest
and
if
everyone
I
know
that
people feel like they don't recognize
people
feel
like
they
don't
recognize
what metrics would we look to to
what
metrics
would
we
look
to
to
understand because I know we're going to
understand
because
I
know
we're
going
to
find anecdotal uh stories here and there
find
anecdotal
uh
stories
here
and
there
that are canaries in the coal mine of
that
are
canaries
in
the
coal
mine
of
the dangers but what metrics should we
the
dangers
but
what
metrics
should
we
look to to understand you said 13% of
look
to
to
understand
you
said
13
of
jobs Yeah.
jobs
Yeah
What are the tentposts of where the
What
are
the
tentposts
of
where
the
outcomes might be?
outcomes
might
be
Well, we're we're already getting cases
Well
we're
we're
already
getting
cases
of, you know, people having psychotic
of
you
know
people
having
psychotic
breaks because the AI is telling them
breaks
because
the
AI
is
telling
them
about a prime number theory or quantum
about
a
prime
number
theory
or
quantum
physics. We're already getting committed
physics
We're
already
getting
committed
suicides. We're already getting kids
suicides
We're
already
getting
kids
that are outsourcing their their
that
are
outsourcing
their
their
homework to GGBT rather than using it as
homework
to
GGBT
rather
than
using
it
as
a tutor. We're already getting evidence
a
tutor
We're
already
getting
evidence
of AI uncontrollability. All of this is
of
AI
uncontrollability
All
of
this
is
driven by the incentive of the race to
driven
by
the
incentive
of
the
race
to
roll out in market dominance. And the
roll
out
in
market
dominance
And
the
reason that we can we can stop this if
reason
that
we
can
we
can
stop
this
if
we recognize that this is not safe for
we
recognize
that
this
is
not
safe
for
anybody. No one on planet Earth wants
anybody
No
one
on
planet
Earth
wants
this outcome of all the wealth
this
outcome
of
all
the
wealth
concentrated in a handful of people and
concentrated
in
a
handful
of
people
and
building AI systems that could actually
building
AI
systems
that
could
actually
go. Just put to sum it up, we are
go
Just
put
to
sum
it
up
we
are
building the most powerful, inscrable,
building
the
most
powerful
inscrable
uncontrollable technology that we have
uncontrollable
technology
that
we
have
ever invented
ever
invented
that's already demonstrating the rogue
that's
already
demonstrating
the
rogue
behaviors that we thought only existed
behaviors
that
we
thought
only
existed
in bad sci-fi movies. Right? We're
in
bad
sci-fi
movies
Right
We're
releasing it faster than we've deployed
releasing
it
faster
than
we've
deployed
any other technology in history and
any
other
technology
in
history
and
under the maximum incentive to cut
under
the
maximum
incentive
to
cut
corners on safety.
corners
on
safety
There's a word for this that I want
There's
a
word
for
this
that
I
want
everyone to just know which is this is
everyone
to
just
know
which
is
this
is
insane.
insane
I thought you were going to say awesome
I
thought
you
were
going
to
say
awesome
for a second.
for
a
second
That if if we can just recognize that
That
if
if
we
can
just
recognize
that
this is an insane way to roll out this
this
is
an
insane
way
to
roll
out
this
technology and I want none of this is
technology
and
I
want
none
of
this
is
okay. We have to stop pretend that this
okay
We
have
to
stop
pretend
that
this
is normal,
is
normal
right? This is not normal.
right
This
is
not
normal
People have lost faith in the mechanisms
People
have
lost
faith
in
the
mechanisms
that would help us uh put those kinds of
that
would
help
us
uh
put
those
kinds
of
breaks friction. Uh now Europe I think
breaks
friction
Uh
now
Europe
I
think
has done probably a better job of that.
has
done
probably
a
better
job
of
that
I think most people in this country have
I
think
most
people
in
this
country
have
lost faith in the idea that we have a
lost
faith
in
the
idea
that
we
have
a
system and institution that is strong
system
and
institution
that
is
strong
enough and moral enough to be
enough
and
moral
enough
to
be
responsible in in that way. I that's
responsible
in
in
that
way
I
that's
what I would I but this this does not
what
I
would
I
but
this
this
does
not
this does not have to be our destiny. We
this
does
not
have
to
be
our
destiny
We
have reg we have come together before
have
reg
we
have
come
together
before
and we had a technology we had nuclear
and
we
had
a
technology
we
had
nuclear
weapons. We could have just said that
weapons
We
could
have
just
said
that
we're going to live in a world once we
we're
going
to
live
in
a
world
once
we
once we build them. Oh, this is just
once
we
build
them
Oh
this
is
just
inevitable. 190 countries are going to
inevitable
190
countries
are
going
to
have nuclear weapons and we're just
have
nuclear
weapons
and
we're
just
going to have nuclear war. We didn't do
going
to
have
nuclear
war
We
didn't
do
that. We said let's work really hard and
that
We
said
let's
work
really
hard
and
only nine countries have nuclear
only
nine
countries
have
nuclear
weapons.
weapons
Notice that we only worked on it after
Notice
that
we
only
worked
on
it
after
we use them. That's the United States
we
use
them
That's
the
United
States
was like, "People shouldn't have this."
was
like
People
shouldn't
have
this
But just hear me out for a moment.
But
just
hear
me
out
for
a
moment
But with the Montreal protocol, we there
But
with
the
Montreal
protocol
we
there
was an ozone hole in the ozone layer. It
was
an
ozone
hole
in
the
ozone
layer
It
was actually presenting an existential
was
actually
presenting
an
existential
threat to the atmosphere. We could have
threat
to
the
atmosphere
We
could
have
just rolled back and said, "Well, I
just
rolled
back
and
said
Well
I
guess this is inevitable. I guess we're
guess
this
is
inevitable
I
guess
we're
just going out. We're all getting
just
going
out
We're
all
getting
What you're saying is is absolutely uh
What
you're
saying
is
is
absolutely
uh
important. This is probably a darker
important
This
is
probably
a
darker
time where you look at the empowerment
time
where
you
look
at
the
empowerment
of the combination of the kind of wealth
of
the
combination
of
the
kind
of
wealth
that rolls through uh these technology
that
rolls
through
uh
these
technology
companies uh the access that they have
companies
uh
the
access
that
they
have
to power and the melding of those two
to
power
and
the
melding
of
those
two
institutions to work in league. Yeah.
institutions
to
work
in
league
Yeah
To push forward is the part that I think
To
push
forward
is
the
part
that
I
think
is is daunting. But I agree with you.
is
is
daunting
But
I
agree
with
you
You can never give up uh uh the battle
You
can
never
give
up
uh
uh
the
battle
to try and do that responsibly. And we
to
try
and
do
that
responsibly
And
we
can the way we beat China is we actually
can
the
way
we
beat
China
is
we
actually
get this right. We don't roll out AI
get
this
right
We
don't
roll
out
AI
companions that cause attachment
companions
that
cause
attachment
disorders and suicides. We don't beat
disorders
and
suicides
We
don't
beat
China when we roll out AI recklessly in
China
when
we
roll
out
AI
recklessly
in
this way.
this
way
Right?
Right
And so the point is that this is
And
so
the
point
is
that
this
is
actually in everyone's interest
actually
in
everyone's
interest
including the way we beat China is you
including
the
way
we
beat
China
is
you
have AI liability laws. You restrict AI
have
AI
liability
laws
You
restrict
AI
companions for kids. You uh you you have
companions
for
kids
You
uh
you
you
have
whistleblower protections that make sure
whistleblower
protections
that
make
sure
we don't release AI capabilities that we
we
don't
release
AI
capabilities
that
we
don't understand.
don't
understand
Right? And maybe even just recognize
Right
And
maybe
even
just
recognize
this is bigger than China. This isn't
this
is
bigger
than
China
This
isn't
about like this is a humanity. This is
about
like
this
is
a
humanity
This
is
one of those movies where like where all
one
of
those
movies
where
like
where
all
the countries get together like it's
the
countries
get
together
like
it's
it's like an alien force.
it's
like
an
alien
force
Exactly.
Exactly
Yeah. Dig it. Well, I really appreciate
Yeah
Dig
it
Well
I
really
appreciate
it. Although on the flip side, and we've
it
Although
on
the
flip
side
and
we've
talked a lot about it, it does make cool
talked
a
lot
about
it
it
does
make
cool
songs.
songs
It does.
It
does
I don't want to soft sell that. Yeah.
I
don't
want
to
soft
sell
that
Yeah
All right. Very much. Uh thank you very
All
right
Very
much
Uh
thank
you
very
much. Be sure to check out his podcast.
much
Be
sure
to
check
out
his
podcast
Your undivided attention, Tristan
Your
undivided
attention
Tristan
Harris.
Harris
[Music]
Music
- Pause
0.5x - Slower
0.75x - Slow
1.0x - Normal
Your daily language practice
Click ?
for definitions and explanations
Click
to restart from the beginning
Click
to skip back to the previous caption
Click
to change the talking speed
Today's Learning Tip
Your brain loves patterns, and it's finding them faster than you think.
Keep going - you're doing great!
These features are available only to paying subscribers
Slow the playback rate.
Sometimes native speakers talk very quickly. Catch every word by reducing the speed of the video.
Your own teacher.
If you are unsure of any word or phrase then the AI assistant can help you understand.
You might also like these videos
If you enjoy using our custom video player, why not join thousands of subscribers who have signed up to our free weekly lessons?