How women revolutionized the computing and Internet technology we use…
TikTok, Boom. (Director's Cut)
- Description
- Reviews
- Citation
- Cataloging
- Transcript
Dissecting one of the most influential platforms of the contemporary social media landscape, TIKTOK, BOOM., directed by CODED BIAS filmmaker Shalini Kantayya, examines the algorithmic, socio-political, economic, and cultural influences and impact of the history-making app. This rigorous exploration balances a genuine interest in the TikTok community and its innovative mechanics with a healthy skepticism around the security issues, global political challenges, and racial biases behind the platform. A cast of Gen Z subjects, helmed by influencer Feroza Aziz, remains at its center, making this one of the most needed and empathetic films exploring what it means to be a digital native.
With this new work, Kantayya, a Sundance Fellow, continues her engagement in the space where technology meets, amplifies, and opposes our humanity. Her incisive, current look at the power and complexity of tech continues to advance a conversation that is bettered by her careful stewardship.
"A thoughtful conversation starter."
The Hollywood Reporter
"An impressive scope with its journalism."
Nick Allen rogerebert.com
"An intriguing exploration."
CBS News
"Engrossing...a cliffhanger"
The Wrap
"A great little historical film that reminds us where this generational new outlet came from."
IndieWire
"Sprightly, informative...there are more levels to the TikTok phenomenon than there are to almost any other blockbuster app."
Variety
"...the documentary shows that there's so much more going on behind the scenes that most of its users aren't even aware of."
Golden Globe Awards
"Raises many pressing concerns about the power of TikTok and its lack of accountability"
That Shelf
"The film is a story of personal branding versus personal privacy."
POV Magazine
"The documentary, TIKTOK, BOOM. is here to remind you of the good and the bad."
Geek Vibes Nation
"Incisive look at the power and complexities of technology"
Black Film
"Shalini Kantayya's is the kind of film that should be mandatory viewing for the rising numbers of kids who dream of being social media influences one day..."
Slash Film
"At once fascinating and frightening...insightfully explores the social media platform."
Flickering Myth
"Fascinating and insightful"
Splash Magazine
Citation
Main credits
Kantayya, Shalini (film director)
Kantayya, Shalini (film producer)
Dinerstein, Ross M. (film producer)
Mynard, Danni (film producer)
Aziz, Feroza (on-screen participant)
Other credits
Director of photography, Steve Acevedo; editor, Seth Anderson; original score, Katya Mihailova.
Distributor subjects
No distributor subjects provided.Keywords
00:00:30.034 --> 00:00:33.935
FEROZA:
TikTok was a new app.
00:00:34.069 --> 00:00:39.535
I knew millions of people
were downloading it by the day.
00:00:39.668 --> 00:00:42.602
And I just thought,
"I want to post on a platform
00:00:42.702 --> 00:00:46.768
where younger
generations are."
00:00:46.869 --> 00:00:49.335
Once I got into junior year
of high school,
00:00:49.435 --> 00:00:51.601
I was like, "Okay,
I'll give TikTok a try."
00:00:53.502 --> 00:00:56.435
I remember when I started
seeing views pile up
00:00:56.568 --> 00:00:58.834
in the thousands,
and hundred-thousands.
00:01:01.568 --> 00:01:03.235
I didn't know
I had this much power
00:01:03.335 --> 00:01:09.268
just because of me
putting my voice on an app.
00:01:09.335 --> 00:01:11.968
On TikTok,
anything can happen.
00:01:17.635 --> 00:01:20.535
♪ Watermelon sugar high
00:01:20.635 --> 00:01:23.168
♪ Watermelon sugar
high ♪
00:01:23.268 --> 00:01:24.434
♪ Watermelon sugar
00:01:24.567 --> 00:01:26.201
♪ Dream, dream, uh
00:01:26.301 --> 00:01:27.235
♪ Yeah
00:01:27.335 --> 00:01:28.968
♪ Listen to me now
00:01:29.068 --> 00:01:32.934
[beatboxing]
00:01:39.734 --> 00:01:42.535
I guess I'm on TikTok now.
00:01:42.668 --> 00:01:43.834
MAN: TikTok has
been downloaded
00:01:43.934 --> 00:01:45.101
more than two billion times,
00:01:45.201 --> 00:01:49.268
more than any app ever.
00:01:49.368 --> 00:01:51.702
GIRL: I think TikTok
right now is probably
00:01:51.802 --> 00:01:54.534
on the cutting edge
of all social media...
00:01:56.401 --> 00:01:59.101
and it is becoming
a world onto itself
00:01:59.168 --> 00:02:01.068
for a lot of people,
00:02:01.168 --> 00:02:02.934
especially young people.
00:02:04.768 --> 00:02:06.602
NEWSCASTER: TikTok is
the first Chinese app
00:02:06.668 --> 00:02:09.667
to threaten the dominance
of Silicon Valley.
00:02:11.468 --> 00:02:14.368
GIRL: It's
a cyber security story.
00:02:14.502 --> 00:02:17.268
It's an algorithm story.
00:02:17.368 --> 00:02:19.368
It's a bias story.
00:02:19.502 --> 00:02:20.901
It's a geopolitical story.
00:02:20.968 --> 00:02:24.001
...TikTok.
We may be banning TikTok.
00:02:24.134 --> 00:02:27.801
GIRL: It was bizarre
why when suddenly this
00:02:27.901 --> 00:02:30.201
kind of a fun
little kids app,
00:02:30.301 --> 00:02:31.535
become wrapped up
00:02:31.635 --> 00:02:33.868
in this huge
geopolitical storm
00:02:34.001 --> 00:02:36.767
between the US and China
00:02:36.868 --> 00:02:40.001
that was only
getting hotter.
00:02:43.235 --> 00:02:45.534
[sound of explosion]
00:02:45.634 --> 00:02:47.101
[electricity crackles]
00:02:47.201 --> 00:02:48.702
[car door closing]
00:02:48.802 --> 00:02:51.534
BOY: Fix your mirrors.
Put it in drive.
00:02:51.667 --> 00:02:53.901
FEROZA:
Put it in drive?
00:02:57.502 --> 00:03:00.834
So my family's
from Afghanistan.
00:03:00.934 --> 00:03:02.968
They were
just very grateful
00:03:03.068 --> 00:03:05.968
because they could finally
come here.
00:03:06.068 --> 00:03:08.468
They were just amazed
by the privileges
00:03:08.568 --> 00:03:10.034
that were being brought
to them.
00:03:10.134 --> 00:03:12.101
You didn't--you didn't
use your signal there.
00:03:12.235 --> 00:03:15.435
I did.
Oh, I thought I did.
00:03:15.502 --> 00:03:20.368
My parents,
their dreams of America
00:03:20.502 --> 00:03:23.068
were flourishing
before 9/11,
00:03:23.134 --> 00:03:26.502
and then once those planes
hit the towers,
00:03:26.602 --> 00:03:28.168
their dreams
were shattered too,
00:03:28.268 --> 00:03:32.300
because it was as if they
were responsible for that.
00:03:33.968 --> 00:03:36.335
Growing up
as an Afghan-American,
00:03:36.401 --> 00:03:37.767
it was really rough.
00:03:41.301 --> 00:03:44.034
I'll just have
a small amount.
00:03:44.134 --> 00:03:45.968
Yesterday, he was teaching me
how to drive.
00:03:46.034 --> 00:03:47.801
Every five seconds,
"Don't scratch my rims.
00:03:47.901 --> 00:03:49.535
Don't scratch my rims."
I'm like,
00:03:49.635 --> 00:03:52.502
I'm not gonna scratch your rims.
I know how to drive.
00:03:52.635 --> 00:03:54.434
The only times
where I feel like I belong
00:03:54.534 --> 00:03:56.435
is with my family.
00:03:56.502 --> 00:03:59.034
My cousins, my brother...
00:03:59.101 --> 00:04:01.134
I--I feel like I have
a place with them,
00:04:01.201 --> 00:04:05.734
because they understand
where I come from.
00:04:05.834 --> 00:04:07.468
How'd you make it look pretty?
00:04:07.602 --> 00:04:08.601
- Why?
- Like, it looks like
00:04:08.701 --> 00:04:10.367
a Instagram picture.
00:04:10.467 --> 00:04:11.901
We had to face
00:04:12.001 --> 00:04:15.001
growing up
as children of refugees
00:04:15.068 --> 00:04:18.467
in a country
that doesn't like refugees.
00:04:23.101 --> 00:04:25.467
And my school
isn't diverse at all.
00:04:27.735 --> 00:04:30.134
I don't feel
like I'm one of them.
00:04:30.235 --> 00:04:31.535
I don't think my classmates
00:04:31.635 --> 00:04:34.201
even think
that I'm one of them.
00:04:34.335 --> 00:04:36.368
I got called a terrorist.
00:04:36.468 --> 00:04:38.101
I got called Bin Laden.
00:04:38.168 --> 00:04:40.534
I've been called
a part of the Taliban,
00:04:40.634 --> 00:04:41.934
since I'm Afghan.
00:04:42.001 --> 00:04:43.734
I've been called
all these things.
00:04:46.168 --> 00:04:49.200
And I always felt
like an outsider.
00:04:55.368 --> 00:04:57.401
My mom made these two.
00:04:57.468 --> 00:05:01.502
I wore this dress on my TikTok.
00:05:01.568 --> 00:05:04.435
I have gone viral
a few times.
00:05:04.535 --> 00:05:06.934
It was my first time
posting on TikTok.
00:05:07.034 --> 00:05:10.268
It was like 40,000 views
I've gotten on it.
00:05:10.368 --> 00:05:13.969
And it was just me dancing
with my Afghan clothes on.
00:05:14.069 --> 00:05:17.635
The response was like amazing
because through TikTok
00:05:17.735 --> 00:05:19.568
I found
so many other Afghans,
00:05:19.668 --> 00:05:22.235
and so many
other Afghan-Americans.
00:05:22.335 --> 00:05:25.501
I never knew how big the
Afghan-American community was
00:05:25.634 --> 00:05:27.835
'til I joined TikTok
00:05:27.902 --> 00:05:30.567
and I've seen more people
accepting me for who I am.
00:05:30.667 --> 00:05:32.934
[exotic music playing]
00:05:33.068 --> 00:05:34.668
I got so excited.
[giggles]
00:05:34.768 --> 00:05:37.934
I really wanna be a part
of my community more
00:05:38.034 --> 00:05:39.868
through TikTok.
00:05:46.335 --> 00:05:48.534
♪ Wipe, wipe,
wipe it down ♪
00:05:48.634 --> 00:05:50.034
♪ Wipe, wipe
00:05:50.168 --> 00:05:53.368
TikTok is an app
that's completely different
00:05:53.468 --> 00:05:55.068
than any other type
of social media
00:05:55.201 --> 00:05:57.368
or entertainment platform
that we've ever seen before.
00:05:57.435 --> 00:06:00.901
TikTok was the first platform
to really popularize
00:06:01.001 --> 00:06:03.301
high quality vertical
video content,
00:06:03.401 --> 00:06:05.901
which just makes it
so easy to consume.
00:06:07.535 --> 00:06:08.535
The "For You" page
is completely,
00:06:08.635 --> 00:06:10.101
individually tailored to you,
00:06:10.201 --> 00:06:13.435
based on the data
that they gather about you.
00:06:13.568 --> 00:06:15.468
I mean, I cannot explain
00:06:15.568 --> 00:06:17.335
how fantastic
this algorithm is
00:06:17.401 --> 00:06:19.734
at delivering you exactly
what you didn't even know
00:06:19.801 --> 00:06:21.101
that you wanted.
00:06:23.401 --> 00:06:26.168
What TikTok does so well
is discovery,
00:06:26.235 --> 00:06:28.869
it allows you to drill down
on the whole internet,
00:06:28.969 --> 00:06:31.667
and find these really
specific groups of people
00:06:31.767 --> 00:06:33.767
that resonate with you.
00:06:35.935 --> 00:06:39.035
It's just any kind of niche,
subculture or community
00:06:39.135 --> 00:06:42.168
where you can find creators
that are in that niche.
00:06:42.301 --> 00:06:44.134
People just get discovered
much faster,
00:06:44.235 --> 00:06:45.535
they blow up much faster,
00:06:45.635 --> 00:06:47.735
and it's just...
everything goes ten times
00:06:47.835 --> 00:06:50.801
more viral than it would
on any other social app.
00:06:50.934 --> 00:06:53.534
It's remaking
the food landscape,
00:06:53.634 --> 00:06:56.168
the fashion landscape...
00:06:56.268 --> 00:06:58.468
People learning on TikTok...
00:06:58.568 --> 00:07:02.034
It's hard to find an industry
that TikTok hasn't infiltrated
00:07:02.101 --> 00:07:03.335
or disrupted.
00:07:03.468 --> 00:07:05.635
ANNOUNCER: TikTok
has captured the attention
00:07:05.702 --> 00:07:07.601
of the world's
most lucrative market,
00:07:07.701 --> 00:07:09.001
young people.
00:07:09.101 --> 00:07:12.701
And with it, the power
to reshape the future.
00:07:12.801 --> 00:07:15.635
[robotic vocalization]
00:07:15.735 --> 00:07:19.235
DJ Spencer in the...
[humming] Mix.
00:07:19.335 --> 00:07:23.068
[beatboxing]
00:07:23.168 --> 00:07:25.068
My friend Scott
comes up to me,
00:07:25.168 --> 00:07:27.701
and he goes, "Hey, man,
I wanna show you something
00:07:27.801 --> 00:07:29.901
you've never heard before."
00:07:30.001 --> 00:07:32.168
[beatboxing]
00:07:32.235 --> 00:07:35.668
And I was like, "What did you
just do right now?"
00:07:35.768 --> 00:07:38.401
Like, "Is this a trick?
Is this a game?" Like...
00:07:38.468 --> 00:07:40.868
And he was like, "No, no, no,
dude, it's called beatboxing.
00:07:40.968 --> 00:07:42.835
You make music
with your mouth."
00:07:42.935 --> 00:07:46.034
And I'm like, "You are
doing that with your face?"
00:07:46.101 --> 00:07:48.201
[beatboxing]
00:07:48.301 --> 00:07:52.268
I was 15. I was a sophomore
in high school,
00:07:52.335 --> 00:07:55.734
and I remember that moment
just being magic to me.
00:07:56.868 --> 00:07:59.567
Like it was love
at first sound.
00:07:59.667 --> 00:08:01.802
[beatboxing]
00:08:01.869 --> 00:08:04.300
This is Spencer Polanco,
and this is what I do.
00:08:04.434 --> 00:08:06.002
[beatboxing]
00:08:06.069 --> 00:08:10.034
And I decided that night
I was gonna be a beatboxer.
00:08:10.134 --> 00:08:13.034
But you can also imagine me
busting out of my room
00:08:13.134 --> 00:08:15.134
just, you know,
wild Spencer...
00:08:15.235 --> 00:08:16.969
full of excitement
in my eyes.
00:08:17.102 --> 00:08:20.534
"Dad! Mom! I know what I wanna
do for the rest of my life.
00:08:20.634 --> 00:08:23.301
I wanna be a beatboxer."
00:08:23.368 --> 00:08:25.201
And then they looked at me
like I was crazy.
00:08:25.335 --> 00:08:29.934
[beatboxing]
00:08:30.001 --> 00:08:31.868
♪ Spencer beatbox
00:08:31.968 --> 00:08:33.835
I grew up in New York City.
00:08:33.935 --> 00:08:37.501
My father is from Ecuador,
and he came here
00:08:37.601 --> 00:08:39.101
and found my mom.
00:08:39.201 --> 00:08:43.201
My mom is first generation,
uh, Chinese family.
00:08:43.301 --> 00:08:44.502
My father, uh...
00:08:44.635 --> 00:08:46.201
He wanted me to be
a tennis player,
00:08:46.268 --> 00:08:48.235
and my mom wanted me
to be a doctor.
00:08:48.335 --> 00:08:50.435
And then when
I expressed something
00:08:50.535 --> 00:08:53.535
they didn't understand,
which is beatboxing,
00:08:53.668 --> 00:08:56.802
they were accepting,
but confused,
00:08:56.902 --> 00:08:59.501
and then unaccepting
and even more confused.
00:08:59.601 --> 00:09:02.901
[beatboxing]
00:09:03.001 --> 00:09:05.901
I--I struggled
with a little bit of anxiety
00:09:06.001 --> 00:09:08.401
and depression
when I was in college.
00:09:08.502 --> 00:09:09.868
I had to make a decision,
00:09:09.968 --> 00:09:13.468
and I said no
to formal education.
00:09:13.568 --> 00:09:15.435
And I said yes
to beatboxing.
00:09:15.502 --> 00:09:16.968
I mean, I was
a struggling artist.
00:09:17.068 --> 00:09:19.934
I was a...
typical artist that was
00:09:20.068 --> 00:09:22.734
performing in the streets
of New York City.
00:09:22.834 --> 00:09:25.201
I was in the subways,
busking.
00:09:25.301 --> 00:09:28.868
Some days, you make $20
working all day.
00:09:31.368 --> 00:09:32.734
When I started TikTok,
00:09:32.834 --> 00:09:34.235
I had to look at myself
in the mirror,
00:09:34.368 --> 00:09:37.534
and say, "If this isn't
gonna happen right now,
00:09:37.634 --> 00:09:38.868
it's not gonna happen."
00:09:38.968 --> 00:09:43.201
Okay, this is
the first viral video.
00:09:43.301 --> 00:09:46.068
[chugging]
00:09:46.201 --> 00:09:47.101
[exhaling]
00:09:47.235 --> 00:09:50.334
[beatboxing]
00:09:50.434 --> 00:09:51.902
Look at my hair.
00:09:52.035 --> 00:09:55.868
I didn't even, like,
fix my hair for this video.
00:09:55.968 --> 00:09:58.634
And that was
a six-second video.
00:09:58.701 --> 00:09:59.634
[beatboxing]
00:09:59.734 --> 00:10:01.268
I woke up the next day
00:10:01.368 --> 00:10:05.335
and it had, like,
three million views.
00:10:05.435 --> 00:10:07.635
And I was just
genuinely confused.
00:10:07.735 --> 00:10:10.134
I was like, "Oh, my God.
They like me."
00:10:10.235 --> 00:10:13.835
I looked at it and I'm like,
"There's something here."
00:10:13.969 --> 00:10:17.068
So I decided to post more,
and I decided to do
00:10:17.168 --> 00:10:19.134
everything single idea
I possibly could."
00:10:19.201 --> 00:10:22.701
[beatboxing]
00:10:26.834 --> 00:10:28.134
TikTok topping Facebook
00:10:28.235 --> 00:10:30.735
to become the world's
most downloaded app.
00:10:30.835 --> 00:10:32.934
TAYLOR:
TikTok is absolutely massive.
00:10:33.034 --> 00:10:34.235
Last year,
they reported they had
00:10:34.335 --> 00:10:35.568
over two billion users,
00:10:35.668 --> 00:10:38.335
with almost a billion
monthly active users.
00:10:38.468 --> 00:10:40.168
People spend more time
on TikTok per day
00:10:40.301 --> 00:10:44.802
than Facebook, Snapchat,
Instagram, YouTube.
00:10:44.902 --> 00:10:48.001
Facebook is absolutely
desperate right now
00:10:48.134 --> 00:10:50.435
to regain any semblance
of relevance,
00:10:50.535 --> 00:10:52.835
which they've lost
quite a while ago.
00:10:52.935 --> 00:10:55.834
In many areas,
we're behind our competitors.
00:10:55.968 --> 00:10:58.335
The fastest growing app
is TikTok.
00:10:58.435 --> 00:11:00.001
The elephant in the room,
of course,
00:11:00.101 --> 00:11:02.268
is the fact that TikTok
is owned by ByteDance,
00:11:02.368 --> 00:11:03.835
a Chinese company.
00:11:03.902 --> 00:11:06.635
And it's the first time
we've seen a huge Chinese
00:11:06.702 --> 00:11:09.601
consumer tech company
come in and dominate
00:11:09.701 --> 00:11:11.334
the American market.
00:11:14.335 --> 00:11:16.301
SHELLY: So there was this guy
from China
00:11:16.368 --> 00:11:18.101
named Zhang Yiming.
00:11:18.168 --> 00:11:19.735
He worked for a number
of startups
00:11:19.835 --> 00:11:22.535
before starting up
his own company,
00:11:22.602 --> 00:11:24.068
a company called ByteDance.
00:11:25.368 --> 00:11:28.401
The thing that struck me
the most about TikTok
00:11:28.502 --> 00:11:30.968
was just how calculated
their founders were
00:11:31.034 --> 00:11:32.968
from the very beginning
about their goal,
00:11:33.068 --> 00:11:35.201
which was to become this kind
of global force,
00:11:35.268 --> 00:11:39.201
and really, like, penetrate
the, you know, the zeitgeist
00:11:39.301 --> 00:11:41.068
of the US.
00:11:42.468 --> 00:11:43.834
The story is going to sound
very similar
00:11:43.901 --> 00:11:46.335
to many Silicon Valley
entrepreneurs.
00:11:46.468 --> 00:11:49.201
Zhang Yiming came
from a middle class family.
00:11:49.301 --> 00:11:51.401
And he basically
just had this dream
00:11:51.502 --> 00:11:54.601
that he was going to make
something important of himself.
00:11:56.335 --> 00:11:59.235
So, when the iPhone came out,
in the late 2000s,
00:11:59.335 --> 00:12:02.001
he was just really
blown away by the fact
00:12:02.068 --> 00:12:04.901
that you could have
a full computing device,
00:12:05.034 --> 00:12:06.201
small enough to fit
in your pocket,
00:12:06.301 --> 00:12:08.035
and he was just convinced
mobile internet
00:12:08.102 --> 00:12:10.934
was going to be the next
once in a lifetime opportunity
00:12:11.068 --> 00:12:12.835
that you'd read about
in history books,
00:12:12.902 --> 00:12:14.601
and he was determined
to take advantage
00:12:14.701 --> 00:12:16.334
of this big wave.
00:12:30.834 --> 00:12:32.134
They started
in an apartment,
00:12:32.201 --> 00:12:34.034
which is very common
in Chinese...
00:12:34.134 --> 00:12:36.834
sort of internet
entrepreneurship lore.
00:12:46.168 --> 00:12:49.335
About 2015 or so,
ByteDance was doing
00:12:49.468 --> 00:12:50.702
experiments with video,
00:12:50.802 --> 00:12:53.568
and Zhang Yiming had
this sort of strange idea
00:12:53.702 --> 00:12:56.502
about recommendation engines
which serves you content
00:12:56.602 --> 00:12:58.934
that it thinks you'll be
interested in.
00:13:14.268 --> 00:13:16.234
So ByteDance bought Duoyin.
00:13:19.335 --> 00:13:20.901
Within six months
after launch
00:13:21.001 --> 00:13:23.068
it hit this inflection point.
00:13:23.168 --> 00:13:25.367
Somehow it just became viral.
00:13:26.901 --> 00:13:32.133
[singing in Chinese]
00:13:44.934 --> 00:13:45.834
Yo, what's going on?
00:13:45.934 --> 00:13:48.101
Dude, it is so hot.
00:13:48.235 --> 00:13:49.101
Dude, why don't
you just go get some
00:13:49.235 --> 00:13:50.602
Ben & Jerry's ice cream?
00:13:50.668 --> 00:13:55.368
People pay me to put
their product in my videos.
00:13:55.468 --> 00:13:57.134
It's--it's--it's crazy.
00:13:57.268 --> 00:13:58.868
These are from this week.
00:13:58.968 --> 00:14:00.535
There are so many.
Look.
00:14:00.668 --> 00:14:03.934
I get, like, clothes
from, like, all different
00:14:04.034 --> 00:14:05.734
major brands.
00:14:05.834 --> 00:14:08.133
Look at that.
00:14:23.602 --> 00:14:26.068
ANNOUNCER: After the success
of Douyin in China,
00:14:26.168 --> 00:14:27.934
ByteDance acquired
Musical.ly,
00:14:28.034 --> 00:14:30.001
a lip-synch app
already popular
00:14:30.101 --> 00:14:33.934
with kids and young teens
in the US.
00:14:34.034 --> 00:14:36.434
And merged them,
rebranding the app
00:14:36.534 --> 00:14:38.467
as TikTok.
00:14:40.201 --> 00:14:42.834
SHELLY: One thing Zhang Yiming
did was to create
00:14:42.968 --> 00:14:44.201
two different products.
00:14:44.301 --> 00:14:46.767
Duoyin was
for the Chinese market,
00:14:46.868 --> 00:14:48.068
and it was a Chinese app,
00:14:48.201 --> 00:14:49.168
and TikTok was
00:14:49.268 --> 00:14:50.868
for the global market.
00:14:51.001 --> 00:14:52.401
And what that meant
00:14:52.502 --> 00:14:54.201
was that he could keep
the Chinese app
00:14:54.301 --> 00:14:56.702
walled off in China,
with Chinese rules,
00:14:56.835 --> 00:15:00.201
and then keep TikTok
for the rest of the globe.
00:15:00.301 --> 00:15:02.168
TikTok was,
all of the sudden,
00:15:02.268 --> 00:15:05.034
this huge success
from day one.
00:15:05.134 --> 00:15:07.034
TikTok has done
what no other Chinese-made app
00:15:07.168 --> 00:15:08.435
has done before.
00:15:08.568 --> 00:15:09.934
It's cracked
the international market,
00:15:10.034 --> 00:15:11.535
and become a global sensation.
00:15:11.602 --> 00:15:13.068
ANNOUNCER: TikTok is
now available
00:15:13.168 --> 00:15:16.568
in 154 countries
and 75 languages,
00:15:16.668 --> 00:15:19.400
rivaling Silicon Valley's
biggest apps.
00:15:26.168 --> 00:15:27.635
TikTok and its owner ByteDance
00:15:27.735 --> 00:15:30.435
were the first Chinese
social media company
00:15:30.502 --> 00:15:32.502
to really provide
a wake-up call
00:15:32.602 --> 00:15:35.201
to Facebook, Google,
Amazon, and others,
00:15:35.301 --> 00:15:37.968
that it's not just
about Silicone Valley
00:15:38.101 --> 00:15:39.668
bringing technology
to the world,
00:15:39.768 --> 00:15:41.701
but, really,
that China's a real force
00:15:41.801 --> 00:15:42.767
to be reckoned with.
00:15:45.934 --> 00:15:48.034
MATTHEW: There's sort
of this perception
00:15:48.168 --> 00:15:49.068
that there was too much
00:15:49.168 --> 00:15:50.535
of a cultural difference
00:15:50.635 --> 00:15:53.368
between China
and the rest of the world
00:15:53.468 --> 00:15:56.201
and that Chinese companies
didn't know to build
00:15:56.301 --> 00:15:58.668
globally successful
social media.
00:15:58.802 --> 00:16:03.468
I think TikTok's completely
blown that out of water.
00:16:03.568 --> 00:16:07.068
The world we all grew up in
was one where America
00:16:07.134 --> 00:16:09.201
dominated culturally,
00:16:09.335 --> 00:16:11.702
where it dominated
technologically,
00:16:11.802 --> 00:16:14.568
and the world that we
will end up retiring in
00:16:14.668 --> 00:16:18.235
will probably be one
where China dominates
00:16:18.368 --> 00:16:20.401
in most of those areas.
00:16:20.502 --> 00:16:23.568
Power is shifting
to China rapidly.
00:16:23.635 --> 00:16:26.467
And ByteDance and the story
of TikTok is part of that.
00:16:32.434 --> 00:16:34.001
- Hi.
- Hi.
00:16:34.068 --> 00:16:35.868
- How are you?
- Good.
00:16:35.968 --> 00:16:37.168
So what are you gonna do?
00:16:37.268 --> 00:16:39.901
I wanna play with...
adding just, like,
00:16:40.001 --> 00:16:41.502
more dimension
to my hair,
00:16:41.602 --> 00:16:43.534
it's, like, a little flat...
00:16:43.634 --> 00:16:44.668
Yes.
00:16:44.735 --> 00:16:46.068
DEJA: I think
that it's scary
00:16:46.134 --> 00:16:48.235
to be the first generation,
00:16:48.335 --> 00:16:50.501
to have
our entire lives documented.
00:16:52.201 --> 00:16:54.602
Every action, you know,
every haircut,
00:16:54.702 --> 00:16:58.368
every look change
is on people's radars.
00:16:58.468 --> 00:17:00.768
You know, I work a job
where so much of it
00:17:00.869 --> 00:17:03.568
is, like, looking...
being looked at, or...
00:17:03.668 --> 00:17:06.667
- Yeah.
- Looking at myself, honestly.
00:17:06.767 --> 00:17:07.768
What do you do?
00:17:07.902 --> 00:17:10.502
Some influencer
and content creation work,
00:17:10.602 --> 00:17:13.869
so I went viral
for the first time,
00:17:13.935 --> 00:17:16.734
uh, when I was 16 years old
for a confrontation
00:17:16.834 --> 00:17:17.902
with my Senator.
00:17:18.002 --> 00:17:20.201
Republican Senator Jeff Flake
held a town hall
00:17:20.335 --> 00:17:22.268
in Mesa, Arizona last night,
00:17:22.368 --> 00:17:24.068
and got an earful
from 16-year-old activist
00:17:24.168 --> 00:17:25.735
Deja Foxx.
Take a look.
00:17:25.835 --> 00:17:28.602
So I'm a young woman,
and you're a middle-aged man.
00:17:28.702 --> 00:17:30.768
- I'm a person of color...
- Ouch.
00:17:30.835 --> 00:17:33.835
...and you're white.
Uhm, I come from a background
00:17:33.935 --> 00:17:36.001
of poverty and I didn't
always have parents
00:17:36.101 --> 00:17:37.134
to guide me through life.
00:17:37.235 --> 00:17:39.234
You come from privilege.
00:17:47.301 --> 00:17:49.767
Why would you deny me
the American Dream?
00:17:49.901 --> 00:17:51.468
DEJA: I woke up
the next morning,
00:17:51.568 --> 00:17:54.401
and 18 million people
had seen that video.
00:17:54.535 --> 00:17:56.901
And I had a request
to my email to go live on CNN.
00:17:57.001 --> 00:17:59.702
I can't sit idly by
while women like me
00:17:59.768 --> 00:18:02.235
are countlessly and constantly
being ignored
00:18:02.335 --> 00:18:03.735
on Capitol Hill.
00:18:03.835 --> 00:18:06.134
And I suddenly realized
that all those things
00:18:06.235 --> 00:18:07.668
I've been putting
on social
00:18:07.768 --> 00:18:10.235
that I thought were only
going to the people
00:18:10.335 --> 00:18:14.301
who knew me, overnight
I could be visible
00:18:14.401 --> 00:18:16.734
to people
that I will never know.
00:18:18.201 --> 00:18:20.701
The internet connected me
to the entire world.
00:18:26.802 --> 00:18:31.068
TikTok is a blowup overnight
kind of place.
00:18:31.168 --> 00:18:36.668
On any platform where you can
get extreme reach,
00:18:36.802 --> 00:18:38.767
you open yourself up
to benefits, right?
00:18:38.868 --> 00:18:41.801
larger followings,
more views,
00:18:41.868 --> 00:18:44.001
more attention.
00:18:44.101 --> 00:18:46.802
You must open yourself up
to more hate,
00:18:46.902 --> 00:18:51.034
to people who are going
to tear you down,
00:18:51.134 --> 00:18:53.968
tear you apart,
pick at you...
00:18:54.101 --> 00:18:56.701
I live in a pretty constant
state of anxiety.
00:18:58.635 --> 00:19:01.068
I don't know what it's like
to live in a world
00:19:01.201 --> 00:19:03.801
where I'm not
being perceived always.
00:19:05.101 --> 00:19:06.667
And it's this tug of war
between...
00:19:06.801 --> 00:19:08.468
that's kind of what I want,
00:19:08.568 --> 00:19:10.934
and it's kind of the thing
I fear the most.
00:19:13.668 --> 00:19:16.235
I think there is a really
interesting line
00:19:16.335 --> 00:19:19.368
between what it means
to be empowered
00:19:19.468 --> 00:19:20.968
by your sexuality,
00:19:21.101 --> 00:19:23.701
versus being exploited
by it online.
00:19:25.502 --> 00:19:28.834
I definitely think
that women are seeing
00:19:28.934 --> 00:19:30.902
larger followings,
more attention,
00:19:30.969 --> 00:19:34.301
from the ways that they're
being sexualized online.
00:19:34.368 --> 00:19:38.235
You can be a bad bitch
in a bikini
00:19:38.368 --> 00:19:41.168
and a boss bitch
in a blazer.
00:19:41.235 --> 00:19:43.400
Do both.
00:19:45.101 --> 00:19:49.968
The posts where I'm showing
more skin do better.
00:19:51.235 --> 00:19:53.535
But I also question
why that is.
00:19:53.635 --> 00:19:56.635
If we think that these platforms
really are just showing us
00:19:56.735 --> 00:19:59.668
the most popular content,
without really interrogating
00:19:59.768 --> 00:20:02.400
why we're seeing
what we're seeing.
00:20:02.501 --> 00:20:04.968
That's dangerous.
00:20:05.101 --> 00:20:07.768
There is so much mystery
to the algorithm.
00:20:07.902 --> 00:20:10.802
The algorithm.
Like, what does that even mean?
00:20:10.935 --> 00:20:14.734
When I call your name
I shall place the sorting hat
00:20:14.801 --> 00:20:16.468
on your head...
00:20:16.568 --> 00:20:18.535
and you'll be sorted
in your Houses.
00:20:18.635 --> 00:20:20.535
EUGENE: I refer to TikTok
as a sorting hat,
00:20:20.602 --> 00:20:22.201
in reference
to the sorting hat
00:20:22.301 --> 00:20:23.568
from "Harry Potter."
00:20:23.668 --> 00:20:25.768
When the kids show up
at Hogwarts,
00:20:25.835 --> 00:20:27.502
there's this magical
sorting hat
00:20:27.635 --> 00:20:30.435
that sorts them into one
of the four schools.
00:20:30.535 --> 00:20:33.335
TikTok's recommendation
algorithm serves
00:20:33.435 --> 00:20:35.335
as that type
of sorting hat.
00:20:35.468 --> 00:20:38.735
It sorts its users
into different audiences,
00:20:38.869 --> 00:20:42.535
then it sorts videos
into different clusters
00:20:42.635 --> 00:20:44.335
that appeal
to different audiences.
00:20:44.435 --> 00:20:47.368
Most apps, like Facebook,
Twitter, Instagram
00:20:47.502 --> 00:20:49.401
you actually have to follow
a lot of accounts,
00:20:49.535 --> 00:20:51.568
or other people,
or people you know.
00:20:51.668 --> 00:20:54.535
TikTok was very different,
in that, even if you
00:20:54.602 --> 00:20:57.134
didn't follow anybody,
you would over time,
00:20:57.235 --> 00:20:58.668
just by using the app,
00:20:58.768 --> 00:21:01.367
get a very personalized
entertainment experience
00:21:01.467 --> 00:21:03.768
for yourself.
00:21:03.869 --> 00:21:06.535
Is anyone else, like,
a little weirded out
00:21:06.668 --> 00:21:09.168
about how specific TikTok's
algorithm gets
00:21:09.301 --> 00:21:10.802
for the "For You" page.
00:21:10.902 --> 00:21:13.534
EUGENE: The "For You" page
on Tiktok is the default
00:21:13.634 --> 00:21:15.635
that the app opens into.
00:21:15.735 --> 00:21:17.368
On the one hand, it has
a bunch of attributes
00:21:17.468 --> 00:21:18.702
about the video.
00:21:18.802 --> 00:21:21.568
It has the song in it.
It has a dog.
00:21:21.668 --> 00:21:23.834
And the other, it has a bunch
of attributes about you,
00:21:23.968 --> 00:21:27.869
you're this age,
you live here,
00:21:28.002 --> 00:21:31.568
those are contextual clues
to feed their algorithm
00:21:31.668 --> 00:21:34.834
to determine
what your tastes are.
00:21:34.934 --> 00:21:36.602
My TikTok algorithm
is just like,
00:21:36.668 --> 00:21:39.034
"You have ADHD,
you have BPD,
00:21:39.134 --> 00:21:40.602
you're depressed."
00:21:40.702 --> 00:21:42.668
EUGENE: When you're looking
at each video
00:21:42.768 --> 00:21:45.535
on the "For You" page,
TikTok, the app,
00:21:45.668 --> 00:21:51.468
is looking at how we react
to that video.
00:21:51.568 --> 00:21:54.568
The algorithm starts
to become smarter
00:21:54.668 --> 00:21:56.301
just off of
these long sessions
00:21:56.368 --> 00:21:59.435
where you're addictively
scrolling through videos.
00:21:59.535 --> 00:22:04.735
And then adjust what videos
it shows you in the future.
00:22:04.835 --> 00:22:06.868
And over time, it builds
almost a fingerprint
00:22:06.968 --> 00:22:09.968
of your tastes.
00:22:10.068 --> 00:22:11.635
I'm talking, like,
I was just thinking about
00:22:11.702 --> 00:22:13.134
making a peanut butter
and jelly sandwich,
00:22:13.268 --> 00:22:14.935
and then, out of nowhere,
someone is making
00:22:15.035 --> 00:22:16.869
a peanut butter and jelly
on my "For You" page.
00:22:16.969 --> 00:22:19.268
But lately, I kid you not,
it hasn't been things
00:22:19.335 --> 00:22:21.268
that I Google or I talk about,
00:22:21.335 --> 00:22:22.802
it's been thoughts.
00:22:22.935 --> 00:22:24.902
Are any other girls, like,
kind of aggravated
00:22:25.002 --> 00:22:26.735
that it took more
than 20 years to figure out
00:22:26.835 --> 00:22:28.968
we were bisexual, but
it took my TikTok algorithm
00:22:29.068 --> 00:22:31.134
like 37 seconds?
00:22:31.235 --> 00:22:33.735
EUGENE: TikTok is just
the latest manifestation
00:22:33.835 --> 00:22:36.168
of the power that comes
from connecting
00:22:36.235 --> 00:22:39.668
billions of people in
the world with really powerful
00:22:39.802 --> 00:22:43.535
machine learning
recommendation algorithms.
00:22:43.635 --> 00:22:45.668
The creator economy
is the fastest growing type
00:22:45.768 --> 00:22:47.635
of small business, with more
than 50 million people
00:22:47.735 --> 00:22:49.667
around the world
who consider themselves
00:22:49.767 --> 00:22:51.068
to be Content Creators.
00:22:52.768 --> 00:22:55.068
TAYLOR: All the biggest
brands, they wanna do TikTok
00:22:55.168 --> 00:22:57.468
campaigns, because it has
the hype right now.
00:22:57.568 --> 00:23:00.301
TikTok has just
really resonated
00:23:00.401 --> 00:23:02.835
with this Gen-Z audience,
00:23:02.935 --> 00:23:04.768
which is where
the most valuable users are.
00:23:04.869 --> 00:23:07.602
We have brands poised
to spend over $15 billion
00:23:07.702 --> 00:23:10.101
in the next year,
on influencer marketing alone.
00:23:10.235 --> 00:23:14.467
Eyeballs bring money
and brands chase the eyeballs.
00:23:14.567 --> 00:23:15.767
chase young users.
00:23:15.868 --> 00:23:17.335
JIMMY: In less than a year,
00:23:17.468 --> 00:23:19.934
you have almost 35 million
followers on TikTok.
00:23:20.068 --> 00:23:23.335
Over two billion likes...
Billion. Wow.
00:23:23.435 --> 00:23:25.168
[audience cheering]
00:23:25.301 --> 00:23:27.734
TAYLOR: TikTokers, they've
embraced that idea of being
00:23:27.834 --> 00:23:29.101
the entrepreneur.
00:23:29.201 --> 00:23:30.969
They're not just in it
to get famous,
00:23:31.069 --> 00:23:32.901
they're also in it
to get rich and successful.
00:23:32.968 --> 00:23:34.968
The social media
influencer market
00:23:35.068 --> 00:23:37.468
is a multi-billion
dollar industry,
00:23:37.568 --> 00:23:40.668
and some young people
are cashing in big time.
00:23:40.768 --> 00:23:42.268
TAYLOR: I mean, these
TikTokers are making more
00:23:42.401 --> 00:23:45.001
in their first year
than a lot of huge YouTubers
00:23:45.101 --> 00:23:47.834
have made throughout
the entire past decade.
00:23:49.034 --> 00:23:51.134
Late 2019
was my first brand deal
00:23:51.235 --> 00:23:52.601
and it was for Nike.
00:23:52.667 --> 00:23:54.969
When I first told my parents
00:23:55.069 --> 00:23:57.634
that I was doing something
for Nike, they were like,
00:23:57.701 --> 00:24:02.168
"What? Nike?
Like Nike?"
00:24:02.268 --> 00:24:05.034
The fact that anyone wants
to use beatboxing for anything
00:24:05.168 --> 00:24:07.768
is the coolest thing
on Earth to me.
00:24:07.835 --> 00:24:10.468
This is a doll made
for Little Caesar's.
00:24:10.602 --> 00:24:13.235
Life-size.
Even got the hair.
00:24:13.335 --> 00:24:14.802
[beatboxing]
00:24:14.869 --> 00:24:18.535
I'm a beatboxer,
and I'm hanging out
00:24:18.635 --> 00:24:23.667
at Jason Derulo's house,
like, "How did this happen?"
00:24:23.767 --> 00:24:25.168
Like this is dope.
00:24:25.268 --> 00:24:29.001
[singing]
00:24:29.101 --> 00:24:30.201
[beatboxing]
00:24:30.301 --> 00:24:32.268
SPENCER: TikTok has made
music more fun.
00:24:32.368 --> 00:24:35.101
When we talk about
revolutionizing, you know,
00:24:35.168 --> 00:24:36.101
an industry...
00:24:36.168 --> 00:24:37.401
[beatboxing]
00:24:37.535 --> 00:24:40.168
...that has existed
for so long by itself...
00:24:40.268 --> 00:24:41.535
[beatboxing]
00:24:41.635 --> 00:24:44.201
TikTok's taking away
that guardian, that little gate.
00:24:44.301 --> 00:24:46.368
Now there's so many
independent artists,
00:24:46.468 --> 00:24:49.268
especially coming from TikTok,
that have just, like,
00:24:49.401 --> 00:24:51.635
cult following that'll
have people that believe
00:24:51.735 --> 00:24:54.001
in them, being outside
of the industry.
00:24:54.101 --> 00:24:56.235
[beatboxing]
00:24:56.335 --> 00:25:00.101
It wasn't until I was on TV,
when I presented
00:25:00.235 --> 00:25:01.968
at the
"Billboard Music Awards,"
00:25:02.068 --> 00:25:03.801
that's when my parents,
they were like,
00:25:03.934 --> 00:25:06.235
"All right, Spencer's
on TV now,
00:25:06.335 --> 00:25:08.268
I don't know what to do
with this information."
00:25:08.368 --> 00:25:11.034
[beatboxing]
00:25:13.969 --> 00:25:15.801
TAYLOR: TikTok has also
completely upended
00:25:15.901 --> 00:25:17.568
the talent industry
in Hollywood.
00:25:17.668 --> 00:25:19.902
Hollywood people love TikTok,
because they're like,
00:25:20.002 --> 00:25:23.902
"Great, more famous people,
more people to make us money."
00:25:24.035 --> 00:25:26.802
These TikTokers are launching
their own brands.
00:25:26.902 --> 00:25:28.834
You have a beauty line
coming out, right?
00:25:28.901 --> 00:25:30.535
TAYLOR: Think of Addison Rae,
right?
00:25:30.668 --> 00:25:31.868
She's a hugely popular
TikToker.
00:25:31.968 --> 00:25:33.335
She launched Item Beauty.
00:25:33.468 --> 00:25:35.869
So you see a lot
of these young TikTokers
00:25:35.935 --> 00:25:40.301
building already
multi-million dollar empires.
00:25:40.401 --> 00:25:42.702
Someone like Bella Poarch
went viral for this, like,
00:25:42.768 --> 00:25:44.001
lipsynching video,
where she's making
00:25:44.134 --> 00:25:45.901
these funny
facial expressions.
00:25:45.968 --> 00:25:47.134
♪ It's M to the B,
it's M to the B ♪
00:25:48.869 --> 00:25:51.068
TAYLOR: She used that
to launch a pop star career.
00:25:51.168 --> 00:25:54.368
I mean, she's had several
huge hit songs now.
00:25:54.435 --> 00:25:56.968
Somebody like Lil Nas X
is also a good example,
00:25:57.068 --> 00:25:58.968
somebody that really
leveraged the internet
00:25:59.034 --> 00:26:00.969
to promote his own career.
00:26:01.035 --> 00:26:05.268
♪ Oh, I'm gonna ride
'til I can't no more ♪
00:26:05.368 --> 00:26:06.767
TAYLOR:
TikTok defines "Top 40."
00:26:06.901 --> 00:26:08.901
If you go
to the Trending List
00:26:09.001 --> 00:26:10.602
on Spotify of
the most viral hits,
00:26:10.735 --> 00:26:14.568
it's all TikTok songs,
and it is a really crazy mix.
00:26:14.668 --> 00:26:17.868
We saw Fleetwood Mac "Dreams",
you know, resurged,
00:26:17.968 --> 00:26:19.834
because of a viral video.
00:26:19.968 --> 00:26:21.968
♪ It's only right
00:26:22.068 --> 00:26:24.768
♪ That you should
play the way... ♪
00:26:24.835 --> 00:26:26.901
TAYLOR: The whole media
ecosystem has migrated
00:26:27.001 --> 00:26:28.301
towards this
personality-driven
00:26:28.435 --> 00:26:30.668
form of entertainment,
so often people think,
00:26:30.768 --> 00:26:32.068
"Oh, I'm on Content Creators,
right?
00:26:32.168 --> 00:26:33.734
That's some teenager
that's dancing".
00:26:33.834 --> 00:26:37.134
No! Online influence
is influence.
00:26:37.268 --> 00:26:39.868
And if you can make
an impact online,
00:26:39.968 --> 00:26:43.300
you have the ability
to reshape the world.
00:26:52.735 --> 00:26:55.467
FEROZA: I started realizing
TikTok had power.
00:26:57.201 --> 00:26:59.368
Even if it was for
a comedy or makeup.
00:26:59.468 --> 00:27:02.934
Whatever I was posting,
people wanted to watch it.
00:27:03.901 --> 00:27:05.934
And I was, like,
"Okay,
00:27:06.068 --> 00:27:08.400
so anything's possible
on this app."
00:27:10.502 --> 00:27:14.801
Anyone can basically go viral
on this app.
00:27:20.368 --> 00:27:23.969
After joining TikTok,
I decided, like,
00:27:24.069 --> 00:27:28.768
"Maybe I wanna make more
political-savvy videos,"
00:27:28.835 --> 00:27:32.401
and I got more--more views
on that.
00:27:32.502 --> 00:27:35.134
Hi, if you, um, actually
think all lives matter,
00:27:35.235 --> 00:27:36.868
I want you to speak up
about the kids in cages
00:27:36.968 --> 00:27:38.335
at the border,
I want you to speak up
00:27:38.435 --> 00:27:40.001
about the kids dying
in the Middle East.
00:27:40.068 --> 00:27:41.535
I want you to speak out
about the child...
00:27:41.668 --> 00:27:44.168
I first read about the Uighurs
through social media
00:27:44.268 --> 00:27:46.768
since I do follow
Muslim pages,
00:27:46.869 --> 00:27:48.968
and try to keep up
with my community.
00:27:49.101 --> 00:27:50.568
NEWSCASTER:
Across the Northwestern
00:27:50.702 --> 00:27:53.268
province of Xinjiang,
an estimated one million
00:27:53.368 --> 00:27:56.368
Chinese Muslims have vanished
into a vast network
00:27:56.468 --> 00:28:00.101
of detention centers
that targets Uyghur Muslims.
00:28:00.201 --> 00:28:03.834
I saw someone post a picture
of these Uyghurs,
00:28:03.934 --> 00:28:06.001
Uyghur prisoners,
00:28:06.101 --> 00:28:08.002
and when I did more research,
I found out
00:28:08.069 --> 00:28:10.101
that this genocide
is happening in front of us,
00:28:10.201 --> 00:28:12.133
and no one is
speaking about it.
00:28:13.968 --> 00:28:17.869
♪ Okay, okay...
00:28:17.969 --> 00:28:19.934
It says, "News outlets
when innocent Muslims
00:28:20.034 --> 00:28:21.301
"are getting murdered
every day
00:28:21.368 --> 00:28:23.534
in the Middle East
and in China."
00:28:26.535 --> 00:28:29.068
The next day,
after I posted that video,
00:28:29.168 --> 00:28:31.034
I--I looked at my feed,
00:28:31.168 --> 00:28:33.168
and I saw where
the post used to be,
00:28:33.268 --> 00:28:36.435
it was no longer the image
of my face on there.
00:28:36.535 --> 00:28:39.200
It was just
a black little box.
00:28:41.602 --> 00:28:43.134
When I clicked on it,
it just would say,
00:28:43.235 --> 00:28:44.934
"Video unavailable."
00:28:46.435 --> 00:28:49.435
I was shocked at that time,
but once I found out
00:28:49.535 --> 00:28:51.534
that TikTok is
a Beijing-owned app,
00:28:51.667 --> 00:28:54.067
I was no longer shocked.
00:28:55.635 --> 00:28:58.001
I found out how TikTok's
basically using our data,
00:28:58.101 --> 00:28:59.301
using our information,
00:28:59.401 --> 00:29:00.535
and using it
for their own benefit.
00:29:00.635 --> 00:29:02.235
TikTok is a Beijing-owned app.
00:29:02.335 --> 00:29:04.468
It has censored videos
that are against the CCP.
00:29:04.602 --> 00:29:06.235
I don't know about you, guys,
but I wanna know
00:29:06.335 --> 00:29:08.033
what TikTok's doing
with our information.
00:29:10.834 --> 00:29:13.001
AI is--is hungry for data,
00:29:13.101 --> 00:29:14.401
so the more data you have,
00:29:14.502 --> 00:29:16.535
the more accurate
the AI becomes,
00:29:16.668 --> 00:29:21.068
so in the age of AI,
uh, data's the new oil,
00:29:21.168 --> 00:29:23.434
and China is
the new Saudi Arabia.
00:29:24.902 --> 00:29:30.101
SCOTT: Your data is an asset
to a lot of companies.
00:29:30.168 --> 00:29:32.934
Google and Amazon
and Facebook are so big,
00:29:33.034 --> 00:29:34.634
and they have so much money
because they have
00:29:34.767 --> 00:29:36.468
all of your data.
00:29:36.568 --> 00:29:38.934
And there's a whole secondary
market for data,
00:29:39.001 --> 00:29:40.468
called "Data Brokers."
00:29:40.568 --> 00:29:42.201
And they're gathering
all of this data,
00:29:42.301 --> 00:29:44.168
and they're selling it
to each other,
00:29:44.235 --> 00:29:47.168
and it's really no different
than the stock market.
00:29:47.268 --> 00:29:50.101
If a company can start
gathering that data,
00:29:50.235 --> 00:29:52.468
thousands of points of data
a day,
00:29:52.568 --> 00:29:56.168
from the time someone's five
until the time they're 18,
00:29:56.268 --> 00:29:58.335
those companies,
when they sell that data,
00:29:58.468 --> 00:30:01.068
they have a profile
that knows a child
00:30:01.168 --> 00:30:03.968
way better
than a parent would.
00:30:04.101 --> 00:30:06.767
And that is extremely valuable
to advertisers,
00:30:06.901 --> 00:30:09.301
and to people who want
to persuade you
00:30:09.368 --> 00:30:12.534
to do something you might
otherwise not want to do.
00:30:15.768 --> 00:30:19.134
If social media is determining
and tracking you
00:30:19.235 --> 00:30:20.868
in different ways,
and--and telling you
00:30:20.968 --> 00:30:22.201
what your dreams
are going to be,
00:30:22.301 --> 00:30:24.035
based on the ads
you're getting,
00:30:24.135 --> 00:30:28.534
that impacts the kid's brain
and it impacts their goals.
00:30:30.435 --> 00:30:33.635
There's a lot of harms
happening with these companies
00:30:33.735 --> 00:30:36.235
that are--are based
in Northern California.
00:30:36.301 --> 00:30:38.834
What's different
about TikTok is,
00:30:38.934 --> 00:30:42.001
where is this data going?
00:30:42.101 --> 00:30:43.868
NEWSCASTER:
TikTok is reportedly
00:30:43.934 --> 00:30:45.535
under federal investigation.
00:30:45.635 --> 00:30:47.535
The US Government
reportedly launching
00:30:47.635 --> 00:30:50.001
a national security review
of the company's
00:30:50.134 --> 00:30:53.001
data collection,
and censorship practices,
00:30:53.134 --> 00:30:55.335
amid concerns that users'
personal data
00:30:55.435 --> 00:30:57.802
could be accessible
to foreign governments.
00:30:57.902 --> 00:31:01.235
SCOTT: There's very little
transparency.
00:31:01.335 --> 00:31:03.968
Just because a tech company
says something...
00:31:04.068 --> 00:31:05.568
and I'm not just
talking about TikTok.
00:31:05.668 --> 00:31:09.901
We don't have to take the tech
company at its word.
00:31:10.001 --> 00:31:11.869
I think Facebook
is somewhat scared
00:31:11.935 --> 00:31:15.101
of the quick rise of TikTok,
because all that data
00:31:15.201 --> 00:31:18.467
that Facebook was getting
is now going to TikTok.
00:31:24.201 --> 00:31:25.935
The Chinese, one of the few
markets where Facebook
00:31:26.035 --> 00:31:28.801
is unavailable right now,
because of government censors...
00:31:28.901 --> 00:31:30.401
While Facebook CEO,
Mark Zuckerberg,
00:31:30.502 --> 00:31:33.868
appears to be trying bit by bit
to reenter the world's
00:31:34.001 --> 00:31:36.901
largest internet population,
China.
00:31:37.034 --> 00:31:38.401
SHELLY:
Facebook was probably
00:31:38.468 --> 00:31:40.468
the most aggressive
social media company
00:31:40.568 --> 00:31:42.435
to try to get
into China,
00:31:42.502 --> 00:31:45.567
because social media in China
had been blocked
00:31:45.667 --> 00:31:47.501
for a number of years.
00:31:49.235 --> 00:31:51.001
Mark Zuckerberg was trying
to find a way
00:31:51.101 --> 00:31:52.934
that they could exist
somehow in China.
00:31:53.034 --> 00:31:55.367
And so he learned Mandarin.
00:32:04.101 --> 00:32:06.468
He went to a number
of conferences,
00:32:06.568 --> 00:32:09.568
where he could put himself
in front of Xi Jinping,
00:32:09.668 --> 00:32:11.969
and speak
to Xi Jinping directly.
00:32:12.069 --> 00:32:15.601
And he really was aggressive
in saying, you know,
00:32:15.701 --> 00:32:17.068
"This is a huge market.
00:32:17.134 --> 00:32:20.268
"How can we be
this global connector
00:32:20.335 --> 00:32:24.702
for the world and not have
China be part of it?"
00:32:24.768 --> 00:32:28.101
But it became clear to him
in the last few years
00:32:28.168 --> 00:32:29.935
that it's not going to work.
00:32:30.035 --> 00:32:34.602
And so he completely changed,
he completely had a 180
00:32:34.702 --> 00:32:37.201
and he went,
uh, on the offensive.
00:32:37.301 --> 00:32:40.168
A decade ago, almost all
of the major internet platforms
00:32:40.268 --> 00:32:41.468
were American.
00:32:41.602 --> 00:32:45.602
Today, six of the top ten
are Chinese.
00:32:45.702 --> 00:32:47.835
He realized that,
"If I can't win over
00:32:47.935 --> 00:32:50.635
"the Chinese market, then I'm
gonna make it harder for them
00:32:50.735 --> 00:32:53.635
to win over my market
in the US."
00:32:53.702 --> 00:32:56.168
NEWSCASTER: Mark Zuckerberg
has reportedly called TikTok
00:32:56.301 --> 00:32:58.468
"A threat to democracy."
00:32:58.535 --> 00:33:00.668
What the Facebook CEO
failed to mention
00:33:00.768 --> 00:33:03.501
is that he tried to purchase
TikTok's predecessor,
00:33:03.601 --> 00:33:05.069
Musical.ly.
00:33:05.202 --> 00:33:08.602
MAN: Zuckerberg's clearly
very concerned about TikTok,
00:33:08.702 --> 00:33:11.535
because it's the most genuine
new competition
00:33:11.635 --> 00:33:14.334
he's received for a long time.
00:33:16.002 --> 00:33:19.834
Guys, remember when Facebook was
the number one place on Earth?
00:33:19.934 --> 00:33:21.667
My grandmother
doesn't even use Facebook.
00:33:21.734 --> 00:33:22.667
She's too cool.
00:33:22.767 --> 00:33:24.300
No, I'm kidding.
00:33:27.602 --> 00:33:29.335
I posted a prank on my page,
00:33:29.435 --> 00:33:31.901
and it has like
three million views right now.
00:33:32.001 --> 00:33:34.401
[imitates buzzing sound]
00:33:34.535 --> 00:33:36.268
- Oh!
- You know what?
00:33:36.368 --> 00:33:37.667
That's what you get, my boy.
00:33:37.767 --> 00:33:38.934
Uh-huh.
00:33:39.034 --> 00:33:41.435
Come here.
Yeah, yeah. Mm...
00:33:41.568 --> 00:33:43.668
- [screams]
- [laughs]
00:33:43.802 --> 00:33:46.768
SPENCER: I think it's all
about how creative you are.
00:33:46.835 --> 00:33:50.634
I mean, that's what TikTok
has taught me.
00:33:50.734 --> 00:33:52.634
Eh!
00:33:55.868 --> 00:33:59.534
[beatboxing]
00:34:02.268 --> 00:34:03.734
- That was good.
- I like that one.
00:34:03.801 --> 00:34:06.034
Content's easy
with you, Merrick.
00:34:06.168 --> 00:34:08.834
My name is Merrick Hanna.
I am 16 years old.
00:34:08.934 --> 00:34:12.068
[beatboxing]
00:34:12.201 --> 00:34:13.969
When people think
of influencers...
00:34:14.035 --> 00:34:17.767
I think they think
it's very leisurely. It's not.
00:34:20.034 --> 00:34:23.667
[beatboxing]
00:34:24.934 --> 00:34:26.635
- It's fast.
- Is that how it goes?
00:34:26.702 --> 00:34:28.268
- Yeah, it's that fast.
- Why is it so fast?
00:34:28.368 --> 00:34:30.468
- 'Cause it loops.
- Ah.
00:34:30.568 --> 00:34:32.335
Yeah, so it seems like it's
longer when you watch it.
00:34:32.435 --> 00:34:33.767
I don't think it needs
to be quite as fast
00:34:33.901 --> 00:34:35.001
as you're suggesting.
00:34:35.134 --> 00:34:37.301
Try to make it longer,
you're rushing so much.
00:34:37.401 --> 00:34:39.702
To manage Merrick's career
at this point,
00:34:39.768 --> 00:34:43.969
is definitely a full time job,
on top of my full time job.
00:34:44.069 --> 00:34:46.969
Reading all the emails for him,
reading the contracts,
00:34:47.069 --> 00:34:49.834
reading offers, replying,
the back and forth...
00:34:49.934 --> 00:34:51.635
My dad helps a lot
with finding ideas,
00:34:51.735 --> 00:34:53.435
because that
is a big part of it.
00:34:53.535 --> 00:34:55.401
He'll find a trend
that he thinks I can do,
00:34:55.502 --> 00:34:56.735
he'll show it to me,
and I'll be like,
00:34:56.835 --> 00:34:58.435
"All right,
I know what to do."
00:34:58.535 --> 00:35:01.201
Hit, hit, hit.
Then I'll push back.
00:35:01.301 --> 00:35:02.235
Even though it may not
seem like it,
00:35:02.335 --> 00:35:03.735
he does a lot of the work.
00:35:03.835 --> 00:35:06.934
I--I don't sleep quite as much
as I used to.
00:35:09.400 --> 00:35:12.801
[beatboxing]
00:35:16.201 --> 00:35:17.568
SHAWN:
It's like a gold rush.
00:35:17.668 --> 00:35:19.768
Brands wisely
are now seeing
00:35:19.835 --> 00:35:22.401
that you can
pinpoint an audience
00:35:22.502 --> 00:35:24.401
better, probably,
through social media,
00:35:24.468 --> 00:35:26.002
and TikTok especially,
00:35:26.069 --> 00:35:29.702
than a lot of traditional
means of advertising,
00:35:29.802 --> 00:35:31.635
and then, there's a lot
of people who are trying
00:35:31.735 --> 00:35:33.101
to take advantage
of the gold rush,
00:35:33.201 --> 00:35:34.902
who just shouldn't be,
who are incompetent,
00:35:35.002 --> 00:35:38.168
but a lot of the influencers
are young and inexperienced,
00:35:38.301 --> 00:35:40.735
- and don't know better.
- Right.
00:35:40.835 --> 00:35:43.702
I think that having a parent
filter social media messages
00:35:43.768 --> 00:35:46.134
is critical
parent involvement,
00:35:46.235 --> 00:35:47.901
if you wanna keep an eye
00:35:48.001 --> 00:35:49.934
- on your child--
- It's very important.
00:35:50.034 --> 00:35:51.968
Yeah.
00:35:52.068 --> 00:35:54.134
♪ Inferno
00:35:54.235 --> 00:35:56.901
♪ Baby, I'm the reason
why bad's so fun ♪
00:35:57.001 --> 00:35:59.401
NEWSCASTER: "A third
of TikTok's US users
00:35:59.502 --> 00:36:03.334
may be 14 or under,
raising safety questions."
00:36:05.201 --> 00:36:07.934
An extremely popular video app
that was called Musical.ly
00:36:08.034 --> 00:36:10.335
is now called TikTok,
has agreed to pay
00:36:10.435 --> 00:36:12.568
millions of dollars in fines
for illegally collecting
00:36:12.668 --> 00:36:14.634
personal information
from children.
00:36:20.968 --> 00:36:22.702
Hey, Carter.
00:36:22.802 --> 00:36:25.568
SCOTT: These companies
are preying on children.
00:36:25.668 --> 00:36:28.668
TikTok is amassing
a profile on them,
00:36:28.802 --> 00:36:31.635
so that they can be
targeted by advertisers.
00:36:31.735 --> 00:36:34.401
They can push ideas
to that child,
00:36:34.502 --> 00:36:35.868
and that is dangerous.
00:36:38.368 --> 00:36:42.168
One of the unique features
of TikTok
00:36:42.301 --> 00:36:44.201
is that a child
could post a video,
00:36:44.301 --> 00:36:47.768
uh, dancing and having fun,
uh, and there's a feature
00:36:47.902 --> 00:36:50.435
on it called "Duets",
where you have children
00:36:50.535 --> 00:36:52.801
posting their own videos
and then you just have
00:36:52.934 --> 00:36:54.634
these older men
staring at them.
00:36:54.734 --> 00:36:57.601
♪ Good morning, beautiful
00:36:57.701 --> 00:37:00.068
♪ How was your night?
00:37:00.168 --> 00:37:02.635
SCOTT: Why are these older men
doing... you know,
00:37:02.735 --> 00:37:05.768
just part of "I wanna be
in a duet with you."
00:37:05.869 --> 00:37:09.102
We have one where the girl
went to kiss her camera
00:37:09.202 --> 00:37:11.902
at the same time one of these
older men kisses his camera,
00:37:12.002 --> 00:37:15.634
so it looks like you're having
an older man making out
00:37:15.734 --> 00:37:17.702
with-with a young girl.
00:37:17.802 --> 00:37:19.902
These people
are seeing your children.
00:37:20.002 --> 00:37:23.535
And they could contact them
through the TikTok app.
00:37:23.602 --> 00:37:25.868
There were child predators
before social media,
00:37:25.968 --> 00:37:27.734
but they didn't have
direct access
00:37:27.834 --> 00:37:30.235
to your child's inbox.
00:37:30.335 --> 00:37:32.068
♪ Walk up in dat bit'
too clean, I'm froze ♪
00:37:32.201 --> 00:37:33.635
♪ They don't fight you
where I'm from ♪
00:37:33.735 --> 00:37:35.401
♪ Like the beat,
I keep a drum ♪
00:37:35.502 --> 00:37:36.834
♪ You ain't got doubt,
lil' boy ♪
00:37:36.934 --> 00:37:38.635
I'm involved in working
on litigation
00:37:38.735 --> 00:37:41.235
against TikTok, and my son
is up in his bedroom,
00:37:41.335 --> 00:37:44.835
you know, doing livestream
of TikTok, it turned out.
00:37:44.935 --> 00:37:46.835
So, like, the work I do,
is that--is that ever
00:37:46.935 --> 00:37:49.134
on the back of your mind,
like, "I'm using all these apps,
00:37:49.235 --> 00:37:52.335
"and my dad has these lawsuits
against these companies
00:37:52.435 --> 00:37:54.268
for data
and protecting people..."
00:37:54.368 --> 00:37:55.802
Like, do you think
about that?
00:37:55.935 --> 00:37:57.901
No. I just think about it
'cause you come home
00:37:57.968 --> 00:37:59.568
and tell me
about all that stuff,
00:37:59.668 --> 00:38:02.034
and then I'm like,
technology is like someone
00:38:02.134 --> 00:38:05.802
my age is, like,
so essential to everything
00:38:05.902 --> 00:38:08.002
I do that it's, like,
I kind of have to live
00:38:08.102 --> 00:38:11.034
with the fact that there's gonna
be people that are profiting
00:38:11.134 --> 00:38:15.869
off my data, and I have
no real recourse for that.
00:38:15.935 --> 00:38:18.568
What's more concerning
is, like, the--the accuracy
00:38:18.702 --> 00:38:20.934
of the algorithm, like,
I could be talking
00:38:21.034 --> 00:38:24.001
about a movie
and then later that day,
00:38:24.134 --> 00:38:26.968
that movie, like, shows up
on my feed.
00:38:27.068 --> 00:38:28.635
And you're just talking
randomly to somebody
00:38:28.768 --> 00:38:30.068
- about the movie?
- That happens, like...
00:38:30.168 --> 00:38:32.934
way more often
than I'm comfortable with.
00:38:33.001 --> 00:38:34.535
And that
doesn't freak you out?
00:38:34.635 --> 00:38:36.435
Oh, it does, to some degree,
but, like, I guess
00:38:36.535 --> 00:38:37.567
I'm used to it
at this point,
00:38:37.634 --> 00:38:39.434
like it happens
so much.
00:38:41.401 --> 00:38:43.201
I am on the frontlines
00:38:43.268 --> 00:38:45.602
of fighting privacy battles
for children.
00:38:45.702 --> 00:38:49.468
And my kids know
that's what I do,
00:38:49.568 --> 00:38:51.935
but they're on the app,
00:38:52.069 --> 00:38:55.301
so it's a fight you fight,
but it's a difficult fight
00:38:55.401 --> 00:38:56.935
to win with your kids.
00:38:57.035 --> 00:39:00.335
You can only do so much,
and these companies know that.
00:39:00.435 --> 00:39:03.834
If there's no regulation of it,
you, as a parent,
00:39:03.934 --> 00:39:05.635
you don't have any control
00:39:05.735 --> 00:39:08.501
over what's being pushed
to these kids.
00:39:12.702 --> 00:39:15.834
AVRIEL:
Gen-Z is a unique generation.
00:39:15.901 --> 00:39:17.835
To be a digital native
is to be someone
00:39:17.935 --> 00:39:22.068
who doesn't know a world
without the internet.
00:39:22.168 --> 00:39:24.268
Teenagers are in a really
sensitive point
00:39:24.335 --> 00:39:25.802
in their development,
00:39:25.902 --> 00:39:28.401
both in terms of how
their brains are rewiring,
00:39:28.535 --> 00:39:31.701
and in terms of how they're
making sense of themselves
00:39:31.767 --> 00:39:33.033
and their place in the world.
00:39:35.068 --> 00:39:37.134
What ends up happening
is that the algorithms
00:39:37.235 --> 00:39:40.435
themselves end up shaping
the development
00:39:40.568 --> 00:39:42.201
of teenagers on these apps,
00:39:42.301 --> 00:39:46.934
in ways that we don't
understand at all.
00:39:49.002 --> 00:39:52.235
With any recommendation
algorithm, you run the risk
00:39:52.335 --> 00:39:56.435
of individuals who look
similar to each other,
00:39:56.502 --> 00:39:59.702
in terms of their activity,
getting pushed closer
00:39:59.802 --> 00:40:02.268
and closer and closer together
in terms of content
00:40:02.368 --> 00:40:03.934
that they're being recommended,
00:40:04.034 --> 00:40:06.268
and whatever information
is gonna confirm
00:40:06.335 --> 00:40:08.134
your preexisting beliefs.
00:40:09.835 --> 00:40:13.602
By not allowing people
from diverse perspectives
00:40:13.735 --> 00:40:15.802
to come into contact
with each other...
00:40:15.902 --> 00:40:19.201
it lessens their ability
for empathy.
00:40:19.301 --> 00:40:23.033
The algorithms are
reinforcing social disparities.
00:40:25.468 --> 00:40:26.768
It's not just TikTok.
00:40:26.835 --> 00:40:30.835
It is the technology
that TikTok relies on,
00:40:30.935 --> 00:40:35.834
but recommendation algorithms
have infiltrated
00:40:35.901 --> 00:40:38.034
all aspects of our society.
00:40:39.468 --> 00:40:43.468
Humans are relying
on recommendation systems
00:40:43.535 --> 00:40:45.734
to tell them
what decisions to make.
00:40:47.301 --> 00:40:51.702
And they are determining
our futures moment by moment,
00:40:51.802 --> 00:40:54.734
in ways that we have
very little control.
00:40:57.768 --> 00:40:59.368
If we fail to regulate
social media
00:40:59.468 --> 00:41:02.969
and the impact that it's
having on this generation,
00:41:03.069 --> 00:41:07.834
we're gonna see a lot
of marginalized teenagers
00:41:07.901 --> 00:41:10.768
experiencing harms
that none of us
00:41:10.902 --> 00:41:12.568
had the experience
and that none of us
00:41:12.702 --> 00:41:14.934
are prepared
to help them navigate.
00:41:23.968 --> 00:41:25.400
DEJA: I've been
in the spotlight
00:41:25.501 --> 00:41:26.801
since I was 16.
00:41:26.901 --> 00:41:29.968
And it is exhausting.
00:41:30.068 --> 00:41:32.602
There's definitely
this tension, uh, always
00:41:32.702 --> 00:41:34.001
between produce,
produce, produce,
00:41:34.101 --> 00:41:35.301
stay relevant,
00:41:35.435 --> 00:41:37.801
and produce things
that you're going to be
00:41:37.901 --> 00:41:39.235
proud of in ten years,
00:41:39.335 --> 00:41:41.534
and they don't always
go together.
00:41:41.634 --> 00:41:43.235
They can't.
00:41:43.335 --> 00:41:47.868
And... I think it leads me
to question now,
00:41:47.968 --> 00:41:49.535
"Am I doing the right thing?
00:41:49.635 --> 00:41:52.167
Did I make
the right choices?"
00:41:55.134 --> 00:41:57.134
I was raised
by a single mom,
00:41:57.235 --> 00:42:01.801
and I grew up in a household
that, like many,
00:42:01.901 --> 00:42:04.401
couldn't afford
the basics.
00:42:04.468 --> 00:42:10.235
Probably the hardest time
was when my mom started
00:42:10.301 --> 00:42:12.434
to decline
into substance abuse.
00:42:14.502 --> 00:42:17.268
When I was 15, I walked out
of my mom's house,
00:42:17.335 --> 00:42:21.068
because I couldn't get
what I needed there.
00:42:21.168 --> 00:42:22.702
And so, for me,
that looked
00:42:22.768 --> 00:42:24.134
like living
at a friend's house,
00:42:24.235 --> 00:42:27.502
until I graduated
and moved to college.
00:42:27.568 --> 00:42:29.668
I was seeing
a therapist at Columbia
00:42:29.768 --> 00:42:31.668
for the first time ever,
and I, like,
00:42:31.768 --> 00:42:33.934
went into her office
and I was shaking
00:42:34.034 --> 00:42:35.568
and crying
and she couldn't understand,
00:42:35.668 --> 00:42:37.201
and then she was telling me,
"You know,
00:42:37.301 --> 00:42:40.268
why don't you just delete
your social media?"
00:42:40.401 --> 00:42:42.802
I was like, "What you
don't understand
00:42:42.902 --> 00:42:44.401
"is that I can't delete
these accounts,
00:42:44.468 --> 00:42:47.667
"because they are what
keeps me financially stable.
00:42:47.767 --> 00:42:50.268
"I pay all of my own bills,
00:42:50.368 --> 00:42:52.868
and then, additionally,
I pay my mom's bills."
00:42:52.968 --> 00:42:56.468
And monetizing
on social media's given me
00:42:56.568 --> 00:42:59.868
the opportunity to do that."
00:42:59.968 --> 00:43:04.168
When I'm being abused
or harassed online,
00:43:04.268 --> 00:43:07.235
it's almost impossible
for me to step away.
00:43:07.335 --> 00:43:09.300
And it's kind of like
an abusive relationship
00:43:09.434 --> 00:43:10.602
in that regard.
00:43:10.668 --> 00:43:12.502
I have to open myself up
to this hate,
00:43:12.635 --> 00:43:15.234
because this is what creates
financial stability for me.
00:43:20.268 --> 00:43:24.401
What starts out as just
a place to be creative,
00:43:24.502 --> 00:43:25.801
and express yourself,
00:43:25.901 --> 00:43:29.901
becomes this rat race
for attention.
00:43:29.968 --> 00:43:31.901
And this need
to constantly chase
00:43:32.001 --> 00:43:33.701
a like count,
follow accounts,
00:43:33.767 --> 00:43:34.734
and view counts.
00:43:34.834 --> 00:43:36.535
A need to constantly perform
00:43:36.602 --> 00:43:38.467
in a way that can really
break people down.
00:43:39.902 --> 00:43:43.168
Social media influencer
was the fourth highest
00:43:43.235 --> 00:43:46.200
aspiration among
elementary school students.
00:43:48.001 --> 00:43:51.001
On the outside,
the life of influencers
00:43:51.068 --> 00:43:54.034
looks really fun
and glamorous.
00:43:54.134 --> 00:43:56.468
On the inside,
a lot of those influencers,
00:43:56.568 --> 00:43:59.268
in addition to getting
some external validation,
00:43:59.368 --> 00:44:02.668
they're getting
a lot of harassment and hate.
00:44:02.768 --> 00:44:06.934
They have to perform
happiness all of the time.
00:44:07.001 --> 00:44:10.634
Many of them are struggling
with depression,
00:44:10.734 --> 00:44:13.368
anxiety, burnout,
00:44:13.435 --> 00:44:19.201
and that is having
very real-world consequences.
00:44:19.268 --> 00:44:22.502
So that algorithm that's
always trying to figure out
00:44:22.635 --> 00:44:24.301
what the hottest trends are,
00:44:24.401 --> 00:44:26.335
is constantly lifting
something up
00:44:26.468 --> 00:44:28.367
to the stratosphere,
and then taking it
00:44:28.467 --> 00:44:29.767
back down again.
00:44:29.901 --> 00:44:31.502
To me, this is bigger
than TikTok.
00:44:31.602 --> 00:44:34.568
It's about who in our society
gets heard,
00:44:34.702 --> 00:44:37.767
and what you have to do
in our society to get heard.
00:44:46.235 --> 00:44:48.134
FEROZA: After my first video
about the Uyghurs
00:44:48.235 --> 00:44:51.901
was taken down, I knew I had
to disguise my video,
00:44:52.001 --> 00:44:54.301
so I grabbed
my pink eyelash curler,
00:44:54.401 --> 00:44:57.702
and I started curling
my lashes.
00:44:57.835 --> 00:44:59.535
This is the one
that started it all.
00:44:59.635 --> 00:45:01.068
Hi, guys,
I'm gonna teach you, guys,
00:45:01.134 --> 00:45:02.335
how to get long lashes,
00:45:02.435 --> 00:45:04.235
so the first thing
you need to do
00:45:04.335 --> 00:45:06.201
is grab your lash curler,
curl your lashes, obviously,
00:45:06.335 --> 00:45:08.034
then you're gonna
put them down
00:45:08.134 --> 00:45:09.568
and use your phone
that you're using right now
00:45:09.668 --> 00:45:11.435
to search up
what's happening in China,
00:45:11.568 --> 00:45:13.201
how they're getting
concentration camps,
00:45:13.301 --> 00:45:14.968
throwing innocent Muslims
in there,
00:45:15.068 --> 00:45:16.702
separating their families
from each other,
00:45:16.802 --> 00:45:18.235
kidnapping them,
murdering them,
00:45:18.335 --> 00:45:20.001
raping them,
forcing them...
00:45:20.101 --> 00:45:22.468
I spoke about that in just,
like, 40 seconds,
00:45:22.602 --> 00:45:24.901
and then I continued on
to the eyelash tutorial.
00:45:25.034 --> 00:45:26.235
...this is another Holocaust,
00:45:26.335 --> 00:45:28.001
yet no one is talking
about it.
00:45:28.101 --> 00:45:31.835
Please, be aware.
Please, spread awareness.
00:45:31.935 --> 00:45:35.101
And... yeah, so you can grab
your lash curler again...
00:45:35.201 --> 00:45:38.301
It reached millions,
it reached millions, and...
00:45:38.401 --> 00:45:39.301
people were...
00:45:39.401 --> 00:45:41.567
People were shocked
in the comments.
00:45:43.134 --> 00:45:45.034
You popped up in my "For You"
page and I was like,
00:45:45.134 --> 00:45:46.301
"Oh, my God, that's Feroza.
00:45:46.368 --> 00:45:48.101
I was, like,
"I sent this to her,"
00:45:48.201 --> 00:45:49.735
and I was like, "Why are you
on my 'For You' page?
00:45:49.835 --> 00:45:52.368
And why do you have, like, so
many likes?" It was like crazy.
00:45:52.468 --> 00:45:54.168
I had to tell my mom
right after that.
00:45:54.268 --> 00:45:55.868
I was like, "Ugh, people
are sending it now.
00:45:56.001 --> 00:45:58.101
I should tell my mom
I have TikTok."
00:45:58.201 --> 00:46:00.834
My mom, like,
we were watching the news,
00:46:00.934 --> 00:46:02.734
and she's like,
"Is that Feroza?"
00:46:02.834 --> 00:46:04.534
I was like, "Oh, my Gosh.
00:46:04.634 --> 00:46:05.801
Yes, it's my friend".
00:46:07.368 --> 00:46:10.635
A 40-second video
going viral in just one day.
00:46:10.735 --> 00:46:13.534
That's, like, the power
that TikTok holds.
00:46:15.068 --> 00:46:17.468
So I decided to post
two more videos, uh,
00:46:17.568 --> 00:46:20.068
the following two days,
to just post more information
00:46:20.168 --> 00:46:21.668
on how to help.
00:46:21.768 --> 00:46:23.968
Hey, guys, you wanted
a second part to the video
00:46:24.034 --> 00:46:25.435
on how to get longer lashes,
so here it is.
00:46:25.535 --> 00:46:27.001
And, by the way,
I say that so TikTok
00:46:27.101 --> 00:46:28.668
doesn't take down my videos.
00:46:28.802 --> 00:46:32.667
I don't think TikTok noticed
what I posted at first,
00:46:32.767 --> 00:46:34.301
and then the following day,
00:46:34.435 --> 00:46:36.635
it was like a Monday,
I wake up at 5:00 am
00:46:36.735 --> 00:46:40.567
for school, and I go on TikTok
to see how many views
00:46:40.667 --> 00:46:42.034
the next two videos got,
00:46:42.101 --> 00:46:45.168
and I see that I can't even
go on TikTok.
00:46:45.301 --> 00:46:47.100
My--my account's suspended.
00:46:48.702 --> 00:46:51.335
"Your account is
temporarily suspended
00:46:51.435 --> 00:46:55.435
because it goes against
community guidelines."
00:46:55.535 --> 00:46:58.535
Nothing from my posts
violates community guidelines.
00:46:58.635 --> 00:47:02.602
I show nothing of hate speech,
I show no profanity,
00:47:02.668 --> 00:47:05.534
everything I spoke
about was factual evidence.
00:47:06.869 --> 00:47:09.768
My first thought after seeing
this black screen
00:47:09.902 --> 00:47:13.934
on my phone was, "I'm not
letting them silence me."
00:47:14.034 --> 00:47:16.602
I--I quickly made
a Twitter account.
00:47:16.735 --> 00:47:21.567
I quickly posted on Instagram
that, "Hey, I was silenced
00:47:21.667 --> 00:47:22.869
for speaking up",
00:47:22.969 --> 00:47:25.802
and I'm not gonna let them
get away with that.
00:47:25.902 --> 00:47:28.634
And I asked people
to continue sharing the video.
00:47:30.034 --> 00:47:32.968
I'm so grateful
that people heard me
00:47:33.068 --> 00:47:36.101
saying that my voice
was taken away.
00:47:36.235 --> 00:47:37.834
17-year-old Feroza Aziz...
00:47:37.934 --> 00:47:39.401
[speaks in foreign language]
00:47:39.468 --> 00:47:41.702
NEWSCASTER: Joining us now
is Feroza Aziz...
00:47:41.835 --> 00:47:44.834
FEROZA: In less than a few
days, I was on Al-Jazeera,
00:47:44.934 --> 00:47:46.635
BBC, CNN...
00:47:46.702 --> 00:47:51.001
More than 1.5 million people
watched it just on TikTok.
00:47:51.101 --> 00:47:52.801
What kind of responses
have you had?
00:47:52.901 --> 00:47:54.335
Just from regular people...
00:47:54.435 --> 00:47:56.768
Half of them is positive,
and the other half
00:47:56.869 --> 00:47:59.101
is, "Well, I don't know
this is happening.
00:47:59.201 --> 00:48:00.767
"Why am I hearing this
on TikTok
00:48:00.834 --> 00:48:02.401
and why not on the news?"
00:48:02.502 --> 00:48:05.268
I did, uhm, feel
a little bit upset, though,
00:48:05.368 --> 00:48:07.502
because I thought, as if
more attention was brought
00:48:07.635 --> 00:48:10.101
to me being silenced
than to what I was actually
00:48:10.201 --> 00:48:11.368
speaking about.
00:48:11.468 --> 00:48:13.535
I remember seeing
headlines saying,
00:48:13.668 --> 00:48:16.535
"Oh, Beijing-owned app
takes down video."
00:48:16.635 --> 00:48:20.334
And not, "Oh, like, there's
a Uyghur genocide happening."
00:48:22.235 --> 00:48:24.635
I felt very overwhelmed
with the news coverage.
00:48:24.735 --> 00:48:26.802
One of my idols--
I have, like, her picture
00:48:26.935 --> 00:48:31.400
on my wall, mm, AOC retweeted
an article of me.
00:48:32.768 --> 00:48:35.268
I didn't expect, uhm,
politicians from China
00:48:35.368 --> 00:48:36.835
to even comment on it.
00:48:36.935 --> 00:48:39.467
China's Prime Ministry
said it had no specifics
00:48:39.601 --> 00:48:40.601
of this case.
00:48:42.635 --> 00:48:44.635
You saying the content
is still on the TikTok account?
00:48:44.735 --> 00:48:46.401
I'm not aware
of the situation.
00:48:46.468 --> 00:48:47.602
How could I know
what's happening
00:48:47.668 --> 00:48:49.435
on the account
of one individual?
00:48:49.502 --> 00:48:51.535
I thought I had
the freedom of speech,
00:48:51.668 --> 00:48:55.268
but I guess... under TikTok
that's not possible
00:48:55.368 --> 00:48:57.767
for me to have that right.
00:49:03.502 --> 00:49:06.568
DAVID POLGAR: Content
moderation is a process
00:49:06.668 --> 00:49:08.401
of determining
what's appropriate
00:49:08.502 --> 00:49:11.502
and what's not
appropriate online.
00:49:11.635 --> 00:49:14.668
One of the natural tensions
becomes if you have
00:49:14.768 --> 00:49:17.969
a company but it's
all throughout the globe,
00:49:18.069 --> 00:49:21.467
do you adjust to the cultural
norms of another country?
00:49:22.902 --> 00:49:25.602
And a lot of people,
when on Facebook or TikTok
00:49:25.702 --> 00:49:28.868
or Instagram, they talk
about it by using concepts
00:49:28.968 --> 00:49:30.502
like freedom of speech,
00:49:30.602 --> 00:49:35.701
because all throughout society,
specifically American society,
00:49:35.801 --> 00:49:38.335
we have debated.
00:49:38.401 --> 00:49:40.668
What's appropriate?
How you balance
00:49:40.768 --> 00:49:43.468
individual autonomy
and expression
00:49:43.602 --> 00:49:45.168
with the societal impact?
00:49:45.235 --> 00:49:48.767
That used to reside
in governmental bodies.
00:49:52.068 --> 00:49:54.335
With social media,
the power
00:49:54.435 --> 00:49:55.602
of determining speech
00:49:55.668 --> 00:49:58.535
has been far
too consolidated.
00:49:58.668 --> 00:50:00.934
Major tech companies...
if they have the power
00:50:01.068 --> 00:50:04.268
of deciding what's okay
and what's not okay
00:50:04.368 --> 00:50:07.168
with what I say,
if they have the power
00:50:07.301 --> 00:50:09.068
to deplatform,
00:50:09.134 --> 00:50:13.001
that puts a tremendous
level of power
00:50:13.134 --> 00:50:15.235
in an unelected official.
00:50:15.335 --> 00:50:16.934
That's antidemocratic.
00:50:20.635 --> 00:50:22.201
I mean, the think
about TikTok is
00:50:22.301 --> 00:50:23.968
it's, so it's--it's
Chinese-owned,
00:50:24.034 --> 00:50:26.869
or just that, owners,
they don't control it.
00:50:27.002 --> 00:50:31.201
In fact, TikTok takes
a much, much stronger attitude
00:50:31.268 --> 00:50:33.268
against the sort
of content that...
00:50:33.368 --> 00:50:34.802
well, that
the Chinese government
00:50:34.902 --> 00:50:37.268
wouldn't like to see
on a social media app.
00:50:37.368 --> 00:50:39.535
There's no orders coming down
from up high,
00:50:39.668 --> 00:50:43.668
there's just the understanding
that you will do what Beijing
00:50:43.768 --> 00:50:46.201
wants and you'll try
and guess what they want
00:50:46.301 --> 00:50:48.467
and do it
without being asked.
00:50:58.635 --> 00:51:01.401
You know, I might--I might
go livestream
00:51:01.468 --> 00:51:03.435
on Duoyin a little bit.
00:51:03.502 --> 00:51:06.034
That's what's hot
on Duoyin, but
00:51:06.134 --> 00:51:09.667
there's one problem,
'cause all the restrictions
00:51:09.734 --> 00:51:11.868
of my tattoos.
00:51:11.934 --> 00:51:13.801
They might shut me down.
00:51:13.901 --> 00:51:14.868
Here we go.
00:51:14.968 --> 00:51:16.634
We're live!
00:51:16.734 --> 00:51:18.368
Tattoos? No.
00:51:18.468 --> 00:51:21.601
Also the piercing? No.
00:51:35.435 --> 00:51:39.268
Ah! Sorry, sorry.
00:51:39.368 --> 00:51:41.602
You see this?
00:51:41.702 --> 00:51:45.734
Because I have tattoos,
I can't--I can't go livestream.
00:51:56.535 --> 00:51:58.602
That's--that's really messed up.
You know?
00:51:58.735 --> 00:52:00.435
Yeah, on Duoyin
you have to watch
00:52:00.535 --> 00:52:02.268
everything that you say.
00:52:02.401 --> 00:52:06.201
Just one word...
one frame
00:52:06.301 --> 00:52:08.234
could set your whole video off.
00:52:10.802 --> 00:52:14.834
For the first time in history,
a person can write something
00:52:14.901 --> 00:52:17.968
or say something
and have it reach
00:52:18.101 --> 00:52:20.368
a large segment of the world.
00:52:20.502 --> 00:52:23.101
So this brings up the topic
of censorship,
00:52:23.201 --> 00:52:24.969
which is really tricky.
00:52:25.069 --> 00:52:29.567
The internet allows
for the fastest spread of ideas
00:52:29.667 --> 00:52:31.535
in the history of the world.
00:52:31.635 --> 00:52:33.168
China with
the great firewall
00:52:33.268 --> 00:52:36.702
and with government moderation
has taken a very active hand
00:52:36.802 --> 00:52:39.568
in controlling what topics
are discussed,
00:52:39.635 --> 00:52:42.168
what ideas are acceptable
to discuss on the internet.
00:52:42.268 --> 00:52:46.401
We've never had to grapple
with questions around censorship
00:52:46.502 --> 00:52:48.802
in an era where
so many people have
00:52:48.869 --> 00:52:51.667
a global megaphone now
in their hands.
00:52:55.568 --> 00:52:58.101
It was back in late December
when Dr. Li Wenliang
00:52:58.201 --> 00:52:59.602
first warned friends
on WeChat
00:52:59.702 --> 00:53:02.168
about a SARS-like
disease going around.
00:53:02.268 --> 00:53:04.702
Li sent a group message
saying that a test result
00:53:04.802 --> 00:53:06.602
from a patient quarantined
at the hospital
00:53:06.702 --> 00:53:09.835
where he worked showed
a patient had a coronavirus,
00:53:09.935 --> 00:53:12.934
but hours after hitting send,
Wuhan City health officials
00:53:13.034 --> 00:53:14.401
tracked Li down,
00:53:14.502 --> 00:53:16.968
questioning where he got
the information.
00:53:27.301 --> 00:53:32.901
Dr. Li sounded the alarm early
in the COVID-19 outbreak.
00:53:32.968 --> 00:53:35.901
He soon faced
government intimidation,
00:53:35.968 --> 00:53:39.534
and then contracted
the virus.
00:53:39.634 --> 00:53:41.268
When he passed away,
00:53:41.368 --> 00:53:43.901
I was among
many Chinese médecins
00:53:44.001 --> 00:53:46.834
who expressed grief
and outrage
00:53:46.934 --> 00:53:49.401
at the events on Weibo,
00:53:49.468 --> 00:53:52.534
only to have
my account deleted.
00:53:54.568 --> 00:53:57.969
I felt guilt
more than anger.
00:53:58.069 --> 00:54:01.934
At the time, I was
a tech worker at ByteDance,
00:54:02.034 --> 00:54:05.335
where I helped develop tools
and platforms
00:54:05.435 --> 00:54:07.401
for content moderation.
00:54:07.502 --> 00:54:11.502
In other words,
I had helped build a system
00:54:11.602 --> 00:54:14.768
that censored accounts
like mine.
00:54:14.835 --> 00:54:18.902
The technologies we created
supported the entire company's
00:54:19.035 --> 00:54:22.435
content moderation,
including Duoyin at home,
00:54:22.535 --> 00:54:25.334
and its international
equivalent, TikTok.
00:54:28.668 --> 00:54:32.101
There was a long,
constantly updated list
00:54:32.201 --> 00:54:36.702
of sensitive words,
dates and names.
00:54:36.768 --> 00:54:39.702
If a user mentioned
a sensitive term,
00:54:39.768 --> 00:54:43.101
they would shut down the
ongoing livestreaming session
00:54:43.168 --> 00:54:47.001
and even suspend
or delete the account.
00:54:48.468 --> 00:54:50.567
Many of my colleagues
felt uneasy
00:54:50.667 --> 00:54:53.201
about what we were doing,
00:54:53.301 --> 00:54:57.134
but we all felt that
there was nothing we can do.
00:54:58.768 --> 00:55:01.301
Dr. Li warned
his colleagues and friends
00:55:01.401 --> 00:55:04.335
about an unknown virus.
00:55:04.435 --> 00:55:07.001
He was punished for that.
00:55:07.134 --> 00:55:11.034
Just imagine,
had any social media platform
00:55:11.168 --> 00:55:13.002
been able to reject
the government's
00:55:13.102 --> 00:55:17.701
censorship directives,
perhaps millions of lives
00:55:17.801 --> 00:55:21.968
would have been saved today.
00:55:24.035 --> 00:55:26.268
NEWSCASTER: TikTok faces
government investigation
00:55:26.368 --> 00:55:27.934
in over seven countries,
00:55:28.034 --> 00:55:30.335
all citing concerns
over national security
00:55:30.468 --> 00:55:33.234
and content moderation.
00:55:42.602 --> 00:55:47.702
CHRIS: In 2019, we had someone
contact us,
00:55:47.802 --> 00:55:51.168
claiming to have internal
information and access
00:55:51.268 --> 00:55:54.801
to internal TikTok
moderation guidelines.
00:55:56.301 --> 00:56:01.235
And I don't think we realized,
at the time,
00:56:01.301 --> 00:56:04.068
how big the story would get.
00:56:04.134 --> 00:56:06.100
[cell phone buzzing]
00:56:08.035 --> 00:56:13.702
What we saw was that TikTok
was very explicit
00:56:13.768 --> 00:56:17.001
about what it wanted to have
on the platform,
00:56:17.101 --> 00:56:20.234
and what it didn't want
to show on the platform.
00:56:21.701 --> 00:56:24.467
TikTok rarely deletes content.
00:56:24.601 --> 00:56:25.601
They don't have to.
00:56:25.701 --> 00:56:27.034
They can just hide it.
00:56:28.468 --> 00:56:31.835
The guidelines were explicitly
instructing moderators
00:56:31.935 --> 00:56:38.034
to deal with people who are
LGBTQ or had disabilities
00:56:38.101 --> 00:56:41.235
or for whatever reason
TikTok felt
00:56:41.301 --> 00:56:46.100
were vulnerable to bullying
by hiding their content.
00:56:49.368 --> 00:56:50.702
So it was in Mandarin,
00:56:50.768 --> 00:56:53.501
and underneath
a fairly awkward
00:56:53.601 --> 00:56:55.301
English translation,
00:56:55.368 --> 00:56:59.401
so it says "subjects
who is susceptible to bullying
00:56:59.502 --> 00:57:03.667
or harassment, based on their
physical or mental condition.
00:57:03.767 --> 00:57:07.934
"Example, facial disfigurement,
00:57:08.068 --> 00:57:11.667
"autism,
Down syndrome,
00:57:11.734 --> 00:57:13.468
"disabled people or people
00:57:13.535 --> 00:57:16.201
"with some facial
problems, etc.
00:57:16.335 --> 00:57:20.001
"Content of subjects likely
to incite cyberbullying
00:57:20.134 --> 00:57:24.834
will be allowed,
but marked with risk tag 4."
00:57:27.168 --> 00:57:29.568
Basically, and like...
different levels
00:57:29.635 --> 00:57:32.201
of what we call
"algorithmic punishment"
00:57:32.301 --> 00:57:33.768
or "algorithmic visibility,"
00:57:33.835 --> 00:57:36.901
so they were put in a category
called "Risk 4,"
00:57:37.034 --> 00:57:39.868
which means that as soon
as their videos
00:57:39.968 --> 00:57:43.235
would reach a certain threshold
of views,
00:57:43.301 --> 00:57:47.567
they would automatically also be
taken from the "For You" feed.
00:57:52.602 --> 00:57:56.968
Later on,
other leaks surfaced.
00:58:18.134 --> 00:58:21.734
I actually have the ugly
content policy
00:58:21.834 --> 00:58:22.934
right in front of me,
00:58:23.034 --> 00:58:25.401
So... crazy to read this.
00:58:25.535 --> 00:58:28.335
"Abnormal body shape,
chubby,
00:58:28.435 --> 00:58:30.101
"ugly facial looks,
00:58:30.168 --> 00:58:32.801
"not limited
to 'disformatted' face,
00:58:32.901 --> 00:58:35.068
"fangs, lack of front teeth,
00:58:35.168 --> 00:58:37.635
senior people
with too many wrinkles..."
00:58:37.735 --> 00:58:39.868
And it just goes
on and on, right?
00:58:51.001 --> 00:58:52.801
NEWSCASTER:
In a statement, TikTok said...
00:59:08.201 --> 00:59:10.868
It's just a lot of the "move
fast and break things"
00:59:10.968 --> 00:59:12.134
attitude that we've seen
00:59:12.235 --> 00:59:15.301
from other
Silicon Valley companies.
00:59:15.435 --> 00:59:20.133
It's not like only TikTok
was doing these things.
00:59:22.368 --> 00:59:27.668
Obviously, the representation
that we see in media
00:59:27.735 --> 00:59:32.801
is not an accurate picture
of society,
00:59:32.934 --> 00:59:35.101
but I think there is
a difference
00:59:35.201 --> 00:59:38.835
that, you know, no TV station,
nor does Hollywood
00:59:38.969 --> 00:59:43.001
pretends to be open access
to everybody,
00:59:43.101 --> 00:59:45.068
whereas this is a promise
00:59:45.201 --> 00:59:49.068
that social media
platforms make.
00:59:52.268 --> 00:59:53.902
Am I the only one
that has noticed
00:59:54.035 --> 00:59:57.168
that Black creators get
least favored by the algorithm?
00:59:57.268 --> 00:59:58.968
How is it that my followers
are not seeing my video?
00:59:59.068 --> 01:00:00.502
What's up
with that algorithm?
01:00:00.602 --> 01:00:03.801
I've had some of my TikTok
videos get zero views,
01:00:03.901 --> 01:00:06.001
and I've been shadowbanned.
01:00:06.101 --> 01:00:09.767
EMILY: Shadowbanning
on TikTok is just when
01:00:09.901 --> 01:00:11.635
there's something
in the algorithm
01:00:11.735 --> 01:00:14.134
that just kind
of shuts you out completely.
01:00:14.235 --> 01:00:16.934
They just like find a way
to make it
01:00:17.034 --> 01:00:19.400
so nobody sees
any of your content.
01:00:22.368 --> 01:00:24.468
I am an apprenticing
occularist,
01:00:24.602 --> 01:00:26.868
an artist who works
in the medical field, uh,
01:00:27.001 --> 01:00:29.834
making prosthetic eyes.
01:00:31.101 --> 01:00:33.868
TikTok's algorithm
is very good.
01:00:33.968 --> 01:00:35.935
You know, you can
create an account
01:00:36.035 --> 01:00:38.301
and within a couple of hours
or a couple of days
01:00:38.401 --> 01:00:40.701
that algorithm
knows who you are.
01:00:40.801 --> 01:00:42.902
You know?
01:00:43.002 --> 01:00:45.568
So for that same algorithm
to kind of just...
01:00:45.702 --> 01:00:48.368
rip the rug out from thousands
of Black creators,
01:00:48.502 --> 01:00:51.901
it kind of pulls you back
for a second.
01:00:51.968 --> 01:00:53.101
We know the history
01:00:53.201 --> 01:00:54.435
that this nation
has with Black people.
01:00:54.535 --> 01:00:56.734
We know the savagery
that they had to endure
01:00:56.834 --> 01:00:58.235
because of colonizers,
01:00:58.368 --> 01:00:59.968
and the savagery that
they still have to endure...
01:01:00.068 --> 01:01:02.335
I got on, and I had made
a video talking
01:01:02.435 --> 01:01:06.468
about how my "For You" page
was only of white creators
01:01:06.568 --> 01:01:10.934
and by that point,
I would say I had...
01:01:11.034 --> 01:01:14.535
150 to 200,000 followers.
01:01:14.635 --> 01:01:17.902
And so I had a video sitting,
that I've published
01:01:17.969 --> 01:01:20.901
for three hours,
and it said zero views.
01:01:21.001 --> 01:01:23.435
That was the first time
where I was, like,
01:01:23.568 --> 01:01:25.668
this is blatant shadowbanning.
01:01:25.768 --> 01:01:28.134
Nobody's seeing us
'cause they're ensuring
01:01:28.235 --> 01:01:29.801
that nobody can.
01:01:30.701 --> 01:01:32.168
NEWSCASTER: Tech troubles.
01:01:32.268 --> 01:01:35.401
TikTok says a technical glitch
is making it appear
01:01:35.502 --> 01:01:38.068
as if posts with the hashtag
"#blacklivesmatter"
01:01:38.168 --> 01:01:40.601
and "#georgefloyd"
received no views.
01:01:40.667 --> 01:01:42.001
The video platform says
01:01:42.101 --> 01:01:44.235
it's dealing
with a display issue
01:01:44.301 --> 01:01:46.468
adding that videos
featuring those tags
01:01:46.568 --> 01:01:49.735
have amassed more
than two billion views.
01:01:49.869 --> 01:01:52.468
Normally when you go
to use a tag on TikTok,
01:01:52.568 --> 01:01:57.902
it'll tell you how many views,
have been on that tag.
01:01:57.969 --> 01:02:00.201
And you go to write
"blacklivesmatter",
01:02:00.335 --> 01:02:03.602
and it says zero
or, you know, "#blm,"
01:02:03.668 --> 01:02:06.601
"georgefloyd,"
"ahmaudarbery," anything.
01:02:06.701 --> 01:02:08.802
It would tell you zero.
01:02:08.869 --> 01:02:10.134
NEWSCASTER: TikTok said
in a statement,
01:02:10.268 --> 01:02:11.367
which reads in part:
01:02:12.467 --> 01:02:13.435
We want to..."
01:02:13.535 --> 01:02:15.235
"Last week,
a technical glitch made it
01:02:15.335 --> 01:02:19.668
"temporarily appear as if posts
uploaded using #blacklivesmatter
01:02:19.735 --> 01:02:23.068
"and #georgefloyd
would receive zero views.
01:02:23.134 --> 01:02:24.335
"We understand
that many assume
01:02:24.435 --> 01:02:26.201
"this bug to be
an intentional act
01:02:26.335 --> 01:02:28.301
"to suppress experiences
and invalidate the emotions
01:02:28.401 --> 01:02:29.901
"felt by the Black community,
01:02:29.968 --> 01:02:31.268
"and we know
we have work to do
01:02:31.368 --> 01:02:33.735
to regain
and repair that trust."
01:02:33.835 --> 01:02:35.734
Kind of like the normal
check points
01:02:35.801 --> 01:02:37.034
that people go through.
01:02:37.134 --> 01:02:38.601
"We're growing,
we're learning,
01:02:38.701 --> 01:02:41.668
we're trying to do better."
01:02:41.802 --> 01:02:44.335
I would love to believe
that it was a technical glitch.
01:02:44.435 --> 01:02:46.301
'Cause you're like,
"That's absolutely possible.
01:02:46.435 --> 01:02:49.535
100%," but it's
so oddly specific
01:02:49.635 --> 01:02:52.868
that I can't attribute that
to just being a glitch.
01:02:54.368 --> 01:02:56.501
SHELLY: TikTok has said
that their content moderation
01:02:56.601 --> 01:02:57.968
has changed.
01:02:58.101 --> 01:03:00.168
Some of what you see on there
backs that up.
01:03:00.301 --> 01:03:03.535
The sense that you see
a lot of activism there.
01:03:03.602 --> 01:03:05.168
You saw Black Lives
Matter content,
01:03:05.235 --> 01:03:07.568
eventually be
up on there.
01:03:07.668 --> 01:03:11.435
But it's constantly changing,
it's a constant black box.
01:03:11.535 --> 01:03:15.034
We have no idea what's going
into any of these algorithms.
01:03:15.134 --> 01:03:17.968
And there's zero transparency.
01:03:21.335 --> 01:03:23.068
NEWSCASTER: ByteDance,
the Beijing-based
01:03:23.168 --> 01:03:26.068
owner of TikTok,
apologized for the suspension,
01:03:26.134 --> 01:03:28.401
blaming a human
moderation error.
01:03:28.502 --> 01:03:31.235
And TikTok says it doesn't
apply Chinese moderation
01:03:31.335 --> 01:03:34.200
principles to its product
outside of mainland China.
01:03:35.935 --> 01:03:39.934
FEROZA: After a few days,
TikTok gave my account back.
01:03:40.001 --> 01:03:41.535
People don't seem
to understand
01:03:41.635 --> 01:03:44.168
what it feels like to have
someone try to take away
01:03:44.235 --> 01:03:47.268
your voice, and then
they give it back to you.
01:03:47.401 --> 01:03:49.901
It's my voice,
and them deciding
01:03:50.034 --> 01:03:51.235
to give me back
my account
01:03:51.335 --> 01:03:52.702
after taking it away.
It was...
01:03:52.835 --> 01:03:54.834
as if they could control
what I could say
01:03:54.934 --> 01:03:56.535
and what I could do,
01:03:56.668 --> 01:03:59.467
and it's just disgusting
to see an app do that.
01:04:01.468 --> 01:04:03.968
'Til this day,
my classmates will post
01:04:04.068 --> 01:04:06.034
on my social media accounts,
01:04:06.168 --> 01:04:08.035
leaving hate comments.
01:04:08.169 --> 01:04:10.368
I can delete the comments,
but I'm gonna go to class
01:04:10.468 --> 01:04:12.168
the next day, and I'm gonna
sit next to the person
01:04:12.268 --> 01:04:15.368
who hates my guts,
for just speaking on issues
01:04:15.468 --> 01:04:17.968
I believe
need to be spoken about.
01:04:19.468 --> 01:04:22.802
When you're so invested on
apps like TikTok,
01:04:22.869 --> 01:04:26.235
when something bad happens
on social media,
01:04:26.301 --> 01:04:27.467
your life is torn apart.
01:04:32.368 --> 01:04:35.767
Story time. We all worked
on the Kamala Harris campaign,
01:04:35.901 --> 01:04:37.602
in the presidential primary.
01:04:37.735 --> 01:04:41.501
And this is your sign to get
a tattoo with your work besties.
01:04:41.567 --> 01:04:43.001
For the people.
01:04:43.134 --> 01:04:44.401
DEJA:
I was 19 years old
01:04:44.502 --> 01:04:46.468
when I started
on the Kamala Harris campaign.
01:04:46.568 --> 01:04:49.801
I withdrew
my sophomore year at Columbia.
01:04:49.901 --> 01:04:51.401
And it was a huge move.
01:04:51.502 --> 01:04:53.401
I think that our perspective
as young people
01:04:53.502 --> 01:04:55.968
is what led us to think
TikTok is important.
01:04:56.034 --> 01:04:57.401
There's a lot
of young people there.
01:04:57.468 --> 01:05:00.901
We were the first campaign
that was putting content
01:05:01.001 --> 01:05:02.468
on TikTok directly.
01:05:02.568 --> 01:05:04.968
I feel like we have to take
a moment for this... this.
01:05:05.068 --> 01:05:05.968
Oh, thank you.
01:05:06.068 --> 01:05:07.101
Oh. Yes.
01:05:07.201 --> 01:05:09.901
All of Jessica's iconic shots.
01:05:09.968 --> 01:05:12.235
So good.
01:05:12.368 --> 01:05:15.435
I mean, you pioneered
vertical video.
01:05:15.568 --> 01:05:17.801
I--I love that shot
in the rain.
01:05:17.901 --> 01:05:19.502
She was boogie-ing it down.
01:05:19.602 --> 01:05:22.368
That one was everywhere.
I love that.
01:05:22.468 --> 01:05:25.001
I just remember, like,
all those airplane videos
01:05:25.068 --> 01:05:26.734
really making the rounds
on TikTok.
01:05:26.834 --> 01:05:27.934
Absolutely.
01:05:28.034 --> 01:05:29.301
They were
showing up everywhere.
01:05:29.435 --> 01:05:30.835
And it was interesting
to see the progression.
01:05:30.902 --> 01:05:33.835
Like, when different candidates
started to have, like,
01:05:33.935 --> 01:05:37.034
their official TikToks
and what kind of content
01:05:37.101 --> 01:05:39.101
did they made on TikTok.
01:05:39.201 --> 01:05:41.201
I think we felt like
we were kind of starting
01:05:41.301 --> 01:05:42.468
to hit our stride on TikTok,
01:05:42.535 --> 01:05:45.235
and then we, like,
had to stop.
01:05:45.335 --> 01:05:48.101
That email, that dreadful,
dreadful email we got.
01:05:48.201 --> 01:05:49.502
The dark day
when we were told
01:05:49.568 --> 01:05:52.134
that we couldn't be
on TikTok anymore.
01:05:52.235 --> 01:05:53.735
DEJA: I get this email that,
01:05:53.835 --> 01:05:56.635
because of security reasons,
we were all being asked
01:05:56.768 --> 01:06:01.235
to delete TikTok on government
and military phones...
01:06:01.301 --> 01:06:03.435
That was a sad email.
That was a tough one.
01:06:03.502 --> 01:06:05.602
NEWSCASTER: TikTok ownership
by a Chinese parent company
01:06:05.702 --> 01:06:07.335
subject to Chinese
surveillance law
01:06:07.435 --> 01:06:10.101
has made the app's
popularity problematic,
01:06:10.201 --> 01:06:12.034
causing concerns
from the US Army,
01:06:12.134 --> 01:06:14.301
the Navy, the TSA,
the DNC, the RNC,
01:06:14.368 --> 01:06:15.702
and the Biden campaign,
01:06:15.768 --> 01:06:17.701
all banning TikTok
from their phones.
01:06:19.468 --> 01:06:21.268
SHELLY: There were many young
soldiers in the military
01:06:21.368 --> 01:06:23.768
who were using TikTok,
they were in all sorts
01:06:23.869 --> 01:06:27.001
of US military bases
around the world,
01:06:27.101 --> 01:06:28.368
and so they would go
01:06:28.468 --> 01:06:29.969
and they would do
pushup contests.
01:06:30.069 --> 01:06:32.802
They would, you know,
do tours of the bases, then...
01:06:32.869 --> 01:06:35.702
You know, they were really
showing some pretty...
01:06:35.835 --> 01:06:38.734
top secret assets
to anyone in the world
01:06:38.868 --> 01:06:40.034
wanting to see them,
01:06:40.101 --> 01:06:41.702
and this was
at a moment where...
01:06:41.802 --> 01:06:44.935
people were
not taking TikTok seriously.
01:06:45.035 --> 01:06:49.001
But what they realized was
this silly little kids app
01:06:49.101 --> 01:06:50.702
was collecting a ton
of information
01:06:50.768 --> 01:06:55.134
on GPS, on location,
of all these soldiers,
01:06:55.235 --> 01:06:56.702
and all of that,
in the end,
01:06:56.802 --> 01:07:00.869
was heading back
into a Chinese company.
01:07:00.969 --> 01:07:02.802
♪ It took too long,
it took too long ♪
01:07:02.902 --> 01:07:05.502
♪ It took too long
for you to call back ♪
01:07:05.602 --> 01:07:07.834
♪ And normally I would
just forget that... ♪
01:07:07.901 --> 01:07:09.934
From a nation state's
perspective,
01:07:10.068 --> 01:07:12.101
well, data is the new oil.
01:07:12.201 --> 01:07:15.335
If I can understand
the connections between people,
01:07:15.435 --> 01:07:17.502
I can start to target
my misinformation,
01:07:17.568 --> 01:07:20.168
so that one person is likely
to take actions
01:07:20.268 --> 01:07:22.868
in the real world,
like vote.
01:07:24.268 --> 01:07:26.401
So the data
that TikTok collects
01:07:26.502 --> 01:07:28.335
is on par with what
other social media companies
01:07:28.435 --> 01:07:31.002
are collecting.
01:07:31.102 --> 01:07:35.001
So, the question really becomes,
"Why is TikTok being picked on?"
01:07:35.101 --> 01:07:37.468
Xenophobia should certainly
be considered a part,
01:07:37.568 --> 01:07:39.702
as part of this, we've seen
a rise in hate crimes
01:07:39.835 --> 01:07:42.268
against Asian-Americans,
and so I think,
01:07:42.368 --> 01:07:44.734
being very clear
about the differences,
01:07:44.868 --> 01:07:47.535
uh, of the practices
of a government
01:07:47.668 --> 01:07:50.268
versus the people
that happen to reside
01:07:50.368 --> 01:07:51.702
inside of that nation state.
01:07:51.802 --> 01:07:54.068
After all, I don't agree
with 100% of the things
01:07:54.134 --> 01:07:56.101
our nation does.
01:07:56.201 --> 01:07:58.334
♪ I'm singing Trump 2020
01:07:58.434 --> 01:08:00.235
♪ Trump 2020
01:08:00.335 --> 01:08:04.734
♪ Trump 2020,
Trump 2020 ♪
01:08:06.235 --> 01:08:09.602
ALEYSHA: When I first found
Gen-Z comedians online,
01:08:09.668 --> 01:08:12.435
it was so inspiring to me
as a comedian,
01:08:12.535 --> 01:08:15.968
in seeing how easy it is
to build traction
01:08:16.068 --> 01:08:18.001
on apps like TikTok.
01:08:19.834 --> 01:08:21.702
One of my friends
had posted
01:08:21.802 --> 01:08:23.702
that Donald Trump's
Tulsa rally
01:08:23.802 --> 01:08:26.502
had free tickets,
and my first thought
01:08:26.602 --> 01:08:30.502
was just how easy is it
to get a ticket.
01:08:30.568 --> 01:08:32.401
Guys, Donald Trump
is having a rally.
01:08:32.502 --> 01:08:34.134
All you had to do
is give your phone number
01:08:34.235 --> 01:08:38.401
and so I got two tickets,
but I totally forgot
01:08:38.502 --> 01:08:41.968
that I had to pick
every individual piece
01:08:42.034 --> 01:08:43.901
of lint off of my floor,
01:08:44.034 --> 01:08:46.801
and then sort them by size,
so I can't...
01:08:46.901 --> 01:08:48.735
make it for Friday.
01:08:48.835 --> 01:08:53.268
I had realized
the potential of this.
01:08:53.368 --> 01:08:56.435
You should be really careful
going to do this, you know.
01:08:56.535 --> 01:08:58.802
You don't want
a bunch of empty seats.
01:08:58.902 --> 01:09:01.734
And when I posted it,
I didn't think much of it...
01:09:03.835 --> 01:09:06.869
...but in two days
it just blew up.
01:09:07.002 --> 01:09:10.435
Oh, my God! I just registered
for Trump's rally,
01:09:10.535 --> 01:09:13.034
and I'm so excited to not go.
01:09:13.134 --> 01:09:14.201
Mm-hmm.
01:09:14.268 --> 01:09:15.734
We've never had an empty seat.
01:09:15.834 --> 01:09:17.435
And we certainly
won't in Oklahoma.
01:09:17.535 --> 01:09:19.335
TikTok users may well be
01:09:19.468 --> 01:09:21.869
President Trump's
latest adversary,
01:09:21.969 --> 01:09:24.868
after thousands of people
who've gotten tickets online
01:09:24.968 --> 01:09:25.968
didn't show up,
01:09:26.101 --> 01:09:28.868
thanks to a secret campaign
on TikTok.
01:09:28.968 --> 01:09:32.601
We've gotten over
a million tickets sold
01:09:32.734 --> 01:09:34.602
and only 6,000 people showed up.
01:09:34.735 --> 01:09:37.534
President Trump was frustrated
and angry.
01:09:42.702 --> 01:09:46.368
GIRL: Tiktok is definitely
giving teenagers new power.
01:09:46.435 --> 01:09:50.235
I think it's unbelievable
that I was able to prank
01:09:50.335 --> 01:09:51.869
an American President.
01:09:51.969 --> 01:09:53.602
NEWSCASTER: Trump nemesis,
New York Congresswoman
01:09:53.702 --> 01:09:56.067
Alexandria Ocasio Cortez,
gloated.
01:10:00.602 --> 01:10:02.735
As soon as that rally happened,
01:10:02.869 --> 01:10:06.767
that's when the rhetoric
on TikTok rose to a level
01:10:06.901 --> 01:10:08.468
that we hadn't seen before.
01:10:08.535 --> 01:10:11.035
The real China hawks
in his Administration
01:10:11.135 --> 01:10:13.568
were ready to go after this
company, and then were kind
01:10:13.668 --> 01:10:16.701
of waiting for the moment,
and this rally
01:10:16.801 --> 01:10:18.535
and the pandemic
came together
01:10:18.602 --> 01:10:20.134
to give them
that moment that they needed.
01:10:20.235 --> 01:10:23.134
The Pentagon,
the Department of State,
01:10:23.235 --> 01:10:24.702
the Department
of Homeland Security,
01:10:24.802 --> 01:10:27.968
and the TSA have all banned
their employees
01:10:28.068 --> 01:10:29.568
and service members
01:10:29.668 --> 01:10:32.001
from using TikTok
on government devices,
01:10:32.101 --> 01:10:34.301
and we know that it's
a national security risk.
01:10:34.435 --> 01:10:36.502
People really pounced
on this moment,
01:10:36.635 --> 01:10:39.034
not only the China-hawks
and the US government,
01:10:39.134 --> 01:10:43.235
but also the tech companies,
particularly Facebook.
01:10:43.335 --> 01:10:45.502
Do you believe
that the Chinese government
01:10:45.602 --> 01:10:48.568
steals technology
from US companies?
01:10:48.702 --> 01:10:50.869
Uh, Congressman,
I think it's well documented
01:10:50.935 --> 01:10:53.034
that the Chinese government
steals technology
01:10:53.134 --> 01:10:54.901
from American companies.
01:10:54.968 --> 01:10:57.535
And so Mark Zuckerberg
saw this as a moment
01:10:57.635 --> 01:10:59.401
and Facebook pounced
on this moment,
01:10:59.502 --> 01:11:02.001
where TikTok was getting
under pressure,
01:11:02.068 --> 01:11:04.435
and he said, "I'm gonna
turn this up even more."
01:11:04.535 --> 01:11:06.835
And so they started
making their case
01:11:06.902 --> 01:11:08.702
to the different people
in Congress
01:11:08.802 --> 01:11:10.835
who were really
going after Facebook,
01:11:10.902 --> 01:11:12.535
and they were saying,
"You know what?
01:11:12.635 --> 01:11:14.567
"You're looking at us
as the Boogeyman,
01:11:14.701 --> 01:11:16.601
"but we're just a distraction,
01:11:16.701 --> 01:11:17.901
"from the real problem,
01:11:18.001 --> 01:11:19.568
"which are the Chinese
tech companies,
01:11:19.668 --> 01:11:20.901
"and those are the companies
01:11:20.968 --> 01:11:22.702
that you should be
looking at."
01:11:22.802 --> 01:11:24.568
Now the "Wall Street Journal"
is reporting that...
01:11:24.668 --> 01:11:27.868
Not only did
Mark Zuckerberg publicly
01:11:27.968 --> 01:11:29.435
go against TikTok,
01:11:29.568 --> 01:11:33.368
he lobbied behind the scenes
against the company,
01:11:33.435 --> 01:11:35.635
in a private dinner
with the President.
01:11:35.735 --> 01:11:37.501
There was a moment
during the pandemic
01:11:37.634 --> 01:11:39.401
where cases were going up,
01:11:39.502 --> 01:11:41.034
we didn't have a vaccine,
01:11:41.134 --> 01:11:43.835
Donald Trump's campaign
wasn't doing so well,
01:11:43.902 --> 01:11:47.869
and so Donald Trump started
really hammering this idea home
01:11:47.969 --> 01:11:51.467
that we need to blame China
for the Coronavirus,
01:11:51.534 --> 01:11:52.868
and this pandemic.
01:11:52.934 --> 01:11:54.868
"Kung-flu."
01:11:54.934 --> 01:11:56.468
The Chinese virus.
01:11:56.535 --> 01:11:57.902
REPORTER: Why do you keep
using this?
01:11:57.969 --> 01:11:59.634
- Because it comes from China.
- Sounds racist.
01:11:59.734 --> 01:12:01.301
It's not racist at all.
No.
01:12:01.401 --> 01:12:04.035
Not at all.
It comes from China.
01:12:04.135 --> 01:12:06.201
SHELLY: Trump loved that.
He wanted to play to that,
01:12:06.301 --> 01:12:10.001
because it became this kind
of rallying cry in the US
01:12:10.101 --> 01:12:11.869
to go after China,
01:12:11.969 --> 01:12:16.434
and TikTok kind of became, um,
this symbol of China
01:12:16.501 --> 01:12:18.102
at that moment.
01:12:18.202 --> 01:12:20.535
NEWSCASTER: There have been
more than 2,500 incidents
01:12:20.635 --> 01:12:22.934
of anti-Asian hate crimes.
01:12:23.034 --> 01:12:24.768
NEWSCASTER: It's not just
in the US,
01:12:24.869 --> 01:12:28.201
Asians around the world
have reported discrimination
01:12:28.335 --> 01:12:29.968
linked to Coronavirus.
01:12:30.068 --> 01:12:32.235
Asian hate, it didn't
just started now.
01:12:32.368 --> 01:12:34.235
It was always there.
01:12:34.368 --> 01:12:35.567
It was
always there.
01:12:58.834 --> 01:13:00.934
Suddenly,
because of the pandemic,
01:13:01.034 --> 01:13:05.368
TikTok became this symbol
of this fight
01:13:05.435 --> 01:13:08.368
between the US and China,
and a way for Donald Trump
01:13:08.468 --> 01:13:10.168
to kind of deflect blame.
01:13:10.268 --> 01:13:12.001
NEWSCASTER: It all started
last Friday,
01:13:12.101 --> 01:13:14.002
when President Trump
send shockwaves
01:13:14.069 --> 01:13:16.601
through social media
after making this comment.
01:13:16.701 --> 01:13:17.834
We're looking at TikTok.
01:13:17.934 --> 01:13:19.635
We may be banning TikTok.
01:13:19.702 --> 01:13:20.634
NEWSCASTER: The President
threatened to block
01:13:20.734 --> 01:13:21.934
the popular video app,
01:13:22.068 --> 01:13:24.101
citing national
security concerns.
01:13:24.201 --> 01:13:26.968
No. We're--we're not
a national security threat.
01:13:27.101 --> 01:13:28.968
And we've said that time
and again,
01:13:29.101 --> 01:13:31.901
we have very strict
data access and controls.
01:13:31.968 --> 01:13:33.535
NEWSCASTER: TikTok has said,
01:13:33.635 --> 01:13:34.767
"American user data is stored
in the US,
01:13:34.868 --> 01:13:36.235
"and backed up in Singapore,
01:13:36.335 --> 01:13:37.802
not in China."
01:13:37.902 --> 01:13:41.068
We are at a time when we're
seeing a very much...
01:13:41.168 --> 01:13:42.801
a geopolitical tension,
as you know,
01:13:42.901 --> 01:13:44.602
between the US and China,
01:13:44.702 --> 01:13:45.968
and we are
in the middle of that.
01:13:46.034 --> 01:13:48.468
In China, there is
a cyber security law
01:13:48.602 --> 01:13:50.435
that states, "If we ask you
for information,
01:13:50.535 --> 01:13:51.801
then you have
to give it to us."
01:13:51.901 --> 01:13:53.869
NEWSCASTER:
The 2017 law mandates
01:13:53.969 --> 01:13:55.801
that Chinese-owned companies
have to cooperate
01:13:55.868 --> 01:13:57.268
with the Communist Party.
01:13:57.368 --> 01:13:59.268
And so that's kind
of the heart of the problem.
01:13:59.368 --> 01:14:02.400
TikTok can swear up and down
that they've never been asked
01:14:02.534 --> 01:14:03.968
to give information,
01:14:04.068 --> 01:14:05.468
but that doesn't stop
the Chinese government
01:14:05.568 --> 01:14:07.968
from taking information
in the future.
01:14:09.835 --> 01:14:12.235
I remember this moment,
where we had the pandemic,
01:14:12.335 --> 01:14:13.768
we had Black Lives Matter
protests,
01:14:13.835 --> 01:14:16.201
we have wildfires
in California,
01:14:16.301 --> 01:14:18.435
like the world feels
like it's falling apart
01:14:18.535 --> 01:14:21.335
and the only thing people
on the news are talking about
01:14:21.435 --> 01:14:23.268
is this ban of TikTok.
01:14:23.368 --> 01:14:26.101
Just the threat alone
has already had a huge impact.
01:14:26.201 --> 01:14:28.634
Advertisers have been
hitting pause on campaigns,
01:14:28.734 --> 01:14:30.034
worth millions of dollars...
01:14:30.101 --> 01:14:33.134
It caused absolute chaos
in the tech industry,
01:14:33.235 --> 01:14:34.702
like Apple and Google,
01:14:34.802 --> 01:14:36.101
and everyone
was sort of struggling
01:14:36.201 --> 01:14:38.368
to get a handle on it,
and thinking,
01:14:38.468 --> 01:14:40.734
"Can a president
even do this?"
01:14:40.834 --> 01:14:42.168
Yo, what's up, guys?
01:14:42.268 --> 01:14:44.601
I'm sure all you guys
heard the news.
01:14:44.701 --> 01:14:46.134
TikTok's getting banned.
01:14:46.201 --> 01:14:51.567
I'm... going across TikTok,
and all my friends
01:14:51.667 --> 01:14:54.001
are saying bye to TikTok
01:14:54.101 --> 01:14:56.168
"It's so sad I have
to leave you, guys."
01:14:56.235 --> 01:14:57.801
There's another one saying,
"Please follow me
01:14:57.901 --> 01:14:59.735
on all my other
social media."
01:14:59.835 --> 01:15:03.335
And I was fearing
for my career.
01:15:03.435 --> 01:15:04.835
♪ But it's done now
01:15:04.935 --> 01:15:06.901
I have five million followers.
How can this get banned?
01:15:06.968 --> 01:15:09.101
This is my living,
it's what I do.
01:15:09.201 --> 01:15:12.468
And I'm starting a video
petition with #savetiktok.
01:15:12.568 --> 01:15:14.601
You all mean the world to me.
01:15:14.701 --> 01:15:15.634
[sighs]
01:15:15.767 --> 01:15:17.268
Thank you for everything.
01:15:17.401 --> 01:15:19.168
Thank you for a career.
01:15:19.268 --> 01:15:22.734
Thank you for... making all
my beatbox dreams come true.
01:15:24.068 --> 01:15:25.502
That was tough.
I...
01:15:25.635 --> 01:15:28.201
I thought of a million
different possibilities in my...
01:15:28.268 --> 01:15:31.235
Maybe I gotta perform
or maybe I gotta go busk
01:15:31.301 --> 01:15:33.068
or something, like,
"What am I gonna do?"
01:15:33.168 --> 01:15:35.834
Like, it was definitely tough
for me to see,
01:15:35.934 --> 01:15:38.334
because I didn't want
that to happen.
01:15:38.434 --> 01:15:39.702
TikTok.
01:15:39.802 --> 01:15:41.201
A few days after
the executive order,
01:15:41.301 --> 01:15:44.935
we hear that Microsoft
is in deal talks to buy TikTok.
01:15:45.035 --> 01:15:47.801
Then we started hearing,
"Okay, well, maybe Oracle
01:15:47.934 --> 01:15:49.301
"wants to buy TikTok,
01:15:49.401 --> 01:15:51.002
"maybe all these
other companies, you know,
01:15:51.135 --> 01:15:53.435
"because if they bought TikTok
then it would no longer
01:15:53.568 --> 01:15:55.634
be owned
by a Chinese company,"
01:15:55.767 --> 01:15:57.134
and suddenly
that would be okay
01:15:57.235 --> 01:15:59.001
for Donald Trump,
and, oh, by the way,
01:15:59.101 --> 01:16:01.368
Donald Trump also wanted
to take a finder's fee
01:16:01.468 --> 01:16:04.168
to get some money
to the Treasury,
01:16:04.301 --> 01:16:06.468
which is probably
the most bizarre part
01:16:06.568 --> 01:16:08.534
of the entire storyline.
01:16:10.335 --> 01:16:11.701
TikTok kept saying,
"We're trying to find a deal,
01:16:11.767 --> 01:16:13.168
we're trying
to find a deal,"
01:16:13.268 --> 01:16:17.735
but, in the meantime,
nothing was actually happening,
01:16:17.835 --> 01:16:19.802
and then the Chinese government
stepped in.
01:16:19.902 --> 01:16:23.468
Tonight, state media have been
lashing out, once again,
01:16:23.568 --> 01:16:26.767
saying that Beijing would,
quote, undoubtedly prepare
01:16:26.868 --> 01:16:28.768
proportional countermeasures
01:16:28.869 --> 01:16:32.201
for what it says could become
piracy and looting
01:16:32.301 --> 01:16:33.535
by the United States.
01:16:33.635 --> 01:16:37.201
All of a sudden, came this law
that banned the export
01:16:37.301 --> 01:16:39.734
or sale of any
artificial intelligence
01:16:39.868 --> 01:16:41.235
from China.
01:16:41.301 --> 01:16:43.335
ByteDance and TikTok
at its core
01:16:43.468 --> 01:16:45.168
is an AI company.
01:16:45.301 --> 01:16:47.301
And that was really
what stopped the discussion
01:16:47.401 --> 01:16:50.935
and could prevent
the sale of TikTok.
01:16:51.035 --> 01:16:54.502
TikTok is one of the opening
salvos in an emerging battle
01:16:54.568 --> 01:16:57.901
of technology between
the world's two largest
01:16:57.968 --> 01:17:00.501
and most dynamic economies.
01:17:00.634 --> 01:17:03.167
A new tech cold war.
01:17:07.102 --> 01:17:09.868
SHELLY: And then, the November
US Presidential Elections
01:17:09.968 --> 01:17:12.134
started heating up,
and the story of TikTok
01:17:12.235 --> 01:17:14.268
became the biggest deal
of the century
01:17:14.335 --> 01:17:16.401
that never actually
ended up happening,
01:17:16.502 --> 01:17:19.767
because Donald Trump
lost the Presidency,
01:17:19.868 --> 01:17:21.567
Biden took over
01:17:21.667 --> 01:17:23.502
and we never revisited it.
01:17:23.602 --> 01:17:26.068
Well, it turns out the clock
won't stop for TikTok.
01:17:26.168 --> 01:17:28.368
President Biden has signed
a new executive order,
01:17:28.468 --> 01:17:31.335
voiding the Trump-era decision
seeking to ban
01:17:31.401 --> 01:17:32.368
the social media app.
01:17:32.435 --> 01:17:34.068
Everybody, calm down.
Calm down.
01:17:34.201 --> 01:17:35.868
TikTok is not getting banned.
01:17:36.001 --> 01:17:38.702
I just love how Trump
tried to ban TikTok.
01:17:38.835 --> 01:17:41.868
And now TikTok
has banned Trump.
01:17:44.235 --> 01:17:48.101
TikTok is just one app
in what is going to be
01:17:48.201 --> 01:17:50.767
a long line of applications
01:17:50.901 --> 01:17:54.502
and new ways of communicating.
01:17:54.602 --> 01:17:57.702
And so acknowledging
our own humanity in this.
01:17:57.802 --> 01:18:01.268
On the internet, we treat
others like they're disposable,
01:18:01.401 --> 01:18:05.602
but, and you know,
nobody is disposable.
01:18:05.702 --> 01:18:08.101
Okay, mama, what do you think
of my haircut?
01:18:08.235 --> 01:18:10.001
- I love it.
- You love it?
01:18:10.101 --> 01:18:11.968
My mom's now been sober
01:18:12.068 --> 01:18:14.869
for a little more
than four years.
01:18:14.969 --> 01:18:17.435
And now we have
a really great relationship.
01:18:17.535 --> 01:18:19.702
I've actively worked...
to remind myself
01:18:19.802 --> 01:18:23.101
that she's someone
who's capable of change.
01:18:26.601 --> 01:18:28.001
As a digital native,
01:18:28.134 --> 01:18:32.001
it's exhausting to grow up
and make mistakes
01:18:32.101 --> 01:18:33.468
in front of everyone.
01:18:33.568 --> 01:18:35.335
And not just the people
who are looking at me right now,
01:18:35.401 --> 01:18:38.434
but inevitably, the people who
are looking at me in five years,
01:18:38.534 --> 01:18:39.668
ten years...
01:18:39.802 --> 01:18:42.767
The things you put
on the internet are forever.
01:18:42.834 --> 01:18:44.801
But...
Hey! How's it going?
01:18:44.868 --> 01:18:46.901
I founded GenZ Girl Gang
01:18:47.034 --> 01:18:48.834
because social media
can be used
01:18:48.934 --> 01:18:50.934
as a community building tool.
01:18:51.068 --> 01:18:53.535
As my generation gets older,
01:18:53.668 --> 01:18:56.101
and we live
more life documented,
01:18:56.201 --> 01:19:00.068
I hope that--that we
learn to live
01:19:00.168 --> 01:19:02.068
with this technology,
01:19:02.168 --> 01:19:04.467
and really live
with it, right?
01:19:04.601 --> 01:19:06.235
Live full lives with it.
01:19:06.335 --> 01:19:09.168
Live our mistakes through it.
01:19:09.268 --> 01:19:11.834
That we can all create the space
for one another
01:19:11.934 --> 01:19:14.167
to--to change.
01:19:18.667 --> 01:19:21.635
[beatboxing]
01:19:21.735 --> 01:19:24.668
SPENCER: The way that old
Hollywood was
01:19:24.735 --> 01:19:30.102
is very different
than how new Hollywood is.
01:19:30.202 --> 01:19:33.534
TikTok's so, so young that
everything that's happening
01:19:33.634 --> 01:19:34.934
is so fresh right now.
01:19:35.034 --> 01:19:37.400
We're really gonna see
who shines.
01:19:37.534 --> 01:19:39.968
[beatboxing]
01:19:40.068 --> 01:19:41.335
I have a lot of followers.
01:19:41.435 --> 01:19:43.934
Like, right now,
I have 54 million,
01:19:44.034 --> 01:19:45.300
as of yesterday.
01:19:49.301 --> 01:19:52.534
We follow you! Can you take
a picture with my kids?
01:19:52.634 --> 01:19:53.734
Oh, my God!
01:19:53.834 --> 01:19:58.634
[all speak at once]
01:19:58.701 --> 01:20:00.068
Don't cry.
01:20:00.168 --> 01:20:01.368
I'm happy.
01:20:01.468 --> 01:20:03.634
- I know. We're happy too.
- Aw.
01:20:03.734 --> 01:20:04.667
No...
01:20:04.767 --> 01:20:06.001
You're gonna make me cry.
01:20:06.101 --> 01:20:07.868
Let me take a picture
really quick.
01:20:07.968 --> 01:20:10.667
[beatboxing]
01:20:10.734 --> 01:20:12.535
My boy!
'Pérate!
01:20:12.635 --> 01:20:14.134
You guys have a great day,
all right?
01:20:14.235 --> 01:20:15.735
Thank you!
That was awesome.
01:20:15.835 --> 01:20:20.134
To me, I think, fame is
that support you give people
01:20:20.235 --> 01:20:24.734
that didn't really have it
before you existed.
01:20:24.801 --> 01:20:26.767
- I follow you.
- Aw.
01:20:26.834 --> 01:20:27.767
I will follow you
guys back.
01:20:27.868 --> 01:20:29.034
I follow you on TikTok.
01:20:29.134 --> 01:20:31.401
I just put out
a music video, like--
01:20:31.468 --> 01:20:33.834
- Uh-huh, I found it.
- You did? Aw, thank you.
01:20:35.702 --> 01:20:39.868
I want kids to be like,
"I know I can do that too.
01:20:39.968 --> 01:20:43.168
I know there's
a chance out there."
01:20:43.268 --> 01:20:45.934
I would've never dreamed,
in a million years,
01:20:46.034 --> 01:20:48.902
it would happen like this.
01:20:49.002 --> 01:20:51.601
TikTok has really changed
my entire life.
01:20:55.335 --> 01:20:57.401
And I--I think
if I wanna speak
01:20:57.535 --> 01:21:00.134
to anyone out there
that ever has a dream
01:21:00.201 --> 01:21:04.268
and think that it's too crazy...
01:21:04.368 --> 01:21:07.535
too crazy to accomplish, uh,
you can do it.
01:21:07.668 --> 01:21:09.467
You can--you can do anything
that you want.
01:21:09.567 --> 01:21:12.501
Sorry.
01:21:14.602 --> 01:21:16.902
It's--it's so, it's so weird
for me, 'cause, like, being
01:21:17.002 --> 01:21:22.034
a beatboxer... it was so hard
for me to be accepted.
01:21:22.134 --> 01:21:25.400
And... [sniffles]
01:21:25.501 --> 01:21:29.234
[exhales]
01:21:32.134 --> 01:21:35.300
I'm just really, really glad
I never gave up.
01:21:39.968 --> 01:21:42.101
FAREZA:
It's graduation day.
01:21:42.235 --> 01:21:44.034
I'm definitely nervous,
01:21:44.134 --> 01:21:46.901
I'm trying to wear
my Afghan sash
01:21:47.001 --> 01:21:48.835
to graduation.
01:21:48.902 --> 01:21:50.635
I was told
that I can't wear the sash
01:21:50.702 --> 01:21:55.802
because it goes
against dress code.
01:21:55.902 --> 01:21:59.635
But I'm sure that no matter
what people say about me,
01:21:59.735 --> 01:22:02.368
at the end of the day
I'm proud and Afghan
01:22:02.468 --> 01:22:04.334
and there's no other
human being like me.
01:22:06.435 --> 01:22:08.001
Take a lot.
Just tap a lot.
01:22:08.101 --> 01:22:09.601
And if it glitches,
use your phone.
01:22:11.702 --> 01:22:14.502
It feels a little embarrassing
to see, like, my mom
01:22:14.602 --> 01:22:17.268
and the whole family, like,
celebrating me graduating,
01:22:17.368 --> 01:22:20.401
'cause I'm like, "Ugh, it's
not that big of a deal."
01:22:20.502 --> 01:22:22.268
But then I look back at them,
and I'm like...
01:22:22.335 --> 01:22:25.034
Honestly, it is, 'cause
I'm the first female
01:22:25.134 --> 01:22:27.802
in my family
to graduate high school.
01:22:27.935 --> 01:22:30.735
My mother, she went
to elementary school,
01:22:30.835 --> 01:22:34.401
but then, once the violence
in Kabul, Afghanistan,
01:22:34.502 --> 01:22:37.668
got too much, she had to be
taken out at third grade.
01:22:37.802 --> 01:22:39.901
I'm the first anyone
in my family
01:22:40.001 --> 01:22:42.033
to even go to college now.
01:22:53.201 --> 01:22:54.834
[cheering and applause]
01:22:54.901 --> 01:22:56.834
MAN: Welcome to graduation.
01:22:56.934 --> 01:23:00.400
We're the class of 2021.
01:23:02.602 --> 01:23:05.834
I didn't expect myself
to go viral
01:23:05.934 --> 01:23:09.168
and be this activist.
01:23:09.235 --> 01:23:13.868
What inspired me to speak up
was seeing those around me
01:23:13.968 --> 01:23:15.101
staying silent.
01:23:18.468 --> 01:23:20.235
WOMAN: Feroza Aziz.
01:23:20.301 --> 01:23:22.435
[cheering and applause]
01:23:22.502 --> 01:23:26.802
I wanna do more in the future
on human rights issues,
01:23:26.902 --> 01:23:30.068
and I wanna do more than just
speaking on social media.
01:23:30.168 --> 01:23:31.801
I actually wanna
physically help.
01:23:34.235 --> 01:23:35.134
MAN: Congratulations.
01:23:35.235 --> 01:23:39.367
[cheering and whistling]
01:23:40.468 --> 01:23:41.801
[speaks in foreign language]
01:23:41.901 --> 01:23:43.400
Thank you.
01:23:52.935 --> 01:23:55.802
TAYLOR: TikTok has infiltrated
American culture,
01:23:55.902 --> 01:23:57.668
the Hollywood
and entertainment system,
01:23:57.768 --> 01:24:00.735
and--and politics, and all
of these different facets
01:24:00.869 --> 01:24:04.435
of American life
in--in such a deep way.
01:24:04.568 --> 01:24:06.535
There's very legitimate reasons
to think critically
01:24:06.635 --> 01:24:09.602
about the impact that this
massive tech conglomerate
01:24:09.702 --> 01:24:12.802
is having on America
and it's really important
01:24:12.902 --> 01:24:14.635
to think about issues
around data privacy
01:24:14.735 --> 01:24:16.568
with all of these
tech platforms.
01:24:16.635 --> 01:24:18.201
It's called
the Log Off movement,
01:24:18.301 --> 01:24:19.701
and it's a nonprofit
organization.
01:24:19.834 --> 01:24:21.301
It's really been started
by kids,
01:24:21.435 --> 01:24:23.801
for ways to promote
healthy ways to exist
01:24:23.868 --> 01:24:25.368
on social media.
01:24:25.468 --> 01:24:27.368
I'm really inspired
by the Log Off movement,
01:24:27.468 --> 01:24:30.534
they're a group
of high school students
01:24:30.667 --> 01:24:32.134
from all over the planet.
01:24:32.235 --> 01:24:33.468
They're not
just telling people
01:24:33.535 --> 01:24:35.668
to spend less time
on the apps.
01:24:35.735 --> 01:24:39.034
They're pushing back by talking
to members of Congress,
01:24:39.134 --> 01:24:42.235
by talking to people
at the platforms themselves,
01:24:42.335 --> 01:24:44.834
to try to change how
these systems are built.
01:24:44.968 --> 01:24:47.734
Companies like TikTok
need to be watched,
01:24:47.834 --> 01:24:49.435
they need
to be held accountable
01:24:49.568 --> 01:24:51.335
the same way that we hold
01:24:51.401 --> 01:24:53.467
other institutions
of power accountable.
01:24:54.668 --> 01:24:56.034
NEWSCASTER:
TikTok has tightened
01:24:56.134 --> 01:24:57.435
privacy measures.
01:24:57.535 --> 01:24:59.834
Anybody under 15
will automatically have
01:24:59.934 --> 01:25:01.335
a private account.
01:25:01.435 --> 01:25:03.802
Federal regulators
have already ordered the app
01:25:03.902 --> 01:25:08.135
to disclose how its practices
do effect young people.
01:25:08.236 --> 01:25:11.168
SHELLY: Personally, I don't
think it's fair to single out
01:25:11.268 --> 01:25:14.802
an individual company
just because it's popular.
01:25:14.902 --> 01:25:17.268
Personally, I think it makes
more sense to pass
01:25:17.368 --> 01:25:19.968
cohesive laws against
all companies,
01:25:20.068 --> 01:25:22.435
so that not only can TikTok
not do some of this,
01:25:22.535 --> 01:25:24.768
so neither can Facebook
or Google, or Amazon
01:25:24.835 --> 01:25:28.001
or any other companies,
regardless of nationality.
01:25:28.101 --> 01:25:29.902
[beatboxing]
01:25:30.035 --> 01:25:33.034
If the story ended today,
I would say, hands down,
01:25:33.101 --> 01:25:34.667
TikTok won.
01:25:36.869 --> 01:25:39.668
All the Trump ban did
was make TikTok even bigger,
01:25:39.768 --> 01:25:41.767
because it caused people
to download the app,
01:25:41.901 --> 01:25:43.368
it caused people
to talk about it,
01:25:43.502 --> 01:25:45.301
and so all it did
was create more growth
01:25:45.368 --> 01:25:47.535
and more revenue
for this Chinese company
01:25:47.635 --> 01:25:50.201
that is even bigger in the US
than it was
01:25:50.335 --> 01:25:51.968
when Trump first started
going after it.
01:25:53.568 --> 01:25:56.235
Now, the story's not over.
01:25:56.368 --> 01:25:57.535
[beatboxing]
01:25:57.602 --> 01:26:00.568
♪ Everybody wants
to be somebody ♪
01:26:00.635 --> 01:26:04.235
♪ Everybody wants
to be somebody ♪
01:26:04.335 --> 01:26:07.268
♪ Everybody...
wants to be somebody ♪
01:26:07.368 --> 01:26:09.367
[beatboxing]
01:26:09.501 --> 01:26:10.934
♪ Everybody
01:26:12.868 --> 01:26:14.868
♪ Be somebody
01:26:16.001 --> 01:26:19.734
♪ Everybody
01:26:19.834 --> 01:26:21.968
♪ Be somebody
01:26:35.568 --> 01:26:38.034
♪ Everybody
wants to be somebody ♪
01:26:38.134 --> 01:26:41.968
♪ Everybody wants
to be somebody ♪
01:26:42.068 --> 01:26:46.268
♪ Everybody
wants to be somebody ♪
01:26:46.368 --> 01:26:49.168
♪ Everybody wants
to be somebody ♪
01:26:49.301 --> 01:26:53.367
[beatboxing]
01:26:58.002 --> 01:27:00.034
...generations before us
didn't have the same power
01:27:00.168 --> 01:27:02.334
as we do now,
and that's technology.
01:27:02.434 --> 01:27:03.401
You have power.
01:27:03.502 --> 01:27:04.901
You can create change.
01:27:05.034 --> 01:27:06.834
[beatboxing]
01:27:06.934 --> 01:27:09.834
[pop song playing]
Distributor: Women Make Movies
Length: 87 minutes
Date: 2021
Genre: Expository
Language: English
Grade: 10-12, College, Adults
Color/BW:
Closed Captioning: Available
Existing customers, please log in to view this film.
New to Docuseek? Register to request a quote.
Related Films
Nerdy women - the "hidden half" of fan culture - open up about their lives…
An examination of our love/hate relationships with our digital devices…