Parental Discard™ · The Timeline · Part 2.1

NOT AN

ACCIDENT

The Technological Precedent for Parental Discard

Between 2004 and 2020, a series of decisions were made inside technology companies that directly altered how the 1985 to 2004 cohort developed. Not in theory. In documented, measurable, structural terms.

The people who made those decisions have since spoken publicly about what they built, how it worked, and what it targeted. Their admissions are on record. Their mechanisms are documented. The ages of the population on the receiving end are arithmetic.

This timeline presents those admissions in chronological order, paired with the specific mechanism each person deployed and the developmental stage of the cohort at the time of deployment.

On Record · Chronological

THE ADMISSIONS.
THE MECHANISMS.
THE YEARS.

These are not accusations. These are public statements made by the people who designed and deployed these systems. Tap through to follow the timeline from 2004 to 2020.

1
2
3
4
5
6
7
8
9
10
11
2004 – 2006

SEAN PARKER

Founding President, Facebook

“It is exactly the kind of thing that a hacker like myself would come up with, because you are exploiting a vulnerability in human psychology… The inventors, creators… understood this consciously. And we did it anyway.”

Parker built the foundational business model on exploiting human psychological vulnerabilities. The model was not about connection. It was about capturing attention by targeting the part of the brain that needs approval from others. He stated publicly that the people who designed it understood exactly what they were doing.

Cohort age at deployment: Newborns to 21 years old. Brains either not yet formed or in the earliest stages of learning decision-making, impulse regulation, and lasting bond formation.

1 of 11
2006

TIM KENDALL

Director of Monetization, Facebook. Later President, Pinterest

“Let us figure out how to get as much of this person's attention as we possibly can. How much time can we get you to spend? How much of your life can we get you to give to us?”

Kendall was tasked with building the advertising model. The question his team operated on daily was not whether this served the user. It was how much more of the user's life could be extracted. He admitted Facebook could dial up monetization or manipulate user numbers as needed. The user was not the customer. The user was the inventory.

Cohort age at deployment: 2 to 21 years old. Entering critical social developmental windows as their attention became the product being sold to advertisers.

2 of 11
2006

AZA RASKIN

Co-founder, Mozilla Labs. Inventor of infinite scroll

“Advertisers are the customers. We are the thing being sold… These things tilt the floor of human behavior. They make some behavior harder and some easier… changing what billions of people think and do.”

Raskin invented infinite scroll. Before this feature existed, every screen had a bottom. The user reached the end, the brain registered “done,” and the session stopped. Raskin removed that signal. The content never ends. The brain never receives the cue to stop. The user continues not out of desire, only out of the absence of a reason to quit.

Cohort age at deployment: 2 to 21 years old. The youngest members of this generation would never experience a digital interface with a natural stopping point.

3 of 11
2007

JUSTIN ROSENSTEIN

Co-inventor, Facebook Like button, Google Drive, Gmail Chat

“When we were making the like button, our entire motivation was, 'Can we spread positivity and love in the world?'”

The Like button attached a public scoreboard to every thought, photo, and moment shared on the platform. The score updates on no set schedule. Sometimes fast. Sometimes slow. Sometimes not at all. That unpredictability is the single most powerful conditioning pattern in behavioral science. It is identical to the mechanism that keeps a person at a slot machine. Post. Check. Check again. The number changes or it does not. The uncertainty is what sustains the behavior.

Cohort age at deployment: 3 to 22 years old. Deployed directly into the social validation system during the precise developmental window when the need for peer approval peaks and the brain region that regulates it is furthest from complete.

4 of 11
2007 – 2011

CHAMATH PALIHAPITIYA

Head of Growth, Facebook

“We want to psychologically figure out how to manipulate you as fast as possible and then give you back that dopamine hit.”

Palihapitiya pioneered the use of scientific A/B testing to develop the most optimal methods for controlling user behavior. Thousands of controlled experiments on millions of users. Each test measured which design, which notification, which trigger produced the fastest return and the longest session. Colleagues described the process as manipulation. Palihapitiya did not dispute the word. He used it himself.

Cohort age at deployment: 3 to 26 years old. Subjected to mass-scale behavioral experimentation during the most formative psychological years. Not informed. No consent.

5 of 11
2009

LOREN BRICHTER

Designer, Tweetie (acquired by Twitter, 2010)

“Smartphones are useful tools, but they are addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things.”

Brichter created pull-to-refresh. Pull the screen down. New content appears. Or it does not. That uncertainty is the entire mechanism. It operates identically to a slot machine. Pull the lever. Sometimes a reward. Sometimes nothing. The not-knowing is what drives the repetition. This gesture was embedded into every app on every phone, placed directly into the hands of a population whose impulse-control systems were years from completion.

Cohort age at deployment: 5 to 24 years old. The region of the brain responsible for impulse control was years from completion. The region that responds to reward was fully operational.

6 of 11
2009

SANTAMARIA & MARCELLINO

Apple engineers. Inventors of push notification technology

“It is not inherently evil to bring people back to your product. It is capitalism.”

Push notifications gave every app the ability to interrupt a user at any moment. A buzz. A sound. A banner. The brain learns to respond to the interruption before the person decides whether they want to respond. Over time, the decision disappears entirely. The reaction becomes automatic. Every app now had a direct line into the user's attention, at any hour, with nothing between the alert and the response.

Cohort age at deployment: 5 to 24 years old. Phones began interrupting school, homework, sleep, and in-person interaction. The interruption became normalized before the developing brain could recognize it as an interruption.

7 of 11
2011 – 2012

SANDY PARAKILAS

Facebook, 2011–2012

“False information makes the companies more money than the truth. The truth is boring.”

Parakilas described how the algorithm selected content not for accuracy or wellbeing, only for engagement. The content that keeps users on the screen longest is content that triggers anger, fear, or outrage. False content performed better than accurate content. The platform learned this and served more of it. Not by accident. By design.

Cohort age at deployment: 7 to 27 years old. Developing worldviews, relationship models, and trust frameworks were being shaped by a system financially incentivized to prioritize false content over true content.

8 of 11
2012 – 2019

TRISTAN HARRIS

Studied under B.J. Fogg, Stanford Persuasive Technology Lab. Design Ethicist, Google. U.S. Senate testimony, 2019

The platforms are designed to work “all the way down the brain stem.”

Harris studied under the man who wrote the scientific playbook for designing technology that changes human behavior. He went inside Google and saw the playbook running. He left. He told the United States Senate, under oath, that these platforms do not target the part of the brain that thinks. They target the part that reacts before thinking is possible. The part that existed before language. Before reasoning. Before the capacity to say no.

Cohort age at time of Senate testimony (2019): 15 to 34 years old. The oldest had completed brain development. Whatever was built in, was permanent. The youngest were still in the formation window.

9 of 11
2017

JOE TOSCANO

Former Google employee. Left June 2017 over ethical concerns

“You pull down and you refresh, it is gonna be a new thing at the top… Which, in psychology, we call a positive intermittent reinforcement.”

Toscano left Google over what he observed. He confirmed publicly what Brichter designed. Pull-to-refresh is not a convenience feature. It is a conditioning tool. The reward arrives on no fixed schedule. That unpredictability is the mechanism. The brain does not chase certainty. It chases maybe. Every app was built on maybe.

Cohort age at deployment: 13 to 32 years old. For the oldest members, conditioning was locked into permanent structure. For the youngest, it was still being written.

10 of 11
Not a Confession. Internal Strategy.
2015 – 2020

MARK ZUCKERBERG

CEO, Meta
```

Every person above this line came forward voluntarily. They spoke publicly about what they helped build. They were not required to do so. This section is different. These are internal company documents entered into evidence in a federal courtroom. They show a company that knew its product was being used by children below its own stated age minimum, and responded by building a strategy to capture them younger.

Company goal: increase the amount of time 10-year-olds spent on Instagram.

2015 internal document

“If we wanna win big with teens, we must bring them in as tweens.”

2018 internal document

11-year-olds were four times as likely to keep coming back compared to older users.

2020 internal document

Zuckerberg refused to remove appearance-altering beauty filters that experts confirmed caused body image damage in girls. He called the removal “paternalistic.”

Federal trial evidence, 2026

2015: cohort was 11 to 30. 2018: cohort was 14 to 33. 2020: cohort was 16 to 35. The youngest were mid-formation. The children Meta identified as their most retained users were in the most critical developmental window a human being goes through.

11 of 11
```
The Precedent

WHAT THIS
TIMELINE ESTABLISHES

This is not a catalog of regrets. It is a documented sequence of engineering decisions, made with stated awareness of the psychological consequences, deployed into a population during the precise years that population's brain architecture was being constructed.

The mechanisms did not arrive all at once. They stacked. Attention capture in 2004. Monetization of that attention in 2006. Removal of stopping cues in 2006. Installation of variable-ratio conditioning in 2007. Mass-scale behavioral experimentation from 2007 to 2011. Slot machine mechanics in 2009. Interruption conditioning in 2009. Algorithmic prioritization of false content by 2011. Senate-testified targeting below conscious thought by 2019. Internal strategy documents showing deliberate targeting of children below the platform's own age minimum from 2015 to 2020.

The 1985 to 2004 cohort did not encounter one of these mechanisms. They encountered all of them, layered across every year their brains were forming. No single deployment would have produced the outcome observed at population scale. The accumulation across the developmental window is the precedent.

That precedent is the foundation for understanding what Parental Discard is, why it presents differently from estrangement, and why the existing clinical models fail to account for it. The timeline is not background. The timeline is the cause.

CONTINUE THE
RESEARCH SERIES

The timeline establishes what was deployed, when, and into whom. The next section examines what that deployment produced.

© 2026, 2025 Parental Discard™ — All Rights Reserved  ·  M.F. Shaw, MSPSY  ·  parentaldiscard.com