Let Your Child Go | Psychology Today Australia

Childhood hasn’t always been what it is today. In fact, it is fair to say that in 17th century England and the American colonies, childhood hardly existed at all1, says Holly Brewer, professor of history at the University of Maryland, as there wasn’t much of a demarcation between being a child and being an adult.

Brewer gives several examples of this: a 5-year-old signed a binding labor contract (despite his not being able to read or write); an 8-year-old was executed for arson; a 3-year-old was married; a 6-month-old was held responsible for killing herself because she rolled into a fire; and a 9-year-old and an 11-year-old were executed for stealing a hat. In the 1660s, 40 members of the English Parliament were between 12 and 21 years of age. One of those members, 13-year-old Chris Monk, opened the impeachment trial of the Earl of Clarendon. And in America, a 4-year-old girl was imprisoned during the Salem witch trials.

Around the world, young children have traditionally worked in the fields and in households. I saw this myself, in Kenya, in the 1960s, where very young girls carried their infant siblings in slings on their backs and boys with switches larger than themselves herded cattle. But what Brewer points to is something different. Children weren’t merely used as labor; they were held to the same level of responsibility as adults, no matter how young.

In parts of Japan2, toddlers are sometimes given chores, such as buying milk, that require them to cross streets and interact with strangers. This is inconceivable to many Americans, who believe that young children should not be burdened with any responsibilities.

In America, the common belief is that people should be held responsible when they are able to give informed consent. Brewer finds that the shift to the modern understanding of childhood began as England moved away from monarchical rule to democracy. Under the older system, a person was born into a status and the status conferred roles and responsibilities. But in a democratic society, the issue of who is eligible to vote becomes central. The systematic exclusion of black people from the vote was often rationalized by claiming that those of African descent, while human beings, were deficient in intellectual qualities and thereby unable to give informed consent. Today, children, people in comas, and those with profound intellectual disabilities have rights but do not bear responsibilities.

At what age does childhood end and adulthood begin? The answer shifts from time, place, and context. I’ve experienced this myself. I was in the army when I was 17, and the following year I was legally allowed to drink. I couldn’t vote for another three years. And when I applied for a marriage license just shy of my 21st birthday, my future mother-in-law had to vouch for me. From the military’s point of view, I was old enough to learn how to kill people; from the government’s point of view, I wasn’t mature enough to cast a ballot; the alcohol industry considered me an adult while New Jersey didn’t allow me to get married without parental consent.

Presently, some of the lines between childhood and adulthood have switched, so the voting age is now prior to the drinking age. The problem extends beyond informed consent. It really is about who society trusts and under what circumstances. A ballot? An eighteen-year-old isn’t going to have a great impact on the outcome of an election, but an eighteen-year-old who has been drinking while driving is dangerous. Of course, drinking and driving is a terrible idea for anyone. The idea behind the age limitation on drinking is that those younger than 21 have less impulse control and therefore are more likely to be reckless. Data3 from New York state seems to back this up. Although drivers under 21 represent 5 percent of drivers, they are involved in 14 percent of fatal crashes.

Numerous studies support the rationale for policy decisions that set age limits on dangerous behavior. It is established that sensation seeking and low impulse control are common in adolescents and that by late adolescence and early adulthood, for the majority, impulse control strengthens, while sensation seeking is moderated. This means that adolescents are less likely than adults to control their drinking or drug use while driving.

Psychology professor Laurence Steinberg looked at adolescence from various perspectives.4 He concluded, based on studies of neuroplasticity, that the cut-off point of adolescence should be at about 25 years old since the human brain isn’t fully formed until then.

The historical and psychological accounts give pause to the notion that childhood is reducible to nothing but biological and beyond changing. It is not. As Denise Park of the University of Texas and Chih-Mao Huang of the University of Illinois wrote, “There is clear evidence that sustained experiences may affect both brain structure and function. Thus, it is quite reasonable to posit that sustained exposure to a set of cultural experiences and behavioral practices will affect neural structure and function.”

The evidence suggests that adolescents in the past may have been more mature than they are today. To what extent is arguable. But what it seems is that there is an element of a self-fulling prophecy. If you expect little from children, if adolescents aren’t challenged, if they aren’t given real responsibilities, then their brains mature at a slower rate, so that the upper limits of childhood today were well into what had once been adulthood.

Is this a good or bad thing? My experience at the university is that many chafe at being held back from full adulthood, while parents complain that colleges shield them from their children’s school records because of privacy laws and regulations. We have a muddle where few are happy. If we admit that it is society that draws the lines, not simply biology, then we can begin to sort out what we mean by maturity. Neither biology nor psychology can answer the question, but society needs to look to both to make an informed decision.