4 questions parents should ask
Head, Artificial Intelligence and Machine Learning, World Economic Forum
Fellow, Artificial Intelligence and Machine Learning, World Economic Forum
4 questions parents should ask about educational tech during COVID-19
- About 1.5 billion children globally are out of school due to COVID-19.
- Many parents are turning to online education technology, but is it safe?
- Here are four questions to ask, including: does the technology prioritize privacy?
With 1.5 billion children globally out of school due to COVID-19, many parents with access to technology and internet are increasingly turning to online education technology, smart toys, and video games to keep their kids learning at home. Kids are using Zoom for classes and video calls, YouTube for education and leisure, online EdTech to learn, and videogames for entertainment.
With children and youth extremely vulnerable to risks posed by technology, ethics and governance are urgently required during and following the COVID-19 pandemic. When deciding whether technology is safe and educational for their children, parents and guardians should ask four questions about education, safety and privacy, responsible use, and inclusion and fairness.
1. Does the technology have a strong educational foundation and encourage creativity?
The technology should have a clear pedagogical foundation for teaching children and provide data to demonstrate its educational value and impact. Not every EdTech product like the app Bedtime Math will have a peer-reviewed study to support it, but they should provide quantitative analyses of their impact. Parents can also look for research-based evaluations from third parties, like Common Sense Media.
Simply teaching skills is not enough: The technology should also encourage creativity and independent thought. Technology can enhance and not hinder creativity, and students who have creativity incorporated into their curricula have better learning outcomes, according to a Gallup Education study. EdTech should leverage technology to promote creativity and critical thinking, and not limit children to think within the constraints of a program or game.
2. Does the technology protect the child, prioritize privacy, and safely store the child’s data?
The technology should have clear safety policies in place to protect children from potential bullying, harassment, exploitation or other security risks. If children can communicate with other users on the platform or within a game, they are extremely vulnerable, and parents might find it challenging to track their online activity and communications.
If parents choose to allow the technology to gather data, it should have clear safeguards in place to protect the data and anonymize it for internal use and to prevent potential hackers from identifying children if the data is stolen. Parents should also check to ensure that the technology does not sell children’s data to third parties unless parents provide consent, which they should carefully consider with the understanding that personal data is sold widely within the private sector.
3. Is the technology designed for responsible use and to prevent addiction?
Many technologies are designed to maximize use, and many smart toys and games are inherently addictive. The technology should have limits in place to discourage children from overuse. Parents should encourage children to moderate their use of technology and should lead by example. Children ages 2 to 5 should spend no more than one hour of screen time per day and parents of children over 5 should have “consistent” limits on screen time, according to the Academy of American Pediatrics.
But COVID-19 makes these guidelines challenging—if not impossible—for many families. Even UNICEF is rethinking their screen time guidelines. Jenny Radesky, a professor of pediatrics at the University of Michigan and author of the AAP guidelines, tweeted AAP’s current advice during COVID, which included, “Challenge your children to practice ‘tech self-control’ and turn off tech themselves.” But as a mother herself, she admits that during COVID this advice is challenging and explains, “I’m making this up as I go, too!”
4. Is the technology inclusive, fair and unbiased?
The technology should make clear that it is designed for a broad base of diverse children to use. Some technologies are designed for a specific child consumer in mind, but all technologies should be designed to promote accessibility for all potential users regardless of ability, language, or potential visual, auditory, or other impairments.
If the technology uses artificial intelligence (AI) or machine learning such as facial recognition, parents must also ensure that it is fair and unbiased. Many AI models struggle with bias against certain groups. The technology should make clear that it treats all children fairly and prevents bias or discrimination based on age, gender identity, ethnicity, or any other demographic characteristics
Address link :