• lmmarsano@group.lt
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    5
    ·
    9 hours ago

    Explain your objection. It’s a parenting problem, not everyone else’s.

    • nymnympseudonym@piefed.social
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      4 hours ago

      My objection is that its my operating system running on my computer

      Not yours. MINE.

      I can make its logic gates do anything I want, as long as it’s not sending CP or malware over the Internet.

      • lmmarsano@group.lt
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 hours ago

        While I agree, I don’t think the language “every operating system provider has to create” means it’s installed if you don’t want it. Parental control software exists for Linux, it’s available from the package manager, and we can opt out of installing it.

        I doubt “every operating system” is meant literally. Embedded OSs for specialized hardware (eg, routers, satellites, rockets, missiles, drones, calculators, industrial lasers) aren’t typically meant for children to browse the web. If TempleOS supported networking, it might be in trouble. Viable legislation would probably be restricted to OSs designed to allow children to access content over the internet.

        The main thrust of the suggestion is to prefer parental controls over age verification. Better ways to ensure availability of parental controls (like government services to provide the software free) fit that broad idea.

        I can make its logic gates do anything I want, as long as it’s not sending CP or malware over the Internet.

        That stipulation doesn’t need to be stated. It can be programmed to do anything, and that’s fine. Laws already exist for illegal activity. Anyone who’d fuss the absence of that stipulation lacks credibility.

    • tyler@programming.dev
      link
      fedilink
      English
      arrow-up
      35
      arrow-down
      1
      ·
      8 hours ago

      Parents already have the tools to block this at the network layer, including in mobile OSes. There’s no need to add age verification at all to anything. The parents control their kids devices, so don’t give them a device they can access this stuff on.

      These tools have existed for literal decades at this point. Anyone trying to add something now is just trying to make it easier for the government to spy on you.

      • lmmarsano@group.lt
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        3 hours ago

        Cool: agreed. Your objection was ambiguous.

        If we had to choose, though, I’d consider the professor’s suggestion preferable to age verification. While I disagree with mandating it, it’d pretty much do nothing, because it’s already reality: most mainstream OSs include parental controls. The “criteria” would establish standards for parental controls, which isn’t altogether a bad idea. A better idea would be to promote a standard & replace mandates with public services to provide parental control technologies free & to educate parents.

        In the late 90s, when US Congress attempted to regulate access of adult content to minors, those laws commissioned studies that drew similar conclusions even then. The studies & federal courts concluded that to meet the government’s compelling interest in “protecting minors from harmful content”, there were more narrowly tailored alternatives to criminalization & age verification that are less restrictive to fundamental rights & are at least as effective:

        • client-side filters to block content from the receiving end
        • government programs to train parents & provide them resources to “protect” their children from “harmful content”
        • public education campaigns.

        They pointed out while client-side filters may have false positives & negatives

        • they can be monitored & corrected
        • they’re a more complete solution that can restrict all internet protocols (not just web) from any geographic source (not only in legal jurisdiction) with content of any type (including dynamic such as live chat)
        • they allow restriction of other kinds of content (eg, violence, hate speech)
        • they can vary restrictions per child (eg, age-appropriateness)
        • they let parents disable them
        • they don’t obstruct access by adults.

        Criminalizing access to adult content at the source obstructs everyone’s access & burdens them with loss of privacy & with security risk.

        Despite their age, those studies’ findings remain relevant.

        • COPA Commission

          In October 1998 Congress enacted the Child Online Protection Act and established the Commission on Online Child Protection to study methods to help reduce access by minors to certain sexually explicit material, defined in the statute as harmful to minors. Congress directed the Commission to evaluate the accessibility, cost, and effectiveness of protective technologies and methods, as well as their possible effects on privacy, First Amendment values and law enforcement. This report responds to the Congressional request.

        • National Research Council

          In November 1998, the U.S. Congress mandated a study by the National Research Council (NRC) to address pornography on the Internet (Box P.1).

        COPA Commission summary

        The COPA Commission found Age Verification ID to have the highest adverse impact on cost, privacy, fundamental rights, and law enforcement and to score poorly on effectiveness and accessibility. They found other technologies & methods to be more effective & accessible with much lower adverse impact including

        • client-side filtering
        • family education programs
        • acceptable use policies
        • top-level domains for materials “not harmful” to minors
        • “greenspaces” containing only child-appropriate materials.

        Some recommendations to highlight

        Public Education:

        • Government and the private sector should undertake a major education campaign to promote public awareness of technologies and methods available to protect children online.
        • Government and industry should effectively promote acceptable use policies.

        Consumer Empowerment Efforts:

        • Resources should be allocated for the independent evaluation of child protection technologies and to provide reports to the public about the capabilities of these technologies.
        • Industry should take steps to improve child protection mechanisms, and make them more accessible online.
        • A broad, national, private sector conversation should be encouraged on the development of next-generation systems for labeling, rating, and identifying content reflecting the convergence of old and new media.
        • Government should encourage the use of technology in efforts to make children’s experience of the Internet safe and useful.

        Industry Action:

        • The ISP industry should voluntarily undertake “best practices” to protect minors.
        • The online commercial adult industry should voluntarily take steps to restrict minors’ ready access to adult content.
        NRC summary

        The NRC found “no single or simple answer”, agreed on the capabilities of filters in preventing inadvertent or unhighly-motivated exposure, but also stressed social & educational strategies in addressing motivation, coping, & responsible behavior.

        Social and educational strategies are intended to teach children how to make wise choices about how they behave on the Internet and to take control of their online experiences: where they go; what they see; what they do; who they talk to. Such strategies must be age-appropriate if they are to be effective. Further, such an approach entails teaching children to be critical, skeptical, and self-reflective of the material that they are seeing.

        An analogy is the relationship between swimming pools and children. Swimming pools can be dangerous for children. To protect them, one can install locks, put up fences, and deploy pool alarms. All of these measures are helpful, but by far the most important thing that one can do for one’s children is to teach them to swim.

        Perhaps the most important social and educational strategy is responsible adult involvement and supervision.

        Internet safety education is analogous to safety education in the physical world, and may include teaching children how sexual predators and hate group recruiters typically approach young people, how to recognize impending access to inappropriate sexually explicit material, and when it is risky to provide personal information online. Information and media literacy provide children with skills in recognizing when information is needed and how to locate, evaluate, and use it effectively, irrespective of the media in which it appears, and in critically evaluating the content inherent in media messages. A child with these skills is less likely to stumble across inappropriate material and more likely to be better able to put it into context if and when he or she does.

        Education, supervision, & parental controls/filters seem a more compelling solution. However, bring that up in regard to legislation to age-restrict social media & the tune at lemmy dramatically changes: seems inconsistent.

      • IsoKiero@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 hours ago

        A lot of parents sadly lack any kinds of skills to use those tools nor even know that they exist. I’m not inherently against the approach where user agent sends some rough age (allowed R-rating or something) to the website which can then block minors from accessing porn/violence/whatever. If it was just that, locally stored info if the user is minor or adult, it could be a pretty decent approach to even technically less inclined parents to give some limits on what their kids can do.

        But as with nearly every ‘protect the kids’ thing, it’s a pretty damn slippery and steep slope. If adult verification requires something more than a local variable that’s the point when the whole system becomes a tool for surveillance instead of a helpful thing for parents/schools and all of these “solutions” worldwide seems to be going in that direction.