Skip to content

ragTechDev/futurenet-research

Repository files navigation

FutureNet — Content Assessment Framework Research

Building frameworks for age-appropriate content classification and online safety.

FutureNet is a research initiative exploring content assessment frameworks for online safety and age-appropriate content classification. The purpose of this project is to develop practical, evidence-based frameworks that help regulators, platforms, educators, and parents create safer digital environments for children.

This project is part of the larger vision to build a digital village (kampung) for children — a safe environment that promotes responsible technological growth through effective content assessment and moderation.


🌱 Project Vision

FutureNet imagines a world where children use technology safely, supported by robust content assessment frameworks that balance protection with access to age-appropriate content.

From our research so far, we understand that content assessment needs vary drastically across age groups and contexts:

Key Focus Areas

Age-Appropriate Content Classification

  • Developing frameworks for assessing content suitability across age bands (5-8, 9-12, 13-16)
  • Understanding developmental needs and risk factors at different ages
  • Creating practical assessment criteria that platforms can implement

Online Safety Assessment

  • Evaluating current content moderation approaches
  • Identifying gaps in existing regulatory frameworks
  • Researching AI-assisted content assessment tools
  • Understanding platform safety mechanisms

Stakeholder Collaboration

  • Engaging with regulators, platform providers, and safety experts
  • Learning from international frameworks (IMDA, eSafety Commissioner, UNICEF)
  • Incorporating perspectives from educators, parents, and advocacy groups

Effective content assessment requires collaboration between technology, policy, and child development expertise.

Research Approach

FutureNet explores:

  • Evidence-based frameworks grounded in child development research
  • Practical assessment tools that platforms and regulators can use
  • Multi-stakeholder engagement to ensure diverse perspectives
  • International best practices from leading regulatory bodies
  • AI and technology integration for scalable content assessment

This repo contains research documentation, interview findings, and framework development work.


📅 Research Roadmap

FutureNet follows a phased research approach to develop evidence-based content assessment frameworks.

Phase overview:

  • Landscape Research
    Review existing content assessment frameworks from IMDA, eSafety Commissioner, UNICEF, and other regulatory bodies. Analyze current platform approaches to content moderation and age-rating.

  • Stakeholder Interviews
    Conduct interviews with regulators, platform providers, content safety experts, educators, and parent advocacy groups to understand needs, challenges, and opportunities.

  • Analysis & Synthesis
    Synthesize research findings to identify gaps in current frameworks, regulatory requirements, technical constraints, and opportunities for innovation.

  • Framework Development
    Develop draft content assessment frameworks, including criteria, tools, and implementation guidance for different stakeholder groups.

  • Validation & Refinement
    Engage experts and stakeholders to review and refine the framework, ensuring practical applicability and alignment with child rights principles.


📋 Core Framework Components

Assessment Criteria

  • Age-appropriateness indicators across developmental stages
  • Content risk factors (violence, sexual content, harmful behaviors)
  • Platform safety features evaluation
  • AI-generated content considerations

Stakeholder Tools

  • Regulatory compliance checklists
  • Platform self-assessment frameworks
  • Educator and parent guidance materials
  • Technical implementation guidelines

Research Areas

  • Current content moderation practices
  • AI-assisted assessment tools
  • International regulatory approaches
  • Child development and online safety research

🛠 Documentation Stack

Component Technology
Documentation Markdown
Version Control Git/GitHub
Collaboration GitHub Issues & Discussions
Research Tools Interview transcripts, synthesis documents

Research outputs: Framework documents, assessment tools, stakeholder guidance, and policy recommendations.


🎯 Research Goals

  • Understand current content assessment practices and gaps
  • Develop evidence-based frameworks for age-appropriate content classification
  • Create practical tools for regulators, platforms, and educators
  • Incorporate child rights principles (UNCRC, UNICEF guidance)
  • Foster multi-stakeholder collaboration on online safety
  • Bridge policy, technology, and child development perspectives

📚 This is a research and framework development project — implementation and deployment are separate future considerations.


📁 Repository Structure

This repository contains research documentation, interview materials, and framework development work organized into key areas:

  • docs/research/ — Research findings, regulatory reviews, and analysis
  • docs/interviews/ — Stakeholder interview guides, transcripts, and synthesis
  • docs/solution-ideas/ — Framework concepts and assessment tools
  • docs/roadmap/ — Project timeline and milestones

📁 Key Resources

For easier navigation, here are links to the main project areas:

Area Description Link
Research Regulatory reviews, content assessment analysis, and research findings docs/research
Interviews Stakeholder interview guides, transcripts, notes, and synthesis docs/interviews/README.md
Framework Development Content assessment framework concepts and tools docs/solution-ideas/content-assessment-tool
Stakeholder Personas Profiles of key stakeholder types (regulators, platforms, educators) docs/interviews/user-personas

💡 Always refer to the relevant README in each folder for detailed instructions and workflow.


🤝 Contributing

We welcome contributions from researchers, policy experts, educators, and anyone interested in online safety and content assessment.

Ways to contribute:

  • Conduct stakeholder interviews (see docs/interviews/README.md)
  • Review and analyze existing regulatory frameworks
  • Provide expert feedback on framework development
  • Share relevant research and best practices
  • Help synthesize interview findings and insights

For detailed contribution guidelines, see CONTRIBUTING.md.

About

Giving our children a chance to grow up

Resources

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors