Child Safety Standards
Match & Help — Developed by Nicole Tang
Last Updated: February 6, 2026
1. About Match & Help
Match & Help (published on Google Play as "Match & Help", developed by Nicole Tang, package name: com.buildersongroup.conhelp) is a secure, organization-managed platform that connects individuals in need with volunteer support. The app provides communication features including text chat, voice calls, and video calls between users and volunteers.
Match & Help operates on an invitation-only, organization-controlled model. There is no public sign-up. All users — both end users and volunteers — are invited and managed exclusively by their organization's administrators (GroupAdmins). This controlled access model provides an inherent layer of safety and accountability.
2. Prohibition of Child Sexual Abuse and Exploitation (CSAE)
Match & Help strictly prohibits all forms of child sexual abuse and exploitation (CSAE) on its platform. This includes, but is not limited to:
- Any child sexual abuse material (CSAM), including images, videos, text, or any other media that depicts, promotes, or glorifies the sexual exploitation of minors
- Grooming behavior — any attempt to build a relationship with a minor for the purpose of sexual exploitation or abuse
- Solicitation of minors for sexual purposes or any inappropriate contact with minors
- Sharing, distributing, requesting, or possessing any material that sexually exploits children
- Any communication that sexualizes minors or promotes sexual contact with minors
- Sextortion or any form of coercion involving minors
- Any behavior that facilitates, encourages, or promotes the sexual exploitation or abuse of children in any way
3. Organization Responsibility
Match & Help is an organization-managed platform. Organizations bear significant responsibility for the safety of their communities:
- User Vetting: Organizations are responsible for vetting all users (end users and volunteers) before inviting them to the platform. Only organization administrators can add members.
- No Public Registration: There is no public sign-up option. Every user must be explicitly invited by an organization administrator using a unique invitation code.
- Volunteer Approval: All volunteer applications must be reviewed and approved by the organization's GroupAdmin before they can interact with end users.
- Minor Safety: Organizations are responsible for ensuring that minors are not invited to the platform without appropriate safeguards in place. The platform is designed for adult users.
- User Removal: Organization administrators can immediately suspend or remove any user who violates these standards or exhibits inappropriate behavior.
- Monitoring: Organizations are expected to actively monitor user interactions within their community and promptly address any concerns.
- Compliance: Organizations using Match & Help must comply with all applicable child safety laws and regulations in their jurisdiction.
4. Enforcement and Consequences
Match & Help takes the following actions when CSAE-related content or behavior is identified:
- Immediate Removal: Any content related to CSAE is immediately removed from the platform upon detection or report.
- Account Termination: Users found to be involved in CSAE-related activity will have their accounts permanently terminated without warning.
- Organization Notification: The user's organization administrator will be immediately notified of the violation and the action taken.
- Law Enforcement Reporting: All confirmed instances of CSAE are reported to the appropriate law enforcement authorities, including local police and relevant national agencies.
- NCMEC Reporting: In accordance with applicable laws, confirmed CSAM is reported to the National Center for Missing & Exploited Children (NCMEC) through their CyberTipline.
- Evidence Preservation: Relevant evidence is preserved and provided to law enforcement as required by law.
- Organization Suspension: Organizations that fail to address CSAE concerns within their community may have their access to the platform suspended or terminated.
5. Reporting Mechanisms
Match & Help provides multiple channels for reporting child safety concerns:
- In-App Reporting: Users can report inappropriate content or behavior directly within the app through the feedback and reporting features available in chat and user profile screens.
- Organization Administrators: Users can report concerns to their organization's GroupAdmin, who has the authority to take immediate action including suspending users and escalating reports.
- Direct Contact: Child safety concerns can be reported directly to our dedicated child safety team via email (see contact information below).
- Anonymous Reporting: Reports can be submitted anonymously. All reports are treated with the utmost seriousness and confidentiality.
All reports of potential CSAE are reviewed promptly. Match & Help commits to taking action on credible reports within 24 hours.
6. CSAM Detection and Handling
Match & Help employs the following measures to detect and handle child sexual abuse material (CSAM):
- User reports of suspected CSAM are treated as highest priority and reviewed immediately.
- Any confirmed CSAM is removed from the platform immediately upon detection.
- The offending user's account is permanently terminated and their organization is notified.
- All confirmed CSAM incidents are reported to NCMEC (National Center for Missing & Exploited Children) via the CyberTipline, as required by law.
- Relevant information and evidence are preserved and made available to law enforcement agencies upon request or as required by law.
- Match & Help cooperates fully with law enforcement investigations related to CSAM and child exploitation.
7. In-App User Feedback and Safety Features
Match & Help provides the following in-app mechanisms to support user safety:
- Feedback System: Users can submit feedback about their experience, including safety concerns, directly within the app.
- Two-Way Review System: Both end users and volunteers can rate and review each interaction, creating accountability and transparency.
- Chat Reporting: Users can report inappropriate messages or behavior encountered during chat, voice, or video interactions.
- Organization Oversight: GroupAdmins have visibility into request activity and can intervene when safety concerns arise.
- User Blocking: Users can block other users to prevent further communication.
- Request Management: Users can cancel requests at any time if they feel uncomfortable or unsafe.
8. Legal Compliance
Match & Help is committed to complying with all applicable child safety laws and regulations, including but not limited to:
- The Protection of Children and Juveniles Ordinance (Cap. 213) of Hong Kong
- The Prevention of Child Pornography Ordinance (Cap. 579) of Hong Kong
- The Crimes Ordinance (Cap. 200) of Hong Kong — provisions relating to child exploitation
- The Personal Data (Privacy) Ordinance (Cap. 486) of Hong Kong — as it relates to children's data
- United States federal law regarding CSAM reporting obligations (18 U.S.C. § 2258A)
- All other applicable local, national, and international laws related to child safety and protection
9. Child Safety Contact
For any child safety concerns, reports, or inquiries related to Match & Help, please contact our dedicated child safety team:
Match & Help — Child Safety Team
Developer: Nicole Tang
Email: tangn090821@gmail.com
Website: https://matchandhelp.com
We aim to respond to all child safety reports within 24 hours. For emergencies involving immediate danger to a child, please contact your local law enforcement immediately.
10. Policy Updates
Match & Help reserves the right to update these Child Safety Standards at any time. Significant changes will be communicated to organizations and users through the app and this website. Continued use of the platform after changes constitutes acceptance of the updated standards.
Organizations using Match & Help are expected to review these standards periodically and ensure their members are aware of and comply with these requirements.