UNICEF Venture Fund Call Applications | Up to USD $100k grant funding


The UNICEF Venture Fund is wanting to spend money on Open Source frontier know-how options which have the potential to create radical change for youngsters. We are providing up to US$100K in equity-free funding for early-stage, for-profit know-how start-ups that may enhance the lives of youngsters. If your organization is leveraging cutting-edge applied sciences reminiscent of synthetic intelligence (AI), machine studying (ML), or blockchain, we wish to hear from you!

We are particularly searching for firms registered in one among UNICEF’s programme countries which have spectacular working prototypes and a dedication to Open Source licensing and practices.

Area 1

Misinformation and Disinformation

Are you creating instruments, platforms or video games leveraging new applied sciences to confirm data and fight misinformation and/or disinformation? Or are you delivering behavioral interventions to constantly inform younger individuals about misinformation and/or disinformation? We are notably all for approaches that deal with mis/disinformation in a number of languages and codecs (e.g. audio, video, picture) and encourage platforms accessible to individuals with disabilities. Potential utility might discover options reminiscent of:

  • Game-based social and behavioral change interventions or platforms to assist determine mis/disinformation
  • Mechanisms to evaluation data, determine mis/disinformation, and/or present legitimacy to true data shared on-line, for instance software program that may detect deepfakes in movies and pictures
  • Platform-agnostic instruments utilizing knowledge science and AI to determine and analyze false or inaccurate content material, and observe the supply and/or unfold of this content material
  • Interactive instruments or video games for youngsters to interact with and study fact-checking
  • Mechanisms to tag knowledge or watermark content material on the supply or whereas its being circulated to enhance belief in it
  • Tools which leverage the ability of crowds to collectively monitor or determine knowledge inaccuracy and to construct belief by way of the ability of social networks
  • Tools which might remove the necessity for third get together auditing by way of revolutionary use of blockchains in an information assortment or recording use case
  • Tools to audit social media platforms that are non-transparent (reminiscent of platforms which don’t disclose their content material advice algorithms or peer-to-peer messaging platforms), to decide how efficient they’re at eradicating or labelling mis/disinformation
  • AI-enabled programs to handle mis/disinformation throughout crises and make sure the dissemination of correct data.

Area 2

Data technology, assortment and evaluation

Are you utilizing novel approaches to compile and validate giant quantities of coaching knowledge? Or creating new knowledge by way of discipline knowledge assortment, crowdsourcing, or social community platforms? This might embody use circumstances reminiscent of:

  • Building protected and safe knowledge assortment and administration programs following Open standards (for transparency and accountability) whereas anonymizing delicate knowledge or leveraging privacy-enhancing applied sciences
  • Developing fashions to analyse giant quantities of knowledge, generate insights for decision-making and useful resource allocation
  • Identifying strategies to handle emotional or cognitive bias in knowledge assortment
  • Generating new knowledge by way of discipline knowledge assortment, crowdsourcing or social community platforms for understanding tendencies and conducting situational evaluation
  • Providing transparency and accountability to how knowledge is collected, managed, analysed, bench-marked, and generated
  • Developing programs for navigating current sources and obtainable data

Area 3

Digital Trust 

Are you leveraging current and new applied sciences to construct digital belief? Or are you producing insights to assess and mitigate the threats and harms for youngsters in digital environments? We are searching for startups which might be constructing new instruments, as an illustration:

  • Decentralized protocols for content material possession, attribution, and licensing utilizing blockchain know-how
  • ML/AI purposes to monitor and mannequin potential on-line dangers to kids, together with these generated by AI programs
  • Blockchain tor AI instruments to guarantee credible proof of humanity and safe “KYC” processes.
  • Tools that use digital footprints from sources like social media or mobility patterns to generate insights, reminiscent of danger analyses or forecasts to set off interventions earlier than a disaster happens
  • Tools that leverage blockchain to confirm on-line content material, for instance by creating trusted collections of data voted on by verified sources in opposition to clear standards
  • Systems which enhance knowledge provenance and auditability
  • Game-based academic instruments and steerage for youngsters to study concerning the ideas of privateness, respect and sharing of content material on-line

We aren’t restricted to the areas talked about. We are actively searching for firms that push the boundaries with frontier applied sciences in revolutionary and scalable methods with international relevance.

If you suppose you meet the UNICEF Venture Fund standards, we wish to hear from you!

Why are we all for Data and Trust?

With over 4 billion individuals (71% of whom are 15-24 12 months olds) and 1 in 3 kids linked to the web, kids’s lives are being formed behind a display. Digital know-how offers kids and younger individuals with wider entry to data, tradition, communication, and leisure in a method that was unattainable to think about simply 20 years in the past.

These exceptional benefits additionally carry risks; these instruments can probably heighten kids’s publicity to on-line dangers and harms. Being on-line may enlarge conventional threats and harms that many kids already face offline and might additional enhance vulnerabilities with on-line dangers additionally current 24/7/365. Additionally, as applied sciences form kids’s each day experiences, their data is gathered, monitored, mixed, examined, and typically used for monetary acquire. One main danger kids and youth expertise relates to weakened knowledge integrity, excessive ranges of mis/disinformation and the restricted understanding for media literacy amongst kids and youth globally.

As know-how evolves, so will digital belief. And subsequently, we search to spend money on cutting-edge options which might be producing new programs to strengthen on-line and offline knowledge assortment, fashions to analyze giant quantities of knowledge in a protected and safe method; instruments and platforms to empower and actively interact younger individuals in understanding digital belief; and leveraging know-how to confirm content material and audit unsafe instruments. We are all for platforms and instruments to uncover and deal with mis/disinformation and programs which contribute to a protected and safe “online life.”



admin
admin

Discover more from YOUR OPPORTUNITIES AFRICA - YOA

Subscribe now to keep reading and get access to the full archive.

Continue reading

× Inquire Now