Annual Program Statement for Internet Freedom Round 1 Grant (From $500,000 – $3,000,000)

The Bureau of Democracy, Human Rights, and Labor (DRL) publishes a Request for Statements of Interest (RSOI) for the Annual Program Statement for Internet Freedom Round 1 Grant (From $500,000 – $3,000,000). From groups interested in submitting Statements of Interest (SOI) for initiatives that support Internet Freedom under each of the financing topics for this cause listed below. Through integrated assistance to civil society for technology, digital safety, policy and advocacy, and applied research projects, DRL seeks to defend the open, interoperable, secure, and dependable Internet by advancing fundamental freedoms, human rights, and the free flow of information online. DRL welcomes businesses that are interested in funding to submit SOI applications that describe their ideas for programs that would achieve this objective.

 

 

Deadline for Annual Program Statement for Internet Freedom Round 1 Grant (From $500,000 – $3,000,000): May 25, 2023

 

Funding:

 

From $500,000 – $3,000,000

 

Eligibility details

The following prerequisites must be met by organizations submitting SOIs:

·      Be a private, public, or state institution of higher education. Be a for-profit organization or business. Note that there are restrictions on the payment of fees and/or profits under grants and cooperative agreements, including those outlined in 48 CFR 30, “Cost Accounting Standards Administration,” and 48 CFR 31, “Contract Cost Principles and Procedures.” DRL maintains the right to ask for more background data from organizations that haven’t previously handled government grants.

·      These candidates might be given pilot money that is restricted. Consortia of applicants may come together and submit a single SOI. The other organizations should be included as sub-award partners and one should be the lead applicant.

·      information on businesses with no prior expertise in managing government grants.

 

Internet Freedom Themes for Funding: (Annual Program Statement for Internet Freedom Round 1 Grant (From $500,000 – $3,000,000))

First Theme for Funding: Technology Goal(s):

Create, enhance, and put into practice technology to support unrestricted, secure access to the Internet worldwide and/or to serve the objectives of other Funding Themes listed below.

Current Problems of Interest include, but are not limited to, the following:

 

1. Advanced surveillance, censorship, filtering, or blocking of websites or online services;

2. Internet shutdowns, deterioration of access;

3. Splintering of the Internet;

4. The repressive use of spyware, especially when used against civil society, human rights defenders, or independent media.

 

Programs must meet the following requirements in order to be considered:

 1. They must be built on tried-and-true open-source technologies that have advanced enough to be responsibly used in relevant oppressive, precarious, or conflict-affected environments and with identified at-risk, marginalized, or vulnerable populations.

2. Include a strong use case for human rights in their application.

3. Show that you have a solid understanding of the oppositional actions that could affect how a suggested technology is used and that you have a plan for dealing with them.

4. Clearly support and defend particular technical claims and their contribution to results connected to the specified Funding Theme(s)’ Goal(s) (e.g. what specific technologies, protocols, etc. are being used; why a specific technology is being used instead of others; how the technology works to address specific identified threats; etc.)

5. In accordance with DRL criteria, submit technology for a security audit by an impartial third party.

 

Programs cannot be closed-source technological projects in order to be qualified (published under proprietary licenses prohibiting code reuse or adaptation).

 

2. Propose the creation of hypothetical or aspirational technology without a user base already in place or a demonstrated use case for safeguarding human rights online.

3. Introduce technologies for relevant at-risk populations that lack adequate security.

 

The following activities, among others, are not often seen as competitive:

1. Technology that aims to facilitate unrestricted and secure access to the Internet but does not specifically address the oppressive risks faced by the populations it serves or provides sufficient information about how it will do so.

2. The bandwidth capacity and/or core server infrastructure for anti-censorship technology.

3. Increasing physical Internet infrastructure and removing major obstacles to online access (i.e., the physical availability and inherent quality of network connections independent of deliberate government interference or targeted repression).

4. The use of digital technologies without a clear strategic application for and focus on online human rights protection, such as artificial intelligence, blockchain, and virtual reality.

5. Support for aspirational technologies that are still in the proof-of-concept stage, unless they are created to conduct research or address an emerging threat to Internet freedom.

6. The deployment of technologies that, in one of the following ways, fail to adequately meet the particular demands, difficulties, and use cases of their target populations: a), b), do not exhibit demand-driven development, or c), do not take into account feedback from local communities.

7. Technology that aims to bring curated information to markets that are restricted.

 

Funding Theme #2: Cybersecurity Goal(s):

Conduct programs that help marginalized, at-risk, and at-risk groups—or those who protect them—get ready for, avoid, spot, investigate, and/or get relief from repressive digital attacks or other forms of repression (such as online surveillance and censorship) that are meant to keep these groups from exercising their fundamental rights and freedoms online. Among the current issues of interest are, but are not restricted to:

 

1. The repressive use of spyware, particularly when it is directed towards members of the public interest, human rights advocates, or independent media.

2. DoS assaults that affect the right to free speech and target human rights advocates, independent media, and civil society.

3. International digital repression.

 

Programs must:

 

1. Have a distinct focus on safeguarding human rights online in order to be qualified.

2. Exhibit a thorough comprehension of adversarial efforts and a plan for dealing with them.

3. Address the immediate repressive risks the populations served are facing.

4. Demonstrate a thorough awareness of the operational hazards associated with working in regional contexts.

5. Clearly show that you have strong internal capabilities and extensive knowledge in risk management and operational security, and that you’ve had success implementing similar initiatives in high-risk settings in the past.

 

Programs cannot:

1. Recommend or implement technology in appropriate repressive, fragile, or conflict-affected situations and with identifiable at-risk, marginalized, or vulnerable groups in order to be considered eligible.

2. Failing to outline the security training techniques that would be used

3. Provide beneficiaries with generalized “digital literacy” instruction without a discernible benefit that enhances security.

4. Impede efforts to moderate and/or combat online content unless they specifically state that they will exclusively employ techniques that do not restrict freedom of expression (such as online self-regulation by users, privacy protection measures, etc.).

5. Concentrate on thwarting offline remote surveillance.

 

 Not Competitive Projects

1. Projects that are broadly intended to oppose attempts to curtail human rights and basic freedoms but are not expressly focused on the online exercise of those rights or freedoms are among the activities that are typically not seen as competitive.

2. Digital security education or capacity-building initiatives that are not in reaction to an obvious and genuine threat, a recent or anticipated shift in the threat environment affecting the target audience, or an underserved at-risk group.

3. The development of new generalized security educational or informative materials that are largely topical and that are intended for the general public.

4. Only making instructional or informational materials available to program participants and not making them available for reuse, revision, or adaptation by other relevant communities or service providers in the field of protection.

5. Purchasing large quantities of hardware or licenses for commercial technologies or encryption products. Programs that provide beneficiaries with equipment or services need to be covert initiatives that lessen the danger or impact of a) recent digital attacks that beneficiaries have encountered, or b) particular near-term threats that beneficiaries are anticipated to encounter in order to be competitive.

 

Third funding theme: advocacy and policy

Goal(s):

Conducting or enabling policy advocacy to oppose laws, court rulings, rules, guidelines, corporate policies, and protocols that impose restrictions on fundamental freedoms and human rights online; enabling the objectives of the digital safety or technology funding themes; and/or promoting and extending Internet freedom in other ways.

 

The following are some current issues of interest:

1. Internet outages, including access degradation.

Internet splintering is another.

3. Policies or laws that limit basic liberties and human rights online under the pretense of enhancing cybersecurity or combating online fraud, defamation, and hate speech.

4. International digital repression.

 

Programs must:

1. Clearly identify and describe a particular Internet freedom policy target area for advocacy in order to be eligible.

2. Exhibit a clear advocacy plan, including activities in detail and establishing measurable objectives and results for changing policy.

3. Clearly explain your understanding of the local advocacy environment for policy.

 

Demonstrate a thorough comprehension of the operational hazards involved in functioning in local environments.

 

Programs must NOT:

1.     Address digital technology policies or laws that are not focused on, or without obvious direct consequences for, the preservation of human rights and basic freedoms on the international Internet.

 

A few examples of activities that aren’t often viewed as competitive ones are:

1. Digital technology projects (such as blockchain, virtual reality, and artificial intelligence) without a clear strategic application for and focus on upholding human rights online

2. Fundamental support for the development of advocacy capacity that does not encourage or promote locally relevant and locally driven advocacy that benefits local civil society actors or marginalized, vulnerable, and at-risk communities

3. Activities that disseminate research findings to U.S. Government allies or stakeholders.

 

Funding Theme #4: Applied Research:

Goals:

Research initiatives to educate and advance Internet freedom globally, as indicated in the Goal(s) of the aforementioned Funding Themes, or to otherwise more fully comprehend and address challenges to Internet freedom. Among the current issues of interest are, but are not restricted to:

 

1. The oppressive use of spyware, particularly for the surveillance, restriction, or repression of independent media, civil society organizations, and human rights advocates.

2. Internet outages, reduced accessibility, and fragmentation of the Internet

3. Internet censorship laws, rules, rules, policies, procedures, and protocols.

4. Reducing the effects of cyberbullying and harassment without limiting freedom of speech

5. DoS assaults that affect freedom of expression and target human rights advocates, independent media, and civil society.

 

Eligibility for Applied Research

1. A clear and immediate Internet freedom Policy and Advocacy, Digital Safety, or Technology application is a requirement for eligibility for Applied Research projects.

2. Demonstrate a thorough comprehension of the operational hazards associated with operating in local environments.

3. Convey how they add to existing research rather than duplicating it.

4. Be honest about their study procedures to allow for third-party validation, peer review, and additional investigation.

 

For Applied Research programs to be considered, they cannot:

1. Conduct solely academic research with no plans to safeguard Internet freedom for particular marginalized, vulnerable, or at-risk populations.

2. Engage in theoretical research on security and/or technological challenges that does not specifically address a danger to Internet freedom that has been articulated.

3. Use populations that are actively targeted, at risk, or marginalized in your research.

 

The following activities, among others, are not often seen as competitive:

1. Research scopes that don’t show a solid foundational understanding of the problem areas.

2. Data/information collection, monitoring, or mapping activities that are not contributing to, collaborating with, or partnering with existing data/information collection, mapping, and tracking projects and cannot articulate how the research under their project is complementary to, and/or different from, those existing projects

 

3. The use of social media monitoring tools or other large-scale data collection without informed consent unless the project can demonstrate a clear commitment and capacity to do so responsibly; a strong technical and operational framework for ensuring the safety and privacy of those being monitored; and a compelling case for why this approach is more useful and would produce more relevant information than more basic research methods that require informed consent.

 

4. Digital security, policy, or technology research that does not directly advance the project’s stated objectives, results, or goals

5. Projects that lack strong potential security, legal, privacy, ethical, or technical arguments for not making their study methods, data, and/or research outcomes openly and publically available and accessible.

6. Research that does not take into account how its release could interact both favorably and unfavorably with local contexts that are fragile or high-risk or have other effects on local people’s lives and interests.

7. Data/information gathering, monitoring, or mapping tasks that lack a strategy for guaranteeing the long-term sustainability of the project’s resources.

 

Key Program Considerations:

To assist applicants in creating responsive, comprehensive program proposals, the following list of program considerations is offered.

 

1. Projects need to have a plan for long-term sustainability when the grant expires.

2. Projects that establish communities of practice and expertise, which not only include but also elevate stakeholders from local communities, will be given preference.

3. Where appropriate, DRL invites applicants to develop cooperative relationships with local organizations in the target nations and/or regions. Where applicable, candidates are encouraged to join consortia with one lead (“prime”) applicant to submit a combined proposal.

4. Before producing identical or similar items, DRL highly advises applicants to consider improving, collaborating or teaming up with the creators of already existing related research, educational tools, or other resources.

 

5. When working with disadvantaged and vulnerable populations, preference will be given to initiatives that expressly focus on issues relating to those groups and/or work in substantive partnership with organizations or groups made up of or led by members of the populations being supported.

6. DRL works to ensure that its initiatives preserve the dignity and enhance the rights of the most vulnerable and at-risk populations. A clear plan for conducting their work responsibly and safely, the necessary capacity and expertise to carry out that plan, and the ability to respond to emerging risks to the program, implementers, and/or beneficiaries are all requirements for projects that directly engage with or focus on such groups or with activities in repressive environments.

7. The 9 Principles for Use of AI in Government outlined in Executive Order (E.O.) 13960 must be followed in any creation or use of artificial intelligence and/or machine learning.

8. In accordance with administration policy, all peer-reviewed scholarly publications written by individuals or institutions that are the result of research carried out under proposed programs must be made freely and publicly available and accessible by default without any embargo or delay after publication. Any constraints or limitations on data access, usage, and disclosure will need to be well justified and approved for by research projects.

 

ALL programs must:

1. Clearly address one or more of the Internet Freedom Funding Themes listed above in order to be eligible. In order to qualify, ALL programs must not:

1. Place an emphasis on digital technologies (such as algorithmic tools, blockchain, virtual reality, the Internet of Things, and facial recognition) without a clear strategic application for human rights online.

2. Impede actions that aim to moderate and/or combat online content unless they specifically state that they will exclusively employ techniques that don’t restrict freedom of expression (such as online self-regulation by users, privacy protection measures, etc.).

3. Limit aggressive cybersecurity initiatives, including hacking and counterattacks. Within ANY program theme, programs and activities that are not typically viewed as competitive include, but are not limited to: 1. Activities that exceed an organization’s demonstrated competence or for which the applicant fails to provide proof of their capacity to conduct those activities safely, responsibly, and with the intended impact;

2. Initiatives that boost internet freedom in particular nations or regions but have no real-world repercussions.

3. Geographically or locally targeted initiatives that fail to explain how their solutions specifically address or are made to account for the particular requirements, dangers, difficulties, use cases, and cultural settings of their target audiences.

4. Programs that do not clearly consider how the context may affect the program’s efforts, how the program may positively or negatively change the local context, and how the program may otherwise affect local human lives and interests, particularly those that target at-risk, marginalized, or vulnerable populations.

5. The widespread collecting of personal information without the subject’s knowledge using social media monitoring tools or other methods. A strong technical and operational framework for ensuring the safety and privacy of those being monitored is required for projects that propose this, as well as a convincing argument for why this approach is more advantageous and would produce more useful information than more conventional research methods that do not require informed consent.

6. Initiatives aimed at enhancing the physical infrastructure of the Internet or removing first-order obstacles to accessing it (i.e., the physical availability and inherent quality of network connections independent of deliberate government interference or targeted repression).

7. The development of new educational or informational resources that are unable to express how they complement and do not duplicate similar ongoing and past initiatives. Projects that are unable to achieve this must give a compelling explanation for why they were unable to expand upon, contribute to, revive, update, translate, or locally adapt existing resources to their needs.

8. Projects that aim to establish, produce, or create resources, technology, research initiatives, service delivery mechanisms, or networks with the intention that they will be used long after the project has ended without making concrete efforts in the project design to ensure this sustainable continuity.

9. Initiatives that do not intend to make the materials, tools, and/or research results they produce freely and widely accessible. For their availability to be restricted, these projects must have strong prospective security, legal, privacy, ethical, or technical justifications.

10. Initiatives that take a national rather than a regional or international perspective.

11. Independent campaigns for public awareness.

12. Study abroad trips, fellowships, and exchange programs.

 

 

Additional Information Link for Annual Program Statement for Internet Freedom Round 1 Grant (From $500,000 – $3,000,000)

more Information 

Link to Opportunity in SAMS Domestic

 

To Apply CLICK HERE (You will have to first create an account)

 

Contact information: If you are having trouble accessing the complete notice online, please get in touch with:

 

InternetFreedom@state.gov

Join us on Telegram as well as Whatsapp for more opportunity updates

 

Annual Program Statement for Internet Freedom Round 1 Grant (From $500,000 – $3,000,000)

admin
admin

Discover more from

Subscribe now to keep reading and get access to the full archive.

Continue reading

× Inquire Now