Deadline: November 4, 2021
Applications are open for the Mozilla Technology Fund Bias and Transparency in AI Awards 2021. The Mozilla Technology Fund supports projects which can expose elements of how artificial intelligence (AI) systems work, in order to reduce bias in and increase the transparency of AI systems.
They hope to fund projects which can empower watchdogs (including technologists and journalists) to hold the designers of AI systems accountable and to give them tools to reduce bias and increase transparency.
For years, bias has been shown to be one of the major risks of operating AI systems at scale. Systemic bias and bias which exists in training data results in real-world harms for people whose access to economic opportunity, social interaction, education and banking is increasingly mediated by these systems. To create more transparent artificial intelligence systems, AI transparency and documentation best practices need to be developed as a means to help unlock accountability, agency, and fairness.
- Through the MTF: Bias and Transparency in AI Awards, they will provide awards of up to $50,000 USD each to open source technology projects.
- Open to all applicants regardless of geographic location or institutional affiliation, except where legally prohibited.
- Mozilla is especially interested in receiving applications from members of the Global Majority or Global South; Black, Indigenous, and other People of Color; women, transgender, non-binary and/or gender-diverse applicants; migrant and diasporic communities; and/or persons coming from climate displaced/impacted communities, etc. They strongly encourage all such applicants to apply.
- Have a product or working prototype in hand—projects which have not moved beyond the idea stage will not be considered.
- Already have a core team in place to support the development of the project (this team might include software developers working in close collaboration with AI researchers, designers, product/project managers and subject matter experts).
- Embrace openness, transparency, and community stewardship as methodology.
- Make their work available under an open-source license.
What they are Looking for:
They imagine that the Bias and Transparency in AI Awards will support a variety of software projects (including utilities and frameworks), data sets, tools and design concepts. They will not consider applications for policy or research projects (though software projects which leverage, support or amplify policy and research initiatives will be considered—for example, bias metrics and statistical analyses being turned into easy to use and interpret software implementations). Some example projects they can imagine:
- Projects that help expose elements of how AI systems in consumer technology products work, the bias that may be inherent in them and/or how to mitigate the bias in these systems
- Utilities that help developers understand and identify bias when building datasets and AI systems
- Components that allow developers to provide more transparency to users on the inner workings of AI systems
- Tools to help identify, understand and mitigate the bias inherent in publicly-available datasets
Applicants can expect to hear back within eight weeks of submitting an application; kindly email [email protected] with any questions. Applications will close on November 4, 2021 at 12pm Eastern Time (UTC-4).
For more information, visit MTF.