
Leveraging artificial intelligence (AI) technologies to support marginalized social groups in the labor market, such as people with disabilities, has become increasingly important, especially given the ongoing challenges they face in finding employment. The United Nations asserts that AI has the potential to create more inclusive and accommodating environments, which could help remove barriers for certain social groups in job markets.
The United Nations Convention on the Rights of Persons with Disabilities states that “technological means, support, and services provided to them have a significant impact on shaping the reality and future of people with disabilities, as they fundamentally rely on interactions between them and environmental factors that may hinder their full and effective participation in society, compared to others.”
In this context, the Organisation for Economic Co-operation and Development (OECD) aimed to address the ongoing challenges faced by people with disabilities in the labor market in a report published in November 2023. The report analyzed the potential of AI to enhance job opportunities for them and the obstacles that might impede this, as well as what governments can do in this area.
More Prone to Unemployment:
The OECD reported that people with disabilities have long faced persistent challenges in the labor market. In 2019, they were more likely to be unemployed, with their employment rate being 27% lower than that of individuals without disabilities. The organization highlighted concerns that poorly managed AI could exacerbate these disparities and waste untapped talent from people with disabilities.
The organization believes that innovative AI-based solutions represent a significant shift in the international community, akin to the advent of the internet and technology, due to the speed and quality of these solutions and their reliance on vast amounts of input data and ability to generate future scenarios. However, the organization also emphasizes concerns about the general application of these technologies without providing tools and aids that facilitate work for people with disabilities—such as screen readers for visually impaired individuals or sign language translation for deaf users—which could result in a lack of fairness in AI utilization between able-bodied and disabled individuals, and increase the exclusion of people with disabilities from the labor market. In this context, there are challenges facing the use of AI to support people with disabilities, including:
- Research and development (R&D) efforts face difficulties in securing necessary funding, as initial funding is often insufficient. This is due to investors’ perceptions that enhancing AI technologies may not yield profitable or guaranteed returns. Additionally, the high costs of data collection make it difficult to obtain the necessary data for developing innovative solutions.
- The lack of basic training for developers and their insufficient knowledge in programming and computer science, as well as their lack of understanding of end-user design, can negatively impact their understanding of the needs of people with disabilities and the design of ICT products that are difficult for them to use.
- Small and medium-sized enterprises struggle to attract talent due to their high requirements for job applicants, unlike large companies. Furthermore, securing sustainable long-term funding is challenging due to the high costs of AI solutions for end users and the inability of people with disabilities to purchase them independently without a fair payment mechanism.
- A lack of awareness among employers about the importance of adopting innovative AI-based applications, as well as the lack of actual participation by end-users in the AI solution development process, can lead to creating solutions in a vacuum that do not meet actual needs and are disconnected from the ecosystem they are supposed to integrate into, which includes existing solutions, policies, stakeholders, and support systems.
- Users in the disability community, especially the elderly, may lack familiarity with IT and have to repeatedly learn new technologies, leading to frustration. Additionally, the absence of standards that enforce compatibility between systems, software, AI solutions, and other assistive technology can be a barrier. For instance, new solutions might require updates to hardware or software to become usable, making associated costs a hurdle for user acceptance.
Bridging the Employment Gap:
The OECD report highlights that bridging the employment gap for people with disabilities requires attention to artificial intelligence solutions and studying barriers. Encouraging the development of these solutions could significantly reduce costs, ensure interoperability, provide more sustainable funding, and foster further innovation. The report also notes that while discrimination against sexual and ethnic minorities is recognized, discrimination based on abilities remains largely absent from the global perspective of those influencing the ethical and responsible discourse on AI.
The organization also addressed the challenge faced by innovators to ensure the consultation of all disability groups in their projects, particularly amid fears that this might discourage some individuals from making their projects accessible to all. Given the desire of people with disabilities to protect their personal data from biased employment algorithms that systematically violate their right to work, the organization affirmed their right to conduct job interviews using accessible and innovative methods to showcase their abilities without risk.
The report highlighted regulatory frameworks that work in isolation from the interests of people with disabilities and do not cover all risks they may face, including exclusion and discrimination even in their private lives. The organization pointed out a perceived disconnect between disability rights policymakers and those working in AI. It noted that there is “no effort yet in government corridors to improve the accessibility of people with disabilities to the market using AI,” and that the “idea that AI can assist in employing people with disabilities has not yet taken root within governments.”
Government Actions and Risk Management:
The OECD calls for stronger government intervention, recommending that national AI regulations include human rights principles and explicit bans on discrimination and violations. It argues that efficiently preventing risks means holding AI product developers (including those who publish AI) accountable for any harm (including discrimination). This would encourage AI deployers to “become more cautious” and fully aware that it must be “safe for marginalized groups,” including people with disabilities, before use.
The organization stressed the importance of encouraging developers to address risks early before biased algorithms enter the market. Innovators would have a commercial incentive to ensure AI products are accessible and interoperable and to provide necessary support in case of issues. It also mentioned that involving users with disabilities with AI developers could help understand risks and aid in making informed decisions and improving outcomes.
The report emphasized the lack of initiatives aimed at enhancing AI literacy, which remains absent. It stressed the importance of having incentivizing policies for developing AI-supported solutions, not only setting standards for risk protection but also contributing to inclusive innovation. Regarding the proper allocation of financial resources for AI solutions, the organization sees the need to direct funds to advocacy groups for disability rights, who would, in turn, guide developers in facilitating access for people with disabilities to the job market.
Proposed Responses:
The OECD proposed several measures to activate AI solutions and facilitate access to the job market for people with disabilities, including: enacting new policies explicitly banning AI uses that lead to discrimination against people with disabilities, and reviewing liability laws and AI solution procurement guidelines to provide developers and buyers with incentives that ensure product safety while maintaining the rights of marginalized groups.
Additionally, making key AI solutions interoperable by default, developing remote-working technologies supported by AI to facilitate work for people with disabilities in various fields such as logistics and transportation, and establishing standards for regular access of people with disabilities to the job market that keep pace with technological and social changes, including new AI technologies.
The organization also suggests directing government-backed capital toward available AI solutions with increased public funding, enhancing the participation of people with disabilities in the innovation process through measures that improve their representation and involvement in setting priorities and needs for the final product. It also calls for the creation of appropriate infrastructure including easy access to internet services to enhance AI use, ensuring that end users can also bear the costs, and collaborating with universities to provide necessary training in computer science and innovative design to support people with disabilities in the job market and assist companies in developing specialized solutions to attract talent from the disability community in AI.
In conclusion, the OECD emphasized that current government policies worldwide are too isolated, treating AI solutions and their implementation in the job market as separate issues without considering their potential integration to serve people with disabilities. It stressed the need for governments to examine political and future plans to address AI-related risks, seize available opportunities, and recognize the importance of AI’s capabilities and potential to support people with disabilities in the job market.
Source: OECD Artificial Intelligence Papers, “Using AI to support people with disability in the labour market – opportunities and challenges”, OECD publishing, November 2023.