Comment on HUD Work Requirements and Term Limits Rule

Date April 1, 2026
Submitted to Department of Housing and Urban Development
Docket Docket No. FR-6520-P-01, RIN 2501-AE15
Organization The Scaffold Initiative

Executive Summary

The Scaffold Initiative submitted this comment on HUD's proposed rule establishing flexibility for Public Housing Agencies and certain Multifamily Housing owners to implement work requirements and term limits. We make two recommendations: first, that the Department explicitly enumerate AI literacy training as a qualifying work activity under the proposed rule's supportive services framework, ensuring residents are prepared for the workforce as it actually exists; and second, that the Department require any automated systems used to monitor work requirement compliance or determine exemptions to be audited for discriminatory outcomes, with notice to residents and human review before any termination of assistance.

SUBJECT: Public Comment on Proposed Rule — Establishing Flexibility for Implementation of Work Requirements and Term Limits

Submitted by: The Scaffold Initiative | thescaffoldinitiative.org | policy@thescaffoldinitiative.org

Submitted to: Office of General Counsel, Regulations Division, Department of Housing and Urban Development, 451 7th Street SW, Room 10276, Washington, DC 20410-0500

Date: April 1, 2026

Re: Docket No. FR-6520-P-01, RIN 2501-AE15


Dear Secretary Turner:

The Scaffold Initiative respectfully submits this comment in response to the proposed rule published at 91 FR 10016 on March 2, 2026, which would establish flexibility for Public Housing Agencies and certain Multifamily Housing owners to implement work requirements and term limits.

The Scaffold Initiative is a Wyoming 501(c)(4) social welfare organization (EIN 41-4911679) dedicated to AI literacy and workforce development for sole proprietors, freelancers, and independent workers in underserved communities.

We write to make two recommendations: first, that the Department explicitly enumerate AI literacy training as a qualifying work activity under the proposed rule's supportive services framework; and second, that the Department require any automated systems used to monitor work requirement compliance or determine exemptions to be audited for discriminatory outcomes.

I. AI Literacy Training Should Be an Enumerated Qualifying Work Activity

The proposed rule provides that housing providers implementing work requirements must offer supportive services, and it defines qualifying work activities broadly. In addition to employment, education, and job training, the rule includes a catch-all category encompassing “any other services and resources, including case management, optional services, and specialized services appropriate to assist eligible families to achieve economic independence and self-sufficiency.”

We urge the Department to explicitly enumerate AI literacy training as a qualifying work activity. AI literacy is not a niche skill — it is a prerequisite for workforce participation in 2026 and beyond.

The Joint Center for Political and Economic Studies' State of the Dream 2026 report documents the scale of the challenge: Black unemployment surged to 7.5% by December 2025, and 271,000 federal jobs were eliminated, disproportionately affecting Black workers who comprise 19% of the federal workforce but only 13% of the overall labor force.1 The technology skills gap compounds this displacement. Workers without AI literacy are increasingly locked out of entry-level positions that now require familiarity with AI-powered tools — from logistics and customer service to healthcare administration and retail management.

The National Urban League's Entrepreneurship Centers, led by Senior Vice President for Programs Cy Richardson, integrate workforce development with entrepreneurship support across thirteen locations nationwide. These centers demonstrate that housing stability and workforce readiness are mutually reinforcing: residents with stable housing are more likely to pursue training, launch businesses, and achieve self-sufficiency.2 The proposed rule's stated goal is self-sufficiency. AI literacy training directly serves that goal.

If the Department does not enumerate AI literacy training as a qualifying activity, the risk is that individual PHAs will exercise the proposed rule's flexibility inconsistently — some recognizing AI literacy as qualifying, others requiring traditional job search activities that do not prepare residents for the labor market they actually face.

II. Algorithmic Compliance Monitoring Must Be Audited for Bias

The proposed rule does not specify how PHAs and housing owners should verify compliance with work requirements. In practice, compliance monitoring at scale will inevitably involve automated systems: databases tracking employment verification, algorithmic tools cross-referencing reported work hours with third-party data, and automated eligibility determination engines.

The Department's own May 2024 guidance on algorithmic tenant screening recognized that AI and algorithmic tools in housing contexts can result in discrimination. That guidance documented how automated screening systems relying on credit scores, eviction records, and criminal background data disproportionately affect people of color and individuals with disabilities.3

The same risks apply to work requirement compliance monitoring. If a PHA uses an automated system to verify whether a resident has met their 40-hour weekly work requirement, and that system relies on incomplete employment databases, gig economy records that undercount informal work, or algorithmic determination of exempt status, the system's errors will disproportionately harm the populations the proposed rule exempts from work requirements — persons with disabilities, caretakers, and pregnant individuals.

Dr. Joy Buolamwini, founder of the Algorithmic Justice League, has documented through peer-reviewed research and her book Unmasking AI how automated systems in housing, employment, and government services can “excode” individuals — systematically excluding them through algorithmic decisions that encode existing biases. Buolamwini's foundational “Gender Shades” research demonstrated error rates of up to 34.7% for darker-skinned females in commercial AI classification systems, compared to 0.8% for lighter-skinned males.4 The Algorithmic Justice League's dedicated research on AI in housing has further documented how “Landlord Tech” tools automate unfair practices in tenant screening, pricing, and property monitoring.5

We recommend that the final rule include a requirement that any automated system used by PHAs or housing owners to monitor work requirement compliance, determine exemptions, or enforce term limits must:

  1. Be subject to periodic audit for disparate impact on the basis of race, disability status, familial status, and other protected characteristics;
  2. Provide residents with notice that automated tools are being used and an opportunity to challenge automated determinations; and
  3. Maintain human review as a final decision-making step before any termination of assistance based on noncompliance.

These safeguards are consistent with the Department's own 2024 guidance on algorithmic tenant screening and with the proposed rule's requirement that housing providers offer supportive services — including the implicit expectation that compliance determinations will be fair and accurate.

III. Conclusion

The proposed rule's stated purpose is self-sufficiency. The Scaffold Initiative supports that goal. We urge the Department to advance it by ensuring that:

  1. AI literacy training is explicitly recognized as a qualifying work activity, so that residents are prepared for the workforce as it actually exists.
  2. Any automated compliance monitoring systems are subject to bias audits, notice requirements, and human review, so that the technology used to implement the rule does not undermine its purpose.

Respectfully submitted,

Ricky Tucker
Executive Director, The Scaffold Initiative
policy@thescaffoldinitiative.org
thescaffoldinitiative.org


References

  1. Joint Center for Political and Economic Studies, State of the Dream 2026, reporting Black unemployment at 7.5% as of December 2025 and documenting 271,000 federal job eliminations with disproportionate impact on Black workers.
  2. National Urban League, Entrepreneurship Centers program, operating thirteen locations with integrated workforce development and entrepreneurship support services. Cy Richardson serves as Senior Vice President for Programs.
  3. U.S. Department of Housing and Urban Development, guidance on the applicability of the Fair Housing Act to algorithmic tenant screening (May 2024), documenting how AI tools in housing can result in discrimination through reliance on incomplete or biased data.
  4. Joy Buolamwini and Timnit Gebru, “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification,” Proceedings of the 1st Conference on Fairness, Accountability and Transparency (FAT), PMLR 81:77-91 (2018).
  5. Algorithmic Justice League, “AI and Housing” research initiative, documenting how automated tools in tenant screening, pricing, and property monitoring encode discriminatory outcomes.

Frequently Asked Questions

Why should AI literacy training be a qualifying work activity?

AI literacy is no longer a niche skill — it is a prerequisite for workforce participation. Workers without AI literacy are increasingly locked out of entry-level positions in logistics, customer service, healthcare administration, and retail management. Enumerating AI literacy training as a qualifying activity ensures residents are prepared for the labor market as it actually exists, directly serving the proposed rule's goal of self-sufficiency.

What risks does algorithmic compliance monitoring pose?

Automated systems used to verify work requirement compliance may rely on incomplete employment databases, undercount gig and informal work, or make algorithmic determinations about exempt status. These errors disproportionately harm the populations the proposed rule exempts — persons with disabilities, caretakers, and pregnant individuals — and can result in wrongful termination of housing assistance.

What safeguards does the Scaffold Initiative recommend for automated systems?

Three safeguards: periodic audits for disparate impact on the basis of race, disability status, and other protected characteristics; notice to residents that automated tools are being used with an opportunity to challenge determinations; and human review as a final decision-making step before any termination of assistance based on noncompliance.

How does this connect to HUD's existing guidance on algorithmic tools?

HUD's own May 2024 guidance on algorithmic tenant screening recognized that AI tools in housing contexts can result in discrimination through reliance on incomplete or biased data. The same risks apply to work requirement compliance monitoring. The recommended safeguards are consistent with that existing guidance.