In time for the assigned deadline of February 23, 2026, medical societies, companies, health-care systems, and more have responded to a request for information from the Department of Health and Human Services (HHS) regarding the use of artificial intelligence (AI) in clinical practice.
The Request
On December 23, 2025, the HHS issued a notice requesting broad public comment on steps needed to be taken to expedite the adoption and use of AI within clinical care.
“Artificial intelligence will be a transformative force for good across America,” said Jim O’Neill, Deputy Secretary of the HHS. “We want to hear from you. Our efforts to accelerate AI adoption must be guided by the real needs and experiences of those developing these tools and delivering care.”
The request for information was directed to anyone who builds, buys, evaluates, uses, or receives care from various AI health-care tools, as well as those who are unable to do so due to obstacles. In addition to a number of specific questions, the request for information solicited feedback focused on regulation, reimbursement, and research and development to determine how AI could confidently and safely be implemented to improve productivity, health-care costs, and health outcomes.
Specific questions addressed barriers to adoption and usage, incentivization, research evaluation and testing, and the best approaches to support the future adoption of AI in clinical care.
“Artificial intelligence is powered by data. Data liquidity and the trust patients and providers have in how data moves are essential,” said Thomas Keane, MD, MBA, Assistant Secretary for Technology Policy and National Coordinator for Health IT. “Through our interoperability work, we are designing for both, bringing true data access to patients and enabling AI. We look forward to hearing how these tools can best strengthen care.”
The request demonstrates alignment with the HHS AI Strategy, released in December 2025, and the Department’s “OneHHS” approach to harness and implement AI technologies across all areas of health care and the federal workforce.
ASCO Response
The American Society of Clinical Oncology (ASCO) submitted comments answering each of HHS’ questions about AI adoption and usage in clinical care, focused on oncology practices.
ASCO emphasized the need for one unified federal regulatory framework for AI rather than various state-led guidances and laws, commenting that its absence remains one of the most significant barriers to AI adoption and usage in clinical practice. A more unified regulatory framework, the organization noted, would promote a broader and more accessible, affordable health-care system across the country while maintaining the competitive advantage of AI adoption and protecting against potential risks.
ASCO also noted that gaps in data sharing limit the accuracy of AI systems and the potential benefits of their outputs. The organization recommended incentivizing sharing data across all health-care systems, as was previously done during the adoption of electronic health record systems, to ensure vast adoption and the most robust and reliable AI outputs. Additionally, use of real-world data and decentralized methods for sharing medical data were recommended. To enable such health information sharing, the organization recommended the implementation of minimal Common Oncology Data Elements (mCODE) and Fast Healthcare Interoperable Resources (FHIR) standards, as well as the creation of standards for the transparency, privacy, and oversight of AI tools and technologies.
Overall, ASCO recommended reducing all gaps in access, care, and knowledge of AI in health care for the benefit of all health-care professionals and patients.
The organization concluded by reiterating that AI is a tool in clinical care, and though it can empower professionals, it ultimately cannot replace clinicians.
COA Response
The Community Oncology Alliance (COA) also published their response to the request for information. In their response, the group announced the creation of an AI and Digital Transformation Task Force to address the challenge of implementing AI technologies and tools in community oncology settings.
“The digital transformation required to integrate AI into clinical care is substantial, necessitating not only financial investment but also significant time and human resources. Implementing AI solutions involves a steep learning curve and the adaptation of existing workflows, which can overwhelm small practices with limited personnel and financial bandwidth,” said Debra Patt, MD, PhD, MBA, FASCO, President of COA. “Without proper incentives, the risks may outweigh the potential benefits, making it especially challenging for community practices to engage in this essential evolution.”
COA focused on answering several of the HHS’ specific questions about barriers, uncertainty and concerns, and incentivization. They noted the potential for AI health-care tools and other advancements to be limited to larger health-care systems, which could undermine larger goals for broad national access and affordability in cancer care delivery.
The organization recommended financial incentives, reimbursement models, and trainings through the HHS to support the adoption of AI in clinical care. Additionally, COA suggested simplifying the regulatory process for approval and integrating new AI tools in health care.
The group further emphasized the need for transparency to gain the trust of patients and caregivers and to increase the adoption of these tools by health-care providers in the community setting.
ACS CAN Response
The American Cancer Society (ACS) Cancer Action Network (CAN) also highlighted in its response the potential dangers of piecemeal or state-led regulations and laws for AI use in oncology care. Instead, the advocacy group recommended a clear regulatory framework for risk- and evidence-based oversight that would protect patients’ safety and privacy.
ACS CAN encouraged the HHS and the Centers for Medicare & Medicaid Services (CMS) to develop and align regulatory guardrails so that AI tools in clinical care could be transparent and accountable to a regulatory body. The group also recommended that areas of research and development be guided by the National Institutes of Health (NIH).

