URL has been copied successfully!
Key cybersecurity takeaways from the 2026 NDAA
URL has been copied successfully!

Collecting Cyber-News from over 60 sources

AI and machine learning security and procurement requirements: Recognizing that AI now underpins everything from battlefield planning to intelligence analysis, the bill introduces sweeping requirements to safeguard these systems from emerging digital threats.The NDAA spells out a spate of policy and procurement practices that the military should meet regarding artificial intelligence and machine learning (ML). First, the DoD, in consultation with other Federal agencies, has 180 days after the date of enactment to develop and implement a department-wide policy for the cybersecurity and associated governance of AI and ML systems and applications, as well as the models for AI and ML used in national defense applications.The policy must protect against security threats to AI and machine learning, including model serialization attacks, model tampering, data leakage, adversarial prompt injection, model extraction, model jailbreaks, and supply chain attacks. It also must employ cybersecurity measures throughout the life cycle of systems using artificial intelligence or machine learning.Moreover, the policy must reflect the adoption of industry-recognized frameworks to guide the development and implementation of AI and ML security best practices. Likewise, it must follow standards for governance, testing, auditing, and monitoring of systems using artificial intelligence and machine learning to ensure the integrity and resilience of such systems against corruption and unauthorized manipulation.Finally, the AI and machine learning policy must accommodate training requirements for the department’s workforce to ensure personnel are prepared to identify and mitigate vulnerabilities specific to AI and ML.The bill further spells out physical and cybersecurity procurement requirements for AI and machine learning systems. It specifies that the defense secretary must develop a framework for the implementation of cybersecurity and physical security standards and best practices relating to AI and ML technologies to mitigate risks to the department from the use of such technologies.The NDAA specifies that the framework must cover all relevant aspects of the security of AI and ML systems, including the risk posed to and by the DoD workforce, including insider threat risks, training and workforce development requirements regarding artificial intelligence security awareness, artificial intelligence-specific threats and vulnerabilities, professional development and education, supply chain threats (including counterfeits), tampering risks, unintended exposure or theft of AI systems or data, security management practices and more.It also requires the framework to draw on existing frameworks, including the NIST Special Publication 800 series and existing DoD frameworks, including the Cybersecurity Maturity Model Certification framework.Finally, under the legislation, the framework must prioritize the most highly capable AI systems that may be of highest interest to cyber threat actors, based on risk assessments and threat reporting, and impose requirements for security on contractors.Other AI provisions under the NDAA require the DoD to revise the mandatory training on cybersecurity for members of the Armed Forces and civilian employees of the department to include content related to the unique cybersecurity challenges posed by artificial intelligence.The bill further says that by April 1, 2026, the DoD needs to establish a task force on AI sandbox environments to identify, coordinate, and advance department-wide efforts to develop and deploy AI sandbox environments necessary to support experimentation, training, familiarization, and development across the military.

Loading

Share via Email
Share on Facebook
Tweet on X (Twitter)
Share on Whatsapp
Share on LinkedIn
Share on Xing
Copy link