DevSecOps Blind Spots: Research, AI, and Integrations

In the realm of DevSecOps, scholarly and industrial research wields only modest influence.

That is the view put forward by ANSSI. The majority of papers it analyzed concentrate on isolated technical facets: enriching software bill of materials (SBOMs), examining the software supply chain, automating CI/CD… Holistic models are rarely explored.

Even under regulatory pressure, the field still relies largely on descriptive approaches, with few tests conducted in real-world conditions, the agency notes. This yields fragmented insights rather than a unified S-SDLC framework.

Read also: ANSSI also warns about the dangers of OpenClaw… and Claude Cowork

It should be noted that the analysis is not exhaustive: it covered eight publications, some of which were not peer-reviewed. ANSSI acknowledges this and attributes it to “time constraints of the study.” Its selection “remains subject to caution and only partially reflects the state of the art in DevSecOps,” the agency adds.

Publication Origin
Technical Paper: Supply Chain Security ESCO (European Cyber Security Organisation)
An Empirical Study of DevSecOps Focused on Continuous Security Testing Three Portuguese institutes (INOV, INSEC-ID, IST)
A Reality Check on SBOM-based Vulnerability Management: An Empirical Study and A Path Forward King Abdullah University of Science and Technology (Saudi Arabia)
Effective Integration of Database Security Tools into SDLC Phases: A Structured Framework Three Egyptian universities (the Nile, Banha and Mansoura)
Integrating DAST in Kanban and CI/CD; A Real-World Security Case Study Virginia Tech
A Practical Guide for Building Robust AI/ML Pipeline Security OpenSSF (white paper)
DevSecMLOps: A Security Framework for Machine Learning A security engineer at Dish Network (a U.S. television/telecom operator)
Software security in practice: knowledge and motivation Carleton University (Canada)

AI has not yet beaten false positives

Beyond the current state of research, ANSSI also surveyed the market by classifying AppSec solutions into twelve categories.

Type of solution Description
Static Application Security Testing (SAST) Identifies vulnerabilities by analyzing the source code or binaries without running the application.
Dynamic Application Security Testing (DAST) Detects security issues by testing the running application, simulating external attacks.
Interactive Application Security Testing (IAST) Combines SAST and DAST by instrumenting the application to analyze its behavior in real time.
Runtime Application Self-Protection (RASP) Monitors and protects the application, detecting and blocking threats during execution.
Software Composition Analysis (SCA) Detects vulnerabilities and risks tied to third-party components and open-source software, typically leveraging a Software Bill of Materials (SBOM) to provide a structured inventory of all components, dependencies, and related risks.
Scanner IaC (Infrastructure as Code) Analyzes IaC templates (e.g., Terraform, CloudFormation) to identify misconfigurations and security risks before deployment.
Artifact scanner Inspects compiled artifacts (e.g., containers, binaries) to uncover known vulnerabilities and compliance issues.
Secrets detection and management solutions Stores sensitive data securely and detects secrets embedded in code or exposed in pipelines.
Threat modeling solutions Assist in identifying and assessing potential threats during the software lifecycle design phase.
Artifact signing solutions Ensure authenticity and integrity of software artifacts through cryptographic signatures.
Application Security Posture Management (ASPM) Aggregate and prioritize security findings from multiple tools to provide a consolidated view of the application’s security posture.
AppSec/DevSecOps training platforms Offer training focused on secure development practices and embedding security into DevOps workflows.

Three of these categories (SCA/SBOM, ASPM, and secrets management/scanning) have been studied in depth. On one hand, through interviews with 13 vendors (including six European ones). On the other, by surveying nine organizations with a questionnaire anchored in the OWASP DevSecOps maturity framework.

If there is a discernible trend toward “platformization” (ten of thirteen vendors have adopted it), some integrations remain partial. For instance, between SBOM management and vulnerability detection: the workflows, data sources, and user journeys often stay separate.

As in many other markets, AI is spreading, but maturity varies greatly. ANSSI deems it insufficient to substantially reduce false positives—particularly with SAST—or to improve prioritization after analysis. Organizations are indeed building internal solutions, but progress is slow and the results are modest. They also expect vendors to shift from general-purpose large language models toward more specialized ones.

Remediation also proves challenging. This is especially true for secret scanners and the results of SCA analyses, which can sometimes require substantial restructuring of the codebase.

Compounding these elements are regulatory frameworks that remain phrased in broad terms. Operational guidance is limited: translating compliance requirements into concrete, actionable practices remains difficult.

Dawn Liphardt

Dawn Liphardt

I'm Dawn Liphardt, the founder and lead writer of this publication. With a background in philosophy and a deep interest in the social impact of technology, I started this platform to explore how innovation shapes — and sometimes disrupts — the world we live in. My work focuses on critical, human-centered storytelling at the frontier of artificial intelligence and emerging tech.