๐งฐ Product Security Tooling Landscape and Inventory
Audience: AppSec, DevSecOps, platform security, directors building a program from scratch
Use this page when: you want a broad map of the tooling universe across AppSec, DevSecOps, CI/CD, Kubernetes, cloud, API, SDL, secrets, and runtime.
What this page is and is not
This page is not a recommendation to buy every tool.
It is a practical map of the tool categories that appear most often in real Product Security programs, with a companion workbook that lists a broader inventory of tools.
Companion workbook
The workbook groups tools by:
- primary domain;
- primary task;
- vendor / maintainer;
- open source vs commercial;
- current vs legacy / historical relevance;
- official site.
How to use the tool inventory well
Start with control needs, not product names
Ask:
- what risk or workflow do we actually need to improve?
- is the answer a default/platform control rather than another dashboard?
- do we already have overlapping coverage somewhere else?
- who will own the findings, evidence, or policy outputs?
A healthy stack usually looks like this
- 1โ2 code and dependency analysis layers
- 1 source-control baseline and secret-scanning layer
- 1 CI/CD trust and evidence layer
- 1 cloud posture and identity layer
- 1 Kubernetes admission / runtime layer
- 1 secrets / key-management layer
- 1 vulnerability intake / exception / evidence workflow
A weak stack usually looks like this
- too many scanners;
- no ownership;
- no deduplication;
- no risk model;
- no release integration;
- no evidence path for audits or customers.
Example categories in the workbook
| Domain | Examples of tool classes |
|---|---|
| AppSec | SAST, SCA, secret scanning, DAST, IAST/RASP, fuzzing, API testing |
| DevSecOps / CI/CD | pipeline policy, artifact signing, provenance, release evidence, repo governance |
| Cloud | CSPM, CIEM, CWPP, IaC scanning, guardrails, logging/evidence |
| Kubernetes / containers | image scanning, admission control, policy enforcement, runtime detection, service mesh |
| Data / identity / secrets | vaults, KMS/HSM, service identity, access review, session recording |
| Assurance / governance | ASOC/ASPM, exception workflow, maturity models, audit evidence |
| Learning / validation | labs, training platforms, local test apps, intentionally vulnerable apps |
Selection reminders
- prefer controls that reduce manual work or remove repeated failure modes;
- prefer tools that integrate into developer workflows and release gates;
- avoid buying a new tool when a platform default or open-source control covers the gap;
- keep legacy tools documented if they still explain historical findings or older customer evidence.
Suggested cross-links
- ๐บ๏ธ DevSecOps Toolchain โ Practical Map, Legacy vs Current
- ๐งญ ASOC and ASPM Orchestration Platforms
- ๐ Semgrep / CodeQL / SonarQube Positioning
- ๐ก๏ธ Runtime Security / Detection / Incident Response / Resilience โ Operating Model and Product Map
References
- NIST SSDF โ https://csrc.nist.gov/pubs/sp/800/218/final
- OWASP SAMM โ https://owasp.org/www-project-samm/
- OpenSSF Scorecard โ https://www.scorecard.dev/
- SLSA โ https://slsa.dev/
Author attribution: Ivan Piskunov, 2026 โ Educational and defensive-engineering use.