PS Product SecurityKnowledge Base

๐Ÿงฐ Product Security Tooling Landscape and Inventory

Audience: AppSec, DevSecOps, platform security, directors building a program from scratch
Use this page when: you want a broad map of the tooling universe across AppSec, DevSecOps, CI/CD, Kubernetes, cloud, API, SDL, secrets, and runtime.

What this page is and is not

This page is not a recommendation to buy every tool.
It is a practical map of the tool categories that appear most often in real Product Security programs, with a companion workbook that lists a broader inventory of tools.

Companion workbook

The workbook groups tools by:

  • primary domain;
  • primary task;
  • vendor / maintainer;
  • open source vs commercial;
  • current vs legacy / historical relevance;
  • official site.

How to use the tool inventory well

Start with control needs, not product names

Ask:

  1. what risk or workflow do we actually need to improve?
  2. is the answer a default/platform control rather than another dashboard?
  3. do we already have overlapping coverage somewhere else?
  4. who will own the findings, evidence, or policy outputs?

A healthy stack usually looks like this

  • 1โ€“2 code and dependency analysis layers
  • 1 source-control baseline and secret-scanning layer
  • 1 CI/CD trust and evidence layer
  • 1 cloud posture and identity layer
  • 1 Kubernetes admission / runtime layer
  • 1 secrets / key-management layer
  • 1 vulnerability intake / exception / evidence workflow

A weak stack usually looks like this

  • too many scanners;
  • no ownership;
  • no deduplication;
  • no risk model;
  • no release integration;
  • no evidence path for audits or customers.

Example categories in the workbook

Domain Examples of tool classes
AppSec SAST, SCA, secret scanning, DAST, IAST/RASP, fuzzing, API testing
DevSecOps / CI/CD pipeline policy, artifact signing, provenance, release evidence, repo governance
Cloud CSPM, CIEM, CWPP, IaC scanning, guardrails, logging/evidence
Kubernetes / containers image scanning, admission control, policy enforcement, runtime detection, service mesh
Data / identity / secrets vaults, KMS/HSM, service identity, access review, session recording
Assurance / governance ASOC/ASPM, exception workflow, maturity models, audit evidence
Learning / validation labs, training platforms, local test apps, intentionally vulnerable apps

Selection reminders

  • prefer controls that reduce manual work or remove repeated failure modes;
  • prefer tools that integrate into developer workflows and release gates;
  • avoid buying a new tool when a platform default or open-source control covers the gap;
  • keep legacy tools documented if they still explain historical findings or older customer evidence.

References


Author attribution: Ivan Piskunov, 2026 โ€“ Educational and defensive-engineering use.