Secure at the Speed of DevOps – What the State of DevOps Report Tells Us

Secure at the Speed of DevOps – What the State of DevOps Report Tells Us

The 2017 State of DevOps Report (Puppet Labs) shows a strong correlation between IT performance and the speed/lead-time associated with changes and deployments.

Organisations are driven by a constant requirement to create and deploy new IT services and applications, with a strong emphasis applied to reduce the time it takes for the development process to run it’s course. As a result, DevOps teams are encouraged to constantly innovate. First mover advantage strongly applies in the competitive battle between rival organisations – one that is increasingly waged on a digital battlefront.

Anyone with scarred knees and elbows from childhood bicycle mishaps will have learned that speed in isolation is not without risk. The same applies in IT and DevOps. You don’t have to look far to find a media article referring to a large enterprise who “should know better” that has fallen victim to a data breach or, if they’re lucky, has discovered themselves that they have inadvertently exposed sensitive data. Do I hear someone shouting about compliance breaches?

Typically such organisations fall foul regarding the security of their rapidly moving IT function due to their development strategy and the order in which they operate. The following breaks this down using a “typical example:”

Yesterday’s sluggish Dev strategy…

IT teams use DevOps processes to build, deploy and run applications; a growing number of which are destined for public cloud delivery. Such application development typically follows the following process in modern enterprises:

  1. Developers download container image file templates (e.g. Docker) and leverage these to build and test their application code.
  2. This code makes its way to continuous integration and delivery (CI/CD) pipelines managed by DevOps teams.
  3. DevOps teams write infrastructure as code templates, allowing them to provision the newly created resources in the relevant public cloud platforms intended to host the associated application.
  4. Security teams then ‘sweep up’ these cloud deployments to identify configuration issues, security vulnerabilities and any other risks within the deployment.

Only then are any issues discovered and reported back to DevOps to be addressed. This results in a long development cycle and slower time to market for new applications – something that keeps C-level Execs up at night. This problem only compounds as the output from DevOps increases.

…Accelerated by today’s intelligent security approach.

Prisma Public Cloud, from Palo Alto Networks, makes the above process antiquated. Today’s process can look like this:

  1. Software API integrations into container image files scan for vulnerabilities “at source.”
  2. Developers can address any vulnerability right away, or utilise more secure image files/templates.
  3. Additional scanning services look at infrastructure as code templates (AWS, Yammer, YAML etc.) and identify any risks well in advance of production deployment.

The process of securing DevOps has become preventative, rather than reactive. Dev teams can focus on making sure configurations are validated against best practice checks and benchmarks during, not after, development itself.

As a consequence, Security teams are presented with a deployment that already has a much smaller attack surface and are left to focus on production-related security risks.

The net result is significantly improved time to market for applications and a competitive advantage for your business. Your C-Level Execs may even sleep better – and that’s good for everyone!

Speak to Gyrocom today to secure your business “at the Speed of DevOps!”


Share this Post:
Posted by Gyrocom

Gyrocom is a network and security company. We support your digital transformation with secure, automated and simple to manage solutions for the data centre, branch office and cloud.