#NeotysPAC 2020 - Todd DeCapua

#NeotysPAC 2020 - Todd DeCapua

Nov 4, 2020

How to aggregate and correlate to informed decisions, leveraging all of your data collectors, automating your pipeline for speed / quality / low cost through this proven and innovative practice

Discover how to enable the highest quality code at the highest speed with the lowest cost, via your automate and optimize our path to production via our pipelines, and aggregating and correlating across all of the ‘collectors’ and ‘data’.

The challenge today comes from the evolution and explosion of tools (data collectors) being used throughout the code pipeline, all of which are point solutions and only become collectors of data to be interrogated in a silo, amongst many silos…crippling the quality gates, automation, and speed we set out to achieve.
An innovative approach and proven practice observed in some organizations is the practice of treating each of these point solutions as the collectors they are, and simply harvest and normalize the data, so to then apply levels of AI / ML / PA / etc against that data, and make it available to endpoints (human & machine) so to make informed decisions and take actions.
The innovative framework we will demo, shows how you can maximize your quality and speed and minimize your cost, so to achieve the objectives defined from your activities in Agile / DevOps / SRE / DevSecOps, etc. We will show you how you can call an automated performance test in Neoload Cloud from your build pipeline, which will then kick off and execute and store the results, then have this full payload automatically sent into Splunk, so you can visualize and alert / notify on that data including: Load / Performance, APM, Infrastructure, Application, Security, and other results.
Looking forward to sharing this with you, so you can learn and apply in your environment(s) tomorrow, and carry it forward to better yourself personally and professionally.