Storage and DevOps: What is required for agile development?
Limited Time Offer!
For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!
Source:-computerweekly.com
DevOps brings continuous and agile application. Storage has to fit in and make itself just as responsive to development that does away with the inertia of stifling hierarchies
In the beginning,there were developers. And then software development got harder, more complicated, less straightforward. That resulted in the birth of a new engineering discipline – DevOps.
If you imagine a Venn diagram with software developers as one circle and systems engineers as another, then DevOps is the bit in the middle that overlaps. Engineers working in this niche write code that orchestrates the day-to-day realities, such as deployment and builds.
And, interestingly, DevOps and storage are a fairly major consideration.
Before we delve into specific storage concerns, it is worth defining what we mean by DevOps, and what a typical DevOps role entails. This is crucial because it is a relatively recent innovation, and so has several different interpretations that differ in subtle but meaningful ways.
The term itself is only 10 years old. It was coined in 2009 by Patrick Debois, a Belgian IT consultant who is profoundly influential in this space, and has since authored several well-respected books on the topic. Quite quickly, the term evolved into a veritable philosophy, in turn creating a market that is estimated to be worth $9.41bn by 2023.
What makes DevOps so tricky to define is that it is not, strictly speaking, an exclusively technical discipline. It elevates culture and procedure to a high status, and is highly influenced by the lean and agile philosophies that were popularised in the 2000s and remain endemic today.
Muddying the water further is the fact that many DevOps engineers regard themselves as a logical bridge between development and operations teams, not necessarily usurping either group, but augmenting them.
Automation integral
So, what do DevOps engineers do? Automation is a huge part of their job. Modern development workflows involve a lot of moving components, from building software to actually deploying it. By automating these mundane tasks, developers can work faster while ensuring a level of consistency that you wouldn’t be able to achieve manually.
Automation is great because it removes the scope for human error. And to achieve this, DevOps engineers use a variety of tools. There are far too many to exhaustively count, but here are a carefully selected few.
There’s Chef, which lets engineers provision cloud resources through simple Ruby-based scripts called “recipes”; Puppet, which focuses on day-to-day configuration and maintenance of servers; and Ansible, which does the same, albeit primarily for compute systems.
Also, DevOps permits a level of agility that was previously hard to achieve. It allows engineers to push new builds of software to users continuously, rather than the previous release-driven model of software development.
This agility also manifests itself in QA processes, with DevOps engineers able to take a data-driven and often programmatic look into how applications work, thereby identifying and resolving issues more quickly.
With great power comes great responsibility
Perhaps the most important thing to note is that DevOps teams often work independently and are held in a level of esteem that permits them a lot of autonomy, particularly when it comes to procurement. Put simply, DevOps engineers are now decision-makers.
Part of that shift is because the current generation of software permits it. As the application lifecycle speeds up, so do the underlying tools and practices that underpin it.
This is particularly evident when looking at container platforms like Kubernetes, which can be configured to automatically scale computational resources to meet demand. Building on that are third-party tools, such as the NetApp-developed Trident project, which can automatically scale and provision storage based on real-world conditions.
And we are increasingly seeing hardware – particularly storage – configurations evolve to keep pace with modern DevOps practices.
Speed, obviously, is tremendously important. After all, the entire DevOps movement is driven by a desperate need for efficiency. Fast flash storage is invariably useful here, such as when moving large Docker images or virtual machines, both of which routinely measure in the gigabytes.
And then there’s efficiency. As DevOps engineers deal with disparate environments, configurations and file sets, they face the difficult challenge of ensuring they are using their scarce resources in the most effective way possible. A storage system that supports copy data management (CDM) natively is therefore hugely helpful.
Finally, automation is vital to the DevOps world, and many storage platforms aimed at this niche place automation front and centre, particularly when it comes to provisioning additional or new resources for an application.
Storage-focused DevOps
When examining how storage fits into the DevOps world, it is worth remembering that the DevOps discipline is merely a consequence of current fast-paced software development practices, where new updates are released each day, rather than every month.
This will inevitably be a bitter pill for many organisations to swallow, because it would mean adjusting how things work. As mentioned above, the DevOps movement is as cultural as it is technical. This inevitably means decentralising decision-making in order to empower those working on the front line. To facilitate this, major architectural changes are inevitable.
That is what makes DevOps so fundamentally interesting. From a storage perspective, it is not something that can be accomplished by acquiring new assets, but rather by thinking how to optimise and democratise existing infrastructure in the most agile way possible. Although faster storage is always welcome, it is arguably more important to see how you can integrate assets with other tools, using scripted workflows and application programming interfaces (APIs).
As you might expect, there is no shortage of suppliers targeting this niche. Pure Storage, for example, talks about how its FlashArray//X is particularly suited to modern DevOps environments. This NVMe-driven box touts pre-built integration with tools such as Docker and Kubernetes, which sit at the heart of how DevOps teams deploy software.
Read more on DevOps and storage
- Cloud storage and intelligent storage can help provide DevOps teams with the reliable, fast and flexible storage they require at every phase of the application delivery process.
- To deliver an Agile or DevOps environment, the way resources, including storage, are consumed and deployed changes to a more cloud-focused approach.
On the cloud front, there are companies like Wasabi, which extols the virtues of its Hot Cloud Storage platform for DevOps engineers. Like other storage platforms, it also offers an API to allow those working in the space to integrate the tool into their existing infrastructure and methodologies.
Storage-focused professionals should not fear the rise of DevOps, but instead think of it as an opportunity. They are in a position to insert themselves into a discussion about how to make software better by using more consistent and agile workflows, which, in turn, permit faster release cycles.
DevOps emphasises being responsive to change. At its core are technologies that permit that, such as CI/CD (continuous integration/continuous delivery) platforms.
It is therefore crucial that DevOps engineers are empowered to make decisions pertaining to storage, such as adjusting capacity without having to climb the corporate hierarchy.
And remember, DevOps engineers are nothing without their tools and APIs. These are the systems that permit them to create elaborate automated systems, which confer efficiency and predictability. Integrating existing storage architecture into these tools is therefore a must.