After the American and French Revolutions, the Western world economies began to grow – fast. The first big demand from the burgeoning middle class was for ready-made fabrics fashioned into all types of clothing, bedding, sails for shipping, towels, etc.  Cotton was a versatile material that could satisfy many needs, and it was also comfortable. But production was limited because of the manual processes of the time, until the advent of the cotton gin.

 The cotton gin ushered in the Industrial Revolution in the early 19th century. It made the production of raw cotton 40 times more efficient, and more cotton than ever was made available. Manual spinners and weavers who could not keep up with this huge supply, and the growing expectations on the demand side for more cotton fabrics, gave way to increasingly automated spinning and weaving machines. Everyone began to think differently about meeting production needs, and began to apply automation to meet supply as well as to other things. This led to many new devices and consumers, with the automobile as one of the crowning achievements.

 In today’s environment the development of software is growing – fast. Startup companies are competing with established enterprises across all verticals (not just in hi-tech where the stories of garage-based giants like HP, Apple, and Microsoft emerged in the last century). The internet has now leveled the playing field and software can be delivered to the masses at relatively low entry cost, especially on mobile devices. Just like cotton in the early 19th century, the requirements for new software are building up at an astonishing rate. For the current spinners and weavers of applications, environments, and production support, these accelerating requirements have pushed the need for lean, agile development cycles that span weeks instead of years, as well as a high degree of integration across multiple (and potentially geographically diverse) parallel teams. The concept of Devops is intended to help meet the demands and accelerate the cycle.

 Is DevOps a “revolution” in the building of software? Automation is what software is supposed to do for many other business processes other than software development. Is it too much to ask that the ability to build, provision, change, and track the way that software 

is developed and deployed should be automated as well?

Continuous Build –> Continuous Deployment –> Continuous  Operations

If you are adopting lean, agile development and delivery techniques, you will eventually need some level of DevOps in your teams. If you are moving into the cloud, DevOps should be a Day One capability for your cloud environment. If you are trying to keep pace with disruptors in your industry (i.e. Fintech startups, “born on the web” enterprises) you will need to deploy DevOps the way that they do.

There are four major components to be addressed for DevOps:

  1. Application Code (Build) – enable multiple teams to check out and develop modules and/or services independently and in parallel; move source code management check-ins right into well-labeled continuous builds; deploy builds into virtual containers to provide flexible installation and rollbacks across all environments; run unit tests, code style checking, and quality assessments during the build; automate integration and regression tests
  2. Environments / Infrastructure (Run) – size and provision various Development, Quality Assurance, User Acceptance, Stress, and Performance test environments appropriately and quickly (within an hour) and only as needed; grow and shrink Production and Disaster Recovery environments as your business patterns change; auto-detect and flag configuration drift between environments
  3. Data – synchronize changes to your relational and object database schemas, static data, files, collaboration content, documents, and big data repositories - in all stages of development and test - with your application code and software versions; automate archiving and storage tiers
  4. Production Assurance (Manage) – instrument applications consistently; use pattern matching against log files to detect problems before they impact your business, auto-fix breaks; monitoring, release management, upgrades

You can see in the above graphic that many tools exist for building, running, and managing applications. Infrastructure itself is becoming programmable (“Infrastructure as Code”) like Application Code. Whether it is the notion of “Serverless” computing or dynamic management of compute and storage resources use of tools such as AWS Lambda, Chef etc. are enabling “Infrastructure Automation”. Cloud has been a big enabler – starting out as an R&D, Test, Dev environment initially with DevOps constructs which are now being seen as critical for Production environments as well on the Cloud for rapid delivery of innovation and business value.

While there are many approaches and tools to support the creation of functionality with Application Code, provisioning multiple Environments and elastic Infrastructure, there have been very few if any tools and methodologies to automate the management of Data - for development, test, and production.

 Ironically, one could argue that the agility of innovation is perhaps most paramount in the Big Data Analytics / Data Sciences space where volumes of data need to be brought in all the time, ad-hoc data sciences applied, and operationalized using data governance via a structured data warehouse for reporting. These data pipelines need to be built and stored fast and made available to Data Scientists. Once the cognitive analytics era takes hold we are basically looking at “continuous analytics” powering transactions, and the concept of DevOps will be very dominant in that space.

 Benefits of a DevOps approach for Data:

  • Higher productivity: put more focus on creating value from your data than on writing database DDL/DML scripts, avoid shutting down environments for overall team productivity and conducting meetings to coordinate all of the moving deployment pieces between the code and data
  • Faster time to market: rollout code, data, and environment changes together (or independently as required) any day of the week any time, while driving new solutions to your customers; manage the huge demand for analytics that is coming to every industry
  • Protection of your enterprise: eliminate human error inherent in manually performing the above activities to avoid project rollout delays and system crashes due to mismatched code, data, and environment settings, which can lead to customer dissatisfaction, lower revenues, costs to fix and rebuild, impact to future scheduled initiatives, and financial impacts from incorrect transactions, not to mention the reputational damage
  • Peace and harmony: reduce finger pointing and blame within teams and management, associated with determining why projects run late, errors impact schedules, and who put the cotton into the server fan

 The high degree of relevance of DevOps to traditional database activities and new Big Data solutions should raise a call to action across the IT community. At Elevondata, our data-driven frameworks and data lake products follow the above philosophies for efficient SaaS offerings and functionality releases to provide a continuous cycle of data platform improvement to end customers in the quickest and most seamless manner.

 Elevondata ( is a leading edge data management advisory and data lake solutions company.  Steve Rycroft is a Big Data product visionary and strategist as well as a Senior Advisor to Elevondata clients.  Elevondata has DevOps frameworks we can apply to your solutions.  Steve has actually seen a working cotton gin in person and can be reached at

DevOps Build, Run, Manage image courtesy of Relevance Labs. Elevondata and Relevance Labs are both Basil Partners portfolio companies.

Author Steve Rycroft