DevOps: Standardizing The Journey

https://pixabay.com/en/background-chaos-mess-colorful-2496214/
geralt

Sometimes I come across an article which is narrowly focused, but has applicability throughout the universe.  OK, maybe not that all-encompassing, but almost.  For instance, just the other day I was reading an article entitled “DevOps and the DBA“.  Kind of a narrow focus, but a couple of sentences really stood out for me.

Operations staff take the view that speed invites chaos, resulting in instability, downtime, and a lack of sleep.  The reality is that chaos, instability and downtime are not the result of speed, but the result of variance.

The goal is to enforce consistency, and to manage our resources so that the software is deployed in the same way that a car factory builds an automobile.

Wow.  That kind of sums up the DevOps movement in general.  In order to automate something, you need to standardize something – the tool, the approach, the steps – and the more you standardize the more you can automate and the fast you can automate.  I was attending a Forrester conference call a couple of weeks ago and the presenter was talking about how Amazon has seven different templates to move things into production.  Just seven.  Now, by any stretch of the imagination Amazon is a big company.  If they can do it, why can’t we?  Why is it that we have to customize all of these deployments for our deployment tool instead of using just a few templates?

But the idea of standardization is not new, nor is it restricted to just deploying applications into production.  The entire software development lifecycle is full of standards, but sometimes not the correct standard.

Back in the good old days, we used Excel for project management.  (Yes, some of you still use Excel, but there are better tools out there.)  It was a standard template for how we built applications.  The template remained the same and we just added line items for the exact pieces that we were building.  If we were building two web pages, one low complexity, and one medium complexity we added two lines into the Design section, two lines in the Build section and we were done.  We told the spreadsheet what complexity the web pages were and it used a lookup table to pull back the numbers for the spreadsheet.  Testing?  It was a set percentage of the Design and Build estimate.  It was standardized.  It was simple.  It was accurate. We used previously executed projects to populate the lookup table and we had years worth of information that we could mine.  By standardizing on a template we were able to make effective use of the data that we accumulated doing the last project and the one before that and the one before that and …

Now, imagine a world where you could go to a single site and tell it that you are starting a new project.  You give it a name, an acronym, the people on the project and by pushing a button you get:

  • Servers created
  • Active Directory groups created
  • Accounts created
  • Databases Created
  • Project Plan created
  • Automated workflows created

This is possible through standardization.  This is the dream of DevOps, this is the target that we should be striving for.  The fewer decisions that need to be made on mundane things (Where do I go to get servers created?  What databases do I need?  What should I call them?  What security needs to be set up?  What do I need to put in my Project Plan?  When should I get the automation workflows created?) the more effort can be spent on building a system that exceeds everyone’s expectations.

We shouldn’t be lulled into the idea that DevOps is restricted to just the technology pieces of development and operations, it is pervasive in nature and covers the entire SDLC, including the management of projects.  DevOps isn’t a thing, it is a culture shift that is both invasive and inclusive.  It will change how you plan and develop applications and it requires that information be fed back into the process in order to continually improve and to ensure that the process is advantageous to the organization.

DevOps, from planning to executing to implementing, is about making things easier.  It’s a long road to get there, but the more we look at the entire SDLC the more advantages we can discover on the DevOps journey.

Stop Overthinking

https://pixabay.com/en/thinking-thinking-work-man-face-272677/
geralt

Psychology Today had an article last year entitled “Why Does Overthinking Sabotage the Creative Process” where they stated:

Many scientists believe that the creative process springs as much from the subconscious as it does from a conscious thought process. Most often, creative solutions are not wrestled from your mind through sheer force of will.

Continue reading “Stop Overthinking”

Concentration Is Not Overrated

https://commons.wikimedia.org/wiki/File:City,_crowded_office_space_(2899334278).jpg
Wikimedia Commons

Being able to concentrate on a problem is a wonderful thing.  The idea that you can have some peace and quiet and really get your head around a problem is, to me, a blessing.

So why do many organizations screw this up? Joel Splosky is the CEO of Stack Overflow.  For those unaware of Stack Overflow, it is a very popular site for developers to go to get answers to questions.  It is the site to go to.  If they don’t have the answer than you have a very unique problem or you need to use different keywords when searching.  Joel doen’t like the concept of the open office.  His anecdotal evidence indicates that developers don’t like it either. Continue reading “Concentration Is Not Overrated”

APIs May Require A Cultural Change

https://www.flickr.com/photos/flying_cloud/2667225198
Flying Cloud (Flickr.com)

When you call a service you invoke an API (Application Programming Interface).  The idea of an API is old but has survived the introduction of thousands of computer languages and processes because it is a necessary piece of application development.  The advent of cloud computing has raised the status of APIs to a new level.  There are APIs to access services from numerous cloud providers.  These APIs allow you to do almost anything you can imagine from creating virtual machines (Powershell on Azure), exploring the latest news feeds from a company (RSS) to seeing what your favourite celebrity is posting (Twitter, Tumbler, Instagram, etc.)
Continue reading “APIs May Require A Cultural Change”

Data Has Escaped

https://pixabay.com/en/data-binary-one-null-privacy-2248217/
geralt

A few days ago I received an interesting email, it was from the New Mexico Medical Center welcoming me to the Patient Portal.  I was a little confused as I don’t live in New Mexico, have never been to New Mexico and, to the best of my knowledge, have no know relatives in New Mexico.  A little curious I opened up the first email and started reading:

Dear DANIEL JESSOP,

We are pleased to inform you that online access to your electronic health record is now available through New Mexico Medical`s Patient Portal. The Patient Portal is a secure website that allows you to communicate with your health care provider and to view parts of your electronic health record.  This tool will help you better manage your care and enhance your partnership with your health care team.

OK, so obviously someone put in the wrong email address (mine) instead of theirs and I got sent the welcoming email.  I was happy that they didn’t actually provide any confidential information in the email as that would have been … awkward.  The part I liked was :

For security reasons, the activation code will be sent to you separately.

So the New Mexico Medical Center was going to send me an activation code that I would need to activate my account.  Cool, so they will send it to the correct mailing address and I don’t have to worry about it.  I pop out of this email and … oh, oh.  Sitting in my inbox is the Patient Portal Activation Code.  Well, this sucks.  Daniel Jessop created a userID with the name of “djessop1”.  (Wait, there is another djessop other than Daniel in New Mexico?)  I now have the link to activate his account and the activation code necessary.

What do I do?  I can now active Daniel Jessop’s account with his health care provider and pull out all of the information about him.

Well, I’m a sucker.  I thought that I would be nice.  The note says that if there are any problems with setting things up to give them a call.  So I proceeded to do that.  Multiple times, across multiple days.  In fact, I checked time zones to make sure that I was calling at the correct time.  No one answered the helpline.  I didn’t even get a recorded message saying that they were busy.  I didn’t get voice mail.  I got diddly.

So there are some interesting lessons to be learned here:

  1. If you are setting up an account for someone, make sure that they can actually access the email account that they enter.  They may have made a mistake and you need to take that into account.  Something like time limiting the activation to xx minutes, but you need to do something.
  2. There is no use at all of sending a welcoming email and an email with an activation code.  Either put them into one email or send the activation code via another method, but two emails to the same account?  Completely amateur.
  3. If you have a help number for people to call, you darn well better staff it or at least let the phone go to voicemail.  No response is not adequate, particularly since you are dealing with someones electronic health record.
  4. Clean up your website.  (OK, I dissected the link a bit.)  If I go to the root of the activation link I get the default IIS web page.  I now know which operating system and version of IIS they are using.  I know that it is an ASPX page hosted by a third party and that third party has not kept up with security patches.  If you are going to host patient data you better harden your website.

In essence, if you are ever in New Mexico, avoid the Eastern New Mexico Medical Center as your data could end up in Canada.

Test Data

https://www.flickr.com/photos/tt2times/2568645910
Tony Werman (Flickr.com)

Here are some interesting quotes:

You can’t be Agile if you don’t use the right test data

Developers often copy subsets of production data, sometimes anonymizing it but not always and rarely considering whether the test data contains sufficient diversity to exercise all important edge cases. And those are the more mature dev shops!

The quotes come from two different Forrester articles but are representative of dozens of articles about testing and quality assurance.  But this is not just representative of Forrester, it is representative of the IT industry as a whole.  Let’s dissect each piece and see what we come up with.
Continue reading “Test Data”

Scaling Out versus Scaling Up

https://www.flickr.com/photos/torkildr/3462607995
Torkild Retvedt (flickr.com)

So, cloud computing.  Kind of a big topic with so many different things that can be discussed.  But’s let’s focus on one very narrow thing:  scaling up vs. scaling out.

There are two main ways of increasing the performance/capacity of an application and that is by scaling an application up (faster processors, bigger machines) or scaling it out (more machines).  When you have an application on-premise, in your own data centre, you get your choice of how to scale an application.
Continue reading “Scaling Out versus Scaling Up”

Two Approaches To Security

Embed from Getty Images
There are two radically different approaches to something like security:  assume that you can prevent something from happening or assume that the worst is going to happen, how are you going to recover.

The first approach (“prevention”) is a wonderful thing.  You spend thousands or millions of dollars in preventative measures: Continue reading “Two Approaches To Security”

Weapons of Math Destruction

Weapons of Math Destruction.

I can’t lay claim to the name, it comes from mathbabe.org.  Her subtitle is “How Big Data Increases Inequality and Threatens Democracy”.  You would think, based on that subtitle, that she doesn’t like Big Data.  On the contrary, she likes it, but it needs to be used effectively. Continue reading “Weapons of Math Destruction”

Batch Processing

https://www.flickr.com/photos/pargon/2444943158
Pargon (flickr.com)

I love learning new words, especially if they help to explain something in a way that I never thought of.  For instance, let’s take the following word:

idempotent: the property of certain operations in mathematics and computer science, that can be applied multiple times without changing the result beyond the initial application

Or, at least, that is what Wikipedia says.  But, what does it really mean?  Suppose you had a function that updated someone’s birthdate and you called it with the value ‘1984-04-03’.  Afterward, that person’s birthdate would be ‘1984-04-03’.  It wouldn’t matter how many times you called the update function, the value would still be the same.  Reading data is an idempotent operation as no matter how often you read it the values are the same. Continue reading “Batch Processing”