Tag Archive | "Disasters"

Outages, Monitoring and Being Prepared

Tags: Amazon EC2, Business Service Management, Cloud Computing, Disasters, Enterprise IT, Outages


Last week, Lori MacVittie had a blog post on DevCentral about earning your data center merit badge. The message was delivered up front and it was simple to understand: Be prepared. MacVittie is right of course, the best way to stay out of trouble is to put systems in place to prevent it from happening in the first place.

But today’s outage at Amazon EC2 showed us something else — that no matter how well prepared you are, stuff happens that’s totally out of your control and it can spiral out of your control pretty quickly. Lest you think just because you don’t use public Cloud Infrastructure as a Service (IaaS) like Amazon EC2 and hence have nothing to worry about, think again.

If it can happen to Amazon, it can happen to you because at its heart what is Amazon but a giant data center, whose core business is keeping other businesses going. That would suggest that Thursday’s outage was something extraordinary to bypass all of the fail-safes that a system like Amazon has to have in place to keep things going. Today it all fell apart, and it could just as easily happen to you because chances are, your data center doesn’t have nearly the number of contingencies in place that Amazon has.

That means that ultimately you’re probably closer to a disaster like yesterday morning than Amazon ever was (yet it happened anyway).

All of this is not to scare you because IT pros know the score about these things, but it is to remind you that having systems in place to monitor and alert you *before* that disaster strikes is more important than ever. Now, it may end up that it doesn’t matter how prepared you are if a disaster strikes that’s completely beyond the scope of anything you could possibly have imagined in a reasonable contingency plan.

All you can do is follow MacVittie’s simple advice and be prepared for whatever comes. It might not always be enough, but if you do your best, you’ll minimize those major outages and be ready to deal with them when they do happen. But remember disasters happen to everyone at some point, whether your in the cloud or in-house in a data center, and you need to be ready.

Photo by rbrwr on Flickr. Used under Creative Commons License.

Monitoring the Internet in Japan After The Disaster

Tags: Disasters, Internet, Japan, Keynote Systems, Monitoring


At a time when the Internet remains perhaps the most critical communications channel for the people devastated by the earthquake and tsunami last week, remarkably it has continued to operate in spite of the dismal conditions on the ground throughout much of the country.

Keynote Systems, an Internet monitoring company that has been watching the health of the Internet inside Japan after the disaster struck found that the Internet continued to run in spite of a the level of destruction across Japan.

Dave Karow, senior product manager for Internet testing and monitoring at Keynote said over the weekend, “At a macro level, the Internet did what it’s supposed to do. It didn’t even blink. Access from Tokyo to major internet properties based on the Keynote Business 40 was not impacted in any meaningful way. Additionally, access between Tokyo and regional hubs including Seoul, Singapore and Taipai, as well as San Francisco, was not impacted either.” That’s pretty amazing when you consider some of the video that was coming out of Japan on Friday.

Further updates from Keynote indicated there were some problems on Monday, but certainly less than you would expect given the situation. The latest update also included status message from NTT, Japan’s main internet backbone provider that submarine repair crews were on the way to repair damaged undersea cables.

You can view Keynote’s online Internet monitoring tool here. It’s a very interesting look at the health of the Internet backbone across the world.

As Steven J. Vaughan-Nichols writing on ZDNet pointed out, it may seem low on the scale of priorities after a disaster of this proportion, but the fact that people can access the Internet means they can get news, communicate and try to find the whereabouts of loved ones, so in a sense it is extremely critical that the Internet has continued to run as a key communications channel for those affected by the disaster.

Tools like Keynote’s can help us understand the situation and get details about the state of the Internet when it is so crucial that these channels remain open.

Photo by Silveira Netto on Flickr. Used under Creative Commons License.

Monitoring the Japan Earthquake and Tsunami

Tags: BSM, Disasters, Monitoring


The speed at which the earthquake and tsunami hit Japan last Friday, and the devastation they left in their wake was shocking and horrific. Technology and how we use it, without a doubt seems insignificant against such a back-drop, yet it’s worth mentioning there were monitors in place during this horrible event and they played a key role in early warnings for other countries, and for building our body of knowledge ahead of future earthquakes and tsunamis.

Wayne Rash writing in eWeek described the Tsunami monitoring system located throughout the Pacific Rim. He explained that there are two types of monitors, buoys that record tsunami activity as it rolls over them and another set of monitors attached to piers and other coastal structures that Rash explained measures the severity of the Tsunami as it begins to hit shore. He describes it as follows:

Each of these buoys, located mostly around the highly seismically active Pacific Rim (also known as the “Ring of Fire”), reports the signs of a tsunami as it passes. Once this data is gathered and processed at the tsunami-warning centers in Hawaii and elsewhere, it delivers a nearly instantaneous, real-time picture of the speed, direction and severity of a tsunami.

As the waves arrive, they trigger a device called a tide station. These perform a similar function to the DART buoys, but they are attached to piers and other coastal structures, and measure the actual severity of the tsunamis as they arrive from the open ocean.

You can see from this video (which was likely generated using this monitoring equipment) just how much of the Pacific basin was affected:
In the end, the fact that monitoring was in place might have helped in some small way, as the tsunami rushed across the ocean and gave coastal authorities a warning, they might not have otherwise had. While monitors couldn’t stop the waves, they could at least do their job and provide warnings and data to build a higher level of scientific understanding for the future.