IDG News Service >
 

Y2K, the Bad: Fear, hype and the blame game

o Robert L. Mitchell
29.12.2009 kl 15:59 |

Editor's note: This is part II of our series, "Y2K: The good, the bad, and the crazy." If you missed it, see Part I, "Y2K, the Good." And check back tomorrow for "Y2K, the crazy." Also, share your stories of how you spent Millennium Eve. Working? Partying? Or a little of both?

 

Editor's note: This is part II of our series, "Y2K: The good, the bad, and the crazy." If you missed it, see Part I, "Y2K, the Good." And check back tomorrow for "Y2K, the crazy." Also, share your stories of how you spent Millennium Eve. Working? Partying? Or a little of both?

The Y2K bug may have been IT's moment in the sun, but it also cast a long shadow in its wake.

The years and months leading up to the year 2000 were a hard slog for virtually everyone in IT, from project managers to programmers. Then, after IT definitively slayed the Y2K beast, IT executives were greeted not by cheers but with suspicious questions.

Was the Y2K threat overblown? Had the managers that controlled the purse strings been hoodwinked into paying for far more tech upgrades than were necessary? No good deed, as they say, goes unpunished. Welcome to the dark side of Y2K.

Fear ruled the day

Even as they were working to prevent a potential Y2K catastrophe, techies had another disaster in mind -- a career catastrophe. Many IT organizations were given the funds they had requested to do the job right, but they were was also on the hot seat to deliver what was for many organizations the single biggest IT project they had ever undertaken.

"People were scared for their jobs and their reputations," says Dick Hudson, who was CIO at oil drilling company Global Marine Inc. at the time.

Staffers feared that if they were fired for failing to remedy Y2K problems, the stigma would prevent them from ever getting a job in IT again. "Then there was the fear that someone like Computerworld would report it, and it would be on the front page," Hudson says.

It wasn't an unfounded concern -- in the months leading up to the new millennium, the technology, business and even mainstream media covered the Y2K bug with an intensity normally reserved for the latest celebrity sex scandal.

"We had this fear of not completing on time," says Michael Israel, former chief operating officer for IT services provider AMC Computer Corp., who oversaw the Y2K remediation work at client Continuum Health Partners and its affiliated hospitals in and around New York City.

"We had to touch basically every system at Continuum Health," Israel says. His team was replacing hardware while it was working on Y2K remediation efforts on the software running on those systems. Tight budgets and tight project schedules kept everyone on edge. "It was a very crazy project," says Israel, now senior vice president of information services at Six Flags Theme Parks Inc.

Nothing else got done

Unless IT was working on a mission-critical project, not much else got done while the Y2K remediation efforts were under way. "We dedicated the last three quarters of 1999 to preparing for Y2K, and reserved the first quarter of 2000 for fixing any issues that came up. Every nonessential project was put aside," says Benny Lasiter, a senior data management analyst who worked on Y2K testing for a real-time trading floor application at Texaco Natural Gas, a division of Texaco U.S., which is now part of Chevron Products Co.

AtAce Hardware Corp., as in many other organizations, new and strategic IT projects were shelved while Y2K remediation work progressed. At Ace, those other projects were held up for two years. "We had to divert resources to do this," says Paul Ingevaldson, who was senior vice president of technology for Ace's global operations at the time.

While Y2K projects succeeded, they also represented a huge cost in terms of lost opportunities, he says. "Y2K remediation had no value to the company other than that the company could run on January first. You lost the opportunity to do positive systems development."

Projects that could have helped business increase profitability or market share simply sat idle for two years.

Business executives didn't like that, Ingevaldson says, but with too few resources and Y2K remediation inflating IT budgets, there was simply no choice. "That's the way it had to go."

Long hours took a toll

To meet deadlines, many IT organizations had to work straight out and with few breaks. "We were working double shifts up until two weeks before [the end of the year]," AMC's Israel recalls.

At Texaco, vacations were suspended for a period of time. "In the second half of 1999, we were literally locked down. No one could have any time off," Lasiter says. Later, he says, employees were rewarded with compensation time and bonuses. But for much of 1999, it was all hands on deck.

Many IT workers spent late nights and weekends at work so that they could test systems after hours, when they could be safely taken offline. "We were literally doing full system backups, setting the clocks on everything to ten minutes before midnight to see how things would run," Lasiter says.

New software still needed attention

What annoyed Lasiter the most was the time spent testing relatively new systems that weren't a problem. Y2K compliance audits required thorough testing even of new systems that didn't use a 2-digit year. "No one was exempt from the rigorous testing plan we had, even if you had newer applications that didn't have Y2K issues," Lasiter recalls.

He said a rising tide of fear drove management to ask for all sorts of redundancies that probably weren't needed -- and ran up the bill. "They had that fear that all of those horror stories would come true and that our IT systems and applications wouldn't run, data would be corrupted and communications wouldn't work anymore," he says. So IT installed redundant lines and other emergency backup measures. "You name it, the redundancy was there, just in case," he says.

Hudson was appalled when vendors of application software packages that had no known Y2K issues added last minute patches to -- in his view -- cover their butts with their lawyers. "We had to put those patches in so we could get the Y2K certification from the vendors," he says. That created extra work for his staff, which had to install each patch and retest the systems, even though reports from outside consultants confirmed that those systems worked fine in the first place.

On the other end of the spectrum, Ingevaldson was disgusted to find that even some of his newest software wasn't Y2K-compliant. "There were some [noncompliant] applications that we had bought relatively recently. We had to modify those, and that really pissed me off," he says. "To have outside software [be] incompatible, that just blew my mind."

New Year's Eve was a work night for techies

Although IT executives across the globe were confident that they had the problem licked, a nagging fear followed them right up until New Year's Eve. While most people were out celebrating the turn of the century, IT executives and their staffs were either monitoring events in the office or standing by at home.

"Nobody in the IT organization was celebrating, because we were all on call," recalls Lasiter, who is now an IT strategy consultant in Houston.

Concern over what could happen was so intense that in some companies, CIOs were sending updates to top management as midnight came and went. "One of my roles was to keep the president and board chairman apprised," says Ingevaldson. That kind of scrutiny was unprecedented, he says, and added to the pressure.

It wasn't until New Year's Eve itself, when it became clear that the world infrastructure wasn't going to collapse, that Ingevaldson and his team of about 30 IT people felt they could finally relax, at least somewhat.

"We had a conference room where we set up televisions," he remembers. A few minor issues came up that night, which the team fixed before morning. Ingevaldson checked in with the CEO after midnight and stayed at work a few hours longer to make sure everything was okay. "I got home at about 5:00 a.m.," he recalls.

Y2K got fixed. IT got kicked.

On Jan. 1, 2000, catastrophe reportedly loomed. Planes would fall from the sky. Power plants would shut down. Elevators would stop. Entire supply chains would freeze up. (For more, check out this video of Leonard Nimoy describing the myriad ways the world was supposed to grind to a halt.)

After hearing all this, top management watched nervously as midnight came and went without any major incidents in their organizations. At first they were relieved -- until they noticed that no one, anywhere, was having any major problems.

Then they became suspicious.

"It was a no-win situation," Ingevaldson says. "People said, 'You IT guys made this big deal about Y2K, and it was no big deal. You oversold this. You cried wolf.' "

Suddenly, management was questioning why all that money had been spent in the first place, says Israel. "After the event was done, the CFO said, 'I signed off on $20 million and I didn't hear one thing,'" he recalls. The CFO doubted that the problem was as big as he'd been led to believe. "No matter how much salesmanship you did, they kind of doubted you after that."

Some organizations faced more than just criticism and snide remarks. "The reputation of IT took a hit," sums up Dale Vecchio, an analyst at Gartner Inc. "A lot more outsourcing happened after that."

Some felt it most keenly in the pocketbook. "There was a backlash from the CEO and CFO. They felt that Y2K was overhyped, that it was just IT's way of getting a lot of money out of them," Hudson says.

While Hudson was able to defend to his management the value of the work his team had done and then move forward with other, long-delayed projects, many of his colleagues saw their capital budgets slashed as resentment mounted. "For a couple of years, they couldn't get squat. [Management] felt that they had been snookered by IT and the press," he says.

They had a point, says Bruce Schneier, currently chief security technology officer at enterprise security provider BT Global Services as well as a noted author of books on risk and security. "If it was really bad, you would think in some cases something would have gone wrong somewhere. But nothing went wrong," says Schneier, who has chronicled the ways in which people overreact to some risks while ignoring others. With Y2K, he asserts, the level of risk was overstated.

Stuart McGill was VP of Y2K business at programming tools vendor Micro Focus, where he is now chief technology officer. He's not sure that the risks were accurately presented to management in all cases. "The most likely consequence [of not fixing Y2K issues] would have been irritation rather than disaster," he contends, adding that some IT organizations probably should have been clearer with management about that.

Overhyped or not, the experience left many CIOs feeling as though IT got a bum rap after working at a furious pace. Israel, for example, had managed a team of 37 people who successfully worked on 37,000 systems in the span of just ten months. But after it was all over, he says, he still had to explain to the CFO why it was money well spent.

There's a lesson that came out of Y2K, says Hudson. "No matter what the media says, don't overhype a crisis. A lot of IT people just said, 'You've read the papers. I need the money.' They used the fear factor." And, as a result, "there were credibility issues there." Those doubts about IT were "unfair and unfounded," Hudson says, "but nevertheless they happened."

On the other hand, IT executives who gave more detailed and accurate assessments of risks and costs fared better on the reputation front, he says.

Y2K hangover was a long-term budget-buster

After the Y2K crisis abated, IT executives faced a New Year's hangover of downward pressure on budgets.

"Immediately after Y2K, spending on mainframe and critical systems seemed to fall away," observes McGill. While his company made money selling tools used to fix Y2K issues, even he thinks IT spent "slightly too much" on Y2K.

Indeed, a 2006 IDC report pegged total spending for Y2K remediation and contingencies in the U.S at $134 billion. If that work hadn't been done, IDC calculated, those companies would have experienced a potential $97 billion in lost revenue. That means Y2K remediation cost $37 billion more than the potential damage would have, explains IDC analyst John Gantz.

Worldwide, IDC estimated that organizations spent $308 billion to prevent $237 billion in potential lost revenue, an overspending of $71 billion.

In many organizations, that money had been diverted away from other parts of the business to handle Y2K remediation. In the aftermath, business leaders felt they'd spent enough on modernizing IT and needed to focus next on those other areas. As a result, IT budgets went down and come didn't come back for a few years, says McGill.

Israel agrees with that assessment. Post-millennium, he says, the prevailing attitude was, "You spent so much in 1999, what more could you possibly need?"

Check back tomorrow for "Y2K, the Crazy: Computer glitch or mind-blowing catastrophe?" or go back and read "Y2K, the Good".

Keywords: IT Management  Business Issues  
Latest news from IDG News Service
Latest news from IDG News Service

Copyright 2009 IDG Magazines Norge AS. All rights reserved

Postboks 9090 Grønland - 0133 OSLO / online@idg.no / Telefon 22053000

Ansvarlig redaktør Morten Kristiansen / Utviklingsansvarlig Ulf H. Helland / Salgsdirektør Jon Thore Thorstensen