This rage comic is a little busy but you’ll get the point.
This rage comic is a little busy but you’ll get the point.
As is usually the way with these things, a year-end upgrade was planned out for our full-disk encryption software to upgrade it to the latest version. In a perfect world, it would uninstall the old version and install the new version. It was tested on several machines in our headquarters only, with no adverse effects. A gradual rollout was scheduled to be done AFTER the encryption administrator came back from a short vacation.
The day after the administrator left on vacation, we started to get laptops coming in from across North America with the new version of encryption on it. The machines themselves were being shipped to headquarters for reimages due to various issues, but not the encryption itself. We groaned to each other, because this is par for the course for us, and made ourselves a note to say unfriendly words to our administrator when he came back.
Our administrator returned from vacation, and admitted that SOMEHOW the encryption upgrade had been pushed to 3500 machines outside of headquarters. He assured us that he would halt the rollout, and that machines that hadn’t begun to install the new version would not be upgraded.
Several days later, we started to see machines come in with new version of encryption installed, but with a twist. The encryption software would not boot. We took one over to our administrator, who tinkered with it, concluded that he couldn’t do anything with it, and that we should reimage it. This lasted for several hours, until we noticed that the tickets indicating a machine inbound for reimage had ballooned, all reporting the same fatal encryption error. We quickly realized that we would be receiving several hundred of these machines in the space of a few days, when our normal volume for reimages is only 10-20 machines per day.
Cue the panic.
We bluntly informed the powers-that-be that we had neither the physical space to stage several hundred machines, the manpower to reimage several hundred machines (at least, not quickly), or the server space to perform data backups (as we usually do whenever we reimage).
The powers-that-be quickly responded, getting us several conference rooms set up for reimaging, and getting more than a dozen IT employees to volunteer for some crash-course training in reimaging.
However, the server space turned out to be a problem. We had at our disposal a PowerEdge with a 1.3 TB RAID array and a normal desktop computer with a 1 TB hard drive. Normally, we could make this last for about two weeks before we had to start deleting old data to make room for new data. However, our Security group firmly insisted that we needed to retain all the data from machines afflicted with this encryption error.
The other IT employees immediately began plumbing the depths of whatever resources they had. Someone offered to archive old data on their group’s network share, another person donated a legacy server and configured it with a 900 GB RAID array. However, we knew (and told the powers-that-be) that we needed way more than that.
Amidst everyone else running around like beheaded poultry, our chief security officer (who happens to be one of the developers of BackTrack) walked up to me and asked about how much space was actually needed. I ballparked high and told him 20 TB. He nodded, grabbed his coat, and walked out.
Thirty minutes later, he returned from Best Buy laden down with bags full of 1 TB external hard drives. He had driven to the nearest Best Buy, walked up to the customer service counter, and casually purchased their entire stock of them.
And he bought a coffee maker.
I work as a Sys Admin at an Ivy League college for my department only. Our department manager thinks we do nothing but carry around keyboards and usb keys; she has stated as much to my supervisor. She is a mean, cold, evil woman who can’t remember her own name half the time. The other day I get a phone call (instead of an email address, because she can’t remember to email support instead of us individually) that she can no longer “hear sounds.”
She stated that she went into the control panel and her sound settings appeared to be ok. I have no idea how she remembered to do that, but gave her credit and told her I would come take a look.
As I walk in her office, I can see that her speakers are off (no light). So I ask her to please turn on her speakers (as she sits in her chair and “guards” her computer unless you ask to touch it). She looks at me with her usual deer-in-headlight look and says “how?”
I ask for permission to reach around her and quickly turn them on and leave…BEFORE I fall over laughing. As it happened, her friend and partner-in-crime (the HR director) who sits two doors down from her had the EXACT same issue a week before.
[Picture Source:Brian Lane Winfield Moore (CC)]
Is this bad? I mean, can you recover my photos?
No, users NEVER get their password wrong. Never.
Customer calls in. Outlook is working but he can’t into some other mail program. He “knows” he has the right password.
Me: “Ok, so you just tried the password that you think it is and that didn’t work?”
Cust: “That’s right.”
Me: “Why don’t we just reset the password on the server. Now, if Outlook stops working that means you had the wrong password so make sure you type it carefully as what you believe the password is in the iPhone.” (device irrelevant)
Cust: “Ok but I don’t think that’s the problem.”
Me: “Let’s just consider this a test.”
Cust: follows directions to reset his mailbox password
Me: “Ok. Let me push that update to the server. It’ll take up to two minutes max.”
Cust: before the update finishes “Ok, the iPhone’s working. Let me check Outlook.”
Cust: “Ok, now Outlook isn’t working.”
Cust: “What do I do?”
Me: “Put in that password you just reset to in Outlook’s password box.”
Cust: “Are you sure?”
Cust: “Ok, I’m typing that in. There, I just click OK.”
Me: “and I see a successful login on the server now.”
Cust: “Well, I don’t know what happened there.”
Me: “you had the wrong password noted so when you reset it didn’t match Outlook’s password and outlook stopped working like I said it would.”
Cust: “Well I know I had the right password. I guess my Outlook profile was messed up or something. Thanks for your help.”
Me: after the call ended “I JUST FUCKING EXPLAINED THAT YOU MORON! WRONG PASSWORD MEANS WRONG PASSWORD. WE JUST PROVED IT…” And then I threw something across the room.
I really, really hate it when I see stuff like this.
This is probably from the same remote facility where they store the janitorial equipment in the switch closet.
This one was entitled “Procurve switch we got back from an office that closed.”
Ever have a day where you’re just in a off mood? Last week I had a user call about a stuck CD tray. I politely told them I would be up shortly and use the “magic hole” to get the disc out. They never even questioned it.
So that leads me to ask…what strange, unusual, absurd, off the wall terms have you used on a client or user? Put your entry in the comments! We’ll pick the top three and then put them to you guys to vote on! Winner will receive the most epic prize ever: BACON JERKY!
Submissions will be taken today through Wednesday night and the top three will be available Thursday and Friday for voting. Put your submissions in the comments and make sure you don’t post it as anonymous so we can get a hold the winner next week.
I know we’ve had a post like this in the past but WTF. Don’t dead thing normally smell?
via: [Fail Blog]
User: Hey! Is something going on with the time clock it’s like so slow!
Me: What do you mean? Does it respond?
User: Yes, it worked but it so slow! (no that’s not a typo)
Me: Oh ok, so do you know if anyone else is having a problem?
User: Yea, Rhonda said it’s so slow for her too. What’s going on?!
Me: Weird, let me log in on my PC and see what happens.
User: I already punched out, see you later I gotta go!
Me: …. are you f**king kidding me? (The timeclock loads fine.)
Every day that passes I am more and more likely to answer the phone like this, “IT what’s your ticket number?”