The Crowdstrike Falcon Apocalypse - Here's how my night went

preview_player
Показать описание
The Crowdstrike Falcon Apocalypse was a nightmare for Rich. If you're curious about what happened, what all the news is about with #crowdstrike #falcon, and, more importantly, what it was like to be boots on the ground when it happened, watch this video!

**GET SOCIAL AND MORE WITH US HERE!**
Get help with your Homelab, ask questions, and chat with us!

**Check out HomeLab Gear for all your homelab needs!**

Subscribe and follow us on all the socials, would ya?

Find all things 2GT on our website!

More of a podcast kinda person? Check out our Podcast here:

Support us through the YouTube Membership program! Becoming a member gets you priority comments, special emojis, and helps us make videos!

**Chapters**
0:00 Introduction
0:34 What happened with Crowdstrke Falcon
1:11 Who is Crowdstrike and what do they do?
2:13 So, what happened, and what is the problem?
4:17 What it was like to experience Crowdstrike Falcon Apocalypse firsthand
12:41 What recovering from Crowdstrike Falcon Apocalypse was like
14:58 Final thoughts on everything
15:32 Closing! Get some sleep!
Рекомендации по теме
Комментарии
Автор

RIP to all IT people dealing with bitlockered systems.

dosmaiz
Автор

As a retired IT worker who went through Y2K and is still irritated by all those who said all the work we put in to prevent this sort of outcome was a waste of time and resources, I feel pained for those doing the firefighting, hopeful they get the recognition they deserve, but suspect they'll be in part scape goated for not having (unaffordable) automated backup and recovery systems in place.

colinutube
Автор

Working for a company that's still recovering from a cyberattack (we switched to CrowdStrike in the wake of that), this was the worst nightmare I could wake up to.

johnp
Автор

I have to admit, this took me back to my IT days 20 years ago and i literally had chills up and down my back. My heart goes out to all the key boards warriors who had to go through what they've gone through and those who are still plugging away. There's got to be a better way and of course that's the same thing we said 20 years ago...

msmithsr
Автор

And suddenly, we went from no cash to a paper sign that says cash only.

JustWatchMeDoThis
Автор

I appreciate you guys and gals. I work for a small IT company out of Alaska. We were not affected thankfully but I can't imagine the mass scale of this outage and the tedious manual work that will have to be implemented
in order to get all systems back up. I'm in the sales dept. I bring in new clients for the company I work for. I've always had respect for Tech. I'm somewhat of a nerd/ sales guru. and appreciate the work behind the scenes
that keeps systems running smooth and safe.

lampkinmedia
Автор

We weren’t affected, but in the end it’s just dumb luck that we chose a different EDR product. Lots of lessons learned today, and I’m sure there are going to be a lot of great discussions in the coming weeks. Sleep well! You deserve it.

MicroAdventuresCA
Автор

My night/day started at 11:41 PM central time. Finished at 7:30PM tonight

Craig-mlnw
Автор

The first thing I thought about when I heard about the outage was the line "Skynet IS the virus".

TomNimitz
Автор

We had Azure VM and Microsoft suggested doing a disaster recovery process, which includes changing the region, etc. I was determined to not mess with the VMs because they are set up very specifically to the apps we had - basically I was hoping Microsoft would be willing to fix it, since I didn't do anything to break it. 45 minutes later it came on, I was glad that I waited.

freddiecrumb
Автор

Something like this was inevitable in a world where enterprises don't control/delay updates and leave it to the vendors.

foxale
Автор

right there with you...worked from 3:30AM to 8:00PM, with another long day tomorrow....

druxpack
Автор

Seeing so many machines down all at the same time was definitely the craziest thing I have ever seen in my 25+ years in IT

Reformatt
Автор

Happy to learn that I'm not the only one who thought of Lawnmower Man 🤓

noahz
Автор

This is an EXCELLENT RECAP
Thank YOU for explaining it for regular folks.

TraciD
Автор

The company I work for was hit hard. To make matters more fun some of our smaller locations, like mine, don’t have any IT personnel. That meant that 2 of us who know computers had to get all of them working once we were given instructions. On my computer I have local admin rights so on that computer I was able boot into safe mode and delete the file. On all other computers we either had to PXE boot or go through the Microsoft menus to get to the command prompt.

We still have 3 computers with Bitlocker that we cannot access. No one in the company has the correct access code.

jeffolson
Автор

Thanks for sharing your experience. I am not working in IT, but I do work in the tech industry. I am lucky my company don't use Crowdstrike, and so far in my country, only the airport and a few airlines were affected. I am not travelling, so I just went about with my day. To everyone involved in fixing this mess, thank you. I try my best to explain to my friends and family what is going on, and I have been emphasizing the situation the sysadmins, IT helpdesk staffs are facing, with the occasional F U to Crowdstrike. Everyone in IT should be appreciated more and I can only hope this can be studied every where and something can be done to prevent the same thing from happening again. Remember it is not just the good guys learning from this outage, I am pretty sure the bad guys, the thread actors are also learning from this.

boringNerd
Автор

If you have VMWare, you could fire up a VM that has PowerCLI, shut down all the windows VM's, then loop through every disk image to mount it on the new VM, delete that file and unmount it again.
PROBABLY best to use a Linux VM for this as Linux is far 'nicer' when it comes to mounting and unmounting drives.
I'm GUESSING other hypervisors would have similar tools.
HyperV shops will probably be an even bigger level of screwed with dataloss due to the hypervisors themselves resetting.

Then there's next level screwed if bitlocker is in use.

Consequator
Автор

I work for a large company that use to use crowstrike...we removed and replaced with MS E5 License Security Stuff when we went to e5 as a cost thing But we had 40 machines still affected as even though it was removed we had had instances of crowdstrike re installing itself...real bizarre ...those 2 Servers have been fixed and the other PC's for users will be fixed when Desktop Staff come in Monday as they were at remote sites...what a

stevemeier
Автор

This was like listening to some strange dude describe my Friday.

keithstarks