Last month my fellow employees and I wrapped up our third annual Palmetto Cyber Defense Competition (PCDC). Inspired by the Collegiate Cyber Defense Competitions, PCDC has been a computer security competition for high schools and colleges in South Carolina. I've written about the event itself in the past PCDC-2014 so I won't go into detail about what the event actually is. Those details can be found at the official PCDC site. Instead I wanted to focus on a couple of the things that make PCDC unique, lessons learned from putting on a computer security competition, and where we are going in the future. For those of you looking for a detailed Red Team write up, that can be found here. I took the time after the competition to create an in-depth analysis of the Red Team's process for those looking to learn from their mistakes at this year's competition. The rest of this blog post is a few of my thoughts from an organizer's perspective and not from the perspective of the Red Team lead.
First off, PCDC is unique in that for the first two years, both high schools and colleges competed. Each group had their own dedicated competition day which meant that the PCDC competition was actually run twice. This means that after the first day, all the machines must get reset, re-imaged, and configured slightly different for the next day. This year, we had a third day added; a professional day. For the first time, PCDC was to be run three times. The schedule of events was as follows: first day was high school, second day was college, and the third day was for professionals. For the professional day, we had groups representing a mixture of government and private industry. From the government side teams were comprised of members from the 24th Air Force and U.S. Cyber Command while the industry teams had members from Scientific Research Corporation (SRC), and SPARC. All in all, we had 4 government teams and 4 teams from industry.
Our goal from the beginning was to design the competition network to be believable and realistic. Since it was not known to us during the planning and design phase of the competition that there would be a professional day, this year's theme was based around an online game development company. Each of the Blue Teams would be responsible for making sure their development company continued to function through the course of the day and deliver the latest version of their game to their user base through audience accessible tablets connected to each Blue Team via a dedicated wireless connection. One of the things that we quickly realized was that our ambitions greatly eclipsed the amount of time we had available to create the network. Remember, a decent portion of our time goes into infrastructure development so that the competition can be rerun the next day. To add to that pressure, we do not have control over the facility in which the competition is hosted. As a result, or preparation time from the end of one day the beginning of the next is usually around 3 hours.
To put it simply, there was a lot of different attributes that we wanted to include into this year's competition, but we ran out of time. One of the biggest things we feel these types of competition lack when trying to simulate real world networks is realistic user activity. This year we attempted to remedy that by developing simulated users. We got all the code developed and tested for the user simulators, but due to a hardware failure, we were unable to deploy them to the competition network this year. In the interest of education and sharing, I have opened sourced the code for the user simulators on GitHub. We'd really like to hear back from anyone that is doing something similar.
A few of us have been involved in multiple CCDCs and PCDCs and every year, we make the comment that the scoring system needs to be altered. Although this was in the works before this year's competition even took place, we haven't had time to finalize what fixing the scoring means. At this point, I think we have a much better idea of how we are going to fix the scoring. To highlight why scoring is such an issue, I want to talk about how winning Blue Teams typically approach this competition. Within the first few minutes, the strategy includes removing all unnecessary accounts, changing all the default passwords, and for some, unplugging their entire network while they continue to harden. Now, from a strategic perspective with the goal to win a game in mind, I can't argue with this approach. The issue that I do have, however, is that this leaves the networks in a pretty unrealistic state.
Each Blue Team is given an information packet at the beginning of the competition. In that packet includes the names of the accounts that the automated scoring engine will use to log into their systems and perform service checks to make sure the Blue Teams still have their services up and running. Once these account names have been identified, the Blue Teams will delete every other account off the workstation or server. This means you could have 3 or 4 domain joined Windows workstations with zero user accounts and only the scoring engine account. It is important to note, that the Red Team is not allowed to leverage the scoring engine account to gain access to the Blue Team's networks. It's also not realistic. A computer security competition should force the students and competitors to perform real security tasks with the presence of real users. Now, since this is a competition and getting that many unbiased volunteer users is unrealistic, we need simulated users.
Other areas where improvements to scoring need to be made is in the way the scoring engine actually evaluates successful checks. Up to this point, a common service to check for is a functioning MySQL database. Typically the scoring engine will login to the MySQL database, make a query, and check for a specific key-value pair or for the presence of a specific table. This simply isn't good enough. For a real company, the database needs to have constant transactions generated by realistic activity. Right now, Blue Teams get away with making a backup of the database in the beginning of the day and just restoring it anytime the Red Team deletes the database. As long as the Blue Team restores the database in between scoring engine rounds, the scoring engine gives that Blue Team a perfect service check score. Now, this is slightly cured by the fact that the Red Team reports incidents to the Gold Team and the Gold Team can decide to take away points, but these types of scenarios need to have bigger impact on the 'day to day' operations of the Blue Teams' networks.
Where we plan to go in the future will attempt to combine 3 facets of scoring. The first being financial. The scoring engine will no longer add points, but will score the Blue Teams' companies in a financial sense. The second facet is from an internal employee and systems perspective. Employees must be able to perform their job related duties and interdependent systems must be able to communicate with each other. Finally, the third facet is from the Red Team. This year we tried something new by giving the Red Team specific targets/flags to capture when we gained access to the the Blue Teams' networks. This included the credit card numbers in their customer database, the source code to their latest game, and a few other things.
Now, I know some people will argue that the CCDCs and even PCDC already take these things into consideration with the scoring engine, but we argue, it is not taken into account enough. The example we like to use is the one where the Blue Team unplugs their network. Now sure, they aren't getting any points from the scoring engine, but in the real world, you can't just go and unplug your entire company from the network. Not only would you be losing sales, but you're paying your employees to do a job that they can't accomplish. And not to mention, the security or IT department has no authority to make that type of decision.
We have thought long and hard about the scoring and we think we have something new and exciting for next year. I don't want to give away too much here until things are more settled. Additionally, we want to find a way to make the audience understand what is going on. PCDC is free and open for the community to come in and view. This year we attempted to show what was taking place by visualizing the the traffic between the Blue Teams and Red Team in real time. I wrote the code for this and am also releasing it on GitHub. You can see a video demo of it on YouTube.
We have a lot of exciting things planned for next year's PCDC! Stay tuned for more, and if you have any feedback from this year's we'd love to hear it.
Yet another blog on computer security. I will be using this blog as a space to track my adventures in reverse engineering, exploit development, capture the flags (CTFs), conferences, and other things I find interesting.
Showing posts with label Red Team. Show all posts
Showing posts with label Red Team. Show all posts
Sunday, May 31, 2015
Wednesday, April 15, 2015
South East Regional CCDC 2015 - Red Team
This time last week I wrapped up Red Teaming for the South East regional Collegiate Cyber Defense Competition (SECCDC) for 2015. The SECCDC is special for me for a few reasons. It was my first exposure to the whole CCDC arena, many of my close friends form the Red Team, and the CCDC national champs from last year are from this region.
This year's scenario was similar to last year's. The Blue Teams were responsible for maintaining the operational status of the HAL business network while completing a series of business related injects. The network layout changed a bit from last year, however. This year all the Blue Teams had a few machines that were public facing, and a group of privately networked workstations. The public facing images were comprised of a pfSense software firewall, 2 SuSe Linux boxes, and a Windows 2012 R2 server. The SuSe boxes were used for backup DNS, MySQL database, and the e-commerce web server while the Windows box was primarily performing the normal functions of Domain Controller and primary DNS.
After scanning the networks, we quickly determined that the Blue Teams were running all of their public facing services off of an ESXi server. Additional investigation revealed that the ESXi servers were version 5.5.0 and vulnerable to Heartbleed. This vulnerability became our primary attack vector. By leveraging Heartbleed, we could force the ESXi servers to leak the login credentials in clear-text whenever the Blue Teams logged in. Once gaining root access to the ESXi servers, my goal was to gain access to the domain controller. This is a little tricky when you want to go unnoticed. We were able to jump on a couple of domain controllers that Blue Teams logged into, but left the console open and unattended. The Linux boxes were much easier to compromise. Either they were logged into as root and we could change the password, add users, start SSH, lock the console, and log in remotely, or, we reboot the machine into single user mode, changed the root password, and rebooted. Finally, by leveraging a combination of default credentials and unattended console sessions, we leveraged the pfSense firewalls to lock the Blue Teams out of their networks by turning off the internal interface, but allowed us in via the external interface.
Throughout the course of the competition, the Blue Teams started to slowly kick us out of their networks. This forced us to start getting more creative with our access methods. The first area we looked at was the WordPress site running the e-commerce server. Although we couldn't take advantage of any vulnerable plugins, we could take advantage of the fact that the WordPress configuration let anyone register an account. Now, registering an account in and of itself isn't exciting, but what is exciting, is the fact that WordPress emails you your password. This is important because we had read/write access to the MySQL database backend of WordPress. In the database we could clearly see the administrator's hashed password. Now that we created our own user, we could also see our hashed password in the database. The next step was just receiving the password generated by WordPress. With the Blue Team's network configuration, the WordPress instance couldn't email out to a public email address. But what it could do, was pass the email along to a local Python SMTP server that we controlled. During the user account creation process, we simply specified our email address as 'user@<ipaddress>' rather than providing a public domain name. This worked like a charm. Now that we had a clear-text password and a corresponding WordPress password hash, we leveraged our read/write access to the MySQL database to overwrite the administrator's password hash. Now we could log into the WordPress administrator's account with a password that we knew.
Our other attack vector was actually found by digging through the pfSense source code. For a few of the teams we still had authentication access to the firewalls, but the web administration had been shutoff. It turns out that there is an XML RPC in pfSense. Like I said, we had the username and password, but couldn't turn on the web console and not all the routers had SSH enabled. So we created our own shell. Using the XML RPC and a little PHP voodoo, we pulled down a PHP shell and created our own web console. In most of the Red Team's opinion, this was one of our coolest finds of the event.
One of the largest differences I noticed this year was how a couple of Blue Teams were able to almost completely block out the Red Team. The teams that were quick to correctly configure their routers and whitelists on their ESXi servers removed the largest holes in their network, and their service scores showed it. As a Red Team we really took advantage of Heartbleed and default pfSense credentials. Without those footholds, we weren't able to really do much. Smaller attack surfaces seemed to be a trend for a few of the CCDC regional events this year. My previous blog post talked about how at the Pacific Rim regional, the Red Team really only had 2 targets, and the vulnerabilities were default credentials. This was definitely not the case for South East, but the Red Team definitely noticed a lack of attack surface. I've been talking to a lot of people about some of these observations and we all agree that we want modern systems and network configurations, but how do you open up the attack surface without making it unrealistic?
All in all, I had an absolute blast at SECCDC. I'm already looking forward to next year. I know all the organizers of the SECCDC work incredibly hard to put on this event every year. Their efforts have been noticed and I thank them for all the time and effort they put forth to make this event a reality. And congratulations to UCF for winning a second year and a row! Good luck at Nationals and bring keep the championship in the South East!!
This year's scenario was similar to last year's. The Blue Teams were responsible for maintaining the operational status of the HAL business network while completing a series of business related injects. The network layout changed a bit from last year, however. This year all the Blue Teams had a few machines that were public facing, and a group of privately networked workstations. The public facing images were comprised of a pfSense software firewall, 2 SuSe Linux boxes, and a Windows 2012 R2 server. The SuSe boxes were used for backup DNS, MySQL database, and the e-commerce web server while the Windows box was primarily performing the normal functions of Domain Controller and primary DNS.
After scanning the networks, we quickly determined that the Blue Teams were running all of their public facing services off of an ESXi server. Additional investigation revealed that the ESXi servers were version 5.5.0 and vulnerable to Heartbleed. This vulnerability became our primary attack vector. By leveraging Heartbleed, we could force the ESXi servers to leak the login credentials in clear-text whenever the Blue Teams logged in. Once gaining root access to the ESXi servers, my goal was to gain access to the domain controller. This is a little tricky when you want to go unnoticed. We were able to jump on a couple of domain controllers that Blue Teams logged into, but left the console open and unattended. The Linux boxes were much easier to compromise. Either they were logged into as root and we could change the password, add users, start SSH, lock the console, and log in remotely, or, we reboot the machine into single user mode, changed the root password, and rebooted. Finally, by leveraging a combination of default credentials and unattended console sessions, we leveraged the pfSense firewalls to lock the Blue Teams out of their networks by turning off the internal interface, but allowed us in via the external interface.
Throughout the course of the competition, the Blue Teams started to slowly kick us out of their networks. This forced us to start getting more creative with our access methods. The first area we looked at was the WordPress site running the e-commerce server. Although we couldn't take advantage of any vulnerable plugins, we could take advantage of the fact that the WordPress configuration let anyone register an account. Now, registering an account in and of itself isn't exciting, but what is exciting, is the fact that WordPress emails you your password. This is important because we had read/write access to the MySQL database backend of WordPress. In the database we could clearly see the administrator's hashed password. Now that we created our own user, we could also see our hashed password in the database. The next step was just receiving the password generated by WordPress. With the Blue Team's network configuration, the WordPress instance couldn't email out to a public email address. But what it could do, was pass the email along to a local Python SMTP server that we controlled. During the user account creation process, we simply specified our email address as 'user@<ipaddress>' rather than providing a public domain name. This worked like a charm. Now that we had a clear-text password and a corresponding WordPress password hash, we leveraged our read/write access to the MySQL database to overwrite the administrator's password hash. Now we could log into the WordPress administrator's account with a password that we knew.
Our other attack vector was actually found by digging through the pfSense source code. For a few of the teams we still had authentication access to the firewalls, but the web administration had been shutoff. It turns out that there is an XML RPC in pfSense. Like I said, we had the username and password, but couldn't turn on the web console and not all the routers had SSH enabled. So we created our own shell. Using the XML RPC and a little PHP voodoo, we pulled down a PHP shell and created our own web console. In most of the Red Team's opinion, this was one of our coolest finds of the event.
One of the largest differences I noticed this year was how a couple of Blue Teams were able to almost completely block out the Red Team. The teams that were quick to correctly configure their routers and whitelists on their ESXi servers removed the largest holes in their network, and their service scores showed it. As a Red Team we really took advantage of Heartbleed and default pfSense credentials. Without those footholds, we weren't able to really do much. Smaller attack surfaces seemed to be a trend for a few of the CCDC regional events this year. My previous blog post talked about how at the Pacific Rim regional, the Red Team really only had 2 targets, and the vulnerabilities were default credentials. This was definitely not the case for South East, but the Red Team definitely noticed a lack of attack surface. I've been talking to a lot of people about some of these observations and we all agree that we want modern systems and network configurations, but how do you open up the attack surface without making it unrealistic?
All in all, I had an absolute blast at SECCDC. I'm already looking forward to next year. I know all the organizers of the SECCDC work incredibly hard to put on this event every year. Their efforts have been noticed and I thank them for all the time and effort they put forth to make this event a reality. And congratulations to UCF for winning a second year and a row! Good luck at Nationals and bring keep the championship in the South East!!
Tuesday, March 31, 2015
Pacific Rim Regional CCDC 2015 - Red Team
A week and a half ago I got to participate in the 2015 Pacific Rim (Pac-Rim) Collegiate Cyber Defense Competition as a member of the Red Team. My more experienced friend Dan has already written a couple of posts about this season's CCDC events (lockboxx) including Pac-Rim. I wanted to use this post not to talk about what I did as a member of the Red Team, but my developing opinions of CCDC and these types of "cyber defense" competitions.
To give context to this post, I'll give a brief description of Pac-Rim's scenario. The Blue Teams were tasked as the IT/Security department of The Center for Disease Control (CDC) while the world experienced a zombie outbreak. While trying the manage their network, they must also address the growing scare of zombies and how that impacted their jobs at the CDC.
As far as network design, the Blue Teams were given an external/public facing network, and a couple of internal networks with varying security levels. From the Red Team perspective we saw 3 primary targets, a VyOS router (VyOS), a Windows 2012 R2 exchange server, and a Fedora 20 web server. Initial scans for vulnerabilities turned up very little. These were pretty modern systems that weren't running a lot of services. Not a lot to attack. The VyOS router did have port forwarding rules set to proxy through connections to servers on the internal network. With this we could see a couple of services beyond the router such as MySQL. For the Red Team, the only way we actually got access to any of these machines was by default credentials.
Leaving default credentials is such a silly thing for Blue Teams to pass up and it is such an easy vector for Red Teams to leverage. Default credentials are usually the very first thing Blue Teams change. That being said, once default credentials have been changed, the Red Team has to be more cunning to find another way into the Blue Team's networks. Unfortunately, for the Pac-Rim event, it seemed that more than half of the teams separated themselves because they changed their passwords quicker than the other teams. I want to focus on this point a little bit more.
When I say that a Blue Team was quicker in changing their default password, I mean they were quicker by a minute, to seconds. This year, we split the Red Team up into cells. Each cell was tasked with attacking a specific Blue Team for the duration of the event. Each cell was to stay in sync with all the other Red Team cells. This sounds like a great concept because it seems very fair. And I agree that it is fair in theory. The issue you run into is the very opening few minutes. This entire event was determined by the first five minutes. The Blue Teams that changed their passwords on the Win2012 Exchange server before their router did better than the students that changed their router's password first. The reason being that the Win2012 Exchange server could be used to pivot all around the internal domain. The router did not provide this type of access. We even had a couple of teams that changed both the router and Exchange server passwords extremely quickly. The result, the Red Team really couldn't do much to them. Now, this isn't meant to be a Red Team sob story. I love when the students lock the Red Team out. That means they are learning, and they are equipped with the skills to make our industry safer. That to me is an amazing thing. The issue I have is that changing 2 default passwords and locking the Red Team out for a day and half is not a learning experience. Further more, since all the Red Team cells are staying in-sync with each other, A Blue Team could get away with leaving a gaping whole in their systems as long as the Red Team cells weren't attacking that particular issue at that time.
The primary point of CCDC is to provide the students with a unique learning experience that they will never get in the class room. At the end of the competition, I got to sit down with the Blue Team I attacked all weekend. They were so full of questions and eager to learn from their experience. The problem was, I couldn't answer all of their questions. Due to a scheduling issue, I had to work alone as a Red Team cell so my focus was extremely stretched. My Blue Team changed their Exchange server password right away and I never got access to it. I only got access to their router and MySQL database. Since most of the Blue Teams' networks was comprised of a majority of Windows systems, I was asked a lot of questions regarding how well they did configuring the workstations, domains, and other Windows related services. Unfortunately I had to tell them that because they changed 1 default password, I wasn't able to give them an accurate perspective. They could have had a terribly configured domain. I wouldn't know. And this was the case for a lot of Blue Teams. They simply didn't get all the feed back that the Red Team could have provided had there been more access.
I go back to the point that CCDC is suppose to be a learning experience. It's hard to find the balance between giving the students unrealistically insecure systems that the Red Team can stomp all over, and modern secure systems where the Red Team still has decent access. I also want to emphasize the point of these competitions is to focus on cyber security. One of my Blue Team members told me that they spent the entire first day (more than 8 hours) dealing with customer phone calls from the Orange Team. What?! Customer service has very little to do with developing cyber security skills. I understand that the Orange Team is there to act as real world customers, but this is a competition. Blue Teams are obviously going to put their best 'people person' on the phone. They probably won't learn anything in terms of how to deal with people and they'll miss out on the actual technical education.
In the end if my Blue Team was able to learn 1 thing, than in my opinion, the experience was worth it. I love doing these types of events as a way to connect with students and provide guidance in a way that it was provided to me. Next week I'll be participating in the South East CCDC regional and my company's own cyber defense competition (PCDC) so I should have a lot more stuff to report on.
To give context to this post, I'll give a brief description of Pac-Rim's scenario. The Blue Teams were tasked as the IT/Security department of The Center for Disease Control (CDC) while the world experienced a zombie outbreak. While trying the manage their network, they must also address the growing scare of zombies and how that impacted their jobs at the CDC.
As far as network design, the Blue Teams were given an external/public facing network, and a couple of internal networks with varying security levels. From the Red Team perspective we saw 3 primary targets, a VyOS router (VyOS), a Windows 2012 R2 exchange server, and a Fedora 20 web server. Initial scans for vulnerabilities turned up very little. These were pretty modern systems that weren't running a lot of services. Not a lot to attack. The VyOS router did have port forwarding rules set to proxy through connections to servers on the internal network. With this we could see a couple of services beyond the router such as MySQL. For the Red Team, the only way we actually got access to any of these machines was by default credentials.
Leaving default credentials is such a silly thing for Blue Teams to pass up and it is such an easy vector for Red Teams to leverage. Default credentials are usually the very first thing Blue Teams change. That being said, once default credentials have been changed, the Red Team has to be more cunning to find another way into the Blue Team's networks. Unfortunately, for the Pac-Rim event, it seemed that more than half of the teams separated themselves because they changed their passwords quicker than the other teams. I want to focus on this point a little bit more.
When I say that a Blue Team was quicker in changing their default password, I mean they were quicker by a minute, to seconds. This year, we split the Red Team up into cells. Each cell was tasked with attacking a specific Blue Team for the duration of the event. Each cell was to stay in sync with all the other Red Team cells. This sounds like a great concept because it seems very fair. And I agree that it is fair in theory. The issue you run into is the very opening few minutes. This entire event was determined by the first five minutes. The Blue Teams that changed their passwords on the Win2012 Exchange server before their router did better than the students that changed their router's password first. The reason being that the Win2012 Exchange server could be used to pivot all around the internal domain. The router did not provide this type of access. We even had a couple of teams that changed both the router and Exchange server passwords extremely quickly. The result, the Red Team really couldn't do much to them. Now, this isn't meant to be a Red Team sob story. I love when the students lock the Red Team out. That means they are learning, and they are equipped with the skills to make our industry safer. That to me is an amazing thing. The issue I have is that changing 2 default passwords and locking the Red Team out for a day and half is not a learning experience. Further more, since all the Red Team cells are staying in-sync with each other, A Blue Team could get away with leaving a gaping whole in their systems as long as the Red Team cells weren't attacking that particular issue at that time.
The primary point of CCDC is to provide the students with a unique learning experience that they will never get in the class room. At the end of the competition, I got to sit down with the Blue Team I attacked all weekend. They were so full of questions and eager to learn from their experience. The problem was, I couldn't answer all of their questions. Due to a scheduling issue, I had to work alone as a Red Team cell so my focus was extremely stretched. My Blue Team changed their Exchange server password right away and I never got access to it. I only got access to their router and MySQL database. Since most of the Blue Teams' networks was comprised of a majority of Windows systems, I was asked a lot of questions regarding how well they did configuring the workstations, domains, and other Windows related services. Unfortunately I had to tell them that because they changed 1 default password, I wasn't able to give them an accurate perspective. They could have had a terribly configured domain. I wouldn't know. And this was the case for a lot of Blue Teams. They simply didn't get all the feed back that the Red Team could have provided had there been more access.
I go back to the point that CCDC is suppose to be a learning experience. It's hard to find the balance between giving the students unrealistically insecure systems that the Red Team can stomp all over, and modern secure systems where the Red Team still has decent access. I also want to emphasize the point of these competitions is to focus on cyber security. One of my Blue Team members told me that they spent the entire first day (more than 8 hours) dealing with customer phone calls from the Orange Team. What?! Customer service has very little to do with developing cyber security skills. I understand that the Orange Team is there to act as real world customers, but this is a competition. Blue Teams are obviously going to put their best 'people person' on the phone. They probably won't learn anything in terms of how to deal with people and they'll miss out on the actual technical education.
In the end if my Blue Team was able to learn 1 thing, than in my opinion, the experience was worth it. I love doing these types of events as a way to connect with students and provide guidance in a way that it was provided to me. Next week I'll be participating in the South East CCDC regional and my company's own cyber defense competition (PCDC) so I should have a lot more stuff to report on.
Subscribe to:
Comments (Atom)