Sunday, May 31, 2015

Palmetto Cyber Defense Competition - 2015

Last month my fellow employees and I wrapped up our third annual Palmetto Cyber Defense Competition (PCDC). Inspired by the Collegiate Cyber Defense Competitions, PCDC has been a computer security competition for high schools and colleges in South Carolina. I've written about the event itself in the past PCDC-2014 so I won't go into detail about what the event actually is. Those details can be found at the official PCDC site. Instead I wanted to focus on a couple of the things that make PCDC unique, lessons learned from putting on a computer security competition, and where we are going in the future. For those of you looking for a detailed Red Team write up, that can be found here. I took the time after the competition to create an in-depth analysis of the Red Team's process for those looking to learn from their mistakes at this year's competition. The rest of this blog post is a few of my thoughts from an organizer's perspective and not from the perspective of the Red Team lead.

First off, PCDC is unique in that for the first two years, both high schools and colleges competed. Each group had their own dedicated competition day which meant that the PCDC competition was actually run twice. This means that after the first day, all the machines must get reset, re-imaged, and configured slightly different for the next day. This year, we had a third day added; a professional day. For the first time, PCDC was to be run three times. The schedule of events was as follows: first day was high school, second day was college, and the third day was for professionals. For the professional day, we had groups representing a mixture of government and private industry. From the government side teams were comprised of members from the 24th Air Force and U.S. Cyber Command while the industry teams had members from Scientific Research Corporation (SRC), and SPARC. All in all, we had 4 government teams and 4 teams from industry.

Our goal from the beginning was to design the competition network to be believable and realistic. Since it was not known to us during the planning and design phase of the competition that there would be a professional day, this year's theme was based around an online game development company. Each of the Blue Teams would be responsible for making sure their development company continued to function through the course of the day and deliver the latest version of their game to their user base through audience accessible tablets connected to each Blue Team via a dedicated wireless connection. One of the things that we quickly realized was that our ambitions greatly eclipsed the amount of time we had available to create the network. Remember, a decent portion of our time goes into infrastructure development so that the competition can be rerun the next day. To add to that pressure, we do not have control over the facility in which the competition is hosted. As a result, or preparation time from the end of one day the beginning of the next is usually around 3 hours.

To put it simply, there was a lot of different attributes that we wanted to include into this year's competition, but we ran out of time. One of the biggest things we feel these types of competition lack when trying to simulate real world networks is realistic user activity. This year we attempted to remedy that by developing simulated users. We got all the code developed and tested for the user simulators, but due to a hardware failure, we were unable to deploy them to the competition network this year. In the interest of education and sharing, I have opened sourced the code for the user simulators on GitHub. We'd really like to hear back from anyone that is doing something similar.

A few of us have been involved in multiple CCDCs and PCDCs and every year, we make the comment that the scoring system needs to be altered. Although this was in the works before this year's competition even took place, we haven't had time to finalize what fixing the scoring means. At this point, I think we have a much better idea of how we are going to fix the scoring. To highlight why scoring is such an issue, I want to talk about how winning Blue Teams typically approach this competition. Within the first few minutes, the strategy includes removing all unnecessary accounts, changing all the default passwords, and for some, unplugging their entire network while they continue to harden. Now, from a strategic perspective with the goal to win a game in mind, I can't argue with this approach. The issue that I do have, however, is that this leaves the networks in a pretty unrealistic state.

Each Blue Team is given an information packet at the beginning of the competition. In that packet includes the names of the accounts that the automated scoring engine will use to log into their systems and perform service checks to make sure the Blue Teams still have their services up and running. Once these account names have been identified, the Blue Teams will delete every other account off the workstation or server. This means you could have 3 or 4 domain joined Windows workstations with zero user accounts and only the scoring engine account. It is important to note, that the Red Team is not allowed to leverage the scoring engine account to gain access to the Blue Team's networks. It's also not realistic. A computer security competition should force the students and competitors to perform real security tasks with the presence of real users. Now, since this is a competition and getting that many unbiased volunteer users is unrealistic, we need simulated users.

Other areas where improvements to scoring need to be made is in the way the scoring engine actually evaluates successful checks. Up to this point, a common service to check for is a functioning MySQL database. Typically the scoring engine will login to the MySQL database, make a query, and check for a specific key-value pair or for the presence of a specific table. This simply isn't good enough. For a real company, the database needs to have constant transactions generated by realistic activity. Right now, Blue Teams get away with making a backup of the database in the beginning of the day and just restoring it anytime the Red Team deletes the database. As long as the Blue Team restores the database in between scoring engine rounds, the scoring engine gives that Blue Team a perfect service check score. Now, this is slightly cured by the fact that the Red Team reports incidents to the Gold Team and the Gold Team can decide to take away points, but these types of scenarios need to have bigger impact on the 'day to day' operations of the Blue Teams' networks.

Where we plan to go in the future will attempt to combine 3 facets of scoring. The first being financial. The scoring engine will no longer add points, but will score the Blue Teams' companies in a financial sense. The second facet is from an internal employee and systems perspective. Employees must be able to perform their job related duties and interdependent systems must be able to communicate with each other. Finally, the third facet is from the Red Team. This year we tried something new by giving the Red Team specific targets/flags to capture when we gained access to the the Blue Teams' networks. This included the credit card numbers in their customer database, the source code to their latest game, and a few other  things.

Now, I know some people will argue that the CCDCs and even PCDC already take these things into consideration with the scoring engine, but we argue, it is not taken into account enough. The example we like to use is the one where the Blue Team unplugs their network. Now sure, they aren't getting any points from the scoring engine, but in the real world, you can't just go and unplug your entire company from the network. Not only would you be losing sales, but you're paying your employees to do a job that they can't accomplish. And not to mention, the security or IT department has no authority to make that type of decision.

We have thought long and hard about the scoring and we think we have something new and exciting for next year. I don't want to give away too much here until things are more settled. Additionally, we want to find a way to make the audience understand what is going on. PCDC is free and open for the community to come in and view. This year we attempted to show what was taking place by visualizing the the traffic between the Blue Teams and Red Team in real time. I wrote the code for this and am also releasing it on  GitHub. You can see a video demo of it on YouTube.

We have a lot of exciting things planned for next year's PCDC! Stay tuned for more, and if you have any feedback from this year's we'd love to hear it.

5 comments:

  1. We are a local Disaster Restoration Company that provides emergency services for Residential and Commercial Properties in the Sarasota and Manatee county areas. Key Cities are Bradenton, Sarasota, Palmetto, Venice, Lakewood Ranch, and Osprey. We are Mold, Water, Odor, Cleaning Specialist. Detail Cleaning Company. For more info visit this websitehttp://servicemastersrq.com/venice-fl-fire-and-smoke-damage/

    ReplyDelete
  2. s. One of the most recent is the produced by a California organization and resembles a 300 pound, 5-foot tall smooth phallic molded waste can. It supposedly makes a trip self-governingly up to 3 miles for each hour. trust ecommerce

    ReplyDelete
  3. Thanks for the blog loaded with so many information. Stopping by your blog helped me to get what I was looking for. Serious Security Packenham

    ReplyDelete
  4. This is such a great resource that you are providing and you give it away for free. I love seeing blog that understand the value of providing a quality resource for free. Serious Security Melbourne

    ReplyDelete