Sponsored By

Cybervisulization Post Mortem

A postmortem of our project visualizing and displaying cybersecurity competitions for general audiences. Authored by: Kevin Laporte, Justin Neft and Robley Evans.

Justin Neft, Blogger

April 8, 2021

16 Min Read

Project Overview

    Over the course of 8 weeks we developed a data-visualization tool for cybersecurity competitions. The intent was to describe and present the events of a cybersecurity competition to an uninformed audience. The tool is meant to be deployed with a casting team over streaming services like YouTube or Twitch. Casters can update the data and modify the view being broadcasted live to help explain and clarify the events of the competition. We initially started building our tool for the Collegiate Penetration Testing Competition (CPTC). Our final product was built for and broadcast during the Northeast Collegiate Cyber Defense Competition (NECCDC).

Project Details

Original timeline was 15 weeks.

Actual timeline was 8 weeks.

No budget.

Developed in Unity version 2020.2.4f1

Streamed using Twitch

Developed by a team of three:

  • Robley Evans (Part time): Data pulling, AI for selecting what information to display, data formatting, programming

  • Justin Neft (Full time): Infrastructure-building, attacks, node state reading, programming, design

  • Kevin Laporte (Full time): Injects, videos, time simulation, random team generator, programming, design

What Went Right

Development:

  • We had good momentum and were able to generate good ideas and results with little outside direction.

  • We made good design decisions early on and stuck with them

  • We were able to easily shift directions when a major change in direction occurred

  • We were able to make quick decisions about what we needed to do next

  • We fixed bugs as they were discovered

  • Three people made it happen

  • The project was scoped well; no main features were cut, but some stretch goals were

Product Launch/Broadcast:

  • Our tool was successful in reading in live data, interpreting it, and displaying it

  • The team was able to quickly adapt (tool and production) on the fly to keep the show going

  • We had an average viewership of 40 people throughout a 9.5 hour livestream

Throughout this project we found that several things went in our favor during both the development and broadcast. These helped us complete our project in time and put on an engaging show for NECCDC. The development process was roughly 8 weeks, starting from the beginning of the semester in January 2021 to the start of the 2021 NECCDC. The broadcasting portion was during NECCDC itself. The first three weeks of the development process were spent developing for the CPTC before we switched to building for NECCDC.

    Within the first few days we were able to determine a short term goal and work towards it. This helped us produce results quickly. There were a few dips in productivity when we switched competitions, but we were able to quickly bounce back. Because of this momentum, we were always able to keep busy working on improvements, bug fixes, or new features for the visualizer. This was especially helpful during the later stages of development when waiting on things from other teams became commonplace. To help keep the momentum, we made sure to fix bugs when they were discovered rather than waiting to solve them later. This ensured that major bugs didn’t slow down development and minor bugs didn’t have time to become bigger problems.

    Our agility was due in part to the resources provided by previous teams as well as the size of our team. Previous teams had produced a lot of reference and proof of concept material. This made it easy to get up to speed with the project and its goals. We started with 2 full time students (Justin Neft and Kevin Laporte) which aided in our ability to quickly get the whole team on the same page. Once we transitioned to designing for NECCDC, a part time student (Robley Evans) who had worked on the project previously also joined the team. Being able to integrate knowledge directly from previous teams was immensely helpful in making good design decisions early. Our small team was very agile, able to think quickly on our course of direction, and make changes faster than a larger team may have been able to.

    The transition from CPTC to NECCDC also went fairly smoothly. CPTC and NECCDC require similar forms of visualization, such as networks and systems. Thus many features we had already developed were able to be ported or reused. This prevented a complete loss of three weeks of development time. With a definitive deadline, we were also able to plan out a timeline. We first broke down what features we had and what features we needed. We estimated how much time each feature would take, then doubled it. This doubled timeline was surprisingly accurate and allowed us to get all the core features implemented by the deadline. Stretch goals were the only elements explicitly cut during development.

    The good aspects didn’t stop at just the development cycle. Despite the broadcast being a very spontaneous production, many things went in our favor. The first major point in the broadcast was that our visualizer successfully completed its main task: reading in real and live data from the competition and displaying it. This was something that previous teams were unable to test since they did not have access to real competition data. The competition proved that the concepts both previous teams and our own team had worked on were viable.

    As with every product launch, there were plenty of issues. Without a proper dry run, many of those issues were discovered live. Fortunately our team had the skills necessary to quickly resolve these issues on the fly with minimal downtime. We were quick to go live and inform the audience when major problems arose. We made sure to stay open with the issues and the fact that this was all a very new and experimental product. Our team also made great use of the downtime during interviews to fix major bugs while the visualizer was off screen.

    Many of the casted segments and interviews were planned mere hours beforehand. Despite the short notice, they were well received and helped bolster the livestream’s educational value. Our team had the benefit of being close friends and were able to naturally talk with each other during casted segments despite having little broadcast experience. Our stream consistently held an average of 40 viewers, with numbers rising upwards of 100 during interviews and casted segments. As a proof-of-concept for creating an interesting live stream out of a cyber security competition, we believe these numbers are proof of its potential.

What Went Wrong

Development:

  • Lack of clear direction with no clear and defined vision for the project.

  • Very little project management.

  • It was a big task for three people, especially one being part time.

  • We didn’t get real data until the day before the competition. We had to make lots of assumptions until then

  • Github caused some problems due to our lack of experience with the tool.

  • Shifting competitions caused some development time to be lost.

  • We had very few coding standards.

  • We didn’t do nearly as much research as we should have.

Product Launch/Production:

  • There was no full broadcast dry run

  • We had to cut a lot of features (injects and inject videos, attack visualization).

  • We had to change how data was read in mere hours before the competition went live.

  • Our main role was software developers. This gave us little time and experience to prepare as a broadcasting team.

  • We tried to schedule the project to avoid crunch time, but it still ended up occurring.

    This was one of the first major products that our team members had shipped. As such, we encountered many issues that we did not initially anticipate.

    One of our most prominent issues was finding clear direction and vision for the project. This is something that is inherent with working in an emerging field. When a topic is so new, it's hard to find inspiration or reference material. This sometimes made it very difficult to decide what our visualizer needed to be and what features it needed to have. Properly researching cybersecurity concepts and the competitions in question would have also majorly helped here.

    Having a hard to define direction made it difficult for our small team to have clear project management. Although we were agile and able to make quick decisions, we found a bit too much safety in that methodology. There were often moments where we would ask “what do we do next?” Having a designated project manager would have helped a lot. Unfortunately, with a team of 2 full time developers and 1 part time, we didn’t see a need for one at first. We were able to plan out what was needed for the visualizer once we had a deadline.

    Part of our development and planning was stunted by a lack of real data. We weren’t able to get real and properly formatted data from the scoring system until a day before the competition. We only learned how the data was formated a day earlier. This meant that a lot of development time was spent making systems based on assumptions. For example, each node in the visualizer corresponded to an IP address. In reality, the scoring system passed us service check names rather than IP addresses. Having to quickly reformat the code to work with the real data contributed to the crunch time as well as a loss in visualized data.

    Early on in the project, we decided to use tools that we knew rather than learn new tools. This allowed us to hit the ground running and produce results as early as week two. One of the tools we used was GitHub. Unfortunately, we did not know GitHub as well as we thought. GitHub’s .gitignore file is an extremely useful resource for preventing files from being tracked. However, we learned far too late that the file is a preventative measure, not an active protection. After only a week of the .gitignore file being in the wrong folder, GitHub had started tracking thousands of junk Unity files. This in turn started to create hundreds of merge conflicts. Eventually we realized the issue and had to spend an entire day learning GitBash to solve it.

    Some development time was lost early on due to a change in what competition we were building for. Originally the tool was being designed for CPTC. This competition mainly focuses on student teams attempting to attack a network and report their findings. A few weeks in, the project pivoted towards building for NECCDC. Although both competitions visualize similar information, NECCDC is focused on student teams defending a network. The way information is gathered between each competition is different, as well as the way we needed to visualize it. This caused multiple already developed features to be shelved since they were no longer relevant for NECCDC. Fortunately the plan was to return to developing for CPTC after NECCDC had completed, so shelved features may be used in the future. In terms of building for NECCDC however, those features were lost time.

    One major issue that became apparent late in the development cycle was a lack of a consistent coding standard throughout the team. Although we all have decent methods of commenting and writing clear code, they are distinctly different. This made it difficult to try and quickly understand another’s code when they weren’t around. This caused a few issues in the final days before the competition. Having a consistent standard we all followed or regularly reviewing code with each other could have helped alleviate these issues.

    We didn’t spend nearly as much time as we should have doing research on cyber security topics, the competitions we were building for, or even on how to commentate a live stream. Having a better understanding of cyber security would have helped us work more closely with the Cyber Security Capstone team we’ve been working with. Knowing the competition we were building for would have helped us determine what we needed and how to get it early on. With our visualizer being in the early stages, it was only able to visualize if a service was on or off. We didn’t learn until during the competition that teams were actually being locked out of their systems, not having their systems turned off. As for the live stream, we learned a lot about what a commentator would need from our visualizer by being the commentators ourselves. It was less than desirable to learn what we needed from the tool when we needed it rather than beforehand.

In terms of the broadcast, the first major issue that arose was the lack of a comprehensive test run. This was due in part to not having real data until the competition. We had done plenty of feature testing throughout development. However, small feature tests don’t always replicate practical applications. Multiple problems arose during the live show that could have been discovered during a full test run. These include things such as videos playing at the wrong times, data being pulled from the wrong date, and many audio issues. We also decided not to visualize attacks using features we had built because of a lack of visualization content. Due to some of these issues, multiple features were cut from the visualizer during the live stream. This was not ideal, but we were able to keep the show running despite these hiccups.

Crunch time was also a very real part of the project. Although we had planned out a lot of requirements beforehand and even doubled our expected timeline, we had not anticipated room for error or need for unexpected features. One of those errors was the fact that we did not get real data until shortly before the competition. Since the format of the data we were receiving was different from what we expected, we had to make quick decisions shortly before we went live. These decisions influenced what systems we were tracking in the competition, and thus what data was being visualized. The decisions were mostly uninformed and wildly based on “what sounds more important to you?” Another unfortunate side effect of crunch time was the shift from making code clean and efficient to making code that just worked.

The final major issue that arose was the lack of a full broadcasting team. Our team’s original job description was to be software developers for the visualizer. As the competition loomed closer, we quickly filled the roles of commentators and broadcasting team. Although we are far from experienced broadcasters, we were able to fill both the role of developer by maintaining and fixing the tool live while also being broadcasters and producing an interesting and engaging live stream. However, if we were able to solely focus on development, we would have had more time to better finalize the visualizer for its debut.

Risk Management

    Development was played fairly-safe with decent time management and low amounts of risk taking. The production side, however, was completely unplanned. As such, we took many risks to try and make the broadcast good. Overall these production risks paid-off in a major way and really helped our tool shine. The production also gave us great insights into how to further develop our tool and what it really needs to be complete.

 

Mid-Project Changes

    Our first major change was having to pivot from developing our tool for CPTC to developing our tool for NECCDC. The pains of this change were mitigated mainly because the tools we had developed to that point were easily-ported to the new competition format. This meant minimal code refactoring was required during the shirt. The real work for this shift came from changing our plans and designing new interfaces. We handled this by planning out what exactly we would need to do for NECCDC. We then broke down what features we already had from CPTC and what we would still need. Once that was done, we calculated how long it would take to implement all of these features and doubled that time to high-ball an estimated time to complete the tool. The estimation was vital in our determination of what was necessary and what was a stretch goal.

    Our second major change came when we decided to handle production for the broadcasting while also running our tool at the same time. This change meant we had to quickly create content for the live streams while maintaining our tool and fixing any unforeseen issues that arose throughout the competition. Unfortunately, we did not have much time to prepare for this change. We had to move quickly to make the stream work while finding our footing throughout the production.

Conclusions

    In review, we think that our team really suffered from a lack of strong project management and direction. We didn’t have great ways to keep track of what tasks needed to get done and we were quite informal with our processes for completing new tasks. Stronger project management and review processes would’ve likely allowed us to keep the correct direction throughout development.

    A lot of this development cycle was us figuring out what needed to get done, who we were developing for, and what exactly we were even making. While the lack of answers gave us a good deal of flexibility and autonomy, we think it led to inefficiencies throughout the entire process and greater confusion. However, we also learned a lot through this process allowing us to structure our future work better. These lessons also leave us better positioned to set-up the project for future teams.This will allow them to have an easier time learning the project and improving and modifying it in the future.

Plan of Action

    With this knowledge in mind, we’re using project management tools and methods to help keep the team on track and focused. We are also contacting experts in software engineering, cybersecurity, and project management to help guide us in setting-up these workflows to be efficient and effective. Finally, we’re getting in contact with the CPTC staff now so everyone has more time to prepare the competition with integrating the tool in mind. This will also help give persistent connections with CPTC to future co-op teams working on the project.

Read more about:

Blogs

About the Author(s)

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like