by Angela Horneman
Network Intelligence Analyst
When I was pursuing my master’s degree in information security, two of the required classes were in cognitive psychology and human factors: one class about how we think and learn and one about how we interact with our world. Students were often less interested in these courses and preferred to focus their studies on more technical topics. I personally found them to be two of the most beneficial. In the years since I took those classes, I’ve worked with people in many organizations in roles where it is their job to think : security operations center (SOC) analysts, researchers, software developers, and decision makers. Many of these people are highly technical, very intelligent, and creative. In my interactions with these groups, however, the discussion rarely turns to how to think about thinking. For people whose jobs entail pulling together and interpreting data to answer a question or solve a problem (i.e. analyze), ignoring human factors and how we and others perceive, think, and remember can lead to poor outcomes. In this blog post, I will explore the importance of thinking like an analyst and introduce a framework to help guide security operations center staff and other network analysts.
I recently came across a book from the CIA’s Center for the Study of Intelligence Psychology of Intelligence Analysis by Richards J. Heuer, Jr . In reading this book I realized that analysts in cybersecurity today struggle with many of this same issues that this book addresses for intelligence analysts. I would love to quote the whole first chapter (read it online here ), but I’ll suffice with the following two excerpts:
What people perceive, how readily they perceive it, and how they process this information after receiving it are all strongly influenced by past experience, education, cultural values, role requirements, and organizational norms, as well as by the specifics of the information received.
Training is needed to (a) increase self-awareness concerning generic problems in how people perceive and make analytical judgments concerning foreign events, and (b) provide guidance and practice in overcoming these problems. Not enough training is focused in this direction–that is, inward toward the analyst’s own thought processes. Training of intelligence analysts generally means instruction in organizational procedures, methodological techniques, or substantive topics. More training time should be devoted to the mental act of thinking or analyzing. It is simply assumed, incorrectly, that analysts know how to analyze.
In this quote “foreign” and “intelligence” could both easily be replaced with “cyber.” The best analysis requires more than subject matter expertise and data; it should include the ability to identify what perceptions or biases cloud or enhance interpretation. The best analysts that I have encountered exhibit a willingness to entertain unexpected ideas. These analysts look at an interpretation and ask “if this is wrong what else could it be?” In short, the best analysts do not rely solely on a formula, an algorithm, or a single technique. They understand the problem (or question, hypotheses) and how to choose and use the relevant data and techniques to get to the appropriate resolution. If the techniques are limited to only those that are computational or associative–those that you use Excel for, or write a script to accomplish, or that follow a playbook–analysts miss important events, researchers misinterpret their results, software developers assume what is important to them is important to end users, and decision makers miss the essence of the needs they must fulfill.
The Problem(s) With Not Thinking Like an Analyst
In our work with government and industry, we often work with SOC staff and other analysts whose goal is to investigate an incident. Problems may arise, however, if during the course of that investigation, they forget to think like an analyst.
One of the first problems that we see is that analysts often only have a vague idea about what they want to accomplish as a result of an investigation. Consequently they cannot make a plan that allows them to identify where to start and how to get to useful results at the end. Many analysts forget the importance of adequately defining what they are trying to do. For example, analysts often want to find beacons on a network, which is when a computer program periodically will send network traffic somewhere, usually on a set or close to a set interval. Since many types of malware use beaconing, the analysts think that finding all beaconing would be a reliable indicator of malware. But they forget that much legitimate software also beacons–operating systems, email and chat clients, antivirus, network time protocol (NTP) , and many other software programs that regularly check for updates, new content, provide “health” updates to the vendor, etc. If the goal is to find malicious beacons, analysts must be explicit about it and decide what characteristics make a beacon malicious or suspicious.
Another problem we see is that analysts often get caught up in the process and focus solely on strict adherence to a checklist. When analysts are just following a checklist, they may forget to consider if the event is something malicious that just isn’t covered by the current checklist, or they may overlook legitimate reasons why the observed behaviors might occur.
Other problems stemming from incident analysis crop up when investigators forget to take context into account. For example, to an external investigator, with little or no context about an organization, any file movement including an employee’s emailing a file to another employee looks like exfiltration. Understanding employee roles, access rights, and employee dynamics (e.g., is an employee leaving or disgruntled) all provide needed context to determine whether any file transfer is malicious, benign, or if there is not enough information to tell.
A Framework for Thinking like an Analyst
I recently reached out to Jared Ettinger, a researcher with the SEI’s Emerging Technology Center who teaches an intelligence analysis class, to determine what aspects of Ettinger’s work apply to other cyber analysis disciplines. He pointed me to the ETC- developed cyber intelligence framework . I found this framework to be very applicable to the skills needed to think like an analyst when analyzing a cyber incident. This conceptual framework includes the following six components:
- analytical acumen–facilitates timely, actionable, and accurate analysis on a cyber issue
- environmental context–provides scope for the analytical effort
- data gathering–acquires and aligns data for analysis
- microanalysis–assesses the functional implications of the cyber issues, the what and how type questions
- macroanalysis–assesses the strategic implications of the cyber issues, the who and why type questions
- reporting and feedback–offers courses of action to enhance decision making
This framework will be the basis of a workshop on How to Think Like an Analyst that we are offering at the 2018 FloCon conference . The training will focus on teaching analysts to work through problems by understanding biases, understanding context, understanding your environment, understanding what sort of information you would need to find to answer questions.
Learning how to think like a security analyst will have a profound impact on SOC staff and network analysts in government and industry. Beyond that realm, learning to think like an analyst could apply to almost any role or responsibility in today’s economy. Often, when organizational problems arise on teams, those problems can be traced back to management or security staff who aren’t thinking like an analyst. They aren’t thinking about how to think. How do they break a problem down to get the point where they have all the pieces ready to reassemble? We will cover these topics in greater detail in a future blog post.
We welcome your feedback in the comments section below.
How to Think Like an Analyst will be part of the training presented at the 2018 FloCon conference .