March 21, 2014
Heather Bearfield, National Technology Assurance Services Practice Group Leader, Featured in The Metropolitan Corporate Counsel Article, "Pay Attention: While Malicious External Cybersecurity Threats Abound, Many Others Are Hidden In Plain Sight."
The Editor interviews Heather Bearfield, Principal in Marcum LLP’s Boston office and National Technology Assurance Services Practice Group Leader.
Editor: Please tell us about your professional background.
Bearfield: I started my career in the IT group at a Big 4 firm where I focused mainly on financial services firms with respect to compliance and internal controls. I then transitioned to Bank of America right around the time it acquired Fleet, so there was a lot of focus on security control and such, and I was exposed to the security world to a much greater extent. From there, my boss and I moved to UHY (a predecessor of Marcum) and started the technology practice. This occurred during the Sarbanes-Oxley era, so we did a lot of internal and external controls review.
When the economy took a turn for the worse, we found many people were fearful of losing their jobs and holding data almost hostage from their company. That really kickstarted what today is Marcum’s Technology Assurance and Advisory Services Practice, where we make it our mission to ensure that our clients’ access controls are locked down, that their data is very much secured and that only authorized individuals are able to see and/or manipulate that data. We then put strong policies and procedures such as acknowledgment and confidentiality agreements into place. Currently I oversee our national practice, with experts in almost all of Marcum’s 23 offices in the U.S., China and the Caymans.
Editor: What are some best practices around risk assessment?
Bearfield: Organizations should assess their threat level on an external and internal basis, then classify and quantify that risk, taking into account the organizational culture as well as the compensating controls the organization has in place.
The main issue we see is that people don’t take into consideration their entire risk universe. They’re too close to the process, so they need to take a step back and have an independent third party come in with a fresh perspective to identify not only external threats but also, very importantly, internal threats. Identifying all relevant risks is imperative. It’s critical to analyze remote access connections, the effectiveness of monitoring controls (or lack thereof) for cybersecurity threats, physical controls around the removal of data, segregations of duties, etc. For example, you may have an employee who’s been with you for 30 years, and he is the only person who knows how to perform his job function. If there is no backup, and that individual doesn’t have the policies and procedures written down, that’s a huge risk to the organization. What would happen if he didn’t come in one day? Your developer might be your sole IT person, and she has the keys to the kingdom because she can control access to the data and every aspect of the network domain. We’ve also seen instances where a trusted person is actually stealing from the organization: one employee developed a wire transfer application that transferred a fraction of every transaction to a different bank account.
Editor: How can these internal threats be identified ahead of time?
Bearfield: Marcum’s professionals have a variety of internal threat detection tests we conduct. One service we offer that has really blossomed is a social engineering testing. In one test, we submit a phishing attack such as a happy birthday email with a link and see how many people click through. We assess the results with the organization, and they’ll use the test as a training exercise for their employees. I always tell people, if you don’t know the person who sent the email, don’t touch the link without verifying the sender!
Another test involves sending a request to employees to reset their passwords. Invariably, the IT department reports that instead of questioning the request first, people will try to execute the process multiple times. Each step of the process – i.e., opening email, clicking the link and entering information onto the landing page – is recorded.
We also perform physical social engineering testing in which we simply observe employee behavior. For example, in one test, we tell people around the client’s office that we’re from the IT group and ask them if we can jump on their computer. Nine times out of 10, they don’t even question us. They say, “sure, do what you need to do,” then leave to get a cup of coffee. Social engineering can be especially evident in a hospital setting, where an employee might want to make a visitor feel at home and let him/her into restricted areas by holding the door – just to be considerate. Hospitals should ensure that their wireless is separate from the guest wireless and is secure. We see too many physicians bringing in their own wireless routers to bypass stronger security measures. Think of the huge threat this creates.
Other human-error vulnerabilities include simply taking information out of the office on a USB drive and having it slip out of your pocket on the train, or emailing an unencrypted file to your personal email so that you can work on it at home. It’s likely the security in your home isn’t at the same standard as the organization itself! So all those instances of unprotected data are vulnerabilities to the organization.