The “someone else’s problem” problem
Douglas Adams’ The Hitchhiker’s Guide to the Galaxy discusses the notion of the “SEP field”. As told by Adams’ Ford Prefect character, an SEP is something our brain “doesn’t see”. Why? Because it is Somebody Else’s Problem.
We can apply this way of thinking to cyber security risks. If we all view it as somebody else’s problem, we leave ourselves vulnerable to risk. For example, clients who outsource their IT or dev work may assume it’s solely the developer’s or IT firm’s responsibility. In truth, the client is accountable.
This is why it’s so important to have frank and constructive conversations about security. Clients shouldn’t try to pass risk onto others. Instead, they should feel confident to mitigate it.
The shared responsibility model
Data security is a prime concern for any business – ensuring that all stakeholders’ personal and company details are safe. When the General Data Protection Regulation was introduced in 2018, it gave rise to two new roles: data controllers and data processors.
Clients are generally data controllers, so they must make sure they are aware of the responsibilities that come with this. They need to safeguard user data and comply with the data protection regulations. Just because you work with a specialist IT firm or web developers who may know more about the rules than you do, it doesn’t mean that they are responsible. If it’s your business data they are managing, then you are still responsible - you are the data controller.
Third party providers are generally data processors who are authorised by the controller to process their data according to a set of instructions. These instructions take the form of a data processing agreement. This describes what the data processor can and cannot do and what steps should be taken if something goes wrong. Remember, even with a data processing agreement in place, the overall responsibility for security does not change.
This being said, your IT firm or external development team are not without responsibility. They will have hands-on access to the code, know what its dependencies are, and can scan log files to help identify suspicious activities. However, you can’t assume this is done automatically or without cost. As a rule of thumb, if your developers haven’t mentioned this already then it's likely they aren’t doing it - or at least not in a formalised way - so it’s always good to check.
Common threats that developers should be aware of:
To prevent risk, your developers should be prepared for common attack vectors;
SQL injections: When weaknesses in input validations and database queries allow attackers to run additional commands on your data.
Cross-site scripting: When weaknesses in website inputs allow attackers to inject code into a trusted website in order to hijack authentication or send users to a malicious copy of a website.
Insecure authentication: Where developers haven’t properly protected a website allowing attackers to bypass authentication and gain access to personal data.
Failure to patch or update: Vendors release updates to patch (secure) found problems in their code. If your software isn’t updated with the latest patches, then it could be vulnerable. Check the end of life dates (EOL); this is when the security patches will stop for this version of the software.
Supply chain attacks: If a third-party provider, such as a sub-contracted developer, becomes compromised, then this could offer access to client systems.
Demonstrating security awareness
There are two key facets to demonstrating security awareness. First, you can ask your teams how they have handled risks in previous projects. Second, you can invest in security certifications like Cyber Essentials and Cyber Essentials Plus. These are good at raising your overall security awareness and demonstrating that you follow best practices to your clients.
Equally, clients should check if their developers have these certifications. Cyber Essentials is a self-certification but Cyber Essentials Plus is independently audited so comes with added confidence and controls. If a developer does not have these, it could be a red flag.
If you’re not sure about a developer, you can verify their certificate status online here. The Cyber Essentials qualification is not totally failsafe, but provides assurance for minimum standards like firewalls, user access controls and malware protection. If your IT firm or external development team doesn’t take this seriously, then why would you trust them with your data?
The takeaways
Rather than assuming security is “somebody else’s problem”, clients need to have open conversations with developers. It is not enough to simply ask if something is secure. To understand potential vulnerabilities specific to their systems, clients should ask these key questions.
Key questions to ask developers:
What are the top security risks for my specific website or application?
Do you have any security certifications like Cyber Essentials or Cyber Essentials Plus?
How can I get an overview of my website infrastructure and code?
What dependencies does my website or application have and when is its End of Life?
Do we have a data processing agreement in place?
What is the plan for responding to a security incident or data breach?
How do you mitigate risks from your own suppliers or supply chain?
As a responsible data controller, it is the client’s role to discuss security risks with their developers. Understanding risks is the best way to mitigate them. Rather than taking a blame-led approach, we need to be proactive and look forward.
As the Scouts say, always be prepared.