This article was edited by SPRITE+ Research Associate Dmitry Dereshev, with responses and edits from Principal Lecturer in Systems Security Engineering Shamal Faily.
The spotlight today is on Shamal Faily – Principal Lecturer in Systems Security Engineering, co-coordinator of the Bournemouth University Cyber Security Research group (BUCSR), RISCS Fellow in Secure Development Practices, and a SPRITE+ Expert Fellow. Some of Shamal’s latest publications include:
How would you describe your job to a 12-year-old?
I build tools like CAIRIS that software creators can use to make their apps safer. I strive to make these tools so that they do not get in the way of the creation process. These tools can help create safer products for any context, whether for desktops, mobiles, indoors or outdoors.
Tell me more about that
Few understand how software is designed and developed in practice. Many perceive “a developer” as someone who does everything, from gathering requirements and designing, to testing, implementation and marketing. But unless you are a one-person start-up, this is unlikely to be the case.
In secure software development, app security is better when considered early. Getting product owners, designers, and developers to engage with security at an early stage, but without impacting productivity is a particular research challenge that I look at.
When you download an app – can you know in advance whether that app is going to harm you? If you know something about technology and security, you might think about the services that app might use on your phone, the data it might get access to, or about threat models. With that knowledge you may be reasonably confident in the harm that app might cause, and whether to trust it or not. If you do not have any of that knowledge, what on earth are you going to do?
Dieter Gollmann described The Fundamental Dilemma of Computer Security, which is that people who use apps and software have security requirements, but they don’t have security expertise. It means that users want the apps on their phone or computer to be safe and useful for what they want to do with them, but they do not have the expertise to understand what their own security needs are, the ability to convey these to others, or even the confidence to check whether these are being met. Security experts, on the other hand, have the technical know-how, but they lack an understanding of the problem space the users live in, or their contexts of use. This isn’t a new problem, and has also been described by Horst Rittel as a symmetry of ignorance. This makes designing for security a wicked problem, but recognising this dilemma or symmetry exists is the first step for tackling it.
Why are we talking about software security separately from software development?
There are several reasons for this. Historically, the first people who did research in computer security were the military. You can still find the Rainbow Series books describing security evaluation standards from back then. The military applications focused on preventing harm to military systems and defence infrastructure. Computers were more expensive than people, and you would fit people around computers, and devise procedures to protect the machines. Nowadays technology is cheap, and it is people’s time that is more expensive, but the security philosophy has been slow to catchup with this idea.
The second problem is that we teach people about security as a stand-alone thing. This is reflected in computer science programs, and in industry as well. It is a chicken-and-egg situation: degree programs are built based on what the industry wants, but the folk in industry are those who have been through degree programs which emphasised that security is a stand-alone thing.
The idea that security must be embedded at the very early stages is something that people talk about, but few understand how to go about it. So long as this lack of understanding persist, not much is going to change. No one apart from the academics has really got the time to investigate how to build in security in a productive way. And even if we find these ways, we will have to convince people that by taking this new approach they are going to be able to do things quicker and still be secure.
Very few organisations have employees read an academic journal paper and say: “this is a very good and we are going to run with that idea”. The research is often hidden behind pay walls, so access is difficult for people outside academia. Besides, in many cases the academic work is not built upon. Academics hope that someone in practice will take it up and use it. Sadly, this does not happen as often as we would like.
Could you describe what you do during a typical workday?
I wake up early and start my day with a quiet activity like catching up with urgent overnight emails coming from students or collaborators. I also use quiet times in the morning to catchup on some writing, particularly if there is a deadline for a journal/conference submission approaching. Since I maintain an open-source product, I often check on whether any new bugs or feature requests have been raised (particular for security) and investigate how much work would be involved implementing them.
If I am not teaching, much of the rest of my day is spent in meetings. I meet with students to discuss ideas and answer questions related to courses. I also have chats and phone calls with industrial collaborators to exchange knowledge – they tell us about their experiences, and that gives us insight into problems we have not thought of. The discovery of research problem relies heavily on those working in professional practice; in many cases, they are closer to the symptoms of research problems than we will ever be.
That is my typical workday – lots of meetings and email. Interestingly, this has not changed much with COVID-19.
Could you describe a challenging project that you have recently worked on?
One of my tasks as a RISCS fellow in secure development practice is to gather evidence of effective practice. We want people to develop stuff in a secure way, but we also must have evidence that it works.
I have also been trying to understand what the UK’s current capabilities in secure development practice are. I have organised a workshop around that to bring the community together. It helps to know who is already doing research in this area. What I noticed is that there is not much research on this topic, and out of over 150 UK Higher Education Institutions, only 20-30 of them have the research capacity to put relevant content into their own cybersecurity degree programmes.
Planning a workshop may not appear to be a challenging activity. However, attracting the right participants, and covering the right ground with just a few hours over a virtual platform is non-trivial. The workshop not only needs to tackle the challenges of secure development practice head on, but at the end of it there should be outputs that can be used in more focused workshops.
There is some good work with respect to secure development practice in the Software Engineering and in Human-Computer Interaction (HCI) research communities, but this is not finding its way into practice. One reason is that it is not finding its way into education. This is one of the problems we will be looking at in our workshop. Education is important, but to get people to build things securely, researchers and practitioners need to be incentivised to transfer technology and know-how into the marketplace.
The Cybersecurity market is worth several £ billion. I am confident that if there were tools and services people could buy then this would help take-up of secure design and development practices. This is important because, despite the complexity of software, the #1 tool for managing software security requirements is still Microsoft Word, or some combination of Word, Excel, and Visio. We need a fertile commercial ecosystem because, without it, how would you convince your manager to buy a ‘secure design and development’ product or service that nobody has used yet?
What training/experience did you have at the start of your career?
Before I went to a university I was in the armed forces, which, if anything, taught me time management. I have got my first degree in Business Computing at City University London, and I did an industrial placement as part of that. That placement gave me experience of eliciting and specifying software requirements. I think that is where my interest around requirements came from. I spent about 9-10 years working for Logica, and a lot of that was spent in the Space and Satellite Communications business in Germany as a software developer and technical architect.
Because of my interest in software engineering, I drifted into seeing how some of the problems that I was experiencing could be addressed. I started doing an MSc in Software Engineering at the University of Oxford, but then an opportunity to take-up DPhil studentship came up. I ended up leaving Logica to take what, at the time, was a career break that never actually ended!
How did you get into your current role?
While doing my DPhil, I took up a postdoc, and as that was coming to an end, I looked at the job market. I found a person I knew from my undergraduate days, who was a lecturer at Bournemouth University. They were advertising jobs, so we exchanged a few emails, and I ended up applying, and obtaining the job I am in now. There were several people there interested in usability and security – they formed a cybersecurity unit trying to build more relationships with companies and run more consultancy work. The unit is not around anymore, but I thought that was a fertile environment for doing research in my area.
I was also required to do a postgraduate certificate in education practice at Bournemouth. In higher education there is a strong emphasis on lecturers to have fellowship in the Higher Education Academy. One of the ways to get that is to get a relevant postgraduate certificate. I think this is a relevant training because that engaged me closer with the pedagogy associated with cybersecurity. There was not a lot of relevant work back when I did the PGCert, but this is changing. I think a lot more people are focused on security education now than perhaps a few years ago.
What do you wish you had known when you started your career?
Understanding the importance of writing and communication. There is an adage that research is not research until it is published. Over the years I found that to identify a research problem, we need to get some research impact in order to have people use our stuff. That entails lots of writing, be it judiciously worded emails or grant proposals.
Writing is a large part of the academic career. I often tell students that academics are professional writers. I have written a published book. Academics write papers all the time; to get work accepted, it is not enough to have good research, it must be well-written too. If you look at the articles at top conferences, one thing they have in common is that they are all well-written. The value and the importance of writing is not something I appreciated when I started.
What would you recommend to people who want to follow in your footsteps?
It is not like in the good old days when you would get a job and stay in that job until you retire. These days we never really retire because, each year, the retirement age goes up! With that in mind, it is good to allow yourself the chance to do something different. If you can find something you are excited about – spend a few years doing that and see where it takes you. I was quite interested in how software was constructed and where the requirements came from. That drove a lot of where my career took me.
Thinking back at my first role at Logica – I think I got that job because the person who interviewed me knew I was interested in requirements, and I had some background in database design. Then there was a related team that wanted someone to get involved in data warehousing, and I thought that would be a great opportunity to get Unix experience. That helped me get a role on a financial services project, which in turn brushed up my C++ skills. That in turn helped me to get a job in oil and gas industry, which subsequently took me to space industry. It was always something that interested me linked to what I did, and what I wanted to do.
What troubles did you have progressing through your career?
The big challenge working in my area is that it is unlikely that you will get a paper accepted at top conferences by working in secure development practice. If you are looking for an academic job in universities, they value papers in those venues. The same applies to UKRI funding – it has been hard to obtain funds working in secure development practice. Fortunately, this is slowly starting to change.
The paradox is that secure development practice is an area of strong interest in industry. Although it has been hard to convince people of the merits of the research, finding industrial collaborators is comparatively easy, because we work in an area where people in practice see the value, even though it is hard for them to implement it.
Things are beginning to change in the academic job market too. When I was finishing my DPhil, we had the issue of “too many PhD graduates, and not enough jobs”. Now in cybersecurity it is the reverse: people doing research in cybersecurity move into industry, or policy, or into other areas. I look at people doing research like mine around the world, and there are very few who are still in higher education – most finished their PhDs, built some interesting tools, but moved on without anyone else taking up their work.
The body of knowledge in secure development practice is spread out across disciplines, and the journal and conference venues are less interested in publishing in this area compared to some of the others. Being an interdisciplinary field, reviewers from several fields must agree that your work is worth publishing, which can be difficult.
As much as interdisciplinarity is encouraged, as a young academic starting out it is hard to make a name for yourself doing research that sits on boundaries between established disciplines. Everyone wants to see interdisciplinary research, but getting it done in practice is a different story altogether. Perhaps more could be done to promote interdisciplinarity in cybersecurity and create a viable academic career progression there. Looking at the job market, you do not see many adverts looking for lecturers in interdisciplinary studies. If much of your salary comes from teaching, it becomes necessary to fit your research around your department’s disciplinary teaching requirements.
What stereotypes would like to dispel about your job or industry?
There is a stereotype that academics live in an ivory tower, and we look out from that tower onto the battlefield of practice and reflect upon what that means to the future of mankind. That is false. We teach the next generation of people who must go out into that battlefield. A lot of people in industry do not realise how close to their everyday problems we are.
Another stereotype that even some naïve academics have is that teaching is not important. Teaching is very important. It is our main vehicle for equipping the next generation of cybersecurity practitioners and getting our knowledge into the wild. As the maintainer of an open-source product used by students, I have also found that students are much more demanding as users of tools than practitioners will ever be. Students want to learn how to use cutting edge, innovative platforms that will give them a head start in the job market, but – at the same time – their tolerance for bugs or poor usability is very low. Practitioners, on the other hand, will learn to get by the tool’s shortcomings, and they will never tell you about them. Vulnerabilities can go unnoticed for years without anyone ever realising that they are there until they get exploited. With students – not so much.
How would you describe your research or business interest in relation to SPRITE+?
I am interested in some of the “how?” questions looking at the SPRITE+ themes. I am looking for opportunities for innovation: what do these themes tell us about how we design things? I like to listen in on discussions and think about how that could be put into practice or linking things in new ways for a design or engineering benefit.
How do you hope to benefit from working with SPRITE+ network?
I see SPRITE+ as a network of networks. The secure development practice work I am doing is in collaborations with SPRITE+. We are hoping that SPRITE+ will help us put the word out. Talking to the SPRITE+ leadership team, I certainly gained interesting ideas to incorporate.
Which of the SPRITE+ Challenge Themes can you relate to from the job that you do? How does it impact your role?
I can probably relate to all of them since we are operationalising things. We talk about how these challenges can lead to stuff that I can use in practice, and in the process of doing that I hope we could find problems that feedback into theory.