Each year we ask both leaders and professionals what they consider to be the top technology issues to be addressed. Here Brian Runciman MBCS analyses the latest results to extract key information about continuing trends, current concerns, the role of chartership and the skills and resource gaps to be addressed.
When questioned about their top technology priorities, IT leaders have consistently named cloud and cybersecurity, with an occasional swapping of order. In 2022, cloud was a top priority for 27% (58% featured it in their top three) and cybersecurity for 24% (57% in top three) of IT professionals surveyed. In 2023, this changed markedly with cloud becoming the top priority for only 12%, but cybersecurity for 39%. In 2023, business process automation replaced cloud as the second top priority — the first time it had dropped out of the top two since 2014. In 2024 cybersecurity maintained its pre-eminent place as a concern: 38% of leaders made it the number one priority, with 26% of IT professionals concurring. However, AI was rated by 21% of leaders and professionals as the second highest issue.
In 2025 leaders’ top priorities are cybersecurity (33%), followed by AI at 24%, and business process automation and the ‘as-a-service’ model are joint third, both at 11%. This means cloud has been relegated by leaders to fifth place, and by professionals to sixth place — a further indication of the maturing of cloud technology and its almost complete embedding in business.
What keeps you up at night?
When asking responders to list their top three concerns, the top two rated are the same as last year:
- Cybersecurity/resilience 36% (leaders) and 27% (professionals)
- AI 13% (leaders) and 21% (professionals)
Yet, for IT professionals, AI attracted 17% last year, so this concern appears to have increased . Last year, leaders rated lack of resources as their third concern, and this was still the case for 2025. For IT professionals, the pace of change was in third place last year, but has been matched by lack of resources for this year. Also of note is that access to skilled staff is an issue for IT leaders, with 37% rating this in their top three things that keep them awake at night.
Interestingly, net-zero policies and sustainability concerns are not in the top three ‘things that keep them awake at night’ for 88% of professionals and 92% of IT leaders. This may indicate an understandable focus on keeping the business’ lights on, or it could be that this needs further raising in the collective awareness. This would certainly benefit from more analysis in 2025.
Don’t worry just yet
Although quantum computing may have reached peak hype, our members haven’t been distracted yet. Only 1% list quantum technologies as the thing most likely to keep them awake at night. But this area is attracting more column inches by the day. To stay across this, BCS has a new Quantum Specialist Group — and the Fellows Technical Advisory Group (F-TAG) also covers the area.
Resource requirements
A new low has been hit for our question on whether responders feel they have enough resources to do their job for the year. At 5% this is the lowest it has ever been in BCS surveys. With BCS’ policy goals and royal charter aims in mind, it is noteworthy that ‘enhanced IT capability and understanding in leadership team/board’ attracted 51% of IT leaders and IT professionals as an issue. Additionally, ‘increased digital literacy among general workforce’ is a requirement noted by 48% (46% in 2024) of leaders and 49% (up from 43% in 2024) of professionals. These numbers are up on both 2024 and 2023.
Skills gaps
The skills gap sees AI leading the concerns, with 61% of leaders (up from 59% last year) rating it as an issue. The related data science area also shows strongly, with 33% citing it as a gap, after the ubiquitous cybersecurity. The high showing for data science seems at odds with the top priorities, where data science came in joint fifth, with 6% rating it as a top priority.
Accountability
As BCS pursues the ideals behind professionalism, we asked a question on the link between chartered status and accountability: which disciplines or roles government and industry should expect accountability to be shown through chartered status? Responders could select more than one answer, and doing so demonstrated a definite appetite for this approach. The top answer was cybersecurity (74%), followed by AI (63%) and CIOs (48%).
For you
Be part of something bigger, join BCS, The Chartered Institute for IT.
We also had a wide selection of suggestions on other areas that would benefit from the accountability that chartered status ensures. This ranged from, ‘all employees’ to ‘all IT activity that delivers to society’.
One commenter asked: ‘Can anyone really be accountable in today’s vastly changing world?’ To which BCS would answer a resounding yes. And the range of other suggestions for chartered recognition bears that out too. For example, other respondents stated that ‘any system development requiring mission- or safety-critical applications’ needs a high level of accountability.
For good measure business analysis, business intelligence, IT managers, programme managers, project managers, architects, IT portfolio management, IT supply chain management, quality assurance, testing of functionality, software engineers and strategy heads were also listed.
AI deeper dive
Last year we took a deeper dive into the issue du jour — AI, first of all asking how responders are considering developing their AI skills during the coming year. We also asked on which sectors responders felt AI would have the biggest impact, and queried the reasons for their answers. In addition, we wanted to get a sense from our expert members about how we could best protect the public from AI’s unintended consequences. All of these responses can be found in the full report.
In the verbatim comments we had a variety of ideas. Governance was a key issue, alongside developing technical mitigations, and digital and data skills frameworks. ‘Governance frameworks that are focused on good practices’, as one commenter said. Another comment circled regulation with sufficient enforcement power and related funding including a clear approach for reparations.
Another commented on the ‘transparency and visibility of all data flows and components including where, by whom and how data (both personal and impersonal) is being used for training ML/AI.’
Whilst some felt that AI will not be as impactful as people currently think, there were a number of comments of how we get the balance between risk taking and safety, with some advocating a slow-down to ensure we have ‘a comprehensive understanding of AI — including the learning models — currently we rush something out, and assume that it works generically as it works in the tested environments.’
BCS’ view resonates with some comments: that we need to address unwarranted fears, including fear of change, that risk bad decision making. Educating leaders is important, an example used being that brute force techniques like LLMs are not really AI.
A final comment on this: ‘We must embrace it, and teach responsibility. You cannot legislate against the ocean tides. Strong social morality messages need planting now for a generation ahead — akin to the effective reframing of drink-driving — which was less of a sin when I was a child, but is now widely understood to be unacceptable. The message needs to start now in a structured way.’