IT Strategy: Seeking value

wordArt-StrategyI have been planning to post on IT strategy in relation to some of the areas which I believe need to be considered.  Initially my thought was for a single post covering a number of different points, some being obvious and some less so obvious, however as soon as I started writing it became clear that each point could be a post in itself or would result in a really long single post.   As such I decided to undertake a number of separate posts of which this is the first:

Seeking Value

coins-1523383_640I remember someone telling me that IT is the 3rd most expensive thing in a school after staffing costs and the cost of the building and school estate.  With such a large part of a schools finances invested in technology it is important to make sure that we are getting value.   Now I note my use of the word “value” as opposed to impact; This is due to impact being often associated with examination outcomes.   In my view this is a narrow view on technologies potential within education.  Exam results, for example, don’t provide a measure of the positive effect which technology can have to a student with Asperger’s who previously found it difficult to interact with the classroom discussion but now can do so easily via an online chat facility.    For me value suggests a broader classification which might include using technology to engage a particular student who previously wouldn’t or couldn’t access learning, like in the above example, it might include introducing new experiences to students which were either difficult, dangerous or costly without tech or it might be using technology to bring about new more efficient processes for teachers such as dictation of feedback, etc.   Value is much more diverse and also context specific than exam results.  Seeking value in our technology should be a key objective in all technology decision making but mustn’t be confused with cost cutting.

I have often heard about how technology should be led by teaching and learning needs.   I agree with this to an extent in that technology shouldn’t dictate what is done in the classroom, however we must be careful that whatever technology we are considering using brings about value.   It is all too easy to fall for the salesman’s spiel regarding the potential or to focus on a particularly nice feature and not appreciate the wider implications of a technologies use.  I remember VLEs being heralded for the potential they had to change learning giving students access to resources and allowing teachers to set homework and provide feedback, etc.  Sadly, in my view, they never really provided value as first there was the cost of the software, then the resource cost of training and of creating, posting and updating content, then the limited ways that content could be organised and presented which stifled the creativity inherent in good teaching.  The cost versus the benefits never added up for me, and with this I didn’t see the value.    I can name a couple of other technologies which have been rolled out due to their potential to impact teaching and learning, but where the costs and resultant value is doubtful at best.

A discussion of value in relation to an educational technology project is never an easy one given the concept of value is potentially so broad and all encompassing.  Important things, such as a detailed consideration of value, are seldom easy.   Judgements on value are also often subject to the different perspectives of the people involved in the project.   To that my answer is to look to the schools values and what it stands for and to see if the proposed technology fits with the schools wider aims.   If it doesn’t then the project should be dropped.  If, however it does then a trial or pilot study may help surface the value or lack of in the technology being examined.   Discussions with other schools may also help to establish value.    Assuming value can be established from such a trial a wider roll-out, either to a bigger pilot group, to a specific group or even whole school can be considered and planned.

I have now added “Seeking value” as one of the value statements for my IT Services team, as a reminder and key focus in supporting IT across the school.   It is my belief that it is important that we all have a similar reminder as we explore the many different and emerging technologies and technology solutions which might be considered for use in our schools.   Before proceeding we need to ask ourselves: Does this add value?





CIO Summit 2019

20190926_105249Interesting day at the CIO Summit down in London yesterday.   This was my annual visit to an event focusing on IT in the wider, including corporate, world rather than within the education sector.   I make an effort to do this simply to try and get a wider view of IT, digital transformation and digital innovation to help provide some context to my work in school.   There were four key messages which definitely resonated with me.

Its about the problem we are solving, not the Tech.

The CDIO of HMRC, Jacky Wright, outlined the importance of focussing on the problems you are seeking to solve rather than coding, or the technology you have available.   This is a message I have often heard Mark Anderson (@ICTEvangelist) state in relation to education, in that it is not about the technology, it is about Teaching and Learning.   In a more recent post I think he hit the nail on the head when he said its not really #edtech after all, its simply #Ed.  It would seem that this need to focus on the end outcome or product and not be distracted by shiny or new technology is something which impacts on the wider IT world rather than just education.

Culture eats strategy

The importance of organisational culture was stated by a number of presenters.   Like a focus on the problem being solved, mentioned above, a focus on culture was identified as being more important than the Tech being used.   I liked Rackspaces mission of providing a “Fantastical Experience” as both setting the tone and culture which they seek to achieve within the organisation.   I wonder whether schools could be a little more inspirational in the missions they set rather than the usual “developing the best learners” or “preparing students to the future” style of mission which we commonly see.  At the end of the day the culture of an organisation is key in what it achieves or does not achieve.   The people, the leadership team and the staff, shape the culture.


A number of presenters discussed the issue of sustainability in relation to technology.   This is a challenging area given that technology may be both part of the solution and part of the problem.   In being the problem, as we consume more data, use more technology and even personally have more devices, we need more power.   We also consume valuable resources in the manufacturing processes plus make use of valuable metals in the various tech products.   This all adds up to using more energy at a time when we want to be using less.    Thankfully tech can also be the solution here in using AI to match availability and demand, in harnessing greater amounts of renewable energy with greater levels of efficiency and in supporting remote collaboration reducing energy consumption associated with travel.     A particular area of discussion in relation to sustainability was that of the supplier chain.   It was highlighted that organisations need to be aware of the energy consumption of the third parties they use rather than treating this as an issue for the third party.  If you are using Microsoft or Google cloud services, the resultant energy usage associated with their data centres, as used to store and process your data, needs to be considered in thinking about your organisations carbon footprint and energy usage.   In addition, looking at devices, including PCs, printers, etc, we also need to consider how suppliers source their resources, manage energy use during production and also to what extent their devices can be recycled, refurbished or reused.

Cyber Security

This topic was always likely to arise as part of discussion.   I found the presentation by Brigadier Alan Hill particularly interesting in discussion his views.   The key issue is ensuring that the risk associated with cyber security is understood at a board level and then working on constant review, testing and preparation for cyber events.      As he identified any plan made won’t survive an encounter with the enemy however the act of having and more importantly testing a plan will at least make you and your team as prepared as they possibly can be for when, and not if, a cyber incident happens.

This was my 2nd CIO Summit event and once again I found it to be useful and informative.   Towards the end of the event the importance of sharing ideas and best practice with IT peers was discussed and for me attendance at this event is a key part of this.   Our best chance for innovation and for security is collaboration and cooperation;  we are all in this together.  And so as I write this on the train on the way home I look forward to reviewing my many pages of notes and identifying the actions to take as a result of this event.   I cant wait for next year.

ISBA IT Strategy and Cyber Security Conference

The main conference venue before things began on Wednesday

On Wednesday I had the opportunity to present a session at the ISBA’s IT Strategy and Cyber Security Conference in London.   I had previously volunteered to contribute to the conference and was expecting and had planned for a small breakout session anticipating around 20 people.   On the day upon arriving at the conference I found out that my breakout session would be following Mark Steed’s keynote speech in the main conference venue and therefore with quite a few more than 20 people.

The session very much focused on my thoughts and experiences around cyber security with key messages around the extent of the risk we all face plus the opposing extremes of over confidence in security efforts or a constant need for heavy security measures at the expense of school operational efficiency.    I described my approach as being one of a “healthy” paranoia and of a robust risk assessment and risk recording process.

You can read my slides from the session here.


Backups: Do you test?

isolated-316392_640A little bit of a technology post today:  Backups including redundant solutions are increasingly important in organisations as we seek to keep our IT services up and running for our own internal users and also for external users or clients/customers.   This might be taking backup copies of data to tapes, having a redundant firewall or internet connection or having a cloud-based service available to replicate on-premise services in the event of a disaster.   My concern however is that we can feel better for having these solutions in place happy in the knowledge that we are better off and more protected than if we don’t have them.     The issue is that this sense of additional protection is false.   Just by having a backup solution of one type of another doesn’t mean that it will work when things go wrong.    We also need to be cognisant of the fact that when things do go wrong the result is often one of stress and urgency as we seek to restore services while under pressure from users, business leaders and process owners among others.   We need to adopt a scientific mindset and test the backup solution to make sure it works as intended.    It is much better to test our backup solutions to a timetabled plan than having the first test of a solution being a full blown real life incident where failure of the system could result in difficulties for the organisation.   We also need to bear in mind that just because it works on the day the solution was put in place, or even works today doesn’t mean it will work in a weeks or months’ time, or in a years’ time when we truly need it.    We need to have a robust programme of testing our backup solutions to ensure that they work, that we are aware of how they work and any implications and that those who need to use them are comfortable with their use.   Only by doing this can we be more comfortable in the knowledge that, when something does go wrong, we have a solution in place and are ready to put it to use.

fail-1714367_640The perfect example of the above, for me, was a recent test of our own backup solutions which included a service which indicated that recovery to a redundant system would be complete in 4 hours plus would be based on data backup taken regularly.    Upon testing the solution we found that the 4 hours recovery period was exceeded due to issues with the backup and the data was 3 days old.   We also found that there were implications for other systems when the test failure occurred.

It might be tempting to look on the above in a wholly negative fashion focussing on why the solution didn’t work however I want to avoid this and intend to focus more on the positive side of things.    We now at least know the solution didn’t perform as anticipated, we know more about the implications of the tested failure area, we are basically now more knowledgeable than we were before the test.    We will therefore now work internally and with the backup solution vendor to arrive at solutions that better meet our needs and are hopefully more robust and reliable.

The moral of the story;  Nothing works until you test it to confirm so test your backup provision and test it often.

Cyber thoughts from the train

Sat on the train going on my way back from London and I noticed my Samsung Galaxy phone was displaying a message telling me that it had detected a Samsung Gear device near me and wanted to connect.    The connection it was trying to establish was via Bluetooth which was enabled to allow my phone to connect to my cars audio system.   I hadn’t even thought to disable it.

As I look around the train I can see various people making use of mobile devices including laptops as we speed through the countryside.    The train is equipped with Wi-Fi thereby allowing everyone to remain connected even as they travel.

Two things worry me about the above.  The first worry is that of stray connections such as the one my phone tried to make with another passengers Samsung Gear.    As the various people on the train sit watching their video on their device, listening to music or working away their mobile devices are constantly seeking to make connections.    To connect to Wi-Fi for internet access, to connect via Bluetooth to external speakers, wireless headphones or in car audio devices.    As we use more and more technology our devices become more and more interconnected.    In doing so though we expose ourselves to an increasing risk of inappropriate connections being made either due to device error or due to human error, such as if I had accepted the connection which my phone was trying to make without reading the actual message.    These inappropriate connections may then give rise to unauthorised access and download of our data or to malicious acts being committed via our devices.

The other thing that worries me is the free Wi-Fi.    Now I suspect most people assume that the trains Wi-Fi is sufficiently secure although I cannot be sure of this.   The issue is the ease with which a passenger on the train could bring their own Access Point and set up a dummy Wi-Fi network, pretending to be the train providers network, for other passengers to connect to.   By doing so the owner of the dummy AP could gather data from those on the train who connect to the dummy AP.   This just seems all too easy.

The third thing that worries me is general awareness and consideration of security.    I doubt many people other than myself was giving cyber security of the many devices in use in the train carriage I sat in much in the way of consideration.    I would love to be able to survey people on a train or in another public space where free Wi-Fi is available in order to prove or disprove this assertion.   My belief, until I have any evidence to the contrary, is that we are a little too accepting.

Events such as the recent National Health Service ransomware attack highlights the issue of cyber security however the impact is not limited to big incidents occurring to big organisations like the NHS.   It affects each and every one of us, every day, even when sat on a train.    Also we cannot afford to be outraged and concerned only when a large breach like the WannaCry virus occurs, before almost instantly returning back to normal and forgetting all about security and the potential risks and implications.

We need a societal shift in terms of our perception of cyber security.

Cyber threats: Some thoughts

The recent WannaCry ransomware outbreak clearly identified the importance of keeping operating systems and other apps up to date to protect against identified vulnerabilities.   Given the high level of news publicity it is likely that a lot of us went home and updated our home PCs and also checked with IT departments to make sure they had done the same with company machines.    The outbreak, in my opinion, highlights a number of critical issues.

The vulnerability in this case had been previously identified and a patch made available by Microsoft, as such had all machines in the world been patched the impact would have been minimal.     But what if the vulnerability had not have been previously identified?    Had this been the case the attack could have been considered as a “zero-day” attack as it would have been on an unidentified vulnerability.    This would therefore have required the identification of the vulnerability followed by the coding and release of a patch, all post the initial infection.    In this case the impact of the ransomware would likely have been much more significant than it was.

The WannaCry Ransomware was specific to machines running Microsoft operating systems.    This has already resulted in a number of comments online suggesting people make use of Linux or Apple as these weren’t affected, suggesting that these may be safer systems.    As an operating system Microsoft has the predominant share of the desktop and laptop markets although the specific figures are difficult to ascertain.    This makes Microsoft machines a preferred target as there are simply more machines to attack.    Although there are differences in how the operating systems are managed, with Apple using a very closed development process and Linux using an open source approach, Apples OS, Linux and also Microsoft OS’s are all equally complex.   It is in this complexity that lies the risk of as yet unidentified vulnerabilities with equal risk across all the above OS’s.    The difference currently lies in the fact that Windows is the most common desktop OS, however if we were all to go out and buy an Apple or install Linux, it is likely the threat of attack would follow the masses.

My final issue is that of the devices we don’t give much thought to.    We think about the operating system of our laptop or desktop and even these days of our phone, and in thinking about these we carry out, or not, the required updates.    Our homes however increasingly contain more and more internet enabled devices and I would suggest we don’t give these the same level of thought.   My router, with which I connect to the internet, runs software in order to allow it to connect, to allow it to present an admin page along with providing other functionality.   This software is basically its operating system.     Your SMART TV runs an operating system which allows it to respond to your voice commands, search the internet and also carry out its other functions.    Your web connected home surveillance system runs an operating system which allows it to connect to cameras around your house and to allow you to connect in to view footage remotely, again, along with other functions.   And what about your wireless printer?    The above is the tip of an ever growing iceberg, however do we know how to upgrade the software in these devices to protect against identified vulnerabilities?   Do we know whether these devices automatically update or how to change the update settings?   Do we know how to check the version number or when the last update was done?

Microsoft called the recent attack a “wake up call”.   I tend to agree.    We need to be more aware of the implications of the use of each technology item, be it hardware or software.   We need to be aware of the risk to which usage exposes us as well as the precautions which we need to take.

My biggest take away from the whole incident is a reminder of what Nassim Taleb described in “The Black Swan”.   On Thursday 11th May all was well, systems were generally safe and precautions were in place.   Largely we didn’t expect a serious whole world cyber incident.   By the following day it was clear all was not well and that significant vulnerabilities existed.   A global cyber incident was underway.   A lot changed in a day and we didn’t do too well at predicting and preparing for it.    What shape will the next incident take if we can’t predict it?     And are those areas where we believe we are the safest those which are most at risk given we are unable to predict the unexpected?

A cyber learning opportunity

The global cyber attack of yesterday marks a learning opportunity in relation to discussing cyber security with our students.     It is important that our students are aware of the implications of such attacks including the impact and also the measures that can be taken to protect against attacks being successful or at least minimise their impact.

So what are the key learning points to take away from this incident and to discuss with our students:

OS and Software Updates:

One of the key points to take away is ensuring that desktop and server operating systems are regularly updated.  This includes updates and also upgrading of versions, for example upgrading from Windows 7 to Windows 10.    Older operating systems eventually stop receiving support from those that produced it, meaning that new security flaws which are identified go unaddressed leaving users vulnerable.  Support for Windows XP ended back in 2014 so users of XP would be vulnerable to flaws identified between then and now.     For more modern operating systems such as Windows 7 and 10 the key here is the updates.   These updates provide the fixes to security flaws as they are identified and therefore it is important to keep your system updated to make sure vulnerabilities are promptly addressed.      This expands beyond operating systems to application software as well, as equally applications which have not been updated may expose users to vulnerability which the appropriate updates would have addressed.

Data Backup:

In the case of ransomware backup is critical as the virus will encrypt all files it can get access to.  As such at this point you can either pay the ransom which may or may not get you your files back, or, assuming you have kept backups, roll back to your latest backup with only minor loss of data.    As such regular backups represent the best protection against ransomware attacks.   The more regular the backup the less the loss so a weekly backup means a loss of up to a week worth of work, whereas a nightly backup reduces this loss down to 1 day worth of work in the event of a successful ransomware infection.

User Awareness:

The weakest point in the network is usually the user, the human being making use of the system.   An IBM report from 2014 identified that 95% of security incidents involved a human being.    It is unlikely that this figure has changed much.   As such it is important to try and educate users to exercise caution and to be aware of the precautions they should be taking in relation to suspicious emails, password security, etc.


While not protecting you against zero day attacks or new variants anti-virus will provide some protection against existing identified threats.   It is also worth noting that new anti-virus products are introducing new capabilities such as heuristic based identification of threats and sandboxing to provide additional protection.


A key security maxim has always been assignment of minimum privileges required.   This means ensuring that users only have access to the files that they need to have access to in order to carry out their role.    This includes defining whether a user is limited to reading files or can in fact modify or delete them.    This also includes whether users have access to specific networks or whether their access is limited, such as in the case of a guest user.     By limiting access in this way we limit the impact of ransomware or other viruses to some extent.   As such in looking at the resources on our network assigning the minimum privileges is a key step.


The recent attack is the largest attack I can remember since the Love Bug Virus which I vaguely remember from back in 2000.   It is likely that such attacks will become more common as we become more and more connected and reliant on technology, adding more and more connected devices into our homes and using more and more software apps in our daily lives.   As such, in preparing our students for the future, it is important that we take every opportunity to discuss how these attacks can and do impact on us and how we might all take appropriate precautions.    With the latest incident so widely reported in the news, now is a good time.