Digitally Speaking
Data backup

Alastair Broom
March 28, 2019

This Sunday marks the 8th year of World Backup Day. What began as a light-hearted campaign to encourage end-users to create secure copies of their treasured photos and files is slowing morphing into a message for the business community: without data there is no business.

Most UK businesses are now part way through digital transformation, which is fuelled by data. Almost every business decision is based upon the data held, yet losing that data is worryingly easy. From equipment theft and accidents to security breaches and simply failing to backup, businesses have to work hard to safeguard their data. And with the growing sophistication of cyber attacks, both the prevention measures and the price you pay for failure are increasing.

In February, hackers breached the servers of a US email provider, wiping the data from its US servers in what was termed an ‘attack and destroy’ incident. The firm did have backup measures but they weren’t secure enough to prevent the attackers from infiltrating their backup servers. No ransom, just an act of vandalism that put the company out of business overnight.

This senseless attack illustrates the point: we cannot overestimate the importance of data or its security. Fortunately, there are steps all businesses can take to protect their most precious asset.

Understand the value and location of your data

Your critical data is not just in your datacentre. It’s likely to be dispersed right across the organisation, from HR and marketing to sales databases and financial systems. It could be a spreadsheet attached to an email, but if that’s the only copy, and that data has a high value to the organisation, then that spreadsheet is critical data.

Understanding where the data is located and knowing its value is essential to determining your data protection policy. It’s not feasible to back up every single piece of data, so knowing what matters is crucial. This is where data discovery and classification do the heavy lifting, increasing your visibility of where data resides on the network, the endpoint and in the cloud, and allowing you to create scalable security solutions to protect that data.

Arm yourself against ransomware

While it’s not an uncommon occurrence for organisations to lose servers or for data centres to go down and return without their data, most businesses do now have their most important data backed up. This has historically been used as protection against ransomware, a form of malware which locks the user out of their device or files and then demands a monetary ransom to restore access. However, cyber criminals have evolved their malware to search out and encrypt backup servers as well as the primary data store so it is important to ensure backup systems are adequately protected from attack.

In 2018 there were 4.7 million ransomware attacks, down 50% from the previous year, but still alarmingly high. And while attacks have decreased, they have become more targeted meaning Enterprises and SMBs alike need to do more to protect their data.

Backup is critical as it enables lost or ransomed data to be restored and business disruption minimised. Encrypting data will protect against data theft but without appropriate protection for the data backups, corporate data can still be destroyed or rendered inaccessible during a cyber attack. Making sure backup systems are suitably distanced from the primary data is critical, if there is a link, cybercriminals can find a way in.

Put the right policies in place

It bears repeating that businesses must understand the value of their data and where it’s located. Discovery and classification enable a security policy to be applied to data and provides a platform for automated backup policies.

In order to build the most effective defences, you need a policy which delivers a level of protection that is appropriate to the value of the data. The policy should dictate different levels of resilience for your tiers of data. For really critical data, the best line of defence is to keep it secure on a completely different, non-digital system.

Don’t forget your role in cloud security

If you put your data into the cloud, don’t be fooled into thinking your cloud providers are responsible for security. They will, of course, ensure your data is secure from external attacks, but they can’t control your access policy. If you get breached internally then criminals can get access to your cloud-stored data. Poor user awareness and lax password policies can leave the door to the cloud wide open, so security must be made a high priority.

As with any security threat, internal training can create an extra line of defence. By training employees to spot and stop phishing emails and other forms of social engineering, organisations can prevent accounts becoming compromised and protect their data and backups from attack.

Your data is a valuable commodity and should be treated as such. By understanding its value, taking steps to remove links between your operating environment and your backup servers, and by protecting your business from ransomware attacks, you know you’ve done the best job you can.

World Backup day

(This page is not officially supported or endorsed by World Backup Day)


Category: Security

Alastair Broom
January 18, 2019

We spoke to Paul Graziano, Cyber Security Compliance Manager at Transport for London (TFL), about his job to defend the UK’s largest public transport network through innovative approaches.

The transport system is part of our critical national infrastructure – how do you see the cyber threat against this evolving?

I think cyber security within critical national infrastructure is becoming an increasingly complex task. A common security challenge is that many of the systems and devices we rely on were built when security wasn’t really an issue. They were designed for only a few people to have access to them. Now, more and more modern industrial control systems are being connected by IT Infrastructure, which is of course a positive thing.

However this means we don’t design ourselves into a position where we have to break in security to protect residency devices, which is what we’re currently working on. Instead of having security by design, we need to put the monitoring around these systems, so we know what users are accessing, know what they’re doing, as well as the network monitoring around these devices too.

What can we do to protect our critical infrastructure against attack

There’s no simple answer to this. In some ways, what we’re trying to do now is use a framework we’ve adopted in Information Technology (IT) to protect Operational Technologies (OT) which involves a security strategy. Security needs to start right from the beginning of the procurement of new systems and IT needs to make sure any new systems meet our security requirements.

I think we need to assume that we will be compromised. As long as the Security Operations team have that visibility, we will be able to respond to any incidents if need be and they’ll also need to be able to pull the plug in worst case scenarios.

A lot of work has been done within the government to help us out. The Centre for Protection of National Infrastructure (CPNI), a UK government body, released a number of frameworks for industrial control systems. They believe we should structure our industrial control systems to make them secure and deploy frameworks around them to achieve that.

What are your thoughts on the value of threat intelligence and and security analytics in the fight against cybercrime?

Both are extremely important and a vital part of Security Operations. Threat intelligence is a necessity as we need to understand what the new threats are and how that could morph into an attack on TFL.

Security analytics is just as important because to protect your network you need to have good visibility over it and you can only really achieve that through security log analytics. To do this you need to have the capability to sort logs from a range of sources and correlate these for suspicious activity. This enables you to benchmark it in order to understand what normal looks like and then you can look out for anomalies.

Will the Internet of Things change the way you address cyber crime?

Absolutely. Up until recently, the increasing number of devices being connected to the internet were mainly from the commercial side, but now it is entering the business side too. Every connected device is a potential target for botnet activities because they are not inherently secure, so they’re easily targeted and taken over as part of a larger attack. This is a very difficult problem to solve and we can’t be isolated as a business in dealing with it, there needs to be a consolidated effort.

How are you using virtualisation to transform your business?

TFL are hugely into virtualisation. A really good example of this would be, our public facing API that’s used for TFL’s journey planner, which powers applications like Citymapper too.

In terms of its impact on our business, it has changed the way in which we think about new projects and new applications. Previously we had to rack up a new server every time we started a project. Now we can rapidly develop, build and test prototypes for new applications at very little cost. It gives us a lot of flexibility as we can scale up pretty much automatically when we need to, which we would never have been in a position to do before virtualisation.

There is a renewal aspect to it too, that if configured correctly, it’s relatively simple to spin up a new server if your prime one goes down. So there are many great benefits to virtualisation.

What other technologies would you suggest adopting in the fight against cyber-crime?

My idea is not so much a technology, but an awareness strategy. If you look at the number of the high profile attacks over the last year, a lot of them started with simple phishing attacks. Our spam filters will never be able to spot the first email from a newly setup phishing campaign and they’re purposefully designed to do that. One of the only ways to protect against this is to ensure our users are sufficiently trained to detect suspicious emails. I think improving communication with employees and making them aware of the impact an attack could potentially have, is vital.

In terms of actual technology, I think privileged identity access management is also key to a security team. It’s important to understand who has access to the most critical parts of your business as during a compromised attack, hackers will look for the business’ ‘crown jewels’.

One of the technologies that the industry is looking at more widely is called ‘deceptive security technology’, this is essentially a very old security component called a ‘honeypot’. A honeypot is a system that is built to be insecure and placed in a public part of your network. By doing so, you can see who finds it and what they do to it. This enables you to understand what sort of attacks are being targeted at it, providing a wider picture of the threat landscape. It is essentially a piece of proactive threat intelligence to find out if people are attacking parts of your network.

Thank you very much for answering all of our questions Paul and we wish you a fruitful career at TFL.

Category: Security

Alastair Broom
January 2, 2019

With high profile breaches for the likes of Facebook, British Airways and Marriott fresh in the mind, it’s no surprise to see a backlash against the companies that hold our data and a new impetus to take back control. But how are CIOs tackling the information security challenge, and what do we expect the future to hold?

Assessing the threat

According to more than 840 CIOs that we interviewed for the sixth edition of the annual Logicalis Global CIO Survey, the role of the CIO is in flux. A traditional focus on “keeping the lights on” has seemingly given way to more strategic activities around service and product innovation as organisations take to the cloud and expand their IT estate. Despite all this, security continues to dominate CIO’s time and attention, with 93% saying that they devote between 10% and 50% of their time to information security – with 54% spending at least 30% of their time on it.

CIOs are right to remain vigilant, as all evidence points to the fact that threats are definitely increasing. We’re seeing no sign that external threats such as malware, ransomware, crypto-jacking and phishing are going anywhere, especially because it’s become even easier for the bad guys to launch these type of attacks. They’re low cost and have the potential to reach either massive or highly targeted audiences.

One trend that has emerged from the survey is that CIOs are now far more focused on the human dimension of cyber risk than before. Whilst the 2017 report cited external threats as the clear focus, this year’s findings saw lack of staff awareness and mistakes as a concern for more than half (56%) of CIOs, while 39% are concerned about malicious insiders. The human dimension is interesting, and we’d certainly say that people and process should always be the place to start. This attitude doesn’t particularly bear out in what we’ve seen before, however, as technology tends to take precedence over training.

The expansion of the IT footprint and the ever evolving threat landscape have clear implications for security, and most CIOs signal that they are moving away from a purely defensive footing to one of cyber resilience, which brings together defence with detection and recovery. More than a third (37%) of CIOs say their organisation now adopts a resilience-based information security footing. This stance will be aided by some element of automation within the process, as the rate at which threats are accelerating is outpacing our ability to develop skills. AI represents an opportunity to keep pace, particularly when it comes to threat detection and response. We’d expect developments in this area to focus on the ‘response’ part of this process, although it will require something of a cultural leap to allow technology to make decisions for us.

The value of data

Data breaches were cited as a concern by 54% of CIOs, demonstrating how CIOs are attuned to the broader debate around data privacy and management. Despite measures introduced this year, it’s still very difficult to understand and manage data permissions, for both consumers and the businesses that own their data, and there’s a lot more that needs to be done to clean up the ethics around this.

According to our CIO sample, the impact of GDPR has fallen far short of the dire predictions with nearly three quarters (71%) saying that GDPR passing into law has had no impact on their organisation at all. Based on what we heard from customers, GDPR was significantly overplayed and organisations became apathetic long before the May deadline. Faced by business trying to sell them solutions to address their GDPR requirements, a lack of understanding of what their liability was and a misconception that they were too small to be significant, fed-up businesses appear to have opted for a view that “we only need to be as good as our neighbour”.

It would be wrong to suggest that GDPR in, and of itself, had no effect. Rather that most organisations did a great job of implementation, as those with longer memories will recall from Y2K. The Logicalis Global CIO Survey also assessed the cost of GDPR compliance and, again, the reality fell well short of the hype. Though the average investment of up to £25,000 is not insignificant, it suggests that the process was well and efficiently handled.

So what happens now?

Even though the fines for non-compliance are considerable, we’d expect the biggest financial impact in a post GDPR world to be seen in class-action lawsuits arising as a result of data breaches. The huge numbers of customers affected by these large data breaches make this inevitable. But this also has the potential to be hijacked by ‘ambulance-chasers’, and our survey found that 6% of CIOs have already been targeted by opportunists seeking to profit from non-compliance. So expect those PPI and whiplash calls on your phone to be replaced by an automated voice asking about the data breach that compromised your personal information.

We’re certainly at the point now where data breaches should be viewed as ‘when’ and not ‘if’. We’d expect organisations to be increasingly turning to encryption as a way to minimise the impact when a breach does happen, a trend that has been slow to date because of the costs associated. Perhaps the threat of users removing their personal data, and a greater understanding of its value to the business world will be the catalyst this needs.

Category: Security

Alastair Broom
December 13, 2017

Ransomware is a hot topic at the moment and the “attack du jour” for cybercriminals.   The code is easy to obtain and campaigns simple to execute thanks to the industrialised infrastructure that supports it.  The result – a rapidly growing market that is already estimated to be worth in excess of $1 billion.

Criminal gangs provide “ransomware as a service” with everything from code provision through to the monetisation of the attack making it easy to both execute and profit from.  Other types of cyber-attacks may involve complex, time-consuming or risky steps for the criminals but ransomware profits are immediate, usually via Bitcoin payment into the criminal’s wallet.

The traditional approach to cyber security has been to deploy more and more technology to protect against the evolving threat.  This has led to uncontrolled technology sprawl creating a security management nightmare.  Throwing more technology at the problem implies more resources to manage the estate and security resources are hard to come by.  According to research by the recruitment company

Despite the increased level of sophistication and frequency of ransomware attacks, by following the steps described below, organisations can significantly reduce their risk.

Step 1: Security Awareness Training

Although technology plays an important part, the majority of malware and ransomware attacks involve a user doing something they shouldn’t, such as clicking on a malicious web link, opening an email attachment or installing a new application.  Usually this is due to a lack of understanding of the security risks associated with such actions.  Phishing, a form of social engineering is a common technique used to get ransomware inside an organisation, effectively duping users into downloading malicious code or otherwise opening up the organisation’s network to cyber-attack.

Many organisations provide basic security training as part of company induction or as one-off exercises to address audit requirements.  However, security training should be a regular part of users’ development and regularly updated.   Training should be delivered at a pace and frequency that fits in with the employees’ work schedules with progress monitored and tested for effectiveness as part of the program.  Security awareness is a critical part of any organisation’s security program and is fundamental to several security frameworks such as the UK National Cyber Security Centre’s “10 Steps to Cyber Security” program

Step 2:  Vulnerability Management

Most malware infections, including ransomware, compromise our systems due to vulnerabilities in the operating systems and applications.  All too often, these vulnerabilities remain unpatched for months or even years allowing criminals to exploit the same flaws time and time again.

In 2015, of exploited vulnerabilities have had a patch for more than one year and it doesn’t look as though that’s changing.  In addition, IBM research shows that when new vulnerabilities are discovered, the average time taken for hackers to exploit them has decreased from 45 days ten years ago to 15 days today.

Effective vulnerability management is critical in ensuring that existing vulnerabilities are dealt with and new ones are patched quickly before they can be are exploited.

Step 3:  Web Protection

Almost all ransomware is delivered across the web.  Regardless of whether the initial infection is via email or a malicious/compromised web site, the malware will normally attempt to contact a remote server to download additional software such as exploit kits or encryption software.  Web security solutions can be used to detect this suspicious activity, preventing the dangerous malware payloads from being downloaded – even if the initial infection is successful.  Modern web security solutions make use of advanced threat intelligence to identify malicious domains and web servers and prevent the malware from receiving its instructions.

Step 4:  Endpoint Protection

Ransomware infection inevitably happens at the endpoint.  Typically, a laptop, PC or server will be compromised and used to propagate the malware throughout the network.  Traditional signature-base anti-malware solutions are largely ineffective against modern malware due to rapidly changing code and the time taken by security vendors to identify new malware variants and create and distribute signatures.  Behaviour-based endpoint protection is much more effective in dealing with modern malware as it will identify malicious behaviour such as file substitution and registry changes rather than looking for a specific malware fingerprint in an ever increasing signature database.

Step 5:  Security Analytics

Steps 1 to 4 above will provide an effective defence against ransomware and malware in general.  However, it is impractical to expect your systems never to be breached so it is imperative that you have visibility into the activity in your environment and the ability to identify breaches and react when they occur.  This visibility and identification is provided by a Security Incident & Event Management (SIEM) platform.  This software will ingest logs from your security infrastructure, servers, routers etc. and search for suspicious activity that could signal a breach.  SIEM provides a “single pane of glass” view into disparate technology overcoming many of the problems associated with the technology sprawl mentioned earlier and enabling security events to identified so they can be dealt with quickly.

Ransomware is a plague that threatens the availability of the data we rely on for our businesses to operate.  If successful, ransomware attacks can bring organisations to their knees and result in substantial financial loss in ransom payments, system restoration, clean-up and the growing impact of regulatory fines. Following these 5 steps will significantly reduce the risk of ransomware attacks.   Logicalis can help you with solutions to address each of these steps helping you navigate to a more secure environment.  For more details, please feel free to contact us.

Category: Security

Alastair Broom
August 9, 2017

It’s common knowledge that there is a global shortage of experienced IT security professionals, right across the spectrum of skills and specialities, and that this shortage is exacerbated by an ongoing lack of cyber security specialists emerging from education systems.

Governments are taking action to address this skills shortage, but it is nevertheless holding back advancement and exposing IT systems and Internet businesses to potential attacks.

Because of this, and despite the fear that other industries may have of Artificial Intelligence (AI) the information security industry should be embracing it and making the most of it. As the connectivity requirements of various different environments become ever more sophisticated, so the number of security information data sources is increasing rapidly, even as potential threats increase in number and complexity. Automation and AI offer powerful new ways of managing security in this brave new world.

At the moment, the focus in AI is on searching and correlating large amounts of information to identify potential threats based on data patterns or user behaviour analytics. These first generation AI-driven security solutions only go so far, though: security engineers are still needed, to validate the identification of threats and to activate remediation processes.

As these first generation solutions become more efficient and effective in detecting threats, they will become the first step towards moving security architectures into genuine auto-remediation.

To explore this, consider a firewall – it allows you to define access lists based on applications, ports or IP addresses. Working as part of a comprehensive security architecture, new AI-driven platforms will use similar access lists, based on a variety of complex and dynamic information sources. The use of such lists will under-gird your auto-remediation policy, which will integrate with other platforms to maintain consistency in the security posture defined.

As we move into this new era in security systems, in which everything comes down to gathering information that can be processed, with security in mind, by AI systems, we will see changes as services adapt to the new capabilities. Such changes will be seen first in Security Operations Centres (SOCs).

Today’s SOCs still rely heavily on security analysts reviewing reports to provide the level of service expected by customers. They will be one of the first environments to adopt AI systems, as they seek to add value to their services and operate as a seamless extension to digital businesses of all kinds.

SOCs are just one example, the security industry will get the most out of AI, but they need to start recognising that machines do best at what people do best. Any use of this technology will enable the creation of new tools and processes in the cybersecurity space that will protect new devices and networks from threats even before a human can classify that threat.

Artificial intelligence techniques such as unsupervised learning and continuous retraining can keep us ahead of the cyber criminals. However, we need to be aware that hackers will be also using these techniques, so here is where the creativity of the Good Guys can focus on thinking about what is coming next and let the machines do their job in learning and continuous protection.

Don’t miss out: to find out more, contact us – we’ll be delighted to help you with emerging technology and use it to your benefit.

Category: Security

Alastair Broom
July 12, 2017

£170m lost on the London Stock Market just over a week, and untold damage to the “World’s Favourite Airline”. That’s the cost within the UK to the International Airlines Group, the owner of British Airways, after BA’s recent ‘Power Outage’ incident.

“It wasn’t an IT failure. It’s not to do with our IT or outsourcing our IT. What happened was in effect a power system failure or loss of electrical power at the data centre. And then that was compounded by an uncontrolled return of power that took out the IT system.” Willie Walsh (IAG Supremo) during a telephone interview with The Times.

Willie has since inferred that the outage was caused by the actions of an engineer who disconnected and then reconnected a power supply to the data centre in “an uncontrolled and un-commanded fashion”. Could this then actually have something to do with the IT outsource after all, and did a staff member go rogue, or was it down to poor training and change control…?

For me what this highlights is the need to place greater emphasis on availability and uptime of those systems that support critical parts of a business or organisations services and offering. Along with robust processes and automation where possible to minimise the impact of an unplanned outage.

All businesses should expect their systems to fail. Sometimes it can be a physical failure of the infrastructure supporting the data centre (Power, UPS’s, Generators, Cooling etc.). It can be the power supply itself. Computing, Storage or the Network equipment can fail. Software and systems can suffer an outage. Plus it can also come down ‘Human Error’ or poor maintenance of core systems or infrastructure.

Coping with a Power Failure

Even if you have two power feeds to your building, and even if they’re from two different power sub-stations, and run through two different street routes, those sub-stations are still part of the same regional and national power grid. If the grid fails, so does your power. No way around it, except to make your own. Power Surge’s are handled by monitoring the power across Cabinet PDU’s, Critical PDU’s, UPS’s, Generators & Transformers, while assigning Maximum Load to all cabinets to make sure that we do not overload our customers systems.

Recovering from a Disaster

Recovering from a disaster is something that all organisation plan for, however not all have a Disaster Recovery (DR) Plan as there are some that consider High Availability (HA) to be more than sufficient. However HA only provides a localised system for failover, whereas DR is designed to cope with a site failure.

The challenge with DR for many of our customers is the cost;

  • First you need to prioritise which applications workloads you want to failover in the event of a disaster.
  • Second you need to purchase and manage infrastructure and licensing for these workloads with continuous replication.
  • Third you need a 2nd location.
  • Fourth you need a robust DR plan that allows you to recover your workloads at the 2nd location.
  • Then lastly (which is considered harder) you’ll need to fail back these services once the primary site has been recovered.

This can be an expensive option, but this is also where things like Cloud DR-as-a-Service can help minimise any expenditure, and the pain associated with owning and managing a DR environment.

Reducing the impact of an outage

Minimising the impact of any form of physical failure should be a priority over recovering from an outage. Workflow Automation can help a business maintain uptime of applications and services. This can be defined as a policy where services can be moved to other systems locally, or all services can be re-provisioned to a DR location or a DR platform in the event of outage caused either by a power issue or human error. Helping a business minimise the risk and the impact of outage.

I’ll let you come to your own conclusions as to whether British Airways should adopt a robust change control, automation or DR policy. Logicalis can assist and provide you with a number of options custom to your particular needs so that you are not the next press headliner.

Alastair Broom
May 16, 2017

What if I told you, that Ransomware, is on its way to becoming a $1 billion annual market ?

Eyebrows raised (or not), it is a matter of fact in 2017 that Ransomware is an extremely lucrative business, evolving in an alarming rate and becoming more sophisticated day by day.

But, the question remains, what is Ransomware?

Ransomware is a malicious software – a form of malware – that either disables a target system or encrypts a user’s files and holds them ‘hostage’ until a ransom is paid. This malware generally operates indiscriminately with the ability to target any operating system, within any organisation. Once the malware has gained a foothold in an organisation, it can spread quickly infecting other systems, even backup systems and therefore can effectively disable an entire organisation. Data is the lifeblood of many organisations and without access to this data, businesses can literally grind to a halt. Attackers demand that the user pay a fee (often in Bitcoins) to decrypt their files and get them back.

On a global scale, more than 40% of ransomware victims pay the ransom, although there is no guarantee that you will actually get your data back and copies of your data will now be in the attacker’s hands. In the UK, 45% of organisations reported that a severe data breach caused systems to be down on average for more than eight hours. This makes it apparent that the cost is not only the ransom itself, but also the significant resources required to restore the systems and data. What is even more alarming, is that in the UK the number of threats and alerts is significantly higher than other countries (Cisco 2017 Annual Cybersecurity Report). Outdated systems and equipment are partially to blame, coupled with the belief that line managers are not sufficiently engaged with security. Modern and sophisticated attacks like ransomware require user awareness, effective processes and cutting edge security systems to prevent them from taking your organisation hostage!

How can you protect your company?

As one of the latest threats in cybersecurity, a lot has been written and said around ransomware and potential ways of preventing it. A successful mitigation strategy involving people, process and technology is the best way to minimise the risk of an attack and its impact. Your security program should consider the approach before, during and after an attack takes place giving due consideration to protecting the organisation from attack, detecting Ransomware and other malware attacks and how the organisation should respond following an attack. Given that Ransomware can penetrate organisations in multiple ways, reducing the risk of an infection requires a holistic approach, rather than a single point solution.  It takes seconds to encrypt an entire hard disk and so IT security systems must provide the highest levels of protection, rapid detection and high containment and quarantine capability to limit damage. Paying the ransom should be viewed as an undesirable, unpredictable last resort and every organisation should therefore take effective measures to avoid this scenario.

Could your organisation be a target?

One would imagine that only large corporations would be at risk of a Ransomware attack, but this is far from the truth. Organisations of all industries and sizes report Ransomware attacks which lead to substantial financial loss, data exposure and potential brand damage. The reason is that all businesses rely on the availability of data, such as employee profiles, patents, customer lists, financial statements etc. to operate.  Imagine the impact of Ransomware attacks in police departments, city councils, schools or hospitals. Whether an organisation operates in the public or private sector, banking or healthcare, it must have an agile security system in place to reduce the risk of a Ransomware attack.

Where to start?

The first step to shield your company against Ransomware is to perform an audit of your current security posture and identify areas of exposure.  Do you have the systems and skills to identify an attack?  Do you have the processes and resources to respond effectively?  As Ransomware disguises itself and uses sophisticated hacking tactics to infiltrate your organisation’s network, it is important to constantly seek innovative ways to protect your data before any irreparable damage is done.

With our Security Consultancy, Managed Security Service offerings and threat-centric Security product portfolio, we are able to help our customers build the holistic security architecture needed in today’s threat landscape.

Contact us to discuss your cyber security needs and ensure you aren’t the next topic of a BBC news article.


Category: Security

Alastair Broom
March 10, 2017

As Logicalis’ Chief Security Technology Officer I’m often asked to comment on cyber security issues. Usually the request relates to specific areas such as ransomware or socially engineered attacks. In this article I’m taking a more holistic look at IT security.

Such a holistic approach to security is, generally, sorely lacking. This is a serious matter, with cyber criminals constantly looking for the weak links in organisations’ security, constantly testing the fence to find the easiest place to get through. So, let’s take a look at the state of enterprise IT security in early 2017, using the technology, processes and people model.


A brief, high-level look at the security market is all it takes to show that there are vast numbers of point products out there – ‘silver bullet’ solutions designed to take out specific threats. There is, however, little in terms of an ecosystem supporting a defence-in-depth architecture. Integration of and co-operation between the various disparate components is , although growing, typically weak or non-existent.

We’ve seen customers with more than 60 products deployed, from over 40 vendors, each intended to address a specific security issue. Having such a large number of products itself presents significant security challenges, though. All these products combined have their own vulnerability: support and manintenance. Managing them and keeping them updated generates significant workload, and any mistakes or unresolved issues can easily become new weak points in the organisation’s security.

The situation has been exacerbated by the rapidly increasing popularity of Cloud and Open Source software. Both trends make market entry significantly simpler, allowing new players to quickly and easily offer new solutions, targeting whichever threat happens to be making a big noise at the moment.

Just as poor integration between security products is an issue, so is lack of integration between the components on which they are built. Through weak coding or failure to make use of hardware security features – Intel’s hardware-level Software Guards Extensions (SGX) encryption technology is a good example – security holes are left open, waiting to be exploited.

The good news on the technology front is that we are seeing the early stages of the development of protocols, such as STIX, TAXII and CybOX, allowing different vendors’ products to interact and share standardised threat information. The big security vendors have been promoting the idea of threat information sharing and subsequent action for a while, but only within their own product ecosystems. It’s time for a broader playing field!


IT security is one of the most important issues facing today’s enterprise, yet, while any self-respecting board will feature directors with responsibility for sales, marketing, operations and finance, few enterprises have a board level CISO.

Similarly few organisations have a comprehensive and thoroughly considered security strategy in place, or proper security processes and policies suitable for today’s threat landscape and ICT usage patterns. A number of industry frameworks exist: ISO 27001, Cyber Essentials, NIST to name but a few; and yet very few organisations adopt these beyond the bare minimum to meet regulatory requirements.

Most organisations spend considerable sums on security technology, but without the right security strategy in place, and user behaviour in line with the right processes and policies, they remain at risk of serious breaches.


The hard truth is that some 60% of breaches are down to user error. Recent research obtiained through Freedom of Information requests found that 62% reported to the ICO are down to humans basically getting it wrong. People make poor password choices, use insecure public (and private!) WiFi, and use public Cloud storage and similar services without taking the necessary security precautions. They do not follow, or indeed even know, corporate data classification and usage policies. The list, of course, goes on.

Training has a part to play here, to increase users’ awareness of the importance of security, as well as the behaviours they need to adopt (and discard) to stay secure. However, there will come a point at which the law of diminishing returns kicks in: we all make mistakes – even the most careful, well trained of us.

We need to explore, discover and devise new ways in which technology can help, by removing the human element, where possible and desirable, and by limiting and swiftly rectifying the damage done when human error occurs. Furthermore, we need to leverage ever improving machine learning and artificial intelligence software to help augment human capability.

Enterprises need to work with specialists that can help them understand the nature of the threats they face, and the weak links in their defences that offer miscreants easy ways in. That means closely examining all aspects of their security from each of the technology, processes and people perspectives, to identify actual and potential weaknesses. Then robust, practical, fit-for-purpose security architectures and policies can be built.

For an outline of how this can work, take a look at Logicalis’ three-step methodology here or email us to discuss your cyber security needs.

Category: Security

Alastair Broom
December 15, 2016

Last week I read that you can now hijack nearly any drone mid-flight just by using a tiny gadget.

The gadget responds to the name of Icarus and it can hijack a variety of popular drones mid-flight, allowing attackers to lock the owner out and give them complete control over the device.

Besides Drones, the new gadget has the capability of fully hijacking a wide variety of radio-controlled devices, including helicopters, cars, boats and other remote control gears that run over the most popular wireless transmission control protocol called DSMx.

Although this is not the first device we have seen that can hijack drones, this is the first one giving the control Icarus works by exploiting DMSx protocol, granting attackers complete control over target drones that allows attackers to steer, accelerate, brake and even crash them.

The attack relies on the fact that DSMx protocol does not encrypt the ‘secret’ key that pairs a controller and the controlled device. So, it is possible for an attacker to steal this secret key by launching several brute-force attacks.

You can also watch the demonstration video to learn more about Icarus box.

There is no mitigation approach to this issue at the moment, other than wait for manufacturers affected to release patches and update their hardware embracing encryption mechanisms to secure the communication between controller and device.

Having seen this video and the potential impact of this hijacking technique, my first thought was about the threat for Amazon’s new service coming soon, which will allow drones to safely deliver packages to people’s homes in under 30 minutes.

This is just another example of how important is to define the right strategy around using encryption as part of the security in the digital era. Business data and the way we want to access this data from any device, anywhere and anytime just highlight the need of enhanced and clever security solutions.

There are different ways Logicalis can help our customers in the protection of data located in data centres and end points with the help of the ecosystem of partners like Cisco and Intel Security.

An interesting offering to mention is Logicalis Endpoint Encryption Managed Service. This service gives our customer’s devices and the data within them the level of protection that will give them peace of mind should a device be lost or stolen, and we Logicalis manage the service for them. This service is the market leader for data protection and it provides the highest levels of Confidentiality, Integrity and Availability. The service is part of the global strategy adopted by Logicalis Group across EMEA.

Category: Automation, Security

Alastair Broom
December 13, 2016

Morpheus, in one of the most iconic scenes of the Matrix trilogies said, “You take the blue pill, the story ends. You wake up in your bed and believe whatever you want to believe. You take the red pill, you stay in Wonderland, and I show you how deep the rabbit-hole goes.”

Let me ask you something, what about taking decisions like the one offered by Morpheus based on additional information that can be used to evaluated better the two options? Would that influence Neo to change his mind on the decision made?

According to the Harvard Business Review ( many business managers still rely on instinct to make important decisions, often leading to poor results. However, when managers decide to incorporate logic into their decision-making processes, the result is translated into better choices and better results for the business.

In today’s digital world, it’s difficult to ensure the integrity of mission critical networks without a detailed analysis of user engagement and an understanding of the user experience.

HBR outlines three ways to introduce evidence-based management principles into an organization. They are:

  • Demand evidence: Data should support any potential claim.
  • Examine logic: Ensure there is consistency in the logic, be on the lookout for faulty cause-and-effect reasoning.
  • Encourage experimentation: Invite managers to conduct small experiments to test the viability of proposed strategies and use the resulting data to guide decisions.

So, the big question is, would it be possible to introduce these three elements into the tasks assigned to the network manager?

The answer is ‘yes’ provided the manager is given the opportunity to integrate with network data that carries the context of users, devices, locations and applications in use and then given the opportunity to mine this captured data to gain insights into how and why systems and users perform the way they do.

Fortunately, the limitations of traditional networks can be overcome with the use of new network platforms providing in-depth visibility into application use across the corporate network, helping organisations to deliver significant, cost-effective improvements to their critical technology assets. It achieves this by:

  • improving the experience of connected users
  • enhancing their understanding of user engagement
  • optimizing application performance
  • improving security by protecting against malicious or unapproved system use.

According to IDC, “With the explosion of enterprise mobility, social and collaborative applications, network infrastructure is now viewed as an enterprise IT asset and a key contributor to optimizing business applications and strategic intelligence,”.

For companies facing the challenge of obtaining deep network insights in order to improve application performances and leverage business analytics, Logicalis is the answer.

Logicalis is helping their clients with the delivery of digital ready infrastructure as the main pillar for enhancing the user experience, business operations and taking secure analytics to the next level of protection for business information.

Alastair Broom
December 11, 2016

21 times in the past 12 months! According to the following BBC article, that’s how many times Bournemouth University has been hit by ransomware attacks (University hit 21 times in one year by ransomware – BBC News). 23 universities have suffered that same issue, 28 NHS trusts. And the problem continues in the Enterprise space. According to Kaspersky labs research, over 50,000 corporate PCs were infected with ransomware in 2015, a 100% year-on-year growth as compared to 2014. And that’s the data Kaspersky has access to. The FBI states that in Q2 of 2015, 4 million unique ransomware samples were detected, and that in 2016 over $1 billion will be paid out in ransomware attacks to retrieve data. I may be in the wrong job…

The thing is, ransomware in the past has been what I would term, a typically opportunistic attack. It was predominantly targeted at individuals, and would rely on relatively basic techniques, such as phishing scams or drive-by-downloads, to get the infection done. Attacks would range from scareware, to screenlockers, all the way to encryption attacks, and would demand a reasonably small sum from the victim to decrypt locked files (assuming you were lucky enough to get the files back at all). You would handover some Bitcoins, the currency hackers crave due to its global and anonymous nature, and hope the decryption key was handed over in return.

But that is no longer the case. Ransomware is now far better classed as an APT – Advanced Persistent Threat, and carries all the industrialized process behind it that we expect of corporate or sovereign government funded warfare. This is big business, and attackers are fully aware of the cost corporations are exposed to if they are unable to access that most precious of resources – their data. Attackers will spend huge effort researching their victim: footprinting, scanning, enumerating – and will then follow a very well thought out plan to realise their desired outcomes!

Not good news, but we can help! At Logicalis, we have been helping our clients prepare for and defend against ransomware and other APTs. Let’s specifically address ransomware.

We understand the key stages involved in a ransomware attack (infection, execution, backup spoliation, encryption, notification & clean up), and so can help advise you on the best lines of defence, and provide a number of services to make sure you are protected. The below list is a great place to start:

  • Reduce logged in user privileges so that they are unable to execute PowerShell and any other code not required.
  • Remove the command prompt from Citrix or logon profile sessions.
  • Use application whitelisting so that only authorised applications can run.
  • Prevent or disallow macro execution via group policy.
  • Investigate using where appropriate Microsoft EMET
  • Use a SIEM to monitor, for example, process injection
  • Ensure that there is a full backup of all the data and systems to be able to restore if and when required.
  • Use a software restriction policy in applocker (SRP)
  • Disable activeX in the browsers.
  • Install a personal firewall or use HIPS
  • Block binaries running from %APPDATA% and %TEMP% paths

With our Security consultancy, Managed Security Service offerings and threat-centric Security product portfolio, we are able to help our customers build the holistic security architecture needed in today’s threat landscape. When it comes to ransomware, our solutions in Anti-Virus, Endpoint Protection, Data Loss Protection, Advanced Malware Protection and Managed SIEM will ensure you aren’t the next topic of a BBC news article.

Category: Security

Alastair Broom
December 10, 2016

I was recently asked what I think will be three things making an impact on our world in 2017, with a few permutations of course:

Maximum of 3 technologies that will be significant for enterprises in terms of driving value and transforming business models and operations in 2017

Innovations that are most likely to disrupt industries and businesses

I’ve put my three below – it would be great to hear your thoughts and predictions in the comments!

Internet of Things

The Internet of Things is a big one for 2017. Organisations will move from exploring ideas around what IoT means for them in theory, to rolling out sensors across key opportunity areas and starting to gather data from what were previously “dark assets”. The reason IoT is so important is because of the amount of data the things will generate, and what new insight this gives to organisations, including things like physical asset utilisation & optimisation and proactive maintenance. Those organisations that take the IoT seriously are going to see their customers, their data, and their opportunities in completely new ways. Being able to add more and more data sources into the “intelligence stream” mean the decision is backed by more facts. It’s Metcalfe’s Law – the value of the network is proportional to the square of the number of users. Data is the network, and each thing is another user.

Being prepared to exploit the IoT opportunity though, especially at scale, will take proper planning and investment. Organisations will need a strategy to address the IoT, one that identifies quick wins that help further the business case for further IoT initiatives. The correct platform is key, an infrastructure for things. The platform that forms the basis for the connectivity of the things to the network will need to be robust, likely be a mix of wired and wireless, and because it’s unlikely to be a separate infrastructure, it needs to have the required visibility and control to ensure data is correctly identified, classified and prioritised.
Security too will be fundamental. Today the things are built for user convenience, security being a secondary concern. What the IoT then represents is a massively increased attack surface, one that is particularly vulnerable to unsophisticated attack. The network will therefore need to be an integral part of the security architecture.

Edge Analytics

Edge analytics is another one to look out for. As the amount of data we look to analyse grows exponentially, the issue becomes twofold. One, what does it cost to move that data from its point of generation to a point of analysis? Bandwidth doesn’t cost what it used to, but paying to transport TB and potentially PB of information to a centralised data processing facility (data centre that is) is going to add significant cost to an organisation. Two, having to move the data, process it, and then send an action back adds lag. The majority of data we have generated to this point has been for systems of record. A lag to actionable insight in many scenarios here may very well be acceptable. But as our systems change to systems of experience, or indeed systems of action, lag is unacceptable.
Analytics at the edge equates to near real-time analytics. The value of being able to take data in real time, its context, and analyse this amongst potentially multiple other sources of data, and then present back highly relevant in the moment intelligence, that’s amazing. Organisations once again need to ensure the underlying platform is up to the task. The ability to capture the right data, maintain its integrity, confirm to privacy regulations and be able to manage the data throughout its lifecycle. Technology will be needed to analyse the data at its point of creation, essentially you will need to bring compute to the data (and not the other way round as typically done today).

Cognitive Systems

Lastly, cognitive systems. Computers to this point have been programmed by humans to perform pretty specific tasks. Cognitive systems will not only now “learn” what to do from human interaction, but from the data they generate themselves, alongside the data from other machines. Cognitive systems will be continually reprogramming themselves, each time getting better and better at what they do. And what computers do is help us to things humans can do, but faster. Cognitive systems will expand our ability make better decisions, to help us think better. Cognitive systems move from computing systems that have been essentially built to calculate really fast, to systems that are built to analyse data and draw insights from it. This extends to being able to predict outcomes based on current information and consequences of actions. And because it’s a computer, we can use a far greater base of information from which to draw insight from. Us humans are really bad at remembering a lot of information at the same time, but computers (certainly for the short term) are only constrained by the amount of data we can hold in memory to present to compute node for processing.

Alastair Broom
November 30, 2016

As you might have read back in November 2016, a huge Distributed Denial of Service (DDoS) attach against Dyn, a major domain name system (DNS) provider, broke large portions of the Internet, causing a significant outage to a tonne of websites and services; including Twitter, GitHub, PayPal, Amazon, Reddit, Netflix, and Spotify.

How did the attack happen? What was the cause behind the attack?

Although exact details of the attack remain vague, Dyn reported an army of hijacked internet-connected devices are thought to be responsible for the large-scale attack; similar to a method recently employed by hackers to carry out a record-breaking DDoS attack of over 1 Tbps against the French hosting provider OVH.

According to security intelligence firm Flashpoint, Mirai bots were detected driving much, but not necessarily all, of the traffic in the DDoS attacks against Dyn. Mira is a piece of malware that targets Internet of Things (IoT) devices such as routers, and security cameras, DVRs, and enslaves vast numbers of these compromised devices into a bonnet, which is then used to conduct DDoS attacks.

This type of attack is notable and concerning because it largely consists of unsecured IoT devices, which are growing exponentially with time. These devices are implemented in a way that they cannot easily be updated and thus are nearly impossible to secure.

Manufacturers majorly focus on performance and usability of IoT devices but ignore security measures and encryption mechanisms. Which is why they are routinely hacked and widely becoming part of DDoS botnets and used as weapons in cyber-attacks.

An online tracker of the Mirai botnet suggests there are more than 1.2 Million Mirai-infected devices on the Internet, with over 166,000 devices active right now.

IoT botnets like Mirai are growing rapidly, and there is no easy way to stop them.

According to officials having spoken to Reuters, the US Department of Homeland Security (DHS) and the FBI are both investigating the massive DDoS attacks hitting Dyn, but none of the agencies have yet speculated on who might be behind them.

At Logicalis UK, we have a threat centric approach. We can help customers protect their applications and environments against DDoS attacks with on-premise, cloud-based or hybrid deployments based on solutions through our partner F5.

F5 provides seamless, flexible, and easy-to-deploy solutions that enable a fast response, no matter what type of DDoS attack you’re under. Together, Logicalis and F5 can;

  • Deliver multi-layered DDoS defense from a single box with a fast-acting, dual-mode appliance that supports both out-of-band processing and inline mitigation, while enabling SSL inspection and guarding against layer 7 app attacks.
  • Stop attacks on your data centre immediately with an in-depth DDoS defense that integrates appliance and cloud services for immediate cloud off-loading.
  • Unique layer 7 application coverage defeats threats cloaked behind DDoS attacks without impacting legitimate traffic.
  • Activate comprehensive DDoS defense with less complexity and greater attack coverage than most solutions.

If you would like to find out more about Logicalis’ advanced security practice, please complete the form opposite. Our experts are primed and ready to support you.

If you would like to find out more about our security practice please do not hesitate to get in touch:

Alastair Broom
November 29, 2016

Last week we hosted a number of our customers (30 people from 18 different organisation in fact) to an event held at Sushisamba in London. From the 39th floor, overlooking most of London, I had the privilege of hosting some of our existing and potential clients for a discussion predicated by the upcoming General Data Protection Regulation (GDPR). Over an absolutely fantastic lunch, which thankfully included many tasty meat dishes – I’m no huge fan of raw fish – we talked about how organisations are going to have to rethink their strategy around data governance and security in the face of a very tough new law.

I just wanted to give you a few takeaways from the day, none of which are edible – I’m sorry…

The first of our guest speakers was Lesley Roe, Data Protection Officer from the IET. Lesley spoke about what the IET are doing to get ready for GDPR. They hold a vast amount of personal data, and given that they are advising their membership on all manner of related things, they need to lead by example. Key points from her presentation are:

  • GDPR is about giving people more control over their personal data. Every day we share an extraordinary amount of personal data with all manner of organisations, and this data is valuable. GDPR is about ensuring we retain the rights to that value. What it is processed for, how can process it, how its retained/deleted once its useful life has expired.
  • Everyone has a part to play and training of staff & staff awareness are paramount. This, however, is no mean feat.
  • The process of data governance, and the education of that process throughout the organisation, will be the only way to fully comply with the regulation. How do the IET classify old & new data? How do they manage the lifecycle of the data? How do they make sure they are only obtaining, using and retaining the data they need and have consent for?
  • None of this, however, is possible without first knowing what personal data is within your organisational context, and where it lives.
  • Much of the thinking around GDPR will be a huge shift in the mindset of organisations today. Companies just do not think about their data assets and their responsibility for that data in the spirit of the regulation at all.

Our next two speakers were from two of our technology partners, VMWare and Palo Alto Networks. Things to remember here:

  • Technology, without a doubt, has a part to play in ensuring compliance. The regulators are far more savvy to what the art of the possible is in the security market, and they will be expecting organisations to leverage technologies within reach of budgets and according to exposure to best mitigate any risks to rights of individuals.
  • The ability to prevent, detect and report on the nature and extent of any breaches will be very important. Technologies will be needed to prove that organisations can do this effectively and efficiently, especially in the face of stringent reporting requirements.
  • State of the art will really mean state of the art. Regulators will be assessing how organisations are using the best possible mix of technologies to minimise both exposure to risk, and impact of any breach if/when they should occur.

The last presentation was from Ed Charvet and his guest star Ian De Frietas. Ian is part of the alliance we have with legal experts BLP. The joint value proposition Ed and Ian spoke about is what I believe makes us entirely unique in this space:

  • The first step towards compliance is data discovery – what is personal data from the perspectives of both the GDPR and the organisations’ context? Where is the data? How is data currently classified? How is it processed? How are permissions obtained? This is delivered through a mix of manual and automated processes to help customers understand where they stand today.
  • But this process takes time. The regulation comes into law on the 28th May 2018, and as Ian made clear, the regulator is taking a “zero day” approach. This effectively means that if you’re not fully complaint on that day, you are non-compliant, and the regulator has ever power to come after you. With fines of up to 4% of global group revenues, or EUR20 million (whichever is the greater of course), this is a regulation with teeth – and with what seems like a very real political agenda. Watch out Facebook, Google, Amazon…
  • Being compliant with the likes of the DPA today, while impressive, would still mean on day zero you do not comply with the new law.
  • Key questions to ask are: do you have a legitimate interest to process the data? What exactly are you planning to do with it? These will need to be made very clear even before the gathering of data has begun.

What became clear throughout the day is that time is tight to reach compliance, and the ICO in the UK seem to be recruiting in earnest to gear up for real enforcement of the law. This feels like something that is going to change how data, and in particular the protection of personal rights and data, is valued and protected by the organisations that get the most benefit from it. What organisations need to do as a matter of urgency is find out what personal data they hold, and where they store it. They need to assess the current security infrastructures they have and find out what gaps existing that could pose a risk and ultimately a loss of personal data. They need to be putting the right people and procedures in place to comply with new and enhanced rights, tighter reporting deadlines, and they should be working out what Data Protection Impact Assessments need to look like for their organisation to satisfy regulatory requirements.

As a next step, please reach out to either myself, Ed Charvet, Alastair Broom or Jorge Aguilera to discuss how Logicalis can help our customers get ready for GDPR. From the data discovery workshop, to engaging with BLP in legal matters, and technology assessments powered by tools from the likes of VMware and Palo Alto Networks; we really can help customers on the road towards compliance.

Latest Tweets