Digitally Speaking
digital transformation

Tim Wadey
March 19, 2018

What is your approach to Digital Transformation and is your business structured for it?


All modern companies are looking at digital transformation, and the key decision they need to make is whether to “become digital” or to “do digital”. “Becoming digital” is deciding to turn the whole business or business unit digital, re-engineering from the ground up to take full advantage of the benefits of technology across the value chain. “Doing digital” implies taking specific processes, maybe a customer interaction or a B2B transaction process, and making it digital. Depending on which of these options a business chooses to take, the approach and qualities of the Digital Transformation function will change.

Digital transformation has grown as a concept over the last few years, but in general, is taken to mean building additional business benefit on the data and data processes that a business owns. This can mean finding efficiencies through process improvement and automation, new opportunities buried within the value of corporate data or new digital routes to market. A full transformation embraces all of these and more; the emergence of a connected environment, now known as IoT, is opening new opportunities with every technological development.

Becoming Digital: starts with a solid digital business culture

If a business has chosen to “become digital”, the leadership team needs to embrace the objective and fully support the change initiative. That said, the scale of investment and impact of the programme means that a single point of oversight is essential. In some businesses this might fall to a CIO, in others a Chief Digital Officer, however, these leaders will need support from a team with excellent project and technical skills. In addition, the cultural change will require consideration throughout the process. Probably the most critical attributes that the transformation leaders will need to have are a clear vision of what digital looks like, the skills to understand how it will be delivered and, most importantly, the drive to sustain a multi-year transformational programme.

In many ways, the Digital Transformation Officer will have to lead the senior team through this programme, and these qualities combined with the soft skills to enable this leadership will eventually determine the success of the programme. This role is well suited to an interventional style – enabling the business to focus on BAU while the digital programme is delivered in a defined manner. There have been well-publicised initiatives similar to this in major UK retail banks and across industries, like the airlines, where all aspects of customer interaction have become fully digital.

Doing Digital: requires greater focus on technical skills

Alternatively, if the choice is to “do digital”, the transformation challenge is much more bounded. In this case, the challenge is more to do with having the technical understanding and project management skills to deliver tightly defined digital projects. While these transform the particular process involved, they do not require wholesale change across the business. For most organisations, this will be the chosen option as there is less risk and disruption in such an iterative approach. We are seeing programs like this often linked to IoT initiatives across our customer base.

Clearly the CEO will take a close interest in any of these initiatives, however with the choice to “become digital” he or she is betting the company and as such will want the transformation leadership to be part of his senior team and empowered to drive the vision to a conclusion. In choosing to “do digital”, the CEO contains the risk to particular areas and should use his management team to direct these initiatives through a skilled and technically able programme manager. Whatever the approach, there will be a material cost and the benefits realisation after go-live needs to be driven and measured with similar control and vigour.

No matter what direction you choose to take, speak to one of our experts to help you through your digital transformation journey.

professional meeting

Natalie Goodwin
February 20, 2018

Digital Workplace is not a one size fits all approach

“Workplace of the Future’’, “Digital Workplace” and ‘‘Future Digital Employee Experience” are definitely the latest buzz phrases in the HR world! Today we work anywhere at almost any time. We work in cafeterias, airport waiting areas, cars, with mobile devices, desktops, laptops, tablets, and the list goes on.

Delivering a great digital workplace in today’s business environment is tricky and one of the most important things a business can achieve – not to add the support of simultaneous communication across different devices and regardless of geo location.

Nonetheless, not many businesses have identified the appropriate tools and processes to create an effective and user-friendly modern workplace. If one takes into consideration the diverse and geographically scattered organisations, it is easy to understand why businesses should have a tailored approach, focusing on retaining talent, ensuring employee wellbeing and engagement, which are all greatly supported by the collaboration tools employees are given.

Digital Workplace in 2018 is all about Engagement, Experience, and Empowerment

Increasing Employee Engagement
Engagement means all employees can feel connected and part of the same team, no matter where they are located.

Engaged employees can increase a company’s performance by up to 202%
(Article by -Engage for success, the importance of employee engagement, Aug 2016).

With advances in technology, from software defined WAN to tailored applications, co-workers are able to stay in constant communication and feel a part of the team, whenever they are based. The easy to use, natural interfaces of UC application tools today allow employers to maximise adoption and bring measurable engagement through the built-in analytics capabilities.

Elevating Employee Experience
Gen X, Gen Y and now Gen Z talent represents a massive shift towards a more collaborative, connected and faster paced workplace, in which self-expression is encouraged, and autonomy, recognition and global awareness are core terms of employment.

The changing dynamic of a digital savvy workforce means organisations must address and tap into analytics and consider harnessing ‘as a service’ delivery models to raise the bar on talent acquisition, as well as to offer employees a productive, engaging and enjoyable work experience.

A recent study Deloitte completed with Facebook found that only 14 percent of companies believe their internal processes for collaboration and decision making are working well, and 77 percent believe email is no longer a viable tool for effective communication.

A great example of this is organisations starting to redesign their recruitment experience to resemble the consumer experience on e-commerce and social media platforms. They have examined how people search for common items, such as cars, music or major purchase and are now looking to weave that into their online recruitment campaign.

Digital Workspace Employee Empowerment
Employees feel trusted when they are given the power to work when and where they want. When non-desk workers feel included and part of the team, they’re more productive. Every business wants empowered employees, because they have been shown to be more satisfied in their roles, and thus more productive. Investing in your business’s digital workspace and enhancing your employees work experience is one way to make that happen.

Most organisations today are likely to have a ‘digital workspace’ of some sort, and it will only grow in the years to come. Competitive advantage increasingly comes from providing the right set of collaborative tools letting employees use technology in the way they want to and a business culture that puts people first.

Exclusive roundtable: The Future of Work in 2018
Join us and Fuze on March 21st at the Sky Garden for an exclusive roundtable dinner to discuss the future of work in 2018. In this session we will look into how employees’ demands create a complex environment for IT leaders, but on the same time are an opportunity to drive innovation and bolster productivity.

In particular, we will go through:
– The role of technology as the enabler of the future of work
– How to increase user adoption as well as workforce mobility and productivity
– How to reduce application sprawl and shadow IT

So bring your questions to the table and let’s see how we can help you get ahead of the changing working landscape.

Category: Collaboration

series of padlocks on brick wall, data privacy related words on the rights side

Tim Wadey
January 28, 2018

Data Privacy on the spotlight!

Data Privacy Day may not be an official holiday for your IT department, but it definitely should remind you that you need to focus and do more to protect confidential data.

The Data Privacy Day was first introduced in 2009 by the Online Trust Alliance (OTA) in response to the increasing number of cybersecurity attacks and data privacy breaches, emphasising the need for effective data protection regulations and transparent processes for collecting and using personally identifiable information (PII).

Examples of PII that fall under data protection regulations are:
• Name;
• Social Security number, full and truncated;
• Driver’s license and other government identification numbers;
• Citizenship, legal status, gender, race/ethnicity;
• Birth date, place of birth;
• Biometrics;
• Home and personal cell telephone numbers;
• Personal email address, mailing and home address;
• Security clearance;
• Financial information, medical information, disability information;
• Law enforcement information, employment information, educational information

If one considers the sources that PII can be collected from and how many new are added on a daily basis – big data, the internet of things, wearable technology – it is easy to understand why data privacy has become increasingly challenging. And let’s not forget the ransomware attacks, which are the latest major data privacy challenge.

Despite the size of the recent ransomware attacks, the majority of organisations still don’t have structured processes in place to prepare themselves and keep confidential data safe. Although there are effective steps for protection against ransomware threats, their number has significantly increased and companies delay to announce in fear of negative publicity.

In order to stop such actions from happening and improve the current data privacy practices, the European Union is introducing the General Data Protection Regulation (GDPR) taking effect in May 2018. This is the biggest shake up of data protection laws in the last 20 years.

What is GDPR?

GDPR is the latest set of regulation law framework across the EU that aims to increase data privacy for individuals, and gives regulatory authorities greater power to take action against businesses that breach the new data privacy laws. GDPR also introduces rules relating to the free movement of personal data within and outside the EU.
In particular, GDPR involves:
• Obtaining consent for processing personal data must be clear and must seek an affirmative response.

• Data subjects have the right to be forgotten and erased from records.

• Users may request a copy of personal data in a profile format.

• Parental consent is required for the processing of personal data of children under the age of 16.

As a result, organisations need to be extremely aware of these changes as they can face very strict fines in the cases of non-compliance. Can your organisation afford to be fined up to £20 million for failing this data privacy regulation or 4% of annual global revenue, as required by the new General Data Protection Regulation?


24% are unaware of the incoming data protection legislation, while one in three companies believe that GDPR isn’t relevant to them.*

Get Started with a GDPR Readiness Assessment

In response to the fast approaching data protection regulation, Logicalis UK Advisory Services team have developed a GDPR Readiness Assessment that will allow us to help you understand and frame your thoughts on your journey to compliance.

The Logicalis GDPR Readiness Assessment will help you answer a key question – Where am I on my journey to data privacy compliance, today? By investigating elements of your organisational landscape, we will produce an ‘as is’ assessment, where we will be able to gauge where you are on a standardised maturity curve, considering all things around cybersecurity and data protection.

Get in touch with our Advisory Services to discuss how we can help you in your journey to GDPR Readiness.



*London Chamber of Commerce and Industry, 

man working with servers, IT gear

Dean Mitchell
January 18, 2018

In the previous blog post on Capacity Management, we explored why IT projects fail and what you can do to prevent it.  What is even more important is to understand why, when it comes to securing your business, product alone isn’t the answer.

 Ransomware, data breaches, insider threats, phishing scams… we’ve all seen the headlines. And, although these words, once reserved for IT departments, are becoming a part of everyday vocabulary, that doesn’t make them any less concerning. They have the power to derail your entire business- everything that you’ve built- within seconds.

Nowadays, cybercrime is big business, and you can guarantee that for every security solution churned out by vendors, someone, somewhere is creating a brand new malicious code to target other vulnerabilities you didn’t even know existed within your organisation. Add to that modern working habits, with more and more businesses needing to adopt cloud and IoT for day-to-day operations, to keep up with their competitors, but subsequently increasing the potential attack surface, and you soon see that organisations are under siege from all angles.

To state the obvious; cybersecurity is no longer optional.

And this is something that all CIOs are more than aware of. In fact, in our 2017 global CIO Survey, security was cited as the number one concern when it came to an increase in the use of cloud services, with 70% of respondents citing it as a challenge.

So the problem is common knowledge, but what’s the solution?

Well, if your automatic answer is ‘by investing in security products’, then you’re not alone. Many business leaders define ‘security strategy’ as lots of different solutions coming together to work as one protective shield. Each solution is built to defend against a single threat vector, so various email, cloud and web products all become separate pieces of a much larger security puzzle.

Given the sheer volume of security products readily available, it’s no surprise that this puzzle doesn’t come cheap and that certain pieces aren’t as effective as others. But, surely the more products you deploy- effective or otherwise- the more significant your overall security capabilities and the better the protection for your organisation. After all, it’s better to be safe, and slightly out of pocket, than sorry… right?

 It’s an easy trap to fall into.

In reality, a growing number of point solutions patched together is no longer an effective strategy. Instead, this method compounds complexity and creates the very vulnerabilities that it is meant to be mitigating against.

This is because security devices raise an alert for each threat that they detect- that’s how they work. And when you have multiple tools in place, each detecting multiple threats, the chances are that alerts will be going off almost constantly. This is fine; it shows that the solutions are working.

But, it’s unlikely that a single organisation will have the manpower needed to deal with each alert simultaneously. Instead, overwhelmed and underresourced IT teams will probably try to prioritise and as a result many of the alerts, and therefore threats, are ignored, making your organisation vulnerable.

So, when it comes to protecting your business, spending thousands and thousands of pounds worth of your budget on product alone is futile. It’s clear that the ‘best of breed approach’ has had its day, with an increasing number of organisations coming to the realisation that it’s not about how many solutions you have in place, it’s about how you’re using them.

 A problem shared is a problem halved

To simplify things, you can strip your security strategy back to three key areas that all need to be done well; threat insight, vulnerability management and managed endpoint security.

Then, you need to make sure that the solutions you have within these areas are being used correctly. The easiest way to do this, and to make the most out of your resources, is to undertake a collaborative approach.

Take the heat off your own IT team and share the security burden with a partner who can help you to plug the gaps with managed solutions like:

–       Managed SIEM/SIRM- A Security Incident Event Management service working in conjunction with a Security Incident Response Management service will provide optimal threat insights. It will solve the biggest and longest headache for your internal IT team- the one that began when you started installing security solutions… External engineering teams will analyse and, effectively, filter the never-ending stream of alerts so that, before they even reach your team, they are prioritised in terms of risk to your business and have clear actions on how to stop them in their tracks.

–       Patch management- By combining patch management services with existing vulnerability scanning in a single service you can achieve optimal vulnerability management. Believe it or not, this service will fill any gaps in your security wall automatically. This is because networks will be regularly scanned for vulnerabilities with any intel gathered then being rolled into a patching program. Obviously, this will significantly reduce the time between when a vulnerability is identified and when it is patched.

 –       Security device management- This incorporates all endpoint security, including antivirus solutions, firewalls and device control, as a single managed service. Delivered via software on laptops, desktops and servers, the service can also detect rogue devices attaching to the network and provide web filtering.

The bottom line is that cybersecurity is not about product. It’s about people, processes and technology working coherently to manage risk and protect your organisation. Often, working collaboratively with providers who can manage your security can be the better option. They will have the resourcing and the skillset to help you deal with any potential threats, while offering more peace of mind.

Today, a third of CIOs see security as the most prominent barrier towards digital transformation. Outsourcing can change that by granting your internal IT teams the gift of time… time that can be used to pursue other areas of your business’ IT strategy.

Talk to us to find out how we can help you.


Originally posted on CBR, 21 November 2017


Category: Security

Alastair Broom
December 13, 2017

Ransomware is a hot topic at the moment and the “attack du jour” for cybercriminals.   The code is easy to obtain and campaigns simple to execute thanks to the industrialised infrastructure that supports it.  The result – a rapidly growing market that is already estimated to be worth in excess of $1 billion.

Criminal gangs provide “ransomware as a service” with everything from code provision through to the monetisation of the attack making it easy to both execute and profit from.  Other types of cyber-attacks may involve complex, time-consuming or risky steps for the criminals but ransomware profits are immediate, usually via Bitcoin payment into the criminal’s wallet.

The traditional approach to cyber security has been to deploy more and more technology to protect against the evolving threat.  This has led to uncontrolled technology sprawl creating a security management nightmare.  Throwing more technology at the problem implies more resources to manage the estate and security resources are hard to come by.  According to research by the recruitment company

Despite the increased level of sophistication and frequency of ransomware attacks, by following the steps described below, organisations can significantly reduce their risk.

Step 1: Security Awareness Training

Although technology plays an important part, the majority of malware and ransomware attacks involve a user doing something they shouldn’t, such as clicking on a malicious web link, opening an email attachment or installing a new application.  Usually this is due to a lack of understanding of the security risks associated with such actions.  Phishing, a form of social engineering is a common technique used to get ransomware inside an organisation, effectively duping users into downloading malicious code or otherwise opening up the organisation’s network to cyber-attack.

Many organisations provide basic security training as part of company induction or as one-off exercises to address audit requirements.  However, security training should be a regular part of users’ development and regularly updated.   Training should be delivered at a pace and frequency that fits in with the employees’ work schedules with progress monitored and tested for effectiveness as part of the program.  Security awareness is a critical part of any organisation’s security program and is fundamental to several security frameworks such as the UK National Cyber Security Centre’s “10 Steps to Cyber Security” program

Step 2:  Vulnerability Management

Most malware infections, including ransomware, compromise our systems due to vulnerabilities in the operating systems and applications.  All too often, these vulnerabilities remain unpatched for months or even years allowing criminals to exploit the same flaws time and time again.

In 2015, of exploited vulnerabilities have had a patch for more than one year and it doesn’t look as though that’s changing.  In addition, IBM research shows that when new vulnerabilities are discovered, the average time taken for hackers to exploit them has decreased from 45 days ten years ago to 15 days today.

Effective vulnerability management is critical in ensuring that existing vulnerabilities are dealt with and new ones are patched quickly before they can be are exploited.

Step 3:  Web Protection

Almost all ransomware is delivered across the web.  Regardless of whether the initial infection is via email or a malicious/compromised web site, the malware will normally attempt to contact a remote server to download additional software such as exploit kits or encryption software.  Web security solutions can be used to detect this suspicious activity, preventing the dangerous malware payloads from being downloaded – even if the initial infection is successful.  Modern web security solutions make use of advanced threat intelligence to identify malicious domains and web servers and prevent the malware from receiving its instructions.

Step 4:  Endpoint Protection

Ransomware infection inevitably happens at the endpoint.  Typically, a laptop, PC or server will be compromised and used to propagate the malware throughout the network.  Traditional signature-base anti-malware solutions are largely ineffective against modern malware due to rapidly changing code and the time taken by security vendors to identify new malware variants and create and distribute signatures.  Behaviour-based endpoint protection is much more effective in dealing with modern malware as it will identify malicious behaviour such as file substitution and registry changes rather than looking for a specific malware fingerprint in an ever increasing signature database.

Step 5:  Security Analytics

Steps 1 to 4 above will provide an effective defence against ransomware and malware in general.  However, it is impractical to expect your systems never to be breached so it is imperative that you have visibility into the activity in your environment and the ability to identify breaches and react when they occur.  This visibility and identification is provided by a Security Incident & Event Management (SIEM) platform.  This software will ingest logs from your security infrastructure, servers, routers etc. and search for suspicious activity that could signal a breach.  SIEM provides a “single pane of glass” view into disparate technology overcoming many of the problems associated with the technology sprawl mentioned earlier and enabling security events to identified so they can be dealt with quickly.

Ransomware is a plague that threatens the availability of the data we rely on for our businesses to operate.  If successful, ransomware attacks can bring organisations to their knees and result in substantial financial loss in ransom payments, system restoration, clean-up and the growing impact of regulatory fines. Following these 5 steps will significantly reduce the risk of ransomware attacks.   Logicalis can help you with solutions to address each of these steps helping you navigate to a more secure environment.  For more details, please feel free to contact us.

Category: Security

Dean Mitchell
December 5, 2017

Whether part of a large, international enterprise, a medium-sized organisation or a small startup, in this day and age, undertaking new IT projects is essential.

Businesses need to adopt new technologies in order to get all the benefits associated with new, innovative IT projects. However, it’s more than that… In our fast-moving digitally-competitive world, if you don’t adapt, you get left behind.For all organisations, it really is survival of the fittest, and the fittest are those who embrace new technologies and invest in new innovative IT projects. Think about it- how can you stand up to your competitors if they’re constantly three digital steps ahead?

Undertaking new innovative IT projects has become the key focus of CIOs everywhere. You can spend weeks, even months pushing a project through the planning stages; going over the specific schedule and timings, working out the breakdown and total costings and redefining the objectives. But what happens if it then falls down?

Failure isn’t something that anyone wants to experience. When you’re a CIO who’s spent huge amounts of time and energy working on getting a proposed project through all the usual barriers to implementation, failure can be even more difficult to accept. And, obviously taking the size of your organisation into consideration, the larger the project, the larger, and more expensive the problem, if it does fail.

A recent global study from Fujitsu found that on average organisations lose £483,690 for every cancelled digital project. That’s a lot of money for a single project, especially for an outcome that could have potentially been avoided.

Why do IT projects fail?

Well, it all comes back to resourcing. When times get tough, CIOs have to throw their efforts into ‘keeping the lights on’, rather than implementing the exciting and innovative new projects that are designed to give their organisations the upper hand against competitors.

Often this will mean that they are working with limited resources from the offset, when it comes to new IT projects and to try and combat this, projects will be run in series, rather than parallel.

However, the fact is that, a lot of the time, various projects rely on the same elements or components. Each project will have a benefit realisation target, which will be recognised upon completion.

If there is a slippage during the implementation of the first project in the series, then the benefit realisation target is not met on time. This then has a knock-on effect; it results in resources being tied up for longer than initially planned which, in turn, affects all the other projects in the series. How can you start a new project when all your assets are tied up somewhere else? Simply put: you can’t, and this is what leads to stalled, and even failed, projects.

But why are resources so far and few between?

Interestingly, a recent independent survey discovered that 22% of CIO’s see a lack of skills as the biggest barrier to achieving their objectives. This came ahead of money, culture, alignment and even technology.

So, even if you have the correct amount of solutions and technologies in place to complete a project, often it’s the human skills needed to implement them that are tied up; stalling projects and leading to their failure.

Why? Well, both the business landscape and our working habits have changed dramatically over the last decade or so. Whereas previous generations might have secured a job in their 20s and stayed with the same company until retirement, now it’s more common to change jobs every 2 or 3 years. And when people leave, they take their specialist in-house knowledge and their skillset with them, creating a lag or gap.

Add to this the fact that technology is constantly changing at an ever-increasing speed and the problem only becomes more exacerbated. In order to keep up, often employees are more focused on, and therefore more skilled in, one sort of technology or in one area.

However, this means that when they leave the company, their absence is strongly felt. The cyber security skills gap is something that everyone has heard of; it’s well documented. But, the truth is that this skills gap is IT industry-wide.

In fact, according to figures released by Indeed in October, since 2014 demand for software developers and machine learning engineers has increased by 485% in the UK, with there now being an average of 2.3 jobs available for every qualified candidate. It’s no wonder that many organisations are feeling the pinch on the skills front!

All in all, resources are tight. There is very little wiggle-room- especially when it comes to human expertise and technical talent.

You need to focus on keeping business operations running as usual before you even start thinking about additional projects. But you need these additional projects in order to avoid falling behind your competitors in the innovation stakes. And with the speed that technology is changing, you ideally need to be undertaking multiple new innovative projects simultaneously.

So what can be done?

Simply put, there just are not enough resources to do everything.

Or are there?…

It’s true, you can’t just pull extra time and technical know-how out of thin air, or magically create an immediately accessible pool of skills where there isn’t one. It’s clear that, this time, the answers aren’t going to be found within your organisation- so why not look somewhere else?

Talk to us to help you with all the extra resources you need to invest in innovation while ‘keeping the lights on.’ It no longer has to be a dreaded choice, with the need to keep the business running as usual, stifling any form of innovation. Instead, by collaborating, you can have it all.


Originally posted on Information Age, 14 November 2017

Mark Rogers
November 21, 2017

Mark Rogers, CEO Logicalis Group, digs into the Logicalis Global CIO Survey 2017-2018 to pick out some of the major topics arising from the survey of 890 CIOs in 23 countries.

The big themes emerging from this years survey break CIO priorities down across three areas that could be mistaken for business as usual: Simplify, secure and engage. But, on the contrary, each has its part to play in a much loftier goal – digital transformation.

Indeed, the headline from the 2017-18 survey is this: CIOs say a massive infrastructure overhaul must be coupled with culture change if organisations are to unlock the benefits of digital transformation.

Digital ambition versus digital reality

That headline finding stems from CIOs’ assessment of their organisations’ digital footing.

The survey tells a story of real digital ambition amongst CIOs, but of limited progress in delivering digital transformation. To use tech adoption bell curve terminology, only 5% of respondents call their organisations digital innovators right now, while 49% characterise their organisations as part of an early majority.

That’s not a significant change on last years’ figures – and the reality is most CIOs see their organisations as partly digitally enabled at best.

Crucially, however, they contextualise those rather cautious views with a realistic and pragmatic assessment of the barriers to digital transformation – and it is their ambitious plans to overcome those barriers that give rise to the ‘simplify, secure, and engage’ triptych:


For almost half (44%) of respondents to this year’s CIO survey, complex legacy technology is the chief barrier to digital transformation.

In simple terms, the job of maintaining and managing those complex environments – in the face of ever more draconian security threats, and business demand for ever more open architecture – is huge. So, legacy complexity doesn’t just slow down or prevent digital projects, it also prevents a refocus on higher level, strategy activity, like digital transformation.

That is clearly not lost on CIOs, who understand very well the urgent need to simplify existing systems – indeed, 51% said they planned to adapt or replace existing infrastructure as a means of accelerating digital transformation.

It’s not hard to envisage CIOs making greater use of cloud services and third party support as a means of both simplifying those systems and handing off some of the management burden associated with them.


It’s no great surprise to see security high on the CIO agenda given the nature of the cyber threat landscape – and no great surprise either to see ransomware top of the threat list for CIOs. Ransomware [] is the biggest threat according to 71% of CIOs surveyed.

More surprising though, is the fact that one in three CIOs admit security concerns have led to the curtailment or cancellation of IT projects – a fact that must surely amplify the impact of security issues on digital transformation.

With that in mind, it is small wonder that so many CIOs (31%) see increased security investment as crucial to digital transformation – and not just to weathering the next cyber threat storm.

I’m in little doubt that CIOs’ security focus will drive an increased demand for services like Cisco Umbrella [Link] as organisations adopt multi-layered security solutions capable of defending against an ever-evolving array of cyber threats.


Perhaps most interesting, CIOs see organisational culture as a key barrier to digital transformation. That is, legacy technology brings with it a legacy relationship between business and technology, a ‘separateness’ that is incompatible with a digital model that puts technology at the heart of every aspect of the business.

In response, CIOs want to engage with line of business (LOB) to drive culture change. They want to be the digital ambassadors who create a new relationship between business and technology, and who foster an environment in which digital transformation can thrive.

Analytics offers a case in point. Back in 2015, 63% of CIOs ranked analytics as ‘very important’ or ‘critical’ to driving business innovation.

Two years later the same barriers to delivering those benefits remain complex systems and siloed data, but so is business engagement – the lack of a clear brief from the business as to what is required from analytics is still an issue for 41% of CIOs.

Crucially, though, they are responding: 54% of CIOs are working with LOB colleagues to bottom out requirements and 38% are setting up working groups to unravel complexity.

Those plans to tackle analytics suggest that CIOs are successfully adapting to a changing environment for business IT, an issue we first highlighted in 2015. []. The big question is whether they will be successful in replicating the approach as they seek to unlock the benefits of wider digital transformation.

In my view, the CIOs that are successful in tackling these three big issues will be those looking outside for help. The majority still spend between 60% and 80% of their time on day to day IT management – an issue that, in itself, is a barrier to change.

That’s partly because so much IT remains in-house. Only 25% outsource 50% or more of their IT – a situation that must surely change quickly if CIOs are to free themselves from the everyday and be digital change makers, not change managers.

Read the full Logicalis Global CIO Survey 2017-2018 here.

Justin Price
November 8, 2017

Year by year we are generating increasingly large volumes of data which require more complex and powerful tools to analyse in order to produce meaningful insights.

What is machine learning?

Anticipating the need for more efficient ways of spotting patterns in large datasets on mass, Machine Learning was developed to give computers the ability to learn without being explicitly programmed.

Today, it largely remains a human-supervised process, at least in the developmental stage. This consists of monitoring a computer’s progress as it works through a number of “observations” in a data set arranged to help train the computer into spotting patterns between attributes as quickly and efficiently as possible. Once the computer has started to build a model to represent the patterns identified, the computer then goes through a looping process, seeking to develop a better model with each iteration.

How is it useful?

The aim of this is to allow computers to learn for themselves, knowing when to anticipate fluctuation between variables which then helps us to forecast what may happen in future. With a computer model trained on a specific data problem or relationship, it then allows data professions to produce reliable decisions and results, leading to the discovering of new insights which would have remained hidden without this new analytical technique.

Real-world Examples

Think this sounds like rocket science? Every time you’ve bought something from an online shop and had recommendations based on your purchase – that’s based on machine learning. Over thousands of purchases the website has been able to aggregate the data and spot correlations based on real buying users’ buying patterns, and then present the most relevant patterns back to you based on what you did or bought. You may see these as “recommended for you” or “this was frequently bought with that”. Amazon and Ebay have been doing this for years, and more recently, Netflix.

This sounds fantastic – but where can this help us going forward?

Deep learning

This is distinguished from other data science practices by the use of deep neural networks. This means that the data models pass through networks of nodes, in a structure which mimics the human brain. Structures like this are able to adapt to the data they are processing, in order to execute in the most efficient manner.

Using these leading techniques, some of the examples now look ready to have profound impacts on how we live and interact with each other.We are currently looking at the imminent launch of commercially available real-time language translation which requires a speed of analysis and processing never available before. Similar innovations have evolved in handwriting-to-text conversion with “smartpads” such as the Bamboo Spark, which bridge the gap between technology and traditional note taking.

Other applications mimic the human components of understanding; classify, recognise, detect and describe (according to This has now entered main-stream use with anti-spam measures on website contact forms, where the software knows which squares contain images of cars, or street signs.

Particularly within the healthcare industry, huge leaps are made where scanned images of CT scans have been “taught” how to spot the early sign of lung cancer in Szechwan People’s Hospital, China. This has come in to meet a great need as there is a shortage of trained radiologists to examine patients.

In summary, there have been huge leaps in data analysis and science in the last couple years. The future looks bright for the wider range of real world issues to which we can apply more and more sophisticated techniques and tackle previously impossible challenges. Get in touch and let’s see what we can do for you.

Category: Analytics, Automation

Dean Mitchell
October 24, 2017

Overspending on resources?

We can all agree, it’s nothing new. In fact, it’s an issue faced by business leaders almost every day. In our increasingly digital world, overspending on technical resources, alongside the human resources (or skills) to back them up, is common.

If you view over-provisioning as a necessary evil, you’re not alone. A recent independent study discovered that 90% of CIOs feel the same way, with the majority only using about half of the cloud capacity that they’ve paid for.

But, why pay for resources that you’re not going to use?

Well, it’s no secret that over provisioning on IT resources is better than the alternative. Understandably, you’d rather pay above-the-odds for ‘too many’ functional digital systems, than risk the outages associated with ‘too few’. A 2015 study by Populus discovered that almost a third of all outages on critical systems are still capacity related, proving that over provisioning is not the only problem here.

It can seem as if organisations are stuck between a rock and a hard place: do you spend thousands and thousands of pounds from your (already) tight budget and over provision, or do you make an upfront saving and risk becoming one of the 29% of companies experiencing business disruption, downtime or worse when the demand on your services exceeds the resources you have in place? How do you optimise costs without risking future, potentially devastating, strain on your resources?

Enter IT Capacity Management…

In a nutshell, IT Capacity Management gives you a snapshot view of all your business resources against the demands placed upon them. This enables you to ‘right-size’ your resources and ensure that you can meet current requirements without over provisioning and over spending.

The level of demand placed upon business resources is constantly fluctuating. That’s why Capacity Management models should run alongside your current operations as part of your ongoing business strategy. It’s one way to be proactive when it comes to resourcing.

However, it doesn’t stop there… Capacity Management also enables you to prepare your business for the future. It continually measures the performance and levels of use of your resources in order to make predictions, which will enable you to prepare for any future changes in terms of demand.

What can Capacity Management do for your business?

There are a number of benefits to having IT Capacity Management included in your company strategy. It gives you visibility of your entire IT infrastructure, including all physical, virtual and cloud environments. The importance of this should not be underestimated; it can enable you to:

● Optimise costs. It’s simple- if you have a clear view of all your resources, you can see where they’re not required, which means that you won’t feel the need to purchase them “just in case”. Capacity Management can be seen as a long-term investment- especially given its ability to predict future trends based on current performance.
● Easily adjust IT resources to meet service demands. With the ability to see exactly which of your services are being placed under the highest amount of pressure in terms of demand, you’ll be able to adjust your business plan accordingly to relieve some of that pressure- allowing you to even out the playing field by ensuring that one service area isn’t being drained whilst others are idle. You’ll be able to add, remove or adjust compute, storage, network and other IT resources as and when they are needed.
● Deploy applications on time. You’ll be able to reserve IT resources to be used for new applications when needed, resulting in a faster time to deployment.
● Reduced time and human resources spend. Imagine the hours being spent by your employees to plan and calculate capacity usage and availability. By implementing a real, ongoing plan which can run in the background, you free up more time for your employees to pursue higher value tasks.

Capacity Management solves the age-old problem of optimising costs for today’s CIOs. While this has always been a priority for organisations, our new digital landscape has redefined its meaning and its importance. Working habits and IT business structures have evolved to include mobile working, shadow IT, unimaginable amounts of data and complex technological advancements that need a certain skillset to deploy. Therefore, it is impossible to view everything simultaneously and manage all resources accordingly, unless you deploy the correct tools and have the right strategy in place.

Capacity Management should be a key element of any business strategy. It’s a model built for your business’ resourcing needs, both today and in the future.

If you’d like to find out more about the Capacity Management and Cost Optimisation services that Logicalis provides then, contact us today.


Originally posted on Information Age, 18 October 2017.

Sara Sherwani
September 27, 2017

Throughout history, I don’t believe we’ve ever seen as much change as we do in the world of Technology! Just to think that in 10 years’ time we’ ve had more iPhone releases than Henry VIII had wives.

Taking a page out of some of tech giants books, be it Apple to Salesforce, it’s clear that innovation is at the centre of what enables the industry to move at the pace it does. It would be fair to say that 3 major trends currently dominate the industry:

1.Service, service, service – Many big players in the hardware product space recognise hardware is fast becoming a vanilla commodity. You’ve got a number of vendors such as Cisco, Oracle, Ericsson, Nokia, HP scrambling very quickly over a number of years to enable value added services on top of the hardware to increase margins.

 “Services are enabled by the specific knowledge, skills and experience you bring to the table which often drives business value through improved margins.”

Sometimes when I think about how you can build your brand of service that you deliver to customers, I like to compare it to food (one of my favourite subjects).

What keeps you going back to your favourite restaurant? Let’s take for instance McDonalds. It could be the quality of the food, but ultimately you KNOW you will get a fast, efficient service and a smile when they ask ‘would you like fries with that?’. The point being, it’s the trusted customer experience that underpins successful services, remember this bit – I’m going to allude to this part later on.

2.Business process design driven by cost reduction, optimization and automation – Ultimately, we use technology to enable us to make our lives simpler. Traditional IT has become so entrenched in complexity and with that has come high cost. Many businesses of all sizes are certainly looking at their balance sheets with scrutiny and seeking to utilize the benefits of IT innovation to gain a competitive advantage. The principles of globalisation, business process optimization and automation are all relevant now as we transform traditional IT to achieve the ultimate goal of simplicity.

3.Data driven customer experience being an investment for the future – Products in the world of data analytics are booming as businesses recognise the power of data in enabling intelligent business decisions. Some proven examples of boosting business value are how Telcos are using customer location data to pinpoint relevant marketing text messages.

Imagine you’re at the airport, where intelligent systems will pick up your location and send you a text to see if you want to purchase international data plan while you’re away. So instead of sending you random marketing messages, geo-location marketing becomes targeted and relevant. Through this intelligent marketing Telcos have been able to generate 40% more revenue than expected in that portfolio.

Keeping up with the pace of change within the industry can be overwhelming, unless you harness the key themes that I mentioned earlier which will be sure to relate to business value. Contact Logicalis today to learn how you can implement an agile business model and use its benefits to increase your business value.

Andrew Newton
September 8, 2017

Shadow IT is not a new concept, but certainly is a big issue for many organisations today. Companies of all sizes see a significant increase in the use of devices and/or services not within the organisation’s approved IT infrastructure.

A Global CIO Survey  found that IT leaders are under growing pressure from Shadow IT and are gradually losing the battle to retain the balance of power in IT decision-making. The threat from Shadow IT is forcing CIOs to re-align their IT strategy to better serve the needs of their line of business colleagues, and transforming IT to become the first choice for all IT service provision. However Shadow IT continues to apply pressure to many CIO’s and IT leaders who do not have clear visibility of the use of Shadow IT within their organisations and therefore cannot quantify the risks or opportunities.

So is Shadow IT a threat to your organisation or does it improve productivity and drive innovation?

Based on Gartner’s report, Shadow IT will account for a third of the cyber-attacks experienced by enterprises by the time we reach 2020. However, some customers have told us:

  • “Shadow IT is an opportunity for us to test devices or services before we go to IT for approval,”
  • Shadow IT allows us to be Agile and use services that IT don’t provide so we can work more effectively

One of the most important aspects of Shadow IT is of course the cost. What are the costs to the business from the hidden costs of a security breach, potential loss of data and for those with regulatory compliance requirements, the possibility of large fines and loss of reputation in their respective markets?

With an ever changing and expanding IT landscape and new regulations  such as the General Data Protection Regulation (GDPR) coming into effect in May 2018, managing and controlling data whilst ensuring complete data security should be top of the priority list. Therefore understanding the key challenges of Shadow IT is fundamental in order to manage it effectively.

Shadow IT – The Key Challenges:

    • Identifying the use of Shadow IT
      Arguably the biggest challenge with Shadow IT is visibility within the organisation. How can IT leaders see who is using or consuming what and for what purpose? If you can’t see or are aware of it, how can you manage it?
    • Costs of Shadow IT
      Controlling costs is impossible for Shadow IT spend if there is no visibility of what is being used. Not just the direct Shadow IT purchases present a challenge but the consequences of a security breach as a result of the use of Shadow IT in fines, reputation damage and future loss of business.
    • Securing the threat to your business
      One of the biggest areas of concern and quite rightly is the security threat to the business from the use of non-approved IT sources.  Not only does this have the potential to add to the organisation’s costs but also could result in the loss of data, again with the potential risk of considerable fines.
    • Managing Shadow IT without stifling innovation
      The wrong approach to managing Shadow IT, such as the “total lock down messaging”  can send signals to the organisation that IT are controlling, inflexible and  unwilling to listen with the possible result of driving Shadow IT under ground and in cases actually increase its use , thus increasing risks and costs.

Shadow IT is a complicated issue, but your response to it doesn’t have to be. Contact us to find out how we can help you manage Shadow IT, be forward thinking and fill the gaps within the current IT infrastructure.

Anis Makeriya
August 21, 2017

It’s always the same scenario: someone giving me some data files that I just want to dive straight into and start exploring ways to visually depict them, but I can’t.

I’d fire up a reporting tool only to step right back, realising that for data to get into visual shapes, they need to be in shape first!  One correlation consistently appearing over the years is that time spent on ETL/ELT (Extract, Transform and Load [in varying sequences]) and the speed of exit from reporting layer back to data prep share a negative correlation.

Data preparation for the win

‘80% of time goes into data prep’ and ‘Garbage in Garbage out (GIGO)’ have existed for some time now but don’t actually hit you until you face it in practical situations and it suddenly translates into ‘backward progress’. Data quality issues can vary from date formats, multiple spellings of the same value to values not existing at all in the form of nulls. So, how can they all be dealt with? Data prep layer is the answer.

Often with complex transformations or large datasets, analysts find themselves turning to IT to perform the ETL process. Thankfully, over the years, vendors have recognised the need to include commonly used transformations in the reporting tools themselves. To name a few, tools such as Tableau and Power BI have successfully passed this power on to the analysts making time to analysis a flash. Features such as pivot, editing aliases, joining and unioning tables and others are available within a few clicks.

There may also be times when multiple data sources need joining, such as matching company names. Whilst Excel and SQL fuzzy look-ups have existed for some time, the likes of dedicated ETL tools such as Paxata have imbedded further intelligence that enable it to go a step further and recognise that the solutions lies beyond just having similar spellings in between the names.

All the tasks mentioned above are for the ‘T’ (Transformation) of ETL and is only the second OR third step in the ETL/ELT process! If data can’t be extracted as part of the E in ETL in the first place, there is nothing to transform. When information lies in disparate silos, often it cannot be ‘merged’ unless the data is migrated or replicated across stores. Following the data explosion in the past decade, Cisco Data Virtualisation has gained traction for its core capability of creating a ‘merged virtual’ layer over multiple data sources enabling quick time to access as well as the added benefits of data quality monitoring and single version of the truth.

These recent capabilities are now even more useful with the rise in data services like Bloomberg/forex and APIs that can return weather info, if we want to further know how people feel about the weather, then the twitter API also works.

Is that it..?

Finally after the extraction and transformation of the data, the load process is all that remains… but even that comes with its own challenges. Load frequencies, load types (incremental vs. full loads) depending on data volumes, data capture (changing dimensions) to give an accurate picture of events and also storage and query speeds from the source to name a few.

Whilst for quick analysis a capable analyst with best practice knowledge will suffice, scalable complex solutions will need the right team from IT and non-IT side in addition to the tools and hardware to support it going forward smoothly. Contact us today to help you build a solid Data Virtualisation process customised to your particular needs.

Alastair Broom
August 9, 2017

It’s common knowledge that there is a global shortage of experienced IT security professionals, right across the spectrum of skills and specialities, and that this shortage is exacerbated by an ongoing lack of cyber security specialists emerging from education systems.

Governments are taking action to address this skills shortage, but it is nevertheless holding back advancement and exposing IT systems and Internet businesses to potential attacks.

Because of this, and despite the fear that other industries may have of Artificial Intelligence (AI) the information security industry should be embracing it and making the most of it. As the connectivity requirements of various different environments become ever more sophisticated, so the number of security information data sources is increasing rapidly, even as potential threats increase in number and complexity. Automation and AI offer powerful new ways of managing security in this brave new world.

At the moment, the focus in AI is on searching and correlating large amounts of information to identify potential threats based on data patterns or user behaviour analytics. These first generation AI-driven security solutions only go so far, though: security engineers are still needed, to validate the identification of threats and to activate remediation processes.

As these first generation solutions become more efficient and effective in detecting threats, they will become the first step towards moving security architectures into genuine auto-remediation.

To explore this, consider a firewall – it allows you to define access lists based on applications, ports or IP addresses. Working as part of a comprehensive security architecture, new AI-driven platforms will use similar access lists, based on a variety of complex and dynamic information sources. The use of such lists will under-gird your auto-remediation policy, which will integrate with other platforms to maintain consistency in the security posture defined.

As we move into this new era in security systems, in which everything comes down to gathering information that can be processed, with security in mind, by AI systems, we will see changes as services adapt to the new capabilities. Such changes will be seen first in Security Operations Centres (SOCs).

Today’s SOCs still rely heavily on security analysts reviewing reports to provide the level of service expected by customers. They will be one of the first environments to adopt AI systems, as they seek to add value to their services and operate as a seamless extension to digital businesses of all kinds.

SOCs are just one example, the security industry will get the most out of AI, but they need to start recognising that machines do best at what people do best. Any use of this technology will enable the creation of new tools and processes in the cybersecurity space that will protect new devices and networks from threats even before a human can classify that threat.

Artificial intelligence techniques such as unsupervised learning and continuous retraining can keep us ahead of the cyber criminals. However, we need to be aware that hackers will be also using these techniques, so here is where the creativity of the Good Guys can focus on thinking about what is coming next and let the machines do their job in learning and continuous protection.

Don’t miss out: to find out more, contact us – we’ll be delighted to help you with emerging technology and use it to your benefit.

Category: Security

Scott Reynolds
July 25, 2017

The amount of data that businesses generate and manage continues to explode. IBM estimates that across the world, 2.3 trillion gigabytes of data are created each day and this will rise to 43 trillion gigabytes by 2020.

From transactions and customer records to email, social media and internal record keeping – today’s businesses create data at rates faster than ever before. And there’s no question that storing and accessing this data presents lots of challenges for business. How to keep up with fast growing storage needs, without fast growing budgets? How to increase storage capacity without increasing complexity? How to access critical data without impacting on the speed of business?

It’s increasingly obvious that traditional storage can’t overcome these challenges. By simply adding more capacity, costs go up for both storage and management. And manually working with data across different systems can become an administrative nightmare – adding complexity, and taking up valuable IT resource.

So, what can you do? It’s likely that you’ve already got an existing infrastructure and for many, scrapping it and starting again, just isn’t an option. This is where flash and software-defined-storage (SDS) could be your saviour. Flash and tape aren’t mutually exclusive, and by separating the software that provides the intelligence from the traditional hardware platform, you gain lots of advantages including flexibility, scalability and improved agility.

So I could add to what I already have?

Yes. Flash and tape aren’t mutually exclusive. Lots of businesses use a mix of the old and the new – what’s important is how you structure it. Think of it like a well-organised wardrobe. You need your everyday staples close at hand, and you store the less frequently worm items, also known in the UK as the summer wardrobe (!), where you can access them if you need them but not in prime position.

Your data could, and should work like this. Use flash for critical workloads that require real-time access and use your older tape storage for lower priority data or lower performance applications.

But won’t it blow my budget?

No, the cost of Flash systems has come down over the last few years and the lower costs to operate make savings over the long term. It’s been proven that the virtualisation of mixed environments can store up to five times more data and that analytics driven hybrid cloud data management reduces costs by up to 73%. In fact, we estimate that with automatic data placement and management across storage systems, media and cloud, it’s possible to reduce costs by up to 90%!

So how do I know what system will work for me?

Well, that’s where we come in. At Logicalis we’ve got over 20 years of experience working with IBM systems. Our experts work with clients to help them scope out a storage solution that meets their needs today, and the needs they’ll have tomorrow.

We start with a Storage Workshop that looks at the existing infrastructure and what you’re hoping to achieve. We’ll look at how your data is currently structured and what changes you could make to improve what you already have – reducing duplication and using the right solution for the right workload. We’ll then work with you to add software and capacity that will protect your business and won’t blow your budget.

If you want to hear more about the solutions on offer, feel free to contact us.

Category: Hybrid IT

Scott Reynolds
July 12, 2017

£170m lost on the London Stock Market just over a week, and untold damage to the “World’s Favourite Airline”. That’s the cost within the UK to the International Airlines Group, the owner of British Airways, after BA’s recent ‘Power Outage’ incident.

“It wasn’t an IT failure. It’s not to do with our IT or outsourcing our IT. What happened was in effect a power system failure or loss of electrical power at the data centre. And then that was compounded by an uncontrolled return of power that took out the IT system.” Willie Walsh (IAG Supremo) during a telephone interview with The Times.

Willie has since inferred that the outage was caused by the actions of an engineer who disconnected and then reconnected a power supply to the data centre in “an uncontrolled and un-commanded fashion”. Could this then actually have something to do with the IT outsource after all, and did a staff member go rogue, or was it down to poor training and change control…?

For me what this highlights is the need to place greater emphasis on availability and uptime of those systems that support critical parts of a business or organisations services and offering. Along with robust processes and automation where possible to minimise the impact of an unplanned outage.

All businesses should expect their systems to fail. Sometimes it can be a physical failure of the infrastructure supporting the data centre (Power, UPS’s, Generators, Cooling etc.). It can be the power supply itself. Computing, Storage or the Network equipment can fail. Software and systems can suffer an outage. Plus it can also come down ‘Human Error’ or poor maintenance of core systems or infrastructure.

Coping with a Power Failure

Even if you have two power feeds to your building, and even if they’re from two different power sub-stations, and run through two different street routes, those sub-stations are still part of the same regional and national power grid. If the grid fails, so does your power. No way around it, except to make your own. Power Surge’s are handled by monitoring the power across Cabinet PDU’s, Critical PDU’s, UPS’s, Generators & Transformers, while assigning Maximum Load to all cabinets to make sure that we do not overload our customers systems.

Recovering from a Disaster

Recovering from a disaster is something that all organisation plan for, however not all have a Disaster Recovery (DR) Plan as there are some that consider High Availability (HA) to be more than sufficient. However HA only provides a localised system for failover, whereas DR is designed to cope with a site failure.

The challenge with DR for many of our customers is the cost;

  • First you need to prioritise which applications workloads you want to failover in the event of a disaster.
  • Second you need to purchase and manage infrastructure and licensing for these workloads with continuous replication.
  • Third you need a 2nd location.
  • Fourth you need a robust DR plan that allows you to recover your workloads at the 2nd location.
  • Then lastly (which is considered harder) you’ll need to fail back these services once the primary site has been recovered.

This can be an expensive option, but this is also where things like Cloud DR-as-a-Service can help minimise any expenditure, and the pain associated with owning and managing a DR environment.

Reducing the impact of an outage

Minimising the impact of any form of physical failure should be a priority over recovering from an outage. Workflow Automation can help a business maintain uptime of applications and services. This can be defined as a policy where services can be moved to other systems locally, or all services can be re-provisioned to a DR location or a DR platform in the event of outage caused either by a power issue or human error. Helping a business minimise the risk and the impact of outage.

I’ll let you come to your own conclusions as to whether British Airways should adopt a robust change control, automation or DR policy. Logicalis can assist and provide you with a number of options custom to your particular needs so that you are not the next press headliner.

Richard Simmons
June 20, 2017

I have a confession to make, I love to read. Not just an occasional book on holiday or a few minutes on the brief, or often the not so brief, train journey into and out of London but all the time. Right now has never been a better time for those with a love of reading! The rise of digital media means that not only can you consume it pretty much anywhere at any time but more importantly it is making it easier for more people to share their ideas and experience.

Recently I came across a book called “Thank You for Being Late: An Optimist’s Guide to Thriving in the Age of Accelerations” by Pulitzer Prize winner Thomas L. Friedman., which I not only found fascinating to read but has also helped to shape and change the way I view many of the challenges we are facing both in business but also in our personal lives. The premise of the book is that often he would arrange to meet people for breakfast early in the morning, to do interviews or research stories but occasionally these people would be delayed. These moments, rather than being a source of frustration, became time he actually looked forward to as it allowed him to simply sit and think. And looking at the world, he believed we are living through an age of acceleration due to constant technology evolution, globalisation and climate change. He argues that these combined are the cause for much of the challenges we currently face.

The key point about this acceleration is that it is now reaching a level in which society and people are struggling to adapt. Within the technology world we talk about disruption a lot, a new business or technology arrives that can disrupt a sector or market, the competition struggles to adapt and eventually a status quo is resumed. For example Uber has undoubtedly caused a huge disruption in the world of transport, and governments are currently working through how they can better legislate for this new way of operating. The challenge will be that new legislation can take 5-10 years to agree and implement in which time Uber may well have been replaced by autonomous cars.

So what we are experiencing now is not just disruption but a sense of dislocation, the feeling that no matter how fast we try and change it is never enough. In this environment it will be the people, businesses and societies that are able to learn and adapt the fastest which will be most successful . For business we are constantly shown how being more agile in this digital world can drive efficiency, generate new business models and allow us to succeed but I feel often what is lacking is the guidance on how to get there. We have a wealth of different technology which can support a business but what is right for me? What should I invest in first? And how do I make sure that I maximise the value of that investment?

My experience with many of our customers is that they understand the challenges and also the opportunity, but simply do not have the time to think and plan. When they do have time the amount of choice can be overwhelming and actually daunting. In a small way this is the same challenge I face when looking for new books to read, I can go online but with so much to choose from how will I know what I will enjoy? The opportunity that digital media provides with more authors and contents can actually make finding and choosing something that you think is valuable much harder.

In Logicalis, we understand the business challenges that you face and discuss with you the different technology options that could support you, recommending those that can deliver the biggest value in the shortest time frame. Contact us to find out how we can help you keep up to speed with emerging technology and use it to your benefit.

Alastair Broom
May 16, 2017

What if I told you, that Ransomware, is on its way to becoming a $1 billion annual market ?

Eyebrows raised (or not), it is a matter of fact in 2017 that Ransomware is an extremely lucrative business, evolving in an alarming rate and becoming more sophisticated day by day.

But, the question remains, what is Ransomware?

Ransomware is a malicious software – a form of malware – that either disables a target system or encrypts a user’s files and holds them ‘hostage’ until a ransom is paid. This malware generally operates indiscriminately with the ability to target any operating system, within any organisation. Once the malware has gained a foothold in an organisation, it can spread quickly infecting other systems, even backup systems and therefore can effectively disable an entire organisation. Data is the lifeblood of many organisations and without access to this data, businesses can literally grind to a halt. Attackers demand that the user pay a fee (often in Bitcoins) to decrypt their files and get them back.

On a global scale, more than 40% of ransomware victims pay the ransom, although there is no guarantee that you will actually get your data back and copies of your data will now be in the attacker’s hands. In the UK, 45% of organisations reported that a severe data breach caused systems to be down on average for more than eight hours. This makes it apparent that the cost is not only the ransom itself, but also the significant resources required to restore the systems and data. What is even more alarming, is that in the UK the number of threats and alerts is significantly higher than other countries (Cisco 2017 Annual Cybersecurity Report). Outdated systems and equipment are partially to blame, coupled with the belief that line managers are not sufficiently engaged with security. Modern and sophisticated attacks like ransomware require user awareness, effective processes and cutting edge security systems to prevent them from taking your organisation hostage!

How can you protect your company?

As one of the latest threats in cybersecurity, a lot has been written and said around ransomware and potential ways of preventing it. A successful mitigation strategy involving people, process and technology is the best way to minimise the risk of an attack and its impact. Your security program should consider the approach before, during and after an attack takes place giving due consideration to protecting the organisation from attack, detecting Ransomware and other malware attacks and how the organisation should respond following an attack. Given that Ransomware can penetrate organisations in multiple ways, reducing the risk of an infection requires a holistic approach, rather than a single point solution.  It takes seconds to encrypt an entire hard disk and so IT security systems must provide the highest levels of protection, rapid detection and high containment and quarantine capability to limit damage. Paying the ransom should be viewed as an undesirable, unpredictable last resort and every organisation should therefore take effective measures to avoid this scenario.

Could your organisation be a target?

One would imagine that only large corporations would be at risk of a Ransomware attack, but this is far from the truth. Organisations of all industries and sizes report Ransomware attacks which lead to substantial financial loss, data exposure and potential brand damage. The reason is that all businesses rely on the availability of data, such as employee profiles, patents, customer lists, financial statements etc. to operate.  Imagine the impact of Ransomware attacks in police departments, city councils, schools or hospitals. Whether an organisation operates in the public or private sector, banking or healthcare, it must have an agile security system in place to reduce the risk of a Ransomware attack.

Where to start?

The first step to shield your company against Ransomware is to perform an audit of your current security posture and identify areas of exposure.  Do you have the systems and skills to identify an attack?  Do you have the processes and resources to respond effectively?  As Ransomware disguises itself and uses sophisticated hacking tactics to infiltrate your organisation’s network, it is important to constantly seek innovative ways to protect your data before any irreparable damage is done.

With our Security Consultancy, Managed Security Service offerings and threat-centric Security product portfolio, we are able to help our customers build the holistic security architecture needed in today’s threat landscape.

Contact us to discuss your cyber security needs and ensure you aren’t the next topic of a BBC news article.


Category: Security

Neil Thurston
April 25, 2017

Hybrid IT is often referred to as bimodal, a term coined by Gartner some four years ago to reflect the (then) new need for the simultaneous management of two distinct strands of work in a Hybrid IT environment – the traditional server-based elements on the one hand, and the Cloud elements on the other.

Since then, the two strands of the bimodal world have blended in various different ways. As they have engaged and experimented with new technologies, organisations have found that certain workload types are particularly suited to certain environments.

For example, DevOps work, with its strong focus on user experience elements such as web front ends, is typically well suited to cloud-native environments. Meanwhile, back end applications processing data tend to reside most comfortably in the traditional data centre environment.

The result is a multi-modal situation even within any given application, with its various tiers sitting in different technologies, or even different clouds or data centres.

The obvious question for IT management is this: how on earth do you manage an application which is split across multiple distinct technologies? Relying on technology to provide the management visibility you need drives you to traditional tools for the elements of the application based on traditional server technology, and DevOps tools for the cloud native side. Both sets of tools need to be continuously monitored. For every application, and every environment.

A new breed of tools is emerging, allowing you to play in both worlds at once . VMware vRealize Automation cloud automation software is a good example. Over the last three years, VMware has developed its long-standing traditional platform, adding Docker container capabilities, so that today vRealize is a wholly integrated platform allowing for the creation of fully hybrid applications, in the form of ‘cut-out’ blueprints containing both traditional VM images and Docker images.

This multi-modal Hybrid IT world is where every enterprise will end up. IT management needs clear visibility, for every application, of multiple tiers across multiple technologies – for security, scaling, cost management and risk management, to name just a few issues. Platforms with the capability to manage this hybrid application state will be essential.

This area of enterprise IT is moving rapidly: Logicalis is well versed, and experienced, in these emerging technologies both in terms of solution and service delivery, and in terms of support for these technologies in our own cloud. Contact us to find out more about multi-modal Hybrid IT and how we can help you leverage it.

Category: Hybrid IT

Fanni Vig
April 20, 2017

Finally, it’s out!

With acquisitions like Composite, ParStream, Jasper and AppDynamics, we knew something was bubbling away in the background for Cisco with regards to edge analytics and IoT.

Edge Fog Fabric – EFF

The critical success factor for IoT and analytics solution deployments is to provide the right data, at the right time to the right people (or machines) .

With the exponential growth in the number of connected devices, the marketplace requires solutions that provide data generating devices, communication, data processing, and data leveraging capabilities, simultaneously.

To meet this need, Cisco recently launched a software solution (predicated on hardware devices) that encompasses all the above capabilities and named it Edge Fog Fabric aka EFF.

What is exciting about EFF?

To implement high performing IoT solutions that are cost effective and secure, a combination of capabilities need to be in place.

  • Multi-layered data processing, storing and analytics – given the rate of growth in the number of connected devices and the volume of data. Bringing data back from devices to a DV environment can be expensive. Processing information on the EFF makes this a lot more cost effective.
  • Micro services – Standardized framework for data processing and communication services that can be programmed in standard programming language like Python, Java etc.
  • Message routers – An effective communication connection within the various components and layers. Without state of the art message brokerage, no IoT systems could be secure and scalable in providing real time information.
  • Data leveraging capabilities – Ad hoc, embedded or advanced analytics capabilities will support BI and reporting needs. With the acquisition of Composite and AppDynamics, EFF will enable an IoT platform to connect to IT systems and applications.

What’s next?

Deploying the above is no mean feat. According to Gartner’s perception of the IoT landscape, no organization have yet achieved the panacea of connecting devices to IT systems and vice versa, combined with the appropriate data management and governance capabilities embedded. So there is still a long road ahead.

However, with technology advancements such as the above, I have no doubt that companies and service providers will be able to accelerate progress and deliver further use cases sooner than we might think.

Based on this innovation, the two obvious next steps that one can see fairly easily are:

  • Further automation – automating communication, data management and analytics services including connection with IT/ERP systems
  • Machine made decisions – once all connections are established and the right information reaches the right destination, machines could react to information that is shared with ‘them’ and make automated decisions.

Scott Hodges
April 18, 2017

Attending a recent IBM Watson event, somebody in the crowd asked the speaker, “So, what is Watson? ” It’s a good question – and one isn’t really a straightforward answer to. Is it a brand? A supercomputer? A technology? Something else?

Essentially, it is an IBM technology that combines artificial intelligence and sophisticated analytics to provide a supercomputer named after IBM’s founder, Thomas J. Watson. While interesting enough, the real question, to my mind, is this: “What sort of cool stuff can businesses do with the very smart services and APIs provided by IBM Watson?”

IBM provides a variety of services, available through Application Programmable Interfaces (APIs) that can developers can use to take advantage of the cognitive elements and power of Watson. The biggest challenge to taking advantage of these capabilities is to “Think cognitively” and imagine how they could benefit your business or industry to give you a competitive edge – or, for not-for-profit organisations, how they can help you make the world a better place.

I’ve taken a look at some of the APIs and services available to see some of the possibilities with Watson. It’s important to think of them collectively rather than individually, as while some use-cases may use one, many will use a variety of them, working together. We’ll jump into some use-cases later on to spark some thoughts on the possibilities.

Natural Language Understanding

Extract meta-data from content, including concepts, entities, keywords, categories, sentiment, emotion, relations and semantic roles.


Identify useful patterns and insights in structured or unstructured data.


Add natural language interfaces such as chat bots and virtual agents to your application to automate interactions with end users.

Language Translator

Automate the translation of documents from one language to another.

Natural Language Classifier

Classify text according to its intent.

Personality Insights

Extract personality characteristics from text, based on the writer’s style.

Text to Speech and Speech to Text

Process natural language text to generate synthesised audio, or render spoken words as written text.

Tone Analyser

Use linguistic analysis to detect the emotional (joy, sadness etc) linguistic (analytical, confident etc) and social (openness, extraversion etc) tone of a piece of text.

Trade-off Analytics

Make better choices when analysing multiple, even conflicting goals.

Visual Recognition

Analyse images for scenes, objects, faces, colours and other content.

All this is pretty cool stuff, but how can it be applied to work in your world? You could use the APIs to “train” your model to be more specific to your industry and business, and to help automate and add intelligence to various tasks.

Aerialtronics offers a nice example use-case of visual recognition in particular, they develop, produce and service commercial unmanned aircraft systems. Essentially, the company teams drones, an IoT platform and Watson’s Visual recognition service, to help identify corrosion, serial numbers, loose cables and misaligned antennas on wind turbines, oil rigs and mobile phone towers. This helps them automate the process of identifying faults and defects.

Further examples showing how Watson APIs can be combined to drive powerful, innovative services can be found on the IBM Watson website’s starter-kit page.

At this IBM event, a sample service was created, live in the workshop. This application would stream a video, convert the speech in the video to text, and then categorise that text, producing an overview of the content being discussed. The application used the speech-to-text and natural language classifier services.

Taking this example further with a spot of blue sky thinking, for a multi-lingual organisation, we could integrate the translation API, adding the resulting service to video conferencing. This could deliver near real-time multiple dialect video conferencing, complete with automatic transcription in the correct language for each delegate.

Customer and support service chat bots could use the Conversation service to analyse tone. Processes such as flight booking could be fulfilled by a virtual agent using the ‘Natural Language Classifier’ to derive the intent in the conversation. Visual recognition could be used to identify production line issues, spoiled products in inventory or product types in retail environments.

Identification of faded colours or specific patterns within scenes or on objects could trigger remedial services. Detection of human faces, their gender and approximate age could help enhance customer analysis. Language translation could support better communication with customers and others in their preferred languages. Trade-off Analytics could help optimise the balancing of multiple objectives in decision making.

This isn’t pipe-dreaming: the toolkit is available today. What extra dimensions and capabilities could you add to your organisation, and the way you operate? How might you refine your approach to difficult tasks, and the ways you interact with customers? Get in contact today to discuss the possibilities.

Alastair Broom
March 10, 2017

As Logicalis’ Chief Security Technology Officer I’m often asked to comment on cyber security issues. Usually the request relates to specific areas such as ransomware or socially engineered attacks. In this article I’m taking a more holistic look at IT security.

Such a holistic approach to security is, generally, sorely lacking. This is a serious matter, with cyber criminals constantly looking for the weak links in organisations’ security, constantly testing the fence to find the easiest place to get through. So, let’s take a look at the state of enterprise IT security in early 2017, using the technology, processes and people model.


A brief, high-level look at the security market is all it takes to show that there are vast numbers of point products out there – ‘silver bullet’ solutions designed to take out specific threats. There is, however, little in terms of an ecosystem supporting a defence-in-depth architecture. Integration of and co-operation between the various disparate components is , although growing, typically weak or non-existent.

We’ve seen customers with more than 60 products deployed, from over 40 vendors, each intended to address a specific security issue. Having such a large number of products itself presents significant security challenges, though. All these products combined have their own vulnerability: support and manintenance. Managing them and keeping them updated generates significant workload, and any mistakes or unresolved issues can easily become new weak points in the organisation’s security.

The situation has been exacerbated by the rapidly increasing popularity of Cloud and Open Source software. Both trends make market entry significantly simpler, allowing new players to quickly and easily offer new solutions, targeting whichever threat happens to be making a big noise at the moment.

Just as poor integration between security products is an issue, so is lack of integration between the components on which they are built. Through weak coding or failure to make use of hardware security features – Intel’s hardware-level Software Guards Extensions (SGX) encryption technology is a good example – security holes are left open, waiting to be exploited.

The good news on the technology front is that we are seeing the early stages of the development of protocols, such as STIX, TAXII and CybOX, allowing different vendors’ products to interact and share standardised threat information. The big security vendors have been promoting the idea of threat information sharing and subsequent action for a while, but only within their own product ecosystems. It’s time for a broader playing field!


IT security is one of the most important issues facing today’s enterprise, yet, while any self-respecting board will feature directors with responsibility for sales, marketing, operations and finance, few enterprises have a board level CISO.

Similarly few organisations have a comprehensive and thoroughly considered security strategy in place, or proper security processes and policies suitable for today’s threat landscape and ICT usage patterns. A number of industry frameworks exist: ISO 27001, Cyber Essentials, NIST to name but a few; and yet very few organisations adopt these beyond the bare minimum to meet regulatory requirements.

Most organisations spend considerable sums on security technology, but without the right security strategy in place, and user behaviour in line with the right processes and policies, they remain at risk of serious breaches.


The hard truth is that some 60% of breaches are down to user error. Recent research obtiained through Freedom of Information requests found that 62% reported to the ICO are down to humans basically getting it wrong. People make poor password choices, use insecure public (and private!) WiFi, and use public Cloud storage and similar services without taking the necessary security precautions. They do not follow, or indeed even know, corporate data classification and usage policies. The list, of course, goes on.

Training has a part to play here, to increase users’ awareness of the importance of security, as well as the behaviours they need to adopt (and discard) to stay secure. However, there will come a point at which the law of diminishing returns kicks in: we all make mistakes – even the most careful, well trained of us.

We need to explore, discover and devise new ways in which technology can help, by removing the human element, where possible and desirable, and by limiting and swiftly rectifying the damage done when human error occurs. Furthermore, we need to leverage ever improving machine learning and artificial intelligence software to help augment human capability.

Enterprises need to work with specialists that can help them understand the nature of the threats they face, and the weak links in their defences that offer miscreants easy ways in. That means closely examining all aspects of their security from each of the technology, processes and people perspectives, to identify actual and potential weaknesses. Then robust, practical, fit-for-purpose security architectures and policies can be built.

For an outline of how this can work, take a look at Logicalis’ three-step methodology here or email us to discuss your cyber security needs.

Category: Security

Neil Thurston
February 13, 2017

The explosive growth of Cloud computing in recent years has opened up diverse opportunities for both new and established businesses. However, it has also driven the rise of a multitude of ‘islands of innovation’. With each island needing its own service management, data protection and other specialists, IT departments find themselves wrestling with increased – and increasing – management complexity and cost.

Necessity is the mother of invention, and with cost and complexity becoming increasingly problematic, attitudes to Cloud are changing. Organisations are moving selected tools, resources and services back to on-premises deployment models: we’re seeing the rise of the Hybrid Cloud environment.

The trend towards Hybrid Cloud is driven by an absolute need for operational and service consistency, regardless of the on-premises/Cloud deployment mix – a single set of automation platforms, a single set of operational tools and a single set of policies. We’re looking at a change in ethos, away from multiple islands of innovation, each with its own policies, processes and tools, to a single tool kit – a single way of working – that we can apply to all our workloads and data, regardless of where they actually reside.

Disparate islands in the Cloud have also increasingly put CIOs in the unenviable position of carrying the responsibility for managing and controlling IT but without the capability and authority to do so. Many organisations have experimented (some might say dabbled) with cherry-picked service management frameworks such as ITIL.

With focus shifting to Hybrid Cloud, we’re now seeing greater interest in more pragmatic ITSM frameworks, such as IT4IT, pushing responsibility up the stack and facilitating the move to something more akin to supply chain management than pure hardware, software and IT services management.

There are two key pieces to the Hybrid IT puzzle. On the one hand, there’s the workload: the actual applications and services. On the other, there’s the data. The data is where the value is – the critical component, to be exploited and protected. Workloads, however, can be approached in a more brokered manner.

Properly planned and executed, Hybrid Cloud allows the enterprise to benefit from the best of both the on-premises world and the Cloud world. The ability to select the best environment for each tool, service and resource – a mix which will be different in different industries, and even in different businesses within the same industry – delivers significant benefits in terms of cost, agility, flexibility and scalability.

Key to this is a comprehensive understanding of where you are and where you want to be, before you start putting policies or technology in place. The Logicalis Hybrid IT Workshop can help enormously with this, constructing a clear view of where you are now, and where you want to be.

In the workshop we assess your top applications and services, where they reside and how they’re used in your business. We then look at where you want to get to. Do you want to own your assets, or not? Do you want to take a CAPEX route or an OPEX route? Do you have an inherent Cloud First strategy? What are your licensing issues?

We then use our own analysis tools, developed from our real world experience with customers, to create visualisations showing where you are today, where you want to eventually be and our recommended plan to bridge the gap, in terms of people, processes, technology and phases.

Hybrid Cloud offers significant benefits, but needs to be carefully planned and executed. To find out more about how Logicalis can help, see our website or call us on +44 (0)1753 77720.

Category: Hybrid IT

Fanni Vig
January 16, 2017

A friend of mine recently introduced me to the idea of the ‘runaway brain’ – a theory first published in 1993 outlining the uniqueness of human evolution. We take a look into how artificial intelligence is developing into something comparable to the human brain and the potential caveats that concern us as human-beings.

The theory considers how humans have created a complex culture by continually challenging their brains, leading to the development of more complex intellect throughout human evolution. A process which continues to occur, even up to today and will again tomorrow, and will no doubt for years to come. This is what theorists claim is driving human intelligence towards its ultimate best.

There are many ways in which we can define why ‘human intelligence’ is considered unique. In essence, it’s characterised by perception, consciousness, self-awareness, and desire.

It was by speaking to a friend that I considered with human intelligence alongside the emergence of artificial intelligence (AI), is it possible for the ‘runaway brain’ to reach a new milestone? After further research, I found some that say it already has.

They label it ‘runaway super intelligence‘.

Storage capacity of the human brain

Most neuroscientists estimate the human brains storage capacity to range between 10 and 100 terabytes, with some evaluations estimating closer to 2.5 petabytes. In fact, new research suggests the human brain could hold as much information as the entire internet.

As surprising as that sounds, it’s not necessarily impossible. It has long been said that the human brain can be like a sponge, absorbing as much information that we throw towards it. Of course we forget a large amount of that information, but take into consideration those with photographic memory or those who practice a combination of innate skills, learned tactics, mnemonic strategies or those who have an extraordinary knowledge base.

Why can machines still perform better?

Ponder this – if human brains have the capacity to store significant amounts of data, why do machines continue to outperform human decision making?

The human brain has a huge range – data analysis and pattern recognition alongside the ability to learn and retain information. A human needs only to glance before they recognise a car they’ve seen before, but AI may need to process hundreds or even thousands of samples before it’s able to come to a conclusion. Perhaps human premeditative assumption, if you will, to save time analysing finer details for an exact match, but conversely, while AI functions may be more complex and varied, the human brain is unable to process the same volume of data as a computer.

It’s this efficiency of data processing that calls on leading researchers to believe that indeed AI will dominate our lives in the coming decades and eventually lead to what we call the ‘technology singularity’.

Technology singularity

Technological singularity is defined by the hypothesis that through the invention of artificial super intelligence abruptly triggering runaway technological growth, which will result in unfathomable changes to human civilization.

According to this hypothesis, an upgradable intelligent agent, such as software-based artificial general intelligence, could enter a ‘runaway reaction’ cycle of self-learning and self-improvement, with each new and increasingly intelligent generation appearing more rapidly, causing an intelligence explosion resulting in a powerful super intelligence that would, qualitatively, far surpass human intelligence.

Ubiquitous AI

When it comes to our day-to-day lives, algorithms often save time and effort. Take online search tools, Internet shopping and smartphone apps using beacon technology to provide recommendations based upon our whereabouts.

Today, AI uses machine learning. Provide AI with an outcome-based scenario and, to put it simply, it will remember and learn. The computer is taught what to learn, how to learn, and how to make its own decisions.
What’s more fascinating, is how new AI’s are modeling the human mind using techniques similar to that of our own learning processes.

Do we need to be worried about the runaway artificial general intelligence?

If we to listen to the cautiously wise words of Stephen Hawking who said “success in creating AI would be the biggest event in human history”, before commenting “unfortunately, it might also be the last, unless we learn how to avoid the risks”.

The answer to whether we should be worried all depends on too many variables for a definitive answer. However, it is difficult not to argue that AI will play a growing part in our lives and businesses.

Rest assured: 4 things that will always remain human

It’s inevitable that one might raise the question is there anything that humans will always be better at?

  1. Unstructured problem solving. Solving problems in which the rules do not currently exist; such as creating a new web application.
  2. Acquiring and processing new information. Deciding what is relevant; like a reporter writing a story.
  3. Non-routine physical work. Performing complex tasks in a 3-dimentional space that requires a combination of skill #1 and skill #2 which is proving very difficult for computers to master. As a consequence this causes scientists like Frank Levy and Richard J. Murmane to say that we need to focus on preparing children for an “increased emphasis on conceptual understanding and problem-solving“.
  4. And last but not least – being human. Expressing empathy, making people feel good, taking care of others, being artistic and creative for the sake of creativity, expressing emotions and vulnerability in a relatable way, and making people laugh.

Are you safe?

We all know that computers/machines/robots will have an impact (positive and/or negative) on our lives in one way or another. The rather ominous elephant in the room here is whether or not your job can be done by a robot?

I am sure you will be glad to know there is an algorithm for it…
In a recent article by the BBC it is predicted that 35% of current jobs in the UK are at a ‘high risk’ of computerization in the coming 20 years (according to a study by Oxford University and Deloitte).

It remains, jobs that rely on empathy, creativity and social intelligence are considerably less at risk of being computerized. In comparison roles including retail assistants (37th), chartered accountants (21st) and legal secretaries (3rd) all rank among the top 50 jobs at risk.

Maybe not too late to pick up the night course on ‘Computer Science’…

Alastair Broom
December 15, 2016

Last week I read that you can now hijack nearly any drone mid-flight just by using a tiny gadget.

The gadget responds to the name of Icarus and it can hijack a variety of popular drones mid-flight, allowing attackers to lock the owner out and give them complete control over the device.

Besides Drones, the new gadget has the capability of fully hijacking a wide variety of radio-controlled devices, including helicopters, cars, boats and other remote control gears that run over the most popular wireless transmission control protocol called DSMx.

Although this is not the first device we have seen that can hijack drones, this is the first one giving the control Icarus works by exploiting DMSx protocol, granting attackers complete control over target drones that allows attackers to steer, accelerate, brake and even crash them.

The attack relies on the fact that DSMx protocol does not encrypt the ‘secret’ key that pairs a controller and the controlled device. So, it is possible for an attacker to steal this secret key by launching several brute-force attacks.

You can also watch the demonstration video to learn more about Icarus box.

There is no mitigation approach to this issue at the moment, other than wait for manufacturers affected to release patches and update their hardware embracing encryption mechanisms to secure the communication between controller and device.

Having seen this video and the potential impact of this hijacking technique, my first thought was about the threat for Amazon’s new service coming soon, which will allow drones to safely deliver packages to people’s homes in under 30 minutes.

This is just another example of how important is to define the right strategy around using encryption as part of the security in the digital era. Business data and the way we want to access this data from any device, anywhere and anytime just highlight the need of enhanced and clever security solutions.

There are different ways Logicalis can help our customers in the protection of data located in data centres and end points with the help of the ecosystem of partners like Cisco and Intel Security.

An interesting offering to mention is Logicalis Endpoint Encryption Managed Service. This service gives our customer’s devices and the data within them the level of protection that will give them peace of mind should a device be lost or stolen, and we Logicalis manage the service for them. This service is the market leader for data protection and it provides the highest levels of Confidentiality, Integrity and Availability. The service is part of the global strategy adopted by Logicalis Group across EMEA.

Category: Automation, Security

Alastair Broom
December 13, 2016

Morpheus, in one of the most iconic scenes of the Matrix trilogies said, “You take the blue pill, the story ends. You wake up in your bed and believe whatever you want to believe. You take the red pill, you stay in Wonderland, and I show you how deep the rabbit-hole goes.”

Let me ask you something, what about taking decisions like the one offered by Morpheus based on additional information that can be used to evaluated better the two options? Would that influence Neo to change his mind on the decision made?

According to the Harvard Business Review ( many business managers still rely on instinct to make important decisions, often leading to poor results. However, when managers decide to incorporate logic into their decision-making processes, the result is translated into better choices and better results for the business.

In today’s digital world, it’s difficult to ensure the integrity of mission critical networks without a detailed analysis of user engagement and an understanding of the user experience.

HBR outlines three ways to introduce evidence-based management principles into an organization. They are:

  • Demand evidence: Data should support any potential claim.
  • Examine logic: Ensure there is consistency in the logic, be on the lookout for faulty cause-and-effect reasoning.
  • Encourage experimentation: Invite managers to conduct small experiments to test the viability of proposed strategies and use the resulting data to guide decisions.

So, the big question is, would it be possible to introduce these three elements into the tasks assigned to the network manager?

The answer is ‘yes’ provided the manager is given the opportunity to integrate with network data that carries the context of users, devices, locations and applications in use and then given the opportunity to mine this captured data to gain insights into how and why systems and users perform the way they do.

Fortunately, the limitations of traditional networks can be overcome with the use of new network platforms providing in-depth visibility into application use across the corporate network, helping organisations to deliver significant, cost-effective improvements to their critical technology assets. It achieves this by:

  • improving the experience of connected users
  • enhancing their understanding of user engagement
  • optimizing application performance
  • improving security by protecting against malicious or unapproved system use.

According to IDC, “With the explosion of enterprise mobility, social and collaborative applications, network infrastructure is now viewed as an enterprise IT asset and a key contributor to optimizing business applications and strategic intelligence,”.

For companies facing the challenge of obtaining deep network insights in order to improve application performances and leverage business analytics, Logicalis is the answer.

Logicalis is helping their clients with the delivery of digital ready infrastructure as the main pillar for enhancing the user experience, business operations and taking secure analytics to the next level of protection for business information.

Scott Reynolds
December 12, 2016

In the year of Brexit and Trump, Scott Reynolds, Hybrid IT practice lead, predicts that electronic digital crime will explode, data privacy breaches will claim scalps, automation will be 2017’s buzzword and the open source movement will challenge profit-making business models in his 2017 tech predictions.

It’s the time of year to engage in the oh-so risky game of making predictions for what is going to be hot for our customers in the coming year. Risky, because stunning twists and turns can take us off course at any point.

After the Brexit referendum and US Elections results confounded political and polling pundits, life’s certainties appear far less certain. Suddenly identifying the big winners in 2017 seems a less straight forward affair. But as a person that doesn’t mind living life on the edge – I thought I’d take a punt anyway. Here are my top tech predictions for 2017.

Security Breaches – the worst is yet to come

Based on the number of high profile data breaches, 2016 hasn’t been a great year for digital. The stable door has been left open by companies and government departments around the world. Armies of Terminator 2 Cyborgs in the guise of home CCTV cameras are attacking the very infrastructure of the internet. I fear we’re only at the beginning of an escalation of electronic digital crime.

2017 will test the nerve of governments, businesses, citizens and consumers and challenge the perception of digital as a safe and secure way of doing business, unless there’s a massive investment in Fort Knox equivalent defences and white hat skills.

Data Privacy

GDPR (General Data Protection Regulation) emanating from Europe is going to hurt businesses who don’t take data privacy seriously. That is a problem, as evidence suggests companies are unaware of their obligations under this new punitive legislative regime and are taking too long to grab hold of the GDPR tail.

It’s highly possible fines of up to 4% of Global Turnover will take some companies out of business in 2017, and beyond.

One Small Change for Mankind, One Giant Leap Forward for Automation

The IT industry is about to enter a time of mass automation…..about time. To our shame, we’ve lagged behind other industries. You can now buy a car that can park itself with the touch of a button, but you need 24 buttons to change the configuration of a router.

Increased levels of automation will manifest itself in robotic decision making, the automation of security systems to guard against, and respond to an avalanche of security threats, and automated provisioning of resource in the data centre and network (Software Defined).

Things Can Only Get Bigger

The Internet of Things is going to get bigger and more impactful. Gartner Group is still predicting that by 2020 there will be 50 billion things connected to the internet – that’s only three years away. In 2017, expect to see mass engagement by businesses in all sectors.

Hopefully we’ll move on from talking about the connected fridge that can order more lettuce when you run out, and recognise that IoT will fundamentally change how industries and organisations operate.

Open Source – Somebody wants to do everything you do for free

Somebody, somewhere, is trying to do what you charge for, at much lower cost, or even for free. Open isn’t a thing, it’s a movement. We’re already seeing Open Source technologies impact our industry; with Open Stack being the new operating system of choice for those companies not wanting to ‘pay’ for mainstream software. Open technologies in automation, such as Puppet and Chef, now have a groundswell of support, and are evangelical about companies who want to delight people rather than turn a profit.

We’ve also witnessed a growing willingness to embrace Open Computing technologies. Now, Open isn’t without its complications and ultimately nothing in life is free – operating an open environment is still a complicated affair. But I think we’ll see a lot more traction, with many of our customers taking Open Source seriously, over the next 12 months.

2017 Tech predictions – a risky game

So, those are my top five tech trends for 2017. Now you’re probably decrying – how could I overlook analytics? I haven’t. I fully acknowledge that analytics and data are core to all the above. They will need to be embedded in the very fabric of a business, to bring my predictions to fruition. Otherwise, you can disregard everything I just said. As I said, making predictions is a risky game.

Alastair Broom
December 11, 2016

21 times in the past 12 months! According to the following BBC article, that’s how many times Bournemouth University has been hit by ransomware attacks (University hit 21 times in one year by ransomware – BBC News). 23 universities have suffered that same issue, 28 NHS trusts. And the problem continues in the Enterprise space. According to Kaspersky labs research, over 50,000 corporate PCs were infected with ransomware in 2015, a 100% year-on-year growth as compared to 2014. And that’s the data Kaspersky has access to. The FBI states that in Q2 of 2015, 4 million unique ransomware samples were detected, and that in 2016 over $1 billion will be paid out in ransomware attacks to retrieve data. I may be in the wrong job…

The thing is, ransomware in the past has been what I would term, a typically opportunistic attack. It was predominantly targeted at individuals, and would rely on relatively basic techniques, such as phishing scams or drive-by-downloads, to get the infection done. Attacks would range from scareware, to screenlockers, all the way to encryption attacks, and would demand a reasonably small sum from the victim to decrypt locked files (assuming you were lucky enough to get the files back at all). You would handover some Bitcoins, the currency hackers crave due to its global and anonymous nature, and hope the decryption key was handed over in return.

But that is no longer the case. Ransomware is now far better classed as an APT – Advanced Persistent Threat, and carries all the industrialized process behind it that we expect of corporate or sovereign government funded warfare. This is big business, and attackers are fully aware of the cost corporations are exposed to if they are unable to access that most precious of resources – their data. Attackers will spend huge effort researching their victim: footprinting, scanning, enumerating – and will then follow a very well thought out plan to realise their desired outcomes!

Not good news, but we can help! At Logicalis, we have been helping our clients prepare for and defend against ransomware and other APTs. Let’s specifically address ransomware.

We understand the key stages involved in a ransomware attack (infection, execution, backup spoliation, encryption, notification & clean up), and so can help advise you on the best lines of defence, and provide a number of services to make sure you are protected. The below list is a great place to start:

  • Reduce logged in user privileges so that they are unable to execute PowerShell and any other code not required.
  • Remove the command prompt from Citrix or logon profile sessions.
  • Use application whitelisting so that only authorised applications can run.
  • Prevent or disallow macro execution via group policy.
  • Investigate using where appropriate Microsoft EMET
  • Use a SIEM to monitor, for example, process injection
  • Ensure that there is a full backup of all the data and systems to be able to restore if and when required.
  • Use a software restriction policy in applocker (SRP)
  • Disable activeX in the browsers.
  • Install a personal firewall or use HIPS
  • Block binaries running from %APPDATA% and %TEMP% paths

With our Security consultancy, Managed Security Service offerings and threat-centric Security product portfolio, we are able to help our customers build the holistic security architecture needed in today’s threat landscape. When it comes to ransomware, our solutions in Anti-Virus, Endpoint Protection, Data Loss Protection, Advanced Malware Protection and Managed SIEM will ensure you aren’t the next topic of a BBC news article.

Category: Security

Alastair Broom
December 10, 2016

I was recently asked what I think will be three things making an impact on our world in 2017, with a few permutations of course:

Maximum of 3 technologies that will be significant for enterprises in terms of driving value and transforming business models and operations in 2017

Innovations that are most likely to disrupt industries and businesses

I’ve put my three below – it would be great to hear your thoughts and predictions in the comments!

Internet of Things

The Internet of Things is a big one for 2017. Organisations will move from exploring ideas around what IoT means for them in theory, to rolling out sensors across key opportunity areas and starting to gather data from what were previously “dark assets”. The reason IoT is so important is because of the amount of data the things will generate, and what new insight this gives to organisations, including things like physical asset utilisation & optimisation and proactive maintenance. Those organisations that take the IoT seriously are going to see their customers, their data, and their opportunities in completely new ways. Being able to add more and more data sources into the “intelligence in stream” mean the decision is backed by more facts. It’s Metcalfe’s Law – the value of the network is proportional to the square of the number of users. Data is the network, and each thing is another user.

Being prepared to exploit the IoT opportunity though, especially at scale, will take proper planning and investment. Organisations will need a strategy to address the IoT, one that identifies quick wins that help further the business case for further IoT initiatives. The correct platform is key, an infrastructure for things. The platform that forms the basis for the connectivity of the things to the network will need to be robust, likely be a mix of wired and wireless, and because it’s unlikely to be a separate infrastructure, it needs to have Ideology the required visibility and control to ensure data is correctly identified, classified and prioritised.
Security too will be fundamental. Today the things are built for user convenience, security being a secondary concern. What the IoT then represents is a massively increased attack surface, one that is particularly vulnerable to unsophisticated attack. The network will therefore need to be an integral part of the security architecture.

Edge Analytics

Edge analytics is another one to look out for. As the amount of data we look to analyse grows exponentially, the issue becomes twofold. One, what does it cost to move that data from its point of generation to a point of analysis? Bandwidth doesn’t cost what it used to, but paying to transport TB and potentially PB of information to a centralised data processing facility (data centre that is) is going to add significant cost to an organisation. Two, having to move the data, process it, and then send an action back adds lag. The majority of data we have generated to this point has been for systems of record. A lag to actionable insight in many scenarios here may very well be acceptable. But as our systems change to systems of experience, or indeed systems of action, lag is unacceptable.
Analytics at the edge equates to near real-time analytics. The value of being able to take data in real time, its context, and analyse this amongst potentially multiple other sources of data, and then present back highly relevant in the moment intelligence, that’s amazing. Organisations once again need to ensure the underlying platform is up to the task. The ability to capture the right data, maintain its integrity, confirm to privacy regulations and be able to manage the data throughout its lifecycle. Technology will be needed to analyse the data at its point of creation, essentially you will need to bring compute to the data (and not the other way round as typically done today).

Cognitive Systems

Lastly, cognitive systems. Computers to this point have been programmed by humans to perform pretty specific tasks. Cognitive systems will not only now “learn” what to do from human interaction, but from the data they generate themselves, alongside the data from other machines. Cognitive systems will be continually reprogramming themselves, each time getting better and better at what they do. And what computers do Some is help us to things humans can do, but faster. Cognitive systems will expand our ability make better decisions, to help us think better. Cognitive systems move from computing systems that have been essentially built to calculate really fast, to systems that are built to analyse data and draw insights from it. This extends to being able to predict outcomes based on current information and consequences of actions. And because it’s a computer, we can use a far greater base of information from which to draw insight from. Us humans are really bad at remembering a lot of information at the same time, but computers (certainly for the short term) are only constrained by the amount of data we can hold in memory to present to compute node for processing.

Neil Thurston
December 8, 2016

Nature Rara has a balance, yin and yang, where apparent opposite forces complement each other. This is the goal of the hybrid IT to blend the wholesale nba jerseys apparent opposites of in-house IT and public clouds and deliver seamless services to end users. This can be achieved by delivering the right mix of on premise and cloud services, 2 with consistent operational management, automation, security and service management across them.

Sitting in-between on premise and cloud infrastructures and the delivered services is a software layer – the hybrid IT operating system. This ‘operating system’ includes the software-defined data centre, the policy-based and self-service automation platform and the telemetry for analytics, security and service management platforms. What hybrid IT operating system you choose depends on your ideology.

The first ideology is ‘push’. Push says that you take the technologies, skills, processes and tools that you operate wholesale jerseys on premise and you try and replicate those in the cloud. This enables a hybrid IT transition, rather than transformation. If you use VMware virtualisation on premise this would mean using a VMware-based public clouds (such as Logicalis Optimal Cloud, IBM Bluemix or VMware on AWS in 2017) and VMware cross-cloud services across these environments (such as cross-cloud vMotion).

The second ideology is ‘pull’. Pull says that you They take the technologies that the cloud operates and you replicate those on premise and refresh your skills, processes and tools accordingly. This requires a transformation, which itself presents an wholesale mlb jerseys opportunity to review and modernise your processes and tools. If you invest heavily in Microsoft this could mean using Azure cloud and platform services and then implementing Azure _ Stack and Hyper-V on premise and using Azure Portal to manage everything.

There’s no right or wrong ideology – each has its own benefits and drawbacks. The reasons for choosing one way over the other include cost and risk but also the expected mix of on premise and cloud based services – for be example, cheap jerseys if you expect to 2010 have >50% on premise infrastructure then Hospital push is potentially more advantageous – and vice versa.

What’s clear is that hybrid IT is the future state of IT. Regulations, security, bespoke workloads, etc will always err towards on premise services, whilst digital users and lines of business will gravitate towards the on-demand, elastic, pay-as-you-go cloud. Hybrid IT is required to blend these services together, delivering these in a consistent approach will drive operational, service and cost efficiencies. All you need to ask yourself is ‘push’ or ‘pull’?

Category: Hybrid IT

Scott Reynolds
December 5, 2016

In the third of a nine-part series drawing on the Logicalis Global CIO study, Scott Reynolds explains why apps are central to digital transformation.

The statement ‘Every company is a software companyThey has been Where on repeat over the last few years. When it was first uttered it was more of a future-gazing, stake-in-the-ground pronouncement – and its application to today’s world is probably still a bit premature. Not every business is a software business, yet – but our global CIO survey suggests that we’re getting there, with the help of a few shining lights along the way.

In cheap mlb jerseys 2013, Forbes noted that Ford sells computers-on-wheels and FedEx boasts a developer skunkworks (a loosely structured group of people who research and develop a project primarily for the sake of radical innovation.) Both are great examples of the happy union between traditional industries and technology industries – and, today, they are not as isolated as you might think. Over 700 CIOs now tell us that 77% of firms are similarly developing apps, either in-house, with the help of third parties or drawing on a combination of internal and external skills.

In fact, not only is the volume of companies getting up close and personal with application development starting to swell, but app development as a strategic activity is also attracting more attention. Rather than being relegated to the fringes, application development is increasingly taking to the centre ground. Today, less than a quarter of apps (23%) are purely promotional. The majority are being used to build new services and revenue (57%) or streamline business processes (63%).

Developing for digital

We да tend to associate apps with the Apple app store or the Android marketplace but they’re so much more than website spin-offs for mobile users. Enterprise-grade applications are replacing ‘big tech’. With the goal of putting automation at their core and providing frictionless self-service experiences, companies are cheap mlb jerseys bringing workloads up to the application level.

In the past, we’ve emphasised the benefits of instituting a Dev-Ops be strategy to develop code with fewer defects and support challenges once they’re released into production. My message to the 64% of businesses developing apps in-house would be to take a digital performance readiness approach and embrace agile from the beginning. Allowing updates to be made quickly and regularly, for constant refinement will create ‘killer apps’ with a punch to disrupt for the better.

Apps = Smart software

As the research attests, Online all sorts of companies are creating their own luck and doing some sort of app wizardry to get ahead.

Book publishers in the business of printing books are transforming themselves into software companies to offer digital content and branded applications. Airline companies are building equipment-tracking apps to provide engineers with a live view of the locations of each piece of airline maintenance equipment and pharmaceutical companies are creating medication temperature monitoring apps, which use sensors to ensure the cheap nfl jerseys best possible delivery of medical supplies.

Overall, apps are making firms a lot smarter. Their ability to gather tremendous amounts of data from sensors and other sources, using machine learning algorithms and predictive analytics makes them the brains behind a wholesale jerseys company’s transformation and the driving force behind our respondents’ digital transformation journey. Channelling James Carville, Bill Clinton’s campaign cheap jerseys free shipping strategist, “it’s the apps, stupid”.

Category: SDN / Mobility

Alastair Broom
November 30, 2016

As you might have read back in November 2016, a huge Distributed Denial of Service (DDoS) attach against Dyn, a major domain name system (DNS) provider, broke large portions of the Internet, causing a significant outage to a tonne of websites and services; including Twitter, GitHub, PayPal, Amazon, Reddit, Netflix, and Spotify.

How did the attack happen? What was the cause behind the attack?

Although exact details of the attack remain vague, Dyn reported an army of hijacked internet-connected devices Slopian are thought to be responsible for the large-scale They attack; similar to a method recently employed by hackers to carry out a record-breaking DDoS attack of over 1 Tbps against the French hosting provider OVH.

According to security intelligence firm Flashpoint, Mirai bots were detected driving much, but not shooting necessarily all, of the cheap NFL jerseys traffic in the DDoS attacks against Dyn. Mira is a piece of malware that targets estudo Internet of Things (IoT) devices such as routers, and security cameras, DVRs, and enslaves vast numbers of these compromised devices into a bonnet, which is then used to conduct DDoS attacks.

This type of attack is notable and concerning because it largely consists of unsecured IoT devices, which are growing exponentially with time. These devices are implemented in a way that they cannot easily be updated and thus are nearly impossible to secure.

Manufacturers majorly focus on performance and usability of IoT devices but ignore security measures and encryption mechanisms. Which is why wholesale MLB jerseys they are routinely hacked and widely becoming part of DDoS botnets and used as weapons in cyber-attacks.

An online tracker of the Mirai botnet suggests there are more than 1.2 Million Mirai-infected devices on the Internet, with over 166,000 devices active right now.

IoT botnets like Mirai are growing rapidly, and there is no easy way to stop them.

According to officials having spoken to Reuters, the US Department of Homeland Security (DHS) and the FBI are both investigating the massive DDoS attacks hitting Dyn, but treatment—How none of the agencies have yet speculated on who might be behind them.

At Logicalis UK, we have a threat centric approach. We can help customers wholesale NBA jerseys protect their applications and environments against DDoS attacks with on-premise, cloud-based or hybrid deployments based on solutions through our partner F5.

F5 provides seamless, flexible, and easy-to-deploy solutions that enable a fast response, no matter what type of DDoS attack you’re under. Together, Logicalis and F5 can;

  • Deliver multi-layered DDoS defense from a single box with a fast-acting, dual-mode appliance that supports both out-of-band processing and inline mitigation, while enabling SSL inspection and guarding against layer wholesale jerseys 7 app attacks.
  • Stop attacks on your data centre immediately with an in-depth DDoS defense that integrates appliance and cloud services for immediate cloud off-loading.
  • Unique layer 7 application coverage defeats threats cloaked behind DDoS attacks without impacting legitimate traffic.
  • Activate comprehensive DDoS defense with less complexity and greater attack coverage than most solutions.

If you would like to find out more about Logicalis’ advanced security practice, please complete the form opposite. Our experts are primed and ready to support you.

If you would like to find out more about our security practice please wholesale jerseys do not hesitate to get in touch:

Alastair Broom
November 29, 2016

Last week Zenit we hosted a number of our customers (30 people from 18 different organisation in fact) to an event held at Sushisamba in London. From the 39th floor, overlooking most of London, I had the privilege of hosting some of our existing and potential clients for a discussion predicated by the upcoming General Data Protection Regulation (GDPR). Over an absolutely fantastic lunch, which thankfully included many tasty meat dishes – I’m no huge fan of raw fish – we talked about how organisations are going to have to rethink their strategy around data governance and security in the face of a very tough new law.

I just wanted to give you a few takeaways from the day, none of which are edible – I’m sorry…

The first of our guest speakers was Lesley Roe, Data Protection Officer from the IET. Lesley spoke about what the IET are doing to get ready for GDPR. They hold a vast amount of personal data, and given that they are advising their membership on all manner of related things, they need to lead by example. Key points from her presentation cheap mlb jerseys are:

  • GDPR is about giving people more control over their personal data. Every day we share an extraordinary amount of personal data with all manner of organisations, and this data is cheap mlb jerseys valuable. GDPR is about ensuring we retain the rights to that value. What it is processed for, how can process it, how its retained/deleted once its useful life has expired.
  • Everyone has a part to play and training of staff & staff awareness are paramount. This, however, is no mean feat.
  • The process of data governance, and the education of that process throughout the organisation, will be the only way to fully comply with the regulation. How do the IET classify old & new data? How do they manage the lifecycle of the data? How do they make sure they are only obtaining, using and retaining the data they need and have consent for?
  • None of this, however, is possible without first knowing what personal data is within your organisational context, and where it lives.
  • Much of the thinking around GDPR will be a huge shift in the mindset of organisations today. Companies just do not think about their data assets and their responsibility for that data in the spirit of the regulation at all.

Our next two speakers were from two of our technology partners, VMWare and Palo Alto Networks. Things to remember here:

  • Technology, without a doubt, has a part to play in ensuring compliance. The regulators are far more savvy to what the art of the possible is in the security market, and they will be expecting organisations to leverage technologies within reach of budgets and according to exposure to best mitigate any risks to rights of individuals.
  • The ability to prevent, detect and report on the nature and extent of any breaches will be very important. Technologies will be needed to prove that organisations can do this effectively and efficiently, especially in the face of stringent reporting requirements.
  • State of the art will really mean state of the art. Regulators will be assessing how organisations are using the best possible mix of technologies to minimise both exposure to risk, and impact of any breach if/when they should occur.

The last presentation was from Ed Charvet and his guest star Ian De Frietas. Ian is part of the alliance we have with legal experts BLP. The joint value proposition Ed and Ian spoke about is what I believe makes us entirely unique in this space:

  • The first step towards compliance is data discovery – what is personal data from the ??? perspectives of both the GDPR and the organisations’ context? Where is the data? How is data currently classified? How is it processed? How are permissions obtained? This is delivered through a mix of manual and automated processes to help customers understand where they stand today.
  • But this process takes time. The regulation comes into law on the 28th May 2018, and as Ian made clear, the regulator is taking a “zero day” approach. This effectively means that if you’re not fully complaint on that day, Wine you are non-compliant, and the regulator has ever power to come after you. With fines of up to 4% of global wholesale group revenues, or EUR20 million (whichever is the greater of course), this is a regulation with teeth – and with what seems like a very real political agenda. Watch out Facebook, Google, Amazon…
  • Being compliant with the likes of the DPA today, while impressive, would still mean on day zero you do not comply with the new law.
  • Key questions to ask are: do you have a legitimate interest to process the data? What exactly are you planning to do with it? These will need to be made very clear even before the gathering of data has begun.

What became clear throughout the day is that time is tight to reach compliance, and the ICO in the UK seem to be recruiting in earnest to gear up for real enforcement of the law. This feels like something that is going to change how data, and in particular the protection of personal rights and data, is valued and protected by the organisations that get the most benefit from it. What organisations need to do as a matter of urgency is find out what personal data they hold, and where they store it. They need to assess the current security infrastructures they have and find out what gaps existing that could pose a risk and ultimately a loss of personal data. They need to be putting the right people and procedures in place to comply with new and enhanced rights, tighter reporting deadlines, and they should be working out what Data Protection Impact Assessments need to look like for their organisation to satisfy regulatory requirements.

As wholesale jerseys a next step, please reach out to either myself, Ed Charvet, Alastair Broom or Jorge Aguilera to discuss how Logicalis can help our customers get ready for GDPR. From the data discovery workshop, to engaging with BLP in legal matters, and technology cheap jerseys assessments powered by tools from the likes of VMware and Palo Alto Networks; we really can help customers on the road towards compliance.

Latest Tweets