Digitally Speaking

Regina Bluman
January 15, 2020

The first report in our four-part series from the 7th Logicalis Global CIO Survey, ‘The Changing Role of the CIO’, discovered that digital transformations are driving major changes for CIOs. As well as managing the IT infrastructure, they are now also expected to contribute to the business by supporting business initiatives and revenue growth.

As the inexorable tide of digital transformation sweeps through industries across the world, the traditional role of the CIO as a technologist is markedly shifting as companies seek to harness new digital technologies to drive the business forward.  Today, CIOs are expected to look beyond technology and consider it within the context of enhancing business strategy and growing revenue, which is clearly a significant change.

When we launched our first CIO Survey seven years ago, it was noted that CIOs were devoting the majority (70%) of their time to the day-to-day management of technology, compared to just over a third (33%) in 2019.

The findings strongly suggest that the CIO role is now becoming one which encompasses broader executive responsibility. It is no longer the endpoint of a career path; rather the role is becoming one which is very much about driving business success.

Our results complement findings from IDC (IDC FutureScapes 2020, The Future Enterprise: Technology, Leadership and Value presentation 11 December 2019). IDC suggests that CIOs are now being measured, in descending order, on business alignment, cost management, risk management, customer satisfaction and revenue management.

Our survey discovered that 43% of CIOs are being measured on their contribution to revenue growth. In 2018 we asked the same question and the figure was 35%. This focus on contribution to revenue growth is supported by the fact that a significant number (74%) of CIOs say reshaping their organisations’ customers experience has become a more significant benchmark of their performance.

This shift is being driven by the urge to digitally transform operations, placing greater emphasis on building a tech foundation on cloud computing, and reducing reliance on user-owned hardware. The requirement to transform is also increasing reliance on subscription-based cloud services, as CIOs seek to unlock competitive advantage and improve operational efficiency.

As business responsibility increases, nearly all (80%) of CIOs are spending more time on building and operating digital platforms and operating models, while over half (67%) have also experienced growth in supporting business initiatives.

The Logicalis findings reflect the wider global trend towards digital transformation and how this is reshaping the role of the CIO, which is also mirrored in the IDC presentation mentioned above. IDC says that 80% of CEOs are under pressure to execute a successful digital transformation, hence, as our research illustrates, traditional business responsibilities are cascading down to CIOs.

Key CIO findings

  • 35% of CIOs spend less time on the day-to-day management of IT than they did in previous years
  • 55% of CIOs are measured on their success of IT cost reductions, compared to 61% in 2018
  • CIOs spend 25% of their time on information and security compliance
  • 51% of CIOs have increased their time spent on innovation
  • Nearly half of respondents (47%) cite increased revenue and business growth as their priority in the next 12 months

To find out more about how CIOs are being tasked with greater business responsibility, and conflicting pressures, including meeting the demands of cybersecurity, compliance, business continuity and other large-scale projects, download the first part of the global survey here.

Smart Factory

Team Logicalis
November 4, 2019

We’re excited to be talking about all things IoT at Europe’s best digital manufacturing show. Join us on 13 and 14 November at Exhibition Centre Liverpool for the Smart Factory Expo.

Organised by The Manufacturer magazine and part of Digital Manufacturing week 2019 Smart Factory Expo promises to showcase transformational manufacturing technologies and game- changing innovation from over 150 cutting-edge exhibitors – including us!

Register now

How we’re aiming to inspire you

Meet the team from Logicalis at stand E20 to see how we can support you in driving value from your  existing environment through services such as condition monitoring, while also designing and building the foundations of a connected factory to support future growth.
– Wednesday 13 November 09:00 – 17:00 and Thursday 14 November 09:00 – 16:00 on stand E20

Hear Thomas Kugelmeier, a Client Solution Executive at Logicalis, specialising in industrial networking, explain how combing OT and IT to build a converged plant environment can enable true business transformation.
– Wednesday 13 November at 14.30 in the Smart Factory Theatre

Hear Richard Simmons, our Head of European Centre of Excellence for IoT, discuss how manufactures can deliver real business value from a connected factory.
– Thursday 14 November at 11.00 in the Digital Transformation Theatre

Read Richard Simmons’ thoughts on digital factories and the journey to industry 4.0. 

Sound good?  We can’t wait to see you in Liverpool, register now and pre-book your meeting with our expert IoT team.

Smart factory Expo logo

Smart Mirror

Richard Simmons
October 2, 2019

We’re excited to once again be taking part in Think Summit London, IBM’s flagship business and technology conference.

Taking place on Wednesday 16 October at Olympia London the event will feature tech talks, immersive experiences, topical debates and thought-provoking guest speakers.

We’ll be present at in the Data & AI Campus where our team will be demonstrating how organisations can unlock the benefits of IoT data with analytics.

Our visitors will be able to experience the Smart Mirror. Pick and model a garment and the mirror, using RFID technology will, show you targeted, specific information about your product or area interest.

Register now

IBM Think Summit

Once you’ve visited us, there are many other activities in the Data & AI Campus that are well worth investigating:

  • Journey to AI expert bar and assessment. Discover where you are on your journey to AI and access to specialist expertise
  • Watson in Action. See behind the scenes of the US Open with unprecedented access and test Watson’s ability to recognise the sights and sounds of the game.
  • Test your swing with Golf AI. An experimental project allowing you to match your golf swing to a professional golfer using an open source human pose estimation model.
  • Can AI help Leatherhead FC raise their game? View a dedicated application for Leatherhead FC that analyses statistics, match reports and social media to provide insights into team and individual performance and the opposition.

Register to secure your place now

Jump data lake

Team Logicalis
August 23, 2019

Data has ability to create new opportunities and revenue models for organisations, but how to manage all that data generated and drive useful insights is something IT leaders are still grappling with. For good reason, data lakes continue to get a lot of attention, however misconceptions remain. Read these five data lake myths before taking the plunge….

Myth one: A data lake is a product which you can buy

A data lake is a reference architecture that is independent of technology. It’s an approach that an organisation can use to put data at the heart of its operation that includes governance, quality and management of data, thereby enabling self-service analytics to empower all consumers of data.

As helpful as it would be, a data lake is not a product that you can just purchase. You can’t just buy any data warehouse solution and call it a data lake.

Myth two: There is only one data lake solution

A data lake could be developed and used based on many relational database management systems – you’re not tied into the prominent names, there are lots of vendors and systems available.

A data lake combines a variety of technologies to establish systems of insight to provide agile data exploration for data scientists to address business needs.

Myth three: Data lakes are for dumping data (and forgetting about governance)

While software and hardware are key components of a data lake solution, equally important is the cataloguing of data, quality of data, and data governance and management processes.

Just as some data warehouses have become massive black holes from which vast amounts of data never escape, a data lake can become a data swamp if good governance policies are not applied.

All data in a data lake must be catalogued, accessible, trusted, and usable; active governance, quality and information management are indispensable parts of the data lake.

Myth four: Delivering access to the data lake success is a measure of success

Having data in a central location is not a true analytics solution. The goal is to run data analyses that produce meaningful business insights; to uncover new revenue streams, customer retention models or product extensions.

But that data must be trusted, relevant, and available for all consumers of data. A data lake needs an intelligent metadata catalogue that can relate to business terminology, moving cryptic-coded data and making it more understandable with context. It will also attribute to the source and quality of data from both structured and unstructured information assets and governance fabric to ensure that information is protected, standardised, efficiently managed, and trustworthy.

Myth five: The data lake is a replacement for a data warehouse

The data lake can incorporate multiple enterprise data warehouses (EDW), plus other data sources such as those from social media or IoT. These all come together in the data lake where governance can be embedded, simplifying trusted discovery of data for users throughout the organisation.

Therefore, a data lake augments EDW environments to allow, enable or empower data scientists and analysts to easily explore their data; discover new perspectives, insights, and to accelerate innovation and business growth.

Thanks to mobile devices, apps and IoT, for example, the amount of unstructured data is growing exponentially, so it’s no wonder the demand for data storage is intensifying.  According to IDC, data lake adoption is expected to rise from 30% of organisations worldwide, to 90% within the next three years; to see if this approach could be a solution for your organisation, download our latest infographic  if you’d like to know more and identify some practical next steps, why not register for one of our Data Services Workshops?

Find out more 

Data lake women end pier

Team Logicalis
August 16, 2019

Data, data everywhere but not a drop to use, that’s how the poem goes, doesn’t it? Something like that anyway…. Is a data lake the answer to the challenge of making the most of ever-growing data volumes?  Organisations we’re working with are increasingly exploring this approach to address demands for an agile yet secure and well-governed data environment.

What is a data lake?

A data lake is a shared data environment that comprises multiple repositories and capitalises on big data technologies.  Unlike a data warehouse, a data lake uses a flat architecture that keeps data in its native format until it’s needed. It allows rapid landing and storage of data, and it provides ready, unfettered self-service access to data for analysis. Comprehensive governance capabilities help ensure data can be easily found, understood and stored without duplication.

For more information on the difference between a data lake and a data warehouse, we like this ‘super simple data lake explanation’ from Forbes.

How might a data lake address some of your data challenges?

A well-governed data lake will:

  • Enable swift access to a full range of data in a timely fashion for business users, analysts, data scientists and developers, so they can generate accurate, meaningful insights
  • Accelerate analytics and data preparation to keep up with the speed of business
  • Facilitate collaboration between knowledge workers, data scientists (when deeper analytic capabilities are required) and data engineers (when it’s time to deploy data lake–based applications to line-of-business users).
  • Ensure data quality, security and governance to provide users with trusted, understandable data while protecting privacy and maintaining compliance with regulations
  • Accommodate a rapidly growing collection of data with a scalable approach to storage that supports fast-growing data volumes cost-effectively while maximizing existing resources.

Download our infographic to see how a data lake could help your business

Considering taking the plunge into the data lake? Sign up to one of our Data Services Workshops where our data experts will get to grips with your current data environment and infrastructure, your business challenges and data goals, and then give you practical advice and achievable milestones to help you meet your data aspirations.

Find out more  

The digital factory

Richard Simmons
June 26, 2019

In the world of manufacturing Industrial Internet of Things (IIoT), flooring is a great visual metaphor for two, often-conflicting, sides of the business. The carpet tiles of the office – governed by IT, versus the epoxy resin floor of the factory – governed by operational technology (OT). Business needs, procurement cycles, downtime allowances; everything is different. And here lies the first, and often biggest, challenge in the journey towards the ‘digital factory’: harmonising two separate worlds.

Like the interchangeable carpet tiles, the IT equipment in the office is refreshed during relatively short investment cycles. But with costs running into the tens of millions, the plant, machinery and assets of the industrial side, like the long-wearing resin floor beneath, are expected to perform for decades.

With such colossal differences, it’s often difficult to get IT and OT to work together and consider each other’s business needs. But if the two can unify, there are real savings and improvements to be had, in terms of efficiencies, data and business insight, cost reductions and productivity.

What is a ‘digital factory’?

Industry 4.0. It’s the latest buzzword in IT and manufacturing and is generating lots of noise. But away from the chatter most manufacturers are actually focused on business as usual. Aside from the difficulty in breaking down the barriers between the office and factory, the term ‘digital factory’ itself has been a source of confusion.

When you delve further into what is required to deliver a digital factory, it quickly gets complicated. A manufacturer would have to build capability in a number of different areas to get to the point of maturity needed to achieve ‘digital factory’ status. They’d need to be generating data about what’s happening on the factory floor, driven by centres or actuators; things that are either generating data or that you can pass messages through to act based on your data.

Then comes the need for networking and connectivity – both within the plant environment and industrial areas as well as within the IT areas. Next, you’d need data and analytics platforms, enabling you to bring that data together and then run analytics, which would then move into areas like machine learning and AI. And then the most valuable element: how you build out the business process, creating workflows to start automating these processes.

While there is some value in having a dashboard that shows if a machine is working as effectively as it should, the real value is to be able to start tuning that machine in real time using your data.

A true digital factory is made up of many layers. At the centre are devices, connectivity, data and analytics, and automation. And, as you can imagine, that’s quite a complicated mix for manufacturers that might be early-on in their digital transformation journey and may not have the platforms in place or the in-house skills to manage them.

While IT equipment is refreshed every few years, the opportunity to update and upgrade plant equipment, machines and assets is hampered by eye-watering capex costs and downtime – a very real challenge. Machinery and equipment will run constantly for months and even years, so maintenance periods are few and the ability to install new technology is limited. Digitising and automating this environment is a long journey, and many OT teams simply don’t know where to start.

For those just setting off, it’s common to see small pilot projects. Perhaps a few sensors to condition and monitor the health of machinery, or to generate data for understanding machine efficiency. We’re talking small-scale, relatively inexpensive pilots to test and learn. But that’s still a long way from manufacturers investing millions to refresh entire operating environments, fully-integrated with IT, and used to make tangible operations savings.

So how can we get closer?

The first step is getting the two sides talking and working together. What’s needed is a unifying, cross-departmental manager who can engage the OT and IT sides of the business to develop a strategy whilst taking ownership of the delivery. Mythical though it may sound, little by little, the role of Head of Digital or Head of 4.0 is actually starting to appear in a handful of manufacturing firms.

The Head of Digital will be the architect of change; their role absolutely critical to the success of digitising the factory. Just having that view across the business; having the responsibility to integrate IT and OT without a bias in either direction, could help elevate projects above the common roadblocks, and see successful pilot projects move through to production.

Unifying people, culture and entrenched processes are key challenges that this role will have to manage. Change brings the greatest resistance, so, along with technical know-how, the Head of Digital will need impeccable people skills to break down the silos between the office and the factory.

For manufacturers just starting out on the journey to design a digital factory, the destination must feel like the summit of Mt. Everest, viewed from the foot. It’s a long, steep climb and not to be embarked upon lightly. My money would be on the Head of Digital to cut the best path to the top without leaving anyone behind.

Richard Simmons
April 26, 2019

If you work in IT like myself, you’ll know what a struggle it can be to explain to your family and friends exactly what you do for a living. More specifically, that your job is probably so much more than fixing a laptop – although this is something that I must admit people still ask me to do now and then…

Not being able to easily explain to people what I do and the value of it was frustrating, but now this is changing along with the nature of my job. In the past two years I have been heavily involved in Artificial Intelligence (AI) and the Internet of Things (IoT). Now the solutions that I’m developing and the projects that I’m working with are having a direct impact on people, communities, and even potentially broader societies. From flood prevention to waste optimisation and robotic process automation, the application of technology is changing the way people live and work. So now, when I give examples of my work, people can relate to or even be amazed by how far technology has come.

Moving past repetitive tasks…

The impact of technology, and particularly the development of AI on businesses and society, is something that I spend a lot of time discussing with customers and partners. The advent of Cloud computing has enabled us to analyse vast quantities of data with ever more powerful computers, which is extending the possibilities of what we can do. Recently, there has been a lot of discussion around AI programs that have excelled at a particular task, predominantly a type of game. The AI could be trained to execute a specific task to a level beyond human capabilities. We see this approach being applied within manufacturing, where robots can be programmed to repeat a particular task to a precision and speed that a human would struggle to match, and can learn from their experience to develop more efficient ways of working.

The focus now for companies like Deepmind is to develop a general AI that can be good at more than one task. For Deepmind, this was initially also focused on games and the ability for machines to learn independently, until they are at a level to outperform humans. The hope is that these general AI programs will be able to apply this approach to multiple real-world problems, reducing the time they need to become effective. The concern though is that we will reach a point, sometimes called the Singularity, when we will develop an AI that has the ability to learn and apply itself to any problem, without needing input from human programmers. It is easy to see that this could have huge benefits, but for me, it is also a cause for concern.

Autonomous cars are  a good example of this dilemma. We know that leveraging autonomous cars could have a huge impact on society. For the elderly or disabled it could open up much greater freedom and mobility than is currently possible. For governments and councils, it would allow them to get greater efficiency out of the infrastructure that is in place and manage the increased demands from a growing population. But they will never be entirely safe and they also raise a fundamental problem; how do we program an AI with the rules and ethics that we as humans live by?

The most widely known example of this is the runaway car. A car is travelling down a hill, it cannot be stopped but can be directed to move. On one side is a cliff, on the other side is a child, in the middle of the road are two adults. What should the AI in the car do? Swerve over the cliff and kill you, as you are in the car, but save the child and adults? Swerve and kill the child but save you and the adults? Or not move at all and kill the two adults but save you and the child? What if there were two children not one, or what if the two adults were pregnant women, or perhaps there are children in the car? All of these situations would need to be defined in a set of rules, and even then you may need to agree to have a setting in the car, which would allow you as the occupant, to decide whether the car should put more value on your life or on others, as not everyone’s values are the same.

So what’s next?

This for me highlights one of the key points around the development of technology in the next few years. Technology will not suddenly take over, nor will AI dominate and compete directly with humans, unless we let it. Stephen Hawking, the world-renowned theoretical physicist, expressed his fear that AI might replace humans altogether. He believed that we should still move forward on Artificial Intelligence development, but we also need to be mindful of its very real dangers.

It is up to each business to decide if AI should replace a human and if they do, whether the employees can be given more rewarding work or be made redundant. It is up to society and governments to decide if AI should be applied to urban mobility and how they mitigate the impact on those that work in those sectors currently. What we can do with technology will have a direct impact on all of us, but I strongly believe that it is humans that will determine how that develops and how it is applied. Perhaps in trying to replicate how humans behave in AI solutions, we will actually learn a lot more about what makes us human in the first place.

Fanni Vig
January 11, 2019

In years to come, it’s possible that we’ll look back on 2018 as the year that Artificial Intelligence (AI) really arrived. When potential became possible, and hype gave way to enterprise adoption. But how much influence are CIO’s having in its adoption, and how well understood is the term?

AI and innovation

We recently launched the sixth edition of the annual Logicalis Global CIO Survey, which gathered the views of more than 840 CIOs in a number of areas. Innovation was a dominant theme, with this year’s survey demonstrating real progress in the role of the CIO in this area. CIOs are finding that they are very much at the heart of the action – with 83% either leading (32%) or enabling (51%) innovation. It’s an area they’re being increasingly judged on too, with half (50%) now being measured according to their ability to deliver service innovation.

Given that technology now pervades innovation – either delivering it or enabling it – we were keen to understand how CIO’s view the rate of adoption, utility and impact of emerging technologies – particularly Internet of Things (IoT) and Artificial Intelligence (AI). Whilst IoT shows signs of becoming more mature in its use, AI and Machine Learning, appear to be in an earlier phase of the hype cycle – possibly where IoT was 12 months ago. Nearly a fifth (19%) of CIOs claim that their organisations are already using AI. That adoption seems likely to continue at pace, with 66% saying AI will be in use in their organisations within three years.

Understanding who is responsible for AI

Before we get too excited about AI changing the face of business, it’s important that we stop and question just how the term is understood. Like IoT, AI is really complex, and many of the use cases we’ve seen up until now are just scraping the surface. The technology industry is still collectively scratching its head to work out what it really means, as the hype continues and threatens to dilute the concept. Do CIOs perceive AI simply as autonomous automated services and interfaces essentially guided by complex manually created rules? Or do they see it in its purest form, as technology that is not just autonomous and automated, but also able to learn and adapt independently based on context? In truth, it’s probably a bit of both.

The big issue facing CIOs when it comes to AI is less about the technology itself, and more about the people, process and culture which support its adoption. Whilst there’s clearly a value in products being ‘AI-ready’, the hangover from the rise of Shadow IT – which saw departments and individuals investing in their own hardware, software, apps and services – is organisations that have huge swathes of data residing in a wide range of places, some out of sight of the CIO.

When asked about where AI and machine learning were in use within their organisations, it was no surprise to see the CIOs we surveyed point to the IT department as the leading area. That’s likely because this is the department they’ve got the most sight of. The siloed nature of organisations, and the distributed data residing within, causes a headache in terms of ownership. Most business have a variety of people who own the data in each department, but few who want to be responsible. That’s where the CIO comes in, as issues such as data security and compliance sit squarely within their remit.

The opportunity for AI

So where can the CIO start, in order to truly realise the potential of AI? The first step is to get people together to understand and agree their processes around data. People and processes can change really quickly, so taking the time to agree a clear approach is important. Those silos aren’t going to be broken down overnight, so the CIO also needs to be realistic. Only then can you think about how to leverage what you’ve got in place, but even then this needs to be done in the right way, with issues such as security, performance and cost-effectiveness at front of mind.

From the survey findings, it was encouraging to see CIOs so bullish about their ability to engage with AI. This, more than anything, will be vital if organisations are to derive true value from AI. At present, rates of use across various business departments are low – with the exception of IT and customer service – which suggests operational, fringe and test cases. However, this also seems to be an opportunity for CIOs to build a culture of experimentation and small-scale deployments driven by clear customer or market needs.

AI should not be seen as a silver bullet. It isn’t the answer to everything. It needs more data, more resources and you need the right foundations and infrastructure in place and ready to go. Otherwise there is going to be a huge amount of inefficiency. Especially as the speed of change in technology terms is unbelievable, with new vendors and solutions springing up on a regular basis. To understand whether these new options are to deliver against your objectives it’s imperative that you look at the entire ecosystem, with people, culture and process playing a pivotal role.

Pepper, the humanoid robot talks AI

Justin Price
December 10, 2018

Logicalis recently attended IBM’s Think London event where Pepper the Robot was our honorary guest. With people lining up to talk to Pepper, we battled through the crowds to ask a few questions about the hot topics of the day, such as implementing AI. 

Q. Hi Pepper! Can you tell us about Think London?

IBM Think London was an incredible event where technology met humanity. There were a range of industry leaders and experts there to advise people on topics such as choosing the best fit cloud model for your business, leveraging data better, and innovating faster.

I was delighted to join my friends at Logicalis at the show, informing visitors about how they can modernise infrastructures, transform the way IT is delivered, and unlock the value of data. As ‘architects of change’, Logicalis were able to give me lots of examples of where they have delivered solutions and services that take advantage of the benefits driven by cloud, mobility, big data analytics and AI.

Pepper the humanoid robot at the Logicalis stand at IBM Think London

Q. There was a lot of talk about Artificial Intelligence at the show. Can you tell us a bit more?

Everyone is talking about Artificial Intelligence currently and the hype around it in the media is extraordinary. Logicalis recently released its sixth annual global CIO survey  which revealed that nearly a fifth (19%) of CIOs claim their organisations are already using AI. Moreover, 66% said that AI will be in use within their organisations within three years. In reality though, only 4% of organisations have successfully deployed AI.

Exciting times, yet the report also sounded a note of caution about how to make the best use of AI. You need a partner with strong capabilities and know how to help you implement it correctly. And it’s worth noting that, outside of the IT department, the business area most likely to use AI is customer service (17% say it is in use here). Old style chat bots have got nothing on AI in its purest form – technology that is able to learn and adapt independently based on context. Like yours truly…

Q. What sectors are driving AI adoption? 

Last year financial fraud losses in the UK totalled £768 million. AI is a powerful tool that can provide financial organisations with the ability to combat this fraud challenge; from automating risk analysis, detecting and investigating fraud, assisting regulatory intelligence and automating IT functions.

AI can also provide retail organisations with a range of benefits including personalised shopping experiences, dynamic pricing or the use of real time tracking to optimise logistics.

Given the potential of AI, it’s not surprising that many businesses want to introduce it into their organisation to increase competitive advantage.

Thank you for chatting with us and giving us an insight into the event Pepper!

Want to learn more about implementing AI?

Join our live webinar on Wednesday 19th December at 11am to hear Justin Price, AI Lead and Chief Data Scientist and Scott Hodges, Solutions Architect discuss how to deliver a scalable AI strategy.

You will discover:

  • The main challenges businesses face in implementing AI 
  • How to put AI into practice inside your organisation
  • Powerful Artificial Intelligence and machine learning use cases
  • Practical advice for an AI-ready infrastructure

Register now



IT Leaders' Summit

Richard Simmons
October 25, 2018

Last week, we sponsored Computing’s annual IT Leaders’ Summit which took place at Carlton House, London. The summit provided an opportunity for senior IT executives across all industries to discuss how they can drive digital transformation in areas such as the cloud and Artificial Intelligence.

Richard Simmons, our Head of European AI & IoT Practice presented to a packed house and hosted a round table; here’s a quick recap of our day.

The hype of AI

Artificial Intelligence has become the latest focal point in the conversation around data insights. Yet, contrary to the volume of noise surrounding it, according to industry studies only 4% of organisations have actually deployed AI*, with most of these businesses still in the early phases of AI adoption and facing unexpected challenges. Richard took to the stage to bust some myths and to provide the facts about AI that have been overlooked in the hype.

“AI is not the answer to everything. It needs more data, it requires more resources, you need the right foundations in place and your infrastructure has to be ready, otherwise there is a huge amount of inefficiency” he said. “You cannot underestimate the time it will take to develop and build. It can take weeks, even months, to get an AI strategy up and running and a massive 80% of an AI project’s time is spent in the data preparation phase.”

Richard also highlighted the extremely long training times required. This is partly due to the different sets of skills needed to fine-tune and deploy it – the skill set of the person managing an AI project will be vastly different to the skill set of the person building it.

“After you have worked on a business strategy, then spent a long time preparing the data, you have to experience all that pain again. Because the more data you give an AI project, the more accurately it performs.  If you want to really drive value from the data you have, the project is never ending. AI is not a quick fix.”

Opening the discussion

Following Richard’s presentation, our over-subscribed (it’s almost like it’s a hot topic!) IT leaders round table began, where we discussed the key factors and approaches to be considered to deliver a scalable AI strategy.

The discussion started with the question ‘who on this table has deployed an AI project in their company?’. Out of 29 people, only five IT leaders said yes. Those who responded with yes reinforced that their projects were in the early phases and were starting small before they scaled up.

“It can take a lot of time with very little return at the start of an AI strategy. So, it can be hard to encourage the rest of the business to support the project when they can’t see the rewards. This is why starting modest and breaking it down into smaller projects can help. You don’t want to bite off more than you can chew” agreed Richard.

Sharing is caring

We are currently seeing a great push for data sharing across businesses, a concern that was raised during the discussion. As Richard said, AI operates most efficiently when it has been fed a lot of data therefore it would make sense for businesses to share already processed and interpreted data with others in a similar sector. So why aren’t businesses doing this?

“There isn’t always a desire to share what you’ve worked so hard on. If you do share your data, there is a huge risk that the person you’ve shared it with will implement it and use the data better. If you share your work, get ready for the competition to begin.”

Who owns the data?

One topic that dominated the round table was the ownership of data within a business. Many of those who sat at the table expressed the desire to be a data-run business, but getting to that stage wasn’t as easy as they hoped. According to Richard, if a company wants to be data led and to use AI efficiently, there needs to be a “new culture built internally. Every part of the business needs to work with data in mind, not just the IT department and those involved with the AI process.”

This is where data ownership is a necessity. It was mentioned that whilst employees may be really interested in deploying AI, no one was excited about the management, governance and upkeep of the data needed for the AI to work efficiently! To combat this, one person said that their company recently wrote a data strategy – from compliance, to governance, to how the company values and uses data – in order for every employee to be on the same page.

As a final note, Richard said “getting every part of the business on board is vital, but it will take work. This culture change to a data driven enterprise will not happen overnight, it should be on going just like the AI project itself.”

Is your business ready for AI?  Is your infrastructure?  Find out how Logicalis UK and IBM can help you overcome AI infrastructure limitations and access an IDC expert infrastructure readiness report – Finance or Retail and Manufacturing.  

Logicalis UK would like to thank Computing for hosting us at the IT Leaders’ Summit and to those who joined us at our panel session and the round table.


Ismail El Kadiri
April 4, 2018

Over the past decade or so, numerous planning and analytics solutions have come out in an effort to catch up with the complex business environment. Most solutions compete around speed, scalability, visualisation capabilities, scenario modelling and excel integration. Our recent Global CIO survey revealed that analytics is still considered ‘very important’ or ‘critical’ for driving innovation and decision-making across the business.

Traditionally, planning tools have been aimed at the department of Finance. Budgeting and forecasting needs, P&Ls, balance sheets and cash flows have been the bread and butter of planning and reporting solutions. However, this only scratches the surface of what can be achieved in the world of business planning. We are in an era when a truly successful planning practice is not solely based upon financial-focused analytics, but also includes customer, sales performance and workforce analytics.

Planning and analytics for the entire business

Although Finance is usually the right strategic area to begin implementing any planning solution, it should just serve as a starting point. A truly successful planning solution should incorporate your operational planning, giving you a more accurate and all-encompassing view of your business.

Apart from Finance, almost all business functions can benefit from agile planning processes and data analytics with payroll, sales and asset management at the top of the list.

Payroll analytics to decrease manual work

Payroll planning can be a very complex and frequently it’s a manual task for modern organisations. HR employees responsible for payroll face multiple components that influence the complexity of the payroll process, such as NI adjustments, complex bonus schemes, salary increases and benefits. Taking into consideration ongoing government changes, regulatory updates and HR related modifications, payroll can prove to be a stressful and time-consuming process.

The most effective way to evolve a historically manual process and increase the speed and accuracy of payroll planning is through data. By taking advantage of analytics, you can create timely, reliable payroll plans to put employee and business insight into action.  You can benefit from faster processes and a uniform view of the data and simplify analytical processes, that HR employees might not be able to execute.

Accurate sales forecasting with planning analytics

Sales is another department of an organisation that can greatly benefit from planning analytics. For most organisations, sales planning and forecasting is their life-blood – as it directs the efforts of each department and helps define the overall strategy. Therefore, it is crucial to set realistic and accurate targets based on existing data.

With agile planning and analytics, businesses today can forecast sales volumes and adjust cost and price centrally to see the bottom line impact of the Sales department. More than in any other part of the organisation, this is the ideal area to take advantage of seasonality forecasting, what if scenario modelling and phasing. This will result in successfully steering sales activities, maintaining margins and delivering value, both to the client and the business.

Asset Management simplified through planning analytics

Often the biggest hurdle that companies face when managing their assets is the volume of data that needs to be collected, analysed and maintained. Increasing cost pressure, complex structures in supply chains and rising risks due to complex procurement mechanisms are just part of the challenge for modern businesses.

Effective and flexible networking of data is crucial in order to make fast and accurate decisions. With advanced planning and analytics, organisations can apply profiles to the assets to plan for depreciation and asset control.

At Logicalis, we have a holistic approach to planning analytics, moving beyond finance and helping you take data-driven decisions for the entire business.

Talk to our team of experts and discover how to make start your planning journey.


series of padlocks on brick wall, data privacy related words on the rights side

Tim Wadey
January 28, 2018

Data Privacy on the spotlight!

Data Privacy Day may not be an official holiday for your IT department, but it definitely should remind you that you need to focus and do more to protect confidential data.

The Data Privacy Day was first introduced in 2009 by the Online Trust Alliance (OTA) in response to the increasing number of cybersecurity attacks and data privacy breaches, emphasising the need for effective data protection regulations and transparent processes for collecting and using personally identifiable information (PII).

Examples of PII that fall under data protection regulations are:
• Name;
• Social Security number, full and truncated;
• Driver’s license and other government identification numbers;
• Citizenship, legal status, gender, race/ethnicity;
• Birth date, place of birth;
• Biometrics;
• Home and personal cell telephone numbers;
• Personal email address, mailing and home address;
• Security clearance;
• Financial information, medical information, disability information;
• Law enforcement information, employment information, educational information

If one considers the sources that PII can be collected from and how many new are added on a daily basis – big data, the internet of things, wearable technology – it is easy to understand why data privacy has become increasingly challenging. And let’s not forget the ransomware attacks, which are the latest major data privacy challenge.

Despite the size of the recent ransomware attacks, the majority of organisations still don’t have structured processes in place to prepare themselves and keep confidential data safe. Although there are effective steps for protection against ransomware threats, their number has significantly increased and companies delay to announce in fear of negative publicity.

In order to stop such actions from happening and improve the current data privacy practices, the European Union is introducing the General Data Protection Regulation (GDPR) taking effect in May 2018. This is the biggest shake up of data protection laws in the last 20 years.

What is GDPR?

GDPR is the latest set of regulation law framework across the EU that aims to increase data privacy for individuals, and gives regulatory authorities greater power to take action against businesses that breach the new data privacy laws. GDPR also introduces rules relating to the free movement of personal data within and outside the EU.
In particular, GDPR involves:
• Obtaining consent for processing personal data must be clear and must seek an affirmative response.

• Data subjects have the right to be forgotten and erased from records.

• Users may request a copy of personal data in a profile format.

• Parental consent is required for the processing of personal data of children under the age of 16.

As a result, organisations need to be extremely aware of these changes as they can face very strict fines in the cases of non-compliance. Can your organisation afford to be fined up to £20 million for failing this data privacy regulation or 4% of annual global revenue, as required by the new General Data Protection Regulation?


24% are unaware of the incoming data protection legislation, while one in three companies believe that GDPR isn’t relevant to them.*

Get Started with a GDPR Readiness Assessment

In response to the fast approaching data protection regulation, Logicalis UK Advisory Services team have developed a GDPR Readiness Assessment that will allow us to help you understand and frame your thoughts on your journey to compliance.

The Logicalis GDPR Readiness Assessment will help you answer a key question – Where am I on my journey to data privacy compliance, today? By investigating elements of your organisational landscape, we will produce an ‘as is’ assessment, where we will be able to gauge where you are on a standardised maturity curve, considering all things around cybersecurity and data protection.

Get in touch with our Advisory Services to discuss how we can help you in your journey to GDPR Readiness.



*London Chamber of Commerce and Industry, 

Team Logicalis
September 27, 2017

Throughout history, I don’t believe we’ve ever seen as much change as we do in the world of Technology! Just to think that in 10 years’ time we’ ve had more iPhone releases than Henry VIII had wives.

Taking a page out of some of tech giants books, be it Apple to Salesforce, it’s clear that innovation is at the centre of what enables the industry to move at the pace it does. It would be fair to say that 3 major trends currently dominate the industry:

1.Service, service, service – Many big players in the hardware product space recognise hardware is fast becoming a vanilla commodity. You’ve got a number of vendors such as Cisco, Oracle, Ericsson, Nokia, HP scrambling very quickly over a number of years to enable value added services on top of the hardware to increase margins.

 “Services are enabled by the specific knowledge, skills and experience you bring to the table which often drives business value through improved margins.”

Sometimes when I think about how you can build your brand of service that you deliver to customers, I like to compare it to food (one of my favourite subjects).

What keeps you going back to your favourite restaurant? Let’s take for instance McDonalds. It could be the quality of the food, but ultimately you KNOW you will get a fast, efficient service and a smile when they ask ‘would you like fries with that?’. The point being, it’s the trusted customer experience that underpins successful services, remember this bit – I’m going to allude to this part later on.

2.Business process design driven by cost reduction, optimization and automation – Ultimately, we use technology to enable us to make our lives simpler. Traditional IT has become so entrenched in complexity and with that has come high cost. Many businesses of all sizes are certainly looking at their balance sheets with scrutiny and seeking to utilize the benefits of IT innovation to gain a competitive advantage. The principles of globalisation, business process optimization and automation are all relevant now as we transform traditional IT to achieve the ultimate goal of simplicity.

3.Data driven customer experience being an investment for the future – Products in the world of data analytics are booming as businesses recognise the power of data in enabling intelligent business decisions. Some proven examples of boosting business value are how Telcos are using customer location data to pinpoint relevant marketing text messages.

Imagine you’re at the airport, where intelligent systems will pick up your location and send you a text to see if you want to purchase international data plan while you’re away. So instead of sending you random marketing messages, geo-location marketing becomes targeted and relevant. Through this intelligent marketing Telcos have been able to generate 40% more revenue than expected in that portfolio.

Keeping up with the pace of change within the industry can be overwhelming, unless you harness the key themes that I mentioned earlier which will be sure to relate to business value. Contact Logicalis today to learn how you can implement an agile business model and use its benefits to increase your business value.

Team Logicalis
August 21, 2017

It’s always the same scenario: someone giving me some data files that I just want to dive straight into and start exploring ways to visually depict them, but I can’t.

I’d fire up a reporting tool only to step right back, realising that for data to get into visual shapes, they need to be in shape first!  One correlation consistently appearing over the years is that time spent on ETL/ELT (Extract, Transform and Load [in varying sequences]) and the speed of exit from reporting layer back to data prep share a negative correlation.

Data preparation for the win

‘80% of time goes into data prep’ and ‘Garbage in Garbage out (GIGO)’ have existed for some time now but don’t actually hit you until you face it in practical situations and it suddenly translates into ‘backward progress’. Data quality issues can vary from date formats, multiple spellings of the same value to values not existing at all in the form of nulls. So, how can they all be dealt with? Data prep layer is the answer.

Often with complex transformations or large datasets, analysts find themselves turning to IT to perform the ETL process. Thankfully, over the years, vendors have recognised the need to include commonly used transformations in the reporting tools themselves. To name a few, tools such as Tableau and Power BI have successfully passed this power on to the analysts making time to analysis a flash. Features such as pivot, editing aliases, joining and unioning tables and others are available within a few clicks.

There may also be times when multiple data sources need joining, such as matching company names. Whilst Excel and SQL fuzzy look-ups have existed for some time, the likes of dedicated ETL tools such as Paxata have imbedded further intelligence that enable it to go a step further and recognise that the solutions lies beyond just having similar spellings in between the names.

All the tasks mentioned above are for the ‘T’ (Transformation) of ETL and is only the second OR third step in the ETL/ELT process! If data can’t be extracted as part of the E in ETL in the first place, there is nothing to transform. When information lies in disparate silos, often it cannot be ‘merged’ unless the data is migrated or replicated across stores. Following the data explosion in the past decade, Cisco Data Virtualisation has gained traction for its core capability of creating a ‘merged virtual’ layer over multiple data sources enabling quick time to access as well as the added benefits of data quality monitoring and single version of the truth.

These recent capabilities are now even more useful with the rise in data services like Bloomberg/forex and APIs that can return weather info, if we want to further know how people feel about the weather, then the twitter API also works.

Is that it..?

Finally after the extraction and transformation of the data, the load process is all that remains… but even that comes with its own challenges. Load frequencies, load types (incremental vs. full loads) depending on data volumes, data capture (changing dimensions) to give an accurate picture of events and also storage and query speeds from the source to name a few.

Whilst for quick analysis a capable analyst with best practice knowledge will suffice, scalable complex solutions will need the right team from IT and non-IT side in addition to the tools and hardware to support it going forward smoothly. Contact us today to help you build a solid Data Virtualisation process customised to your particular needs.

Richard Simmons
June 20, 2017

I have a confession to make, I love to read. Not just an occasional book on holiday or a few minutes on the brief, or often the not so brief, train journey into and out of London but all the time. Right now has never been a better time for those with a love of reading! The rise of digital media means that not only can you consume it pretty much anywhere at any time but more importantly it is making it easier for more people to share their ideas and experience.

Recently I came across a book called “Thank You for Being Late: An Optimist’s Guide to Thriving in the Age of Accelerations” by Pulitzer Prize winner Thomas L. Friedman., which I not only found fascinating to read but has also helped to shape and change the way I view many of the challenges we are facing both in business but also in our personal lives. The premise of the book is that often he would arrange to meet people for breakfast early in the morning, to do interviews or research stories but occasionally these people would be delayed. These moments, rather than being a source of frustration, became time he actually looked forward to as it allowed him to simply sit and think. And looking at the world, he believed we are living through an age of acceleration due to constant technology evolution, globalisation and climate change. He argues that these combined are the cause for much of the challenges we currently face.

The key point about this acceleration is that it is now reaching a level in which society and people are struggling to adapt. Within the technology world we talk about disruption a lot, a new business or technology arrives that can disrupt a sector or market, the competition struggles to adapt and eventually a status quo is resumed. For example Uber has undoubtedly caused a huge disruption in the world of transport, and governments are currently working through how they can better legislate for this new way of operating. The challenge will be that new legislation can take 5-10 years to agree and implement in which time Uber may well have been replaced by autonomous cars.

So what we are experiencing now is not just disruption but a sense of dislocation, the feeling that no matter how fast we try and change it is never enough. In this environment it will be the people, businesses and societies that are able to learn and adapt the fastest which will be most successful . For business we are constantly shown how being more agile in this digital world can drive efficiency, generate new business models and allow us to succeed but I feel often what is lacking is the guidance on how to get there. We have a wealth of different technology which can support a business but what is right for me? What should I invest in first? And how do I make sure that I maximise the value of that investment?

My experience with many of our customers is that they understand the challenges and also the opportunity, but simply do not have the time to think and plan. When they do have time the amount of choice can be overwhelming and actually daunting. In a small way this is the same challenge I face when looking for new books to read, I can go online but with so much to choose from how will I know what I will enjoy? The opportunity that digital media provides with more authors and contents can actually make finding and choosing something that you think is valuable much harder.

In Logicalis, we understand the business challenges that you face and discuss with you the different technology options that could support you, recommending those that can deliver the biggest value in the shortest time frame. Contact us to find out how we can help you keep up to speed with emerging technology and use it to your benefit.

Fanni Vig
April 20, 2017

Finally, it’s out!

With acquisitions like Composite, ParStream, Jasper and AppDynamics, we knew something was bubbling away in the background for Cisco with regards to edge analytics and IoT.

Edge Fog Fabric – EFF

The critical success factor for IoT and analytics solution deployments is to provide the right data, at the right time to the right people (or machines) .

With the exponential growth in the number of connected devices, the marketplace requires solutions that provide data generating devices, communication, data processing, and data leveraging capabilities, simultaneously.

To meet this need, Cisco recently launched a software solution (predicated on hardware devices) that encompasses all the above capabilities and named it Edge Fog Fabric aka EFF.

What is exciting about EFF?

To implement high performing IoT solutions that are cost effective and secure, a combination of capabilities need to be in place.

  • Multi-layered data processing, storing and analytics – given the rate of growth in the number of connected devices and the volume of data. Bringing data back from devices to a DV environment can be expensive. Processing information on the EFF makes this a lot more cost effective.
  • Micro services – Standardized framework for data processing and communication services that can be programmed in standard programming language like Python, Java etc.
  • Message routers – An effective communication connection within the various components and layers. Without state of the art message brokerage, no IoT systems could be secure and scalable in providing real time information.
  • Data leveraging capabilities – Ad hoc, embedded or advanced analytics capabilities will support BI and reporting needs. With the acquisition of Composite and AppDynamics, EFF will enable an IoT platform to connect to IT systems and applications.

What’s next?

Deploying the above is no mean feat. According to Gartner’s perception of the IoT landscape, no organization have yet achieved the panacea of connecting devices to IT systems and vice versa, combined with the appropriate data management and governance capabilities embedded. So there is still a long road ahead.

However, with technology advancements such as the above, I have no doubt that companies and service providers will be able to accelerate progress and deliver further use cases sooner than we might think.

Based on this innovation, the two obvious next steps that one can see fairly easily are:

  • Further automation – automating communication, data management and analytics services including connection with IT/ERP systems
  • Machine made decisions – once all connections are established and the right information reaches the right destination, machines could react to information that is shared with ‘them’ and make automated decisions.

Scott Hodges
April 18, 2017

Attending a recent IBM Watson event, somebody in the crowd asked the speaker, “So, what is Watson? ” It’s a good question – and one isn’t really a straightforward answer to. Is it a brand? A supercomputer? A technology? Something else?

Essentially, it is an IBM technology that combines artificial intelligence and sophisticated analytics to provide a supercomputer named after IBM’s founder, Thomas J. Watson. While interesting enough, the real question, to my mind, is this: “What sort of cool stuff can businesses do with the very smart services and APIs provided by IBM Watson?”

IBM provides a variety of services, available through Application Programmable Interfaces (APIs) that can developers can use to take advantage of the cognitive elements and power of Watson. The biggest challenge to taking advantage of these capabilities is to “Think cognitively” and imagine how they could benefit your business or industry to give you a competitive edge – or, for not-for-profit organisations, how they can help you make the world a better place.

I’ve taken a look at some of the APIs and services available to see some of the possibilities with Watson. It’s important to think of them collectively rather than individually, as while some use-cases may use one, many will use a variety of them, working together. We’ll jump into some use-cases later on to spark some thoughts on the possibilities.

Natural Language Understanding

Extract meta-data from content, including concepts, entities, keywords, categories, sentiment, emotion, relations and semantic roles.


Identify useful patterns and insights in structured or unstructured data.


Add natural language interfaces such as chat bots and virtual agents to your application to automate interactions with end users.

Language Translator

Automate the translation of documents from one language to another.

Natural Language Classifier

Classify text according to its intent.

Personality Insights

Extract personality characteristics from text, based on the writer’s style.

Text to Speech and Speech to Text

Process natural language text to generate synthesised audio, or render spoken words as written text.

Tone Analyser

Use linguistic analysis to detect the emotional (joy, sadness etc) linguistic (analytical, confident etc) and social (openness, extraversion etc) tone of a piece of text.

Trade-off Analytics

Make better choices when analysing multiple, even conflicting goals.

Visual Recognition

Analyse images for scenes, objects, faces, colours and other content.

All this is pretty cool stuff, but how can it be applied to work in your world? You could use the APIs to “train” your model to be more specific to your industry and business, and to help automate and add intelligence to various tasks.

Aerialtronics offers a nice example use-case of visual recognition in particular, they develop, produce and service commercial unmanned aircraft systems. Essentially, the company teams drones, an IoT platform and Watson’s Visual recognition service, to help identify corrosion, serial numbers, loose cables and misaligned antennas on wind turbines, oil rigs and mobile phone towers. This helps them automate the process of identifying faults and defects.

Further examples showing how Watson APIs can be combined to drive powerful, innovative services can be found on the IBM Watson website’s starter-kit page.

At this IBM event, a sample service was created, live in the workshop. This application would stream a video, convert the speech in the video to text, and then categorise that text, producing an overview of the content being discussed. The application used the speech-to-text and natural language classifier services.

Taking this example further with a spot of blue sky thinking, for a multi-lingual organisation, we could integrate the translation API, adding the resulting service to video conferencing. This could deliver near real-time multiple dialect video conferencing, complete with automatic transcription in the correct language for each delegate.

Customer and support service chat bots could use the Conversation service to analyse tone. Processes such as flight booking could be fulfilled by a virtual agent using the ‘Natural Language Classifier’ to derive the intent in the conversation. Visual recognition could be used to identify production line issues, spoiled products in inventory or product types in retail environments.

Identification of faded colours or specific patterns within scenes or on objects could trigger remedial services. Detection of human faces, their gender and approximate age could help enhance customer analysis. Language translation could support better communication with customers and others in their preferred languages. Trade-off Analytics could help optimise the balancing of multiple objectives in decision making.

This isn’t pipe-dreaming: the toolkit is available today. What extra dimensions and capabilities could you add to your organisation, and the way you operate? How might you refine your approach to difficult tasks, and the ways you interact with customers? Get in contact today to discuss the possibilities.

Team Logicalis
December 12, 2016

In the year of Brexit and Trump, Scott Reynolds, Hybrid IT practice lead, predicts that electronic digital crime will explode, data privacy breaches will claim scalps, automation will be 2017’s buzzword and the open source movement will challenge profit-making business models in his 2017 tech predictions.

It’s the time of year to engage in the oh-so risky game of making predictions for what is going to be hot for our customers in the coming year. Risky, because stunning twists and turns can take us off course at any point.

After the Brexit referendum and US Elections results confounded political and polling pundits, life’s certainties appear far less certain. Suddenly identifying the big winners in 2017 seems a less straight forward affair. But as a person that doesn’t mind living life on the edge – I thought I’d take a punt anyway. Here are my top tech predictions for 2017.

Security Breaches – the worst is yet to come

Based on the number of high profile data breaches, 2016 hasn’t been a great year for digital. The stable door has been left open by companies and government departments around the world. Armies of Terminator 2 Cyborgs in the guise of home CCTV cameras are attacking the very infrastructure of the internet. I fear we’re only at the beginning of an escalation of electronic digital crime.

2017 will test the nerve of governments, businesses, citizens and consumers and challenge the perception of digital as a safe and secure way of doing business, unless there’s a massive investment in Fort Knox equivalent defences and white hat skills.

Data Privacy

GDPR (General Data Protection Regulation) emanating from Europe is going to hurt businesses who don’t take data privacy seriously. That is a problem, as evidence suggests companies are unaware of their obligations under this new punitive legislative regime and are taking too long to grab hold of the GDPR tail.

It’s highly possible fines of up to 4% of Global Turnover will take some companies out of business in 2017, and beyond.

One Small Change for Mankind, One Giant Leap Forward for Automation

The IT industry is about to enter a time of mass automation…..about time. To our shame, we’ve lagged behind other industries. You can now buy a car that can park itself with the touch of a button, but you need 24 buttons to change the configuration of a router.

Increased levels of automation will manifest itself in robotic decision making, the automation of security systems to guard against, and respond to an avalanche of security threats, and automated provisioning of resource in the data centre and network (Software Defined).

Things Can Only Get Bigger

The Internet of Things is going to get bigger and more impactful. Gartner Group is still predicting that by 2020 there will be 50 billion things connected to the internet – that’s only three years away. In 2017, expect to see mass engagement by businesses in all sectors.

Hopefully we’ll move on from talking about the connected fridge that can order more lettuce when you run out, and recognise that IoT will fundamentally change how industries and organisations operate.

Open Source – Somebody wants to do everything you do for free

Somebody, somewhere, is trying to do what you charge for, at much lower cost, or even for free. Open isn’t a thing, it’s a movement. We’re already seeing Open Source technologies impact our industry; with Open Stack being the new operating system of choice for those companies not wanting to ‘pay’ for mainstream software. Open technologies in automation, such as Puppet and Chef, now have a groundswell of support, and are evangelical about companies who want to delight people rather than turn a profit.

We’ve also witnessed a growing willingness to embrace Open Computing technologies. Now, Open isn’t without its complications and ultimately nothing in life is free – operating an open environment is still a complicated affair. But I think we’ll see a lot more traction, with many of our customers taking Open Source seriously, over the next 12 months.

2017 Tech predictions – a risky game

So, those are my top five tech trends for 2017. Now you’re probably decrying – how could I overlook analytics? I haven’t. I fully acknowledge that analytics and data are core to all the above. They will need to be embedded in the very fabric of a business, to bring my predictions to fruition. Otherwise, you can disregard everything I just said. As I said, making predictions is a risky game.

Alastair Broom
December 10, 2016

I was recently asked what I think will be three things making an impact on our world in 2017, with a few permutations of course:

Maximum of 3 technologies that will be significant for enterprises in terms of driving value and transforming business models and operations in 2017

Innovations that are most likely to disrupt industries and businesses

I’ve put my three below – it would be great to hear your thoughts and predictions in the comments!

Internet of Things

The Internet of Things is a big one for 2017. Organisations will move from exploring ideas around what IoT means for them in theory, to rolling out sensors across key opportunity areas and starting to gather data from what were previously “dark assets”. The reason IoT is so important is because of the amount of data the things will generate, and what new insight this gives to organisations, including things like physical asset utilisation & optimisation and proactive maintenance. Those organisations that take the IoT seriously are going to see their customers, their data, and their opportunities in completely new ways. Being able to add more and more data sources into the “intelligence stream” mean the decision is backed by more facts. It’s Metcalfe’s Law – the value of the network is proportional to the square of the number of users. Data is the network, and each thing is another user.

Being prepared to exploit the IoT opportunity though, especially at scale, will take proper planning and investment. Organisations will need a strategy to address the IoT, one that identifies quick wins that help further the business case for further IoT initiatives. The correct platform is key, an infrastructure for things. The platform that forms the basis for the connectivity of the things to the network will need to be robust, likely be a mix of wired and wireless, and because it’s unlikely to be a separate infrastructure, it needs to have the required visibility and control to ensure data is correctly identified, classified and prioritised.
Security too will be fundamental. Today the things are built for user convenience, security being a secondary concern. What the IoT then represents is a massively increased attack surface, one that is particularly vulnerable to unsophisticated attack. The network will therefore need to be an integral part of the security architecture.

Edge Analytics

Edge analytics is another one to look out for. As the amount of data we look to analyse grows exponentially, the issue becomes twofold. One, what does it cost to move that data from its point of generation to a point of analysis? Bandwidth doesn’t cost what it used to, but paying to transport TB and potentially PB of information to a centralised data processing facility (data centre that is) is going to add significant cost to an organisation. Two, having to move the data, process it, and then send an action back adds lag. The majority of data we have generated to this point has been for systems of record. A lag to actionable insight in many scenarios here may very well be acceptable. But as our systems change to systems of experience, or indeed systems of action, lag is unacceptable.
Analytics at the edge equates to near real-time analytics. The value of being able to take data in real time, its context, and analyse this amongst potentially multiple other sources of data, and then present back highly relevant in the moment intelligence, that’s amazing. Organisations once again need to ensure the underlying platform is up to the task. The ability to capture the right data, maintain its integrity, confirm to privacy regulations and be able to manage the data throughout its lifecycle. Technology will be needed to analyse the data at its point of creation, essentially you will need to bring compute to the data (and not the other way round as typically done today).

Cognitive Systems

Lastly, cognitive systems. Computers to this point have been programmed by humans to perform pretty specific tasks. Cognitive systems will not only now “learn” what to do from human interaction, but from the data they generate themselves, alongside the data from other machines. Cognitive systems will be continually reprogramming themselves, each time getting better and better at what they do. And what computers do is help us to things humans can do, but faster. Cognitive systems will expand our ability make better decisions, to help us think better. Cognitive systems move from computing systems that have been essentially built to calculate really fast, to systems that are built to analyse data and draw insights from it. This extends to being able to predict outcomes based on current information and consequences of actions. And because it’s a computer, we can use a far greater base of information from which to draw insight from. Us humans are really bad at remembering a lot of information at the same time, but computers (certainly for the short term) are only constrained by the amount of data we can hold in memory to present to compute node for processing.

Alastair Broom
November 30, 2016

As you might have read back in November 2016, a huge Distributed Denial of Service (DDoS) attach against Dyn, a major domain name system (DNS) provider, broke large portions of the Internet, causing a significant outage to a tonne of websites and services; including Twitter, GitHub, PayPal, Amazon, Reddit, Netflix, and Spotify.

How did the attack happen? What was the cause behind the attack?

Although exact details of the attack remain vague, Dyn reported an army of hijacked internet-connected devices are thought to be responsible for the large-scale attack; similar to a method recently employed by hackers to carry out a record-breaking DDoS attack of over 1 Tbps against the French hosting provider OVH.

According to security intelligence firm Flashpoint, Mirai bots were detected driving much, but not necessarily all, of the traffic in the DDoS attacks against Dyn. Mira is a piece of malware that targets Internet of Things (IoT) devices such as routers, and security cameras, DVRs, and enslaves vast numbers of these compromised devices into a bonnet, which is then used to conduct DDoS attacks.

This type of attack is notable and concerning because it largely consists of unsecured IoT devices, which are growing exponentially with time. These devices are implemented in a way that they cannot easily be updated and thus are nearly impossible to secure.

Manufacturers majorly focus on performance and usability of IoT devices but ignore security measures and encryption mechanisms. Which is why they are routinely hacked and widely becoming part of DDoS botnets and used as weapons in cyber-attacks.

An online tracker of the Mirai botnet suggests there are more than 1.2 Million Mirai-infected devices on the Internet, with over 166,000 devices active right now.

IoT botnets like Mirai are growing rapidly, and there is no easy way to stop them.

According to officials having spoken to Reuters, the US Department of Homeland Security (DHS) and the FBI are both investigating the massive DDoS attacks hitting Dyn, but none of the agencies have yet speculated on who might be behind them.

At Logicalis UK, we have a threat centric approach. We can help customers protect their applications and environments against DDoS attacks with on-premise, cloud-based or hybrid deployments based on solutions through our partner F5.

F5 provides seamless, flexible, and easy-to-deploy solutions that enable a fast response, no matter what type of DDoS attack you’re under. Together, Logicalis and F5 can;

  • Deliver multi-layered DDoS defense from a single box with a fast-acting, dual-mode appliance that supports both out-of-band processing and inline mitigation, while enabling SSL inspection and guarding against layer 7 app attacks.
  • Stop attacks on your data centre immediately with an in-depth DDoS defense that integrates appliance and cloud services for immediate cloud off-loading.
  • Unique layer 7 application coverage defeats threats cloaked behind DDoS attacks without impacting legitimate traffic.
  • Activate comprehensive DDoS defense with less complexity and greater attack coverage than most solutions.

If you would like to find out more about Logicalis’ advanced security practice, please complete the form opposite. Our experts are primed and ready to support you.

If you would like to find out more about our security practice please do not hesitate to get in touch:

Alastair Broom
November 29, 2016

Last week we hosted a number of our customers (30 people from 18 different organisation in fact) to an event held at Sushisamba in London. From the 39th floor, overlooking most of London, I had the privilege of hosting some of our existing and potential clients for a discussion predicated by the upcoming General Data Protection Regulation (GDPR). Over an absolutely fantastic lunch, which thankfully included many tasty meat dishes – I’m no huge fan of raw fish – we talked about how organisations are going to have to rethink their strategy around data governance and security in the face of a very tough new law.

I just wanted to give you a few takeaways from the day, none of which are edible – I’m sorry…

The first of our guest speakers was Lesley Roe, Data Protection Officer from the IET. Lesley spoke about what the IET are doing to get ready for GDPR. They hold a vast amount of personal data, and given that they are advising their membership on all manner of related things, they need to lead by example. Key points from her presentation are:

  • GDPR is about giving people more control over their personal data. Every day we share an extraordinary amount of personal data with all manner of organisations, and this data is valuable. GDPR is about ensuring we retain the rights to that value. What it is processed for, how can process it, how its retained/deleted once its useful life has expired.
  • Everyone has a part to play and training of staff & staff awareness are paramount. This, however, is no mean feat.
  • The process of data governance, and the education of that process throughout the organisation, will be the only way to fully comply with the regulation. How do the IET classify old & new data? How do they manage the lifecycle of the data? How do they make sure they are only obtaining, using and retaining the data they need and have consent for?
  • None of this, however, is possible without first knowing what personal data is within your organisational context, and where it lives.
  • Much of the thinking around GDPR will be a huge shift in the mindset of organisations today. Companies just do not think about their data assets and their responsibility for that data in the spirit of the regulation at all.

Our next two speakers were from two of our technology partners, VMWare and Palo Alto Networks. Things to remember here:

  • Technology, without a doubt, has a part to play in ensuring compliance. The regulators are far more savvy to what the art of the possible is in the security market, and they will be expecting organisations to leverage technologies within reach of budgets and according to exposure to best mitigate any risks to rights of individuals.
  • The ability to prevent, detect and report on the nature and extent of any breaches will be very important. Technologies will be needed to prove that organisations can do this effectively and efficiently, especially in the face of stringent reporting requirements.
  • State of the art will really mean state of the art. Regulators will be assessing how organisations are using the best possible mix of technologies to minimise both exposure to risk, and impact of any breach if/when they should occur.

The last presentation was from Ed Charvet and his guest star Ian De Frietas. Ian is part of the alliance we have with legal experts BLP. The joint value proposition Ed and Ian spoke about is what I believe makes us entirely unique in this space:

  • The first step towards compliance is data discovery – what is personal data from the perspectives of both the GDPR and the organisations’ context? Where is the data? How is data currently classified? How is it processed? How are permissions obtained? This is delivered through a mix of manual and automated processes to help customers understand where they stand today.
  • But this process takes time. The regulation comes into law on the 28th May 2018, and as Ian made clear, the regulator is taking a “zero day” approach. This effectively means that if you’re not fully complaint on that day, you are non-compliant, and the regulator has ever power to come after you. With fines of up to 4% of global group revenues, or EUR20 million (whichever is the greater of course), this is a regulation with teeth – and with what seems like a very real political agenda. Watch out Facebook, Google, Amazon…
  • Being compliant with the likes of the DPA today, while impressive, would still mean on day zero you do not comply with the new law.
  • Key questions to ask are: do you have a legitimate interest to process the data? What exactly are you planning to do with it? These will need to be made very clear even before the gathering of data has begun.

What became clear throughout the day is that time is tight to reach compliance, and the ICO in the UK seem to be recruiting in earnest to gear up for real enforcement of the law. This feels like something that is going to change how data, and in particular the protection of personal rights and data, is valued and protected by the organisations that get the most benefit from it. What organisations need to do as a matter of urgency is find out what personal data they hold, and where they store it. They need to assess the current security infrastructures they have and find out what gaps existing that could pose a risk and ultimately a loss of personal data. They need to be putting the right people and procedures in place to comply with new and enhanced rights, tighter reporting deadlines, and they should be working out what Data Protection Impact Assessments need to look like for their organisation to satisfy regulatory requirements.

As a next step, please reach out to either myself, Ed Charvet, Alastair Broom or Jorge Aguilera to discuss how Logicalis can help our customers get ready for GDPR. From the data discovery workshop, to engaging with BLP in legal matters, and technology assessments powered by tools from the likes of VMware and Palo Alto Networks; we really can help customers on the road towards compliance.

Latest Tweets