Digitally Speaking
IT Leaders' Summit

Dean Mitchell
October 25, 2018

Last week, we sponsored Computing’s annual IT Leaders’ Summit which took place at Carlton House, London. The summit provided an opportunity for senior IT executives across all industries to discuss how they can drive digital transformation in areas such as the cloud and Artificial Intelligence.

Richard Simmons, our Head of European AI & IoT Practice presented to a packed house and hosted a round table; here’s a quick recap of our day.

The hype of AI

Artificial Intelligence has become the latest focal point in the conversation around data insights. Yet, contrary to the volume of noise surrounding it, according to industry studies only 4% of organisations have actually deployed AI*, with most of these businesses still in the early phases of AI adoption and facing unexpected challenges. Richard took to the stage to bust some myths and to provide the facts about AI that have been overlooked in the hype.

“AI is not the answer to everything. It needs more data, it requires more resources, you need the right foundations in place and your infrastructure has to be ready, otherwise there is a huge amount of inefficiency” he said. “You cannot underestimate the time it will take to develop and build. It can take weeks, even months, to get an AI strategy up and running and a massive 80% of an AI project’s time is spent in the data preparation phase.”

Richard also highlighted the extremely long training times required. This is partly due to the different sets of skills needed to fine-tune and deploy it – the skill set of the person managing an AI project will be vastly different to the skill set of the person building it.

“After you have worked on a business strategy, then spent a long time preparing the data, you have to experience all that pain again. Because the more data you give an AI project, the more accurately it performs.  If you want to really drive value from the data you have, the project is never ending. AI is not a quick fix.”

Opening the discussion

Following Richard’s presentation, our over-subscribed (it’s almost like it’s a hot topic!) IT leaders round table began, where we discussed the key factors and approaches to be considered to deliver a scalable AI strategy.

The discussion started with the question ‘who on this table has deployed an AI project in their company?’. Out of 29 people, only five IT leaders said yes. Those who responded with yes reinforced that their projects were in the early phases and were starting small before they scaled up.

“It can take a lot of time with very little return at the start of an AI strategy. So, it can be hard to encourage the rest of the business to support the project when they can’t see the rewards. This is why starting modest and breaking it down into smaller projects can help. You don’t want to bite off more than you can chew” agreed Richard.

Sharing is caring

We are currently seeing a great push for data sharing across businesses, a concern that was raised during the discussion. As Richard said, AI operates most efficiently when it has been fed a lot of data therefore it would make sense for businesses to share already processed and interpreted data with others in a similar sector. So why aren’t businesses doing this?

“There isn’t always a desire to share what you’ve worked so hard on. If you do share your data, there is a huge risk that the person you’ve shared it with will implement it and use the data better. If you share your work, get ready for the competition to begin.”

Who owns the data?

One topic that dominated the round table was the ownership of data within a business. Many of those who sat at the table expressed the desire to be a data-run business, but getting to that stage wasn’t as easy as they hoped. According to Richard, if a company wants to be data led and to use AI efficiently, there needs to be a “new culture built internally. Every part of the business needs to work with data in mind, not just the IT department and those involved with the AI process.”

This is where data ownership is a necessity. It was mentioned that whilst employees may be really interested in deploying AI, no one was excited about the management, governance and upkeep of the data needed for the AI to work efficiently! To combat this, one person said that their company recently wrote a data strategy – from compliance, to governance, to how the company values and uses data – in order for every employee to be on the same page.

As a final note, Richard said “getting every part of the business on board is vital, but it will take work. This culture change to a data driven enterprise will not happen overnight, it should be on going just like the AI project itself.”

Is your business ready for AI?  Is your infrastructure?  Find out how Logicalis UK and IBM can help you overcome AI infrastructure limitations and access an IDC expert infrastructure readiness report – Finance or Retail and Manufacturing.  

Logicalis UK would like to thank Computing for hosting us at the IT Leaders’ Summit and to those who joined us at our panel session and the round table.


Ismail El Kadiri
April 4, 2018

Over the past decade or so, numerous planning and analytics solutions have come out in an effort to catch up with the complex business environment. Most solutions compete around speed, scalability, visualisation capabilities, scenario modelling and excel integration. Our recent Global CIO survey revealed that analytics is still considered ‘very important’ or ‘critical’ for driving innovation and decision-making across the business.

Traditionally, planning tools have been aimed at the department of Finance. Budgeting and forecasting needs, P&Ls, balance sheets and cash flows have been the bread and butter of planning and reporting solutions. However, this only scratches the surface of what can be achieved in the world of business planning. We are in an era when a truly successful planning practice is not solely based upon financial-focused analytics, but also includes customer, sales performance and workforce analytics.

Planning and analytics for the entire business

Although Finance is usually the right strategic area to begin implementing any planning solution, it should just serve as a starting point. A truly successful planning solution should incorporate your operational planning, giving you a more accurate and all-encompassing view of your business.

Apart from Finance, almost all business functions can benefit from agile planning processes and data analytics with payroll, sales and asset management at the top of the list.

Payroll analytics to decrease manual work

Payroll planning can be a very complex and frequently it’s a manual task for modern organisations. HR employees responsible for payroll face multiple components that influence the complexity of the payroll process, such as NI adjustments, complex bonus schemes, salary increases and benefits. Taking into consideration ongoing government changes, regulatory updates and HR related modifications, payroll can prove to be a stressful and time-consuming process.

The most effective way to evolve a historically manual process and increase the speed and accuracy of payroll planning is through data. By taking advantage of analytics, you can create timely, reliable payroll plans to put employee and business insight into action.  You can benefit from faster processes and a uniform view of the data and simplify analytical processes, that HR employees might not be able to execute.

Accurate sales forecasting with planning analytics

Sales is another department of an organisation that can greatly benefit from planning analytics. For most organisations, sales planning and forecasting is their life-blood – as it directs the efforts of each department and helps define the overall strategy. Therefore, it is crucial to set realistic and accurate targets based on existing data.

With agile planning and analytics, businesses today can forecast sales volumes and adjust cost and price centrally to see the bottom line impact of the Sales department. More than in any other part of the organisation, this is the ideal area to take advantage of seasonality forecasting, what if scenario modelling and phasing. This will result in successfully steering sales activities, maintaining margins and delivering value, both to the client and the business.

Asset Management simplified through planning analytics

Often the biggest hurdle that companies face when managing their assets is the volume of data that needs to be collected, analysed and maintained. Increasing cost pressure, complex structures in supply chains and rising risks due to complex procurement mechanisms are just part of the challenge for modern businesses.

Effective and flexible networking of data is crucial in order to make fast and accurate decisions. With advanced planning and analytics, organisations can apply profiles to the assets to plan for depreciation and asset control.

At Logicalis, we have a holistic approach to planning analytics, moving beyond finance and helping you take data-driven decisions for the entire business.

Talk to our team of experts and discover how to make start your planning journey.


series of padlocks on brick wall, data privacy related words on the rights side

Tim Wadey
January 28, 2018

Data Privacy on the spotlight!

Data Privacy Day may not be an official holiday for your IT department, but it definitely should remind you that you need to focus and do more to protect confidential data.

The Data Privacy Day was first introduced in 2009 by the Online Trust Alliance (OTA) in response to the increasing number of cybersecurity attacks and data privacy breaches, emphasising the need for effective data protection regulations and transparent processes for collecting and using personally identifiable information (PII).

Examples of PII that fall under data protection regulations are:
• Name;
• Social Security number, full and truncated;
• Driver’s license and other government identification numbers;
• Citizenship, legal status, gender, race/ethnicity;
• Birth date, place of birth;
• Biometrics;
• Home and personal cell telephone numbers;
• Personal email address, mailing and home address;
• Security clearance;
• Financial information, medical information, disability information;
• Law enforcement information, employment information, educational information

If one considers the sources that PII can be collected from and how many new are added on a daily basis – big data, the internet of things, wearable technology – it is easy to understand why data privacy has become increasingly challenging. And let’s not forget the ransomware attacks, which are the latest major data privacy challenge.

Despite the size of the recent ransomware attacks, the majority of organisations still don’t have structured processes in place to prepare themselves and keep confidential data safe. Although there are effective steps for protection against ransomware threats, their number has significantly increased and companies delay to announce in fear of negative publicity.

In order to stop such actions from happening and improve the current data privacy practices, the European Union is introducing the General Data Protection Regulation (GDPR) taking effect in May 2018. This is the biggest shake up of data protection laws in the last 20 years.

What is GDPR?

GDPR is the latest set of regulation law framework across the EU that aims to increase data privacy for individuals, and gives regulatory authorities greater power to take action against businesses that breach the new data privacy laws. GDPR also introduces rules relating to the free movement of personal data within and outside the EU.
In particular, GDPR involves:
• Obtaining consent for processing personal data must be clear and must seek an affirmative response.

• Data subjects have the right to be forgotten and erased from records.

• Users may request a copy of personal data in a profile format.

• Parental consent is required for the processing of personal data of children under the age of 16.

As a result, organisations need to be extremely aware of these changes as they can face very strict fines in the cases of non-compliance. Can your organisation afford to be fined up to £20 million for failing this data privacy regulation or 4% of annual global revenue, as required by the new General Data Protection Regulation?


24% are unaware of the incoming data protection legislation, while one in three companies believe that GDPR isn’t relevant to them.*

Get Started with a GDPR Readiness Assessment

In response to the fast approaching data protection regulation, Logicalis UK Advisory Services team have developed a GDPR Readiness Assessment that will allow us to help you understand and frame your thoughts on your journey to compliance.

The Logicalis GDPR Readiness Assessment will help you answer a key question – Where am I on my journey to data privacy compliance, today? By investigating elements of your organisational landscape, we will produce an ‘as is’ assessment, where we will be able to gauge where you are on a standardised maturity curve, considering all things around cybersecurity and data protection.

Get in touch with our Advisory Services to discuss how we can help you in your journey to GDPR Readiness.



*London Chamber of Commerce and Industry, 

Sara Sherwani
September 27, 2017

Throughout history, I don’t believe we’ve ever seen as much change as we do in the world of Technology! Just to think that in 10 years’ time we’ ve had more iPhone releases than Henry VIII had wives.

Taking a page out of some of tech giants books, be it Apple to Salesforce, it’s clear that innovation is at the centre of what enables the industry to move at the pace it does. It would be fair to say that 3 major trends currently dominate the industry:

1.Service, service, service – Many big players in the hardware product space recognise hardware is fast becoming a vanilla commodity. You’ve got a number of vendors such as Cisco, Oracle, Ericsson, Nokia, HP scrambling very quickly over a number of years to enable value added services on top of the hardware to increase margins.

 “Services are enabled by the specific knowledge, skills and experience you bring to the table which often drives business value through improved margins.”

Sometimes when I think about how you can build your brand of service that you deliver to customers, I like to compare it to food (one of my favourite subjects).

What keeps you going back to your favourite restaurant? Let’s take for instance McDonalds. It could be the quality of the food, but ultimately you KNOW you will get a fast, efficient service and a smile when they ask ‘would you like fries with that?’. The point being, it’s the trusted customer experience that underpins successful services, remember this bit – I’m going to allude to this part later on.

2.Business process design driven by cost reduction, optimization and automation – Ultimately, we use technology to enable us to make our lives simpler. Traditional IT has become so entrenched in complexity and with that has come high cost. Many businesses of all sizes are certainly looking at their balance sheets with scrutiny and seeking to utilize the benefits of IT innovation to gain a competitive advantage. The principles of globalisation, business process optimization and automation are all relevant now as we transform traditional IT to achieve the ultimate goal of simplicity.

3.Data driven customer experience being an investment for the future – Products in the world of data analytics are booming as businesses recognise the power of data in enabling intelligent business decisions. Some proven examples of boosting business value are how Telcos are using customer location data to pinpoint relevant marketing text messages.

Imagine you’re at the airport, where intelligent systems will pick up your location and send you a text to see if you want to purchase international data plan while you’re away. So instead of sending you random marketing messages, geo-location marketing becomes targeted and relevant. Through this intelligent marketing Telcos have been able to generate 40% more revenue than expected in that portfolio.

Keeping up with the pace of change within the industry can be overwhelming, unless you harness the key themes that I mentioned earlier which will be sure to relate to business value. Contact Logicalis today to learn how you can implement an agile business model and use its benefits to increase your business value.

Anis Makeriya
August 21, 2017

It’s always the same scenario: someone giving me some data files that I just want to dive straight into and start exploring ways to visually depict them, but I can’t.

I’d fire up a reporting tool only to step right back, realising that for data to get into visual shapes, they need to be in shape first!  One correlation consistently appearing over the years is that time spent on ETL/ELT (Extract, Transform and Load [in varying sequences]) and the speed of exit from reporting layer back to data prep share a negative correlation.

Data preparation for the win

‘80% of time goes into data prep’ and ‘Garbage in Garbage out (GIGO)’ have existed for some time now but don’t actually hit you until you face it in practical situations and it suddenly translates into ‘backward progress’. Data quality issues can vary from date formats, multiple spellings of the same value to values not existing at all in the form of nulls. So, how can they all be dealt with? Data prep layer is the answer.

Often with complex transformations or large datasets, analysts find themselves turning to IT to perform the ETL process. Thankfully, over the years, vendors have recognised the need to include commonly used transformations in the reporting tools themselves. To name a few, tools such as Tableau and Power BI have successfully passed this power on to the analysts making time to analysis a flash. Features such as pivot, editing aliases, joining and unioning tables and others are available within a few clicks.

There may also be times when multiple data sources need joining, such as matching company names. Whilst Excel and SQL fuzzy look-ups have existed for some time, the likes of dedicated ETL tools such as Paxata have imbedded further intelligence that enable it to go a step further and recognise that the solutions lies beyond just having similar spellings in between the names.

All the tasks mentioned above are for the ‘T’ (Transformation) of ETL and is only the second OR third step in the ETL/ELT process! If data can’t be extracted as part of the E in ETL in the first place, there is nothing to transform. When information lies in disparate silos, often it cannot be ‘merged’ unless the data is migrated or replicated across stores. Following the data explosion in the past decade, Cisco Data Virtualisation has gained traction for its core capability of creating a ‘merged virtual’ layer over multiple data sources enabling quick time to access as well as the added benefits of data quality monitoring and single version of the truth.

These recent capabilities are now even more useful with the rise in data services like Bloomberg/forex and APIs that can return weather info, if we want to further know how people feel about the weather, then the twitter API also works.

Is that it..?

Finally after the extraction and transformation of the data, the load process is all that remains… but even that comes with its own challenges. Load frequencies, load types (incremental vs. full loads) depending on data volumes, data capture (changing dimensions) to give an accurate picture of events and also storage and query speeds from the source to name a few.

Whilst for quick analysis a capable analyst with best practice knowledge will suffice, scalable complex solutions will need the right team from IT and non-IT side in addition to the tools and hardware to support it going forward smoothly. Contact us today to help you build a solid Data Virtualisation process customised to your particular needs.

Richard Simmons
June 20, 2017

I have a confession to make, I love to read. Not just an occasional book on holiday or a few minutes on the brief, or often the not so brief, train journey into and out of London but all the time. Right now has never been a better time for those with a love of reading! The rise of digital media means that not only can you consume it pretty much anywhere at any time but more importantly it is making it easier for more people to share their ideas and experience.

Recently I came across a book called “Thank You for Being Late: An Optimist’s Guide to Thriving in the Age of Accelerations” by Pulitzer Prize winner Thomas L. Friedman., which I not only found fascinating to read but has also helped to shape and change the way I view many of the challenges we are facing both in business but also in our personal lives. The premise of the book is that often he would arrange to meet people for breakfast early in the morning, to do interviews or research stories but occasionally these people would be delayed. These moments, rather than being a source of frustration, became time he actually looked forward to as it allowed him to simply sit and think. And looking at the world, he believed we are living through an age of acceleration due to constant technology evolution, globalisation and climate change. He argues that these combined are the cause for much of the challenges we currently face.

The key point about this acceleration is that it is now reaching a level in which society and people are struggling to adapt. Within the technology world we talk about disruption a lot, a new business or technology arrives that can disrupt a sector or market, the competition struggles to adapt and eventually a status quo is resumed. For example Uber has undoubtedly caused a huge disruption in the world of transport, and governments are currently working through how they can better legislate for this new way of operating. The challenge will be that new legislation can take 5-10 years to agree and implement in which time Uber may well have been replaced by autonomous cars.

So what we are experiencing now is not just disruption but a sense of dislocation, the feeling that no matter how fast we try and change it is never enough. In this environment it will be the people, businesses and societies that are able to learn and adapt the fastest which will be most successful . For business we are constantly shown how being more agile in this digital world can drive efficiency, generate new business models and allow us to succeed but I feel often what is lacking is the guidance on how to get there. We have a wealth of different technology which can support a business but what is right for me? What should I invest in first? And how do I make sure that I maximise the value of that investment?

My experience with many of our customers is that they understand the challenges and also the opportunity, but simply do not have the time to think and plan. When they do have time the amount of choice can be overwhelming and actually daunting. In a small way this is the same challenge I face when looking for new books to read, I can go online but with so much to choose from how will I know what I will enjoy? The opportunity that digital media provides with more authors and contents can actually make finding and choosing something that you think is valuable much harder.

In Logicalis, we understand the business challenges that you face and discuss with you the different technology options that could support you, recommending those that can deliver the biggest value in the shortest time frame. Contact us to find out how we can help you keep up to speed with emerging technology and use it to your benefit.

Fanni Vig
April 20, 2017

Finally, it’s out!

With acquisitions like Composite, ParStream, Jasper and AppDynamics, we knew something was bubbling away in the background for Cisco with regards to edge analytics and IoT.

Edge Fog Fabric – EFF

The critical success factor for IoT and analytics solution deployments is to provide the right data, at the right time to the right people (or machines) .

With the exponential growth in the number of connected devices, the marketplace requires solutions that provide data generating devices, communication, data processing, and data leveraging capabilities, simultaneously.

To meet this need, Cisco recently launched a software solution (predicated on hardware devices) that encompasses all the above capabilities and named it Edge Fog Fabric aka EFF.

What is exciting about EFF?

To implement high performing IoT solutions that are cost effective and secure, a combination of capabilities need to be in place.

  • Multi-layered data processing, storing and analytics – given the rate of growth in the number of connected devices and the volume of data. Bringing data back from devices to a DV environment can be expensive. Processing information on the EFF makes this a lot more cost effective.
  • Micro services – Standardized framework for data processing and communication services that can be programmed in standard programming language like Python, Java etc.
  • Message routers – An effective communication connection within the various components and layers. Without state of the art message brokerage, no IoT systems could be secure and scalable in providing real time information.
  • Data leveraging capabilities – Ad hoc, embedded or advanced analytics capabilities will support BI and reporting needs. With the acquisition of Composite and AppDynamics, EFF will enable an IoT platform to connect to IT systems and applications.

What’s next?

Deploying the above is no mean feat. According to Gartner’s perception of the IoT landscape, no organization have yet achieved the panacea of connecting devices to IT systems and vice versa, combined with the appropriate data management and governance capabilities embedded. So there is still a long road ahead.

However, with technology advancements such as the above, I have no doubt that companies and service providers will be able to accelerate progress and deliver further use cases sooner than we might think.

Based on this innovation, the two obvious next steps that one can see fairly easily are:

  • Further automation – automating communication, data management and analytics services including connection with IT/ERP systems
  • Machine made decisions – once all connections are established and the right information reaches the right destination, machines could react to information that is shared with ‘them’ and make automated decisions.

Scott Hodges
April 18, 2017

Attending a recent IBM Watson event, somebody in the crowd asked the speaker, “So, what is Watson? ” It’s a good question – and one isn’t really a straightforward answer to. Is it a brand? A supercomputer? A technology? Something else?

Essentially, it is an IBM technology that combines artificial intelligence and sophisticated analytics to provide a supercomputer named after IBM’s founder, Thomas J. Watson. While interesting enough, the real question, to my mind, is this: “What sort of cool stuff can businesses do with the very smart services and APIs provided by IBM Watson?”

IBM provides a variety of services, available through Application Programmable Interfaces (APIs) that can developers can use to take advantage of the cognitive elements and power of Watson. The biggest challenge to taking advantage of these capabilities is to “Think cognitively” and imagine how they could benefit your business or industry to give you a competitive edge – or, for not-for-profit organisations, how they can help you make the world a better place.

I’ve taken a look at some of the APIs and services available to see some of the possibilities with Watson. It’s important to think of them collectively rather than individually, as while some use-cases may use one, many will use a variety of them, working together. We’ll jump into some use-cases later on to spark some thoughts on the possibilities.

Natural Language Understanding

Extract meta-data from content, including concepts, entities, keywords, categories, sentiment, emotion, relations and semantic roles.


Identify useful patterns and insights in structured or unstructured data.


Add natural language interfaces such as chat bots and virtual agents to your application to automate interactions with end users.

Language Translator

Automate the translation of documents from one language to another.

Natural Language Classifier

Classify text according to its intent.

Personality Insights

Extract personality characteristics from text, based on the writer’s style.

Text to Speech and Speech to Text

Process natural language text to generate synthesised audio, or render spoken words as written text.

Tone Analyser

Use linguistic analysis to detect the emotional (joy, sadness etc) linguistic (analytical, confident etc) and social (openness, extraversion etc) tone of a piece of text.

Trade-off Analytics

Make better choices when analysing multiple, even conflicting goals.

Visual Recognition

Analyse images for scenes, objects, faces, colours and other content.

All this is pretty cool stuff, but how can it be applied to work in your world? You could use the APIs to “train” your model to be more specific to your industry and business, and to help automate and add intelligence to various tasks.

Aerialtronics offers a nice example use-case of visual recognition in particular, they develop, produce and service commercial unmanned aircraft systems. Essentially, the company teams drones, an IoT platform and Watson’s Visual recognition service, to help identify corrosion, serial numbers, loose cables and misaligned antennas on wind turbines, oil rigs and mobile phone towers. This helps them automate the process of identifying faults and defects.

Further examples showing how Watson APIs can be combined to drive powerful, innovative services can be found on the IBM Watson website’s starter-kit page.

At this IBM event, a sample service was created, live in the workshop. This application would stream a video, convert the speech in the video to text, and then categorise that text, producing an overview of the content being discussed. The application used the speech-to-text and natural language classifier services.

Taking this example further with a spot of blue sky thinking, for a multi-lingual organisation, we could integrate the translation API, adding the resulting service to video conferencing. This could deliver near real-time multiple dialect video conferencing, complete with automatic transcription in the correct language for each delegate.

Customer and support service chat bots could use the Conversation service to analyse tone. Processes such as flight booking could be fulfilled by a virtual agent using the ‘Natural Language Classifier’ to derive the intent in the conversation. Visual recognition could be used to identify production line issues, spoiled products in inventory or product types in retail environments.

Identification of faded colours or specific patterns within scenes or on objects could trigger remedial services. Detection of human faces, their gender and approximate age could help enhance customer analysis. Language translation could support better communication with customers and others in their preferred languages. Trade-off Analytics could help optimise the balancing of multiple objectives in decision making.

This isn’t pipe-dreaming: the toolkit is available today. What extra dimensions and capabilities could you add to your organisation, and the way you operate? How might you refine your approach to difficult tasks, and the ways you interact with customers? Get in contact today to discuss the possibilities.

Scott Reynolds
December 12, 2016

In the year of Brexit and Trump, Scott Reynolds, Hybrid IT practice lead, predicts that electronic digital crime will explode, data privacy breaches will claim scalps, automation will be 2017’s buzzword and the open source movement will challenge profit-making business models in his 2017 tech predictions.

It’s the time of year to engage in the oh-so risky game of making predictions for what is going to be hot for our customers in the coming year. Risky, because stunning twists and turns can take us off course at any point.

After the Brexit referendum and US Elections results confounded political and polling pundits, life’s certainties appear far less certain. Suddenly identifying the big winners in 2017 seems a less straight forward affair. But as a person that doesn’t mind living life on the edge – I thought I’d take a punt anyway. Here are my top tech predictions for 2017.

Security Breaches – the worst is yet to come

Based on the number of high profile data breaches, 2016 hasn’t been a great year for digital. The stable door has been left open by companies and government departments around the world. Armies of Terminator 2 Cyborgs in the guise of home CCTV cameras are attacking the very infrastructure of the internet. I fear we’re only at the beginning of an escalation of electronic digital crime.

2017 will test the nerve of governments, businesses, citizens and consumers and challenge the perception of digital as a safe and secure way of doing business, unless there’s a massive investment in Fort Knox equivalent defences and white hat skills.

Data Privacy

GDPR (General Data Protection Regulation) emanating from Europe is going to hurt businesses who don’t take data privacy seriously. That is a problem, as evidence suggests companies are unaware of their obligations under this new punitive legislative regime and are taking too long to grab hold of the GDPR tail.

It’s highly possible fines of up to 4% of Global Turnover will take some companies out of business in 2017, and beyond.

One Small Change for Mankind, One Giant Leap Forward for Automation

The IT industry is about to enter a time of mass automation…..about time. To our shame, we’ve lagged behind other industries. You can now buy a car that can park itself with the touch of a button, but you need 24 buttons to change the configuration of a router.

Increased levels of automation will manifest itself in robotic decision making, the automation of security systems to guard against, and respond to an avalanche of security threats, and automated provisioning of resource in the data centre and network (Software Defined).

Things Can Only Get Bigger

The Internet of Things is going to get bigger and more impactful. Gartner Group is still predicting that by 2020 there will be 50 billion things connected to the internet – that’s only three years away. In 2017, expect to see mass engagement by businesses in all sectors.

Hopefully we’ll move on from talking about the connected fridge that can order more lettuce when you run out, and recognise that IoT will fundamentally change how industries and organisations operate.

Open Source – Somebody wants to do everything you do for free

Somebody, somewhere, is trying to do what you charge for, at much lower cost, or even for free. Open isn’t a thing, it’s a movement. We’re already seeing Open Source technologies impact our industry; with Open Stack being the new operating system of choice for those companies not wanting to ‘pay’ for mainstream software. Open technologies in automation, such as Puppet and Chef, now have a groundswell of support, and are evangelical about companies who want to delight people rather than turn a profit.

We’ve also witnessed a growing willingness to embrace Open Computing technologies. Now, Open isn’t without its complications and ultimately nothing in life is free – operating an open environment is still a complicated affair. But I think we’ll see a lot more traction, with many of our customers taking Open Source seriously, over the next 12 months.

2017 Tech predictions – a risky game

So, those are my top five tech trends for 2017. Now you’re probably decrying – how could I overlook analytics? I haven’t. I fully acknowledge that analytics and data are core to all the above. They will need to be embedded in the very fabric of a business, to bring my predictions to fruition. Otherwise, you can disregard everything I just said. As I said, making predictions is a risky game.

Alastair Broom
December 10, 2016

I was recently asked what I think will be three things making an impact on our world in 2017, with a few permutations of course:

Maximum of 3 technologies that will be significant for enterprises in terms of driving value and transforming business models and operations in 2017

Innovations that are most likely to disrupt industries and businesses

I’ve put my three below – it would be great to hear your thoughts and predictions in the comments!

Internet of Things

The Internet of Things is a big one for 2017. Organisations will move from exploring ideas around what IoT means for them in theory, to rolling out sensors across key opportunity areas and starting to gather data from what were previously “dark assets”. The reason IoT is so important is because of the amount of data the things will generate, and what new insight this gives to organisations, including things like physical asset utilisation & optimisation and proactive maintenance. Those organisations that take the IoT seriously are going to see their customers, their data, and their opportunities in completely new ways. Being able to add more and more data sources into the “intelligence stream” mean the decision is backed by more facts. It’s Metcalfe’s Law – the value of the network is proportional to the square of the number of users. Data is the network, and each thing is another user.

Being prepared to exploit the IoT opportunity though, especially at scale, will take proper planning and investment. Organisations will need a strategy to address the IoT, one that identifies quick wins that help further the business case for further IoT initiatives. The correct platform is key, an infrastructure for things. The platform that forms the basis for the connectivity of the things to the network will need to be robust, likely be a mix of wired and wireless, and because it’s unlikely to be a separate infrastructure, it needs to have the required visibility and control to ensure data is correctly identified, classified and prioritised.
Security too will be fundamental. Today the things are built for user convenience, security being a secondary concern. What the IoT then represents is a massively increased attack surface, one that is particularly vulnerable to unsophisticated attack. The network will therefore need to be an integral part of the security architecture.

Edge Analytics

Edge analytics is another one to look out for. As the amount of data we look to analyse grows exponentially, the issue becomes twofold. One, what does it cost to move that data from its point of generation to a point of analysis? Bandwidth doesn’t cost what it used to, but paying to transport TB and potentially PB of information to a centralised data processing facility (data centre that is) is going to add significant cost to an organisation. Two, having to move the data, process it, and then send an action back adds lag. The majority of data we have generated to this point has been for systems of record. A lag to actionable insight in many scenarios here may very well be acceptable. But as our systems change to systems of experience, or indeed systems of action, lag is unacceptable.
Analytics at the edge equates to near real-time analytics. The value of being able to take data in real time, its context, and analyse this amongst potentially multiple other sources of data, and then present back highly relevant in the moment intelligence, that’s amazing. Organisations once again need to ensure the underlying platform is up to the task. The ability to capture the right data, maintain its integrity, confirm to privacy regulations and be able to manage the data throughout its lifecycle. Technology will be needed to analyse the data at its point of creation, essentially you will need to bring compute to the data (and not the other way round as typically done today).

Cognitive Systems

Lastly, cognitive systems. Computers to this point have been programmed by humans to perform pretty specific tasks. Cognitive systems will not only now “learn” what to do from human interaction, but from the data they generate themselves, alongside the data from other machines. Cognitive systems will be continually reprogramming themselves, each time getting better and better at what they do. And what computers do is help us to things humans can do, but faster. Cognitive systems will expand our ability make better decisions, to help us think better. Cognitive systems move from computing systems that have been essentially built to calculate really fast, to systems that are built to analyse data and draw insights from it. This extends to being able to predict outcomes based on current information and consequences of actions. And because it’s a computer, we can use a far greater base of information from which to draw insight from. Us humans are really bad at remembering a lot of information at the same time, but computers (certainly for the short term) are only constrained by the amount of data we can hold in memory to present to compute node for processing.

Alastair Broom
November 30, 2016

As you might have read back in November 2016, a huge Distributed Denial of Service (DDoS) attach against Dyn, a major domain name system (DNS) provider, broke large portions of the Internet, causing a significant outage to a tonne of websites and services; including Twitter, GitHub, PayPal, Amazon, Reddit, Netflix, and Spotify.

How did the attack happen? What was the cause behind the attack?

Although exact details of the attack remain vague, Dyn reported an army of hijacked internet-connected devices are thought to be responsible for the large-scale attack; similar to a method recently employed by hackers to carry out a record-breaking DDoS attack of over 1 Tbps against the French hosting provider OVH.

According to security intelligence firm Flashpoint, Mirai bots were detected driving much, but not necessarily all, of the traffic in the DDoS attacks against Dyn. Mira is a piece of malware that targets Internet of Things (IoT) devices such as routers, and security cameras, DVRs, and enslaves vast numbers of these compromised devices into a bonnet, which is then used to conduct DDoS attacks.

This type of attack is notable and concerning because it largely consists of unsecured IoT devices, which are growing exponentially with time. These devices are implemented in a way that they cannot easily be updated and thus are nearly impossible to secure.

Manufacturers majorly focus on performance and usability of IoT devices but ignore security measures and encryption mechanisms. Which is why they are routinely hacked and widely becoming part of DDoS botnets and used as weapons in cyber-attacks.

An online tracker of the Mirai botnet suggests there are more than 1.2 Million Mirai-infected devices on the Internet, with over 166,000 devices active right now.

IoT botnets like Mirai are growing rapidly, and there is no easy way to stop them.

According to officials having spoken to Reuters, the US Department of Homeland Security (DHS) and the FBI are both investigating the massive DDoS attacks hitting Dyn, but none of the agencies have yet speculated on who might be behind them.

At Logicalis UK, we have a threat centric approach. We can help customers protect their applications and environments against DDoS attacks with on-premise, cloud-based or hybrid deployments based on solutions through our partner F5.

F5 provides seamless, flexible, and easy-to-deploy solutions that enable a fast response, no matter what type of DDoS attack you’re under. Together, Logicalis and F5 can;

  • Deliver multi-layered DDoS defense from a single box with a fast-acting, dual-mode appliance that supports both out-of-band processing and inline mitigation, while enabling SSL inspection and guarding against layer 7 app attacks.
  • Stop attacks on your data centre immediately with an in-depth DDoS defense that integrates appliance and cloud services for immediate cloud off-loading.
  • Unique layer 7 application coverage defeats threats cloaked behind DDoS attacks without impacting legitimate traffic.
  • Activate comprehensive DDoS defense with less complexity and greater attack coverage than most solutions.

If you would like to find out more about Logicalis’ advanced security practice, please complete the form opposite. Our experts are primed and ready to support you.

If you would like to find out more about our security practice please do not hesitate to get in touch:

Alastair Broom
November 29, 2016

Last week we hosted a number of our customers (30 people from 18 different organisation in fact) to an event held at Sushisamba in London. From the 39th floor, overlooking most of London, I had the privilege of hosting some of our existing and potential clients for a discussion predicated by the upcoming General Data Protection Regulation (GDPR). Over an absolutely fantastic lunch, which thankfully included many tasty meat dishes – I’m no huge fan of raw fish – we talked about how organisations are going to have to rethink their strategy around data governance and security in the face of a very tough new law.

I just wanted to give you a few takeaways from the day, none of which are edible – I’m sorry…

The first of our guest speakers was Lesley Roe, Data Protection Officer from the IET. Lesley spoke about what the IET are doing to get ready for GDPR. They hold a vast amount of personal data, and given that they are advising their membership on all manner of related things, they need to lead by example. Key points from her presentation are:

  • GDPR is about giving people more control over their personal data. Every day we share an extraordinary amount of personal data with all manner of organisations, and this data is valuable. GDPR is about ensuring we retain the rights to that value. What it is processed for, how can process it, how its retained/deleted once its useful life has expired.
  • Everyone has a part to play and training of staff & staff awareness are paramount. This, however, is no mean feat.
  • The process of data governance, and the education of that process throughout the organisation, will be the only way to fully comply with the regulation. How do the IET classify old & new data? How do they manage the lifecycle of the data? How do they make sure they are only obtaining, using and retaining the data they need and have consent for?
  • None of this, however, is possible without first knowing what personal data is within your organisational context, and where it lives.
  • Much of the thinking around GDPR will be a huge shift in the mindset of organisations today. Companies just do not think about their data assets and their responsibility for that data in the spirit of the regulation at all.

Our next two speakers were from two of our technology partners, VMWare and Palo Alto Networks. Things to remember here:

  • Technology, without a doubt, has a part to play in ensuring compliance. The regulators are far more savvy to what the art of the possible is in the security market, and they will be expecting organisations to leverage technologies within reach of budgets and according to exposure to best mitigate any risks to rights of individuals.
  • The ability to prevent, detect and report on the nature and extent of any breaches will be very important. Technologies will be needed to prove that organisations can do this effectively and efficiently, especially in the face of stringent reporting requirements.
  • State of the art will really mean state of the art. Regulators will be assessing how organisations are using the best possible mix of technologies to minimise both exposure to risk, and impact of any breach if/when they should occur.

The last presentation was from Ed Charvet and his guest star Ian De Frietas. Ian is part of the alliance we have with legal experts BLP. The joint value proposition Ed and Ian spoke about is what I believe makes us entirely unique in this space:

  • The first step towards compliance is data discovery – what is personal data from the perspectives of both the GDPR and the organisations’ context? Where is the data? How is data currently classified? How is it processed? How are permissions obtained? This is delivered through a mix of manual and automated processes to help customers understand where they stand today.
  • But this process takes time. The regulation comes into law on the 28th May 2018, and as Ian made clear, the regulator is taking a “zero day” approach. This effectively means that if you’re not fully complaint on that day, you are non-compliant, and the regulator has ever power to come after you. With fines of up to 4% of global group revenues, or EUR20 million (whichever is the greater of course), this is a regulation with teeth – and with what seems like a very real political agenda. Watch out Facebook, Google, Amazon…
  • Being compliant with the likes of the DPA today, while impressive, would still mean on day zero you do not comply with the new law.
  • Key questions to ask are: do you have a legitimate interest to process the data? What exactly are you planning to do with it? These will need to be made very clear even before the gathering of data has begun.

What became clear throughout the day is that time is tight to reach compliance, and the ICO in the UK seem to be recruiting in earnest to gear up for real enforcement of the law. This feels like something that is going to change how data, and in particular the protection of personal rights and data, is valued and protected by the organisations that get the most benefit from it. What organisations need to do as a matter of urgency is find out what personal data they hold, and where they store it. They need to assess the current security infrastructures they have and find out what gaps existing that could pose a risk and ultimately a loss of personal data. They need to be putting the right people and procedures in place to comply with new and enhanced rights, tighter reporting deadlines, and they should be working out what Data Protection Impact Assessments need to look like for their organisation to satisfy regulatory requirements.

As a next step, please reach out to either myself, Ed Charvet, Alastair Broom or Jorge Aguilera to discuss how Logicalis can help our customers get ready for GDPR. From the data discovery workshop, to engaging with BLP in legal matters, and technology assessments powered by tools from the likes of VMware and Palo Alto Networks; we really can help customers on the road towards compliance.

Latest Tweets