Digitally Speaking

Chris Gabriel
September 24, 2019

Shared Services should accelerate digital change; not just focus on consolidation and compromise.

Put two children on a table for lunch but provide only one lunch box and research proves both will share the available food as equitably as possible. Sharing is a natural response to scarce resources. One child does not wish to see the other child go hungry.

However, as we grow older, and as these childlike instincts give way to the more refined and less innocent behaviours, we start to think more strategically about sharing.

The Ultimatum Game[1] is a fabulous economic experiment that explores the decision-making process used during an open negotiation. Now put two adults in a room to negotiate the sharing of a pot of money. Both know the amount on offer, but one is given the position of power of dividing the spoils between the two.  They write an offer down and hand it over, with the other side deciding whether to take the sum on offer, or if they feel they are being treated unfairly decide that neither party will take away any money at all.

This is where it becomes interesting.

When a 50/50 split is offered most people take their half and shake hands. But what happens when the offer is lower, maybe forty-five, forty or say twenty percent? Neither party started with anything, so surely twenty percent of something is better than nothing? Surely you would take the money and run? This is where fairness takes over from simple monetary gain.

The experiment shows that even a small imbalance of perceived fairness can cause somebody gaining something they never had to throw away that benefit because the other person has gained more.

Welcome to the world of public sector shared services

The shared service model is a simple concept. Stakeholder organisations pool financial and people resources to build and operate IT platforms that serve their collective needs. Everybody contributes to create one thing that is suitable for all. Sounds simple. But agreeing shared IT services has always been more complicated than the concept, and over the last couple of decades this is a strategy has been tried, tested, succeeded, failed, aborted, and agonised over in equal measure.

Fundamentally, a shared service mandates some level of ceding control to others. And it’s much more than just what network or data centre technology will be chosen.  Who will run it? Who will decide on the service catalogue it will provide? Who will define and govern the service quality? Who will arbitrate the different needs of all organisations? Who will pay for what?

The word ‘share’ is loaded with the perception that sharing means compromising.  I must give up what I have to share something with you; we are taking 1+1 and making 0.75 – we are consolidating cost not constructing something compelling.

And that has always been the problem with shared services.  It usually feels like a conversation about consolidation and compromise in a world obsessed with digital innovation, digital experiences, digital choice.

Digital cannot thrive in a world driven by simple consolidation. That is a digital compromise

And who wants to build a shared service that is ultimately DiNO; Digital in Name Only?

Thankfully, the world is changing.

With a new generation of adaptable digital platform technologies, powered by software defined technologies, the progressive CIO in public sector can now bring together people, processes and technology of disparate organisations and create services where everybody wins, and more importantly where no one perceives they lose. We are now in an age where shared doesn’t mean mediocre or the lowest common denominator. It doesn’t mean compromise through consolidation.

What organisation using AWS, Azure or Google believe they are downgrading their digital capability?  Those massive shared cloud platforms have mastered the art of personalising the experience of users, delivering exactly what they need when they need it. These aggregated platforms have mastered automation, intelligence and personalisation, and so public sector shared services must adopt the same approach.

The CIO in charge of shared services designed for a digital future now has no choice but to do the same.

The trick now is to turn financial or operational or service need to build a shared service platform into a re-imagining of what can be delivered.  Not just focusing on a like for like consolidated replacement of the technology but building a new adaptable platform that can share the resources based on agreed need and provide every stakeholder organisation with an environment tuned to their specific environment.

Yes, some shared decisions will need to be made.

Individual IT teams won’t be able to get their way on the choice of a vendor, or virtual machine type, or whether the hyper-converged infrastructure or network technology is one they have personally road tested and approved.  They will have to cede operational control of their data centre platform, wide area network, or networks to a shared service organisation.  But what they won’t have to do is compromise on the quality of services they provide to their users.

Because, with adaptable digital platforms, sharing is never about compromise.

[1] ^ Güth, W., Schmittberger, R., & Schwarze, B. (1982). An Experimental Analysis of Ultimatum Bargaining. Journal of Economic Behavior & Organization, 3, 367-388.

Category: Automation

digital transformation

Tim Wadey
March 19, 2018

What is your approach to Digital Transformation and is your business structured for it?


All modern companies are looking at digital transformation, and the key decision they need to make is whether to “become digital” or to “do digital”. “Becoming digital” is deciding to turn the whole business or business unit digital, re-engineering from the ground up to take full advantage of the benefits of technology across the value chain. “Doing digital” implies taking specific processes, maybe a customer interaction or a B2B transaction process, and making it digital. Depending on which of these options a business chooses to take, the approach and qualities of the Digital Transformation function will change.

Digital transformation has grown as a concept over the last few years, but in general, is taken to mean building additional business benefit on the data and data processes that a business owns. This can mean finding efficiencies through process improvement and automation, new opportunities buried within the value of corporate data or new digital routes to market. A full transformation embraces all of these and more; the emergence of a connected environment, now known as IoT, is opening new opportunities with every technological development.

Becoming Digital: starts with a solid digital business culture

If a business has chosen to “become digital”, the leadership team needs to embrace the objective and fully support the change initiative. That said, the scale of investment and impact of the programme means that a single point of oversight is essential. In some businesses this might fall to a CIO, in others a Chief Digital Officer, however, these leaders will need support from a team with excellent project and technical skills. In addition, the cultural change will require consideration throughout the process. Probably the most critical attributes that the transformation leaders will need to have are a clear vision of what digital looks like, the skills to understand how it will be delivered and, most importantly, the drive to sustain a multi-year transformational programme.

In many ways, the Digital Transformation Officer will have to lead the senior team through this programme, and these qualities combined with the soft skills to enable this leadership will eventually determine the success of the programme. This role is well suited to an interventional style – enabling the business to focus on BAU while the digital programme is delivered in a defined manner. There have been well-publicised initiatives similar to this in major UK retail banks and across industries, like the airlines, where all aspects of customer interaction have become fully digital.

Doing Digital: requires greater focus on technical skills

Alternatively, if the choice is to “do digital”, the transformation challenge is much more bounded. In this case, the challenge is more to do with having the technical understanding and project management skills to deliver tightly defined digital projects. While these transform the particular process involved, they do not require wholesale change across the business. For most organisations, this will be the chosen option as there is less risk and disruption in such an iterative approach. We are seeing programs like this often linked to IoT initiatives across our customer base.

Clearly the CEO will take a close interest in any of these initiatives, however with the choice to “become digital” he or she is betting the company and as such will want the transformation leadership to be part of his senior team and empowered to drive the vision to a conclusion. In choosing to “do digital”, the CEO contains the risk to particular areas and should use his management team to direct these initiatives through a skilled and technically able programme manager. Whatever the approach, there will be a material cost and the benefits realisation after go-live needs to be driven and measured with similar control and vigour.

No matter what direction you choose to take, speak to one of our experts to help you through your digital transformation journey.

Justin Price
November 8, 2017

Year by year we are generating increasingly large volumes of data which require more complex and powerful tools to analyse in order to produce meaningful insights.

What is machine learning?

Anticipating the need for more efficient ways of spotting patterns in large datasets on mass, Machine Learning was developed to give computers the ability to learn without being explicitly programmed.

Today, it largely remains a human-supervised process, at least in the developmental stage. This consists of monitoring a computer’s progress as it works through a number of “observations” in a data set arranged to help train the computer into spotting patterns between attributes as quickly and efficiently as possible. Once the computer has started to build a model to represent the patterns identified, the computer then goes through a looping process, seeking to develop a better model with each iteration.

How is it useful?

The aim of this is to allow computers to learn for themselves, knowing when to anticipate fluctuation between variables which then helps us to forecast what may happen in future. With a computer model trained on a specific data problem or relationship, it then allows data professions to produce reliable decisions and results, leading to the discovering of new insights which would have remained hidden without this new analytical technique.

Real-world Examples

Think this sounds like rocket science? Every time you’ve bought something from an online shop and had recommendations based on your purchase – that’s based on machine learning. Over thousands of purchases the website has been able to aggregate the data and spot correlations based on real buying users’ buying patterns, and then present the most relevant patterns back to you based on what you did or bought. You may see these as “recommended for you” or “this was frequently bought with that”. Amazon and Ebay have been doing this for years, and more recently, Netflix.

This sounds fantastic – but where can this help us going forward?

Deep learning

This is distinguished from other data science practices by the use of deep neural networks. This means that the data models pass through networks of nodes, in a structure which mimics the human brain. Structures like this are able to adapt to the data they are processing, in order to execute in the most efficient manner.

Using these leading techniques, some of the examples now look ready to have profound impacts on how we live and interact with each other.We are currently looking at the imminent launch of commercially available real-time language translation which requires a speed of analysis and processing never available before. Similar innovations have evolved in handwriting-to-text conversion with “smartpads” such as the Bamboo Spark, which bridge the gap between technology and traditional note taking.

Other applications mimic the human components of understanding; classify, recognise, detect and describe (according to This has now entered main-stream use with anti-spam measures on website contact forms, where the software knows which squares contain images of cars, or street signs.

Particularly within the healthcare industry, huge leaps are made where scanned images of CT scans have been “taught” how to spot the early sign of lung cancer in Szechwan People’s Hospital, China. This has come in to meet a great need as there is a shortage of trained radiologists to examine patients.

In summary, there have been huge leaps in data analysis and science in the last couple years. The future looks bright for the wider range of real world issues to which we can apply more and more sophisticated techniques and tackle previously impossible challenges. Get in touch and let’s see what we can do for you.

Category: Analytics, Automation

Team Logicalis
September 27, 2017

Throughout history, I don’t believe we’ve ever seen as much change as we do in the world of Technology! Just to think that in 10 years’ time we’ ve had more iPhone releases than Henry VIII had wives.

Taking a page out of some of tech giants books, be it Apple to Salesforce, it’s clear that innovation is at the centre of what enables the industry to move at the pace it does. It would be fair to say that 3 major trends currently dominate the industry:

1.Service, service, service – Many big players in the hardware product space recognise hardware is fast becoming a vanilla commodity. You’ve got a number of vendors such as Cisco, Oracle, Ericsson, Nokia, HP scrambling very quickly over a number of years to enable value added services on top of the hardware to increase margins.

 “Services are enabled by the specific knowledge, skills and experience you bring to the table which often drives business value through improved margins.”

Sometimes when I think about how you can build your brand of service that you deliver to customers, I like to compare it to food (one of my favourite subjects).

What keeps you going back to your favourite restaurant? Let’s take for instance McDonalds. It could be the quality of the food, but ultimately you KNOW you will get a fast, efficient service and a smile when they ask ‘would you like fries with that?’. The point being, it’s the trusted customer experience that underpins successful services, remember this bit – I’m going to allude to this part later on.

2.Business process design driven by cost reduction, optimization and automation – Ultimately, we use technology to enable us to make our lives simpler. Traditional IT has become so entrenched in complexity and with that has come high cost. Many businesses of all sizes are certainly looking at their balance sheets with scrutiny and seeking to utilize the benefits of IT innovation to gain a competitive advantage. The principles of globalisation, business process optimization and automation are all relevant now as we transform traditional IT to achieve the ultimate goal of simplicity.

3.Data driven customer experience being an investment for the future – Products in the world of data analytics are booming as businesses recognise the power of data in enabling intelligent business decisions. Some proven examples of boosting business value are how Telcos are using customer location data to pinpoint relevant marketing text messages.

Imagine you’re at the airport, where intelligent systems will pick up your location and send you a text to see if you want to purchase international data plan while you’re away. So instead of sending you random marketing messages, geo-location marketing becomes targeted and relevant. Through this intelligent marketing Telcos have been able to generate 40% more revenue than expected in that portfolio.

Keeping up with the pace of change within the industry can be overwhelming, unless you harness the key themes that I mentioned earlier which will be sure to relate to business value. Contact Logicalis today to learn how you can implement an agile business model and use its benefits to increase your business value.

Alastair Broom
July 12, 2017

£170m lost on the London Stock Market just over a week, and untold damage to the “World’s Favourite Airline”. That’s the cost within the UK to the International Airlines Group, the owner of British Airways, after BA’s recent ‘Power Outage’ incident.

“It wasn’t an IT failure. It’s not to do with our IT or outsourcing our IT. What happened was in effect a power system failure or loss of electrical power at the data centre. And then that was compounded by an uncontrolled return of power that took out the IT system.” Willie Walsh (IAG Supremo) during a telephone interview with The Times.

Willie has since inferred that the outage was caused by the actions of an engineer who disconnected and then reconnected a power supply to the data centre in “an uncontrolled and un-commanded fashion”. Could this then actually have something to do with the IT outsource after all, and did a staff member go rogue, or was it down to poor training and change control…?

For me what this highlights is the need to place greater emphasis on availability and uptime of those systems that support critical parts of a business or organisations services and offering. Along with robust processes and automation where possible to minimise the impact of an unplanned outage.

All businesses should expect their systems to fail. Sometimes it can be a physical failure of the infrastructure supporting the data centre (Power, UPS’s, Generators, Cooling etc.). It can be the power supply itself. Computing, Storage or the Network equipment can fail. Software and systems can suffer an outage. Plus it can also come down ‘Human Error’ or poor maintenance of core systems or infrastructure.

Coping with a Power Failure

Even if you have two power feeds to your building, and even if they’re from two different power sub-stations, and run through two different street routes, those sub-stations are still part of the same regional and national power grid. If the grid fails, so does your power. No way around it, except to make your own. Power Surge’s are handled by monitoring the power across Cabinet PDU’s, Critical PDU’s, UPS’s, Generators & Transformers, while assigning Maximum Load to all cabinets to make sure that we do not overload our customers systems.

Recovering from a Disaster

Recovering from a disaster is something that all organisation plan for, however not all have a Disaster Recovery (DR) Plan as there are some that consider High Availability (HA) to be more than sufficient. However HA only provides a localised system for failover, whereas DR is designed to cope with a site failure.

The challenge with DR for many of our customers is the cost;

  • First you need to prioritise which applications workloads you want to failover in the event of a disaster.
  • Second you need to purchase and manage infrastructure and licensing for these workloads with continuous replication.
  • Third you need a 2nd location.
  • Fourth you need a robust DR plan that allows you to recover your workloads at the 2nd location.
  • Then lastly (which is considered harder) you’ll need to fail back these services once the primary site has been recovered.

This can be an expensive option, but this is also where things like Cloud DR-as-a-Service can help minimise any expenditure, and the pain associated with owning and managing a DR environment.

Reducing the impact of an outage

Minimising the impact of any form of physical failure should be a priority over recovering from an outage. Workflow Automation can help a business maintain uptime of applications and services. This can be defined as a policy where services can be moved to other systems locally, or all services can be re-provisioned to a DR location or a DR platform in the event of outage caused either by a power issue or human error. Helping a business minimise the risk and the impact of outage.

I’ll let you come to your own conclusions as to whether British Airways should adopt a robust change control, automation or DR policy. Logicalis can assist and provide you with a number of options custom to your particular needs so that you are not the next press headliner.

Fanni Vig
January 16, 2017

A friend of mine recently introduced me to the idea of the ‘runaway brain’ – a theory first published in 1993 outlining the uniqueness of human evolution. We take a look into how artificial intelligence is developing into something comparable to the human brain and the potential caveats that concern us as human-beings.

The theory considers how humans have created a complex culture by continually challenging their brains, leading to the development of more complex intellect throughout human evolution. A process which continues to occur, even up to today and will again tomorrow, and will no doubt for years to come. This is what theorists claim is driving human intelligence towards its ultimate best.

There are many ways in which we can define why ‘human intelligence’ is considered unique. In essence, it’s characterised by perception, consciousness, self-awareness, and desire.

It was by speaking to a friend that I considered with human intelligence alongside the emergence of artificial intelligence (AI), is it possible for the ‘runaway brain’ to reach a new milestone? After further research, I found some that say it already has.

They label it ‘runaway super intelligence‘.

Storage capacity of the human brain

Most neuroscientists estimate the human brains storage capacity to range between 10 and 100 terabytes, with some evaluations estimating closer to 2.5 petabytes. In fact, new research suggests the human brain could hold as much information as the entire internet.

As surprising as that sounds, it’s not necessarily impossible. It has long been said that the human brain can be like a sponge, absorbing as much information that we throw towards it. Of course we forget a large amount of that information, but take into consideration those with photographic memory or those who practice a combination of innate skills, learned tactics, mnemonic strategies or those who have an extraordinary knowledge base.

Why can machines still perform better?

Ponder this – if human brains have the capacity to store significant amounts of data, why do machines continue to outperform human decision making?

The human brain has a huge range – data analysis and pattern recognition alongside the ability to learn and retain information. A human needs only to glance before they recognise a car they’ve seen before, but AI may need to process hundreds or even thousands of samples before it’s able to come to a conclusion. Perhaps human premeditative assumption, if you will, to save time analysing finer details for an exact match, but conversely, while AI functions may be more complex and varied, the human brain is unable to process the same volume of data as a computer.

It’s this efficiency of data processing that calls on leading researchers to believe that indeed AI will dominate our lives in the coming decades and eventually lead to what we call the ‘technology singularity’.

Technology singularity

Technological singularity is defined by the hypothesis that through the invention of artificial super intelligence abruptly triggering runaway technological growth, which will result in unfathomable changes to human civilization.

According to this hypothesis, an upgradable intelligent agent, such as software-based artificial general intelligence, could enter a ‘runaway reaction’ cycle of self-learning and self-improvement, with each new and increasingly intelligent generation appearing more rapidly, causing an intelligence explosion resulting in a powerful super intelligence that would, qualitatively, far surpass human intelligence.

Ubiquitous AI

When it comes to our day-to-day lives, algorithms often save time and effort. Take online search tools, Internet shopping and smartphone apps using beacon technology to provide recommendations based upon our whereabouts.

Today, AI uses machine learning. Provide AI with an outcome-based scenario and, to put it simply, it will remember and learn. The computer is taught what to learn, how to learn, and how to make its own decisions.
What’s more fascinating, is how new AI’s are modeling the human mind using techniques similar to that of our own learning processes.

Do we need to be worried about the runaway artificial general intelligence?

If we to listen to the cautiously wise words of Stephen Hawking who said “success in creating AI would be the biggest event in human history”, before commenting “unfortunately, it might also be the last, unless we learn how to avoid the risks”.

The answer to whether we should be worried all depends on too many variables for a definitive answer. However, it is difficult not to argue that AI will play a growing part in our lives and businesses.

Rest assured: 4 things that will always remain human

It’s inevitable that one might raise the question is there anything that humans will always be better at?

  1. Unstructured problem solving. Solving problems in which the rules do not currently exist; such as creating a new web application.
  2. Acquiring and processing new information. Deciding what is relevant; like a reporter writing a story.
  3. Non-routine physical work. Performing complex tasks in a 3-dimentional space that requires a combination of skill #1 and skill #2 which is proving very difficult for computers to master. As a consequence this causes scientists like Frank Levy and Richard J. Murmane to say that we need to focus on preparing children for an “increased emphasis on conceptual understanding and problem-solving“.
  4. And last but not least – being human. Expressing empathy, making people feel good, taking care of others, being artistic and creative for the sake of creativity, expressing emotions and vulnerability in a relatable way, and making people laugh.

Are you safe?

We all know that computers/machines/robots will have an impact (positive and/or negative) on our lives in one way or another. The rather ominous elephant in the room here is whether or not your job can be done by a robot?

I am sure you will be glad to know there is an algorithm for it…
In a recent article by the BBC it is predicted that 35% of current jobs in the UK are at a ‘high risk’ of computerization in the coming 20 years (according to a study by Oxford University and Deloitte).

It remains, jobs that rely on empathy, creativity and social intelligence are considerably less at risk of being computerized. In comparison roles including retail assistants (37th), chartered accountants (21st) and legal secretaries (3rd) all rank among the top 50 jobs at risk.

Maybe not too late to pick up the night course on ‘Computer Science’…

Alastair Broom
December 15, 2016

Last week I read that you can now hijack nearly any drone mid-flight just by using a tiny gadget.

The gadget responds to the name of Icarus and it can hijack a variety of popular drones mid-flight, allowing attackers to lock the owner out and give them complete control over the device.

Besides Drones, the new gadget has the capability of fully hijacking a wide variety of radio-controlled devices, including helicopters, cars, boats and other remote control gears that run over the most popular wireless transmission control protocol called DSMx.

Although this is not the first device we have seen that can hijack drones, this is the first one giving the control Icarus works by exploiting DMSx protocol, granting attackers complete control over target drones that allows attackers to steer, accelerate, brake and even crash them.

The attack relies on the fact that DSMx protocol does not encrypt the ‘secret’ key that pairs a controller and the controlled device. So, it is possible for an attacker to steal this secret key by launching several brute-force attacks.

You can also watch the demonstration video to learn more about Icarus box.

There is no mitigation approach to this issue at the moment, other than wait for manufacturers affected to release patches and update their hardware embracing encryption mechanisms to secure the communication between controller and device.

Having seen this video and the potential impact of this hijacking technique, my first thought was about the threat for Amazon’s new service coming soon, which will allow drones to safely deliver packages to people’s homes in under 30 minutes.

This is just another example of how important is to define the right strategy around using encryption as part of the security in the digital era. Business data and the way we want to access this data from any device, anywhere and anytime just highlight the need of enhanced and clever security solutions.

There are different ways Logicalis can help our customers in the protection of data located in data centres and end points with the help of the ecosystem of partners like Cisco and Intel Security.

An interesting offering to mention is Logicalis Endpoint Encryption Managed Service. This service gives our customer’s devices and the data within them the level of protection that will give them peace of mind should a device be lost or stolen, and we Logicalis manage the service for them. This service is the market leader for data protection and it provides the highest levels of Confidentiality, Integrity and Availability. The service is part of the global strategy adopted by Logicalis Group across EMEA.

Category: Automation, Security

Team Logicalis
December 12, 2016

In the year of Brexit and Trump, Scott Reynolds, Hybrid IT practice lead, predicts that electronic digital crime will explode, data privacy breaches will claim scalps, automation will be 2017’s buzzword and the open source movement will challenge profit-making business models in his 2017 tech predictions.

It’s the time of year to engage in the oh-so risky game of making predictions for what is going to be hot for our customers in the coming year. Risky, because stunning twists and turns can take us off course at any point.

After the Brexit referendum and US Elections results confounded political and polling pundits, life’s certainties appear far less certain. Suddenly identifying the big winners in 2017 seems a less straight forward affair. But as a person that doesn’t mind living life on the edge – I thought I’d take a punt anyway. Here are my top tech predictions for 2017.

Security Breaches – the worst is yet to come

Based on the number of high profile data breaches, 2016 hasn’t been a great year for digital. The stable door has been left open by companies and government departments around the world. Armies of Terminator 2 Cyborgs in the guise of home CCTV cameras are attacking the very infrastructure of the internet. I fear we’re only at the beginning of an escalation of electronic digital crime.

2017 will test the nerve of governments, businesses, citizens and consumers and challenge the perception of digital as a safe and secure way of doing business, unless there’s a massive investment in Fort Knox equivalent defences and white hat skills.

Data Privacy

GDPR (General Data Protection Regulation) emanating from Europe is going to hurt businesses who don’t take data privacy seriously. That is a problem, as evidence suggests companies are unaware of their obligations under this new punitive legislative regime and are taking too long to grab hold of the GDPR tail.

It’s highly possible fines of up to 4% of Global Turnover will take some companies out of business in 2017, and beyond.

One Small Change for Mankind, One Giant Leap Forward for Automation

The IT industry is about to enter a time of mass automation…..about time. To our shame, we’ve lagged behind other industries. You can now buy a car that can park itself with the touch of a button, but you need 24 buttons to change the configuration of a router.

Increased levels of automation will manifest itself in robotic decision making, the automation of security systems to guard against, and respond to an avalanche of security threats, and automated provisioning of resource in the data centre and network (Software Defined).

Things Can Only Get Bigger

The Internet of Things is going to get bigger and more impactful. Gartner Group is still predicting that by 2020 there will be 50 billion things connected to the internet – that’s only three years away. In 2017, expect to see mass engagement by businesses in all sectors.

Hopefully we’ll move on from talking about the connected fridge that can order more lettuce when you run out, and recognise that IoT will fundamentally change how industries and organisations operate.

Open Source – Somebody wants to do everything you do for free

Somebody, somewhere, is trying to do what you charge for, at much lower cost, or even for free. Open isn’t a thing, it’s a movement. We’re already seeing Open Source technologies impact our industry; with Open Stack being the new operating system of choice for those companies not wanting to ‘pay’ for mainstream software. Open technologies in automation, such as Puppet and Chef, now have a groundswell of support, and are evangelical about companies who want to delight people rather than turn a profit.

We’ve also witnessed a growing willingness to embrace Open Computing technologies. Now, Open isn’t without its complications and ultimately nothing in life is free – operating an open environment is still a complicated affair. But I think we’ll see a lot more traction, with many of our customers taking Open Source seriously, over the next 12 months.

2017 Tech predictions – a risky game

So, those are my top five tech trends for 2017. Now you’re probably decrying – how could I overlook analytics? I haven’t. I fully acknowledge that analytics and data are core to all the above. They will need to be embedded in the very fabric of a business, to bring my predictions to fruition. Otherwise, you can disregard everything I just said. As I said, making predictions is a risky game.

Latest Tweets