Digitally Speaking

Richard Simmons
February 18, 2019

Peel away the shiny facade of any great business and you’ll no doubt see the same things. Data. People. Process. This combination, when correctly tuned, is the critical factor in the success of businesses today. So I’m always surprised when I work with clients who haven’t yet discovered this ‘secret sauce’ and who aren’t yet driving value from their data.

Reassuringly, up to 75 percent of CIOs are now harnessing their data effectively, according to Logicalis’ recent global survey of CIOs. In today’s challenging trading conditions, organisations that combine business intelligence (BI) and analytics to enhance customer service, delivery and supply are the ones that will thrive. Never has there been more pressure on CIOs to put data in the driving seat for real competitive advantage. In fact, if CIOs are not driving value from their business intelligence, they may struggle to stay in business.

Quality in, quality out

Successful use of BI and analytics starts with knowing what makes good data. You can provide the most stunning visualisation for a business but the output will only ever be as good as the data going in.

From the perspective of the customer, good data, or rather, correct data is critical. From personalised emails that label you the wrong gender, to clumsy sales calls for services you don’t need, we’ve all been at the receiving end of misplaced marketing messages where the data held on you doesn’t quite match up. If customer data is wrong then, undoubtedly, the customer experience is not going to be a good one.

But changes are afoot, driven in part by data governance. GDPR has been a contributory factor in improving the way many businesses handle data. It now costs resource and money to store and secure customer data, so if it’s not adding value then there’s no point keeping it.

From a business perspective, data must be linked to business objectives. Our survey has revealed 25% of CIOs simply don’t know whether benefits are being derived from BI and analytics, suggesting that some CIOs are involved in delivering data projects without fully understanding how they will be used. Similarly, 41% of CIOs stressed that having no clear business brief is a barrier to moving forward with BI and analytics projects.

Manage what you mine

Next to quality, management is the next critical factor. I have worked with clients whose marketing teams spend as much as 80% of their time creating data sets and 20% of their time on analysing it. With all their time tied up at the front end of the machine, they have no time to analyse and action the insights they’re getting. If you can’t make your data work for you, what’s the point in mining it?

In the last two years, 90% of the world’s data has been created. Knowing what to do with it is far from straightforward. I often go into businesses that have so much data they get overwhelmed and don’t know where to start. That’s where curation comes in.

A key challenge in maximising data is knowing which of it to use, and when. When you don’t know what you don’t know, how can you effectively explore the data you produce? Data curation is a relatively new function that, by combining the skills of data scientists and data analysts, allows businesses to determine what information is worth saving and for how long.

It’s the difference between looking for a book on Amazon or going into a Waterstones store. The former will present you with a million options. The latter uses expert sales staff to find out what you’re looking for and guide you to a range of options.

Businesses looking to enable self-service analytics will utilise data curation to give internal teams quicker access to the data they need to make commercial decisions. This could be in the form of an in-house data curator or support from an external team. But the outcome is what’s important.

Make your data project pay from the start

Getting tangible benefits from BI and analytics requires resource, but it also requires ongoing investment. And where there is investment, there must be strategy. Joined-up thinking across the business and a laser focus on the return on investment can help businesses avoid common stumbling blocks when starting BI and analytics programmes for the first time.

It’s sadly quite common for organisations to jump into data projects without ensuring the ground-work is in place, with the digital wing of the business introducing new BI and analytics process only to be tripped up by old business practises in other areas.

There can be a tendency to overlook the ROI of BI and analytics projects, which can be a huge risk to the business. My advice is to break the project down to start driving outputs and getting results you can use quickly. I have seen many organisations stall because they embark on huge, monolithic projects that take years to complete and can become outdated during the time of implementation. This can be easily avoided by adopting agile working practices – going through cycles and iterations that produce results you can start using immediately.

Is respect for data a cultural thing?

BI and analytics are essential for innovation; the pace of which is defined by the pace of valuable insight coming into the business. So developing a data-driven culture is a critical component.

We need to change the way we interact with data, from the customer data we store to the intelligence we mine and use to make better decisions. With a clearer focus on business outcomes, organisations can harvest their data more effectively or even decide whether or not to invest in large BI and analytics projects. I still go into businesses that want to introduce BI and analytics without truly understanding the business challenge. If your business is running effectively then why implement it?

So how do we build a data-driven culture? It must start from the top down. BI and analytics advancement can improve business practices and help make better decisions, but they must be part of, and driven by, strategy.

The cultural shift is already happening. Our understanding and use of data is changing as new blood enters the market. Today’s entrants have grown up with data, at home with functions like Apple Siri and Amazon Alexa, which allow us to get answers really easily. This is truly a data-driven generation.

We’re almost at a cusp where for every CIO used to the old world, there is a new data-savvy millennial joining the industry. It’s no longer just about the technology, it’s the people and processes behind the tech that are driving change and changing company culture.

Where we see BI and analytics really pulling their weight is where CIOs aren’t afraid to experiment with their data, using agile working to go through cycles and iterations rather than attempting large scale, monolithic BI. Getting results you can use straight away to improve the business is the secret to bringing the rest of the C-suite on-board for the journey.

Category: Analytics

Justin Price
February 7, 2019

It’s an undeniable fact that Artificial Intelligence (AI) is changing the way we live. From facial recognition and voice technology assistants to self-driving cars, AI has crept into our lives and as consumers we use it without a second thought.

But its impact across a wide range of business sectors is perhaps the hottest topic in tech right now. AI has developed and matured to the stage where, for some functions and operations, the levels of accuracy have overtaken human skills.

Yet with stories of mind-boggling complexity, escalating project timescales and spiralling costs, the much-hyped technology is still regarded by many business owners with confusion and as a risk. Justin Price, Data Scientist and AI lead here at Logicalis recently led a webinar about the business reality of AI. One of his key points was that knowing what you want to achieve and setting realistic expectations are the best guarantees of a successful first AI adoption.

Choose the right AI tool for your business need

During my meetings with clients, and from talking to CIOs, it has become apparent that confusion reigns over the terminology used to describe Artificial Intelligence.

There are three key terms at play. All three fall under the umbrella term of AI, and are often used interchangeably but each has a different meaning. AI is a technology that retrains or ‘learns’ patterns and other specified behaviours to achieve a set goal. Crucially, it is about producing something which didn’t exist before.

Other types of AI behaviour include:

  • Robotic Process Automation or RPA – a software designed to reduce the burden of simple but repetitive tasks done by humans
  • Machine Learning – essentially probability mathematics used to spot patterns in very large samples of data
  • Deep Learning – a Neural Network which mimics the way the human brain works to examine large data sets, and unstructured data formats such as HD images and video

Being aware of the subtle differences and uses of these terms allows a greater understanding of which tool will best-support your business’ data insight needs.

Make sure your data is up to the job

When delivering an AI project, around 80% of the total effort and time will go into making sure your data is correct. Underestimating the importance of top-quality data is a common pitfall for organisations because, just like any other IT tool, AI will perform poorly if you have low quality data.

Data must be well structured, and it must be in format that’s consistent and compatible with the AI model. Don’t forget that AI is a process which must be regularly re-trained to ensure accuracy, so ongoing maintenance is essential.

Brace yourself for complexity

Never underestimate the breadth and complexity of what is involved in building and delivering an AI project. For many CIOs this will be their first experience of AI and, even with the right data, there are many variables at play that can add to both the costs and timescale of implementation.

Working with the right partner is essential to guide you through the first project. We recommend undertaking an initial project with a fixed fee whereby you can deliver a functioning result while you establish trust and credibility with your solution provider.

As well as the importance of good data, another critical factor in delivering a successful AI project is finding a solution that is scalable. It’s one thing to write an AI model on a laptop, but it’s a completely different thing to write a model in a way that will survive deployment across a business. Getting expert advice will help you decide on the right foundation to support your project. This is where Logicalis can add real value.  We have the skills and know how to advise whether your AI initiatives can use existing infrastructure, or if the AI applications would require new servers with new performance capabilities.  And if new infrastructure is needed, we can guide your organisation toward on-premise or a hybrid-cloud expansion.

Know when AI is (and isn’t) the right tool for the job

With a plethora of impressive use cases available to businesses spanning most sectors, it’s no wonder AI is the tech tool of the moment.

But AI is just another technology and won’t be the right choice for every business looking to gain insight from their data. The importance of prioritising people, process and culture in any AI project has already been discussed by my colleague Fanni Vig in a previous blog, and is absolutely essential to ensure your business isn’t trying to use AI where a different tool could deliver the desired results.

At the highest level, AI allows you to work through far larger data sets than previously possible. It can be used to help automate your data workflows, redirecting low-difficulty but high-repetition task to bots, which allows people previously engaged in these tasks to work more efficiently.

This process creates a new way of working that may have greater implications across the business as roles change and skills need to be channelled in different directions.

Introducing AI is a truly cross-business decision. And let’s not forget that, at the most fundamental level, using AI to harness your data is an investment that must show a return.

AI adoption steps

Finally, get advice from the experts

While the impetus to adopt AI may come from the IT department, the results generated can help drive cross-company productivity; help differentiate businesses from their competitors; and delight your customers through a more tailored service. The impact cannot be underestimated. But neither can the complexity.

If you’re considering whether AI could help you get more from your data, let Logicalis guide you through your first successful deployment. We will collaborate with you every step of the way to:

  • Help you decide which area(s) of your business will benefit most from AI
  • Help you identify where the relevant data resides and help you access and structure it
  • Deliver a business-ready solution which is scalable to meet your needs
  • Advise on infrastructure requirements
  • Analyse the data to provide rich insights in to your business

 

Category: Analytics

Richard Simmons
November 5, 2018

Nowadays, when it comes to digital, employee expectations are at an all time high and, with the industry-wide skills gap proving a difficult problem to solve, often the greatest challenge faced by organisations is not only sourcing the right people, but keeping them.

If your employees are motivated and committed, then individual targets are more likely to be met and business objectives are more likely to be achieved. If not, then your workforce, and available skill pool, is likely to dwindle. In the current climate, attracting and subsequently retaining talent is one of the biggest dilemmas faced by businesses of all sizes.

This is why user experience should be the driving force behind any changes to your digital landscape.

‘The Workplace’- A new definition

There’s no getting away from the fact that things are changing. ‘The Workplace’ used to be similar to an engine room where any and all work was completed. Employees came in at 9am and left at 5pm each day, completing all their assigned tasks whilst staring at the same four walls.

Then along came agile technologies, empowering employees to leave the office and enabling them to work with a level of flexibility that had never been achieved before. They can work at any time and from anywhere. ‘The Workplace’ is no longer a desk in an office. It can be anything an individual would like it to be; whether that is a cafe, the family living room or a park bench. And what’s more, time constraints are a thing of the past. Of course, the dreaded deadlines are still unavoidable but working hours are no longer dictated by a lack of efficient technology.

All in all, the ability to offer some form of flexible working has become an important part of attracting your workforce. In fact, a recent report conducted by global recruitment expert Hydrogen discovered that 81% of people look for flexible working options before joining a new company. The same report also found that 88% of technology professionals consider flexible working to be more important than other benefits and 60% of those questioned would chose it over a 5% salary increase. It’s no longer optional for businesses – if you don’t invest in the technologies to support flexible working initiatives then potential, talented candidates will look elsewhere.

And flexible working doesn’t just play a role in the hiring process, agile technologies and the new workplace can also help businesses to retain talent.

If employees can work flexibly, they can better manage their work/life balance. They can choose to work longer – and at a more flexible rate – if they wish meaning that more experienced generations will be able to educate and pass on specialised skill sets to the younger, up and coming workforce. And, thanks to agile technologies, they will no longer have to be there physically to do so.

In fact, with flexible working initiatives, if employees are coming into the office it will often be because they’ve chosen to or because they would like to meet others there. As a result, ‘The Workplace’ is transformed into a social hub, where relationships are formed and maintained, rather than a space where deadlines have to be met. Better relationships with co-workers leads to happier employees who are more invested in the company culture and so less likely to leave.

Listening to the data

There are the obvious benefits associated with flexible working initiatives for employees but installing agile working technologies also enables businesses to capture informative data.

You see, in our online world, every single action generates data. Every website visited, every message sent, every document created has a record somewhere. For businesses, this data can be used to measure employee activity and ensure that any facilities and services are meeting the standards of today’s workforce.

After all, at the heart of this new digital workplace is user experience. By monitoring the data that they are able to collect via agile technologies, businesses can essentially listen to their employees needs and ensure that they’re supporting them.

And, if you can work out which environments produce the best results in your workforce then you can look to emulate this elsewhere. For example, if there is a room in the building where people prefer to meet or are able to work more productively, data will reveal this. This intel can then be used to influence and optimise the whole office space so that your business can make the most out of its environment. So agile technologies are not only changing the ‘workplace’ as a concept, they can also be used to change it physically.

The re-definition of ‘the workplace’ has opened up a whole host of opportunities for businesses and for employees alike. Empowering your workforce and granting your people an element of choice by enabling them to work how they would like to is important for both attracting and retaining talent. As is also updating physical spaces to improve employee satisfaction and – as a direct result – productivity. After all, your workforce is your most valuable asset and if you invest in them then your entire business will reap the rewards.

Category: Analytics

Justin Price
November 8, 2017

Year by year we are generating increasingly large volumes of data which require more complex and powerful tools to analyse in order to produce meaningful insights.

What is machine learning?

Anticipating the need for more efficient ways of spotting patterns in large datasets on mass, Machine Learning was developed to give computers the ability to learn without being explicitly programmed.

Today, it largely remains a human-supervised process, at least in the developmental stage. This consists of monitoring a computer’s progress as it works through a number of “observations” in a data set arranged to help train the computer into spotting patterns between attributes as quickly and efficiently as possible. Once the computer has started to build a model to represent the patterns identified, the computer then goes through a looping process, seeking to develop a better model with each iteration.

How is it useful?

The aim of this is to allow computers to learn for themselves, knowing when to anticipate fluctuation between variables which then helps us to forecast what may happen in future. With a computer model trained on a specific data problem or relationship, it then allows data professions to produce reliable decisions and results, leading to the discovering of new insights which would have remained hidden without this new analytical technique.

Real-world Examples

Think this sounds like rocket science? Every time you’ve bought something from an online shop and had recommendations based on your purchase – that’s based on machine learning. Over thousands of purchases the website has been able to aggregate the data and spot correlations based on real buying users’ buying patterns, and then present the most relevant patterns back to you based on what you did or bought. You may see these as “recommended for you” or “this was frequently bought with that”. Amazon and Ebay have been doing this for years, and more recently, Netflix.

This sounds fantastic – but where can this help us going forward?

Deep learning

This is distinguished from other data science practices by the use of deep neural networks. This means that the data models pass through networks of nodes, in a structure which mimics the human brain. Structures like this are able to adapt to the data they are processing, in order to execute in the most efficient manner.

Using these leading techniques, some of the examples now look ready to have profound impacts on how we live and interact with each other.We are currently looking at the imminent launch of commercially available real-time language translation which requires a speed of analysis and processing never available before. Similar innovations have evolved in handwriting-to-text conversion with “smartpads” such as the Bamboo Spark, which bridge the gap between technology and traditional note taking.

Other applications mimic the human components of understanding; classify, recognise, detect and describe (according to SAS.com). This has now entered main-stream use with anti-spam measures on website contact forms, where the software knows which squares contain images of cars, or street signs.

Particularly within the healthcare industry, huge leaps are made where scanned images of CT scans have been “taught” how to spot the early sign of lung cancer in Szechwan People’s Hospital, China. This has come in to meet a great need as there is a shortage of trained radiologists to examine patients.

In summary, there have been huge leaps in data analysis and science in the last couple years. The future looks bright for the wider range of real world issues to which we can apply more and more sophisticated techniques and tackle previously impossible challenges. Get in touch and let’s see what we can do for you.

Category: Analytics, Automation

Anis Makeriya
August 21, 2017

It’s always the same scenario: someone giving me some data files that I just want to dive straight into and start exploring ways to visually depict them, but I can’t.

I’d fire up a reporting tool only to step right back, realising that for data to get into visual shapes, they need to be in shape first!  One correlation consistently appearing over the years is that time spent on ETL/ELT (Extract, Transform and Load [in varying sequences]) and the speed of exit from reporting layer back to data prep share a negative correlation.

Data preparation for the win

‘80% of time goes into data prep’ and ‘Garbage in Garbage out (GIGO)’ have existed for some time now but don’t actually hit you until you face it in practical situations and it suddenly translates into ‘backward progress’. Data quality issues can vary from date formats, multiple spellings of the same value to values not existing at all in the form of nulls. So, how can they all be dealt with? Data prep layer is the answer.

Often with complex transformations or large datasets, analysts find themselves turning to IT to perform the ETL process. Thankfully, over the years, vendors have recognised the need to include commonly used transformations in the reporting tools themselves. To name a few, tools such as Tableau and Power BI have successfully passed this power on to the analysts making time to analysis a flash. Features such as pivot, editing aliases, joining and unioning tables and others are available within a few clicks.

There may also be times when multiple data sources need joining, such as matching company names. Whilst Excel and SQL fuzzy look-ups have existed for some time, the likes of dedicated ETL tools such as Paxata have imbedded further intelligence that enable it to go a step further and recognise that the solutions lies beyond just having similar spellings in between the names.

All the tasks mentioned above are for the ‘T’ (Transformation) of ETL and is only the second OR third step in the ETL/ELT process! If data can’t be extracted as part of the E in ETL in the first place, there is nothing to transform. When information lies in disparate silos, often it cannot be ‘merged’ unless the data is migrated or replicated across stores. Following the data explosion in the past decade, Cisco Data Virtualisation has gained traction for its core capability of creating a ‘merged virtual’ layer over multiple data sources enabling quick time to access as well as the added benefits of data quality monitoring and single version of the truth.

These recent capabilities are now even more useful with the rise in data services like Bloomberg/forex and APIs that can return weather info, if we want to further know how people feel about the weather, then the twitter API also works.

Is that it..?

Finally after the extraction and transformation of the data, the load process is all that remains… but even that comes with its own challenges. Load frequencies, load types (incremental vs. full loads) depending on data volumes, data capture (changing dimensions) to give an accurate picture of events and also storage and query speeds from the source to name a few.

Whilst for quick analysis a capable analyst with best practice knowledge will suffice, scalable complex solutions will need the right team from IT and non-IT side in addition to the tools and hardware to support it going forward smoothly. Contact us today to help you build a solid Data Virtualisation process customised to your particular needs.

Fanni Vig
April 20, 2017

Finally, it’s out!

With acquisitions like Composite, ParStream, Jasper and AppDynamics, we knew something was bubbling away in the background for Cisco with regards to edge analytics and IoT.

Edge Fog Fabric – EFF

The critical success factor for IoT and analytics solution deployments is to provide the right data, at the right time to the right people (or machines) .

With the exponential growth in the number of connected devices, the marketplace requires solutions that provide data generating devices, communication, data processing, and data leveraging capabilities, simultaneously.

To meet this need, Cisco recently launched a software solution (predicated on hardware devices) that encompasses all the above capabilities and named it Edge Fog Fabric aka EFF.

What is exciting about EFF?

To implement high performing IoT solutions that are cost effective and secure, a combination of capabilities need to be in place.

  • Multi-layered data processing, storing and analytics – given the rate of growth in the number of connected devices and the volume of data. Bringing data back from devices to a DV environment can be expensive. Processing information on the EFF makes this a lot more cost effective.
  • Micro services – Standardized framework for data processing and communication services that can be programmed in standard programming language like Python, Java etc.
  • Message routers – An effective communication connection within the various components and layers. Without state of the art message brokerage, no IoT systems could be secure and scalable in providing real time information.
  • Data leveraging capabilities – Ad hoc, embedded or advanced analytics capabilities will support BI and reporting needs. With the acquisition of Composite and AppDynamics, EFF will enable an IoT platform to connect to IT systems and applications.

What’s next?

Deploying the above is no mean feat. According to Gartner’s perception of the IoT landscape, no organization have yet achieved the panacea of connecting devices to IT systems and vice versa, combined with the appropriate data management and governance capabilities embedded. So there is still a long road ahead.

However, with technology advancements such as the above, I have no doubt that companies and service providers will be able to accelerate progress and deliver further use cases sooner than we might think.

Based on this innovation, the two obvious next steps that one can see fairly easily are:

  • Further automation – automating communication, data management and analytics services including connection with IT/ERP systems
  • Machine made decisions – once all connections are established and the right information reaches the right destination, machines could react to information that is shared with ‘them’ and make automated decisions.

Scott Hodges
April 18, 2017

Attending a recent IBM Watson event, somebody in the crowd asked the speaker, “So, what is Watson? ” It’s a good question – and one isn’t really a straightforward answer to. Is it a brand? A supercomputer? A technology? Something else?

Essentially, it is an IBM technology that combines artificial intelligence and sophisticated analytics to provide a supercomputer named after IBM’s founder, Thomas J. Watson. While interesting enough, the real question, to my mind, is this: “What sort of cool stuff can businesses do with the very smart services and APIs provided by IBM Watson?”

IBM provides a variety of services, available through Application Programmable Interfaces (APIs) that can developers can use to take advantage of the cognitive elements and power of Watson. The biggest challenge to taking advantage of these capabilities is to “Think cognitively” and imagine how they could benefit your business or industry to give you a competitive edge – or, for not-for-profit organisations, how they can help you make the world a better place.

I’ve taken a look at some of the APIs and services available to see some of the possibilities with Watson. It’s important to think of them collectively rather than individually, as while some use-cases may use one, many will use a variety of them, working together. We’ll jump into some use-cases later on to spark some thoughts on the possibilities.

Natural Language Understanding

Extract meta-data from content, including concepts, entities, keywords, categories, sentiment, emotion, relations and semantic roles.

Discovery

Identify useful patterns and insights in structured or unstructured data.

Conversation

Add natural language interfaces such as chat bots and virtual agents to your application to automate interactions with end users.

Language Translator

Automate the translation of documents from one language to another.

Natural Language Classifier

Classify text according to its intent.

Personality Insights

Extract personality characteristics from text, based on the writer’s style.

Text to Speech and Speech to Text

Process natural language text to generate synthesised audio, or render spoken words as written text.

Tone Analyser

Use linguistic analysis to detect the emotional (joy, sadness etc) linguistic (analytical, confident etc) and social (openness, extraversion etc) tone of a piece of text.

Trade-off Analytics

Make better choices when analysing multiple, even conflicting goals.

Visual Recognition

Analyse images for scenes, objects, faces, colours and other content.

All this is pretty cool stuff, but how can it be applied to work in your world? You could use the APIs to “train” your model to be more specific to your industry and business, and to help automate and add intelligence to various tasks.

Aerialtronics offers a nice example use-case of visual recognition in particular, they develop, produce and service commercial unmanned aircraft systems. Essentially, the company teams drones, an IoT platform and Watson’s Visual recognition service, to help identify corrosion, serial numbers, loose cables and misaligned antennas on wind turbines, oil rigs and mobile phone towers. This helps them automate the process of identifying faults and defects.

Further examples showing how Watson APIs can be combined to drive powerful, innovative services can be found on the IBM Watson website’s starter-kit page.

At this IBM event, a sample service was created, live in the workshop. This application would stream a video, convert the speech in the video to text, and then categorise that text, producing an overview of the content being discussed. The application used the speech-to-text and natural language classifier services.

Taking this example further with a spot of blue sky thinking, for a multi-lingual organisation, we could integrate the translation API, adding the resulting service to video conferencing. This could deliver near real-time multiple dialect video conferencing, complete with automatic transcription in the correct language for each delegate.

Customer and support service chat bots could use the Conversation service to analyse tone. Processes such as flight booking could be fulfilled by a virtual agent using the ‘Natural Language Classifier’ to derive the intent in the conversation. Visual recognition could be used to identify production line issues, spoiled products in inventory or product types in retail environments.

Identification of faded colours or specific patterns within scenes or on objects could trigger remedial services. Detection of human faces, their gender and approximate age could help enhance customer analysis. Language translation could support better communication with customers and others in their preferred languages. Trade-off Analytics could help optimise the balancing of multiple objectives in decision making.

This isn’t pipe-dreaming: the toolkit is available today. What extra dimensions and capabilities could you add to your organisation, and the way you operate? How might you refine your approach to difficult tasks, and the ways you interact with customers? Get in contact today to discuss the possibilities.

Alastair Broom
December 10, 2016

I was recently asked what I think will be three things making an impact on our world in 2017, with a few permutations of course:

Maximum of 3 technologies that will be significant for enterprises in terms of driving value and transforming business models and operations in 2017

Innovations that are most likely to disrupt industries and businesses

I’ve put my three below – it would be great to hear your thoughts and predictions in the comments!

Internet of Things

The Internet of Things is a big one for 2017. Organisations will move from exploring ideas around what IoT means for them in theory, to rolling out sensors across key opportunity areas and starting to gather data from what were previously “dark assets”. The reason IoT is so important is because of the amount of data the things will generate, and what new insight this gives to organisations, including things like physical asset utilisation & optimisation and proactive maintenance. Those organisations that take the IoT seriously are going to see their customers, their data, and their opportunities in completely new ways. Being able to add more and more data sources into the “intelligence stream” mean the decision is backed by more facts. It’s Metcalfe’s Law – the value of the network is proportional to the square of the number of users. Data is the network, and each thing is another user.

Being prepared to exploit the IoT opportunity though, especially at scale, will take proper planning and investment. Organisations will need a strategy to address the IoT, one that identifies quick wins that help further the business case for further IoT initiatives. The correct platform is key, an infrastructure for things. The platform that forms the basis for the connectivity of the things to the network will need to be robust, likely be a mix of wired and wireless, and because it’s unlikely to be a separate infrastructure, it needs to have the required visibility and control to ensure data is correctly identified, classified and prioritised.
Security too will be fundamental. Today the things are built for user convenience, security being a secondary concern. What the IoT then represents is a massively increased attack surface, one that is particularly vulnerable to unsophisticated attack. The network will therefore need to be an integral part of the security architecture.

Edge Analytics

Edge analytics is another one to look out for. As the amount of data we look to analyse grows exponentially, the issue becomes twofold. One, what does it cost to move that data from its point of generation to a point of analysis? Bandwidth doesn’t cost what it used to, but paying to transport TB and potentially PB of information to a centralised data processing facility (data centre that is) is going to add significant cost to an organisation. Two, having to move the data, process it, and then send an action back adds lag. The majority of data we have generated to this point has been for systems of record. A lag to actionable insight in many scenarios here may very well be acceptable. But as our systems change to systems of experience, or indeed systems of action, lag is unacceptable.
Analytics at the edge equates to near real-time analytics. The value of being able to take data in real time, its context, and analyse this amongst potentially multiple other sources of data, and then present back highly relevant in the moment intelligence, that’s amazing. Organisations once again need to ensure the underlying platform is up to the task. The ability to capture the right data, maintain its integrity, confirm to privacy regulations and be able to manage the data throughout its lifecycle. Technology will be needed to analyse the data at its point of creation, essentially you will need to bring compute to the data (and not the other way round as typically done today).

Cognitive Systems

Lastly, cognitive systems. Computers to this point have been programmed by humans to perform pretty specific tasks. Cognitive systems will not only now “learn” what to do from human interaction, but from the data they generate themselves, alongside the data from other machines. Cognitive systems will be continually reprogramming themselves, each time getting better and better at what they do. And what computers do is help us to things humans can do, but faster. Cognitive systems will expand our ability make better decisions, to help us think better. Cognitive systems move from computing systems that have been essentially built to calculate really fast, to systems that are built to analyse data and draw insights from it. This extends to being able to predict outcomes based on current information and consequences of actions. And because it’s a computer, we can use a far greater base of information from which to draw insight from. Us humans are really bad at remembering a lot of information at the same time, but computers (certainly for the short term) are only constrained by the amount of data we can hold in memory to present to compute node for processing.

Latest Tweets