Digitally Speaking
Data lake benefits

Team Logicalis
August 28, 2019

When implemented correctly, a data lake can really benefit organisations by helping them to efficiently extract business value from their data environment.

View our infographic to see how a governed data lake could help you gain control of your data.

Data lake benefits

  • Easier data access to a broad range of data across the organisation
    With a governed data lake, users can access structured and unstructured data located both on premises and in the cloud. They can access what they need, when they need it, without sending time-consuming requests to IT.
  • Faster data preparation
    A data lake can accelerate data preparation in several ways. For example, implementing a data catalogue helps increase knowledge and understanding of the data, which in turn accelerates data preparation. In addition, building a data lake with a hybrid cloud infrastructure gives organisations the ability to store data on the platform most appropriate for its use. When data is placed in its ideal location, it takes less time to locate and access it, thereby speeding up data preparation and reuse efforts.
  • Enhanced agility
    Faster data preparation lets users explore more. Components of the data lake can be employed as a sandbox that enables users to build and test analytics models with greater agility. They can experiment with analytics and—in some cases— “fail fast” to move on to the most productive avenues more quickly.
  • More accurate insights, stronger decisions
    By providing access to more data, accelerating data preparation and letting users experiment with data, a data lake can help organisations generate more accurate insights. A well-constructed data lake can also track data lineage to help ensure data is trustworthy. All of these capabilities help organisations make better business decisions.

Read more: Five data lake myths

If you’re thinking it’s time to make waves with data in your business, register for a Logicalis Data Services Workshop.  We’ll carry out a robust gap analysis exercise to help you define the practical next steps towards a successful governed data approach – and help you avoid the dreaded data swamp!

Find out more 

Category: Security

Jump data lake

Team Logicalis
August 23, 2019

Data has ability to create new opportunities and revenue models for organisations, but how to manage all that data generated and drive useful insights is something IT leaders are still grappling with. For good reason, data lakes continue to get a lot of attention, however misconceptions remain. Read these five data lake myths before taking the plunge….

Myth one: A data lake is a product which you can buy

A data lake is a reference architecture that is independent of technology. It’s an approach that an organisation can use to put data at the heart of its operation that includes governance, quality and management of data, thereby enabling self-service analytics to empower all consumers of data.

As helpful as it would be, a data lake is not a product that you can just purchase. You can’t just buy any data warehouse solution and call it a data lake.

Myth two: There is only one data lake solution

A data lake could be developed and used based on many relational database management systems – you’re not tied into the prominent names, there are lots of vendors and systems available.

A data lake combines a variety of technologies to establish systems of insight to provide agile data exploration for data scientists to address business needs.

Myth three: Data lakes are for dumping data (and forgetting about governance)

While software and hardware are key components of a data lake solution, equally important is the cataloguing of data, quality of data, and data governance and management processes.

Just as some data warehouses have become massive black holes from which vast amounts of data never escape, a data lake can become a data swamp if good governance policies are not applied.

All data in a data lake must be catalogued, accessible, trusted, and usable; active governance, quality and information management are indispensable parts of the data lake.

Myth four: Delivering access to the data lake success is a measure of success

Having data in a central location is not a true analytics solution. The goal is to run data analyses that produce meaningful business insights; to uncover new revenue streams, customer retention models or product extensions.

But that data must be trusted, relevant, and available for all consumers of data. A data lake needs an intelligent metadata catalogue that can relate to business terminology, moving cryptic-coded data and making it more understandable with context. It will also attribute to the source and quality of data from both structured and unstructured information assets and governance fabric to ensure that information is protected, standardised, efficiently managed, and trustworthy.

Myth five: The data lake is a replacement for a data warehouse

The data lake can incorporate multiple enterprise data warehouses (EDW), plus other data sources such as those from social media or IoT. These all come together in the data lake where governance can be embedded, simplifying trusted discovery of data for users throughout the organisation.

Therefore, a data lake augments EDW environments to allow, enable or empower data scientists and analysts to easily explore their data; discover new perspectives, insights, and to accelerate innovation and business growth.

Thanks to mobile devices, apps and IoT, for example, the amount of unstructured data is growing exponentially, so it’s no wonder the demand for data storage is intensifying.  According to IDC, data lake adoption is expected to rise from 30% of organisations worldwide, to 90% within the next three years; to see if this approach could be a solution for your organisation, download our latest infographic  if you’d like to know more and identify some practical next steps, why not register for one of our Data Services Workshops?

Find out more 

Data lake women end pier

Team Logicalis
August 16, 2019


Data, data everywhere but not a drop to use, that’s how the poem goes, doesn’t it? Something like that anyway…. Is a data lake the answer to the challenge of making the most of ever-growing data volumes?  Organisations we’re working with are increasingly exploring this approach to address demands for an agile yet secure and well-governed data environment.

What is a data lake?

A data lake is a shared data environment that comprises multiple repositories and capitalises on big data technologies.  Unlike a data warehouse, a data lake uses a flat architecture that keeps data in its native format until it’s needed. It allows rapid landing and storage of data, and it provides ready, unfettered self-service access to data for analysis. Comprehensive governance capabilities help ensure data can be easily found, understood and stored without duplication.

For more information on the difference between a data lake and a data warehouse, we like this ‘super simple data lake explanation’ from Forbes.

How might a data lake address some of your data challenges?

A well-governed data lake will:

  • Enable swift access to a full range of data in a timely fashion for business users, analysts, data scientists and developers, so they can generate accurate, meaningful insights
  • Accelerate analytics and data preparation to keep up with the speed of business
  • Facilitate collaboration between knowledge workers, data scientists (when deeper analytic capabilities are required) and data engineers (when it’s time to deploy data lake–based applications to line-of-business users).
  • Ensure data quality, security and governance to provide users with trusted, understandable data while protecting privacy and maintaining compliance with regulations
  • Accommodate a rapidly growing collection of data with a scalable approach to storage that supports fast-growing data volumes cost-effectively while maximizing existing resources.

Download our infographic to see how a data lake could help your business

Considering taking the plunge into the data lake? Sign up to one of our Data Services Workshops where our data experts will get to grips with your current data environment and infrastructure, your business challenges and data goals, and then give you practical advice and achievable milestones to help you meet your data aspirations.

Find out more  

Team Logicalis
January 2, 2019

Business leaders around the world are discussing a common dilemma: how to create transformative and creative experiences and business models that improve their customer’s lives, drive growth and boost profitability.

What was once a predictable operating landscape where five-year strategies punctuated by major IT investment was the norm, is now fast being replaced by a rapid series of digital business initiatives, platform redesigns and line of business-driven demands. The inexorable rise in customer expectations characterised by a growing demand for seamless, on demand experiences with real-time feedback where expectations aren’t met, is forcing businesses to fundamentally rethink the role which IT plays in supporting staff, partners and customers in the pursuit of service improvement.

For most companies, the strategic imperative should not just to be doing things right and more efficiently. Leaders must also determine the right things to do. Now is the time to reflect and consider the fundamentals of value created by technology investment. Customers’ own choice has never been more readily available, their ability to select services at a global scale and switch to alternative solutions, is a reality every business needs to address.

New disruptive pure-play companies whose businesses have been conceived and built in a digital world have arrived. Faster, simpler and better optimised, these businesses are the legacy players. Category disruptors like AirBnB or Uber have successfully challenged existing business models and by doing so changed their sectors forever. Today’s competitive marketplace requires a different and deeper level of change; reinvention of core services, transformation of process and the wider integration of technology and frameworks across open platforms.

The broadly accepted premise that the Cloud could provide the new levels of innovation required to transform business is certainly true for those willing to go “all-in” and develop new service offerings born and built in the cloud. However, transforming an established business into a Cloud native, agile and inventive organisation, requires not only a deep-rooted commitment at board level, but also a solid foundation of legacy intelligence, securely linked at every level to ensure business continuity and overall customer service integrity.

The big questions

The process of digital transformation starts with a very honest and open appraisal of the business today; what do you do well? Which services could generate greater profits or improve customer retention, if only technology didn’t get in the way? How could you unlock new revenue streams by reducing the time to market for new services, and by doing so reach the target audience before the competition?

Cloud Computing has fuelled some of the most successful business transformations in recent decades, not because of its cost or scale, but rather its ability to break down traditional methods of service and application development into collaborative, visual and user accessible environments. However, as with all things, this new-found agility comes with an increased exposure to the unknown, moving to cloud, developing on cloud and ultimately managing cloud, requires a completely fresh approach to IT operations and service management.

As such, many earlier adopters are finding cloud management a challenge they hadn’t fully prepared for. But help is at hand if it feels like understanding and navigating your way through this new landscape is a full-time job. Instead of tackling this transformation by yourself, you can enlist the help of a company that can underpin and enable your journey so your move to the cloud is as stress free as possible.

Category: Hybrid IT

Team Logicalis
May 10, 2018

As the benefits of hybrid IT have become clear, it has evolved from a temporary state to the chosen environment for many organisations looking to thrive.

Every organisation, regardless of size or sector, has a digital strategy. In fact, it’s hard to believe that IT once lingered on the fringes of business operations and decisions when today it is front and centre – a driving force behind both individual projects and overall business objectives.

And for the vast majority of organisations, it’s difficult to speak about digital strategy without mentioning cloud.

In fact, cloud’s ever-growing role and potential benefits are so widely publicised that it can feel almost unavoidable. After all, if your competitors adopt ‘cloud first’ strategies and you chose not to- don’t you risk getting left behind?

But, cloud doesn’t have to be all or nothing…

Enter hybrid IT

With hybrid IT, organisations can bring in cloud-based services that will run in parallel with their existing on-premise hardware. This may not necessarily be a new concept. However, its full potential is rarely realised.

Instead, more often than not, hybrid IT is built into digital strategies as a stepping stone to cloud and, as such, considered the transitional phase on a much bigger journey. It’s useful, but it’s also temporary… Simply a vehicle to get you and your organisation where you need to be by enabling you to join the elite and become a ‘cloud first’ company.

And it’s true, hybrid IT is a very useful tool for organisations looking to make the first small steps into a new cloud-centric world. You can test the waters by investing in new cloud-based technologies, without being all-in.

But hybrid IT can also open up the door to a whole new world of possibilities, enabling businesses to operate in- and therefore reap the benefits associated with- both on-premise and cloud environments.

Many organisations have started to realise this and- as a result- the concept of hybrid IT is transitioning from a temporary phase to a preferred way to do business. In fact, according to Gartner, by 2020 90% of organisations will have adopted a hybrid infrastructure.

But what are the key drivers behind this shift?

The best of both worlds

Traditional IT or cloud technologies… it used to be a one-stop choice that organisations had to make. And once you made it, all your application workloads and databases were assigned to one environment. You were effectively tied into that environment until you actively decided to change and, with significant effort and probably financial cost, you made steps to convert.

But, by using hybrid IT, organisations no longer have to commit to a single environment. They can have the best of both worlds and benefit from aligning specific workloads and applications to specific platforms. hybrid IT grants:

The scalability and cost efficiency of cloud technologies

There’s no doubt about it, scaling a traditional infrastructure can be very expensive. By making the most of hybrid IT- and utilising the public cloud and private cloud environments, businesses can upscale IT operations quickly and at minimal cost – which is particularly useful for shorter-term projects. But, it doesn’t stop there… with hybrid IT, organisations can also downscale their operations. In effect, everything can be driven to reflect the actual demands being placed upon the business, saving both resource and money.

And if organisations are saving resource in those areas, it leaves more room for innovation. The exciting new projects that often have to be pushed aside due to more pressing concerns, such as keeping the lights on, can become a reality.

The security and consistency of traditional IT

IDC recently discovered that the number one barrier to adopting cloud-based technologies continues to be security concerns – with 72% of business leaders agreeing it was a real issue. Whether these fears are well founded or not, the fact remains that- sometimes- public cloud may not meet an organisation’s strict security requirements. Hybrid IT grants peace of mind, enabling organisations to benefit from cloud technologies while keeping their most valuable and sensitive data on premise.

Hybrid IT is also often used in disaster recovery strategies. In our digital world, suffering an IT outage is every organisation’s worst nightmare. Why? Because the downtime that organisations suffer as a result can have a devastating and lasting impact, both financially and in terms of future reputation. By having primary data copied and stored in two different locations, organisations can recover faster while keeping downtime to a minimum.

Above all, hybrid IT gives organisations the freedom to make their own choices. It merges the best of old world technology with new world thinking. And, just as digital is no longer the sole territory of IT departments, it’s set to infiltrate the boardroom and play a key role in all future business decisions.

After all, hybrid IT is an enabler, allowing business leaders to make the right digital decision for their business, whether that is traditional IT, cloud-based technologies or a mixture of the two.

Contact us to find out more about Hybrid IT and how we can help you leverage it

Originally posted on Information Age, 24 April 2018

 

Category: Hybrid IT

professional meeting

Team Logicalis
February 20, 2018

Digital Workplace is not a one size fits all approach

“Workplace of the Future’’, “Digital Workplace” and ‘‘Future Digital Employee Experience” are definitely the latest buzz phrases in the HR world! Today we work anywhere at almost any time. We work in cafeterias, airport waiting areas, cars, with mobile devices, desktops, laptops, tablets, and the list goes on.

Delivering a great digital workplace in today’s business environment is tricky and one of the most important things a business can achieve – not to add the support of simultaneous communication across different devices and regardless of geo location.

Nonetheless, not many businesses have identified the appropriate tools and processes to create an effective and user-friendly modern workplace. If one takes into consideration the diverse and geographically scattered organisations, it is easy to understand why businesses should have a tailored approach, focusing on retaining talent, ensuring employee wellbeing and engagement, which are all greatly supported by the collaboration tools employees are given.

Digital Workplace in 2018 is all about Engagement, Experience, and Empowerment

Increasing Employee Engagement
Engagement means all employees can feel connected and part of the same team, no matter where they are located.

Engaged employees can increase a company’s performance by up to 202%
(Article by -Engage for success, the importance of employee engagement, Aug 2016).

With advances in technology, from software defined WAN to tailored applications, co-workers are able to stay in constant communication and feel a part of the team, whenever they are based. The easy to use, natural interfaces of UC application tools today allow employers to maximise adoption and bring measurable engagement through the built-in analytics capabilities.

Elevating Employee Experience
Gen X, Gen Y and now Gen Z talent represents a massive shift towards a more collaborative, connected and faster paced workplace, in which self-expression is encouraged, and autonomy, recognition and global awareness are core terms of employment.

The changing dynamic of a digital savvy workforce means organisations must address and tap into analytics and consider harnessing ‘as a service’ delivery models to raise the bar on talent acquisition, as well as to offer employees a productive, engaging and enjoyable work experience.

A recent study Deloitte completed with Facebook found that only 14 percent of companies believe their internal processes for collaboration and decision making are working well, and 77 percent believe email is no longer a viable tool for effective communication.

A great example of this is organisations starting to redesign their recruitment experience to resemble the consumer experience on e-commerce and social media platforms. They have examined how people search for common items, such as cars, music or major purchase and are now looking to weave that into their online recruitment campaign.

Digital Workspace Employee Empowerment
Employees feel trusted when they are given the power to work when and where they want. When non-desk workers feel included and part of the team, they’re more productive. Every business wants empowered employees, because they have been shown to be more satisfied in their roles, and thus more productive. Investing in your business’s digital workspace and enhancing your employees work experience is one way to make that happen.

Most organisations today are likely to have a ‘digital workspace’ of some sort, and it will only grow in the years to come. Competitive advantage increasingly comes from providing the right set of collaborative tools letting employees use technology in the way they want to and a business culture that puts people first.

Exclusive roundtable: The Future of Work in 2018
Join us and Fuze on March 21st at the Sky Garden for an exclusive roundtable dinner to discuss the future of work in 2018. In this session we will look into how employees’ demands create a complex environment for IT leaders, but on the same time are an opportunity to drive innovation and bolster productivity.

In particular, we will go through:
– The role of technology as the enabler of the future of work
– How to increase user adoption as well as workforce mobility and productivity
– How to reduce application sprawl and shadow IT

So bring your questions to the table and let’s see how we can help you get ahead of the changing working landscape.

Category: Collaboration

man working with servers, IT gear

Team Logicalis
January 18, 2018

In the previous blog post on Capacity Management, we explored why IT projects fail and what you can do to prevent it.  What is even more important is to understand why, when it comes to securing your business, product alone isn’t the answer.

 Ransomware, data breaches, insider threats, phishing scams… we’ve all seen the headlines. And, although these words, once reserved for IT departments, are becoming a part of everyday vocabulary, that doesn’t make them any less concerning. They have the power to derail your entire business- everything that you’ve built- within seconds.

Nowadays, cybercrime is big business, and you can guarantee that for every security solution churned out by vendors, someone, somewhere is creating a brand new malicious code to target other vulnerabilities you didn’t even know existed within your organisation. Add to that modern working habits, with more and more businesses needing to adopt cloud and IoT for day-to-day operations, to keep up with their competitors, but subsequently increasing the potential attack surface, and you soon see that organisations are under siege from all angles.

To state the obvious; cybersecurity is no longer optional.

And this is something that all CIOs are more than aware of. In fact, in our 2017 global CIO Survey, security was cited as the number one concern when it came to an increase in the use of cloud services, with 70% of respondents citing it as a challenge.

So the problem is common knowledge, but what’s the solution?

Well, if your automatic answer is ‘by investing in security products’, then you’re not alone. Many business leaders define ‘security strategy’ as lots of different solutions coming together to work as one protective shield. Each solution is built to defend against a single threat vector, so various email, cloud and web products all become separate pieces of a much larger security puzzle.

Given the sheer volume of security products readily available, it’s no surprise that this puzzle doesn’t come cheap and that certain pieces aren’t as effective as others. But, surely the more products you deploy- effective or otherwise- the more significant your overall security capabilities and the better the protection for your organisation. After all, it’s better to be safe, and slightly out of pocket, than sorry… right?

 It’s an easy trap to fall into.

In reality, a growing number of point solutions patched together is no longer an effective strategy. Instead, this method compounds complexity and creates the very vulnerabilities that it is meant to be mitigating against.

This is because security devices raise an alert for each threat that they detect- that’s how they work. And when you have multiple tools in place, each detecting multiple threats, the chances are that alerts will be going off almost constantly. This is fine; it shows that the solutions are working.

But, it’s unlikely that a single organisation will have the manpower needed to deal with each alert simultaneously. Instead, overwhelmed and underresourced IT teams will probably try to prioritise and as a result many of the alerts, and therefore threats, are ignored, making your organisation vulnerable.

So, when it comes to protecting your business, spending thousands and thousands of pounds worth of your budget on product alone is futile. It’s clear that the ‘best of breed approach’ has had its day, with an increasing number of organisations coming to the realisation that it’s not about how many solutions you have in place, it’s about how you’re using them.

 A problem shared is a problem halved

To simplify things, you can strip your security strategy back to three key areas that all need to be done well; threat insight, vulnerability management and managed endpoint security.

Then, you need to make sure that the solutions you have within these areas are being used correctly. The easiest way to do this, and to make the most out of your resources, is to undertake a collaborative approach.

Take the heat off your own IT team and share the security burden with a partner who can help you to plug the gaps with managed solutions like:

–       Managed SIEM/SIRM- A Security Incident Event Management service working in conjunction with a Security Incident Response Management service will provide optimal threat insights. It will solve the biggest and longest headache for your internal IT team- the one that began when you started installing security solutions… External engineering teams will analyse and, effectively, filter the never-ending stream of alerts so that, before they even reach your team, they are prioritised in terms of risk to your business and have clear actions on how to stop them in their tracks.

–       Patch management- By combining patch management services with existing vulnerability scanning in a single service you can achieve optimal vulnerability management. Believe it or not, this service will fill any gaps in your security wall automatically. This is because networks will be regularly scanned for vulnerabilities with any intel gathered then being rolled into a patching program. Obviously, this will significantly reduce the time between when a vulnerability is identified and when it is patched.

 –       Security device management- This incorporates all endpoint security, including antivirus solutions, firewalls and device control, as a single managed service. Delivered via software on laptops, desktops and servers, the service can also detect rogue devices attaching to the network and provide web filtering.

The bottom line is that cybersecurity is not about product. It’s about people, processes and technology working coherently to manage risk and protect your organisation. Often, working collaboratively with providers who can manage your security can be the better option. They will have the resourcing and the skillset to help you deal with any potential threats, while offering more peace of mind.

Today, a third of CIOs see security as the most prominent barrier towards digital transformation. Outsourcing can change that by granting your internal IT teams the gift of time… time that can be used to pursue other areas of your business’ IT strategy.

Talk to us to find out how we can help you.

 

Originally posted on CBR, 21 November 2017

 

Category: Security

Team Logicalis
December 5, 2017

Whether part of a large, international enterprise, a medium-sized organisation or a small startup, in this day and age, undertaking new IT projects is essential.

Businesses need to adopt new technologies in order to get all the benefits associated with new, innovative IT projects. However, it’s more than that… In our fast-moving digitally-competitive world, if you don’t adapt, you get left behind.For all organisations, it really is survival of the fittest, and the fittest are those who embrace new technologies and invest in new innovative IT projects. Think about it- how can you stand up to your competitors if they’re constantly three digital steps ahead?

Undertaking new innovative IT projects has become the key focus of CIOs everywhere. You can spend weeks, even months pushing a project through the planning stages; going over the specific schedule and timings, working out the breakdown and total costings and redefining the objectives. But what happens if it then falls down?

Failure isn’t something that anyone wants to experience. When you’re a CIO who’s spent huge amounts of time and energy working on getting a proposed project through all the usual barriers to implementation, failure can be even more difficult to accept. And, obviously taking the size of your organisation into consideration, the larger the project, the larger, and more expensive the problem, if it does fail.

A recent global study from Fujitsu found that on average organisations lose £483,690 for every cancelled digital project. That’s a lot of money for a single project, especially for an outcome that could have potentially been avoided.

Why do IT projects fail?

Well, it all comes back to resourcing. When times get tough, CIOs have to throw their efforts into ‘keeping the lights on’, rather than implementing the exciting and innovative new projects that are designed to give their organisations the upper hand against competitors.

Often this will mean that they are working with limited resources from the offset, when it comes to new IT projects and to try and combat this, projects will be run in series, rather than parallel.

However, the fact is that, a lot of the time, various projects rely on the same elements or components. Each project will have a benefit realisation target, which will be recognised upon completion.

If there is a slippage during the implementation of the first project in the series, then the benefit realisation target is not met on time. This then has a knock-on effect; it results in resources being tied up for longer than initially planned which, in turn, affects all the other projects in the series. How can you start a new project when all your assets are tied up somewhere else? Simply put: you can’t, and this is what leads to stalled, and even failed, projects.

But why are resources so far and few between?

Interestingly, a recent independent survey discovered that 22% of CIO’s see a lack of skills as the biggest barrier to achieving their objectives. This came ahead of money, culture, alignment and even technology.

So, even if you have the correct amount of solutions and technologies in place to complete a project, often it’s the human skills needed to implement them that are tied up; stalling projects and leading to their failure.

Why? Well, both the business landscape and our working habits have changed dramatically over the last decade or so. Whereas previous generations might have secured a job in their 20s and stayed with the same company until retirement, now it’s more common to change jobs every 2 or 3 years. And when people leave, they take their specialist in-house knowledge and their skillset with them, creating a lag or gap.

Add to this the fact that technology is constantly changing at an ever-increasing speed and the problem only becomes more exacerbated. In order to keep up, often employees are more focused on, and therefore more skilled in, one sort of technology or in one area.

However, this means that when they leave the company, their absence is strongly felt. The cyber security skills gap is something that everyone has heard of; it’s well documented. But, the truth is that this skills gap is IT industry-wide.

In fact, according to figures released by Indeed in October, since 2014 demand for software developers and machine learning engineers has increased by 485% in the UK, with there now being an average of 2.3 jobs available for every qualified candidate. It’s no wonder that many organisations are feeling the pinch on the skills front!

All in all, resources are tight. There is very little wiggle-room- especially when it comes to human expertise and technical talent.

You need to focus on keeping business operations running as usual before you even start thinking about additional projects. But you need these additional projects in order to avoid falling behind your competitors in the innovation stakes. And with the speed that technology is changing, you ideally need to be undertaking multiple new innovative projects simultaneously.

So what can be done?

Simply put, there just are not enough resources to do everything.

Or are there?…

It’s true, you can’t just pull extra time and technical know-how out of thin air, or magically create an immediately accessible pool of skills where there isn’t one. It’s clear that, this time, the answers aren’t going to be found within your organisation- so why not look somewhere else?

Talk to us to help you with all the extra resources you need to invest in innovation while ‘keeping the lights on.’ It no longer has to be a dreaded choice, with the need to keep the business running as usual, stifling any form of innovation. Instead, by collaborating, you can have it all.

 

Originally posted on Information Age, 14 November 2017

Team Logicalis
November 21, 2017

Mark Rogers, CEO Logicalis Group, digs into the Logicalis Global CIO Survey 2017-2018 to pick out some of the major topics arising from the survey of 890 CIOs in 23 countries.

The big themes emerging from this years survey break CIO priorities down across three areas that could be mistaken for business as usual: Simplify, secure and engage. But, on the contrary, each has its part to play in a much loftier goal – digital transformation.

Indeed, the headline from the 2017-18 survey is this: CIOs say a massive infrastructure overhaul must be coupled with culture change if organisations are to unlock the benefits of digital transformation.

Digital ambition versus digital reality

That headline finding stems from CIOs’ assessment of their organisations’ digital footing.

The survey tells a story of real digital ambition amongst CIOs, but of limited progress in delivering digital transformation. To use tech adoption bell curve terminology, only 5% of respondents call their organisations digital innovators right now, while 49% characterise their organisations as part of an early majority.

That’s not a significant change on last years’ figures – and the reality is most CIOs see their organisations as partly digitally enabled at best.

Crucially, however, they contextualise those rather cautious views with a realistic and pragmatic assessment of the barriers to digital transformation – and it is their ambitious plans to overcome those barriers that give rise to the ‘simplify, secure, and engage’ triptych:

Simplify

For almost half (44%) of respondents to this year’s CIO survey, complex legacy technology is the chief barrier to digital transformation.

In simple terms, the job of maintaining and managing those complex environments – in the face of ever more draconian security threats, and business demand for ever more open architecture – is huge. So, legacy complexity doesn’t just slow down or prevent digital projects, it also prevents a refocus on higher level, strategy activity, like digital transformation.

That is clearly not lost on CIOs, who understand very well the urgent need to simplify existing systems – indeed, 51% said they planned to adapt or replace existing infrastructure as a means of accelerating digital transformation.

It’s not hard to envisage CIOs making greater use of cloud services and third party support as a means of both simplifying those systems and handing off some of the management burden associated with them.

Secure

It’s no great surprise to see security high on the CIO agenda given the nature of the cyber threat landscape – and no great surprise either to see ransomware top of the threat list for CIOs. Ransomware [http://cxounplugged.com/2017/01/what-is-ransomware-a-c-suite-quick-guide/] is the biggest threat according to 71% of CIOs surveyed.

More surprising though, is the fact that one in three CIOs admit security concerns have led to the curtailment or cancellation of IT projects – a fact that must surely amplify the impact of security issues on digital transformation.

With that in mind, it is small wonder that so many CIOs (31%) see increased security investment as crucial to digital transformation – and not just to weathering the next cyber threat storm.

I’m in little doubt that CIOs’ security focus will drive an increased demand for services like Cisco Umbrella [Link] as organisations adopt multi-layered security solutions capable of defending against an ever-evolving array of cyber threats.

Engage

Perhaps most interesting, CIOs see organisational culture as a key barrier to digital transformation. That is, legacy technology brings with it a legacy relationship between business and technology, a ‘separateness’ that is incompatible with a digital model that puts technology at the heart of every aspect of the business.

In response, CIOs want to engage with line of business (LOB) to drive culture change. They want to be the digital ambassadors who create a new relationship between business and technology, and who foster an environment in which digital transformation can thrive.

Analytics offers a case in point. Back in 2015, 63% of CIOs ranked analytics as ‘very important’ or ‘critical’ to driving business innovation.

Two years later the same barriers to delivering those benefits remain complex systems and siloed data, but so is business engagement – the lack of a clear brief from the business as to what is required from analytics is still an issue for 41% of CIOs.

Crucially, though, they are responding: 54% of CIOs are working with LOB colleagues to bottom out requirements and 38% are setting up working groups to unravel complexity.

Those plans to tackle analytics suggest that CIOs are successfully adapting to a changing environment for business IT, an issue we first highlighted in 2015. [http://cxounplugged.com/2015/01/power-shift-will-cios-respond/]. The big question is whether they will be successful in replicating the approach as they seek to unlock the benefits of wider digital transformation.

In my view, the CIOs that are successful in tackling these three big issues will be those looking outside for help. The majority still spend between 60% and 80% of their time on day to day IT management – an issue that, in itself, is a barrier to change.

That’s partly because so much IT remains in-house. Only 25% outsource 50% or more of their IT – a situation that must surely change quickly if CIOs are to free themselves from the everyday and be digital change makers, not change managers.

Read the full Logicalis Global CIO Survey 2017-2018 here.

Team Logicalis
October 24, 2017

Overspending on resources?

We can all agree, it’s nothing new. In fact, it’s an issue faced by business leaders almost every day. In our increasingly digital world, overspending on technical resources, alongside the human resources (or skills) to back them up, is common.

If you view over-provisioning as a necessary evil, you’re not alone. A recent independent study discovered that 90% of CIOs feel the same way, with the majority only using about half of the cloud capacity that they’ve paid for.

But, why pay for resources that you’re not going to use?

Well, it’s no secret that over provisioning on IT resources is better than the alternative. Understandably, you’d rather pay above-the-odds for ‘too many’ functional digital systems, than risk the outages associated with ‘too few’. A 2015 study by Populus discovered that almost a third of all outages on critical systems are still capacity related, proving that over provisioning is not the only problem here.

It can seem as if organisations are stuck between a rock and a hard place: do you spend thousands and thousands of pounds from your (already) tight budget and over provision, or do you make an upfront saving and risk becoming one of the 29% of companies experiencing business disruption, downtime or worse when the demand on your services exceeds the resources you have in place? How do you optimise costs without risking future, potentially devastating, strain on your resources?

Enter IT Capacity Management…

In a nutshell, IT Capacity Management gives you a snapshot view of all your business resources against the demands placed upon them. This enables you to ‘right-size’ your resources and ensure that you can meet current requirements without over provisioning and over spending.

The level of demand placed upon business resources is constantly fluctuating. That’s why Capacity Management models should run alongside your current operations as part of your ongoing business strategy. It’s one way to be proactive when it comes to resourcing.

However, it doesn’t stop there… Capacity Management also enables you to prepare your business for the future. It continually measures the performance and levels of use of your resources in order to make predictions, which will enable you to prepare for any future changes in terms of demand.

What can Capacity Management do for your business?

There are a number of benefits to having IT Capacity Management included in your company strategy. It gives you visibility of your entire IT infrastructure, including all physical, virtual and cloud environments. The importance of this should not be underestimated; it can enable you to:

● Optimise costs. It’s simple- if you have a clear view of all your resources, you can see where they’re not required, which means that you won’t feel the need to purchase them “just in case”. Capacity Management can be seen as a long-term investment- especially given its ability to predict future trends based on current performance.
● Easily adjust IT resources to meet service demands. With the ability to see exactly which of your services are being placed under the highest amount of pressure in terms of demand, you’ll be able to adjust your business plan accordingly to relieve some of that pressure- allowing you to even out the playing field by ensuring that one service area isn’t being drained whilst others are idle. You’ll be able to add, remove or adjust compute, storage, network and other IT resources as and when they are needed.
● Deploy applications on time. You’ll be able to reserve IT resources to be used for new applications when needed, resulting in a faster time to deployment.
● Reduced time and human resources spend. Imagine the hours being spent by your employees to plan and calculate capacity usage and availability. By implementing a real, ongoing plan which can run in the background, you free up more time for your employees to pursue higher value tasks.

Capacity Management solves the age-old problem of optimising costs for today’s CIOs. While this has always been a priority for organisations, our new digital landscape has redefined its meaning and its importance. Working habits and IT business structures have evolved to include mobile working, shadow IT, unimaginable amounts of data and complex technological advancements that need a certain skillset to deploy. Therefore, it is impossible to view everything simultaneously and manage all resources accordingly, unless you deploy the correct tools and have the right strategy in place.

Capacity Management should be a key element of any business strategy. It’s a model built for your business’ resourcing needs, both today and in the future.

If you’d like to find out more about the Capacity Management and Cost Optimisation services that Logicalis provides then, contact us today.

 

Originally posted on Information Age, 18 October 2017.

Team Logicalis
September 27, 2017

Throughout history, I don’t believe we’ve ever seen as much change as we do in the world of Technology! Just to think that in 10 years’ time we’ ve had more iPhone releases than Henry VIII had wives.

Taking a page out of some of tech giants books, be it Apple to Salesforce, it’s clear that innovation is at the centre of what enables the industry to move at the pace it does. It would be fair to say that 3 major trends currently dominate the industry:

1.Service, service, service – Many big players in the hardware product space recognise hardware is fast becoming a vanilla commodity. You’ve got a number of vendors such as Cisco, Oracle, Ericsson, Nokia, HP scrambling very quickly over a number of years to enable value added services on top of the hardware to increase margins.

 “Services are enabled by the specific knowledge, skills and experience you bring to the table which often drives business value through improved margins.”

Sometimes when I think about how you can build your brand of service that you deliver to customers, I like to compare it to food (one of my favourite subjects).

What keeps you going back to your favourite restaurant? Let’s take for instance McDonalds. It could be the quality of the food, but ultimately you KNOW you will get a fast, efficient service and a smile when they ask ‘would you like fries with that?’. The point being, it’s the trusted customer experience that underpins successful services, remember this bit – I’m going to allude to this part later on.

2.Business process design driven by cost reduction, optimization and automation – Ultimately, we use technology to enable us to make our lives simpler. Traditional IT has become so entrenched in complexity and with that has come high cost. Many businesses of all sizes are certainly looking at their balance sheets with scrutiny and seeking to utilize the benefits of IT innovation to gain a competitive advantage. The principles of globalisation, business process optimization and automation are all relevant now as we transform traditional IT to achieve the ultimate goal of simplicity.

3.Data driven customer experience being an investment for the future – Products in the world of data analytics are booming as businesses recognise the power of data in enabling intelligent business decisions. Some proven examples of boosting business value are how Telcos are using customer location data to pinpoint relevant marketing text messages.

Imagine you’re at the airport, where intelligent systems will pick up your location and send you a text to see if you want to purchase international data plan while you’re away. So instead of sending you random marketing messages, geo-location marketing becomes targeted and relevant. Through this intelligent marketing Telcos have been able to generate 40% more revenue than expected in that portfolio.

Keeping up with the pace of change within the industry can be overwhelming, unless you harness the key themes that I mentioned earlier which will be sure to relate to business value. Contact Logicalis today to learn how you can implement an agile business model and use its benefits to increase your business value.

Team Logicalis
September 8, 2017

Shadow IT is not a new concept, but certainly is a big issue for many organisations today. Companies of all sizes see a significant increase in the use of devices and/or services not within the organisation’s approved IT infrastructure.

A Global CIO Survey  found that IT leaders are under growing pressure from Shadow IT and are gradually losing the battle to retain the balance of power in IT decision-making. The threat from Shadow IT is forcing CIOs to re-align their IT strategy to better serve the needs of their line of business colleagues, and transforming IT to become the first choice for all IT service provision. However Shadow IT continues to apply pressure to many CIO’s and IT leaders who do not have clear visibility of the use of Shadow IT within their organisations and therefore cannot quantify the risks or opportunities.

So is Shadow IT a threat to your organisation or does it improve productivity and drive innovation?

Based on Gartner’s report, Shadow IT will account for a third of the cyber-attacks experienced by enterprises by the time we reach 2020. However, some customers have told us:

  • “Shadow IT is an opportunity for us to test devices or services before we go to IT for approval,”
  • Shadow IT allows us to be Agile and use services that IT don’t provide so we can work more effectively

One of the most important aspects of Shadow IT is of course the cost. What are the costs to the business from the hidden costs of a security breach, potential loss of data and for those with regulatory compliance requirements, the possibility of large fines and loss of reputation in their respective markets?

With an ever changing and expanding IT landscape and new regulations  such as the General Data Protection Regulation (GDPR) coming into effect in May 2018, managing and controlling data whilst ensuring complete data security should be top of the priority list. Therefore understanding the key challenges of Shadow IT is fundamental in order to manage it effectively.

Shadow IT – The Key Challenges:

    • Identifying the use of Shadow IT
      Arguably the biggest challenge with Shadow IT is visibility within the organisation. How can IT leaders see who is using or consuming what and for what purpose? If you can’t see or are aware of it, how can you manage it?
    • Costs of Shadow IT
      Controlling costs is impossible for Shadow IT spend if there is no visibility of what is being used. Not just the direct Shadow IT purchases present a challenge but the consequences of a security breach as a result of the use of Shadow IT in fines, reputation damage and future loss of business.
    • Securing the threat to your business
      One of the biggest areas of concern and quite rightly is the security threat to the business from the use of non-approved IT sources.  Not only does this have the potential to add to the organisation’s costs but also could result in the loss of data, again with the potential risk of considerable fines.
    • Managing Shadow IT without stifling innovation
      The wrong approach to managing Shadow IT, such as the “total lock down messaging”  can send signals to the organisation that IT are controlling, inflexible and  unwilling to listen with the possible result of driving Shadow IT under ground and in cases actually increase its use , thus increasing risks and costs.

Shadow IT is a complicated issue, but your response to it doesn’t have to be. Contact us to find out how we can help you manage Shadow IT, be forward thinking and fill the gaps within the current IT infrastructure.

Team Logicalis
August 21, 2017

It’s always the same scenario: someone giving me some data files that I just want to dive straight into and start exploring ways to visually depict them, but I can’t.

I’d fire up a reporting tool only to step right back, realising that for data to get into visual shapes, they need to be in shape first!  One correlation consistently appearing over the years is that time spent on ETL/ELT (Extract, Transform and Load [in varying sequences]) and the speed of exit from reporting layer back to data prep share a negative correlation.

Data preparation for the win

‘80% of time goes into data prep’ and ‘Garbage in Garbage out (GIGO)’ have existed for some time now but don’t actually hit you until you face it in practical situations and it suddenly translates into ‘backward progress’. Data quality issues can vary from date formats, multiple spellings of the same value to values not existing at all in the form of nulls. So, how can they all be dealt with? Data prep layer is the answer.

Often with complex transformations or large datasets, analysts find themselves turning to IT to perform the ETL process. Thankfully, over the years, vendors have recognised the need to include commonly used transformations in the reporting tools themselves. To name a few, tools such as Tableau and Power BI have successfully passed this power on to the analysts making time to analysis a flash. Features such as pivot, editing aliases, joining and unioning tables and others are available within a few clicks.

There may also be times when multiple data sources need joining, such as matching company names. Whilst Excel and SQL fuzzy look-ups have existed for some time, the likes of dedicated ETL tools such as Paxata have imbedded further intelligence that enable it to go a step further and recognise that the solutions lies beyond just having similar spellings in between the names.

All the tasks mentioned above are for the ‘T’ (Transformation) of ETL and is only the second OR third step in the ETL/ELT process! If data can’t be extracted as part of the E in ETL in the first place, there is nothing to transform. When information lies in disparate silos, often it cannot be ‘merged’ unless the data is migrated or replicated across stores. Following the data explosion in the past decade, Cisco Data Virtualisation has gained traction for its core capability of creating a ‘merged virtual’ layer over multiple data sources enabling quick time to access as well as the added benefits of data quality monitoring and single version of the truth.

These recent capabilities are now even more useful with the rise in data services like Bloomberg/forex and APIs that can return weather info, if we want to further know how people feel about the weather, then the twitter API also works.

Is that it..?

Finally after the extraction and transformation of the data, the load process is all that remains… but even that comes with its own challenges. Load frequencies, load types (incremental vs. full loads) depending on data volumes, data capture (changing dimensions) to give an accurate picture of events and also storage and query speeds from the source to name a few.

Whilst for quick analysis a capable analyst with best practice knowledge will suffice, scalable complex solutions will need the right team from IT and non-IT side in addition to the tools and hardware to support it going forward smoothly. Contact us today to help you build a solid Data Virtualisation process customised to your particular needs.

Team Logicalis
July 25, 2017

The amount of data that businesses generate and manage continues to explode. IBM estimates that across the world, 2.3 trillion gigabytes of data are created each day and this will rise to 43 trillion gigabytes by 2020.

From transactions and customer records to email, social media and internal record keeping – today’s businesses create data at rates faster than ever before. And there’s no question that storing and accessing this data presents lots of challenges for business. How to keep up with fast growing storage needs, without fast growing budgets? How to increase storage capacity without increasing complexity? How to access critical data without impacting on the speed of business?

It’s increasingly obvious that traditional storage can’t overcome these challenges. By simply adding more capacity, costs go up for both storage and management. And manually working with data across different systems can become an administrative nightmare – adding complexity, and taking up valuable IT resource.

So, what can you do? It’s likely that you’ve already got an existing infrastructure and for many, scrapping it and starting again, just isn’t an option. This is where flash and software-defined-storage (SDS) could be your saviour. Flash and tape aren’t mutually exclusive, and by separating the software that provides the intelligence from the traditional hardware platform, you gain lots of advantages including flexibility, scalability and improved agility.

So I could add to what I already have?

Yes. Flash and tape aren’t mutually exclusive. Lots of businesses use a mix of the old and the new – what’s important is how you structure it. Think of it like a well-organised wardrobe. You need your everyday staples close at hand, and you store the less frequently worm items, also known in the UK as the summer wardrobe (!), where you can access them if you need them but not in prime position.

Your data could, and should work like this. Use flash for critical workloads that require real-time access and use your older tape storage for lower priority data or lower performance applications.

But won’t it blow my budget?

No, the cost of Flash systems has come down over the last few years and the lower costs to operate make savings over the long term. It’s been proven that the virtualisation of mixed environments can store up to five times more data and that analytics driven hybrid cloud data management reduces costs by up to 73%. In fact, we estimate that with automatic data placement and management across storage systems, media and cloud, it’s possible to reduce costs by up to 90%!

So how do I know what system will work for me?

Well, that’s where we come in. At Logicalis we’ve got over 20 years of experience working with IBM systems. Our experts work with clients to help them scope out a storage solution that meets their needs today, and the needs they’ll have tomorrow.

We start with a Storage Workshop that looks at the existing infrastructure and what you’re hoping to achieve. We’ll look at how your data is currently structured and what changes you could make to improve what you already have – reducing duplication and using the right solution for the right workload. We’ll then work with you to add software and capacity that will protect your business and won’t blow your budget.

If you want to hear more about the solutions on offer, feel free to contact us.

Category: Hybrid IT

Team Logicalis
December 12, 2016

In the year of Brexit and Trump, Scott Reynolds, Hybrid IT practice lead, predicts that electronic digital crime will explode, data privacy breaches will claim scalps, automation will be 2017’s buzzword and the open source movement will challenge profit-making business models in his 2017 tech predictions.

It’s the time of year to engage in the oh-so risky game of making predictions for what is going to be hot for our customers in the coming year. Risky, because stunning twists and turns can take us off course at any point.

After the Brexit referendum and US Elections results confounded political and polling pundits, life’s certainties appear far less certain. Suddenly identifying the big winners in 2017 seems a less straight forward affair. But as a person that doesn’t mind living life on the edge – I thought I’d take a punt anyway. Here are my top tech predictions for 2017.

Security Breaches – the worst is yet to come

Based on the number of high profile data breaches, 2016 hasn’t been a great year for digital. The stable door has been left open by companies and government departments around the world. Armies of Terminator 2 Cyborgs in the guise of home CCTV cameras are attacking the very infrastructure of the internet. I fear we’re only at the beginning of an escalation of electronic digital crime.

2017 will test the nerve of governments, businesses, citizens and consumers and challenge the perception of digital as a safe and secure way of doing business, unless there’s a massive investment in Fort Knox equivalent defences and white hat skills.

Data Privacy

GDPR (General Data Protection Regulation) emanating from Europe is going to hurt businesses who don’t take data privacy seriously. That is a problem, as evidence suggests companies are unaware of their obligations under this new punitive legislative regime and are taking too long to grab hold of the GDPR tail.

It’s highly possible fines of up to 4% of Global Turnover will take some companies out of business in 2017, and beyond.

One Small Change for Mankind, One Giant Leap Forward for Automation

The IT industry is about to enter a time of mass automation…..about time. To our shame, we’ve lagged behind other industries. You can now buy a car that can park itself with the touch of a button, but you need 24 buttons to change the configuration of a router.

Increased levels of automation will manifest itself in robotic decision making, the automation of security systems to guard against, and respond to an avalanche of security threats, and automated provisioning of resource in the data centre and network (Software Defined).

Things Can Only Get Bigger

The Internet of Things is going to get bigger and more impactful. Gartner Group is still predicting that by 2020 there will be 50 billion things connected to the internet – that’s only three years away. In 2017, expect to see mass engagement by businesses in all sectors.

Hopefully we’ll move on from talking about the connected fridge that can order more lettuce when you run out, and recognise that IoT will fundamentally change how industries and organisations operate.

Open Source – Somebody wants to do everything you do for free

Somebody, somewhere, is trying to do what you charge for, at much lower cost, or even for free. Open isn’t a thing, it’s a movement. We’re already seeing Open Source technologies impact our industry; with Open Stack being the new operating system of choice for those companies not wanting to ‘pay’ for mainstream software. Open technologies in automation, such as Puppet and Chef, now have a groundswell of support, and are evangelical about companies who want to delight people rather than turn a profit.

We’ve also witnessed a growing willingness to embrace Open Computing technologies. Now, Open isn’t without its complications and ultimately nothing in life is free – operating an open environment is still a complicated affair. But I think we’ll see a lot more traction, with many of our customers taking Open Source seriously, over the next 12 months.

2017 Tech predictions – a risky game

So, those are my top five tech trends for 2017. Now you’re probably decrying – how could I overlook analytics? I haven’t. I fully acknowledge that analytics and data are core to all the above. They will need to be embedded in the very fabric of a business, to bring my predictions to fruition. Otherwise, you can disregard everything I just said. As I said, making predictions is a risky game.

Team Logicalis
December 5, 2016

In the third of a nine-part series drawing on the Logicalis Global CIO study, Scott Reynolds explains why apps are central to digital transformation.

The statement ‘Every company is a software company’ has been on repeat over the last few years. When it was first uttered it was more of a future-gazing, stake-in-the-ground pronouncement – and its application to today’s world is probably still a bit premature. Not every business is a software business, yet – but our global CIO survey suggests that we’re getting there, with the help of a few shining lights along the way.

In 2013, Forbes noted that Ford sells computers-on-wheels and FedEx boasts a developer skunkworks (a loosely structured group of people who research and develop a project primarily for the sake of radical innovation.) Both are great examples of the happy union between traditional industries and technology industries – and, today, they are not as isolated as you might think. Over 700 CIOs now tell us that 77% of firms are similarly developing apps, either in-house, with the help of third parties or drawing on a combination of internal and external skills.

In fact, not only is the volume of companies getting up close and personal with application development starting to swell, but app development as a strategic activity is also attracting more attention. Rather than being relegated to the fringes, application development is increasingly taking to the centre ground. Today, less than a quarter of apps (23%) are purely promotional. The majority are being used to build new services and revenue (57%) or streamline business processes (63%).

Developing for digital

We tend to associate apps with the Apple app store or the Android marketplace but they’re so much more than website spin-offs for mobile users. Enterprise-grade applications are replacing ‘big tech’. With the goal of putting automation at their core and providing friction less self-service experiences, companies are bringing workloads up to the application level.

In the past, we’ve emphasised the benefits of instituting a Dev-Ops strategy to develop code with fewer defects and support challenges once they’re released into production. My message to the 64% of businesses developing apps in-house would be to take a digital performance readiness approach and embrace agile from the beginning. Allowing updates to be made quickly and regularly, for constant refinement will create ‘killer apps’ with a punch to disrupt for the better.

Apps = Smart software

As the research attests, all sorts of companies are creating their own luck and doing some sort of app wizardry to get ahead.

Book publishers in the business of printing books are transforming themselves into software companies to offer digital content and branded applications. Airline companies are building equipment-tracking apps to provide engineers with a live view of the locations of each piece of airline maintenance equipment and pharmaceutical companies are creating medication temperature monitoring apps, which use sensors to ensure the best possible delivery of medical supplies.

Overall, apps are making firms a lot smarter. Their ability to gather tremendous amounts of data from sensors and other sources, using machine learning algorithms and predictive analytics makes them the brains behind a company’s transformation and the driving force behind our respondents’ digital transformation journey. Channelling James Carville, Bill Clinton’s campaign strategist, “it’s the apps, stupid”.

Category: SDN / Mobility

Latest Tweets