2020 and beyond: transitions, disruptions and transformations

Technology is changing at the speed of light, but this is the slowest pace of change we’ll ever see. The tempo of change and adoption is increasing, fueled by the democratization of innovation as technologies such as cloud computing give smaller companies greater access to new technologies. While the scale and use of some of these technologies is some way into the future, the rate of change we’re experiencing means that future is not as far off as we may think. Today’s trend is tomorrow’s mainstream adoption, and organizations need to consider how this is affecting their businesses and what they should prioritize.

Data becomes the center of the IT universe

Operational and digital transformation is all in the data: how you collect it, what you do with it, the platforms you have to manage it and how you make it available.

Greater volume, greater detail, greater insights

The ability to decouple data from applications allows organizations to really embrace operational transformation and digital transformation.

Volumes of data collected at the most granular level from IoT devices and digital platforms – effectively any technology system – can now be made available through application programming interfaces for analysis, business insights and developing additional applications.

We're not just collecting more data, we're also collecting more granular data, allowing us to deduce more insights than were ever possible.

With enough datapoints, you can model behaviour and understand patterns – for example, the diet of someone’s biometric twin – and come to more accurate conclusions (the time it would take before a health incident occurs), more quickly, and at a fraction of the cost of modern-day science.

Data is also critical for next-generation technologies: digital twins, spatial computing, artificial intelligence, deep analytics and new applied versions of technology are all dependent on data platforms. The data platform will even enable the next generation of computing architectures, such as serverless computing.

Data lake or data swamp?

Data truly is the center of it all and it’s therefore critical to get the data platform right. Data strategies must consider the architecture for capturing and managing all types of data – structured, semi-structured, unstructured and streaming – and making it accessible through APIs and a data service catalogue so a wide set of stakeholders can use it to create new value.

It is data that allows insights that transcend human observations or interpretation, and exposes truths or realities undetectable in the past. It is data that shows that it's not the fastest cyclist who wins the race, and it is data that changes a public transport company into a provider of consumer presence patterns for the retail industry.

Organizations that don’t build these disciplines around data will find themselves swamped by its scale rather than inspired by its potential.

NTT’s Digital Twin Computing Initiative uses data such as geographical and transport information to create virtual societies that can accurately predict the spread of an infection and help to control a real-world outbreak, in real time.

'We’ll see complete end-to-end computing come to the fore, bringing to life fully intelligent environments that are completely connected and will have a big impact on the world we live in.'


Ettienne Reinecke, Chief Technology Officer, NTT Ltd.


Follow on LinkedIn

edgeprocessing

Massive changes on the edge will transform the technology landscape

Thanks to container-based architecture, we can run scaled-down stacks, process data and make decisions at the edge without even touching backhaul to the cloud or a data center. Edge processing will have a pervasive impact on the industry.

5G – on the edge or the periphery?

5G is a major talking point in the industry but it’s important to distinguish where the promise of pervasive broadband really lies, and that’s in the higher frequency (hence, higher bandwidth) range of 24 to 28 GHz.

The challenge of this frequency range is that the short wavelength is susceptible to interference and can't penetrate solid materials (e.g. walls). It requires at least ten times the density of cell towers and base stations as the lower range (below 6GHz), which is where most pilots are currently taking place. This will require significant investment, so it could take years for mobile operators to find the business case to deploy high-frequency, high-impact 5G.

WiFi6, on the other hand, operates in the 2.4 GHz range and will likely have a greater impact on the edge for now. It’s more efficient than WiFi5, particularly in dense environments, where it’s up to four times better. WiFi6 also offers greater penetration, more subchannels and better performance – and has fewer base-station problems. Built-in algorithms for powering down help to optimize power usage and reduce energy consumption.

IoT: from myriad sensors to audiovisual sensors

IoT is changing at scale. We’re seeing audiovisual technologies being used to get the same input you’d receive from wiring up numerous IoT sensors. Audio analysis can identify the sound of breaking glass, for example, to trigger an alert. High-definition video allows for object processing, isolating and creating pattern libraries, and using pattern matching to monitor real-time events. Compared with individual sensors, these technologies provide feasible mass coverage at a lower cost.

There are several use cases for this audiovisual approach in smart buildings and smart cities, for example, where the cost of powering and maintaining myriad sensors is simply unfeasible. Audiovisual solutions have far greater possibilities to scale, making this a significant growth area.

Implications and impact of changes on the edge

These changes on the edge will have a profound impact on how we design and run everything from networks and data centers to applications and security, and how organizations get value out of new data they’re able to collect and process as a result.

In the innovation district of Las Vegas, high-definition video cameras, sound sensors and IoT devices deployed by NTT have improved situational awareness through video and sound data. The system can alert authorities to patterns that appear abnormal, helping to reduce response times for first responders.


appliedtechnology

Applied technology explodes to bridge the digital and physical

Using technology that’s applied with a specific outcome in mind will bring us to some interesting intersections between the physical and digital worlds. Here’s a brief look at what’s to come.

Seamlessly blending a world of realities

Applied technology will not just help us bridge the gap between our physical and digital worlds; it will allow us to blend them seamlessly. Spatial computing will introduce us to a whole realm of realities – virtual, augmented, enhanced, hyper, mixed, authentic – and digital interfaces will expand to include multiple senses: sight, sound, touch, smell. Immersive environments, where the physical and digital blend, will have a pervasive impact on the industry.

It might start as a dual-screen experience – where traditional formats such as high-definition images are complemented by a real-time second screen, using augmented reality to enrich the viewing experience – and evolve to volumetric capture technology, such as NTT’s Kirari!®, that will bring this to life at scale with holographic telecasting.

Opening new worlds of possibility

Given the growth and granular collection of data at scale, data modeling will evolve to encompass all systems and processes, enabling the exploration of digital twins across multiple research areas. The scale and scope of what is possible are in their infancy. The application of technology is finding its way into areas previously unthinkable – and at a speed that is unforeseen.

NTT and Major League Baseball will collaborate to create an exciting new baseball fan experience using NTT’s Ultra Reality Viewing that’s based on NTT’s Kirari!®. Audiences will be able to view sports content as if they were watching it live in the stadium.

'There is a huge opportunity to use any and every tool out there to support innovation initiatives in every field and truly transform our future world for the better. Successfully leveraging these opportunities requires intelligence.'


Andy Cocks, Chief Go-to-Market Practices Officer, NTT Ltd.


Follow on LinkedIn

hybridcompute

Computing model evolution and hybrid compute raise big questions

The computing model evolution and adoption of hybrid compute are having a profound impact on organizations, vendors and startups. It’s time to take a reality check on what this means for applications, workloads and the broader business.

Servers to serverless: where to from here?

Unlike physical computing, virtual and container-based computing make it easy to move applications, as they’re decoupled from the underlying compute infrastructure. The use cases for these models are becoming clearer and adoption is on the rise. Now there’s the promise of serverless computing, which offers even greater agility and cost savings because applications don’t have to be deployed on a server at all. Instead, functions run from a cloud provider’s platform, return outputs and immediately release the associated resources.

In essence, we have four distinct computing models evolving at the same time: physical servers, virtual servers, container-based and serverless computing. This raises some big questions for organizations: What do we do with our applications? Which workloads should move the cloud? What must stay on-premises? Can I replatform some applications to run on virtual machines? Can I rewrite others for the container or serverless world? Most importantly, how long will it take to get this done so I can unlock the benefits of these new models?

The computing architecture decision has a direct relationship with the future of each application – in fact, it may even dictate the future direction of each application.

The impact of hybrid compute

The realities of hybrid computing, with all these variations, are coming into play in a big way for organizations. Some applications are simply not worth rewriting and will remain on physical servers, on-premises, for many years to come. Some workloads are feasible for virtual machines and containers, but businesses will need to make choices about how they approach development.

Skills, architecture, budget, strategy and business outcomes will play a role in deciding whether to ‘go it alone’ or look for a compute-as-a-service, platform-as-a-service provider or software-as-a-service offering so as not to build the environment themselves. There is a watershed moment looming for most organizations as they are compelled to decide what their IT architecture models for the future should be.

Vendors, too, are feeling the impact and will have no choice but to develop software-based offerings. Everything they produce must be programmable – if it’s physical, it must be virtual, too, and available on all cloud marketplaces. We've reached a point where so many workloads are moving to the cloud that vendors who don’t have software offerings that are consistent with what was available on-premises will be left behind. The implications this has for scale, and deploying, managing, operating and reengineering entire systems, are game-changing.

Take, for example, vendors with a fully programmable firewall that can be deployed onto the edge of a virtual machine in a matter of seconds. Suddenly we have micro-perimeters at the most granular computing level, providing a different quantum of protection that’s possible only through programmable automation.

The answers to many of the questions raised by this new environment will inform important decisions about investment, operations, security, skills … but, for the most part, those answers are still in the making. What we’re likely to see for now is a hybrid world where everything – physical servers, virtual servers (on-premises or in the cloud), containers, serverless, cloud (public, private, hybrid, multi) and as-a-service models – is used in the context of each architecture’s merits and strengths to form a truly hybrid compute model.

52% of businesses surveyed intend to interoperate multiple cloud environments to deliver seamless business functions. However, 44% do not have an overarching strategy in place for hybrid cloud adoption. NTT Insights: Going Hybrid


predictivecybersecurity

Programmability and predictive capabilities reshape cybersecurity

A fundamental shift is taking place in cybersecurity as new technologies enable greater predictive capabilities and security at an atomic-unit level.

Blockchain finding its use case

Blockchain is finding its use case beyond just the international settlement areas, and we have seen its adoption extend to all areas of the financial sector, even unexpected ones. As an example, blockchain provides more control over data and can be used as an entirely new approach for managing configuration provenance in IT systems.

With blockchain, agent software can run on every device in a physical or virtual environment so that all devices participate in a distributed ledger. You can then run the hash of the block on the config file to render the hash code of the block, indicating a state of the overall configuration posture. If the hash code changes, it means there’s a change in the config – and suddenly you have an immutable mechanism to detect the change (in real time), trace it through the block, and identify the origin and party responsible for the change. There’s a lot of potential for this technology to have a great impact on cybersecurity, perhaps even to redefine cybersecurity as we know it today.

Programmability and security at an atomic level

As every element of infrastructure and architecture becomes programmable, we can start thinking about security posture as code. If we can put a firewall inside the software of all containers and virtual machines in our architecture, and contain the flow of data between them within software, security vulnerabilities are greatly reduced – there are simply no doors in the wall you can pick the lock for. If you can maintain the dataflow in software, it’s never on a network where it can be hacked, siphoned off or stolen. Some vulnerabilities will still be there but the granularity, contained dataflows and software greatly reduce and improve the vulnerability profile.

In a way, this leverages the brute force of programmability to embed security technology at a granular level and orchestrate the outcome at a scale that was not feasible in the past. This is good news for the industry and we’ll continue to see this capability emerge as programmability evolves and improves.

Predictive power for detection and prevention

Cybersecurity is one of the first IT functions to mature in using large-scale data with machine learning, analytics and artificial intelligence to become more predictive. The more data sets we collect over time, and the richer that data is, the better we can determine and match patterns to both predict and shut down attack vectors. This is a major step – a major and very positive step.

The scale and granularity of measures we can take with programmability and the resulting data that is generated, coupled with machine learning, analytics and predictive artificial intelligence, allow us to combine powerful technologies to curtail security threats more effectively. We’re better able to detect and contain these threats before they manifest, leading to the phenomenon of negative-day attack prevention, as opposed to zero-day prevention.

NTT’s Piper platform, a highly scalable machine learning pipeline, enables us to handle billions of flows hourly – generating over 300 unique statistical features (for example, global geolocation distribution) for each internet host – and track their locations.

'There are an increasing number of elements in the hybrid environments of enterprises, causing a major fragmentation of traditional concepts of perimeter, attack vectors and vulnerabilities.'


Kazu Yozawa, Chief Technology Officer, Security division, NTT Ltd.


Follow on LinkedIn

coinnovation

Co-innovation for complete solutions

Most organizations simply will not have the skills or resources to keep pace with every aspect of change in technology. 

Co-innovation will become increasingly important as organizations look to partner with technology providers that can offer R&D resources to help them address and solve these business challenges, give guidance on the best way forward and demonstrate proven use cases for proposed solutions. Above all, they will expect business outcomes, not technology solutions – which means technology companies will need both depth and breadth in their offerings to meet coming demand.

Frameworks for sharing intellectual property will be redefined. Co-innovation partners will contribute intellectual property assets, evolve jointly to solve specific business challenges and review monetization models.

Ettienne Reinecke

Ettienne Reinecke

Chief Technology Officer, NTT Ltd

Ettienne is responsible for NTT’s technology strategy, enterprise architecture and innovation frameworks, positioning our capabilities with external stakeholders and providing input into new-offer creations. 

Our innovation capabilities