Over the past two decades, IT organizations have faced a tough battle against complexity. Virtualization and the cloud technologies have helped to contain this situation, but even these are limited by the inefficiencies of the data centres, specifically by the little automation and orchestration between the components. They are no longer limited by four walls. Moreover, they are a dynamic collection of resources in the cloud and premises that reside in multiple physical locations. They must have the automation and intelligence to go where it is most needed to process applications efficiently. This is where the next-generation of such development begins. An IT environment that is simplified, adaptable and responsive, allowing IT to switch from investment in systems maintenance time to innovation in business solutions. It is a combination of capacities designed for a world in constant change.

Defined Environment Software in which IT resources are dynamically and comprehensively orchestrated, capable of detecting and responding to the demands of applications in real time is just one of the new expressions of data technology. Hybrid environment, where private and public clouds work in harmony with traditional systems. Also, cognitive computing environment, where systems can learn and solve business problems using advanced analytics. The prime idea of word transforming into more data efficient environment is the creation of the global managed ecosystem that integrates the elements of IT and the physical infrastructure to provides uniform management through a single console.

IBM Vision

While virtualization and cloud-based delivery models have responded to the need for greater agility, complexity and management costs have also increased. Most IT provisioning and management tools are labour-intensive and increasingly unable to cope efficiently with the extreme performance demands of the workloads of today’s applications. So, what makes a data centre become the next generation? In short, the ability to eliminate many of the barriers that have inhibited IT to date. The next-generation offers a more simple and adaptable infrastructure that is capable of responding to a disruptive change, undoing technology barriers and integrating legacy and new architectures into a single managed system.

Software-defined Environment

It’s safe to say that the future of the IT infrastructure will not be manually controlled by administrators via hardware decisions. It will be controlled by software that is programmed to make those decisions automatically. This software-defined environment (SDE) optimizes the computing infrastructure, storage, networks and services, allowing to adapt dynamically to the type of work required, transforming a static IT infrastructure into a resource-intelligent infrastructure aware of workloads.

Software-defined environments change the rules that govern the way resources are deployed, capturing what IBM calls “experience patterns” to define how objectives will be met. These patterns are essentially best practices for deploying workloads, configuration, integration and other complex IT tasks that have been captured by subject matter experts and then coded into templates to be reused over and over again. The patterns of experience summarize all the necessary elements to automate the processing of workloads, including the policies that govern them. When a workload is executed, the associated template is invoked.

The SDE automatically orchestrates infrastructure resources to meet workload demands in near real time, escalating to meet changing demand and using predictive analytics to achieve the desired performance results. It allows the infrastructure to be exceptionally sensitive to unpredictable changes in the market.

Open Standards and Support for Heterogeneous Infrastructures

The next-generation data centre is based on open standards. With support for platforms such as OpenStack, Linux / KVM and others, it allows organizations to achieve true interoperability between the cloud and traditional models. It also facilitates the integration of the heterogeneous infrastructures of today, allowing organizations to bring legacy systems to the world defined by software.

By providing an open platform, the centres facilitate the exchange of information and services that are crucial for collaboration and comprehensive management. The IT infrastructure can be easily managed as a collective set of business resources, rather than as computation, storage and discrete network elements. Its open design allows organizations to take advantage of new technologies more easily, and avoid “getting married” to manufacturers, increasing the long-term viability of investments.

ITIL-based Management and Results-based Metrics

With the organization served by multiple delivery models and a heterogeneous set of systems, service management is essential to achieve commercial IT benefits at a controlled cost. IT Infrastructure Library (ITIL) has long been associated with a service-focused approach to IT management. Therefore, it is reasonable that service management is based on ITIL, with measurement and analysis of precise trends, capacity management and visibility on usage and cost. This should be an integral element of the newly developed data centre. IT metrics are another integral element, whose performance is measured at the component level (availability, utilization, recovery time) against service level agreements (SLA).

Changes in the next generation are the emphasis on results-based indicators. For example, customer satisfaction, productivity and the quality of the user experience. Instead of measuring the CPU, memory and disk utilization to evaluate the success of the systems, the services that are run are covered constantly, regardless of where the infrastructure is running.

Cognitive Computing

The concept of IT Operations Analytics, based on cognitive systems, represents the evolution of the monitoring and operation. Providing the ability to simulate the human thought process at an extraordinary speed. It has the ability to process large volumes of fast-moving data, pattern recognition, anomaly detection and make complex decisions in a matter of seconds. Its ability to adapt and learn over time and process natural language is what distinguishes cognitive systems from traditional analytical systems.

IT Operations Analytics is the evolution of monitoring. From monitoring in real time to a cognitive solution that analyzes the cause of the problems. It’s capabilities in the preventive and predictive analysis that learns from what happens in the infrastructure to predict when it could happen again, may as well be one of the most innovative data technologies.

Becoming an innovative data centre requires a change, not only in the physical and operational aspects but also at the organizational and cultural level. The evolution of hardware-centred thinking towards one focused on the optimization of services is critical in developing fast-moving data reality.