Factors such as physical size and the levels of automation and virtualization of resources are now making it increasingly necessary to distinguish between two classes of data centers. The traditional corporate data center is contrasted by a category of infrastructures, that of hyperscale data centers, which is distinguished by the ability to “scale” virtually any load volumes, both upwards and downwards. According to Synergy research, the latter category of super data center is characterized by a strong concentration phenomenon. At the end of 2018, the consulting firm censored 430 hyperscale-class facilities worldwide, 40% of which in the United States alone. A calculation that takes into account the data centers created by the 20 main companies providing cloud computing and Internet services, including the major operators of IaaS, PaaS and SaaS services, search, social networking and e-commerce. On average, each of these twenty companies today manages according to Synergy 22 hyperscale data center, representing a point of reference for all companies, large or small, that resort to methods such as co-location, the use of hosting and housing services. advanced and of course public or hybrid cloud computing, in conjunction with on-premises own resources.

According to IDC, the hyperscale data center club will be the real engine of innovation towards ever greater operational efficiency. Even public data centers – the most correct definition is perhaps “multi-tenant” – of more conventional dimensions will continue to invest in sustainability, also by leveraging agreements with their customers. As a result, the solutions market and the number of acquisitions or partnerships between data center operators and companies capable of inventing new approaches to the efficiency and resilience of computing resources will also grow. The same utilities – underlines Sergio Patano – will be involved in initiatives aimed at supporting customers in their strategies for sustainable and efficient IT. “Here too – continues Patano – a holistic point of view for the use of energy must prevail, embracing all offices, devices and work spaces, merging them into a single large dynamically managed environment”. Among the effects, an ever greater investment in autonomous and small-scale energy architectures, the so-called microgrids, “infrastructure islands” capable of addressing the variability of loads by acting, if necessary, in an intelligent connection with regional or national networks, is conceivable.

Read also:   Fujitsu fi Series scanners are also for macOS

The most important counterpart to sustainability is certainly the simplification, or if you want, the surplus of intelligence and automation that anyone who designs a data center today, for public or private purposes, must take into serious consideration. This applies to all scale scenarios, whether the project concerns a hyperscale or high capacity data center, or when more limited volumes of resources must be concentrated, such as in the case of peripheral data centers (edge ​​data centers) which begin to take hold in multi-site or widely distributed IoT architectures. Another recent IDC “Techscape” study, “Worldwide Smarter data center Technologies, 2019″addresses the topic of the smart data center through the examination of the technologies today at the basis of the automation of these resources. “The parameters used to measure the intelligence level of a data center infrastructure – Patano explains – move along the axes of improving operational efficiency, reducing risks, accelerating the deployment times of new workloads and the ability to monitor and manage all operations even remotely, an essential prerequisite for edge IT support ». Visibility and control are a fundamental requirement to leverage different types of resources – proprietary, co-located or simply rented as a service from a public provider – and distribute them evenly in the center as well as in the suburbs.