Recent news is dominated by how the energy and utilities sector worldwide has been given a boost because of the exponential rise in Hyperscale DCs. The focus is largely on renewable energy sources and distributed generation, driven by cost and environmental sustainability concerns.
On the flipside, while Hyperscale Datacenters are architected for optimal utilization of resources, energy continues to be their largest cost driver. To reduce energy costs, infrastructure and technology vendors are trying innovative ways to cool their racks while minimizing cooling costs.
Microsoft, in fact took a whole datacenter and put it underwater, in the ocean off the coast of Scotland, in July this year, and garnered worldwide media coverage. And while the power and telecom industries have been able to drive new revenue streams from Hyperscale DCs, these sectors stand to gain from Hyperscale architecture in many other ways.
Addressing cost and scalability challenges
The telecom sector is arguably the world’s largest consumer of network, storage and compute resources today. Obviously, the telecom sector provides the underlying infrastructure for the World Wide Web, making it the most data-and-storage–intensive sector in the world. With the explosion in consumer devices, apps and data, the telecom industry is facing unprecedented infrastructure challenges. At the same time, a hyper-competitive market is putting huge pressure on telco margins.
The power generation and distribution sector is also seeing a huge influx of consumer data. This sector which was earlier focused only on turbines and towers, now also needs to manage and analyse terabytes of consumption data on a daily basis. The infrastructure needed to manage this scale of data would add significantly to cost pressures.
In this cost-sensitive scenario, Hyperscale DCs give telecom and power companies the opportunity to consolidate and virtualise their data, network and compute resources. This would allow companies to consolidate and optimize infrastructure costs, while accounting for huge variations in load on day-to-day basis.
Making grid integration a reality
India is one of the most vibrant environments for renewable energy in the world, with more than $40 bn investments having made in alternative energy sources in the last four years. As renewable / alternative energy becomes a major focus area for organizations and governments, solutions need to be found to integrate distributed renewable energy resources (DERs). Examples of distributed energy resources include solar panels, local wind energy generators, power backups, etc.
Managing an integrated network of energy resources requires huge investments in network analytics and infrastructure maintenance. Recent examples include the Green Energy Corridor (GEC) being set up by the Indian Government across Himachal Pradesh, Rajasthan, Gujarat, Madhya Pradesh, Maharashtra, Karnataka, Andhra Pradesh and Tamil Nadu. This project is expected to be completed by 2020 and is expected to cost nearly $6 bn.
The volume and complexity of data generated by devices and sensors across such an integrated, diverse, and democratized grid, is likely to be enormous. Managing and leveraging this data would require a parallel set of storage, compute and network infrastructure that maps to every node across the grid. Typical datacenters are not likely to have the scale or technology to handle loads of this nature. Hyperscale datacenters are likely to be the most optimal solution for such requirements.
Maximizing the value from IoT and consumer-generated data
With the rise of an IoT ecosystem, the generation and consumption of digital data has literally exploded across most industries. The telecom and power sectors, being metered utilities, stand to gain the most from devices or ‘things’. Both power and telecom industries are realizing the power of consumer- and network- generated data to drive greater efficiencies and enhancing customer experience.
“ The Green Energy Corridor (GEC) being set up by the Indian Government across eight states is expected to be completed by 2020 and will cost nearly $6 bn.”
Of course the data processing horsepower and data storage needs will be of an order or magnitude of millions, compared to a decade ago. The following areas provide immense opportunities for power and telecom companies to innovate and grow:
- Combine device-generated data (smart meters) and consumer generated data (preferences, geography, demographics, frequency, etc.) to improve grid management and optimize network traffic.
- Adapt to the data and analytics needs of the emerging Home Automation (Smart Home) market. Energy and utility companies will need to build capabilities to integrate, standardize and process data coming from different types of Home Automation devices. Recent AI-driven entrants to this space include Google Assistant and Amazon’s Alexa.
- Adapt legacy application and infrastructure for cloud environments, to achieve unmatched scalability, extensibility and interoperability.
- Leverage data from sensors, devices, analytics apps, smart meters, etc. to forecast utilization, load, availability and performance challenges, leading to accurate assessment of future requirements. Next-generation forecasting technology is critical to success, and needs to replace the current, decades-old methodologies used in today’s power and utility companies.
- Use data and analytics to optimize distributed generation and best use of alternative energy sources.
- Analyse consumer data (including social data) and analytics to enhance customer satisfaction, offer innovative services and enhance top line.
- Use Software Defined Network (SDN) constructs, predictive analytics and statistical models to optimize network load and performance management.
Adapting to technology changes
Infrastructure investments in the energy and utilities sectors have traditionally tended to be large-scale and long-term. As market dynamics – including consumer expectations, regulatory pressures, industry standards and cost structures – change quickly, companies in the telecom and power sectors can no longer rely on legacy infrastructure to handle future performance and scalability needs.
Today, we are seeing a spectrum of emerging technology trends such as Artificial Intelligence, Machine Learning, Deep Learning, Predictive Modelling, Blockchain, Virtual Reality, Augmented Reality, DevOps, API-driven architectures, in-sprint testing, etc. As a result, unless power and telecom companies adapt to these new trends, they face the constant risk of technology obsolescence and consumer dis-engagement.
And while organizations need to build pilots and proofs-of-concept around emerging technologies, they would need significant infrastructure (compute, storage and network) to test and operationalize these use cases. They would also need to build an environment that is rapidly scalable and highly-optimized, in terms of resource utilization, traffic management and coverage across cloud services (including nodes, platforms and environments). The Hyperscale DC concept includes pre-configured and virtualized environments that address most of these needs, and take business agility to the next level.
Finally, it is only a matter of time before the energy and utilities sectors adopt the Hyperscale notion extensively, to drive real value for organizations as well as consumers. However, these initiatives will need to extend far beyond consumers data and impact almost every part of both supply and demand. Companies in the energy and utilities sector have the opportunity to leverage Hyperscale architectures, build digital capabilities, deliver huge gains in productivity, lower costs, build business agility, and drive innovation to meet future market needs.
Vimal Kaw is Associate Vice President, Products & Services,Netmagic (An NTT Communications Company)
Disclaimer: This article is published as part of the IDG Contributor Network. The views expressed in this article are solely those of the contributing authors and not of IDG Media and its editor(s).