博客

How is the Data Center Optimization Initiative (DCOI) Changing the Landscape? (Part 1)

Posted on May 8, 2017 by Gento  |  Comment (0)

Data centers are becoming more important with each passing day. One study estimates that data center construction worldwide will continue to grow at a rate of about 21% per year until 2018. Yet at the same time, the infrastructure that supports these mission-critical data centers is woefully inefficient on their best days. It is estimated that about 30% of servers in the United States alone were considered comatose - meaning that expensive computing assets are going to waste. Worldwide, experts estimate that about 10 million servers (representing about $30 million in capital) have not performed valuable computing work OR delivered data in the past six months.

Simply put, things are growing too large, too fast. The complexity of these environments are changing all the time, making it difficult for data center operators to keep up with issues that lead to downtime or lost productivity. Not to mention, how difficult it is to actually increase design and operational efficiency at the same time.

In many ways, these are the types of issues that the Federal Government Data Center Optimization Initiative (DCOI) was designed to address. As one of the biggest data center users on planet Earth today (the United States government, alone owns approximately 2,000 data centers), federal officials have a vested interest in strengthening the efficiency of the existing infrastructure to both better meet the needs of today and prepare for the perceived future needs of tomorrow and beyond.

But just how is the DCOI changing the landscape of the modern day data center? While a certain level of disruption is to be expected and even welcomed, this is an understandably complicated topic that requires you to keep a few key things in mind.

The Data Center Optimization Initiative: Breaking It Down

At its core, the DCOI is a federal mandate that requires agencies to "develop and report on data center strategies" in an effort to consolidate aging and inefficient infrastructure, make better use of existing facilities, unlock new opportunities in terms of cost savings and begin a lengthy but pivotal transition into a more efficient infrastructure - essentially all at the same time.

To that final point, as of March 2016 United States government agencies are no longer actually allowed to either build new data centers or expand existing ones until they can prove that there are no other viable (read: more efficient) alternatives available. Just a few examples of alternatives that would be considered acceptable include but are not limited to ones like:

  • Transitioning into cloud computing and related services.
  • Leasing colocation space.
  • Using services shared with other agencies.
  • Migrating to DCIM, or Data Center Infrastructure Management.

When you consider that the US government's total annual IT spend is estimated to be about $80 billion, placing a high priority on more affordable, better options makes a large amount of sense. The government actually spent close to $5.4 billion on physical data centers alone in 2014. 

What Has Actually Changed?

The DCOI was primarily intended to replace the rules and goals laid out in the Federal Data Center Consolidation Initiative, which first debuted in 2010. A few of the major objectives of the FDCCI included:

  • Reducing the overall footprint of government data centers, both in terms of energy consumption and real estate requirements.
  • Reducing the total cost of hardware, software, and operations of each data center.
  • Strengthening the IT security capabilities of the federal government.
  • Shifting towards more efficient computing platforms and technologies wherever possible.

The DCOI, on the other hand, is much stricter in its primary goals of both reducing the government's data center inventory and the total cost required to maintain it. The DCOI places a heavy emphasis on the role of cloud computing moving forward and if everything goes to plan, the government wants to reduce any and all spending on physical data by as much as $1.36 billion over the next three years. 

In Part 2, we will discuss what these changes mean for the consistently evolving data center landscape moving forward. To read Part 2 click here - http://www.raritan.com/blog/detail/how-is-the-data-center-optimization-initiative-dcoi-changing-the-landscape-



{/exp:comment:form} --}

What Are the Top Concerns for Data Center Managers?

Posted on May 22, 2018 by Gento  |  Comments (4)

Top Four Concerns for Data Center Managers
Preparedness is often the first step for resolving potential challenges and devising solutions. As such, in order to properly prepare for the future of data centers, it is important to first understand the top concerns for data center managers.

1. Climate Change. -- In a recent 2018 study, more than 50 percent of participating organizations were concerned about the potential for climate change disruptions to existing data centers. Throughout the globe, organizations need to take into consideration the potentially negative impacts of rising temperatures, growing floodplains, and an increase in violent storms. All three types of climate change coincide with the increase in region-wide disasters. In order to prepare for this potential challenge, data centers need to include disaster and emergency planning efforts into the broader business continuity plans for the entire organization.

2. Data Center Infrastructure Security Threats. -- Due to the nature of the sensitive business and personal information that they hold, data centers need to remain vigilant for potential infrastructure security threats. Recent studies show that these attacks are more frequently being conducted on an IP-basis. As such, organizations need to effectively control how machines are connected to their data centers. Through private networks, a limited number of access points, and stringent monitoring systems, data centers can remain prepared to effectively combat infrastructure security threats.

3. Emerging Edge Computing Capacity. -- Edge computing is set to be one of the emerging technologies that disrupt the data center sector. This type of technology is a direct response for the need to process data closer to where it is generated, consumed, integrated, and computed. As with any emerging data center technology, the concern for many managers lies in security and data sovereignty. However, as organizations require access to data at the "edge," these solutions will continue to be implemented for a variety of purposes. From "store and forward," to data consolidation and backup, self-contained micro-modular data centers will play a key role in deploying a viable solution for edge computing.

4. DCIM Strategies. -- Data center complexity is on the rise. As a response to this increased complexity, the requirements for control, management, and visibility from DCIM software has also grown. Fortunately, DCIM products have recently matured to now offer rich, scalable, and stable management solutions to increase the forecasting, agility, and efficiencies of data centers. While it is still an under-deployed technology, it is expected that as it continues to mature, so too will it become a more widely adopted solution. The challenge for data center managers will be creating and implementing the operational changes needed to support DCIM software.

The Bottom Line: Be Prepared for Upcoming Challenges
Climate change, data center infrastructure security threats, edge computing, and DCIM strategies are all concerning areas for data center managers. As these emerging technologies continue to be adopted, data centers will need to take a proactive approach. Through a state of preparedness, data centers can more readily adopt the technologies needed to meet the growing needs of organizations. Finally, with the right knowledge and preparation, data centers can continue to grow and evolve as they adopt new technologies, address the concerns of data center managers, and meet the evolving needs of organizations.

Find out how Raritan can help solve your power data center concerns. Visit our website here


Why Easy to Use PDUs will Help Your Data Center

Posted on May 17, 2018 by Gento  |  Comments (4)

With the rapid expansion of data centers creating highly complex IT infrastructures, it’s becoming more important than ever to find ways to increase the efficiency of day-to day operations.  One of the most practical ways to achieve improved efficiency is through devices that are easy to use and easy to deploy.


Cost Savings with Micro Data Centers

Posted on May 4, 2018 by Gento  |  Comments (2)

Major changes in data centers operations have historically involved the location of data processing. At one time, this function was moved off-site to mainframes, but the advent of microcomputers, now known as desktops or PCs, brought data processing back to the customer’s own data center. Cloud servers and collocated data centers resulted in data processing being performed off-site once again.

Today, some organizations are using micro data centers to process data on their own premises. This solution can provide performance improvements that justify the initial expense of a data center and has the potential to gain wide acceptance in the near future.


5 Reasons to Prioritize Rack-Level Management Now

Posted on April 25, 2018 by Gento  |  Comments (17)

If you’re an IT leader, you’ve probably made significant investments in data center management over the past few years. That’s because the success of the organization you serve depends heavily on the technical and economic performance of your data center. So the more digital your organization becomes, the smarter you have to be about how you manage your data center infrastructure.

Chances are, though, that you’ve focused on aggregate management of your data center as a whole. That’s good – but it will only get you so far. To fully optimize the value your business derives from its data center capex and opex, you must aggressively pursue operational excellence at the rack level.


What is an intelligent PDU?

Posted on April 25, 2018 by Gento  |  Comments (16)

An Intelligent Power Distribution Unit (iPDU) is a networked power distribution unit that increases the efficiencies of data centers with real-time remote power monitoring, environmental monitoring, and data center infrastructure integration. Intelligent rack PDUs deliver technologies which enables a smarter IT infrastructure so you can stay ahead of problems before they occur. They help achieve the ultimate goal of any data center manager, maintaining uptime while reducing cost.


Page 1 of 42 pages  1 2 3 >  Last ›

Other Blog Posts

What Are the Top Concerns for Data Center Managers?
Posted on May 22, 2018
Why Easy to Use PDUs will Help Your Data Center
Posted on May 17, 2018
Cost Savings with Micro Data Centers
Posted on May 4, 2018
5 Reasons to Prioritize Rack-Level Management Now
Posted on April 25, 2018
What is an intelligent PDU?
Posted on April 25, 2018

View all Blog Posts

力登官方微信公众号

力登官方微信公众号
cn-blogfollow

近期活动

Cisco Live 2018
March 6-9, 2018  •  Melbourne, Australia
DataCloud Asia 2018
March 22, 2018  •  Singapore
DCD Indonesia
Apr 5, 2018  •  Jakarta
Data Centre World Hong Kong
May 16 – 17, 2018  •  Hong Kong
CDCC China
May, June, July, Nov 2018  •  Wuhan, Hangzhou, Chengdu, Beijing

View all Events

力登最新新闻

Legrand Makes 451 Research’s List of Largest Data Center Technology Suppliers
Posted on May 4, 2018
Finding Weak Links in Your Security Policy and How to Safeguard Data Centers Will Be Addressed by Raritan Speaker at AFCOM Conference
Posted on March 8, 2018
Raritan Introduces Secure Switch for KVM Access to Government and Military Computers
Posted on March 6, 2018
Ashley Fox of Raritan Inc. Recognized as 2018 CRN Channel Chief
Posted on February 27, 2018
Packet, Myriad Supply, Raritan, and Data Center Knowledge to Participate in ‘Managing the Edge’ Webinar
Posted on November 12, 2017

View all news