Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Wednesday, January 8th, 2014
Time |
Event |
12:30p |
Data Center Jobs: Latisys At the Data Center Jobs Board, we have two new job listings from Latisys, which is seeking a NOC Manager and a Director of Data Center Operations in Ashburn, Virginia.
The NOC Manager is responsible for the overall customer satisfaction and experience, supporting Data Center Operations management in support and maintenance of the facility, hiring, developing, and managing a 24×7 NOC team to include staffing, coaching, and training, facilitating the collection, reporting and reviewing of performance metrics, and monitoring workloads and recommending changes in staffing levels / overseeing the appropriate staffing levels/ utilization and productivity of the team, including the maintenance of the daily shift/work schedule. To view full details and apply, see job listing details.
The Director of Data Center Operations is responsible for ensuring data center service levels and key business plan initiatives are met, directing/managing/supervising the data center operations in providing efficient and effective information systems, services and support to customers, ensuring the cost-effective use of data center resources, timely response for data center issues and maximum effectiveness in routine, daily operations, department resource planning and budgeting, and overseeing construction management. To view full details and apply, see job listing details.
Are you hiring for your data center? You can list your company’s job openings on the Data Center Jobs Board, and also track new openings via our jobs RSS feed. | 1:30p |
2014: The Year of DCIM Gary Bunyan is Global DCIM Solutions Specialist at iTRACS, a CommScope Company. iTRACS is a Data Center Infrastructure Management (DCIM) software suite provider. This is the first in a series of columns about the business value of DCIM as an open infrastructure management platform.
 GARY BUNYAN
iTRACS
It’s that time of year again ¬– when everyone makes predictions about everything from who’s winning the tablet wars to which app will be the “next Instagram.”
So here’s my prediction:
2014 is the year of DCIM – the year that DCIM will drive increasing business value for users in one of three ways: reducing operational costs, deferring capital expenditures, and increasing business output from the IT assets on the floor of the data center.
Now let me take this prediction a step further:
2014 will be the year that DCIM solutions which are truly OPEN are the ones more widely embraced by data center owners, operators, and users. Because these are the solutions with the best chance of achieving the highest ROI.
Open is the natural evolution of DCIM. By its very nature, being open is where DCIM can deliver its highest return on investment back to the business. By “open,” I mean a DCIM platform that:
- is purchased as a software-based suite solution (not a point product)
- uses industry standards and communications protocols to ensure seamless interoperability and the free bi-directional exchange of information with other systems
- offers a data exchange framework that allows for rapid, hassle-free integrations with third-party vendors both within and outside of the data center (short development cycles = rapid time-to-value at low customer cost)
I’m talking about a DCIM platform that invites, embraces, and empowers collaboration within a broad vendor ecosystem, opening doors to greater productivity and efficiency for data center owners, operators, and users. A collaborative environment that is inclusive enough to serve as a holistic “information hub” across both IT and Facilities infrastructure.
The alternative? It’s not pretty. I’m referring to proprietary and/or hardware-constrained tools that handcuff decision-makers with inherent limitations in functionality, technology dead-ends, burdensome hardware costs, unreliable access to information, fragmented visibility into operational or planning metrics, and integration roadblocks that discourage rapid collaboration (the opposite of empowering).
Click to enlarge. NO DCIM IS AN ISLAND. A DCIM suite that’s 0pen offers deeper visibility and richer information-sharing with other systems. How else can it pull together so much data from so many disparate sources? (Image courtesy of iTRACS software.)
“Open” is where the DCIM world will continue to go.
Why do I say this?
1. No vendor is an island. Being open optimizes the ability of the DCIM vendor to collaborate with other systems and data sources to meet the customer’s needs. In this way, the vendor can provide the total “command and control” management platform that users are looking for. This holistic approach requires collaboration. It takes a web of capabilities and information-sharing to provide the holistic infrastructure management solution that today’s decision-makers demand. This is best achieved by an open software platform that utilizes industry-standard interfaces and protocols to facilitate rapid, hassle-free integrations with other systems.
2. Opportunity is knocking. An open software platform offers expanded coverage across IT and Facilities, widening the web of assets that can be monitored and managed within the DCIM platform. When it comes to gathering and analyzing operational data like power, space, cooling, etc., the more, the merrier. This interoperability can also include outside enterprise systems like finance, ERP, and other management tools. Integration with enterprise systems can enhance how assets are sourced, purchased, and financed over their entire lifecycles, improving cost efficiency over the long haul.
3. More collaboration means more insight. DCIM isn’t about data. It’s about turning that data into insight that improves decision-making. It’s about turning guesswork – what you “think” is going on in the data center – into knowledge – what you KNOW is going on.
Turning data into insight isn’t just triggering an alarm or monitoring a device. It’s about providing information on the past, present and future of the entire physical ecosystem to drive efficiency and business planning. For example, DCIM can help you manage and reduce opex not for the year but over the entire lifecycle of the data center. It can predict the future using “what if” scenarios and predictive modeling.
This means not just optimizing day-to-day operations, but using capacity planning to look ahead and optimize future utilization and capacity requirements across all four key resources – power, space, cooling, and network ports. This level of resource management requires accessing and analyzing data from multiple sources including operational, asset, power, and even financial (purchasing) systems. This is achievable today with the right DCIM software suite.
Open is the only way DCIM can go if it is to fulfill its true mission:
- Keeping the physical layer continuously aligned to the needs of the business
- Ensuring that the infrastructure investment delivers maximum business value to the enterprise
So. . .in 2014, you will see DCIM continue to spread its wings as an open technology.
And I believe you will see the benefits of this in three key areas:
1. Operational efficiency – reducing Opex and deferring Capex
2. Assets and connectivity – understanding the interrelationships and improving business output of IT assets
3. Holistic capacity planning around space, power, network, and cooling requirements – being able to plan for all four of these vital resources.
So do you feel it … the window of opportunity opening?
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 2:00p |
Cosentry Enters St. Louis Market With Acquisition of XIOLINK Omaha-based Cosentry is a big player in the Midwest, and it just got a little bigger. The company acquired St. Louis-based XIOLINK, moving it into a fourth market and strengthening its managed services portfolio. In addition to Omaha, and now St. Louis, Cosentry has data centers in Kansas City and Sioux Falls, S.D. Financial Terms of the deal were not disclosed.
XIOLINK is an established player in the St. Louis market. It approached the market from a managed services perspective and later added data center space. Cosentry’s roots are in colocation, but the company expanded into cloud services and hosting infrastructure. XIOLINK’s managed services not only bring Cosentry into a new market, it also fills out an already rich portfolio of colocation, cloud and held desk services offered by Cosentry.
“The rationale for the acquisition is two-fold,” said CoSentry CEO Brad Hokamp. “As a company, we want to grow our business and establish a greater position in the Midwest. St. Louis is a priority market for Cosentry. Looking at the St. Louis market, it’s a great foundation to expand into. We’re looking to expand in the Midwest, into underserved markets. We looked at the market opportunities, and St. Louis had the business demographics.”
Growth for Cosentry
It was a busy 2013 for Cosentry. The company completed a refinancing of its existing credit facilities that provided the company with up to $100 million of capital for its continued expansion. It also named industry veteran Hokamp its new CEO, and doubled capacity in Omaha after rapidly selling out its first phase at its Midlands data center.
XIOLINK customers also gain from the acquisition, which will bring a better network and Cosentry’s data center portfolio, which can be leveraged for disaster recovery services. XIOLINK spent around $25 million on renovations and expansion in 2013.
“We are excited about the newly combined company’s strength and scale, which will allow us to continue providing customers with exceptional service while investing in and expanding our operations in the Midwest region,” said Brad Pittenger, XIOLINK CEO. “We will integrate seamlessly into Cosentry and be the platform for the managed services operations of the new company. In addition, our customers will now benefit from multi-city disaster recovery services throughout the region.”
Cosentry will operate XIOLINK’s two data centers in St. Louis. Pittenger, who co-founded XIOLINK 1999, will leave the company after a transition period of three to six months. Cosentry’s head of operations will relocate to St. Louis from Omaha.
Cosentry will now operate in four Midwest markets with eight data centers, including Omaha, Kansas City, St. Louis and Sioux Falls. In addition, the breadth of services now include private and public cloud capabilities, hosting infrastructure solutions, extensive managed services, disaster recovery and help desk outsourcing.
Cosentry specializes in Midwest data center services, providing colocation, cloud, hosting and managed services out of facilities built with resiliency in mind. Private equity firm TA Associates acquired Cosentry in 2011. | 2:00p |
Understanding and Mitigating Risk From Data Center Threats  Distributed denial of service (DDoS) attacks are getting larger and more powerful.
Today’s business world is becoming ever more reliant on the data center, which has become the heart of any modern organization. With virtualization and cloud computing driving gains, many are saying that it’s a great time to be in the data center business. Although this may be the case from an infrastructure side, we can never forget that as more people move towards a platform, the bigger a target it becomes.
Here’s the real problem: Attacks against cloud providers have not only increased, they’ve also become much more sophisticated. In fact, a recent Arbor Networks report indicates that the first quarter of 2013 saw the previous record for the largest reported DDoS attack, around 100Gbps, shattered by a 300Gb/sec DNS reflection/amplification attack targeting Spamhaus. Attackers have had the technical capability to generate attacks of this magnitude for some time, and this has now been demonstrated. The attack vector used in this case was not new, as DNS reflection / amplification has been used to generate several of the largest attacks seen on the Internet in recent years. These attacks are actually relatively common, but usually at much lower traffic levels.
So, what do you do in these types of situations? A new security term has been circulating the industry. Next-generation security platforms are much more than just physical boxes sitting in the data center. Traditional means of securing a data center simply won’t cut it anymore. Standard perimeter defenses at the data center layer must be optimized to handle new types of security threats. There has been a leap in security technologies where advanced engines are doing much more deep diving than a regular firewall would.
- Virtual appliances. Many organizations are now deploying security appliances within numerous points in the data center. No longer bound by the physical aspect, virtual appliances can roam between physical hosts and have visibility into more parts of the network. Plus, they’re easier to manage from an agility perspective. Furthermore, administrators can dedicate a virtual security appliance to a specific function. This means an appliance can reside departmentally doing a certain type of service for that team. For example, Fortinet’s FortiGate Virtual Appliances create a new layer of security for a physical and virtual infrastructure. By placing these virtual appliances inside of a network, you’re able to monitor for intrusion prevention, malware, unauthorized devices, network viruses, and much more. This could prove to be much more expensive when doing something similar with a physical device.
- New scanning engines. Advanced deep scanning engines like data-loss prevention (DLP), intrusion detection/prevention services (IPS/IDS), and even device interrogation helps lock down an environment. Creating intelligent network monitoring algorithms allows administrators to control what data flows in and out of the environment. Furthermore, these new engines help control the various consumer devices that are trying to enter the environment. Security administrators can allow or prevent access from certain types of devices. In fact, they can even geo-fence their environment and prevent devices from access given an unsecure connection. Then, these devices can be interrogated, authorized and even provisioned all through secure corporate communications
- Creating layered security. The idea is to create intelligent security layers within your data center to secure the entire environment. The most important point to consider is that layers security can be logical, physical – or both. Security professionals can align necessary resouces with appropriate security measures as needed. For example, Entire applications can be placed behind intelligent, heuristic, learning engines which monitor for anomalous activity. Not only do they protect internal resources, they continuously monitor these applications against signatures based on public vulnerability databases (e.g. Snort, CVE, Bugtraq, etc.). There’s also the physical side of the security. Virtualization is certainly very cool, but what happens when you need raw throughput as well as next-generation security? Physical security appliances are now being integrated with 10Gbe cards capable of even greater amounts of expansion. You can have the best core infrastructure out there, but if your edge appliances can’t handle the throughput, you’re going to have a serious bottleneck.
Let’s face it, there are always going to be bad guys out there. Their attacks will continue to be sophisticated. However, in deploying intelligent technologies and good security practices – these attacks can be stopped at the border without really causing any damage. Cloud computing, IT consumerization and the influx of information housed within a data center has created new types of targets.
As more organizations increase their reliance on the modern data center, attack vectors will continue to evolve. Cloud and next-generation security is designed to mitigate risk above and beyond standard security measures. Remember, a good security solution not only protects your environments, it can increase your infrastructure agility as well. The future looks to leverage data center and cloud technologies even more. With mobility and the evolution of the end-user – organizations must look at new ways to secure their environment. | 2:25p |
Cisco: 2014 Will Be Key Tipping Point for Internet of Everything  At his keynote at CES, Cisco CEO John Chambers said 2014 will be the “transitional, pivotal year” for the Internet of Everything.
At the CES Tech Titans keynote addresses Tuesday, Cisco (CSCO) CEO John Chambers enthusiastically ran on to the stage, delivering a vision of an age with unprecedented technology – set to eclipse even the first Internet boom, and stating that he believes 2014 will be the “transformational, pivotal point” for the Internet of Everything (IoE).
In this next massive wave of change, Cisco predicts the number of connected things will grow to between 15 and 25 billion by 2015, before exploding to 40 or 50 billion by 2020. The framework the company presents to foster this dynamic IoE environment is the intelligent connection of people, process, data and things, with ‘people’ being most paramount.
“Cisco has led customers through every Internet transition over the last 30 years,” said Blair Christie, senior vice president and chief marketing officer, Cisco. “The Internet of Everything is perhaps the most promising of these, creating unprecedented opportunities for organizations, individuals, communities and countries to realize dramatically greater value from networked connections between people, processes, data and things. We are working to harness the IoE’s power and promise to connect the unconnected, and we look forward to showcasing several life-changing scenarios for the IoE, here at CES in Las Vegas.”
Cisco estimates that the enterprise value of the intersection of these elements presentss a $14.4 trillion opportunity for the private sector, which can increase global aggregate corporate profits by about 21 percent. Cisco’s extensive research on the IoE appears in its 2013 IoE Value index, showing how firms from different industries in 12 countries are capturing the potential value made possbile by the increased connections in the IoE.
TV in the Cloud
Cisco announced it has expanded its Videoscape TV services delivery platform to include a host of new cloud video capabilities, including a Videoscape “as-a-service” offering and open cloud software technologies based on OpenStack. These added capabilities will help service providers and media companies enhance agility, increase revenue and reduce operating expenses, as well as capitalize on the Internet of Everything.
New Videoscape cloud capabilities include cloud software, which separates it from dedicated hardware and enables it to run on service provider and media company public and private cloud. To help service providers and media companies gain even more agility, Cisco is offering Videoscape Cloud Services. Utilizing the same software and APIs as the rest of Videoscape, Videoscape Cloud Services can be purchased “as a service” from Cisco on a consumption-based model. Cisco cloud fusion for Videoscape lets companies mix-and-match elements from all modes of deploying Videoscape, including performance-optimized hardware and software appliances, cloud software, and cloud services. The Videoscape Open UX foundation enhances functionality and performance of gateways, set-top boxes and connected devices running HTML5 applications.
“Videoscape leads the industry as a platform for delivering exciting video services and experiences. With these new Videoscape cloud capabilities, our customers have two additional ways to deploy Videoscape, meaning they can get new services to market faster than ever before,” said Joe Cozzolino, senior vice president and general manager, Service Provider Video Infrastructure at Cisco. ”Plus our unique Cisco Fusion strategy allows customers to mix-and-match deployment options to best grow revenue and reduce their overall costs to roll out video services, including exciting second screen, 4K video and IoE based Connected Life services.”
Winter Olympics powered with Videoscape
Highlighting the success of the new Videoscape cloud solution, Cisco announced that it has been selected to provide video hardware and cloud software components from its Videoscape TV services delivery platform to NBC Olympics, a division of the NBC Sports Group, to support transcoding and content management during its production of the 2014 Olympic Winter Games in Sochi, Russia. Adding Cisco’s new cloud software components will provide NBC Olympics with a simple, agile, and elastic cloud architecture that supports the streaming of live and cloud-enabled on-demand Olympics sports content for on-site production in Sochi.
“Cisco is a trusted partner who we have marked many milestones with, collaborating on IP video contribution and multiscreen delivery, and now cloud-based infrastructure,” said Craig Lau, Vice President, Information Technology, NBC Olympics. ”We are excited about the benefits and options cloud-powered video services bring us, including added agility, portability, flexibility and scalability of our networks, to meet the demands, with much less engineering and prep time. As we approach our eighth consecutive Olympic Games together, we know we can rely on Cisco to bring it all together, and help us exceed our goals to deliver a seamless production on location in Sochi.”
Cisco is sponsoring the Internet of Everything track at CES in Las Vegas. The event conversation can be followed on Twitter hashtag #CES2014, and Cisco’s #CiscoCES. | 4:25p |
Survey: NSA Scandal Prompting Shift Away From U.S. Providers  About 25 percent of UK and Canadian businesses say they are looking to move data outside of the United States because of revelations about NSA surveillance, according to a survey by Peer 1.
Since the NSA privacy scandals broke eight months ago, there’s been plenty of speculation about how the revelations might impact U.S. hosting providers. Forrester made the bold prediction that the cloud market would take a $180 billion hit over the next three years as a result of new privacy concerns.
Peer 1 Hosting has conducted a survey of actual IT decision makers in Canada and the UK, and the results reflect the nuances of the situation. About 25 percent of UK and Canadian businesses say they are looking to move data outside of the United States following the scandals. But the concerns are not new: 81 percent of these IT decision makers said that the reports of NSA surveillance scandal didn’t surprise them.
Many decision makers outside the United States have been wary of housing their data within U.S. borders ever since the passage of the Patriot Act, which leaves any U.S.-hosted data open to disclosure should the government request it. The NSA privacy scandal headlines might begin to exacerbate that problem.
The Peer 1 study indicates the NSA scandal continues to hurt general perceptions of both hosting providers and cloud services. Sixty nine percent of decision makers said the NSA scandal has made them more skeptical about hosting providers in general, while 57 percent said they are less likely to use a public cloud as a result. The “outsourcing world” (a generic term for all things hosted off-premises) has grown by leaps and bounds, especially with the advent of the cloud. Negative headlines only serve to continue to hurt the industry.
But it’s not a simple calculation. About 96 percent of these businesses consider security a top concern, while 82 percent cite data privacy as a factor in choosing where to host their data. More than 81 percent said it’s important to know precisely where their data is stored. However, the States remain one of the most popular places for these companies to host data, outside of their own countries.
Confusion About Privacy Laws
As a hosting provider with major operations in Canada, the United States and the UK, Peer 1 has an interesting vantage point on the NSA publicity and how it may shift customers’ focus. Concern about the NSA is highest among Canadian decision makers. A third of Canadian respondents say they are moving their company’s data away from the U.S. as a result of the NSA revelations, significantly more than the 21 percent in the UK.
Significantly, a whopping 69 percent of decision makers said they’d be willing to incur additional latency to ensure data sovereignty.
“With data privacy and security concerns top of mind after NSA, PRISM and other revelations around the world, businesses in the UK and Canada are taking real action,” said Robert Miggins, SVP business development, PEER 1 Hosting. “Many are moving data outside of the U.S., and even more are making security and privacy their top concerns when choosing where to host their company data. It’s clear that hosting and cloud providers need to take note and offer their customers true choice in terms of the locations and environments where they store their data, ensuring they can maintain security, compliance and privacy to the best extent possible.”
The survey also revealed that many don’t fully understand current data laws. While 60 percent said they don’t fully understand current data laws, another 44 percent are confused by privacy and security laws. This confusion only exacerbates the problem from increasing headlines around the scandal. The recent news that the NSA collects information on cell phone calls in the countries of its allies will impact the hosting decisions of a third of those surveyed.
The survey asked 300 Canadian and UK businesses and was performed by March Communications. It should be noted that Peer 1 stands to benefit from these kind of results, as it’s a Canadian hosting provider that operates key space in both Canada and in the UK, in addition to the USA. However, that benefit might be minimal when you consider the potential negative impact that the scandal has on the industry in general. Data and privacy laws are playing an increasing role on hosting decisions. The USA is and remains the most popular hosting destination, and the most popular location for data centers, and not everyone is going to move their data. It’s safe to say that the majority won’t; however, this scandal should be of concern. | 8:47p |
Cologix Expands In Minnesota’s Most Connected Building  A trow of cabinets in a Cologix data center. (Photo: Cologix)
Cologix has announced a new 28,000 square foot expansion in downtown Minneapolis at 511 11th Avenue. This doubles the company’s footprint in the “511 building,” the primary telecom hotel in Minneapolis and Minnesota’s largest Internet communications hub.
The expansion makes Cologix the largest tenant in the carrier hotel, meaning it has prime space available in a market that has drawn a lot of interest over the past few years or so. Examples of this interest include Stream Data Centers, Viawest’s 150,000 square foot build, as well as DataBank, Compass Datacenters and Digital Realty Trust all making moves in the bustlinh Minnesota market.
Cologix acquired the Minnesota Gateway, located in the carrier hotel in 2012. This gave the company a key footprint in the market, as the Minnesota Gateway operated the Meet Me Room in the building.The company has enjoyed success in the area ever since.
Strong Demand
“We continue to see strong demand in Minneapolis across all customer segments and increasingly from financial institutions and large enterprises,” said Grant van Rooyen, President and Chief Executive Officer of Cologix. “Our new data center will provide all our customers with more of the highly robust infrastructure for which we are well known and the benefits of the broadest connectivity choice in the Upper Midwest, all within easy access from IT offices in and around downtown. The new capacity will also support content and cloud providers who are recognizing Minneapolis as an ideal peering market, and the new edge of the Internet.”
At full capacity, Cologix’s new facility is designed to support 600 incremental cabinets.
Traffic Surging
The traffic in Minnesota is growing, and the 511 building is growing as a peering alternative to Chicago, according to Shaun Carlson, Network Engineering Manager, Arvig and member of the board of MICE (Midwest Internet Cooperative Exchange).
“In the past year MICE’s average traffic has quadrupled and we are seeing increased interest from networks and media providers alike in Minneapolis as a peering alternative to Chicago,” said Carlson. ”Cologix’s new downtown expansion is good news for the peering community as it will provide new capacity to attract additional peers to the budding community.”
511 11th was once in jeopardy, as the Minnesota Vikings, the football team next door, wanted to tear it down for a new stadium. The 511 Building is a 270,000 square foot building adjacent to the Metrodome. This would have been worse than missing a 38-yard field goal in an NFC championship game, as the building’s importance truly came to light. Seventy different data networks converge on the deeply connected building. |
|