Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, April 20th, 2017

    Time Event
    12:00p
    RagingWire Launches Its First Dallas Data Center

    RagingWire reinforced the saying, “everything is bigger in Texas” today with the first-phase grand opening of its 1 million-square foot, 80MW Dallas TX1 data center campus, located in Garland.

    When complete, the facility will be comprised of five fully interconnected buildings, sprawling across 42 acres in the Dallas-Fort Worth metroplex. This first phase puts 230,000 square feet of data center space and 16 MW of critical IT load up for lease by RagingWire ­­and its parent company, the Japanese telco giant NTT Communications.

    Located near data center clusters in North Dallas, including Richardson, Plano, and Carrollton, this is RagingWire’s first foray into what is becoming one of the top and fastest-growing data center markets in the US. Other companies contributing to the boom there include Digital Realty, CenturyLink, CyrusOne, DataBank, Equinix, Internap, QTS, TierPoint, T5 Data Centers, and ViaWest.

    “We expect that TX1 will become a critical data center hub for large enterprises and cloud companies as part of their global IT deployments,” Douglas Adams, president and CEO of RagingWire, said in a statement.

    One reason Texas continues to attract more and more business is because it is the only state with its own power grid. Run by ERCOT, the Electric Reliability Council of Texas, it gives colocation providers access to three power grids in the US. For example, RagingWire in Ashburn connects to the Eastern Grid, and its Sacramento facility connects to the Western Grid. Dallas completes the triangle, according to a company blog post.

    While that’s certainly a plus, Texas—like most states—poses its share of unique challenges with respect to data centers. Not every site in the Dallas-Fort Worth metroplex has sufficient utility power for data centers.

    So, in order to make the new campus “Texas-ready,” RagingWire is working directly with Garland Power & Light to build a massive 108 MVA substation right next to the TX1 Data Center, according to a press release. GP&L is providing two new 138KV lines connected to multiple transfer systems that can choose between two independent transmission lines connected to dedicated RagingWire transformers. Eventually this new substation will connect to the new 345KV transmission system which will be the backbone for electrical power throughout Texas.

    The new campus is also being built to withstand whatever Mother Nature might toss at it. For drought conditions, RagingWire installed one of the largest water-free mechanical systems in the US. The company’s utilization of DCIM software is designed to take advantage of 5,000-plus hours of free cooling per year expected in Dallas.

    Then there are the tornados, flash floods, and golf-ball sized hail—all conditions RagingWire said it experienced during the build. Everything from roof to foundation was designed to withstand an EF3 tornado of 136 mph.

    The data center is carrier neutral with a number of onsite carriers, as well as dark fiber connections to the carrier hotels at the Dallas Infomart and 2323 Bryant and direct connection to major cloud providers, including Amazon Web Services, Microsoft Azure and Google Cloud Platform.

    The Dallas TX1 Data Center, combined with data centers in Ashburn, Virginia; and Sacramento, California, expands RagingWire’s data center capacity to 1.5 million square feet and 113 MW of power.

    3:00p
    Retiring Instor President Bob Hancock Picks Jack Vonich as Successor

    Perhaps one of the biggest compliments for a professional is to be hand-picked by a retiring company president as their own replacement.

    “Jack (Vonich) is one of the most enthusiastic people I’ve ever worked with. He’s a straight shooter. Very transparent,” Instor’s outgoing president Bob Hancock said about his choice for taking over the reigns after he retires this month.

    The data center infrastructure vendor won’t lose the expertise of the 30-year veteran who led the data center infrastructure solutions team; Hancock will remain on as chairman of the Fremont, California-based company.

    After a 12-year stint at Server Technology, where Vonich led the sales efforts of the power distribution unit manufacturer, he joined Instor in 2014 as VP of sales and marketing. He has been the driving force behind the expansion of sales to locations in Virginia, Texas, Ireland, Amsterdam, and the UK.

    “When I came onboard, my goal was to bring in a new set of clients to diversify Instor’s revenue,” Vonich said in a company blog post. “We had a strategic partnership with Digital Realty, which really started growing quickly. With that, we recognized that we needed to expand our presence nationally.”

    In his new role, Vonich’s will focus on enhancing customer relations while expanding Instor’s deep partnership with vendors, many of which can be linked to the company’s founding in the 1980s.

    “This new role allows me to focus more on our strategic relationships and on the long-term strategy to bring in more services to our customers,” Vonich said in a statement. “If there is something we can do for the customer that makes the solution more valuable to them, we’re always looking out to do that.”

    Under Vonich’s leadership, Instor recently launched a Fit Up Calculator, which lets data center operators estimate the cost of a new project – whether in a traditional data center or a colocation.

    Vonich and Hancock share similar philosophies when it comes to the company’s biggest asset and priority.

    “We both feel strongly that our strength is in the people we work with,” Hancock said. “In a lot of companies, the guys at the top tend to have large egos, but neither Jack nor I have that. We both feel we should create a successful environment for people with talent, then get out of the way and let them do it.”

    Instor has been in the data center infrastructure business since the ’80s. Its customers have included Oracle, Adobe, Digital Realty, and Cisco.

    4:44p
    Cybersecurity Firm Tanium’s CEO Apologizes for Being ‘Hard-Edged’

    Lizette Chapman and Sarah McBride (Bloomberg) — The head of Tanium Inc. apologized for being “hard-edged” and for exposing a hospital’s computer network during sales pitches — the executive’s first public statement following a Bloomberg News report last week of turmoil at the cybersecurity startup.

    Past and current employees described abusive behavior by Tanium’s Chief Executive Officer Orion Hindawi that led to an exodus of top executives, culminating with the departure last month of Chief Financial Officer Eric Brown.

    “It is true that I personally can be hard-edged, and that I’ve had to apologize to people at Tanium when I’ve gotten too sharp at times,” Hindawi wrote in a blog late Wednesday. “And it is true that as we’ve grown, we haven’t matured processes in some areas as quickly as we’ve added people, which is something we’re working hard to build faster. These are in fact all things we need to work on, and we’re doing so every day.”

    Last valued at $3.5 billion, Emeryville-based Tanium is one of venture capital firm Andreessen Horowitz’s largest bets. Hindawi, who succeeded his father and co-founder David Hindawi as CEO last year, is laying plans for an initial public offering.

    Tanium’s software sends a signal to devices connected to corporate networks. It asks what software is running, the date of the last security patch and other questions — a digital conversation that each device then asks other devices on the network. The result is swift visibility into what is connected and what is most vulnerable. The company says it can get full network visibility in 15 seconds.

    When pitching this technology to potential customers, Tanium salespeople used the internal corporate network of Silicon Valley-based El Camino Hospital for live demos. This was done without the hospital’s permission or knowledge, and the hospital’s identity was sometimes shared with the audience, according to people who presented or attended the demos. The Wall Street Journal earlier reported the practice.

    “We take responsibility for mistakes in the use of this particular customer’s demo environment,” Orion Hindawi wrote in Wednesday’s blog. He didn’t identify the customer by name. “We should have done better anonymizing that customer’s data.” He said viewers didn’t connect the demo environment to the customer for years, and that he does not believe Tanium put the customer at risk.

    While he noted that some customers have agreed to be used for demonstration purposes, he did not say whether El Camino Hospital had given its permission.

    El Camino Hospital said it neither authorized nor knew Tanium was exposing its network to outsiders. The hospital was only recently made aware of the activity.

    “El Camino Hospital is thoroughly investigating this matter and takes the responsibility to maintain the integrity of its systems very seriously,” a spokeswoman said. “It is important to note that Tanium never had access to patient information and, based on our review to date, patient information remains secure.”

    During hundreds of live demos, the hospital was sometimes identified by name and sometimes referred to as an unnamed hospital, according to the people who presented or attended the demos. Audience members would sometimes request Tanium sales reps make a specific query which would then respond with information identifying the hospital by name and the computing device that was at that moment compromised, they added. They asked not to be identified talking about private presentations.

    The demos, which revealed the hospital’s network names “ECHO” and “ECHO1”, frequently took place at the offices of Andreessen Horowitz. The VC firm prides itself on introducing portfolio companies to prospective customers. It regularly brings executives from established companies to what it calls its executive briefing center to listen to themed presentations by promising startups on subjects like finance or health care. Andreessen Horowitz declined to comment. Bloomberg LP was one of the venture firm’s early investors.

    Hindawi would often present at such briefings, typically to chief information security officers and chief information officers, people who have attended the demos said. Tanium’s demos exposed the names of devices connected to the hospital’s network, along with closely guarded information, such as which computers were not patched with software upgrades, people who presented or sat in on the demos told Bloomberg.

    By revealing weaknesses in El Camino Hospital’s IT architecture, Tanium may have violated federal and California state laws, including the Computer Fraud and Abuse Act and the California Comprehensive Computer Data Access and Fraud Act, said Daniel Appelman, a partner at law firm Montgomery & Hansen LLP.

    “Certainly, it’s bad business practice,” Appelman said. “It sounds insane.”

    In addition, the hospital may have run afoul of laws that mandate adequate cybersecurity measures, he added. The Federal Trade Commission has investigated and sanctioned companies for weak cybersecurity, and on the state level, the California Attorney General can sue companies that don’t comply with state law, Appelman said.  El Camino Hospital didn’t immediately respond to questions about its potential legal liability.

    Tanium’s live demos typically began with a disclaimer that the hospital had given permission for its IT environment to be shared in exchange for free services from the startup.

    El Camino Hospital was used as a live case study from at least 2014, said several people familiar with the matter.

    Hindawi had a master account and personally resolved problems with the hospital’s network, according to the people familiar with the situation.

    5:30p
    IaaS Industry Consolidation Won’t Curb Oracle Cloud Plans, VP Says

    Brought to You by Talkin’ Cloud

    TORONTO — If you follow the Gartner Magic Quadrant for infrastructure as a service (IaaS) closely each year you would have noticed that in the 2016 edition, no new vendors were added, while five vendors were removed. According to Gartner, market consolidation is to blame for the drop-off of those providers, and “the increased dominance of just two providers has led to closer relative market share among the other providers in the market.”

    The dominance of two cloud vendors – Amazon Web Services (AWS) and Microsoft Azure – is well-documented and researched outside of Gartner as well. Synergy Research Group recently noted that AWS has more than 40 percent of the public cloud services market (including IaaS and PaaS), while a recent report by 451 Research showed Microsoft encroaching on AWS’ market share lead, at least in Europe.​

    Missing from the Gartner Magic Quadrant for IaaS in 2016 was Oracle. The research firm said at the time of its evaluation Oracle Compute Cloud Service was not in general availability, and Oracle did not have enough market share to qualify for inclusion. Since then Oracle has released its Oracle Cloud platform, and has seen favorable response from the street; in Q3 Oracle posted revenue and profit that topped analysts’ estimates as sales from Oracle’s cloud businesses gained 62 percent.

    See also: Oracle’s Cloud, Built by Former AWS, Microsoft Engineers, Comes Online

    The momentum seems to have given a boost of confidence to the company; both Oracle Executive Chairman Larry Ellison and co-CEO Mark Hurd have talked in recent months about how its cloud platform is poised to take on cloud leaders like Amazon. Most recently, Hurd suggested Oracle does not have to spend tens of billions of dollars on data centers annually to catch up to the three largest cloud providers.

    It’s a confidence that others, including Oracle vice president of development Deepak Patil, share.

    “When it’s all said and done I believe that the number of major cloud players in the world who are providing the entire stack of amazing SaaS applications, platform services, and foundational IaaS components, you will be able to count on one hand,” Patil told Talkin’ Cloud this week at the Oracle Code event in Toronto.

    “The types of investments Oracle is making, the commitment Oracle has from everybody [in senior leadership] and phenomenal changes the company is undergoing, we are looking forward to one of being one of those,” Patil said.

    How the company gets there will be a significant effort, which will include tapping into the developer community. According to a presentation by Patil at Oracle Code on Tuesday, there are more than 10 million developers around the world, and that group in particular will benefit greatly from cloud innovations.

    See also: Top AWS Engineer Calls Hurd’s Cloud Data Center Bluff

    “Cloud truly uncovers unprecedented opportunities for the developer community,” Patil said.

    This week at the Oracle Code event in Toronto, which is part of series of free events Oracle is hosting in local markets for developers, Patil announced its acquisition of Wercker, a partnership with PluralSight to provide cloud-based training solutions, as well as a new portal for developers to see everything that Oracle is working on in one place.

    Having conversations with hundreds of developers in the year and a few months that Patil has been at Oracle has helped the company have “a much better understanding of what developers want,” he said, including local meetups or the ability to interact with Oracle at non-Oracle events like Velocity, KubeCon, and DockerCon.

    “Expectations for what the cloud should do for developers is changing on a monthly basis so the engagement with developers are a really great insight into how fast the cloud is evolving,” he said.

    The changing demands from developers – and enterprises – will create a much different-looking cloud market from the one we know today, Patil suggests.

    “The playing field is shrinking really, really fast and consolidation is happening, mergers are happening; acquisitions are happening,” he said. “If you look at Gartner Magic Quadrant in various aspects of cloud, IaaS, IPaaS, and data five-six years ago, and you see them now, those provide phenomenal insights into just how dramatic the consolidation has been.”

    “There will be significant consolidation, no doubt about it, but at the same time, on the part of cloud providers, it will be naïve to expect that enterprises especially will take everything they’ve got an move it to one cloud provider,” he said. “It’s not savvy for enterprises, both from a negotiations point of view, risk management point of view, and also from the point of view of different cloud providers’ platforms are going to have different strengths and different capabilities.”

    For Oracle, that means a few different things: building out homegrown cloud capabilities, making strategic acquisitions, and continuing to invest in partnerships.

    “The most foundational components of the cloud we will continue to build in-house,” he said. “When I was at Microsoft we didn’t outsource building the Windows kernel. Similar to that we are building the most foundational components of our storage engine, the most foundational components of our virtualization layer, and network layer in-house and we continue to do that.” To build out this team, Oracle hired engineers from Microsoft (where Patil spent over 15 years), Google and AWS.

    Acquisitions, including the most recent Wrecker acquisition, as well as those of Ravello and Dyn, will help build out other cloud components, he said. He made it clear that this combination – in-house development and acquisitions – will be a critical part of what Oracle does in the cloud. “I think some who did not succeed in the cloud race tried to buy their way out of it, and we are not going to do that, we have not done that,” he said.

    Finally, partnerships will help Oracle amplify its cloud message, Patil said. “I think it is impossible to succeed in the cloud unless you have a rich network of global partners, technology partners as well as ISVs and managed service providers.”

    On the technology end, this week Oracle announced a partnership with Docker where developers can access databases, middleware and developer tools in the Docker Store marketplace via the Docker Certification Program. With this collaboration, developers can now pull images of Oracle products in Docker and quickly start developing, testing and deploying modern applications, according to a statement.

    “Doing something for the developer community that allows development of containerized applications on Oracle Cloud easier was one of our top priorities,” Patil said.

    While some skeptics believe that Oracle started too late to capture enough IaaS market share, Oracle believes there is still a lot of runway left.

    “The good thing is while everyone talks about cloud, one of the most amazing things is where we’re at in the entire evolution. We are in the top of the first inning. We still are very early in our migration to the cloud,” he told attendees.

    This article originally appeared on Talkin’ Cloud.

    6:00p
    A Call for IoT Standards

    Joonho Park is Executive Director of The Open Connectivity Foundation.

    As anticipation for the Internet of Things has blossomed, so have misgivings and fears about its security vulnerabilities. Several high profile incidents, particularly the Mirai episode, have raised questions about the security risks posed by proliferating devices connected to the Internet. As companies and consumers continue their march into the brave new world of IoT, addressing these concerns will be essential.

    Industry-wide standards and certifications are a solution with many obvious benefits; vendors and IoT experts can craft them with an eye for security and allaying customer concerns. These conclusions are backed by a survey conducted by the Open Connectivity Foundation (OCF), which shows both widespread concerns over IoT security and clear support for an industry standardization approach. Indeed, respondents viewed standards implementation and vendor cooperation as an effective way to address ease of use, interoperability and security concerns. Sixty percent of respondents indicated that they were more likely to purchase a connected device with some form of a security certification, a clear sign that standards and certifications would effectively improve customer faith in IoT security.

    IoT security dramatically strode into the national spotlight last September with the arrival of Mirai. The now notorious malware finds and infects various IoT devices, assembles them into a centrally controlled botnet, and launches their traffic at targeted websites in massive DDoS attacks. Mirai’s inaugural assault was aimed at Dyn, a DNS service provider essential to the running of a multitude of different websites. The resulting downages were so widespread that a common refrain heard in news coverage was that Mirai had “broke the internet”, heralding an “IoT-pocalypse”. Subsequent reporting and analysis has continued to highlight the security vulnerabilities of IoT.

    Trouble on the Horizon?

    These anxieties are mainstreaming. Security concerns are considered the second highest barrier to IoT adoption, and improvements to device security the second most desired product change from IoT vendors. As connected device usage becomes universal and high profile security breaches continue to receive national coverage, these concerns could form the bedrock of a crisis in consumer confidence that would blowback on different vendors and the industry as a whole.

    The development of industry standards and certifications is the most effective and relatively straightforward response to IoT security concerns. By cooperating together, vendors can establish benchmarks for connected devices; these would cover everything from infrastructure to data protocols to security features. Such standardization would ensure that connected devices would have baseline security protocols in place. Product certifications or security ratings would be the next step. Vendors could signal to customers that the devices they purchase are up to agreed upon industry standards. Adopting these initiatives would be a simple yet effective method to both tackle security shortfalls and allay consumer concerns raised by those shortfalls.

    Fortunately for IoT vendors, the new is not all negative. There is a widespread desire for connected devices; some 80 percent of respondents from the OCF study said that they planned to buy a connected device within six months.  Less than (8 percent) said that they currently had no connected devices. These responses are a clear sign of how pervasive IoT technology already is and how the market is set to continue its growth. However, an increase in the number of connected devices will exacerbate security problems if industry standards are not in place.

    The distinct characteristics of connected devices, especially infrequent interactions from users, make them uniquely vulnerable to infection and manipulation by malicious actors. Traditional targets, such as personal PCs, are commonly interfaced with, and performance issues can tip off users that there is something wrong.  In contrast, many connected devices, such as routers, sensors and cameras, are designed to operate without regular check-ins. Once attacked, they may not show any noticeable signs of infection, and will sit unrepaired or replaced. Baseline security standards are clearly a necessary measure, as vendors can’t count on user intervention to identify potential problems.  Taken in conjunction with the obvious support, these technical considerations provide a compelling case for introduction of common industry standards and associate certification programs. This is one of the more effective mechanisms available to address the security vulnerabilities of our IoT future and restore confidence in the industry.

    The development of IoT standards and certifications is not only desirable from a security standpoint. The most commonly cited barrier to IoT adoption is interoperability; common industry standards would make the goals of device compatibility much more realistic. As such, standards would not only be a reactive response to security worries, but a springboard to developing features that customers want. However, vendors should consider security standards and certifications to be an immediate priority necessary to plugging security holes and buttressing consumer confidence.

    Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

     

    << Previous Day 2017/04/20
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org