Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, October 26th, 2015

    Time Event
    12:00p
    Vantage Brings 12MW to Supply-Constrained Silicon Valley Data Center Market

    Last week, Vantage Data Centers, which rents out large amounts of data center space to tech companies and cloud providers in Silicon Valley and Central Washington, suddenly brought an additional 12 MW of data center capacity to market in Silicon Valley. But it didn’t actually build a new 12 MW data center on its three-building Santa Clara campus.

    The capacity was freed up by an existing customer that had overestimated its future data center needs when it signed a lease with Vantage. “The customer, in hindsight, purchased more capacity than they ultimately needed,” Vantage CEO Sureel Choksi said in a recent interview at the company’s offices in Santa Clara but declined to disclose who the customer was.

    While 12 MW is a lot of capacity – typical wholesale deals range between 1 MW and 6 MW – it was welcome news for Vantage, which only had 3 MW available in the supply-constrained Silicon Valley data center market. While several providers are in the midst of construction to add capacity, few have big chunks of space that’s ready to go.

    Supply of data center space in the Silicon Valley is at “historically low levels,” real estate brokers at Jones Lang La Salle wrote in a recent report on the North American data center market. Demand this year is higher than it was in 2014, and JLL anticipates supply constraints to persist because of both high demand and a tight real estate market.

    Providers that can deliver “large contiguous space” will be able to rent it out at premium rates, the brokers wrote. This is good news for Vantage, since half of the capacity freed up by its customer is fully built-out and the other half is contiguous expansion space.

    Not surprisingly, Silicon Valley’s booming tech industry is driving the demand the same way it’s driving demand for office space and housing in the region.

    Cloud providers and enterprise software makers are driving demand for Vantage, Choksi said.

    Both Infrastructure-as-a-Service and Software-as-a-Service cloud companies are shopping for data center capacity in the area.

    As we reported earlier this year, Chinese internet and cloud services companies are contributing a significant portion of this demand. “Both domestic and Asian providers represent a big source of demand going forward,” Choksi said.

    Alibaba, China’s answer to Amazon, launched two Silicon Valley data centers this year. It’s unclear which data center provider or providers Alibaba is using in California.

    Server Farm Realty partnered with Chinese data center provider 21Vianet, which said it will use SFR’s data center space in Santa Clara to serve unspecified Chinese customers. Last year, CoreSite leased 3 MW of capacity in Santa Clara to China Telecom to serve an unnamed customer.

    CoreSite also said earlier this year it was building a massive 140,000-square-foot data center in Santa Clara for a single customer whose name it did not disclose. In addition to the build-to-suit project, CoreSite is building a 230,000-square-foot multi-tenant facility, expecting to “substantially complete” the first phase in the second quarter.

    Software companies comprise the other big category that’s driving demand in the Valley. “Enterprise software is a big driver of wholesale data center needs,” Choksi said.

    Vantage’s recent wins include multi-megawatt deals with the security software giant Symantec, the enterprise Hadoop company Cloudera, and MarkLogic, an enterprise NoSQL database company. Another enterprise software company signed a multi-megawatt deal this year, but its name was not disclosed.

    According to JLL, there was only about 14 MW of commissioned vacant data center capacity available in the Silicon Valley when its report was published earlier this month. The report came out before Vantage announced its 6 MW of commissioned capacity.

    Choksi believes Vantage is the only provider that can provide that much capacity with contiguous expansion space in the market today, putting it in the position to benefit from those premium rates forecasted by JLL.

    3:00p
    Five Factors Data Centers Must Contend with in the IoT Era

    Brian Lavallée is the Director of Solutions and Technology Marketing at Ciena.

    Whether it’s 25 billion or 75 billion “things” connected to the Internet by 2020, there’s no question that the Internet of Things (IoT) will generate huge amounts of data changing the way business is conducted and how we go about our day-to-day tasks. IoT promises to improve the way we live, work, and play; but could its progress be hindered by data centers being unable to cope with the data demands?

    Below are five key factors data center providers must consider:

    Data Center Interconnect (DCI)

    The impending terabits of data from IoT will only exacerbate data bandwidth challenges, requiring networks to support far more data traffic than ever before. For IoT to truly succeed, data must move across the network freely and securely. Consequently, DCI becomes increasingly important. Sufficient levels of bandwidth must be made available to send data from the “thing” to the data center, as well as between data centers due to distributed computing and storage architectures, to accomplish meaningful analysis. This is critically important for things like smart cars, in which data is constantly being received and analyzed by onboard systems that provide alerts for everything from faulty headlights to low tire pressure. Once driverless cars become practical, data will need to reach the data center in near real-time to prevent collisions. Advances in coherent optics have paved the way to successfully transmit data at rates of 100 GBs and much higher, over almost any distance, dramatically improving DCI performance.

    Computation

    IoT will accelerate the adoption of big data analytics. An enormous amount of structured and unstructured data will be generated, and computational-intensive analytics are required to gain meaningful and actionable information. IoT data is ordinarily crunched on entry to the data center to create a new, more valuable data streams. This, along with other streams, is sent to powerful analytics engines using big data techniques to generate meaningful results that can be acted upon. Robust DCI network connectivity that facilitates distributed compute and storage capabilities required of big data analytics is needed to process the enormous amounts of data collected from a variety of sensors.

    Storage

    The impact of IoT on storage infrastructure will have to be addressed as this data becomes more prevalent. The focus today must be on centralized and distributed storage capacity for data center operators, as well as whether IT can harvest and use IoT data in a cost-effective manner. Up until now, the comparatively more predictable nature of traffic made it simpler to store data, but with data coming from smartphones, watches, cars, and countless other devices, data is exploding. Legacy storage systems are on the verge of becoming obsolete, leaving room for organizations to adopt cloud-based storage services that supported both structured and unstructured data in a highly secure manner. As a result, storage isn’t strictly local anymore. Instead, cloud-based storage provides flexibility, scalability, compliance, and a sophisticated architecture to support an essentially unlimited influx of data.

    Availability

    If critical infrastructure systems like smart grids and emergency alert systems are going to be sending time-critical data over a network and acting upon it in a meaningful and timely manner, a high degree of network and service assurance is needed. A carrier-class network will help eliminate the network connectivity concerns of data center operators and ensure mission-critical traffic can be successfully transported over short to very long distances based on application requirements.

    Network latency and jitters will impact performance of IoT applications. A patient whose vital signs are being constantly monitored by remote hospital staff could end up in a life-threatening situation if critical information cannot travel across the network in a reliable and timely manner. Situations like this show why highly reliable and low-latency networks optimized for IoT traffic will be necessary.

    Security

    A recent FTC report warns that IoT can present a number of security risks that could be exploited to harm both businesses and consumers. Even though consumers produce sensor data, it doesn’t always belong to them; instead, the data is often owned by the collecting entity, which will create apprehension for many users. This information includes financial transactions, personal records, and corporate data that is often business critical and confidential, creating a requirement to ensure data center network connections are trusted, reliable, and secure – often requiring network encryption.

    While encryption and stringent rules for access to stored data are widely employed to protect against intrusions, advances in networking equipment can also deliver low latency, in-flight data encryption. This provides increased protection for data from the moment it leaves one data center to the moment it enters another data center. Encrypting at the transport layer of the network offers wire-speed performance to ensure the process does not reduce the traffic throughput, increase latency, or modify the content. Data center operators must be compliant. Different countries have varying codes and regulations regarding how data can be stored, accessed and analyzed.

    Once simply imaginative, the business and life enhancements that IoT can bring to society are now almost infinite and right around the corner. However, without properly addressing the dynamics inside and outside the data center, the many benefits offered by IoT will be interrupted and wide-scale adoption will slow.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    5:05p
    Former VMware CTO Joins CloudFlare as Head of Engineering

    logo-WHIR

    This article originally appeared at The WHIR

    Former VMware chief technology officer Ben Fathi has joined CloudFlare as head of engineering, according to a blog post on Monday.

    Fathi left VMware in August after three and a half years with the company. Fathi was one of VMware’s top executives to leave in August; Chuck Hollis, VMware chief strategist of storage and availability, left to join Oracle around the same time.

    In a post on CloudFlare’s blog, Fathi describes why he chose to take a role at the company, an announcement which he said may leave some scratching their head since it’s “not even close” to his passion of operating systems.

    “We have an opportunity to build a fully distributed and resilient always-on Internet Operating System at the edge of the network — where it matters most — as far as end user experience is concerned,” Fathi said in the blog post. “I’m thrilled to be taking over the helm of engineering at CloudFlare. The question shouldn’t be why I joined, but rather, why you’re not here as well. Come join us and let’s make the Internet a better place together. If we happen to build a scalable secure cloud at the same time, even better. If we manage to build a distributed Internet-scale operating system to manage it all, you might call it serendipity.”

    Fathi has more than 30 years of experience in the tech industry, working on UNIX,Linux, Windows, embedded operating systems, and distributed operating systems, according to a report by TechRepublic. At VMware, Fathi ran the core engineering team for vSphere before taking the CTO role.

    In the blog post, Fathi also described the interview process and meeting the CloudFlare team, which no doubt has some PR purposes as the company is hiring to keep up with its tremendous growth. Last month, CloudFlare raised $110 million, and announced plans to use the funding to acquire customers, grow product capabilities and rollout expansion into international markets.

    This first ran at http://www.thewhir.com/web-hosting-news/former-vmware-cto-joins-cloudflare-as-head-of-engineering

    5:30p
    Survey: Enterprise IT Employees are Biggest Threat to Security

    varguylogo

    This post originally appeared at The Var Guy

    Company managers might think their IT staff would be the most trusted not to make any security gaffs, but they may want to think again, according to a new survey, which found that these types of workers are posing some of the biggest risk to IT security.

    The 2015 Insider Risk Report—commissioned by cloud business app provider Intermedia and delivered by independent marketing association Precision Sample—found that people with the greatest access to company data who are tasked with keeping it secure, such as IT personnel, are more likely to engage in risky behaviors than the average employee.

    According to the survey—which polled more than 2,000 business professionals—32 percent of IT staff admit to having given out their login and password credentials to other employees. This is compared to 19 percent of other staff that responded to the survey who said they’ve given out their login and password credentials.

    “It’s nearly always that technical people are the worst offenders,” said Richard Walters, vice president of Identity and Access Management at Intermedia, in a press release. “They know how to get around various controls that an IT team will put in place. It’s sometimes done with the best intent, but nevertheless with a complete lack of consideration for the risk or security implications.”

    Moreover, 28 percent of IT pros said they’ve accessed systems of previous employers even after they’ve left positions at those companies, compared to 13 percent of other respondents, according to the survey. And 31 percent of IT professionals said they would take data from their company if they think it would positively benefit them—a number that’s nearly three times the rate of general business professionals, according to the survey.

    Overall, 93 percent of those polled admitted to insecure IT practices, which result in myriad issues for the enterprise, including lost data, regulatory compliance failures, data breaches and even blatant sabotage by a disgruntled current or former employee, according to Intermedia.

    The newer entrants to the workforce, millennials, also are some of those most likely to put enterprises at risk. The survey found that this group of employees—who have been comfortable with technology most of their lives—are also most likely to be guilty of installing applications without company approval.

    They also engage in behaviors that breach the personal and professional divide by saving company files to personal cloud storage or other so-called “shadow IT” practices, according to the survey.

    This first ran at http://thevarguy.com/network-security-and-data-protection-software-solutions/102615/survey-enterprise-it-employees-are-biggest-t

    << Previous Day 2015/10/26
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org