Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Wednesday, December 23rd, 2015
Time |
Event |
12:04a |
Done Deal: $600M Google Data Center Coming to Tennessee Google has bought a defunct semiconductor plant in Clarksville, Tennessee, not far from Nashville, planning to convert it into a data center, state officials announced today.
The company expects to invest $600 million in the project. This will be the eighth Google data center in the US.
Hemlock Semiconductor built the $1.2 billion polysilicon plant in 2013 but did not launch it because of deteriorating market conditions for the material, used to make photovoltaic panels. The site has access to a lot of power and has a lot of infrastructure in place that Google can adapt for data center use.
The company has repurposed a massive paper mill in Finland as a data center and earlier this year announced a plan to turn a defunct coal power plant in Alabama into a Google data center.
The Tennessean reported that Google was in talks to buy the site Monday, when a local county board was expected to vote on whether to sell the property to the internet giant.
The site will be fully powered by renewable energy, Tennessee Department of Economic and Community Development said in a statement. As part of the deal, Google will be able to scout new renewable energy projects and work with the local utility, Tennessee Valley Authority, to bring renewable generation capacity to the grid. | 1:00p |
Top 10 Data Center Stories of 2015 While “boring” is a good thing as far as data center operators are concerned, 2015 was not a boring year for the data center industry. From data center outages on both sides of the Atlantic, caused by lightning and an explosion, to one of the biggest data center providers thinking of offloading its data centers, to a big change to the way the industry’s most important reliability rating system works, the year delivered plenty of excitement for some and anxiety for others.
Here is a recap of the most popular stories on Data Center Knowledge this year:
Not even the mighty Google data centers are immune to acts of god it turns out. In August, a series of successive lightning strikes in Belgium last Thursday managed to knock some cloud storage systems offline briefly, causing errors for some users of Google’s cloud infrastructure services.
 Chillers and cooling towers of the Google data center campus in St. Ghislain, Belgium (Photo: Google)
Few data center providers build bigger than Switch. In January, the Las Vegas company unveiled plans for its largest project yet, a $1 billion, 3 million square foot SuperNAP data center campus on 1,000 acres of land near Reno, Nevada.
 Rendering of the planned Switch Tahoe Reno SuperNap data center campus (Image: Switch)
CenturyLink’s colocation business, the business whose seeds were sown primarily four years ago with the $2.5 billion acquisition of Savvis, is not doing well. Colo revenue is not growing, and the telecommunications giant is looking for ways to avoid investing more capital in the segment.
In Facebook data centers, the meaning of the words “six pack” no longer has anything to do with beer or abs.
 Facebook’s Six Pack switch is a 7RU chassis that includes eight of its Wedge switches and two fabric cards (Photo: Facebook)
The Tesla energy storage systems are based on the powertrain architecture and components of Tesla electric vehicles. They integrate batteries, power electronics, thermal management, and controls into a turn-key system.
The way the most important rating system for data center reliability works has been changed.
Taking a major step forward in its quest to drive a Linux container standard that’s not created and controlled by Docker or any other company, CoreOS spun off management of its App Container project into a stand-alone foundation.
 Alex Polvi, founder and CEO of CoreOS, wants to offer a full suite of tools that will enable enterprises to deploy ‘warehouse-scale’ computing infrastructure. (Photo: CoreOS)
Nevada Attorney General’s office has asked the company to address accusations that it has been advertising its Las Vegas data center as a Tier IV facility, when in fact it was not constructed to Tier IV standards.
Multiple companies operating data centers in the area affected by the blast in August reported loss of utility power and said they had switched to backup generators without power interruption to their customers.
 Downtown Los Angeles. April 2014 (Photo by David McNew/Getty Images)
In the cloud giant’s vision, the role of hybrid is simply to smooth the transition and let customers’ existing infrastructure investments reach their natural end of life.
 Werner Vogels, CTO, Amazon, speaking at AWS re:Invent 2015 in Las Vegas (Photo: AWS)
Stay current on data center news by subscribing to our daily email updates and RSS feed, or by following us on Twitter, Facebook, LinkedIn and Google+. | 4:00p |
Change is Coming, and It’s IT-Fueled Neil Jarvis is Chief Information Officer (CIO) for Fujitsu Americas.
“Clairvoyance required” is not something that’s likely to appear in a CIO’s job description, but it probably should. With the massive amount of information (and misinformation) available these days about coming trends in IT, a crystal ball would sure make our jobs easier. The good news is, we can gaze into the future by examining the present IT climate, and extrapolating what shifts are likely to occur.
One of the safer bets is that workplace mobility will continue to skyrocket. There are numerous data points to support this; one of my favorites is that an astounding 34 percent of the U.S. workforce – that’s 53 million people– are doing freelance work. As more and more millennials enter the workforce, that number is sure to climb since many from that generation report that a “9-to-5” office job is not their ideal career trajectory.
Even those who aren’t freelancing don’t necessarily have to be tethered to a desk; enterprise technologies such as cloud services, collaborative work platforms and video conferencing have progressed to a stage where employees are no longer restricted to the location of a company’s headquarters. The playing field has become leveled to such an extent that working offsite will become the norm, not the exception.
Which leads us to another, possibly more controversial, projection for the future; We all know that BYOD stands for “bring your own device,” but within 10 years I believe we’ll know that acronym by its new meaning: “bring your own data center.” But don’t take my word for it; by 2016, Gartner predicts that 30 percent of BYOD strategies will leverage personal applications and data for enterprise purposes.
What that means is that the line between personal and enterprise data usage will blur. Now we see an unmistakable trend emerging – millennials are redefining not only when and where they work, but also how they get their work done. This is no passing phase, either. It will present an ongoing challenge for IT device makers and service providers – not to mention CIOs trying to formulate a coherent BYOD strategy – for years to come. What are those challenges, exactly?
In my mind, the biggest one of all is security. With the proliferation of connected devices fueling the current and future BYOD movements, the inevitability of a robust Internet of Things (IoT) can simply no longer be ignored. Just what are we talking about here? Well, the current number of connected things – over 5 billion – will skyrocket to 25 billion by 2020. That’s just five years away.
No crystal ball is needed to tell us we must make security planning for IoT a top strategic priority. Fortunately, cyber-criminals can only steal raw data and being able to turn that data into real-world analog, criminal activity will require a deeper understanding on the part of the would-be thieves. Make no mistake, though; understanding will come sooner than we would like. As CIOs, we certainly will have our work cut out for us.
The shift in how and where employees are working will bring about a number of technological challenges in the short term. Looking ahead, however, the opportunities it will create for workers, companies and IT vendors will be unprecedented. If you want to hold me to any of my predictions, make it that one. After all, human ingenuity wins every time.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 6:47p |
IBM Opens Up Security Analytics Platform to Outsiders 
Article courtesy of The Var Guy
Companies generally agree that sharing threat intelligence helps improve everyone’s cybersecurity posture, but some are hesitant to do it for fear of giving away too much information.
That attitude is beginning to change, however, and IBM is the latest to adopt a more generous approach to sharing threat intelligence. The company followed an announcement earlier this year that it had opened up its 700 TB security threat database, making it part of a threat data sharing platform with an announcement this month that it was opening up its security analytics platform for custom application development as well as launching an app exchange for creating and sharing apps based on IBM security technologies, it said in a press release.
IBM Security QRadar consolidates log source event data from thousands of devices, endpoints and applications distributed throughout a network and performs analytics on raw data to distinguish real threats from false positives, the company said. IBM customers, partners, and developers can now leverage the platform’s advanced security intelligence capabilities through new open APIs, the company said.
IBM also has launched IBM Security App Exchange, a marketplace for the security community to create and share apps based on these new QRadar APIs. IBM and partners including Bit9 + Carbon Black, BrightPoint Security, Exabeam, and Resilient Systems already have built a total of 14 new apps for the IBM Security App Exchange that extend QRadar security analytics in areas like user behavior, endpoint data, and incident visualization, according to IBM. Other partners such as STEALTHbits and iSIGHT Partners also have apps in development.
For example, Exabeam’s User Behavior Analytics app integrates user-level behavioral analytics and risk profiling directly into the QRadar dashboard, providing a real-time view of user risk that allows companies to detect small behavioral differences between a normal employee and an attacker using that same credential, according to IBM.
IBM opened its massive database of security threat data through its IBM X-Force Exchange platform. Since then more than 2,00o organizations have joined the program to share threat intelligence.
Marc van Zadelhoff, vice president, strategy and product management for IBM Security, said it’s imperative that industry leaders like IBM take initiative to extend security technologies to share threat intelligence to promote better cybersecurity globally, which suggests that stakeholders can expect similar moves from Big Blue in the future.
“With thousands of customers now standardizing on IBM’s security technologies, opening this platform for closer collaboration and development with partners and customers changes the economics of fighting cybercrime,” he said in the press release. “Sharing expertise across the security industry will allow us to innovate more quickly in order to help stay ahead of increasingly sophisticated attacks.”
This first ran at http://thevarguy.com/network-security-and-data-protection-software-solutions/ibm-shares-threat-intelligence-through-app-e | 8:01p |
US Government Spends $9B on Software Yearly and Wants to Change That Top White House technology and acquisition officials are making a push to reduce the federal government’s enormous annual software spend.
The government’s various agencies make more than 50,000 software purchase transactions, collectively spending about $9 billion. A lot of that money is spent on unnecessary, redundant software licenses. To combat the inefficiency, US CIO Tony Scott and chief acquisition officer Anne Rung want to create a more centralizes strategy for software purchases by government IT.
Departments of Defense and Interior, for example, are some of the largest buyers of geospacial software within the government. If and when a new government-wide system that’s in the works gets up and running, both will be able to take advantage of government-wide volume discounts and reduce duplication, saving millions of dollars as a result, Scott and Rung wrote in a joint blog post announcing the effort.
The effort aims to reduce government IT spending on software in a similar way another recent change cut the cost of laptops and desktop computers. The Office of Management and Budget issued a directive prohibiting agencies from issuing new contracts for the machines, requiring them to use three existing contracts.
For at least one of those existing contracts, NASA’s Solutions for Enterprise-Wide Procurement, vendors dropped their prices for some of the standard-configuration computers by up to 50 percent one month later, according to the two chiefs.
The recent memo about proposed changes to software buying processes invites agency heads to comment on the changes. It is one of numerous new Category Management policies, issued to centralize purchasing of common commodities by the government.
Agencies struggle to create accurate inventories of software they own, often buy capabilities they don’t need, and don’t share details about their purchases, such as pricing, terms, and conditions, to improve purchasing, the memo read. Most of them don’t have a centralized authority that manages software agreements or employees who know how to properly negotiate and manage large software agreements.
The proposed policy aims to address those problems. In effect, this and other Category Management policies attempt to make the government act more like a private-sector company when buying products and services, leveraging scale to bring down the cost. | 8:25p |
Oracle Buys Docker Container Startup StackEngine, Opens Cloud Campus in Austin 
Article courtesy of theWHIR
Oracle acquired Docker container operations management startup StackEngine on Friday and followed on Tuesday by announcing plans to build a cutting-edge cloud campus in Austin, Texas. A team of recent graduates and technical professionals will be employed on the campus, which is designed to attract top millennial talent to support Oracle’s rapidly growing cloud business.
StackEngine sells an application management solution called Container Application Center, which provides an administrative console to automate and simplify container operations management. The company just emerged from stealth in October 2014 and TechCrunch reports it had accumulated $4.5 million in funding over multiple investment rounds prior to the sale. StackEngine’s full team will be integrated with Oracle as part of Oracle Public Cloud, which is the only detail provided about the acquisition by the companies.
The cloud campus announcement provides further clues, however, about Oracle’s plan and StackEngine’s role in it, as StackEngine is also based in Austin. With application development shifting steadily toward container-centric architectures, and Docker purchasing competing container management software company Tutum in October, Oracle seized the remaining opportunity to compete in the hot DevOps services market. Competing means offering enterprises a differentiated solution, and if StackEngine can be integrated with Oracle’s PaaS to deliver the security, resiliency, and speed it promises, easy container management may become a selling feature for Oracle.
Staff at the Austin cloud campus will increase Oracle’s local team “by more than 50 percent over the next few years.” Oracle will also purchase a 295-unit apartment building adjacent to the planned 560,000 square foot complex and parking development to house employees and provide work-life balance, the company said.
In an earnings report last week, Oracle revealed that while quarterly on-premise software revenue decreased 7 percent, its total cloud revenue rose 26 percent to $649 million, with PaaS making up three-quarters and growing by 34 percent. Oracle launched new IaaS offerings and expanded its Cloud Marketplace in October.
This first ran at http://www.thewhir.com/web-hosting-news/oracle-announces-cloud-growth-stackengine-acquisition-and-austin-cloud-campus |
|