Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, October 31st, 2014
Time |
Event |
12:00p |
From HAL to Johnny Depp’s California Data Center: The Evolution of Data Centers in Movies As data centers continue to play a central role in our day to day lives, they make their way into the pop culture. The data center has made several appearances on the silver screen over the years, at least as a concept. While many movies go the supercomputer route (perhaps a trend started by War Games), the “data center” is often a central plot driver. Here’s a look at how the concept of a data center has evolved in the movies over the years.
Several films, such as Hackers, have presented data centers as futuristic boxes filled with blinking lights to convey the idea to the viewer. In recent movies like Transcendence, however, they look closer to what the future data center might actually look like physically.
So, here it is, the Data Center Knowledge guide to the way Hollywood’s portrayal of the infrastructure that powers our high-tech world has evolved:
2001: A Space Odyssey (1968)
People either love or hate Stanley Kubrick’s 2001: A Space Odyssey. It can be described either as a deep meditation or as plodding. The antagonist is artificial intelligence named HAL 9000.
Kubrick does a lot right in the movie. The source material by Arthur C. Clarke is visionary. There are no Star Warsy space noises (space is a vacuum, after all) and the movie predicts the importance of both computers and artificial intelligence.
HAL is perhaps the first and most important computer plot device ever used, and the villain AI continues as a popular trope.
The protagonist is forced to dismantle HAL’s brain room (the supercomputer or data center behind the AI) when HAL turns on him.
Terminator 1 and 2 (1984, 1991)
Perhaps it’s cheating to include so many instances of AI on a data center list, but a data center is ultimately powering these oft-used antagonists. Perhaps the most famous computer antagonist is Skynet in the Terminator series. Once again, self-awareness causes machines to turn against humanity.
2001 and Terminator have also set up IBM for unending HAL and Skynet jokes regarding its cognitive computing system Watson.
Live Free Or Die Hard (2007)
The hacker installation of the Die Hard series starring Bruce Willis. The plot revolves around “the Russians” hacking into critical infrastructure and wreaking havoc in the U.S. The hackers are able to tap into a plethora of critical systems from a single dashboard, which is pretty impressive if not implausible.
Also of note is “Warlock” played by Director Kevin Smith. Warlock has a “command center” which is really a basement, but can be considered a home data center of sorts in 1997 terms. Also look out for “mutating encryption algorithms” solid state drives, and an emergency data dump for the country’s most critical information into a single backup data center, not redundant locations. In a way, this is a horror movie.
For another home data center, Demon Seed’s Proteus IV controls all aspects of living in the movie and grows an unnatural attachment to his owner.
Entrapment’s Magical Patch Cables (1999)
Sean Connery and Catherine Zeta-Jones sneak into a server room, break open a cabinet and steal millions from a stock exchange by connecting a few patch cables.
The implausible scene is often cited as one of the worst “hacking” scenes in movies. Note that an alarm is set once the cable touches the laptop and not the server.
The Matrix (1999): Human Powered Data Center
While the data center industry looks for alternative, renewable sources of energy, the data center in the Matrix discovers humans are the perfect battery. Somehow the output of energy is greater than the energy required to keep humans living and incubated. While the machines look organic, the sum of all the parts makes for a dystopian data center.
The data center powers a virtual simulation world to keep humans pacified, until a group wakes up and plots its attack, led by Neo, the One, played by Keanu Reeves.
Ocean’s Eleven (2001)
The heist movie features a more “classical” data center or traditional racks as we know them. The data center begins to move away from abstract concept in the 2000s.
An elaborate plot to steal dough from a Casino includes hacking into security camera feeds. This is accomplished once again with some masterful patch cable movement (see: Entrapment).
Skyfall (2012)
Finally, we begin to see data center scenes that resemble reality. The James Bond flick shows a neatly arranged data center owned by the villain. Compass Datacenters CEO Chris Crosby called the film the “data center movie of the year,” noting a distinct lack of data centers in films. So far, the list backs up this claim with the traditional data center being largely absent, despite often being a major contributor to the plot.
Iron Man 2 – Powered By Oracle (2010)
The Oracle name is slapped across this superhero movie follow up, CTO Larry Ellison even making a cameo appearance. Tony Stark, Iron Man, is perhaps the biggest technical visionary in the fictional Marvel Universe, and Oracle is sure to have paid a tidy sum to make sure that Iron Man leverages its tech.
Oracle held a screening for the movie and prominently features the trailer on its website. There’s even a fictional article about how Iron Man/Stark industries uses Oracle Cloud data centers.
Oracle Exadata servers also make an appearance.
Tron: Legacy (2010)
 Nothing legacy about Tron Legacy’s data center
The update on the classic Tron also features a very futuristic data center. The server cabinets aren’t in rows, but spaced evenly apart. Some might call this a gaffe, but perhaps the servers of the future give off zero heat or have a heat profile that requires equidistant spacing between cabinets.
Transcendence (2014)
All it takes to build a dream data center is to transcend your physical body. Spoiler Alert: Johnny Depp’s character transforms into AI with human awareness and builds one of the most beautiful data centers ever to appear on the silver screen. We often talk of automation and robots in the industry, but we’re not quite at the point seen in the movie. Nano technology is able to control and build all aspects of the facility.
Transcendence features the ultimate future data center. In the movie, the data center is built at a cost of $38 million. Once again, the tech-hearted will twitch at a few inconsistencies in the movie, but the beauty of the data center is unquestionable.
The data center was also located five stories below ground in the middle of nowhere — in a fictional town called Brightwood. It is built underground for security purposes and to control facility temperatures. It is powered by a massive array of solar panels which Depp’s character is able to build and fix on the fly.
The movie data center has come a long way since HAL in 2001: A Space Odyssey to Transcendence. As more of the general public becomes aware of the data center, we can expect an increasing amount of data center cameos. Also expect more appearances of “the Cloud,” which was recently central to the plot of “Sex Tape” starring Cameron Diaz. | 3:00p |
Joyent Raises $15M, Says it Used Containers Before They Were Cool Joyent, the San Francisco-based public cloud provider, has raised $15 million in venture capital it said it will use to convince enterprise IT shops and developers to use application containers in the cloud.
The company is known for its high-performance Infrastructure-as-a-Service cloud as well as for being the sponsor and driving force behind Node.js, the popular open source platform for writing server-side applications in JavaScript.
A group of existing investors, including Intel Capital, El Dorado Ventures and EPIC Ventures, among others, participated in the latest funding round, meant to fuel integration between Joyent’s cloud and Docker container technology.
In the past, Joyent differentiated its IaaS from the likes of Amazon Web Services by saying it had a different architecture that made the cloud perform better. It’s still saying that, except it is now emphasizing that what makes its platform perform better is the use of application containers.
“Joyent has always been container-focused, and the way we’ve been high-performance is because of containers,” the company’s CTO Bryan Cantrill said. “We were way too early, and people didn’t see the same advantages that we saw necessarily.”
Now that Docker has made application containers the next big thing in IT infrastructure, Joyent is attempting to capitalize on sudden popularity of the technology.
Marrying two different kinds of container technology
Docker, an open source software project and a company, has been around for less than two years. The first production-ready release came out in June of this year, but the technology already has support from a handful of big names in IT, including Google, IBM, Microsoft and Red Hat, and runs inside data centers by companies like eBay and Gilt.
Application containers used by Joyent’s SmartOS are different from Docker’s Linux containers, but the basic concept is the same, Cantrill said. Instead of creating virtual machines and running an operating system on each of them, the approach is to virtualize the OS and run it so that it uses resources of the underlying server without the extra layer of hypervisor software in between, which, according to Cantrill, degrades performance.
While it uses different container technology, Joyent is working to make Docker containers available on its cloud – a bridge between the SmartOS foundation and Docker images. Cantrill was reluctant to describe in detail how that bridge would work exactly, promising to reveal more in the coming months. The integration is well on its way, but the company is not ready to talk about a specific product roadmap just yet, he said.
The difference between SmartOS and Docker containers is fundamental, Cantrill said. The team that built Joyent’s cloud OS was focused on features like supporting multiple tenants on a single infrastructure and security. Linux containers were originally created by Google for its own use, so there wasn’t a concern about sharing servers with others.
Docker the company has been focused on making Docker containers palatable for enterprises.
Joyent’s cloud lives in colocation data centers (mostly Equinix facilities) in Northern Virginia, Las Vegas, San Francisco Bay Area, and Amsterdam. There are also lots of users around the world using the company’s software to operate private clouds in their own data centers, Cantrill said. | 3:30p |
CoreOS Launches Stand-Alone Docker Container Registry for Private Deployments CoreOS, the San Francisco-based startup that offers a version of Linux geared for large-scale compute clusters and optimized for Docker containers, has introduced a stand-alone version of its container registry customers can run in their own data centers.
Both CoreOS and Docker are recent comers to the cloud infrastructure market but have enjoyed a lot of support from developers. Both companies are aiming to enable traditional enterprises to develop and run software the way Internet giants like Facebook or Google do – constantly creating new features and deploying them in production across their massive data centers.
CoreOS got the registry capability through the acquisition of a New York startup called Quay.io in August and until now has been offering it as a hosted service bundled with its “managed Linux” offering. Now, by popular demand, customers who like the registry but don’t necessarily want to pay for the full-fledged enterprise-grade service can buy the software separately.
Docker the company (as opposed to the open source software project) hosts users’ container images in its public registry called Docker Hub. Quay.io offered private registry hosting services to make it more enterprise-friendly.
CoreOS only started offering Quay.io’s service in August, after it bought the two-person startup. But the registry was so popular with its customers that it decided to make it a stand-alone product.
“Due to popular demand we decided to make CoreOS Enterprise Registry available to any company solely looking for a private Docker registry,” CoreOS CEO Alex Polvi said in a statement.
The Docker container registry has an interface to manage permissions, audit code commits and deploy containers. It also automates deployment by pushing code into a user’s repository for access by their servers after it has been checked into GitHub and put through tests. | 3:30p |
Friday Funny Caption Contest: Giant Pumpkin Happy Halloween, goblins and ghouls, let’s get into the spooky spirit! Help us complete our Kip and Gary cartoon by submitting your caption below.
Diane Alber, the Arizona artist who created Kip and Gary, has a new cartoon for Data Center Knowledge’s cartoon caption contest. We challenge you to submit a humorous and clever caption that fits the comedic situation. Please add your entry in the comments below. Then, next week, our readers will vote for the best submission.
Here’s what Diane had to say about this week’s cartoon, “What on earth could a giant pumpkin being doing in the middle of the data center?!”
Congratulations to the last cartoon winner, Darrell, who won with, “My new office is cramped, but it has great air conditioning!”
For more cartoons on DCK, see our Humor Channel. For more of Diane’s work, visit Kip and Gary’s website.
| 5:13p |
DuPont Fabros Customer Subleases Former Yahoo Data Center in Ashburn DuPont Fabros Technology execs revealed on the company’s third-quarter earnings call Thursday that one of its existing customers subleased the entire 13 megawatts of capacity at its ACC4 data center in Ashburn, Virginia, that was recently vacated by Yahoo.
Since Yahoo’s lease has not yet expired, the lease is a three-party agreement between the Internet company, the data center provider and the customer. The customer will also have a direct agreement with DuPont Fabros on a lease extension for 9 megawatts in the former Yahoo data center.
The customer, whose name was not disclosed, chose the Yahoo sublease space over brand new space in the recently built ACC7. Company officials did say that it was an existing “super-wholesale” customer with space in ACC5 and ACC6.
DuPont Fabros CEO Hossein Fateh explained that ACC7 didn’t have enough space for the customer’s needs, however, which either means the customer is going through a stage of rapid growth or the data center space came at a deep discount. ACC4 was completed in 2007.
This is a sizable chunk of space that would have impacted the market at large had it not been taken. A big Internet company exiting a big data center to move into one of its own and subleasing it at a discount rate until its own lease with the provider expires has been a common theme in several high-profile data center markets.
If it went at a discount, the Yahoo data center sublease in Ashburn is likely to be welcome news for DuPont Fabros competitors in Northern Virginia, including RagingWire, CyrusOne, Digital Realty Trust and CoreSite. It’s difficult to compete in a market where some inventory is available at below-market rates.
CEO succession plans on fast track
DuPont Fabros announced it was looking for a replacement for Fateh in 2013, but the previous plan was to appoint someone as president and then, over time, have them transition into the CEO role, at which point Fateh would step down. But now the plans have changed.
DuPont Fabros is now looking for a CEO and president who will be ready to take over immediately, and Fateh will step down as soon as that person is found. He will remain on the company’s board.
“Finding the right person to lead DFT’s next chapter of growth is such a serious undertaking,” said Fateh. “It deserves careful time and attention, but frankly, our process has taken just too long. The board and I have committed to a new and re-prioritized approach.”
The company reported $105.6 million in revenue for the quarter — up 10 percent year over year. DuPont Fabros earnings per share were $0.60 — up 18 percent year over year. | 6:00p |
IBM Adds OpenStack Services, Launches Cloud Data Center in Mumbai IBM has added OpenStack services to its cloud marketplace in yet another demonstration of a tech giant looking to enable OpenStack clouds in the enterprise.
The services will run on SoftLayer infrastructure. SoftLayer has offered OpenStack-compatible cloud services before, but this marks introduction of IBM OpenStack cloud services into the enterprise-focused cloud marketplace.
IBM also opened a 31,000 square foot data center in Mumbai in support of its global cloud offering. The expansion is part of an ongoing $1.2 billion investment by IBM to grow its SoftLayer data center infrastructure. The investment was committed in January, with 15 data center expansions planned, which will bring the total portfolio to 40 total data centers across five continents. The Mumbai announcement follows recent launches in Toronto, London, Hong Kong, Melbourne, and Paris.
The goal of the new IBM OpenStack cloud services is to make it easier to deploy OpenStack clouds on top of IBM cloud. Business and government customers can select a range of underlying hardware configuration options based on open standards and deploy on IBM SoftLayer’s global footprint. Managed services are available depending on the needs.
Startups welcome OpenStack support from giants
Legacy IT vendors have based their cloud strategies on the popular open source cloud architecture, and countless startups have devised entire business models around it. End users are taking notice, and the community has matured tremendously with contributions from companies of all ilk.
VMware launched its own integrated OpenStack distribution in August. Oracle keeps adding OpenStack support for its own Linux distribution, as well as other offerings, such as Network Fabric.
There is no shortage of backing for the open source project from the IT giants, which smaller OpenStack-based businesses welcome.
“It’s great to see more efforts towards widely deploying OpenStack,” said John Dickinson, director of technology at SwiftStack and OpenStack Swift project technical lead. “Data is growing, and business must choose open technology to have full ownership of everything that touches their data. With IBM providing OpenStack services, more companies have greater access to an open software-based infrastructure solution. I’m excited to see large, traditional companies bringing OpenStack services to their customers.”
“It helps everybody who’s involved in OpenStack,” said Wendy Cartee, vice president of marketing PLUMgrid. PLUMgrid is a big contributor to the network technology side of OpenStack. “All the activity definitely helps a startup like us. All the legacy companies are heavily investing in OpenStack. We’re seeing enterprises and service providers move to [OpenStack] because they don’t want to be locked in.”
SoftLayer’s previous OpenStack efforts
IBM is a founding member and contributor to OpenStack, but once upon a time, OpenStack and SoftLayer seemed like an unlikely duo.
Initially, OpenStack was deeply tied into Rackspace (the company’s engineers played a major role in its creation), and SoftLayer and Rackspace were fierce competitors. For those that lived through the early days of that battle, it might appear strange to hear SoftLayer and OpenStack in the same phrase. But the project has evolved into an open source development effort with many contributors and SoftLayer has successfully used and enabled customers to use OpenStack for a while.
SoftLayer’s object storage platform is built on OpenStack Swift and its three-tiered network “integrates perfectly with OpenStack’s Compute and Network node architecture,” wrote SoftLayer CTO Marc Jones.
Object storage comes to Bluemix PaaS
IBM also recently introduced Object Storage as a service on its Platform-as-a-Service called Bluemix. Object storage is more scalable than traditional file storage. The PaaS continues to gain valuable services such as cognitive computing Watson APIs and an Internet of Things (IoT) focused service for data generated by connected devices. | 6:30p |
Microsoft Azure Unveils Real-Time Tools for IoT and Big Data Application Developers 
This article originally appeared at The WHIR
Microsoft’s public cloud platform Azure is continuing to add functionality for Big Data and the Internet of Things (IoT) with the preview launch of Stream Analytics and Data Factory, and general availability launch of Azure Event Hubs.
According to a Wednesday blog post from Microsoft’s Corporate VP of Machine Learning, Joseph Sirosh, these new services help make Azure an ideal cloud platform on which to build big data solutions, as well as process, manage and orchestrate data from IoT devices and sensors.
Azure Stream Analytics is an event processing engine which helps uncover real-time insights from devices, sensors, infrastructure, applications and data. It could be used in IoT scenarios such as real-time fleet management or gaining insights from devices like mobile phones or connected cars.
Azure Data Factory helps orchestrate and manage diverse data such as an on-premise SQL Server database, and cloud data like Azure SQL Database, blobs, and tables. Data Factory helps assess end-to-end data pipeline health across various sources, and pinpoint issues and troubleshoot them.
Industrial automation and information provider Rockwell Automation, for instance, uses Data Factory as part of its remote monitoring services to orchestrate critical data pipelines for time series sensor data.
Azure Event Hubs is capable of logging millions of events per second in near real time. The data is collected into “Event Hubs”, which can then be used with real-time analytics services such as Stream Analytics.
Event Hubs and Stream Analytics can be used together to process massive amounts of real-time data and enable organizations to make more immediate decisions. Medical products company Aerocrine has already implemented Stream Analytics and Event Hubs to improve the management and care of patients with inflammatory airway diseases through the analysis of telematics data from clinics.
Microsoft has been improving its capabilities around Big Data and analytics recently with the addition of support for Apache Storm in Azure HDInsight earlier this month. It also recently announced new Azure SQL Database tiers which are now as large as 500 GB, as well as its fully-managed NoSQL database service Azure DocumentDB with a sophisticated search service and Azure Machine Learning.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/microsoft-azure-unveils-real-time-tools-iot-big-data-application-developers |
|