Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, November 1st, 2013
| Time |
Event |
| 11:20a |
Report: Google Mystery Barge A Party Boat, Not A Data Center  The mysterious barge berthed at Treasure Island in San Francsco, which reportedly will be a high-tech showcase for Google Glass, and not a data center. (Photo: Jordan Novet)
That mysterious “Google Barge” in the San Francisco Bay? It’s apparently not a water-based data center, but a Google Glass promotion featuring high-tech showrooms and a “party deck.” That’s the word from San Francisco TV station KPIX, which reports that the barge will be an exclusive, invitation-only showcase for Google’s new wearable technology.
That’s consistent with recent discussions with data center experts, who say Google would be unable to procure enough power to operate a working data center in San Francisco, which appears to be where the barge will stay. Google has had discussions about docking the barge at Fort Mason, a former military facility across from Alcatraz, according to The Verge.
That’s a clear sign it’s not a production data center, according to Mark Bramfitt, a former PG&E executive who now works as a consultant on power procurement for data centers.
“There is no power capacity at Fort Mason to serve a data center of this size,” said Bramfitt. “I’m guessing, but this might be an ‘immersive’ experience site for Google Glass. You go in and are presented with a variety of scenarios that show the tech to high advantage.”
That’s what KPIX is reporting, citng “mulyiple sources” who say the barge will be filled with glittering showrooms, topped by a rooftop party deck offering bars and lanais for VIPs, who can enjoy scenic views of Alcatraz and the Golden Gate Bridge.
“The project, which has been in the planning stages for more than a year, was created at Google[x], the secret facility that Google reportedly runs near its corporate headquarters in Mountain View,” KPIXreports. “It is personally directed by Google co-founder Sergey Brin and is Google’s attempt to upstage rival Apple and its chain of popular retail stores, sources said.”
CNet initially reported that the barge, which is stacked high with shipping containers, may be a floating data center being built by Google. A nearly identical facility has appeared in a harbor in Portland, Maine
For those just joining the story, see our previous coverage:
| | 12:00p |
A Public, Private, or Hybrid Cloud Debate? Not Really Kris Bliesner is CEO and Co-Founder of 2nd Watch, Inc. Among other areas, he is responsible for the strategic development of 2nd Watch’s cloud based software solutions.
 KRIS BLIESNET
2nd Watch
Most IT professionals and market researchers contend that while the majority of businesses today are eyeing a hybrid cloud deployment, that’s really because they’re being conservative. When it comes to debating the merits of public, private and hybrid, there really is no debate at all. It all comes down to the public cloud adding the most value. Here is why:
1. Start-up companies need the public cloud. These companies are often involved in development with uncertain requirements. They don’t know what they might need day-to-day. And many can be on a very tight timeline to get their products to market. These situations mandate a public cloud deployment, like Amazon Web services (AWS), where more (or less) resources can be configured and absorbed in a matter of minutes. While they might maintain a small infrastructure onsite, the majority of their infrastructure simply has to be in the public cloud.
2. Enterprises have a need for redundancy and disaster recovery. Taking AWS as an example, this cloud option can be incredibly redundant if you utilize its lesser-known features – region-to-region redundancy. This means that infrastructure is backed up in different data centers. Many AWS customers don’t consider this, and feel that multiple zones in the same region are enough. However, opting for region-to-region puts data and virtual infrastructure in two very different locations, and should anything happen to one, the odds are very small that anything will happen to the other. This can be mirrored with a private cloud deployment, but the cost is colossal.
3. There are mandatory regulations around compliance and security. Audits are often years behind the industry, but their rules can be challenged. We’ve seen customers exceeding auditors’ expectations, make a case for their architecture, and win, providing them with all the benefits of a public cloud architecture with all the security needed by common regulatory requirements. This is hard to replicate with private clouds, because with internal data protection you’re going to have internal SLAs and internal compliance checklists, which require frequent upkeep, higher costs and a more complicated infrastructure.
4. Web and cloud security present challenges. This landscape changes quickly, and should be a primary concern for any cloud-based deployment. Some perceive a public cloud infrastructure to be more vulnerable than a private cloud, but that’s actually a misconception. A private cloud allows IT to control the perimeter; but it’s also responsible for staying on top of a rapidly shifting security landscape and making all required fixes, updates, and upgrades. Public clouds take care of all that. Data is protected by both managed security on a software and physical level, since large-scale data centers like those used by public cloud providers have state-of-the-art security.
5. Budget is, of course, a huge factor. Companies with large amounts of infrastructure already installed might find it cheaper to implement a private cloud, since in many cases they already have the hardware as well as the operating systems and management tools required to build a private cloud. However, hardware infrastructure and the demands made on it by software – especially operating systems – changes about every 3-5 years. Public cloud deployments are 100 percent virtual, which means the hardware hosting those virtual machines is on the provider to keep the infrastructure current, and has significant long term cost savings. Smaller companies that need to stretch their investment as far as it can go will see those benefits right away. For this reason, the application-level services offered by partners and other customers of providers like AWS are a very attractive draw.
With all of the above said, private cloud is inefficient. It is built on a model that encourages bad over-provisioning. In fact. in order to get maximum benefit from private cloud – true elasticity – you have to over=provision. The public cloud, on the other hand, is the most widely applicable and delivers the most value to a majority of businesses. Is successful private cloud deployment possible? Of course. Is it efficient? No.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 1:00p |
Understanding Cloud Security, Certificates and Compliance  Cloud security, compliance, and the certificates that help support cloud communication have all come a long way
Now that we have a better understanding of the cloud and the capabilities it can provide, it’s time to examine the never-ending questions around security that always seem to drive the conversation. Next-generation security models aim to revolutionize the data center and how it facilitates the cloud. We have better scanning engines, improved cloud monitoring options, and even more granular user control features.
But what’s happening inside of the cloud? How have security concerns around core areas been addressed? Remember, there are still organizations out there which are heavily driven by compliance requirements such as SOX, HIPAA, PCI/DSS and even FISMA.
Cloud security, compliance, and the certificates that help support cloud communication have all come a long way. When it comes to compliance and regulatory-driven organizations, it’s important to understand how technologies like cloud and virtualization are now able to create a more robust environment. Furthermore, mobility solutions – when incorporated with your cloud platform – can even further enhance the data center cloud security model.
- Cloud security advancements. Cloud is certainly here and not going anywhere. In fact, organizations are finding ways to deploy even more of their IT environment into a cloud model. Between all of this sits the very real challenge around security. Standard UTM firewalls were simply not enough to stop some of the latest threats and attacks against the modern cloud and data center. This is where next-generation security came in to help. Although it’s a bit of a buzz-term, the idea behind next-gen security is very real. Instead of standard firewall services, we are now seeing direct integration into the application, data and even user layer. Entire applications can be placed behind intelligent, heuristic, learning engines which monitor for anomalous activity. Not only do they protect internal resources, they continuously monitor these applications against signatures based on public vulnerability databases (e.g. Snort, CVE, Bugtraq, etc.). Couple this with systems which are able to operate inside and outside of the network, and you’ve got a pretty robust security platform. Internal security services now include solutions like IPS/IDS as well as data loss prevention.
- Who owns the “keys” to your kingdom? Modern organizations have very new security demands around their data and applications. Data sharing and collaboration – primarily revolving around mobility – has been a very hot topic for IT managers. Many security administrators were asking for a better way to share, control and deliver data on a cloud-ready platform. Solutions which house data directly in the cloud aren’t a great fit for everyone. This is where solutions like ShareFile step in. Organizations bound by compliance regulations (HIPAA for example) must take special precautions around their data sharing environment. With the ShareFile platform, administrators are able to house their data both on-site and in the cloud. Furthermore, they are able to own the encryption keys throughout the entire process. With cloud-based solutions, this is simply not possible since the data resides on a vendor system. This new approach around file sharing and collaboration has allowed organizations which were stuck with compliance challenges to take that cloud leap while retaining complete control of their information.
- The importance of certificate monitoring and control. Security certificates play a big role in the cloud world. There are some key aspects that are necessary to create a solid certificate management platform:
- Monitoring your SSL and security certificates is a must. Why? A single expired SSL certificate can not only affect the system that it’s loaded on – but also other services which have dependencies. Recently, a single SSL certificate caused a major outage taking down or disabling 52 different services. For the Azure cloud, this was a tough lesson to learn.
- Proactively set up alerts. As part of the monitoring process, it’s important to know when certificates are expiring, if there are issues and how they are interacting with other services. Remember application-based dependencies include certificates deployed on various servers. This means that if a single certificate goes down, it can potentially create a cascading scenario which can negatively impact a number of services.
- Know your certificate origins. Just because a certificate costs less, doesn’t mean it’s as good as other well-known certificate issuing authorities. There are cases where certain certificates simply will not work or will not be accepted by some cloud services. This means understanding who is issuing your certificates and how these certs are compatible with your cloud platform.
- SSL certs can take down your cloud, as happened with Azure. Create monitors and alerts. Ensure compatibility.
- Compliance and regulation in the cloud. Going to the cloud may not be as challenging as it once was, but staying compliant is still a challenge. Even now, there are only a handful of data center providers which can host PCI/DSS systems. Furthermore, creating a cloud platform for a compliance-driven organization creates numerous other security challenges. Where will the data be housed? Are virtual images certified? How are users accessing their workloads? How are you staying proactive to stay compliant? Take a look at PCI/DSS – only a few certified data centers. Aside from PCI/DSS, FISMA has created a new way for government entities to look at their cloud and IT model. Think they didn’t take this seriously? In 2008, federal agencies spend $6.2 billion to secure their infrastructure. The idea was to create a system around cybersecurity which emphasized a risk-based policy for cost-effective security. From a FISMA perspective, there are 7 key elements that an organization must meet:
- Inventory of all systems
- Categorization of those systems based on risk level
- Implementing security controls
- Conducting risk assessment audits,
- Security certification and system accreditation
- Continuous infrastructure monitoring.
Written in 2002, this act was in dire need of an update and a refresh. With that, came the Cybersecurity Act of 2012. After having key conversations with government, public and private security experts – the concepts of FISMA were further defined in this act. This clarification describes how organizations should:
- Determine the Greatest Cyber Vulnerabilities:
- Create a Public‐Private Partnership to Combat Cyber Threats
- Incentivize the Adoption of Voluntary Cybersecurity Practices.
- Improve Information Sharing While Protecting Privacy and Civil Liberties.
- Improve the Security of the Federal Government’s Networks.
- Strengthen the Cybersecurity Workforce
- Coordinate Cybersecurity Research and Development.
The act further defines roles, responses and some of the new technological platforms currently available in the IT world. Still, some security experts argue that – although this act is set with good intentions – still is really just a checklist for organizations to examine. Ultimately, it will all come down to how well your organization is able to deploy a secure infrastructure around overall security best practices.
As cloud continues to integrate more closely with the modern business organization, security tactics will need to evolve. Typical UTM appliances are simply not enough to protect new types of threats against cloud and data center environments. Plus, compliance regulations aren’t going anywhere either. This means that if you want to go to the cloud – and are bound by governing policies – remember that there are options out there.
Because cloud has continued to progress, the security models that support cloud computing are allowing for more granular controls over key security components. This means more control over secure traffic transmission, file/data controls, and the ability to continue to deliver digital content to the end-user. There are a lot of great ways to deliver powerful cloud services – the key will be do so in an efficient and secure manner. | | 1:00p |
Colocation Providers, Customers Trade Tips on Energy Savings  During the Green Colocation panel at Tuesday’s Data Center Efficiency Summit, moderator KC Mares (far left) led a discussion among (from left to right) Nicole Peill-Moelter from Akamai, Salesforce.com’s Tom Fisher, Vantage Data Centers’ Jim Trout, Andy Broer of Box, and Jim Smith from Digital Realty Trust. (Photo: Jordan Novet)
PALO ALTO, Calif. – Customers of colocation facilities don’t always see eye to eye with the companies whose infrastructure and energy they depend on, but at times, their interests do align.
At the 2013 Data Center Efficiency Summit on Tuesday, representatives of both groups shared ideas about how they could improve energy efficiency and rely more on renewable power.
Nicole Peill-Moelter, director of environmental sustainability at Akamai, called on colocation providers to share data on power usage with customers.
“Right now we’re blind,” Peill-Moelter said. “To get real-time data on how we’re using our energy compared to how we think we are – that would help us drive greater efficiency on our side.”
Sharing Best Practices
Then again, a colocation customer could share best practices with other companies running infrastructure in the same facility. “I think that would break down some of the barriers that we face,” said Tom Fisher, a sustainability manager at Salesforce.com.
Colocation providers could also assist their customers by offering incentives for energy-efficient practices or helping to trouble-shoot obvious inefficiencies that could worsen site-wide Power Usage Efficiency (PUE). To some degree, that’s already happening.
Andy Broer, senior manager of data center operations at Box, said that earlier in the day he’d received a report from one of his company’s colocation providers containing photographs of hardware Box employees had installed incorrectly. “They’re doing containment in the data center, even for colocation,” Broer said. “The fact that they’re enforcing it, it’s the first time I’ve seen it.”
Opportunities in Design, Power Provisioning
But sometimes the building itself is the problem. Some colocation providers and their customers could benefit from constructing facilities that are more energy-efficient from the start than those that have been delivering services for decades.
“You have to build better facilities, and you have to plan together,” said Jim Trout, founder and chief technology officer of Vantage Data Centers. He’d like to run facilities with PUEs in the 1.1 to 1.4 range. “Everybody else is raising the bar as well,” he said.
Then there’s the matter of generating green power for colocation facilities. Individual colocation users could try to band together and secure bulk purchasing deals on renewable energy the way Google or Apple can. Fisher suggested as much. Earlier this year his company committed to steadily using more renewable energy sources, with a goal of exclusively using renewables.
At least in one case, though, a similar approach to buying renewable energy didn’t pan out. Digital Realty once got a chance to bid on a deal offering low prices for renewable energy and contacted all of its larger customers in Texas about the opportunity.
“We got zero takers,” said the company’s chief technology officer, Jim Smith.
At times, Smith said, other interests have come before energy efficiency. One customer was told about an opportunity to save money if the data center could install aisle-containment equipment. While it would have been easy to implement the gear, the customer was reluctant, saying the company was too busy growing. Smith thought the argument made sense. “His growth is more valuable than his cash flow at that point,” he said.
Ultimately, Smith said, he’d like discussions on infrastructure changes to become more rational and data-driven. | | 1:00p |
Top Ten Data Center Stories, Month of October  Will Googlers be singing “Anchors Away” in a sea-going data center? Mysterious barges, complete with containers and linked by stairs, appear in San Francisco Bay and Portland, Me. harbor, and rumors abound about these “secret” vessels. (Photo by Jordan Novet.)
October has been a busy month here at DCK, with coverage of the Google mystery barges, Facebook’s cold storage system and how DropBox stores “your stuff.” Here we present the top ten most popular stories published by Data Center Knowledge this month, ranked by page views. Enjoy!
- The Barge Mystery: Floating Data Centers or Google Store? – October 28, 2013
- First Look: Facebook’s Oregon Cold Storage Facility – October 16, 2013
- How Dropbox Stores Stuff for 200 Million Users – October 23, 2013
- NSA Data Center Plagued by Electrical Problems – October 8, 2013
- Terremark Data Center Outage Knocks HealthCare.gov Offline – October 27, 2013
- Chaos Kong is Coming: A Look At The Global Cloud and CDN Powering Netflix – October 17, 2013
- Apple Quietly Builds Its Prineville Data Center – October 16, 2013
- NTT Communications Acquires Controlling Interest in RagingWire – October 28, 2013
- Closer Look: The Mysterious Barge in the Bay – October 28, 2013
- Colocation Will Be a $10 Billion Market by 2017, Research Firm Says – October 1, 2013
A “shout-out” goes to Ron Vokoun of JE Dunn Construction, whose column almost made it to the “Top 10,” coming in at number 11 in page views. Many readers commented about the issue. Check it out, Top 7 Reasons Data Centers Don’t Raise Their Thermostats, and join the discussion, too.
Stay current on Data Center Knowledge’s data center news by subscribing to our RSS feed and daily e-mail updates, or by following us on Twitter or Facebook. DCK is now on Google+ and Pinterest. | | 2:00p |
Friday Funny: Time to Vote on Batman, Arizona Thank Goodness It’s Friday! Since it’s Friday, that means it’s time for our caption contest, with cartoons drawn by Diane Alber, our favorite data center cartoonist! Please visit Diane’s website Kip and Gary for more of her data center humor.
This week, we are voting on the last two cartoons, Is it hot enough for you? and Holy Batman!
Please vote below, and have a good weekend.
Here’s how it works. We provide the cartoon and you, our readers, submit the captions. We then choose finalists and the readers vote for their favorite funniest suggestion. The winner receives a hard copy print, with his or her caption included in the cartoon.
Take Our Poll
Take Our Poll
Please visit Diane’s website Kip and Gary for more of her data center humor. For the previous cartoons on DCK, see our Humor Channel. |
|