Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, April 15th, 2014
Time |
Event |
11:30a |
Westland, Evolve Team on 12 Megawatt Expansion of Houston Bunker The Westland Bunker and Evolve Data Center Solutions are teaming to build a second data center for the Westland Campus, an ultra-secure data center campus outside of Houston. The companies broke ground Monday on an expansion that will add 105,000 square feet of data center space and 12 megawatts of power to the campus in Montgomery, Texas, which has a colorful history.
With the new space, Westland and Evolve are implementing a “ClearColo” concept, moving all supporting equipment outside of the server rooms into service halls. This approach, which has been adopted in many new data center builds, allows maintenance to be performed securely, with limited vendor access to the server rooms.
The Montgomery Westland property was initially built by Ling-Chieh Kung, a nephew of Chiang Kai-shek and founder of Westlin Oil. Fearing a nuclear war, the reclusive Kung built a 40,000 square foot underground facility from reinforced concrete, along with a 100,000 square foot office building. After Kung’s oil company collapsed, the facility stood vacant for years before it was converted to data center use in 2004.
That history ties into some of the design advantages for Westland, which include geothermal cooling and added protection. As with the initial data center, half of the new facility will be above ground, with the other half built into the side of the surrounding terrain. | 12:30p |
Virtualization Security: The Goldilocks Zone Tom Corn is VP of Security Strategy at VMware.
Throughout the history of IT, security has always been both important and challenging, but never more so than now. The worlds of cloud, mobile and social rely on a trusted digital world. And yet it appears the very promise of that trust is at risk. We are stuck in an escalating arms race, where every step forward yields two steps back.
This does not appear to be an issue of investment, innovation or priorities. Investments in research and security startups are at record high. Security has been a board level issue for a number of years. And enterprises are spending more on security than ever before. Growth in security spend outpaces growth in overall IT spend. The only thing outpacing security spend, is security losses.
This is, at its core, an architectural issue, one that may be solvable through the technology at the very center of IT transformation: virtualization.
Security: A Set of Tradeoffs
When it comes to instrumenting IT infrastructure with security controls, we’ve had two main choices; network-based or host-based. But these choices force us to make a tradeoff between isolation and context.
If we place controls in the network, we’re in a separate trust domain, so we have isolation. The problem is we lack context. We see ports and protocols instead of applications. We see IP and MAC addresses instead of users. These physical identifiers were never good proxies for their logical counterparts to begin with, but in modern IT architectures such as cloud, where workloads are mobile and transient, they’re even worse. The development of next-generation firewalls was driven by this very issue.
If we place controls on the host, we get wonderful context about the application, processes, files and users. But we lack any meaningful isolation. We are placing security controls right in the middle of the attack zone. If the endpoint is compromised, so is the control.
And in both cases we lack ubiquity. That is, we lack a horizontal enforcement layer that places controls… everywhere. Endpoint controls provide little network visibility. Network controls provide little endpoint visibility, and cost and operational constraints stop us from deploying throughout the infrastructure.
Enter the Goldilocks Zone
The term “Goldilocks Zone” was first coined by NASA researchers in the 1970’s to describe a planetary location that exhibits characteristics that must be simultaneously present for a planet to support life. At VMware, we borrowed the term to describe the location for security controls that simultaneously provides context and isolation –key characteristics required to create a secure information infrastructure. | 12:35p |
Big Data News: MapR Partners With Databricks MapR adds Apache Spark technology to its distribution, TIBCO advances its ActiveMatrix BusinessWorks integration solution, and Cloudera adds resources to its Cloudera Connect Partner Program to provide deeper levels of engagement.
MapR adds Spark. MapR Technologies announced a strategic partnership with Databricks and the addition of the complete Apache Spark technology stack to the MapR Distribution. The Spark in-memory processing framework provides speed, programming ease and real-time processing advantages. With many organizations running Spark in production MapR environments, the Spark-based applications benefit from the enterprise-grade dependability and performance of the MapR Distribution, and from the ability to process real-time operational data due to MapR’s Direct Access NFS interface. With this, MapR customers can obtain 24×7 support for all projects in the Spark stack. In addition, MapR and Databricks are joining forces to drive the roadmap and accelerate innovation on these projects. This will benefit MapR customers and the broader Hadoop community over the coming years, starting with the upcoming release of Apache Spark 1.0. ”It has become clear that Apache Spark offers a combination of high-performance, in-memory data processing and multiple computation models that is well suited to serving as the basis of next-generation data processing platforms,” commented Matt Aslett, research director, data platforms and analytics, 451 Research. “MapR’s support for the complete Spark stack, combined with its partnership with Databricks, should give Hadoop users the confidence to start developing applications to take advantage of Spark’s performance and flexibility.”
TIBCO advances ActiveMatrix BusinessWorks 6.0. TIBO Software (TIBX) announced TIBCO ActiveMatrix BusinessWorks 6.0, the next generation of TIBCO’s flagship product and leading integration solution. The new release features a productive design environment that enables organizations to act in real time by making data available, reliable, and actionable. It gives a single platform to transform big data into fast data, and uses a model-driven development approach for users to develop, debug, configure and deploy integration processes without having to write any lines of code. ”More than ever, the foundation for business initiatives success is the ability to integrate applications together and fuel them with fast data—the ability to influence what’s about to happen, not what’s already happened,” said Matt Quinn, chief technology officer, TIBCO. “Integration is no longer just a matter of simple ‘plumbing’ – organizations need to be agile. ActiveMatrix BusinessWorks 6.0 dramatically increases the speed with which new solutions can be configured and deployed, enabling much quicker time-to-results.”
Cloudera enhances Connect Partner Program. Cloudera announced that it has added resources, training, certification and communications to its Cloudera Connect Partner Program. The program affords a wide variety of opportunities to partner with Cloudera and deliver integrated Big Data solutions to the market. The Cloudera technology certification program allows partners to work directly with Cloudera to integrate, test and certify their products against Cloudera Enterprise. By expanding resources available in the partner portal, partners and Cloudera will improve direct communications and have access to a single location for comprehensive sales content. Online access to Cloudera University, certifications and requirements, and more training resources will significantly enhance partners’ ability to support Cloudera customers’ data management objectives. ”In a world where data continues to grow, becomes more valuable but increasingly unmanageable, our partners are looking for ways to make their products and services big data relevant,” said Tim Stevens, vice president, Business and Corporate Development, Cloudera. “Many do not, however, have the expertise, resources or brand recognition needed to achieve this goal. Partners that join our 900+ ecosystem are able to take advantage of deep collaboration, certification and association with the world’s leading enterprise analytic data management provider. This allows them to rise above the crowd and gain an audience with leading enterprise customers.” | 1:00p |
Red Hat Summit Focuses on Cloud Integration Celebrating its 10th annual Red Hat Summit event in San Francisco this week Red Hat (RHT) put the focus on developers, with announcements of integrating cloud, hybrid and Internet of Things using JBoss Fuse 6.1 and JBoss A-MQ 6.1, delivering on the promise of modern middleware, and a success story with IBM in helping Casio implement a virtualized storage infrastructure. The event conversation can be followed on Twitter hashtag #rhsummit.
Offering a boost to its standards-based integration and messaging products Red Hat announced the availability of JBoss Fuse 6.1 and JBoss A-MQ 6.1 to extend and simplify integration to all facets of the enterprise. The technologies ease the development and maintenance of integration solutions and provide a vast array of connectivity options. JBoss Fuse 6.1 and JBoss A-MQ 6.1 feature full support of AMQP 1.0 (Advanced Message Queuing Protocol), a vast library of connectors, ability to manage integration processes and improved high availability.
As part of its vision to establish a unified platform that encompasses the development and management of end-to-end applications that span the web, process orchestration and integration, Red Hat released JBoss Fuse 6.1 on OpenShift as a developer preview. The preview allows users to explore the messaging and integration capabilities of Apache ActiveMQ and Apache Camel running in a Platform-as-a-Service (PaaS) environment. Users can run integration in the cloud, allowing for message queueing, transformation, and routing, which are many of the same capabilities provided by integration PaaS (iPaaS) offerings.
“Integration can be a daunting task for organizations with IT assets spread across on-premise, hybrid, or cloud-based environments,” said Mike Piech, general manager, Middleware, Red Hat. “Our goal is to simplify integration and provide consistent, reliable, and highly interoperable connections to all facets of the enterprise, and in doing so, unlock the value of existing assets and enable faster innovation.”
Middleware meets cloud
After promising Red Hat JBoss xPaaS services for OpenShift last fall, Red Hat is ready to deliver the abstractions and capabilities of traditional middleware in a more immediate, unified, and powerful—yet simple—paradigm enabled by the cloud. It demonstrates this with integration, business process management and mobile services through four xPaaS services: Application Container Services (aPaaS), Integration Services (iPaaS), Business Process Services (bpmPaaS) and Mobile Services (mPaaS) – based on the AeroGear project. Red Hat will now focus on hardening the integration and business process technologies for production use and then begin to bring the services together into a unified experience.
Red Hat and IBM help Casio with virtualized storage infrastructure
Red Hat and IBM announced that Casio Computer Company has realized significant results in IT efficiency, performance and cost savings with a solution comprising Red Hat Storage, Red Hat Enterprise Virtualization and IBM System x servers. By combining virtualized industry-standard servers with open software-defined storage, Casio now has a highly available and agile storage solution that can accommodate heavy data workloads by scaling to petabytes. IBM System x servers with Red Hat solutions were used to provide the backbone for the virtualized environment to offer reliability and support for new digital capabilities focused on increasing sales and visibility of Casio’s clock, digital cameras and tablets products.
Within two years, Casio slashed its storage costs in half by integrating the internal storage disks of multiple servers and using it as one large storage pool that is accessible through the virtualized server. The Red Hat Storage management console also provides a simplified and unified way to manage both the storage and virtualized server environments for optimized performance.
“By deploying Red Hat Storage with Red Hat Virtualization running on IBM x series servers, we were finally able to build a storage environment at a low cost while using commodity servers,” said Kazuyasu Yamazaki, group manager of the IT Infrastructure Group at Casio Information Service. ”Our costs, including various procurement costs and operating fees, fell to less than half of what we had been spending before implementing Red Hat Storage. And, our IT usage was by no means optimal or efficient. We were locked in by vendors’ proprietary storage hardware technologies, so we couldn’t manipulate the system ourselves.” | 1:44p |
Servers as Radiators: Can a Data Center Heat Stockholm? Sweden’s Bahnhof is the master of offbeat data center design projects. Sexy sci-fi is its design aesthetic, with most of its data center designs looking like they came straight out of a movie from the future. There’s the “James Bond Villain” data center, and modular data centers that look like space stations.
One of the company’s most ambitious projects has taken a new direction. Bahnhof says it is searching for a new home for a unique data center that could also warm homes in downtown Stockholm, CEO Jon Karlung said this week.
The company’s original plan was to convert a huge former natural gas holding tank in Stockholm into a five-story data center. The gasometer is a cylindrical building erected in 1893, constructed with red bricks and enclosed by a spectacular wood and steel ceiling structure as ceiling, which Bahnhof says contributes to the “sacral character of the space.” The project has been delayed by more mundane considerations.
“Unfortunately the politicians took a while before they could decide,” said Karlung. “It’s not scrapped, but in the meantime we’ve focused on another thing.”
Servers That Heat Urban Buildings
Due to the political delays for the gas station progress, the company is now negotiating with the city for a different spot. The idea behind the new project is to resell the energy from the excess heat from the data center to the local utility, which can use it to heat the surrounding homes.
“It’s in a central area in Stockholm,” said Karlung. “The solution we’re building today. The data center is a big radiator, and if you live in cold climate this makes a lot of sense. We pump the heat back and sell it to the utility company.
“The financial and business model is quite valid, but I think it will not be at the gas meter at this moment,” said Karlung. “It’s locked in the political (process). In the meantime, you have to find another idea. The model for reheating is purely financial. If you get half the money back, it will be very hard for others to compete. If you have a city that’s cold in the winter, you have to heat it up. It’s strange it hasn’t been done.”
Similar Projects
In fact, the concept of using excess heat to warm nearby buildings has been discussed within the industry, and there are a handful of implementations. TELUS in Vancouver is a prime example, with TELUS is tapping waste heat from its data center in Vancouver to power heating and cooling systems of its adjacent $750 million mixed-use Telus Garden development. In London, Telehouse began using excess heat in a Docklands data center to heat nearby homes and businesses in 2009. IBM has a data center in Switzerland that warms a nearby community swimming pool.
There was also an unusual concept was put forth by researchers from Microsoft and the University of Virginia in a paper published in 2011. It suggested that large cloud infrastructures could be distributed across offices and homes, which would use exhaust heat from cabinets of servers to supplement (or even replace) their on-site heating systems.
The primary challenge for Bahnhof is finding a location for the project within a densely populated urban area.“The reason why this has not been widespread, is that very seldom do you have a district heating system built,” said Karlung. “You need a city, a densely populated area and you need a district heating system. Currently in Stockholm, heat is being generated by burning coal.”
Modular Project is live
While one project has been put on hiatus, Bahnhof has successfully launched its modular “space station” data center, named Lajka. “The modular project went live a couple of weeks ago,” said Karlung. “It’s a containerized solution, but it’s not containers. It’s a high security facility in a modular installation.”
The design features a spacious double-wide module built with bullet-proof steel that will house servers, which attaches to “The Dome,” an inflatable central vestibule that houses security staff.
“For the moment, we will use it ourselves,” said Karlung. ”But there are blueprints for it, and it’s possible to buy. The product line isn’t quite there yet but it’s possible to make if you want to make it. I assume if you’re a military organization, it would make sense. You can drive by with a truck and put it into place quickly.”
The company is known for the “James Bond Villain” data center, a high-tech server farm built in a former nuclear bunker beneath Stockholm. The subterranean lair is outfitted with waterfalls, a greenhouse-style NOC, and a glass-enclosed conference room “floating” above the colocation floor. The facility reflects Karlung’s belief that data centers shouldn’t just be cool – they should LOOK cool, too.
 The “floating room” at the Pionen underground data center, a glass-enclosed conference room raised above the data hall within the subterranean facility. (Photo: Bahnhof) | 2:27p |
Chef Announces Sales and Ecosystem Growth at #ChefConf Automation is an essential enabling factor of the digital world. Chef, a leading provider in the mega-scale IT automation sector, and its community are convening in San Francisco this week to celebrate the expansion of this fast-growing community building around automation. (You can follow the conference at the #ChefConf Twitter hashtag.)
On the business side, Chef announced healthy sales results, with 2013 total sales increased by 188 percent year-over-year. Event attendance reflects the company’s top-tier customer portfolio, with keynotes from GE Capital, Nordstrom, Target, and Yahoo, and nearly 70 percent of Chef’s total sales coming from Fortune 1000 companies.
“Many of the world’s leading companies and technology providers are betting on Chef as the de facto standard for web-scale automation in today’s IT-powered economy,” said Barry Crist, chief executive officer, Chef. “Our accelerating sales and community growth are built on a strong foundation of customer and partner support, enabling us to deliver speed and scale for forward-leaning enterprises and web innovators alike.”
The Chef Community includes tens of thousands of registered users, with millions of Chef downloads, and thousands of contributors.
Ecosystem Expands, Platform Gets New Features
Chef announced an expanded ecosystem of companies including Amazon Web Services, Docker, Google, HP, IBM, Juniper Networks, Microsoft, Rackspace, VMware, and others, who are collaborating to empower enterprises to accelerate software delivery and support enterprises.
Chef is extending its open source and commercial automation platforms, adding new features to accelerate software delivery and simplify infrastructure management. Recent platform enhancements include:
- Download and Management: With the latest version of Chef’s commercial platform, a simple download installs and starts the Chef server, making it quicker than ever to access and make the most of Chef.
- Training: Chef’s always growing #learnchef library features extensive resources for getting productive with Chef. Chef also offers a comprehensive suite of online training videos that guide novices and experts alike through all the operational skills needed to stir up delight with Chef.
- Chef DK: Chef DK – Developer Kit – consolidates open source components into an easily installed package that provides a best practice tool chain for building Chef workflows.
- Chef actions: Chef actions is a new feature of Enterprise Chef that provides users with visibility into all activity on the Chef server. Chef actions delivers notifications on who is changing what on the Chef server and allows administrators to track cookbook usage, roles, environments, and changes to infrastructure, all through a dashboard.
- Chef metal: Chef metal delivers policy-based provisioning that allows you to automate entire clusters of machines with the same approach you use to configure a single node. Chef metal automates the provisioning of infrastructure at any scale.
- Supermarket: Chef has open sourced its community site to provide the entire Chef Community with the means to build its own open source community resource. By opening all of Supermarket to the Community, any organization can leverage this collection of code and best practices in order to create its own community resource.
- Networking Automation: Chef recently released full integration with Juniper Junos OS for streamlining the configuration of networking infrastructure. In addition, Cisco’s 3000 and 9000 series switches feature full Chef integration for automating both network port and server configuration from the same platform. Chef also integrates with Arista and Plexxi’s networking platforms.
| 3:10p |
Top 5 Data Center and IT Lessons Learned from the Cloud Giants Do you want to” stand on the shoulders of giants”? One way of doing this is to use others’ IT lessons learned. In this whitepaper from Forrester Research, Amazon, Salesforce.com, Microsoft and Rackspace contribute what they’ve learned while “living in the cloud.” The economics and efficiencies of the world’s largest cloud and hosting service providers can provide cloud lessons for all, whether you support a large or small-scale enterprise.
In this Nlyte-commissioned whitepaper, Forrester interviewed many of the leading web, cloud, and hosting providers and uncovered five key lessons enterprise IT can apply within its own environments.
Let’s face it: the cloud model is continuously evolving. One of the best ways to adapt to an ever-changing infrastructure is to learn what other leaders (and cloud giants) have done right. In this report, Forrester discusses several factors that impact modern cloud computing platforms and how cloud organizations learned to adapt. These concepts include:
- While Cloud And Hosting Service Providers Operate At Scale, There Are Differences
- Size Matters: Large Scale Provides Economic Leverage . . . .
- . . . . But The Benefits You Get At Size Aren’t As Unattainable As You Might Think
- How Do They Do It? Standardization, Optimization, And Automation
- Hardware Standardization And Simplification Sets The Stage For Operations At Scale
- Automation Is The Key Enabler To Productivity And Efficiency
- Aggressive Management Of Power And Cooling Is Universal
The proliferation of IT consumerization, new types of workloads and an explosion of data points has placed some new – amazing – challenges for the modern data center. These cloud giants have battled outages, growing pains, legacy systems, and so much more. Now, they’re here to deliver 5 very valuable lessons. Here’s a taste:
- Lesson No. 1: Bring IT Process Automation To The Facilities Level
- Lesson No. 2: Prioritize Speed-To-Market And Standardization Over Customization
- Lesson No. 3: Automate Basic IT Infrastructure Processes
- Lesson No. 4: Shift From Infrastructure Management To Infrastructure Service Delivery
- Lesson No. 5: Break Down Organizational Silos
As you embark on your greater cloud journey, remember there always will be a push to try new concepts and platforms. As for those parts of your portfolio where you suspect you will never be able to achieve operational efficiencies like that of the Web giants, consider the following:
- If your data center can’t be made efficient, move to one that is.
- Go hybrid and embrace a mix of internal and external services.
Download this white paper today to learn where the cloud giants succeeded and how they learned to adapt to an ever-changing business environment. | 5:21p |
DCK Survey: What’s Your Cooling Profile? Cooling — ever a challenge in any IT environment — has become even more important as the data hall becomes more and more crowded with high-density servers and equipment.
Data Center Knowledge readers are on the front lines of the data center industry and we’d like to gather your thoughts on cooling. What are your currently using to address your cooling needs? Are you dealing with legacy equipment? Do you plan to use more cutting edge solutions in the near future or in the longer term? Do you have some cooling equipment in one area and different equipment in another? Take the DCK Cooling Survey today!
We’d appreciate if you take a few minutes to complete our survey. We will be compiling our results and analyzing them and providing them back to you, our readers. (Please note: This survey is anonymous, no personally-identifiable information will be sold, rented, or given away to outside parties.) Thank you! We appreciate your time.
About Data Center Knowledge
Data Center Knowledge (DCK) is a leading online source of daily news and analysis about the data center industry. We cover the latest developments and trends driving the powerful growth in demand for mission-critical facilities, the challenges and opportunities presented by high-density computing and its impact on power and cooling, and the evolution of the industry to include cloud computing and modular data centers. Our audience includes IT and operations professionals who build and manage data centers.
Stay updated! Follow Rich Miller, founder and editor, on Twitter @datacenter, where he provides real-time updates on the latest data center news. Follow Colleen Miller @cmiller237 on Twitter for DCK and other industry content. Or join your colleagues on DCK’s Facebook page for another way to stay updated on industry developments. Our Linked In Group provides a way to discuss the news and trends of the day. Either “Like” us on Facebook or ask for an invite to the LinkedIn Group. Google+ users can find us on our Data Center Knowledge Google+ Page. We’re on Pinterest, too. |
|