Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, January 11th, 2013

    Time Event
    2:00p
    Data Center Jobs: McKinstry

    At the Data Center Jobs Board, we have a new job listing from McKinstry, which is seeking a Critical Facility Engineer in Reno, Nevada.

    The Critical Facility Engineer is responsible for maintaining a positive and professional working relationship with internal and external clients, responding to customer service requests in a timely manner, responding to emergency calls, performing routine maintenance tasks in accordance with McKinstry Safety Policy and Procedures, inspecting buildings, grounds and equipment for unsafe or malfunctioning conditions, troubleshooting, evaluating and recommending system upgrades, ordering parts and supplies for maintenance and repairs, and soliciting proposals for outsourced work. To view full details and apply, see job listing details.

    Are you hiring for your data center? You can list your company’s job openings on the Data Center Jobs Board, and also track new openings via our jobs RSS feed.

    2:14p
    PCE Acquires Cooling Specialist Opengate Data Systems

    Plastic Companies Enterprises (PCE) continues to expand its presence in data center management with its acquisition of Opengate Data Systems. Opengate is Massachusetts company providing data center infrastructure solutions. It develops intelligent, rack and row-based heat containment systems that minimize power consumption while maintaining an optimal temperature for the entire data center.

    In a case of “try before you buy,” Geist, a PCE company, had partnered with Opengate Data since 2007 to provide intelligent cooling solutions to the data center market. Opengate contracted Geist to develop and integrate web-based communication functions into a variety of Opengate solutions. The companies formed an alliance in 2011 in which Geist offered and promoted Opengate’s data center airflow management solutions to its customer base, and made a financial investment in Opengate. So this is a relationship long in the making,

    “We’ve seen the need for efficient cooling solutions continue to grow as the world’s demand in computing power increases,” said Sam Featherston, CEO PCE.  “This acquisition puts PCE in a position to capitalize on this demand with Opengate’s innovative cooling technology and solutions.”

    Under the umbrella of PCE, Opengate gains on several fronts. ”This acquisition gives Opengate access to additional engineering and manufacturing resources that will allow us to meet the evolving cooling needs of the data center industry,” said Mark Germagian, President and Founder of Opengate.

    Focus on ‘Smart Containment’

    Opengate Data was founded in May 2007 by Germagian, a veteran cooling specialist who previously worked with APC and Wright Line. Opengate helps customers reduce their total cost of ownership by addressing three areas that impact data center profitability: energy consumption, real estate utilization and operational overhead.

    Opengate has focused on cabinet-level containment systems that effectively eliminate the hot aisle by removing waste heat from servers through a chimney system that brings the air directly into an overhead plenum (air chamber) that returns the hot air to the air handlers or CRACs (computer room air conditioners).

    Plastic Companies Enterprises, Inc. (PCE) has been in business for more than 18 years. The company is made up of two distinct groups—Data Center Solutions and Plastics Solutions. The Data Center Solutions Group includes Geist Manufacturing, Geist Intelligent Facilities, IT Watchdogs and now Opengate Data Systems. In 2010, the company acquired 50 percent of Data and Power Solutions (DPS) in the UK.  DPS is a manufacturer of data center power distribution and environmental monitoring solutions for Europe and other global markets.

    2:49p
    Extending the Data Center Into the Cloud

    cloud-keys-dreamstime

    At the heart of the cloud computing movement sits the data center infrastructure. The data center has long been the workhorse of the cloud world, providing bandwidth, connectivity, and of course resources to help the environment run continuously (or as continuously as possible).

    During the early days of cloud computing, there was a disconnect between cloud resources and the internal corporate environment. To access workloads or resources in the cloud, administrators would have to jump through a few hoops to connect their environment with external components. Beyond an internal data center hosting some cloud options; there are other SaaS, PaaS and even IaaS environments out there that require connectivity.

    So the challenge was very clear: how does an administrator effectively scale their data center to directly connect with other components in the cloud?

    Although there are a few ways to accomplish this goal – software-defined technologies, virtualization, and advanced networking – identity federation has become an integral part of connecting a data center to the cloud. A user who sits on a corporate data center may at any one point require access to the outside world or an application hosted externally.

    One of the great things about cloud computing is the flexibility that it brings to both the user and the administrator. This is where identity federation can help. To truly get an idea to the nature of cloud authentication and how it interacts with a corporate data center, we can look at federated identity in the following three ways:

    Connecting The User

    The user is probably the most important part of any environment. Considerations around the end-user can mean the difference between a successful deployment and one that has to go back to the drawing board. Many times, organizations will deploy applications or resources which are completely cloud-hosted. In many situations, the user will have to re-authenticate from their internal environment into the cloud resources.

    This is where identity federation can help a user, who is accessing a local data center, become much more transparent in their experience. By publishing a special portal – sometimes referred to as a cloud gateway – the user will see their set of applications, which can include Word, Outlook, Excel, and other local resources. However, in the very same portal, administrators can now present external applications like Salesforce, LinkedIn, Ceridian, and others as well. To the user, they look like their installed locally. Furthermore, they only need to authenticate once with their Active Directory credentials.

    From there, identity federation takes over and allows the user to connect to both internally hosted applications, and those hosted in the cloud. What about password changes and some other administrative functions for those cloud-hosted apps? No problem. Modern identity federation technologies work with a self-service portal that allows users to change their password as needed and even add/remove icons from their portals. Not only does this help the user connect to the cloud via an internal data center – it also eases the management process for data center administrators.

    Connecting The Applications

    One of the main goals in creating a good identity federation mechanism for the end-user is to make the experience as transparent as possible. One of the other goals is to simplify administration.

    Applications being hosted outside of the internal data center may need to be connected. In the previous section, we discussed how this experience would look to the end user. Now, we have to look at how it’s accomplished. Administrators will set up an internal server or service which will interface with external applications. From there, many identity federation solutions will actually have pre-built connectors for many cloud-facing applications. They will know the landing pages and be able to identify the authentication methods and how to tie them back into the internal environment.

    From there, administrators are able to set password reset metrics, how a user access external applications, and create a local data center capable of expanding into the cloud. The flexibility of using identity federation technologies to connect to the cloud doesn’t only cover “pre-approved” application lists. For custom apps or even those not only the list, administrators are able to create custom HTTP or SAML connectors for the resource in need. This type of deployment creates certain flexibility for applications which were once thought to be cloud-challenging.

    Connecting The Data Center

    As one of the final, yet very important, pieces in the connection process, connecting the data center goes a bit beyond just identity federation. In conjunction with software-defined technologies, connecting a data center with the cloud becomes much easier.

    From an authentication perspective, identity federation acts as the engine to a facilitate user access for both internal, data center-hosted, as well as external applications. Now, administrators are able to logically connect internal data centers with cloud-based environments for an even greater extension of their infrastructure. At this point, not only is the internal environment connected to cloud-facing data center resources, but users can seamlessly authenticate into the environment as well. A secure link can be created with a data center which hosts a specific set of applications. Beyond that, the identity federation solution can help handle the authentication for users to access those resources. And so, not only is there a private link going to a cloud-based data center, users can still incorporate their AD credentials to access resources across the platform.

    Extending a data center into the cloud doesn’t have to be a complicated task. In fact, advanced security and connectivity methodologies make the process much safer now than it was before. Software-defined technologies and cloud APIs continue to make the modern data center more agile and capable of scaling with the needs of the organization. More companies are leveraging cloud-based resources and require a way to connect into that type of environment. This is where using solutions like identity federation can help connect local data center environments to the cloud.

    6:25p
    Friday Funny: Hitting the Links

    It’s finally Friday! And that means it’s time for our “before the weekend” Friday caption contest, with cartoons drawn by Diane Alber, our fav data center cartoonist! Please visit Diane’s website Kip and Gary for more of her data center humor.

    This time, Kip and Gary are hitting the links! Diane writes: “I know a lot of people in this industry that love to golf. So I thought, Wouldn’t it be awesome if you could be golfing while your working? You could, if your modular data center was located right on the golf course!”

    Click to enlarge graphic.

    Click to enlarge graphic.

    The caption contest works like this: We provide the cartoon and you, our readers, submit the captions. We then choose finalists and the readers vote for their favorite funniest suggestion. Scroll down and add your suggestion in the comments below.

    Hearty congratulations are extended to German Pacio for the winning caption — “Our PUE decreased to 1.05, Let’s celebrate !!!” — for “New Year’s Celebration.”

    For the previous cartoons on DCK, see our Humor Channel.

    7:11p
    GSA Hits the ‘Like’ Button on Open Compute
    hyve-servers

    Could this racks Open Compute project servers soon live in government data centers? Hyve Solutions, which made these servers (shown from the rear with the fans visible) has earned GSA Schedule 70 approval. (Photo: Hyve/Synnex Corp.)

    It just got easier for government agencies to use Facebook’s data center designs if they choose. Hyve Solutions, a division of Synnex which makes servers and storage solutions based on Open Compute designs, has qualified as a government vendor under General Services Administration (GSA) Schedule 70.

    Schedule 70 puts IT providers on an “approved list,” making it easier for these companies to win government contracts. This means that Hyve’s Open Compute Project (OCP) suite of data center solutions is now available, making it easier for government agencies to adopt Open Compute designs,with Hyve managing OCP deployments for Federal entities from concept to implementation. In other words, Hyve is basically opening the door for government agencies to use Facebook’s data center designs.

    The Open Compute Project started when Facebook set out to develop an innovative data center and server solution that was both energy and cost efficient. Facebook decided to share its designs in an effort to promote and encourage power efficiency and future innovation. The Open Compute Project (OCP) was born from this desire and officially launched in April of 2011. Facebook’s Prineville Oregon Data Center was the first OCP Data Center built from the ground up. Here’s a look at Facebook’s Open Compute Servers.

    Hyve was the first official OCP reseller. OCP servers are designed to be efficient, inexpensive, and easy to maintain. OCP data centers were built with the goal of maximizing mechanical performance and having efficient thermal and electrical properties. Here’s some more detail on the company’s website.

    “Hyve Solutions’ goal has been to use our design expertise, superior integration services, and logistics infrastructure roots to make OCP solutions available to a wider user base,” said Steve Ichinaga, Senior Vice President and General Manager of Hyve Solutions. “Federal civilian and military data center environments can join the ranks of the more efficient data centers in the world now that Hyve Solutions’ OCP suite of data center solutions has been added to GSA Schedule 70.”

    Hyve, Hyve Solutions, a division of SYNNEX Corporation, is the leader in providing customers with purpose-built data-center servers.

    Hyve, a division of SYNNEX Corporation, is a leading player in providing customers with purpose-built data-center servers. Here’s the Hyve booth at last year’s Open Compute Summit in San Antonio (Photo: Colleen Miller)

    9:09p
    COPT Building New Data Center in Ashburn
    The exterior of the COPT (PowerLoft) data center in Manassas, Virginia. (Photo: Rich Miller)

    The exterior of the COPT 6 data center in Manassas, Virginia. COPT has bought land for a new data center in nearby Ashburn. (Photo: Rich Miller)

    The cloud-building continues in the busy northern Virginia market. Corporate Office Properties Trust (COPT) is developing a new data center on a 34-acre site in Ashburn, and reportedly has lined up Amazon as its anchor tenant. This is the latest in a series of new data center projects for Ashburn’s “Data Center Alley,” which continues to see robust activity.

    COPT is a real estate investment trust that focuses on properties for the federal government, including specialized IT facilities for the defense sector. It already operates a large data center in Manassas in Prince William County, but appears to see an opportunity in joining the crowd in Ashburn in Loudoun County.

    A COPT affiliate acquired the 34-acre Ashburn site from St. John Properties Inc this December for $14 million. St. John acquired the land for future development but decided to sell when it was approached with a prospective offer. “It was just a further assemblage of land in a market that we firmly believe in,” St. John Regional Partner Matt Holbrook toldf  the Washington Business Journal, which was the first to report the deal and named Amazon as the prospective tenant. “We are rarely ever sellers of anything, but the compelling purchase price made us decide to sell in this case.”

    COPT can’t comment because of SEC regulations, and St. John won’t comment on the buyer, but said that it was attractive because of location, data center potential and connections to a reclaimed water system from Loudoun Water, which makes it easier for data centers to provision large volumes of non-potable water for use in cooling towers.

    The project will continue the data center building boom in Loudoun County, which is already home to more than 5 million square feet of data center space, with several more projects under way or in the development pipeline. This region of northern Virginia is perhaps the most connected piece of Internet real estate in the United States, housing servers for Facebook, Amazon, Rackspace, Google, Microsoft and hundreds of other companies.

    Is Ashburn getting too crowded? Data center  industry watcher Daniel Golding noted active projects by RagingWire, Sabey and Equinix and the prospect of future inventory from DuPont Fabros Technology.

    “I think Ashburn is, at the moment, well supplied for data center space,” Golding writes at DataCenter Insight. “That doesn’t mean there is a glut, however. There are several large requirements on the street and same-store business is strong for Equinix, DLR (Digital Realty), and DFT.”

    Check out our Northern Virginia Channel for news about this market.

    << Previous Day 2013/01/11
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org