Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Wednesday, April 9th, 2014
| Time |
Event |
| 11:30a |
Venyu Expands in Louisiana With New Baton Rouge Facility Data center service provider Venyu has opened a second facility in Louisiana, investing $14 million to develop a 23,000 square foot data center in Baton Rouge. Venyu says the new facility will feature its “disaster proven” data recovery services, a reference to the company’s experience supporting clients through hurricanes Katrina and Rita.
With its exposure to hurricanes and high temperature and humidity, Louisiana is not conducive to data centers. But there are plenty of local businesses that need mission-critical IT space, and need it within the state. This is where Venyu steps in, providing data center space that tries to exceed current industry standards and protect these folks from the surrounding environment.
“Venyu is a company with its roots in Louisiana, and we’ve greatly benefited from the state’s increasingly strong economic climate,” said Scott Thompson, CEO of Venyu. “We’re fully committed to the region and excited to further stimulate the economy with such an important facet of our business.”
Venyu, founded in 1989, has approximately 75 employees and offers managed hosting, cloud, virtualization, and data protection solutions. The company was acquired last year by regional telco provider EATEL. Venyu has carved out a business for backup and recovery services. It’s a local business, as national service providers have little desire to enter the market due to the risks.
“In today’s highly competitive market, industry demand for crucial data center services such as colocation, cloud hosting, managed hosting and cloud backup are on the rise – as businesses seek more cost-effective IT solutions to drive efficiencies and generate revenue,” said Thompson. “This expanded footprint opens new doors for customers with heightened levels of elasticity and on-demand IT capacity – while enabling us to drive future innovation and services.”
The new facility features biometric and ID card authentication at all entrances, 24-hour surveillance, and redundant HVAC and fire suppression. The HVAC equipment is placed at an indoor mezzanine level so the company can perform upgrades and maintenance with no disruption to the customer. | | 12:00p |
T5 Reports Leasing Success at Dallas Campus T5 Data Centers says it has now leased three of the four data centers on its T5@Dallas campus, and has commenced the interior build-out of the 70,000 square-foot final building of the development. T5 has now leased 15 megawatts of wholesale space at the Dallas development, which has room for future expansion. The company lined up $113 million in financing last year to funds its buildout of the remainder of the first phase.
The T5@Dallas facility currently houses four independent data centers, each offering between 4.5 megawatts and 6 megawatts of power capacity. There is no shared infrastructure between the four data centers, so clients have complete control of their data environment, including dedicated utility feeds, electrical and mechanical plants for each data center.
“Today’s high density data center users are searching for robust and secure environments within purpose built facilities,” said Martin Peck, General Manager of T5′s Dallas operation. “Our T5@Dallas data center is a ‘battleship’ of a facility and given our Plano Legacy location, we believe we have the best data center facility in the best location for business in Texas.”
T5@Dallas is a wholesale data center facility within a 315,000 square foot building on a 20-acre site located in Legacy Business Park in Plano. Texas4 is the remaining data center, and will feature 40,000 square feet of raised floor space and 4.5 MW of critical power available, expandable to 6.0 MW. The entire T5@Dallas facility is LEED-Silver Certified and built to withstand winds up to 221 miles per hour. It has dual primary power feeds connected to two separate substations, and is concurrently maintainable (mechanically and electrically) and fault tolerant (electrically).
T5 now has seven T5 data centers across the United States. Earlier this year it entered the greater New York market, teaming with Lincoln Rackhouse to develop a property in Briarcliff Manor, N.Y. for data center use. T5 also has facilities in Atlanta, Los Angeles and Charlotte with new projects under development in Oregon and Colorado. | | 12:30p |
Five Questions on Mobile Collaboration That Will Support Your Business Continuity Ryan Kalember is chief product officer, WatchDox. With 14 years of experience in a variety of roles in the U.S. and Europe, Ryan has an extensive background in information security.
The rise of mobile should be welcome news to IT teams tasked with maintaining business continuity and employee productivity through comprehensive disaster recovery plans.
In the face of transit strikes, severe weather, a server or application outage and other events that keep employees from working on site, business operations can continue as usual, thanks to widespread adoption of smartphone and tablet use by employees.
However, the mobile enterprise doesn’t come without risk. That risk is magnified when companies fail to deploy secure mobile collaboration options and online workspaces that provide business continuity and can operate as a “light” data backup solution.
When catastrophic events force employees to work from home or remote locations, they’re likely to pick up mobile devices to collaborate with colleagues seamlessly by accessing, editing and sharing documents. The mobile productivity of a distributed workforce is a plus for the enterprise, but the inherent risk is the security of these devices and information access on it. Companies do not want to implement business continuity and backup solutions that will introduce the possibility of lost or compromised data.
There are five questions every IT team should ask about the role of mobility in business continuity and data backup plans:
1. How are employees collaborating when they work from mobile devices?
Unfortunately, most IT and security executives acknowledge that shadow IT is rampant and corporate data on mobile devices is essentially uncontrolled and unprotected. Unless firms provide a solution that is easy to use, workers will embrace the general box file-sharing services simply because they are more accessible. However, the very act of uploading a file to an insecure box means the organization immediately loses control of the company information contained in that file.
2. What happens to sensitive data after it’s shared?
The security risk extends beyond the virtual box. Once an employee sends a document from his tablet to someone via email or a file-sharing service, does the company have any control over what happens next? Once that file is downloaded or opened on another mobile app, for example, too many companies lose control over its use, increasing the risk of trading business continuity and productivity for security. And those firms in regulated industries, such as government, finance and healthcare, require an audit trail of who and how each person interacts with a document. The business continuity solution must meet these compliance requirements.
3. Which mobile tools are employees using to annotate and edit files?
Historically, editing a document from a mobile device has not been an easy process. Employees will typically go through several time-consuming, frustrating and insecure steps or use many apps to simply annotate or edit a document from a smartphone or tablet. First, the mobile worker has to download a file (too often from a personal file-sharing account), use her tablet to open the file in a mobile editing application, edit it, figure out how to save the new version back to the file-sharing account and then disseminate it to her co-workers. This scenario creates problems related to security, syncing, sharing and overall productivity.
4. Is there a better way to maintain mobile productivity AND security at the same time?
Enterprise file-sync-and-share (EFSS) solutions keep files safe wherever they go and on any device. At the same time, EFSS provides a user-friendly interface – an online workspace from which offsite employees can collaborate over files without creating high-risk situations around sensitive data. By delivering enterprise-worthy mobile productivity workflows to end users, IT can ensure that the next natural or man-made disaster doesn’t derail business continuity or data security.
5. How is mobile productivity affected when a device is stolen or damaged?
Often, if an employee’s mobile device is lost or stolen, so is the data contained on that device. Some solutions allow you to wipe the data, but it can’t always be recovered. An EFSS solution that allows employees to save and sync during the collaboration process creates a backup of the most recent version of the data. This way, if a device goes missing or is damaged, work is not lost. Employees can quickly log into the workspace from another device and get back to work. This also means there is no need for a separate device backup solution since restoring files from the EFSS server is a simple process, and even recovering other data such as emails from Exchange can be done with minimal effort.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 1:00p |
Nimbix Offers HPC Energy Cloud in Digital Realty’s Texas Sites Nimbix is bringing its high-performance cloud computing infrastructure to two data centers operated by Digital Realty Trust, the companies said today. Nimbix and Digital Realty have formed an alliance to offer an enterprise high performance computing (HPC) cloud solution to customers in the oil and gas industry. The infrastructure will be housed in Digital data centers in Dallas and Houston.
“The combination of high-performance computing and cloud technologies is a powerful solution, with significant client benefits,” said Steve Hebert, Chief Executive Officer of Nimbix. “Clients who need immediate, high-performing computing solutions for shorter time frames will realize significant efficiencies.”
Technology plays a critical role in finding oil and gas reserves, as computer modeling allows oil companies to analyze seismic data and produce 3D images that identify the best location and trajectory for drilling wells. These modeling applications can save millions of dollars by focusing the drilling of offshore wells on the most promising locations, but require enormous computing power.
With its JARVICE cloud platform, Nimbix provides a way for companies to shift their data-crunching HPC infrastructure from their own data centers to remote cloud environments.
“Providing easy access to cloud services is a natural progression in our support of the oil and gas industry,” said Matt Miszewski, Digital Realty SVP of Sales and Marketing. “Our alliance with Nimbix facilitates the cost-effective generation, analysis, and sharing of geophysical data.” | | 1:30p |
Data Foundry Breaking Ground In Booming Houston Market Data Foundry is set to break ground on its second Houston data center, which will be known as Houston 2. The colocation and managed services provider is building a 350,000 square foot facility that will have the capability for 60 megawatts of power. A ceremony will be held on April 10 and the data center is expected to be fully commissioned and ready to go in the first quarter of next year
The new data center is modeled after Texas 1 built in Austin in 2011, which has been successful for the company. “A number of our Houston-based customers have been asking for us to construct a Texas 1 style facility in Houston,” said Edward Henigin, CTO at Data Foundry. “We are excited to mark the beginning of construction of Houston 2 and we expect to have this facility fully operational by the first quarter of next year.”
Houston 2 will reside on an 18-acre tract of land the company purchased in 2013. The facility will have a chilled water cooling system, a 185-mph wind rated structure and will offer more than 10 fiber carriers and 60,000 square feet of office space. Data Foundry will serve requirements ranging from single cabinet to multi-megawatt deployments out of Houston 2.
Data Foundry began life in 1994 as Texas.net. It opened its first facility in Houston in 1996. It owns and operates two data centers in Austin and offers a suite of colocation and managed services in data centers located in Ashburn,VA, Los Angeles, Amsterdam and Hong Kong.
The Houston market is booming, and Data Foundry is one of the home-grown providers driving this growth. The strong demand in Houston for data center space in recent years has been driven by the data processing needs of the energy industry. Other major players include CyrusOne, which has also been on a growth tear, and Stream Data Centers.
Design of the new data center was done by Corgan, and kW Engineering is providing engineering services for the project. The facility is being built by Holder Construction Company | | 2:30p |
Schneider Electric Reports Progress for Service Provider Initiative Schneider Electric launched its Data Center Service Provider Team over a year ago, and the business is humming along nicely. The company recently completed work with long-time customer Internap on its new Secaucus facility. That contract was a $2.5 million hardware-centric deal.
Internap commissioned Schneider Electric’s Data Center Service Provider Team to deploy Symmetra Uninterruptible Power Supply (UPS) modules, Square DSwitchgear for protecting and controlling electrical currents, and Schneider Electric’s Building Management System (BMS) for reliable, precision control of HVAC systems. While the Internap deal was a hardware-centric deal, it does represent a growing relationship.
“While Internap has heard our whole story, they’ve chosen us as hardware provider,” said Mike Hagan, Vice President of the North America Data Center Service Provider Team. “Would I love to provide everything? Of course I would. As we worked with them, we learned their business. This is a proactive build, with a compressed time schedule to preserve their capital.” Internap is doing what a lot of mid to large size providers are now doing; the approach is to build the larger building (or shell) and then expand within it incrementally by building out Pods.
“They’ve done a good job at standardizing overtime,” said Hagan. “They’re more granular at how much capacity they’re adding. They’ve done really well at establishing a repeatable design.”
Data Center Service Provider Team Progressing
Whie the Internap deal was a hardware deal, Schneider is well positioned to offer a broad spectrum of services. The company at large has made some big acquisitions in the form of AST Modular and Lee Technologies, diversifying its product mix. “The big picture is, we’ve had all these divisions as a potential product,” said Mike Hagan. “When we stood the team, we’re now able to approach the customer with a combined single vendor approach.” The world is changing and so are the conversations; Mike Hagan believes his team is ahead of the curve when it comes to addressing this shifting conversation.
“Our message overall, is to go to the customer with the capability to design, build, maintain and manage energy,” said Hagan.
Merging Schneider, AST, and Lee into one organization seems like a massive task, but all indications are that it’s going smoothly. “The best news is that culturally, the companies are very well aligned, which I believe is 80 percent of the battle,” said Hagan. “The corporate change that is new standards, new systems is the only thing that is cumbersome. A lot of the products are technically grasped by sales and we have subject matter experts in each of those industries.“
The company has targeted 12 percent annual growth for its Data Center Service Provider team. Last year, it exceeded that target and is well on pace for 2014, the company says.
In the past, Hagan says the company would receive separate Request For Proposals (RFPs) for individual systems; now, as with Internap, the deals come in one big RFP. “Now we’ve become more of a business planner,” said Hagan. “It’s about lower cost to build, speed to build, and lower operating costs.”
“Whether it’s Schneider or others, the reality is in this niche, more of the decisions are being made by the C-suite because these are financially driven transactions,” said Hagan, “but with all the equity money in this space, the stakes are high. As it’s become more competitive, rates are dropping, but equity wants the same return. The C-suite is driving much harder.” | | 3:00p |
Tiered Adaptive Storage from Cray Powers Petabytes for Research With the ability to scale up to over 75 petabytes of storage the Cray Tiered Adaptive Storage (TAS) solution has been selected by the North German Supercomputing Alliance (HLRN), under a recently announced contract. Cray TAS is an open storage and archiving solution for big data and high performance computing environments, and gives HLRN a long-term data management solution for its High Performance Computing Center (RRZN) located at Leibniz University in Hannover, Germany.
“It is important that our supercomputing infrastructure includes a flexible, scalable and simple storage archiving solution that supports the massive demand for supercomputing resources from across the northern states of Germany,” said PD Dr. Steffen Schulze-Kremer, head of HPC department at RRZN. “Cray’s Tiered Adaptive Storage solution seamlessly integrates with our existing supercomputing systems, and is expected to fulfill our storage needs now and into the future.”
The Cray TAS solution will provide RRZN’s users with a large-scale archiving system to actively access, manage and preserve important data resulting from the Center’s scientific research. It consists of over one petabyte of data storage and is upgradeable to more than 75 petabytes within the delivered architecture. For RRZN, Cray TAS provided a fast path to move from its existing Oracle SAM-QFS installation to Cray TAS without a lengthy data migration period. It features the Versity Storage Manager, includes all software and hardware, and eliminates complexities associated with planning, designing, and building large-scale archives. It provides transparent data migration across storage tiers — from fast scratch to primary and archive storage, and features up to four flexible storage tiers mixing media, solid state drive, disk or tape.
“The supercomputing facilities at RRZN support a wide array of complex scientific research, and we are pleased to provide the Center with an end-to-end data management solution that can meet and grow with the big data needs of their scientists and researchers,” said Barry Bolding, Cray’s vice president of storage and data management. “Cray TAS allows for a seamless upgrade from RRZN’s existing archiving solution, including the ability to continue to use their current policy engines. With the explosion of big data, we believe that Cray TAS fills a gap in the marketplace for customers who need enterprise-class data management solutions with high performance and low total-cost-of-ownership. Cray is providing a compelling solution for customers like RRZN that need active access to big data for their scientific workflow and a strong roadmap for their future data-tiering needs.” | | 3:30p |
Fundings Galore: Boundary, Ineda, IPO for MobileIron IT Infrastructure Monitoring solution provider Boundary raises $22 million in a C-round of funding, MobileIron files for an IPO hoping to raise as much as $100 million, and startup company Ineda Systems has raised $17 million to advance its low-power SoCs for the Internet of Things market.
Boundary raises $22 million. IT Infrastructure Monitoring solution provider Boundary announced a $22 million C-round funding, bringing total funding to date to $41 million. The round was led by new investor Adams Street Partners with participation from Triangle Peak Partners as well as existing investors Lightspeed Venture Partners and Scale Venture Partners. Mountain View based Bondary will use the funding to enable continued expansion in the development of its product and technology as well as significantly bolster its sales and marketing capabilities. As a public or private SaaS-delivered service Boundary’s service is operational within seconds and completely non-intrusive. It is designed for modern application infrastructures such as public/private cloud or hybrid environments where resources are dynamic in nature. At its core is the ability to collect and process hundreds of millions of metrics and events every second using a highly scalable low latency streaming engine. This data is processed in context of the real-time application map, which Boundary builds and updates immediately to reflect changes in the infrastructure or the application. Boundary processes up to 2 trillion metrics daily for clients that include Scripps Networks Interactive, Salesforce.com, The Weather Channel, Zendesk, Gilt, Expedia, Infor, Heroku, OneNeck, Websense, HCL Technologies, Rackspace and many others. “Modern application development methodologies combined with modern operations cultures such as DevOps give organizations the key advantages of rapid time to market and agility. But this can come at the cost of service availability as recent high profile outages have demonstrated. New applications and infrastructures require new monitoring solutions – ones that can embrace the cadence of continuous deployment and elasticity while dealing with the complexity of highly distributed architectures,” said Boundary CEO Gary Read. “Boundary is a new breed of application aware infrastructure monitoring solution where the key emphasis is how the infrastructure affects the application performance.”
MobileIron files for IPO. Mountain View based mobile device management company MobileIron has filed a registration statement for a proposed initial public offering of its common stock. It disclosed that it hopes to raise up to $100 million. MobileIron is led by former Cisco executive Robert Tinker, and has seen revenue grow to $105.5 million. The company has raised around $145 million in previous funding rounds. The MobileIron platform is a purpose-built mobile IT platform for enterprises to secure and manage mobile applications, content and devices while providing their employees with device choice, privacy and a native user experience. The company believes that the size of the global mobile IT market will be $27 billion for 2014 and will grow to approximately $49 billion in 2017. Morgan Stanley & Co. LLC and Goldman, Sachs & Co. will act as lead joint book-running managers for the offering.
Ineda Systems secures $17 million. Ineda Systems, a developer of low-power SoCs (system on a chip) for use in the fast-growing wearables and Internet of Things (IoT) market segment, announced that it has raised $17M in Series B funding. The investment was led by Walden-Riverwood Ventures and includes co-investors Samsung Catalyst Fund, Qualcomm Ventures, IndusAge Partners and others, along with existing investors. The funding will be used to further develop Ineda’s new class of highly integrated, ultra-low power semiconductor and software products that are aimed at the wearable device segment. These SoCs will be applicable to a multitude of devices such as smartwatches, health and fitness trackers and other wearable devices, as well as IoT. The market is primed for a new class of semiconductor architecture that is specifically designed to be ultra-low power and high performance for use in the rapidly growing wearable technology space,” said Ineda Systems CEO Dasaradha Gude. | | 4:00p |
New From 3M: Boiling Liquid to Cool Your Servers Will the high-density servers of the future live in blue-lit pools of boiling liquid? Making the case for a new approach to cooling, 3M this week unveiled a proof-of-concept for its immersion cooling solution in a Minneapolis lab. Instead of racks, the ICE-X supercomputing hardware from SGI is fully immersed in a tank of Novec, a dielectric fluid from 3M.
It’s the latest step forward for 3M in its effort to commercialize open bath immersion (OBI), a passive two-phase cooling technique which uses a boiling liquid to remove heat from a surface and then condenses the liquid for reuse, all without a pump.
3M is positioning its Novec-based immersion tank as a technique to manage high-density workloads that are difficult to cool with traditional air-based HVAC systems. 3M says OBI can reduce cooling energy costs by 95 percent by eliminating the need for raised flooring and room-level air cooling. It uses a smaller footprint for the same workloads, and can reduce water consumption by eliminating municipal water usage for evaporative cooling.
This week’s demo is the next step in the company’s plan to commercialize OBI, a technology Data Center Knowledge has been tracking since 2012. It’s notable for the involvement of SGI, a leading player in high-performance computing, and chipmaker Intel, whose E5-2600 Xeon chips were used in the proof-of-concept.
“Through this collaboration with Intel and 3M, we are able to demonstrate a proof-of-concept showcasing an extremely innovative capability to reduce energy use in data centers, while optimizing performance,” said Jorge Titinger, president and CEO of SGI. “Built entirely on industry-standard hardware and software components, the SGI ICE X solution enables significant decreases in energy requirements for customers, lowering total cost of ownership and impact on the environment.”
Intel Optimizing Chips for Immersion
For Intel, the collaboration represents its latest step to support immersion cooling, where liquid comes into direct contact with components. In 2012, the chipmaker began optimizing its technology for servers immersed in oil. 3M and SGI are taking a slightly different approach with open bath immersion. But Intel clearly sees the value in preparing for a future in which denser HPC environments may prompt more users to consider immersion.
“Intel is continually innovating and improving microprocessor technologies to meet today’s datacenter demands and is working with companies like 3M and SGI to explore advanced cooling technologies that improve energy efficiency in data centers while also containing operating costs,” said Charles Wuishpard, vice president, data center group and general manager, Workstation and High Performance Computing at Intel. “As the backbone of the data economy, modern data centers must increase the raw performance they deliver, but also do so efficiently by containing power consumption and operating costs.”
The companies are working with the Naval Research Laboratory, Lawrence Berkeley National Labs and Schneider Electric to deploy an identical system “with the intention to demonstrate the viability of the technology at any scale.” Meanwhile, cooling specialist Allied Control has deployed several OBI-based solutions for Bitcoin mining.
“These advancements are a significant stepping stone in accelerating industry wide collaboration to optimize computer hardware design.” said Joe Koch, business director for 3M Electronics Markets Materials. “We are thrilled with the work that our collaboration with SGI and Intel has produced.” |
|