Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, February 25th, 2014
Time |
Event |
1:00p |
QTS Builds Federal Cloud Business in Richmond Mega-Center  One of the power rooms inside the QTS Richmond Data Center. (Photo: QTS)
There’s a saying in real estate that you make your money when you buy a property, not when you sell it. This has been a guiding principle for QTS Realty Trust, which is a firm believer in buying huge commercial properties at a discount and converting them into data center space.
The showcase for that strategy is the QTS data center outside Richmond, Virginia, where the company has just launched QTS Federal Cloud, an Infrastructure as a Service (IaaS) solution designed exclusively for United States government agencies.
The site in Sandston, Virginia was previously a Qimonda semiconductor fabrication facility. In 2010, QTS bought the massive 1.3 million square foot plant and 210 acres of adjoining land in a bankruptcy sale for just $12 million. Those campus included 400,000 square feet of existing raised floor space, with more than 22,000 tons of chiller capacity on site, and an existing power capacity of 100 megawatts.
“Growth for Many Years to Come”
Four years later, the company has built 84,000 square feet of data center space, which is 80 percent filled with customers. QTS has another 22,000 square feet under construction and ready to come online this spring. It has plenty of room for future growth, with another 450,000 feet of shell space available for phased expansion.
“The Richmond market reflects our approach to low-cost, super-rich infrastructure,” said QTS chief executive Chad Williams. “We will be able to drive growth within that facility for many years to come. It also sets us up for our federal cloud offering. It’s a unique facility.”
QTS will follow a similar template in the Dallas market, where the company bought another former chip fab in Irving, Texas. This time the building was “only” 700,000 square feet. QTS plans to convert 292,000 square feet into data center space, starting with a 26,000 square foot first phase that will open in July.
Big Footprint, Lots of Runway
QTS Realty’s national footprint features 1.8 million square feet of powered shell space that can be used for data centers. Just 690,000 square feet of that has been finished, a utilization rate of 38 percent.
Some data center executives might worry about owning that much unused space. But acquiring space at a discount reduces the risk, so when Williams looks at the QTS footprint, he sees a long runway. “This gives us visibility for future growth at a known cost within our existing footprint,” he said.
QTS Federal Cloud is designed to meet the mandates faced by federal government agencies, including the Federal Data Center Consolidation Initiative (FDCCI), Cloud First policy and IT Shared Services strategy. Housed in QTS data centers in Richmond and Atlanta, the cloud offering is built on enterprise hardware from VMware, EMC and Cisco, making for simple integration with private VMware environments. The company says it maintains strict logical and physical security protocols and expects to achieve FedRAMP certification in mid-2014.
“QTS understands the IT needs and requirements of government agencies and our Federal Cloud solution augments the company’s strategic focus on the public sector,” said Jim Reinhart, chief operating officer, development and operations – QTS. “We are proud to add this solution to our existing federal data center services.”
Diversifying Beyond Atlanta
Growth in Richmond and Dallas will help QTS diversify its revenue beyond its core market of Atlanta, where it is the leading provider and operates huge facilities in downtown and in Suwanee. The Atlanta market currently accounts for about 70 percent of the company’s revenue.
QTS Realty (QTS) went public through an IPO in October at $21 a share. It gained 18 percent in the first three months of trading, making QTS the best-performing stock in the data center sector for 2013. The company has 10 data centers in seven states with 3.8 million square feet of data center infrastructure and supports more than 880 customers. QTS closed Monday at $25.59.
In Richmond, QTS offers custom data suites from 1,000 to 50,000 square feet, as well as cloud, colocation and managed service. Williams sees opportunities that extend beyond the federal market.
“We continue to see good growth in Richmond, not just from Virginia but also from New Jersey and North Carolina,” he said. “We see it as a regional market. We think we have a unique opportunity for those (regional customers) customers in disaster recovery.”
 An aerial view of the former semiconductor plant near Richmond, Va. that has been converted into a data center by QTS Realty Trust. | 1:20p |
A3Cube Launches, Promising Faster Networking for HPC and Big Data  The home page of networking startup A3Cube.
Silicon Valley startup A3Cube emerged from stealth mode today, and is looking to introduce a dramatic shift in network architecture. The company is backed by five years of research and development, and has been in business for a little over a year.
A3Cube has announced a “brain inspired” PCI Express Network Interface Card (NIC) designed to eliminate the I/O performance gap between CPU power and data access performance for data center, HPC and big data applications.
Nano-Second Latency
A3Cube says its RONNIEE Express platform elevates PCI Express from a simple interconnect to an intelligent network fabric that solves performance bottlenecks inherent in PCIe. A3CUBE’s In-Memory Network technology allows direct shared global memory across the entire network. This data plane enables storage that combines supercomputing’s massively parallel operational concepts and an I/O interface eliminating central switching and network overhead.
The company says this In-Memory Network discards the protocol stack bottleneck and replaces it with a direct memory-to-memory mapped socket, producing disruptive performance enhancements while using commodity hardware.
“A3CUBE’s In-memory Network Fabric leverages an innovative approach to transforming HPC, Big Data and data center environments in order to drive greater performance and efficiencies in the network and storage systems,” said Bob Laliberte, senior analyst at ESG. “A3CUBE is extending PCIe capabilities in order to deliver a next generation network that it claims will overcome traditional network bottlenecks utilizing a high performance (Nano-second latency) and massively scalable architecture.”
Touts “Radical” New Design Approach
“Today’s data center architectures were never designed to handle the extreme I/O and data access demands of HPC, Hadoop and other Big Data applications,” said Emilio Billi, founder and CTO of A3CUBE. “The scalability and performance limitations inherent in current network designs are too severe to be rescued by incremental enhancements. The only way to accommodate the next generation of high performance data applications is with a radical new design that delivers disruptive performance gains to eradicate the network bottlenecks and unlock true application potential.”
RONNIEE 2S is a PCIe-based intelligent NIC designed to maximize application performance using a combination of hardware and software. It provides multiple channels with sub 1 microsecond fast direct remote I/O connections. RONNIEE RIO is a general purpose NIC supporting Ethernet and memory-to-memory transactions in a 3D torus topology that can plug in any server equipped with a PCIe slot. This empowers a scalable interconnection fabric based on a patent-pending shared memory architecture that implements the concept of distributed non-transparent bridging.
RONNIEE 3 is a card designed to extend the capability of RONNIEE 2S and optimized for high performance data environments. The In Memory Network provides full support for memory-to-memory transactions without the usual software overhead. | 1:30p |
Successfully Marrying IT Process Automation Tools with Cloud Technology Gabby Nizri is the Founder & CEO of Ayehu Software Technologies Ltd., publishers of eyeShare, an enterprise-class, lightweight IT process automation tool.
 GABBY NIZRI
Ayehu Software
As technology evolves, more and more businesses of every shape, size and industry are turning to the cloud for enhanced flexibility, improved efficiency and the opportunity for global advancement. Coincidentally, IT process automation also offers many of the same benefits, which makes the two a very attractive combination. Let’s take a look at how ITPA tools are currently working within the cloud and how to choose the ones that will most benefit your business.
Define Goals and Strategies
First and foremost, it’s important to understand that if you’re in the market for an automation product that you’d like to leverage along with cloud technology, you must clearly outline what your goals and strategies are ahead of time. There are plenty of options to choose from, and quite frankly, it can be mind-boggling figuring out which option makes the most sense for your particular business. You must start by clearly defining what you want to accomplish and then compare your options to find the product that is most closely aligned with your needs.
Integration
The next thing to consider is how it will integrate with your current system. There’s no point in implementing a new IT Process automation tool if it’s going to function as just another fragmented program in and of itself. You’ll only be creating more work and a bigger headache, and it’s very unlikely you’ll accomplish that set of goals you’ve laid out. You need to find a product that not only works in the cloud, but also provides easy integration with legacy systems. The goal is to make the transition as seamless as possible and to enhance, not hinder, the functionality of your IT operation and the organization as a whole.
Security
Another important factor to consider when marrying cloud with automation is security. These days, it’s absolutely critical that businesses do their due diligence when managing sensitive, proprietary and confidential data to avoid a potential security breach. The nature of the cloud makes it inherently riskier than systems that are completely housed on-premise. Provided that the ITPA product that you choose is of good quality and integrates well with your existing systems, the security issue can be resolved, since it allows you to achieve the best of both worlds – keeping confidential information safely in-house while still leveraging the benefits of cloud technology.
Holistic Approach
Finally, if you’re going to be successful at leveraging both cloud and IT automation, you must find a way to adjust your thinking (and that of your entire team) from the older, more opportunistic approach to one that is much more holistic. The old system of thought was that each department or function could use its own automation tool to make their job more efficient. The problem is, not only do you end up with way too many tools, but the end result is fragmented and inherently inefficient, ultimately defeating the purpose of automating altogether.
The only way to really bring the two pieces together successfully is to look at automation as a “big picture” solution – a tool that can and should become an inter-departmental bridge creating a more systematic approach to operational efficiency. Rather than many individually beneficial tools, automation can be the one big problem solver that brings everything together and improves the business as a whole. As such, the IT automation tool you select should be designed to work across all systems.
Cloud computing is most certainly the future paradigm of business IT. So is IT process automation. Combining the two can provide your organization with the best of both worlds, but you have to first know what to look for if you’re going to piece together a system that will truly boost your business to the next level. By following the guidelines provided here, you’ll be poised to create an environment in which cloud and automation will help streamline your operations and catapult your business into success, both now and well into the future.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 2:30p |
Managed Hoster Logicworks Surfing the Sea Change In Healthcare IT Managed hosting provider Logicworks sees a massive opportunity in healthcare. The company was founded way back in 1993, but has been embracing cloud over the last seven or eight years as the technology hit the limelight. The bulk of its business is managing private clouds for customers running big, complex web-facing applications.
The company has a focus on security and regulatory compliance, so it tends to attract a lot of healthcare and legal customers. The healthcare vertical is its biggest market in terms of overall revenue. National initiatives focused on advancing healthcare technology have furthered boosted the opportunity.
“We’re really focused on the US healthcare market,” said Jason Deck, VP of Strategic Development at Logicworks, who cited several reasons the healthcare market is a great opportunity.
“Digitizing records is an easy thing to say but a complex thing to do,” said Deck. “It’s a Sea Change kind of a problem, and a challenge across all levels of their organization and operations. This is coupled this with the government releasing an awful lot of money to make this happen. How do you deal with compliance and regulatory shifts? Being able to hold those conversations separates us from larger competitors. If you’re rolling out an electronic records application, lets first understand the application and we will design and build the application around your specific needs.”
Logicworks customers include 20 different US state Health Information Exchanges (HIE). “We manage the whole kit and caboodle,” said Deck. ”We manage the entire infrastructure, all mission critical up to the application. We don’t manage the software, but we do everything else.”
There is a big push to digitize healthcare records and establish health information exchanges. The current administration has been aggressive in moving healthcare forward into the age of technology (Obamacare web page launches aside) and Logicworks is in unique position to capitalize on these sizable needs.
While it sees the healthcare market heating up big time, the company also notes more customers are inquiring about managed Amazon Web Services. “Cloud is really just a tool we use to deliver value to our client,” said Deck. “We’re not a cloud provider, we deliver compliant infrastructure.”
Customers Growing Comfortable with Public Cloud
Most of Logicworks’ customers are businesses focused on security and compliance focused businesses – the types you wouldn’t expect to embrace public cloud. However, the company notes that more and more customers are inquiring about managed Amazon Web Services.
Logicworks currently hosts infrastructure within data centers operated by Equinix (New York and Silicon Valley) and Digital Realty (Piscataway, N.J.). “We look at a provider that gives the ability for us to advance,” said Deck. “We look for people with critical mass around the world. We look for network density, physical security, and the attributes of tier III data center and up.”
While it will continue to grow within its existing data centers, a sizable chunk of Logicworks’ new business is being sent over to AWS. Given its compliance focus, it suggests a larger trend of companies growing comfortable with public cloud.
“This is an emerging part of our business,” said Deck. “The underlying infrastructure itself and the ownership of it is less and less valuable. So we’re finding that, while the majority is still hosted private cloud, we’ve embraced AWS as a partner. If we can deliver via AWS, we will. A lot of service providers look at them as the arch nemesis, we see them as a partner.”
Deck uses an analogy to describe AWS. “They’ve build an F-16 fighter jet, but it isn’t assembled,” said Deck. “What we do isn’t running Exchange on AWS. We can, but people call about mission critical applications. We write our own code to automate on an application specific basis.”
So in this scenario, the curiosity about the cloud is helping Logicworks, a traditional managed hoster finding itself evolving with customer needs. It’s given birth to enterprise class managed AWS services.
“Customers are coming to us with an appetite to understand whether Amazon is a fit to them, but there’s a still a lot of figuring out,” said Deck. “We do not have a preference. Our preference is to do what’s right for the application. We don’t stir one way or another. Sometimes it makes sense to do both.”
The company predicts a very healthy mix of traditional, managed private cloud on dedicated infrastructure as well as growth of public cloud usage. “Our prediction and our forecast on the market is that there is today, and for the foreseeable future, a need for hosting private cloud infrastructure,” said Deck. “The next round, which is a bit down the road, is around AWS Direct Connect.” | 2:47p |
Help DCK: Take Our Reader Survey, Get A Starbucks Card  Kip and Gary took the DCK Reader Survey. They were so excited about the Starbucks gift card that they take their coffee with them everywhere.
Do you have a minute to help improve Data Center Knowledge? We invite you to complete our new 2014 Data Center Market Survey. It’s a few questions about the data center industry.
We value your insights on the data center industry. Your answers will be completely anonymous, and we’ll use the aggregated data to help us understand current industry trends and continue to improve the Data Center Knowledge site, which brings you the latest news on the topics you care about.
This year everyone who completes the reader survey will receive an $5 Starbucks card. Click here to take the survey. Feel free to share with your data center friends and colleagues.
Rich Miller
Editor in Chief, Data Center Knowledge | 3:30p |
Citrix Offers Big Data Analytics for Mobile Carriers At Mobile World Congress in Barcelona this week, Citrix (CTXS) launched a new big data analytics solution designed to help mobile operators monetize their businesses, and added enterprise mobility management support for Intel Device Protection Technology.
Citrix launched ByteMobile Insight, a big data analytics solution built on a carrier-grade platform, ByteMobile Insight collects usage, location and customer relationship management (CRM) data from a wide variety of sources. This enables new revenue streams by monetizing usage and location data with third parties. It also gives marketers a 360 degree view of data and location usage across the mobile network.
“Citrix is addressing a gap in the market for an analytics solution that can be quickly deployed to collect and correlate usage data from multiple sources and translate that into ready-to-use, well visualized, a la carte insights for any non-technical stakeholder in the organization to use,” said Ari Banerjee, senior analyst, Heavy Reading. “By optionally combining ByteMobile Insight with the ByteMobile Adaptive Traffic Management system as a network source, Citrix is able to both accelerate time-to-insight and enrich the data with comprehensive user experience metrics, making that data more valuable in the process.”
“More than 140 operators around the world have deployed ByteMobile web and video optimization solutions to improve subscriber quality of experience (QoE), increasing data consumption and subscriber loyalty in the process,” said Chris Koopmans, vice president and general manager, Service Provider Platforms at Citrix. “With ByteMobile Insight, we’re taking the logical next step, which is to enable operators to further enhance the customer experience and more directly increase revenues, by leveraging their own usage data, including the user experience insight we enable through our other products.”
Enterprise Mobility Management support for Intel Device Protection Technology
Citrix announced support for Intel Device Protection Technology (Intel DPT), enabling enterprises to secure and manage Intel-based Android mobile devices. Intel DPT combined with Citrix XenMobile will provide enhanced hardware and software security and management capabilities, empowering IT organizations with increased levels of control without compromising end-user experience. Intel-based devices with Intel DPT running Android OS have enhanced layers of security built in. XenMobile will fully leverage the manageability extensions for Android offered by Intel DPT. In doing so, XenMobile will provide IT with full administrative visibility and control.
“Our close collaboration with Intel around client virtualization goes back many years,” said Amit Pandey, Vice President and General Manager, Mobile Platforms Group at Citrix. ”With this announcement, we are extending that relationship to the mobility space and providing XenMobile support for Intel DPT. Together, we are enabling enterprises to mobilize their employees by delivering security and manageability while providing a seamless user experience. This helps accelerate the adoption of Android devices in the enterprise.” | 6:52p |
Data Scientists Predict Oscar Winners  The Academy Awards are Sunday. Can data crunching predict the winners? (Photo: Oscars.org)
A data scientist can generate a multitude of insightful analyses with enterprise technology solutions and challenging big data opportunities – coupling statistics, computer science and business acumen. But what about a more pressing question: who’s going to win at the Oscars?
Data scientists at Farsite, the advanced analytics division of ICC have done just that, using data to predict the winners in major categories this Sunday night at the Oscars.
“Our predictions are far more than lucky guesses,” says Ryan McClarren, Chief Science Officer at ICC. ”Most people are surprised to hear that the same sophisticated, predictive modeling we use in industries like retail and healthcare can predict Oscar winners quite accurately. And while social media buzz may be high this week for Leo DiCaprio, sadly, our data shows he is not going home with a statue on Sunday.”
Farsite made Oscar predictions last year as well, and picked winners in five of the six major categories – including surprise best supporting actor winner Christoph Waltz, and best picture winner Argo. Farsite uses a first-of-its kind data-modeling tool to predict Oscar winners. The model analyzes more than 40 years of film industry and Academy Award related information to forecast probabilities for the winners. This information includes real-time data and an array of variables, including total nominations, other Guild nominations and wins, buzz and nominees’ previous winning performances. For the 2014 Oscars, Farsite is predicting the following winners:
- Matthew McConaughey for best actor for Dallas Buyers Club
- Alfonso Cuaron for best director for the movie Gravity
- 12 Months a Slave for best picture
- Cate Blanchett for best actress for Blue Jasmine
- Jared Leto for best supporting actor for Dallas Buyers Club
According to Farsite data scientists, the first factor to consider in the model for predicting is that during awards season there are other award winners that can provide insight into likely Oscar winners. The second factor to consider is the momentum or buzz behind particular nominees. The third factor is the history or prior performance of the nominees. Some nominees may have an edge given their past.
Follow the @FarsiteForecast Twitter account for learning the details behind science of their predictions.
ICC (Information Control Company) is a Columbus, Ohio provider of enterprise technology solutions. Its Farsite advanced analytics division specializes in helping companies use big data and predictive analytics to empower smart business decisions and solve tough challenges. | 9:07p |
DCK Webinar: The Ins and Outs of Data Center Migration At a time when corporate consolidations and M&A activity are leading reasons for adding new data center capacity, the details of migrating a data center are rarely discussed. This webinar will present the key issues that must be addressed to ensure that your next data center migration is a successful one.
Join Chris Crosby, CEO of Compass Datacenters, and Craig Macfarlane, CTO and Co-Founder of Transitional Data Services on Wednesday, March 11 for a special one-hour webinar in which they will discuss data center migrations.
In this webinar, you’ll learn about:
- Discovery and documentation of dependencies
- Definition of optimal relocation strategies
- Development of project plans and runbooks
- Execution of effective relocation priorities
Webinar Details
Title: The Ins and Outs of Data Center Migration
Date: Tuesday, March 11, 2014
Time: 2 pm Eastern/ 11 am Pacific (Duration 60 minutes, including time for Q&A)
Register: Sign up for the webinar.
Following the presentation, there will be a Q&A session with your peers, Julius and industry experts from Digital Realty.
About the Speakers:
Chris Crosby is a recognized visionary and leader in the datacenter space, founder and CEO of Compass Datacenters. Chris has over 20 years of technology experience and over 10 years of real estate and investment experience. Previously, Chris served as a senior executive and founding member of Digital Realty Trust. Chris was Senior Vice President of Corporate Development for Digital Realty Trust, responsible for growth initiatives including establishing the company’s presence in Asia. Mr. Crosby received a B.S. degree in Computer Sciences from the University of Texas at Austin
Craig Macfarlane is the Chief Technology Officer and Co-Founder of Transitional Data Services, and initiated the creation TDS’s proprietary software TransitionManager. Craig’s innovative insights around optimizing data center moves led him and a team of 10 developers to create purpose built software TransitionManager. With more than 30 years of technology experience Craig has worked on data center services, IT services, Internet systems and global networks. Previously, Craig worked at Student Advantage, iCast, Strategic Interactive Group (now Digitas) and led teams that designed and built websites and content management systems for IBM, AT&T, Kraft, LL Bean and other companies. Early in his career, he deployed and managed a variety of global computer networks at Bolt, Beranek and Newman, a Cambridge, Mass.-based technology firm; projects included developing networks for the U.S. government including ARPANET (the precursor to the Internet), NEARNET and TWBNet.
Sign up today and you will receive further instructions via e-mail about the webinar. We invite you to join the conversation. |
|