Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, August 22nd, 2013
| Time |
Event |
| 11:00a |
Digital Realty Launching Open Internet Exchange Initiative  A Digital Realty Trust data center in Ashburn, Virginia, which will be one of the sites for its new open Internet exchange initiative. (Photo: Rich Miller)
Data center developer Digital Realty Trust is seeking to create a global open-interconnect environment that it says will provide more cost-effective exchange options. The Digital Open Internet Exchange will be an open-interconnect and peering environment, borrowing from an open model of neutral and member-governed Internet exchanges that’s proven successful in Europe.
The company will operate as an endorsed data center partner with the governing bodies and endorsed IXPs (Internet exchange providers) in each of the major exchange locations in North America, Europe, and the Asia Pacific region. The initial rollout for Digital Open Internet Exchange will take place in the New York metro area and Northern Virginia, followed by deployment in several other U.S. markets.
“Digital Open Internet Exchange is a game-changer for the entire IP and networking community, and for our customers,” said Michael Foust, Chief Executive Officer at Digital Realty. “By creating a truly open Internet exchange environment, we are supporting the community’s desire for neutral exchanges that are more efficient and cost-effective than those available today. The end result for our customers will be immediate access to enhanced interconnectivity and Internet peering capabilities across the 33 markets in our global portfolio.”
The Digital Realty ecosystem provides customers with a neutral, connectivity-rich environment to connect with carriers, business partners and service providers. It also provides infrastructure for carriers and service providers to deliver products and services to customers in any Digital Realty data center, without capital-intensive deployment costs.
Sees Opportunity in European Model
“This is great for the entire industry – from the large CDNs, content providers and the end users,” said John Sarkis, Vice President of Connectivity and Carrier Operations at Digital Realty. “The current state of affairs, as an end user, is that service isn’t that great. In Europe, things run more efficiently and effectively. Here in North America the exchanges are very expensive to participate in. The community wants to produce a better product. The Internet community here in North America wants to adopt the open structure. As a data center provider, we have endorsed this.”
Digital Realty Trust is the world’s largest operator of data centers and Internet gateways, with 122 properties spanning 22.7 million square feet of space in 32 markets throughout North America, Europe, Asia and Australia.
That scale could offer opportunities in an open exchange ecosystem. Major European carrier-neutral Internet exchanges like AMS-IX in Amsterdam and DE-CIX in Frankfurt operate in a distributed model, with points of presence spread across multiple data centers. In the U.S., interconnection activity tends to be more centralized in particular buildings or facilities.
“We understand that the costs and limitations associated with connecting to the existing Internet exchanges in the U.S., as well as peer-to-peer interconnecting, are simply too high.” said Sarkis. “This initiative establishes the optimal exchange environment, making it possible for Internet service providers (ISPs), content delivery networks (CDNs) and content producers to interconnect with their peers as well as to other exchanges from our data centers at a significantly lower cost than they pay today, while also improving the quality of service they can provide to customers.
“By empowering the community to take control of the way it operates the exchanges, we will enable organizations within business ecosystems – like-minded and vertical industries – to connect with each other in and across all of Digital Realty’s 120-plus global locations,” he added.
The open exchange idea has gathered a lot of momentum in the last 12 months.The criteria has been set, and with this move, Digital Realty purposely lined this up with ongoing initiatives in the market.
“We at Digital Realty are focusing on providing what we do best – the environment,” said Sarkis. “The exchanges will be in data centers that help them grow. That’s our role.” | | 12:00p |
New Programs Emerge to Train Big Data Scientists  Universities and corporations with a stake in big data are investing in programs and facilities to train a new generation of leaders.
The term “big data” has entered the spotlight. But as many have pointed out, big data alone is practically useless. It must be transformed into information and knowledge, with the help of analytics, innovation and data scientists.
But where will those data scientists come from? Universities and corporations with a stake in big data are investing in programs and facilities to train this new generation of technologists.
As a (somewhat) new profession, the data scientist is set to capture a variety of attributes important to the big data field. Data analysis certainly is not new, but the modern data scientist must leverage today’s tools, work efficiently in large data sets being generated and managed, and have the right mix of analytical skills and business acumen. It’s a mix of skills in database query languages, statistics, predictive and advanced analytics, programming, business intelligence and cognitive science, mixed in with a good base in business and mathematics.
Universities have been essential in research and development for data science. The Rensselaer Polytechnic Institute recently announced that it intends to build a $100 million center to pursue big data developments and research. The Rensselaer Institute for Data Exploration and Applications (IDEA), will operate as a centralized hub across the university’s five schools, and let students and private investors to benefit from the big data advancements discovered within the state-of-the-art research facility.
“The Rensselaer IDEA will maximize the ability of our researchers to harness the expanding possibilities for discovery and innovation in a data-driven, supercomputer-powered, web-enabled, globally interconnected world,” said Rensselaer President Shirley Ann Jackson. “Educated in this context, with new approaches and analytical capabilities, our students—the next generation of discoverers, innovators, and entrepreneurs—will be better equipped to truly change the world.”
IBM Narrows the Big Data Skills Gap
Though the company’s Academic Initiative, IBM launched a new curricula focused on big data and analytics with the addition of it has added nine new academic collaborations to its more than 1,000 partnerships with universities across the globe. IBM has partnered with Georgetown University, George Washington University, Rensselaer Polytechnic Institute and the University of Missouri, as well as a new addition to IBM’s partnership with Northwestern University. IBM cites statistics from the U.S. Bureau of Labor that predicts a 24 percent increase in demand for professionals with data analytics skills during the next eight years.
“Leaders in business, education and government must take action to foster a new generation of talent with the technical expertise and unique ideas to make the most of this tsunami of Big Data,” said Richard Rodts, Manager of Global Academic Programs, IBM. “To narrow this skills gap, IBM is committed to partnering with universities around the world to provide students with Big Data and analytics curriculum to make an impact in today’s data-driven marketplace.”
This past spring the University of Wisconsin Milwaukee began a new fully online program to deliver a graduate certificate in Business Analytics. After completing a gateway course in Analytic Models for Managers, students choose from remaining courses in business forecasting, web mining and analytics, marketing analytics, database marketing, or business intelligence technologies and solutions. Additionally they will have group data projects with adequate exposure to software tools such as SAS, IBM SPSS and Python.
Numerous other universities have seen the benefit from offering degree programs in big data and analytics. DataInformed maintains a map of the various programs across the United States. Swami Chandrasekaran built a Metromap visualization of the data scientist curriculum – covering statistics, programming, machine learning, natural language processing, data visualization, big data, data ingestion, data munging, and toolbox.
The market for Data Scientists
Kaggle is a platform for data prediction competitions and a community of data scientists that meet and compete with each other to solve complex data science problems. Its clearing house of big data competitions matches big data challenges from big name companies with a community of over 100,000 data scientists. For instance -GE’s Flight Quest challenge asks data scientists to use provided data to develop a usable and scalable algorithm that delivers a real-time flight profile to the pilot, helping them make flights more efficient and reliably on time. GE will hand out awards totaling $250,000 for the project.
Kaggle’s Chief Scientist Jerremy Howard told Fast Company that the predominant attribute of the data scientists competing for challenges is not a PhD, but creativity – and Coursera, an online education site that partners with top universities around the world. These DIY Data Scientists are ranked according to the competitions that they have won, with prizes ranging all the way up to $3 million.
Like any other job in technology, another key trait that the data scientist must have, is the ability to adapt with ever-changing landscape of technology, tools and trends in the industry. No matter which angle the data scientist approaches from (business, technical, creative), there is no doubt that the demand is present for the analytical skillset of those willing to take on big data challenges. | | 12:29p |
Enterprise-Class Support for Converged Environments Doug Schmitt serves as Vice President and General Manager of Dell’s Global Support & Deployment line of business. In this role, he leads an organization of over 40,000 direct and indirect team members delivering customer support, field deployment, operations and engineering readiness & capabilities in over 100 countries.
 DOUG SCHMITT
Dell
Slow, inefficient and error-prone are often the words that businesses use to describe IT. That isn’t surprising considering that recent studies have found that just 29 percent of business users consider IT to be distributed, agile and flexible1; 75 percent of downtime is caused by human error2; and an overwhelming amount of IT budgets (72 percent) goes toward ongoing maintenance instead of innovation to drive positive business impact3.
CIOs face a wide range of issues in trying to support their environments, including; meeting demands for mobility (with Corporate-Owned, Choose-Your-Own, and Bring-Your-Own Device [BYOD] management strategies to consider), implementing new trends, managing individually supported products; keeping compatibility lined up among connected components (like ensuring firmware levels are compatible when doing updates), support agreements expiring at different intervals, knowing the right number to call for support of different types and different versions of equipment, and running on a mixture of legacy equipment and software. They do not typically have the budget, time, or expertise to solve all the issues that might arise in their environment.
The bad news – IT environments are only going to get more complex. And, with all of the IT variation, it gets even worse as data growth continues to increase (25 percent by 2015) and mobile devices continue to multiply.
The good news – converged solutions can help CIOs rapidly deliver IT services, maximize data center efficiency, and strengthen IT service quality. In fact, Gartner estimates, “By 2015, one-third of all servers will ship as managed resources integrated in a converged infrastructure.”4
Let’s dive into the basics of converged solutions–from ‘why’ to planning to support services.
Planning for Transformation
You should keep in mind the following principals as you transform your IT environment.
- Open and Standard: Make sure you have an open and standard architecture that is flexible enough to handle both current and future demands.
- Intuitive: Look for reference architectures and pre-integrated systems that will help you deploy and manage solutions faster and more efficiently, so that IT can focus on more strategic projects.
- Automation: Automation is key to shifting from “keeping the lights on” to innovation. Without it, IT wastes valuable time focusing on repetitive and time consuming tasks. Look for a management approach that offers automation without compromising on quality and flexibility.
- End-to-End: Look for end-to-end solutions and services paired with enterprise-class support that is not only easy to buy, deploy, and manage but that ensures the highest quality customer experience. Make sure that the vendor you choose is a trusted business partner that can not only help you implement, manage and monitor the technology but that can identify areas of opportunity.
Enterprise-Class Support
So, you have your converged solution – but how are you going to support it? Much like planning ahead for problems when buying insurance for a new house, you need to plan for the future needs of your converged solution and your business. Converged infrastructure requires a new level of enterprise-class support and expertise.
You should look for a seamless and comprehensive support solution that incorporates the four principles listed above. IDC recommends that businesses consider vendors with state-of-the-art offerings, deep domain expertise and the tools and automation to help address day-to-day operational issues.
What are the questions to ask … see next page.
Notes:
1InformationWeek, Oct. 2012
2Advisory Board Q&A, Jun. 2011
3Forrester Research, Apr. 2013
4Is the Concept of the ‘Server’ Obsolete, or in Need of Redefining? 29 March 2012</p> | | 1:30p |
Carter Validus, Server Farm Realty Team on NJ Acquisition Carter Validus Mission Critical REIT and Server Farm Realty have partnered in the acquisition of a fully-leased data center in Leonia, New Jersey. The 67,000 facility is 100 percent leased on a long-term basis to Infocrossing, a subsidiary of Wipro.
Server Farm Realty acquired the property from Cole Realty Group and then sold it to Carter Validus for a final sale price of $14.76 million, according to Cushman & Wakefield, which represented Server Farm Realty in the transaction.
“We’re pleased to add an asset of this quality in an important data center market like New York City and continue to add to our portfolio of mission critical assets.” said John Carter, CEO of Carter Validus Mission Critical REIT.
The two-story building includes first-floor data center space with more than 25,000 square feet of raised floor white space, and office space on the second floor.
“This facility is on a great power grid, has great proximity to Manhattan and a good parking ratio,” said Sean Brady, senior director and co-founder of Cushman & Wakefield’s Global Data Center Advisory Group. “All of those reasons made the property a very attractive investment opportunity.” | | 2:05p |
LSI Launches Cache-Coherent 16-core SMP ARM Processor LSI launches a cache-coherent ARM processor, Marvell introduces its Prestera DX4200 series packet processors for communications, and QuantaBits launches new optical networking equipment for ISPs and telecommunications providers.
LSI launches SMP ARM processor. LSI announced it has started delivering the Axxia 5500 communication processor family to key OEM customers. Featuring 16 ARM cores, the new chip is designed to meet the performance, integration, cost and power demands of mobile and fixed networks. The new chip provides increased network performance through a unique combination of power-efficient ARM Cortex-A15 processors and CoreLink CCN-504 Cache Coherent Network interconnect. It has flexible connectivity provided through the 16 10GbE interfaces, which allows the AXM5500 product to support a variety of network configurations with a single SoC. “The delivery of Axxia 5500 means that our customers are now building Axxia processor performance and efficiency into systems that will represent over half of the mobile base station market,” said Jim Anderson, senior vice president and general manager, Networking Solutions Group, LSI. “The versatile Axxia processor family is also being applied in many other applications, such as datacenters and enterprise networking, and we are excited to be shipping this powerful new processor.”
Marvell introduces DX4200 for service delivery solutions. Marvell (MRVL) announced the Prestera DX4200 series of packet processors that enable highly differentiated service-delivery solutions in the access and aggregation layers for a new generation of converged fixed and mobile networks. The DX4200 family is designed to accelerate service provisioning and improve the deployment and management of these networks while maximizing service and application monetization. The 28nm System on Chip design delivers innovative integration of multi-core ARM CPUs, a carrier grade traffic manager and a flexible IPv6 packet processing pipeline enabling dynamic software defined networking and advanced service virtualization. “As demand for higher service density per watt increases, Marvell is uniquely positioned to offer platforms for the software defined storage, networking, mobile and compute clouds being designed today,” said Ramesh Sivakolundu, vice president for the Connectivity, Services and Infrastructure Business Unit (CSIBU) at Marvell Semiconductor, Inc. “We believe the Prestera DX provides the best platform for services-driven mobile backhaul and carrier Ethernet infrastructures along with application-driven secure access and aggregation layers in datacenter and campus networks.”
New Optical equipment from QuantaBits. QuantaBits announced a family of products that include a 10Gb GEPON OLT and a 10Gb ONSU, a solution for Internet Service Providers and telecommunications providers. The new family offers an improved split rate (64 ONUs per PON Port), ensuring a quick and efficient way to deploy networking equipment. They also use up to 70 percent less power consumption and the ability to transmit data up to 50 miles. “Our network equipment allows for higher utilization and a quicker return on investment. The availability of this technology is sure to have a disruptive impact on the communications industry” said Dan Horan, CEO of QuantaBits Inc. | | 2:25p |
CyrusOne Keeps Growing Houston Campus  An artist’s conception of CyrusOne’s new data center on its Houston West campus. (Image:CyrusOne)
CyrusOne has another major expansion underway at its Houston campus. A third data center is being built on the 45-acre parcel land located along Beltway 8 in Houston’s energy corridor. The company is building on its strong market position with the oil and gas industry – and continual expansion indicates that this position grows strong by the day. The company anticipates breaking ground on the expansion early next year.
The company acquired 32 acres adjacent to its Houston West facility last April, and the 45-acre total campus is growing like wildfire. Upon completion, the Houston campus will have total power capacity approaching 100 megawatts and more than 1 million square feet of data center and 200,000 square feet of Class A office space.
“CyrusOne’s Houston West campus is well known as the largest data center campus for seismic exploration computing in the oil and gas industry,” explained Kevin Timmons, chief technology officer at CyrusOne. “By continuing to apply our Massively Modular design/build approach and high-density compute expertise, the new facility will allow us to serve the growing number of oil and gas customers who are demanding best-in-class mission-critical infrastructure.
“The 200,000 square feet of Class A office building will enable us to expand the ecosystem for facilitating research and development of geophysical exploration data by providing office space for employees of the world’s leading oil and gas companies as well as academicians from the leading universities that are all involved in conducting petrochemical analytical research,” said Timmons.
Focus on Seismic Energy Exploration
This is the largest seismic exploration computing campus in the U.S. The company is building on its strong market position with the oil and gas industry, as CyrusOne does business with nearly all super-major and major oil and gas firms worldwide. The campus is attracting significant research and development investment to the Houston area as more companies and countries look to expand their exploration expertise, notably in the newer hydraulic fracturing (“fracking”) procedures.
The company also developed a high-performance computing (HPC) cloud solution specifically for the oil and gas industry. It’s been a whirlwind 2013 thus far for CyrusOne. There was a sizeable Houston land grab in April, and this new facility shows that the campus is growing gangbusters.
Also this year, the data center service provider completed its IPO, and reported record sales and leasing in its first earnings report as public company. The company, which was spun off from parent Cincinnati Bell, is said to be looking at new markets and could expand through acquisition, according to executives earlier this year.
Success in Texas
The company is killing it in Texas, also growing in San Antonio, leasing all of its space in an existing facility to a single customer and building out another 200,000 square foot data center there that will also sell out. It’s seeing massive growth in Dallas as well. Texans like to build big. And perhaps no company knows the Texas market better than CyrusOne
The company’s ‘massively modular’ approach to building data centers means it can cost effectively deliver space in record time, and the demand has it continuously building. Just look at the juggernaut that is Houston, which is about to cross the 100 megawatt mark. Business is big in Texas.
Facilities are built with the highest power redundancy (2N architecture). Customers also have access to the CyrusOne National Internet Exchange (National IX) which married low cost robust connectivity with massive scale data centers the company is known for, linking a dozen CyrusOne facilities in five metropolitan markets (Dallas, Houston, Austin, San Antonio, and outside of Texas, Phoenix.}
CyrusOne customers can go beyond the benefits of a single data center and architect a data center platform for their production or disaster recovery solution using multiple, connected data centers. It’s ripe for the Fortune 500 to architect data center platforms that meet the disaster recovery requirements of Sarbanes Oxley, HIPAA, PCI, Nasdaq, NYSE and many other regulatory frameworks because they can ensure their data is protected and accessible 100 percent of the time across multiple sites located throughout the country. This represents a true paradigm shift in the way companies are managing the creation, access and sharing of their data.
Moreover, the CyrusOne National IX provides customers with the opportunity to drive additional revenue by improving their immediately addressable market, to reduce expenses with a lower-cost option for point-to-point connectivity between data centers, and to improve service quality through higher levels of resiliency.
CyrusOne operates 25 carrier-neutral data center facilities across the United States, Europe, and Asia that give customers the flexibility and scale to perfectly match their specific growth needs. The company is renowned for exceptional service and for building enduring customer relationships and high customer satisfaction levels. Customers include nine of the global Fortune 20 companies and more than 100 of the Fortune 1000. | | 2:30p |
Hybrid is Here – The Convergence of Data Center, Hosting and Cloud Technological progression has introduced new types of solutions, platforms and ways to conduct business. Through these advancements, organizations have been able to leverage modern technologies capable of scalability and advanced connectivity. Still, many organizations feel that if they want to go global or to the data center , they have to utilize cloud computing.
Remember, the data center is much more than just a cloud hub. For some organizations, going “all in” on cloud computing is just not necessary. This is why it’s important to understand the differences between:
- Cloud computing
- Managed hosting services
- Colocation services
- The hybrid option of all three
In working with the right type of hybrid infrastructure provider, your organization can leverage the most benefit from each of the above IT platforms. This white paper will examine your existing environment, outline the solutions typically offered by a modern Infrastructure-as-a-Service provider, and help you decide where your organization can benefit the most.
In creating your optimal infrastructure, there are core questions and considerations that need to be addressed:
- What types of workloads are being delivered?
- How scalable are you?
- What are your growth patterns — 6, 12, 18 months out?
Remember, the data center has become a key component for any organization. Almost all new technologies are being deployed within the modern data center. This includes big data, IT consumerization, and of course — cloud computing. The digitization of the business unit has created a greater dependency on the data center. This is why more organizations are moving towards a colocation or managed services data center option.
- Easier to manage
- Higher levels of uptime
- Greater levels of flexibility and agility
- Scale on demand
- Better communication
For some companies and specific workloads, cloud computing can be the right model. But in other situations, a managed hosting or data center services deployment can provide even more value. And for many businesses, a hybrid mix is the best fit. Download this white paper today to see how in this world of ever-evolving technological capabilities – it’s more important than ever to know your environment and understand which services are most relevant. |
|