Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, February 26th, 2013
| Time |
Event |
| 12:30p |
Data Center Jobs: Silver Linings Systems At the Data Center Jobs Board, we have a new job listing from Silver Linings Systems, which is seeking a Business Development Manager in Kenosha, Wisconsin.
The Business Development Manager is responsible for developing and executing a tactical sales plan that will start moving Silver Linings into the number one position among modular data center providers. To view full details and apply, see job listing details.
Are you hiring for your data center? You can list your company’s job openings on the Data Center Jobs Board, and also track new openings via our jobs RSS feed. | | 1:38p |
Creating An Effective Data Warehouse Strategy Alan McMahon works for Dell. He has worked for Dell for the last 13 years involved in enterprise solution design, across a range for products from servers, storage to virtualization, and is based in Ireland.
 ALAN McMAHON
Dell
Every company has a stockpile of data – loads and loads of data. That data may not be that useful however, if you are unable to even access it without dedicating copious amounts of time and effort to the endeavor. That’s where an effective data warehouse strategy comes in.
Contrary to what some companies may still believe, effective data warehouse solutions do not have to be costly. Nor do they have to be complex or limited to a single size and scope.
What No Longer Works
In the so-called olden days, which in the high-tech world can be as recent as last year, data warehousing was attempted using two fairly common methods.
One was relying on external resources to cobble together a system as the company went along. Such systems could contain any number and types of servers, storage arrays and software. When combined, companies hoped such a collection would work as an effective data warehousing solution, although that has become less and less likely of being the case. Disparate units thrown together can create an increasingly complex system that is difficult to monitor, track or manage in an effective manner.
The do-it-yourself approach also runs into trouble for companies who have limited internal IT resources to dedicate to the creation and management of an effective warehousing system. IT resources may not be large enough or enjoy the availability to focus on implementing or managing a sprawling warehousing system.
Another old-school method of data warehousing was going for a system based on proprietary technology. While this type of system may offer the capabilities and technology to meet the needs of many businesses, the cost was typically high. Outlay was costly, as were the ongoing contracts required to ensure the systems would be continuously optimized and maintained. Reaching for proprietary systems could also often result in over-provisioning for the small and medium business. Smaller businesses would not necessarily need such an extensive system but were forced to pay for it anyway, believing it was the only available option.
The drawbacks of former data warehousing methods include high cost, low efficiency, and the simple inability to make any useful sense of the data being stored.
What You Can Do with a System That is Much More Effective
Instead of having vast amounts of unorganized and inaccessible data, an effective data warehouse strategy lets you access the data easily and rapidly for a number of uses. Reviewing various types of data allows you to track past and current trends, while predicting future trends and issue – resulting in meaningful business intelligence reports.
Vast amounts of data stored in an inefficient manner can result in drastically reduced system performance. As data volume increases, so can the amount of time it takes for data to load for even the simplest routine operations. Throw a few queries in there for an attempt to locate a specific item, and the system can lag even further as the system attempts to sift through or process existing data. These time lags not only affect the employees’ productivity, but they can also affect the company as a whole if downtime or bottle-necked traffic results.
Extensive and ever-expanding data collections are a major challenge for today’s businesses. Internal and external source are constantly adding more data to the mix in a variety of formats and complexity levels. Duplications and redundant data are neither uncommon nor of any practical use.
Online Analytical Processing, or OLAP, can be a very handy application for mining data from different data bases, but it places an extreme workload and pressure on a system that may not be designed to handle anything as complex or large.
Effective data warehousing can also eliminate archaic data storage systems that have long outlived any useful purpose or free up other devices that are too stocked with data to perform additional functions.
What to Look for in an Efficient Data Warehousing System
Capacity and performance are the two big factors to review when choosing a data warehouse strategy. The framework should be capable of supporting and balancing the hardware and software comprising the system can contain important features that are vital to today’s enterprises. These include:
- Ability to handle extensive sequential scans
- Capabilities compatible with OLAP systems
- Configurations that implement next-generation servers and storage arrays
- Rapid installation with minimal impact on daily operations and operational systems
- Scalability to meet business needs without over-provisioning
- Ability to increase scale for business growth with cost-effective additions down the road
- Cost-effectiveness to fit a variety of price points and budgets
- Available upgrades and updates as technologies advance
Size Matters
The option to size availability is a must, to keep processing speeds high and cost low. Small, medium and large data warehousing options should be available to meet the specific needs of your business. Small and medium businesses, for instance, may do well with a 5 TerraByte (TB) platform consisting of a single server with internal storage. Slightly larger businesses may be able to create an effective strategy using a 10 TB platform with a larger server and internet storage array. The largest enterprises, by contrast, may want nothing less than a 20 TB platform based on a large server and fibre channel storage array that can handle the massive loads.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 2:00p |
Video: Jay Adelson on the Early Days of Equinix “It was a crazy time.” That comment sums up the experience of Jay Adelson, co-founder of Equinix, describing the effort to build the first major colocation provider. Adelson, known to many for his role in creating the news portal Digg, founded Equinix in 1998 with the late Al Avery. Adelson and Avery had worked at Digital Equipment Corp., and saw the need for carrier-neutral facilities where companies could interconnect their networks. The attractiveness of that business model, along with its early-mover advantage, made Equinix a key player in the data center sector. In this video from the Equinix Interconnections blog, Adelson discusses the company’s history and progress. “Equinix is the most powerful Internet infrastructure company in the world,” said Adelson. “It’s a powerful position. But you can use that position of power to make a diffference, and to enable the Internet in places it’s never been before.” Adelson also cited the power of culture at Equinix, and the connections that resulted. ”At the time I left, there had been 17 marriages of people who met at Equinix,” he says. This video runs
For additional background on the history of Equinix, see the Vision & History area of the company web site.
For more news about Equinix, please see our Equinix Channel. For additional video, check out our DCK video archive and the Data Center Videos channel on YouTube. | | 3:43p |
Mobile World Congress News from the Hot Mobile Market Mobile World Congress kicked off Monday in Barcelona with a number of announcements from the red-hot mobile market. Here is conference news from Intel, IBM and Nokia, and Mellanox and 6WIND. The Twitter conversation for the event can be followed on hashtag #MWC13.
Intel launches mobile solutions.
Intel (INTC) announced several new products for the mobile market segments: a new dual-core Atom SoC platform for smartphones and Android tablets. “Today’s announcements build on Intel’s growing device portfolio across a range of mobile market segments,” said Hermann Eul, Intel vice president and co-general manager of the Mobile and Communications Group. “In less than a year’s time, we have worked closely with our customers to bring Intel-based smartphones to market in more than 20 countries around the world, and have also delivered an industry-leading low-power Atom SoC tablet solution running Windows 8, and shipping with leading OEM customers today.”
Intel announced a new Atom processor platform with 32nm dual core Z2580, Z2560 and Z2520 products, available in speeds up to 2.0GHz, 1.6GHz and 1.2GHz respectively. The platform also features support for Intel Hyper-Threading, and an Intel Graphics Media Accelerator engine. It includes advanced imaging capabilities, including support for two cameras, with a primary camera sensor up to 16 megapixels. The platform is also equipped with Intel Identity Protection Technology (Intel IPT), helping to enable strong, two-factor authentication for protecting cloud services such as remote banking, e-commerce, online gaming and social networking from unauthorized access.
“Our second-generation product delivers double the compute performance and up to three times the graphics capabilities, all while maintaining competitive low power,” Eul said. “As we transition to 22nm Atom SoCs later this year, we will take full advantage of the broad spectrum of capabilities enabled by our design, architecture, 22nm tri-gate transistor technology, and leading-edge manufacturing to further accelerate our position.”
An Intel Atom Z2420 was announced for smartphones in emerging markets, and an Atom Z2760, the first quad-core Atom SoC (“Bay Trail”). Intel also launched a Long-Term Evolution (4G LTE) strategy, for a low-power, global modem solution that works across multiple bands, modes, regions and devices. The Intel XMM 7160 is a solution that supports smartphones, tablets and Ultrabooks, with support for 15 LTE bands simultaneously.
IBM and Nokia Siemens announce Edge Computing Platform.
Nokia Siemens Networks and IBM announced a collaboration to deliver a mobile edge computing platform that can run applications directly within a mobile base station. This new platform allows mobile operators to create a truly unique mobile experience, relieve the ever increasing strain on network infrastructure and bring completely new solutions to market. Nokia’s Liquid Applications and IBM’s WebSphere Application Service Platform for Networks (ASPN) together provide an environment for operators to manage the many applications that will be deployed to the mobile edge.
“Pushing applications, processing and storage to the edge of the mobile network allows large complex problems to be distributed into many smaller and more manageable pieces and to be physically located at the source of the information it needs to work on,” said Phil Buckellew, vice president, IBM Mobile Enterprise. “This enables a huge amount of rich data to be processed in real time that would be prohibitively complex and costly to deliver on a traditional centralized cloud.”
Mellanox and 6WIND partner to help deployments of virtual and non-virtual networks.
Mellanox (MLNX) and 6WIND, a solution for data plane processing in software defined networks, announced a performance-optimized solution featuring the Mellanox ConnectX-3 Network Interface Controller (NIC) together with the 6WINDGate networking software. As a solution for Telecom Equipment Manufacturers (TEMs)it is combined with a Mellanox SwitchX 40GbE switch to aid deployments of virtualized and non-virtualized networks and SDN on standard high-volume servers.
“Because it solves critical network performance challenges for mobile infrastructure, 6WIND’s software has been selected by multiple TEMs who provide equipment deployed in commercial LTE networks worldwide,” said Eric Carmès, CEO of 6WIND. “The addition of Mellanox ConnectX-3 support within 6WIND’s enhanced Intel DPDK library as well as the 6WINDGate software, incorporating features such as SR-IOV and RDMA over Converged Ethernet (RoCE), enables improved CAPEX and OPEX in service provider networks through the virtualization of network functions and more efficient SDN management.”
| | 4:15p |
EMC Supercharges Hadoop At the RSA conference this week in San Francisco, EMC announced Pivotal HD, a new distribution of Apache Hadoop. Pivotal HD features native integration of EMC’s Greenplum massively parallel processing (MPP) database with Apache Hadoop.
The new EMC Greenplum-developed HAWQ technology brings ten years of large scale data management research and development to Hadoop and delivers more than 100X performance improvements when compared to existing SQL-like services on top of Hadoop. What makes Pivotal HD different is its ability to offer the full spectrum of the SQL interface and run reports without moving data between systems or using connectors that require users to store the data twice. It removes the complexity of using Hadoop, thus expanding the platform’s potential and productivity, and allowing customers to enjoy the benefits of the most cost-effective and flexible data processing platform ever developed.
Sam Grocott, vice president of marketing and product management, EMC Isilon, noted, “The introduction of Pivotal HD, combined with EMC Isilon’s native integration of the Hadoop Distributed File System (HDFS) protocol, continues the evolution of the industry’s first and only enterprise-proven Hadoop solution on a scale-out NAS architecture. This powerful combination succeeds in reducing the complexities traditionally associated with Hadoop deployments and allows enterprises to easily extract business value from unstructured data.”
Using the Greenplum MPP analytical database, Pivotal HD is a true SQL parallel database on top of the Hadoop Distributed File System (HDFS). HAWQ adds the capabilities of note include Dynamic Pipelining, a world-class query optimizer, horizontal scaling, SQL compliant, interactive query, deep analytics, and support for common Hadoop formats. HAWQ unlocks the potential of Hadoop’s fault-tolerant storage capabilities by bringing to bear the vast pool of “data worker” tools and languages into the Hadoop ecosystems.
“With Pivotal HD, we can check off many of the items on our Hadoop wish-list—things like plug-in support for the ecosystem of tools, improved data management and greater elasticity in terms of the storage and compute layer,” Steven Hirsch, chief data officer, SVP Global Data Services, NYSE Euronext. ”But above all, it provides true SQL query interfaces for data workers and tools—not a superficial implementation of the kind that’s so common today, but a native implementation that delivers the capability of real and true SQL processing and optimization. Having a single Hadoop infrastructure for Big Data investigation and analysis changes everything. Now add to all of this functionality the fact that the SQL performance is up to 100x faster than other offerings and you have an environment that we at NYSE Euronext are extremely excited about.”
| | 4:30p |
BTI Systems Launches SDN-Enabled Platform Canadian networking software and systems company BTI Systems announced Intelligent Cloud Connect, an open software-rich platform that combines network intelligence and application awareness with significantly expanded capacity and scale. It is an inter-data center networking solution developed specifically to meet the stringent performance, scale, economics and agility demands of the cloud.
Intelligent Cloud Connect combines the intelligence and flexibility of routing with the capacity and scale of optical bolstered by the efficiencies and extensibility of applications integration. It is a SDN-enabled integrated platform that allows content and service providers to rapidly scale new services, while reducing operational complexities and cost. It will replace the need to continue to purchase separate optical transport devices, switches, router ports, and network application appliances to handle and manage inter-data center traffic growth, with more control and flexibility to optimize services.
“Legacy WAN solutions are not designed to optimize the dramatic gains achieved in today’s highly virtualized data centers,” said BTI Systems President and CEO Steve Waszak. “Intelligent Cloud Connect enables providers to create highly programmable network fabrics for improved applications performance, operational efficiencies as well as service innovation demands. Customers can literally ‘cap and grow,’ retaining existing equipment investments, while decreasing additional cost outlays and efficiently supporting bandwidth and capacity demands. Large content providers are validating that we have effectively addressed their requirements while reducing significant operational headaches and costs.”
The new solution features a software-rich platform with open APIs, and that integrates a converged fabric with integrated high-performance applications modules and 10G/100G forwarding modules. Open APIs enable content and service providers to integrate rich, high-performance network optimization applications developed by BTI and by third parties. Intelligent Cloud Connect benefits include large scale capacity with low-latency, SDN-enabled management and control, and increased differentiation, control and applications performance.
“Data centers are becoming major confluence points for high rates of traffic growth: interconnecting data centers, and connecting data centers to access networks or peering points,” said Ovum Vice President and Practice Leader Dana Cooperson. “As data centers are becoming much more critical in support of public and private cloud services, they are becoming both larger and more widely distributed, adding pressure on the connecting networks to scale, deliver higher-performance and differentiated services, and improve economics. BTI’s Intelligent Cloud Connect, with its focus on high-availability, high-capacity, open, application-aware, and even applications-based networking, supports these goals.”
$10 Million Funding
Building on unparalleled global demand from content and service providers BTI Systems announced $10 million in new growth capital. The Series C funding was led by Bain Capital Ventures and adds to the more than $33 million total that BTI Systems has raised since 2011. The funding will be used to scale BTI’s operations and accelerate the delivery of Intelligent Cloud Connect.
“BTI Systems continues to enjoy tremendous growth based on our ability to understand our customers’ networking challenges and opportunities – paired with our ability to consistently deliver solutions that greatly improve the way they run their networks and business models,” said BTI Systems President and CEO Steve Waszak. “We’re excited to be expanding our relationship with our investors and we’re using those funds to bring new and even more powerful innovations to content and service providers worldwide.” | | 6:31p |
After Strong Quarter, Internap Preps Cloudy Colo  The exterior of an Internap data center. The company’s shares have gained in recent days on the strength of fourth-quarter earnings. (Photo: Internap)
Shares of Internap have surged after the company recorded a strong quarter, indicating it is striking the right chords with its diverse portfolio of colocation, managed services and cloud. The company has also been touting what it calls “Cloudy Colo,” a true hybrid solution available through a single portal. The strong finish to the year and the cloudy colo concept are signs that Internap has found its identity.
In two trading sessions since the release of its fourth quarter earnings, shares of Internap have gained 11.5 percent, rising from $7.91 at Thursday to a close of $8.81 on Monday. The fourth quarter saw the highest quarterly revenue, segment profit and adjusted EBITDA in company history.
Revenue for 2012 was $273.6 million, with fourth quarter revenue of $69.7 million. Internap’ revenue was up 2 percent from the previous quarter. The high growth was attributed to its data center services segment, which includes Voxel which Internap acquired in 2011. Data center services revenue hit $43.7 million, up 24 percent compared to same time last year, and up four percent from the third quarter. For the full year, data center services generated revenue of $167.3 million, up 25 percent over last year. IP services was flat, but slightly down year over year. Customer churn was down. The company counted 3,700 customers as of December 31, 2012.
Strong Finish to 2012
“We are pleased with the strong finish to 2012,” said Eric Cooney, President and Chief Executive Officer of Internap. “The continued execution of our growth strategy is reflected in full year revenue and Adjusted EBITDA growth of 12 percent and 20 percent, respectively. Successful integration of the Voxel business and focus on our organic colocation, hosting and cloud infrastructure businesses have delivered full-year growth in data center services revenue of 25 percent.
“As we look forward to 2013, the priority is simple – focus on continued execution of the strategy to deliver a platform of high-performance, hybridized IT Infrastructure services,” Cooney continued. “We remain confident that the opportunity for long-term profitable growth and stockholder value creation is significant in the market for outsourced IT Infrastructure services.”
Internap has shifted its focus over the years. The company was founded in 1996 on its expertise in IP services and route optimization. It later added colocation and content delivery services, but has had its stumbles, most notably the 2006 acquisition of VitalStream, which led to a $99 million write-off amid customer support problems. Cooney became CEO in 2009, and immediately focused on the company’s colocation business. Since Internap was realizing higher margins on its company-owned data centers, and began phasing out its use of third-party space and building data center space. The company rolled out 26,000 square feet of company-controlled data center space in 2012.
Sneak Peak: Cloudy Colo
The company is working on what it informally calls “Cloudy Colo.” It is an extension of its core data center OS platform, with some customers using the beta version.
“Our whole goal is to redefine what the limitations around colo are,” said Raj Dutt, Senior VP of Technology at Internap and former CEO of Voxel. “We’re going to start giving visibility and control into the obvious things that people don’t get from colo – reboot, bandwidth, inventory management asset management, the ability to hybridize managed cloud in the same portal.
“Through software – DCIM-like software – customers can focus at stuff in the rack rather than outside of the rack,” said Dutt. “DCIM has barely started in terms of inside the rack. The roles of machines are changing, and DCIM falls short on this. This is where it starts to get interesting.”
Offering up a variety of services through one portal from colo to cloud, as well as giving DCIM-like insight into total infrastructure, will aide Internap in cross-selling its services.
“This makes colo great for colo customers,” said Dutt. “It also makes colo within reach for cloud guys. As cloud customers need colo, it’s an easier way to go about that. From an infrastructure standpoint, we don’t think the cloud is the be-end and end-all,” he said. “If you’re deploying any application, the best solution is a hybrid situation.”
Dutt noted that Internap offers everything from colocation to dedicated servers to cloud. “Very few people offering all of these product sets as one infrastructure fabric,” he noted.
Dutt believes the economics of cloud are often misinterpreted, and cloud is not always the most cost-effective approach for the customer. ”I’d rather sell 100 racks of cloud than 100 racks of colo any day,” said Dutt, stating that the margins for providers are simply better for cloud within the same footprint.
Dutt atttributes Internap’s success with its diverse portfolio to one thing. “The market certainly got more educated,” he said. “More and more people are treating infrastructure as a competitive weapon more than cost center.”
It’s still early, so there’s no formal “cloudy colo” product yet. The company is evaluating different models. However, all indications are that Internap is working on giving customers deeper control and analytics across the portfolio, from colo to cloud, with deeper DCIM-like functionality. A formal announcement is most likely coming within the next quarter. |
|