Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, April 28th, 2014

    Time Event
    11:00a
    IBM Launches Cloud Marketplace

    IBM is launching a Cloud Marketplace, making much of its software and services portfolio available with one click of a mouse and the swipe of a credit card. It’s a big way to lower the barrier to entry for customers, as well as to educate them about what’s available. It also greatly helps IBM’s BlueMix cloud platform, as it gives developers building cloud apps in BlueMix a marketplace to ultimately sell their wares.

    The addition of a cloud marketplace is more momentum for IBM’s aggressive cloud march towards assembling a comprehensive cloud portfolio.

    The marketplace was created to help enterprises ease the transition to cloud and speed the adoption of “hybrid” clouds. The new cloud marketplace will initially offer hundreds of services ranging from Big Data analytics, to security, to mobile including cloud app and solution providers such as Zend, SendGrid, MongoDB, NewRelic, Redis Labs, Sonian, Flow Search Corp, Twilio, Ustream and many more.

    IBM is making some serious investments in cloud computing. It’s investing more than $1.2 billion in a global data center expansion for SoftLayer, while another $1 billion is going towards Blue PaaS. Meanwhile, the company has spent over $7 billion in 17 acquisitions, including recently Aspera, Cloudand, and Silverpop.

    While IBM missed its revenue expectation last quarter, it is in a period of transition to becoming a cloud powerhouse. The revenue numbers for cloud show that this transition is working, with 50 percent cloud revenue growth for the first quarter of 2014, following a year in which its cloud revenue clocked in at an all-time high $4.4 billion. The question is whether the cloud progress will offset struggled in other areas of IBM’s business.

    Accelerating Blue Mix Push

    IBM is also accelerating its expansion of the BlueMix open cloud platform, which has already won a nice list of customers who have built cloud applications in the first 60 days of open beta. The new cloud marketplace is strategically important to BlueMix, which offers an entry point for developers to build and test applications in a sandbox-like environment before potentially offering them for sale in the IBM Cloud marketplace. IBM has offered a place for these developers to sell what they make through a cloud marketplace that will start with a healthy audience on day one.

    IBM is launching more than 30 new services in BlueMix around growth areas such as Big Data & Analytics, Cloud Integration, DevOps and Internet of Things.

    IBM is also working with partners such as Galvanize to provide a collaborative physical space for startups and developers to work together, build and test applications in an accelerated environment using agile development methods and best practices. IBM’s first BlueMix Garage Lab will be located in San Francisco at Galvanize where BlueMix members will work side by side with IBM consultants and partners such as Pivotal Labs to iterate quickly on new innovations.

    11:45a
    HP Updates CloudSystem, Integrity Servers

    Responding to the demand for continuous application availability due to trends driven by cloud, mobility and big data, HP (HPQ) announced enhancements to the HP-UX operating environment and HP CloudSystem Matrix with HP-UX, as well as new HP Integrity NonStop entry-class servers. As a continuation of Project Odyssey, a project to redefine the future of mission-critical computing, HP is expanding its offerings for enterprises with new enhancements that deliver industry-leading availability and increased performance for mission-critical workloads.

    With the latest 11i v3 release of HP-UX customers can perform zero-downtime platform upgrades of virtualized applications between HP Integrity i2 and HP Integrity I4 servers. It also enables larger virtualized workloads, with double the virtual machine capacity from previous versions, now with up to 32 processor cores and 256 GB memory.

    With enhancement to HP CloudSystem Matrix with HP-UX, customers can improve efficiency and service level agreement (SLA) performance, by leveraging larger cloud deployments with workloads running vPas, o with direct I/O networking. The cost of entry is lower with HP-UX private cloud deployments with new two socket license bundles. Identifying inefficient server processes to eliminate will help increase overall data center processing capacity.

    “Given the increasing demands of mission critical applications, performance and uptime are critical for organizations of all sizes,” said Shannon Poulin, vice president and general manager, Enterprise Solutions Group, Intel. “HP’s new servers allow us to bring the proven capabilities of our latest Itanium processors to a full range of customers, as we continue to drive innovation and performance through the portfolio.”

    HP is delivering NonStop server technology to the small to midsize enterprise market with the new HP Integrity NonStop NS2300 and NS2400 severs. These fully fault-tolerant servers offer the lowest total cost of ownership compared to any other server in its class.

    HP Integrity NonStop NS2300/NS2400 commercial systems and the telecommunications-specific configurations of the DC-powered NS2400T/NS2400ST are available worldwide immediately.

    12:30p
    Hybrid Cloud Computing for the Modern Economy

    Robert Jenkins is the chief executive officer of CloudSigma.

    Since the dawn of cloud computing, its appeal to businesses could be summed up in three words: scalability, availability and accessibility. But, cloud computing has been around for a while now, and like any other technology, we inevitably reach the point where people start asking what’s next. We live in an age where paradigm-shifting innovations are rolled out more often than at any other point in history. Consequently, businesses never stop looking for the next breakthrough that will launch the next generation of their product or service.

    Scalability, availability and accessibility will always be the pillars upon which the cloud was built, but where is that next cost-efficiency upgrade coming from? The answer is not something altogether new, but rather an innovative way of bringing two existing cloud technologies (i.e. public and private) together to form the next great innovation: a hybrid cloud as one environment with private patching technology.

    The hybrid cloud enables companies to develop public and private infrastructure strategies in concert, rather than in isolation. Private patching connects these separate cloud environments without the need for expensive public IP lines, offering a host of benefits, not the least of which are better data security and improved cost efficiencies. A hybrid cloud can combine the benefits of private and public clouds.

    When companies consider purchasing public cloud resources today, they often don’t simultaneously reconsider their existing and future private infrastructure purchasing plans. But, those that do come to find out that by aligning private infrastructure purchasing and location plans with public cloud procurement they can unlock hidden value in terms of both business agility and cost savings in the neighborhood of 20 to 40 percent of their total previous costs.

    Making the Connection

    By eliminating the need for public IP lines, companies are able to connect their own private infrastructure directly to their private networking within their public cloud deployment and offer a totally private, IP-only solution at full-line speed with low latency.

    If both environments are hosted in the same data center, the latency will be so low that a company can run computing agnostically between its private and public infrastructures. That’s a fancy way of saying that virtual machines can run and access computing seamlessly between the two environments; a very efficient way of cloud bursting.

    If, on the other hand, the private and public infrastructures of a hybrid solution are in separate data centers, there will still be a significant improvement in latency over the public internet. Any and all data will be seamlessly and securely transferred between both environments, even in the case of backups if the company experiences unexpected downtime. It is this level of security that has been sorely lacking from traditional public cloud options, and has held at bay the countless businesses that would otherwise jump at the chance to enjoy the numerous benefits of public cloud environments.

    Both of these options offer a better alternative to maintaining completely separate public and private clouds and then investing in costly and less-secure public IP lines. Companies stand to gain the most from housing their public and private clouds in the same data center, or at least in the same immediate area, as this will drastically reduce, or even eliminate in some instances, costly latency. This direct proximity is what makes private patching possible, so they can connect these separate cloud environments without needing public IP lines to do so.

    In either scenario, the economics of hybrid cloud connections are pretty compelling. Companies pay a flat rate for the external private IP line or as low as $300 for a cross-connect within the same data center instead of burning up thousands of dollars in data transfer fees.

    Networking-as-a-Service through Hybrid Cloud

    Most private cloud deployments require external connectivity to serve any public-facing services and, usually, in a redundant fashion. Recent technology innovations around software-defined networking have made it possible for companies adopting a harmonized private-public colocated strategy to also transform their public connectivity costs. Instead of buying expensive and under-utilized public IP lines from carriers, companies can run a patch for their public IP connectivity to the public cloud and pay a simple per GB used cost while enjoying the benefits of redundant 10Gbps speeds offered by most public cloud providers, not to mention services like DDOS protection.

    All of this typically costs about $300 for the cross-connect, plus $0.05 per GB transferred, compared to the several thousands of dollars a dedicated public IP line would cost. It is an order of magnitude improvement in efficiency compared with typical standalone private cloud deployments. The cost savings are even more significant when you account for the fact that utilization of such public IP lines is typically very low, while hybrid connections allow for much more diversified connectivity options than a standard public cloud provider can offer.

    Beyond Just Connectivity

    Harmonizing connectivity is clearly critical to creating a successful hybrid strategy, but harmonizing the running environment is just as important. As companies add public cloud procurement to their strategy going forward, adding another environment to test and manage can destroy many of the very benefits that the company is hoping to leverage. The solution is to use one of the next generation cloud providers that have the ability to mold to user requirements. The result is the ability for the customer to recreate their private environment within the public cloud and run one seamless deployment.

    In this way, harmonizing private and public cloud deployments allows companies to reap the full benefits of adding a public cloud whilst avoiding the negative impact of multiple environments and additional connectivity costs.

    Solving the Cloud Capacity Conundrum

    Hybrid cloud solutions, particularly ones with high flexibility and private patching technology, uncover hidden value beyond the elimination of public IP lines. Historically, businesses have invested in far more private infrastructure capacity than they actually need on a day-to-day basis because they have to service peak capacity requirements, not average capacity requirements.

    For example, an e-commerce business is likely to experience significant workload spikes around the holidays. So, in the past, they would over-invest in private infrastructure in order to be prepared for those spikes leaving that capacity unused for the rest of the year. However, with a hybrid cloud, they can keep those core capacity needs in-house and burst workloads directly to the public cloud as needed during those peak moments. The result is a higher utilization rate for the private infrastructure and another example of the cost efficiencies of the hybrid cloud.

    Finally, public clouds are realizing the value of having customers colocated with them and are beginning to offer cross-subsidies on the hosting cost of private environments colocated with the public cloud. The fact that data center providers are also chipping in by offering low-cost access to ecosystems of customers and providers within their buildings is further strengthening this trend. The data centers themselves are doing what they can to help companies uncover long-hidden value through hybrid cloud solutions.

    The New Cloud Economy

    Next generation hybrid solutions and private patching technology are taking the best of what cloud has to offer and making it better. They’ve made it more convenient and effective for users, not to mention more affordable. Companies will be able to better rationalize the planning of their public and private infrastructures, thus reducing their total cost of ownership holistically across their total infrastructure budget and unleashing the ability benefits and elasticity that public cloud usage can bring.

    In the new cloud economy, companies will find themselves developing better purchasing strategies, eliminating waste and uncovering major value they didn’t even know existed.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    1:00p
    Yevgeniy Sverdlik Joins Data Center Knowledge as Editor in Chief

    Today we are pleased to welcome Yevgeniy Sverdlik as the new Editor in Chief of Data Center Knowledge. Yevgeniy is familiar to many of you from his work covering the industry for DataCenterDynamics, where he has headed the North American news coverage for the past five years. We are thrilled to have him join the Data Center Knowledge team.

    Yevgeniy knows the data center industry inside and out, including the companies and personalities that drive innovation in this business. He also understands government from his days as a city hall reporter for the Benicia Herald, and has a journalism degree from San Francisco State.

    I’ll be transitioning to a new role as Editor at Large, which will allow me to continue writing on a regular basis, while handing off the day-to-day management of our news operation. When I founded Data Center Knowledge in 2005, it was a one-man operation. DCK has since grown into a thriving business, becoming a part of iNET Interactive in 2012.

    My new role will allow me more time to focus on reporting and writing, uncovering some of the stories that go beyond a press release or briefing.

    Colleen Miller, our Director of Content, will also be taking on new duties, serving as a consultant to iNET on the fast-moving changes in social media. We’ll soon be introducing a new Director of Content.

    We have exciting plans ahead for Data Center Knowledge. The addition of Yevgeniy as our new Editor in Chief reinforces DCK’s role as the leading source for news about the data center industry.

    If you’re attending Data Center World, come meet Yevgeniy and I for a “Meet the DCK Team” session Wednesday afternoon from 12:45 to 2:45 p.m. at Booth #1100 on the expo hall at the Mirage. Colleen, Industry Analyst Jason Verge and other members of the DCK team will be available throughout the DCW event. Please stop by for a visit.

    2:30p
    Cisco Launches Managed Threat Defense

    Cisco launches a Managed Threat Defense security solution, A10 Networks unveils a new partner program, and Continuum Data Centers selects Ciena to enable high-performance connectivity in Chicago.

    Cisco launches Managed Threat Defense. Cisco (CSCO) announced Managed Threat Defense, a managed security solution that applies real-time, predictive analytics to detect attacks and protect against advanced malware across customers’ extended networks. Managed Threat Defense is an on-premises solution, comprised of hardware, software, and analytics designed to monitor, capture, and analyze threats.  Cisco’s worldwide network of expert-staffed security operations centers (SOCs) monitor the service and provide incident response analysis, escalation, and remediation recommendations. The service protects against unknown attacks, uses Hadoop 2.0 to apply predictive analytics to detect anomalous patterns, provides incident tracking and reporting, and includes Cisco security software such as Cisco Advanced Malware Protection (AMP) and Sourcefire FirePOWER. “As data continues to move to the cloud, more people are accessing data via mobile devices, in addition to sharing data through social channels. Consequently, security has become our customers’ number one concern,” said Bryan Palma, SVP Cisco Security Solutions. “Managed Threat Defense lessens the worry associated with protecting against a breach and allows Cisco and its partners to add value where customers need it most.”

    A10 Networks unveils new partner program. A10 Networks (ATEN) launched the new A10 Affinity Partner Program in North America, a program designed to motivate and reward A10′s channel partners who sell the company’s application service gateway products. Affinity partners will gain access to a host of resources and enablement programs that will help them lead with A10 solutions and grow their business. Core benefits of the program include expanded program offerings, increased partner profitability, and new training/tools. The three tier program has A10 making a significant investment in role-based training for partner sales and systems engineers, and features new and enhanced online tools, including a redesigned partner portal. An expanded marketing development funds (MDF) program extends partners’ resources and budgets to build awareness of and generate demand for A10 solutions. ”Partners are an essential part of A10′s growth strategy and success,” said Ray Smets, A10 Networks vice president of worldwide sales. “Now is an ideal time for us to reward our partners by investing in this important channel program and growing our global footprint. A10 has the breadth and depth in our portfolio to really help our partners succeed.”

    Continuum selects Ciena. Ciena (CIEN) announced that Continuum Data Centers (CDC) has selected converged packet optical solutions from Ciena to connect its data center in Chicago’s western suburbs to key regional exchange points including 350 Cermak in Chicago. With the Ciena platform Contiuum will provide high-bandwidth, low latency 100G connectivity to support applications like cloud computing, storage networking and big data for service providers, financial service companies, research institutions and content delivery networks. Additionally, it will be able to offer carrier-grade connectivity to tier 2 and tier 3 networks as it builds out its peering and Internet exchange community. Continuum will deploy Ciena’s Packet-Optical platform equipped with WaveLogic Coherent Optical Processors and integrated switching capabilities. “With a growing portfolio that is increasingly driven by control and application software and a focus on addressing emerging customer segments in addition to service providers, we continue to lead the industry in the shift to virtualize the network for a vastly improved user experience,” said Duncan Puller, Vice President of Data Center and Cloud at Ciena. ”This deployment with a leading data center operator like CDC is a prime example of our ability to make networks of all kinds more efficient and scalable through openness and programmability.”

    3:00p
    Public Cloud Market Gears Up for Hypergrowth Phase, Reaching $191B in Revenue by 2020

    logo-WHIRThis article originally appeared at The WHIR.

    The public cloud services market is expected to reach $191 billion in revenues by 2020, a huge jump from the $58 billion in revenue in 2013, according to a report released on Thursday by Forrester Research.

    Cloud applications will lead, accounting for a whopping $133 billion of revenue in 2020. Cloud platforms follow, generating $44 billion in revenue, and cloud business services will generate $14 billion in 2020.

    Forrester’s last forecast, released in 2011, predicted the public cloud services market would reach $160 billion by 2020. Its latest forecast is a 20 percent jump from that number, but Forrester analyst Andrew Bartels tells Computerworld that the cloud is starting to shift from a complement to a replacement for existing technology, accounting for this “hypergrowth.”

    While using public cloud services to complement existing technology is still the dominant trend, Bartels said that new companies and startups are going directly to the cloud. These companies will become a larger part of the economy, he says, and over time, will make up a larger portion of cloud business.

    “While not a one-for-one replacement for on-premise, hosting, or colocation, cloud platforms fit well as ideal deployment options for elastic and transient workloads built in modern application architectures,” Forrester analyst James Staten says in a blog post. “And as the seismic shift in application portfolios progresses, public clouds will capture a significantly larger addressable market.”

    Forrester notes that the cloud has a “delaying effect” on spending, since rather than buying hardware upfront, companies increase long-term spending with SaaS vendors, for example, that may require a three or five year commitment.

    As noted in other reports on cloud adoption, Forrester says that concerns around security, integration, performance and cost linger. Public cloud service providers have responded to the latter concern by slashing prices. A recent report showed that AWS led public cloud price reductions in 2013.

    “Still, inertia is difficult to overcome in the enterprise market, and there remain legitimate concerns about the security and performance of cloud solutions in certain parts of the world and for certain use cases and data types. As the past three years have shown, these will be overcome,” Staten writes. “The CIO’s biggest battle will be the internal cultural, psychological and financial accounting barriers born in a different age that will take decades to change. Start the fight now.”

    This article originally appeared at http://www.thewhir.com/web-hosting-news/public-cloud-market-gears-hypergrowth-phase-reaching-191b-revenue-2020

    3:30p
    PowerAssure Updates EM/5 to Integrate with ServiceNow

    Power Assure has introduced EM/5, the latest iteration of its data center monitoring software, which features integration with IT automation solutions from ServiceNow. EM/5 is a complete, hosted data center monitoring solution, combining real-time and historical metrics. It lets customers incorproate existing asset databases, cutting the time and expense traditionally experienced in deploying a Data Center Infrastructure Management (DCIM) solution.

    “Using EM/5, ServiceNow customers can experience a working DCIM solution in a matter of hours, demonstrating value in just a few days without the usual expense of an extensive services engagement before seeing results, or any disruption to their ServiceNow application,” said Pete Malcolm, president and CEO, Power Assure.

    The integration with ServiceNow includes advancements in ease-of-use, metric aggregation, mobility and navigation, including:

    • ServiceNow Integrated GUI – EM/5 presents information right in the ServiceNow user interface.  For example, a simple click on a ServiceNow asset instantly displays the relevant live metrics, while an EM/5 menu section gives access to graphical map, plan and views, with features such as drill-down navigation and status indication.
    • Aggregations – EM/5 allows collections of metrics from any source to be combined, displayed and stored as a stand-alone metric.   For example, aggregating power metrics from each rack in a given row or from all the devices used for a particular application.   Aggregations are unlimited, can be built in a hierarchy, and are ideal for chargebacks and analysis.
    •  Smartphone App – By scanning the asset tag barcode of a particular device, users can immediately see its live metrics both on the smartphone and in the ServiceNow GUI.  This feature delivers instant asset confirmation between data center floor, the network operations center, or elsewhere.
    •   Asset Synchronization – EM/5 automatically synchronizes with ServiceNow Configuration Items, avoiding the need for manual entry or updating.  Synchronization is fully configurable, both in the selection of items to be replicated and for the transform of individual fields. Customer definable templates allow common fields to be pre-populated for a given asset configuration.

    The DCIM industry continues to consolidate and integrate in order to provide an all-in-one solution. Power Assure’s software-defined power monitoring is a key functionality sought by many.

    Headquartered in Santa Clara, CA, the company is privately held with funding from ABB Technology Ventures, Dominion Energy Technologies, Draper Fisher Jurvetson, Good Energies, Point Judith Capital, and a grant from the Department of Energy. Power Assure partners include ABB, Cisco, Dell, IBM, In-Q-Tel, PARC, Raritan, UL and VMware

    3:30p
    Dimension Data Acquires Nexus, Plans to Quadruple Data Center Business

    Global IT firm Dimension Data announced that it has acquired Valencia, California IT company Nexus, and plans to quadruple its data center business to $4 billion in the next five years.

    The acquisition of Nexus expands Dimension Data’s operations in the U.S. by 40 percent and significantly increases the company’s presence in the West, Southwest and Southeast regions of the country. Nexus has 19 offices in California, Nevada, Colorado, Arizona, Utah, Washington, Texas, Georgia,Florida and North Carolina.

    “The acquisition of Nexus is a significant strategic step in enhancing the group’s geographic coverage and depth of skills and capabilities to support our clients,” said Brett Dawson, CEO of Dimension Data plc. “For over three decades, Dimension Data has been building expertise and experience in ICT solutions and services that deliver real business value to our clients. Nexus increases our ability to support both our US-based and global clients with significant West Coast presence.”

    Quadruple Data Center Business

    While Dimension Data enjoys a $1 billion data center business in all major regions worldwide, it is looking to aggressively grow and scale those businesses both organically and through acquisition. It hopes to quadruple its data center business to $4 billion by 2018. In addition, Dimension Data believes its access to a significant set of data center assets across its parent company, the NTT Group, differentiates the business.

    “In all regions and with all clients, large and small, there is an urgent need to undergo the transformation process needed to not only achieve better data center performance and manage disruptive technologies, but to also become progressively greener, in terms of environmental custodianship,” said Steve Joubert, Group Executive for the Data Center Business Unit.

    Currently Dimension Data operates 12 public cloud locations around the world with further locations coming online in the next few quarters. Dimension Data significantly extends its cloud locations through its OneCloud partners, giving it one of the largest cloud footprints in the world.

    “Getting there requires an integrated approach in the secure delivery of workloads and applications across the traditional data center, cloud and the enterprise network, all of which make up the next-generation data center,” continued Joubert. “This calls for a level and range of capabilities that the average organization doesn’t have and shouldn’t need to build or acquire when all of the considerable benefits of cloud, networking, security, and systems integration experience, as well as economies of scale and a global footprint, are available through Dimension Data.”

    “Both organizations have a people-centric culture with a strong focus on delivering a great client experience,” said Mark Slaga, CEO of Dimension Data Americas. “I am very excited to welcome 657 talented Nexus employees to the Dimension Data family. Together, the assets of both companies position Dimension Data as one of the leaders in the provision of services and solutions in the ICT sector across the US. Nexus brings a rich set of services and solutions to the Group, particularly in the data center, collaboration, enterprise networks, security and cloud spaces. In addition, there’s minimal overlap of geographies and clients, which strengthens Dimension Data’s presence across the Americas.”

    4:41p
    Facebook Ready to Begin Second Data Center in Iowa

    The Iowa data center boom continues. Facebook hasn’t even finished its first massive data center building in Altoona, Iowa. But the social networking company is ready to begin construction on a second 476,000 square foot building to house its growing armada of servers.

    “The City of Altoona Planning and Zoning Board will review our plans tomorrow evening, April 29,” Facebook said on its page for the Altoona Data Center. “From there, they’ll go before the Altoona City Council on May 5. Pending the Council’s approval, we’ll break ground on the new building shortly.”

    These are giddy days for the data center business in Iowa. The announcement comes just a week after just a week after Microsoft unveiled plans for a $1.1 billion expansion of its operations in West Des Moines. Meanwhile, Google has announced expansions to its campus in Council Bluffs that will bring its investments in Iowa to more than $1.5 billion.

    How does this translate into economic activity and jobs? Facebook has provided some metrics on its Altoona project, which has employed more than 460 construction workers logging more than 435,000 hours in the ongoing construction.”

    “We couldn’t be more pleased by the progress we’ve made, and we’re grateful for the kindness we’ve received from our friends and neighbors here in Altoona,” the company said.

    The Facebook Altoona data center was announced in April of 2013, when the company began building the $299.5 million first phase of the campus. Plans call for a total of three data centers on its 200-acre Altoona campus, which is nestled alongside Route 80 and has access to significant supplies of fiber and power.

    One of the deciding factors on choosing Altoona was the opportunity to help develop a new wind project in the state. The new data center will be supported 100 percent by wind energy,

    << Previous Day 2014/04/28
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org