Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Wednesday, May 22nd, 2013
| Time |
Event |
| 12:29p |
Capture Client Satisfaction: Seventh Key to Brokering IT Services Internally Dick Benton, a principal consultant for GlassHouse Technologies, has worked with numerous Fortune 1000 clients in a wide range of industries to develop and execute business-aligned strategies for technology governance, cloud computing and disaster recovery.
 DICK BENTON
Glasshouse
Last August, I outlined seven key tips IT departments should follow to build a better service strategy for their internal users. Since then, I’ve taken a deeper dive into each of these steps on the way to becoming an Internal Cloud Provider (ICP), an essential transformation if IT wants to align with company goals and user expectations. My last post addressed the sixth step, proving what you delivered; in other words, it’s important to show management and service consumers that you’ve met the established service level agreements (SLAs) and key performance indicators (KPIs) for your IT services.
Now, we’ve come to the seventh and final step in this process: capture client satisfaction.
So you now have your nascent cloud service offerings out there in consumer land. Your service offerings are being selected by the end user from your Web-based service catalog. They are intelligently choosing the service they really need, because you have provided service attributes in terms the consumer can understand, and you have also identified the cost of each service offering to assist in their selection.
You have provided a mechanism not only for auto selection, but also for auto deployment. Services selected are now provisioned automatically under appropriate policies agreed by management. Mean time to provision is now a matter of minutes or hours instead of days and weeks. Each month, you produce your score card showing which groups, departments or divisions have consumed which service offerings and at what costs, and you have confirmed in formal reporting that all SLAs have been either met or exceeded.
Determine Satisfaction and Look to Tomorrow
What more can IT do? To lock in your new understanding of consumer needs and to stay abreast of trends in these needs, it is essential to include a survey into your processes. This is not just a satisfaction survey vainly seeking confirmation that IT has indeed “done well”. Rather, it’s critical to use this opportunity to ask probing questions to get a handle on how needs are changing and how future offerings might be driven.
The satisfaction survey process provides an opportunity to capture consumer needs, consumer consumption behaviors and service offering usage as well as satisfaction levels with your services. It is a tool you can use to better understand individual and overall service requirements. This is what we mean by aligning IT with business needs. Your metrics reporting should have already allowed you to identify frequent consumers from the occasional. You should also have a good handle on who is consuming what and be able to classify small, medium and large consumers. If you have offered services that lend themselves to being turned off as easily as they are turned on, you can also get a handle on the mean time-to-live of the various service offerings.
Survey Can Help Set Your Roadmap for the Future
The above information allows you to craft an intelligent survey that seeks information to assist you in planning new services to meet changing needs, changes to existing services as business drivers change, and even end-of-life decisions for some services that are no longer in demand. This knowledge is critical to retaining your relevancy to the consumer for your service offerings, and to remain competitive with outside public operations. It’s always a good idea to formally review what your competition is up to. Just because you think your consumers are captive doesn’t mean they see themselves as captive. Review the competition’s service offerings at least monthly, and follow their PR news feeds for service offering announcements. These new service offerings may well be suitable grist for your survey, to identify if there is a need for these within your organization.
Finally, there is room for the classic component of the satisfaction survey. How well did you do? A scale of one to five is usually sufficient. Further levels of granularity add little. One through five provides a high, medium and low score with something in between for those who want to be picky. Start by asking consumers to assess the service offerings themselves. Are the service offerings meeting their needs? Would more offerings be helpful? Be sure to allow for written feedback as well. Then move to questions around ease of use. Can they find the service offerings they are looking for? How easy is it to find the service offering they seek? How easy is it to select the service offering and place an order? Are the terms and conditions of the service offering clear? Do they think the costs of the service offerings are reasonable and competitive?
Next, move on to the deployment process. The key question here is to ask how they feel about the mean time to provision the service they ordered (self-selected). Did they get the service they asked for? Were there any additional clerical steps required for approval? (This allows you to count such instances). Was the service delivered as promised? Were SLAs met on each occasion? Was the monthly/weekly reporting adequate? Did they need to escalate any issue? (Capture, count and classify). How do they feel about IT’s ability to respond to their issues? How do they feel about IT’s ability to resolve their issues? How do they rate the internal IT cloud against the competition (Amazon)?
The creative mind can conjure a number of other questions to include in the survey; however, there is a risk of driving boredom or even dissatisfaction once a survey gets beyond a certain size. Perhaps 10 to 15 questions should be sufficient to capture key information about the services you offer, your ability to respond to consumer demands, and trends and future service offering needs. There are quite a few Web-based survey tools, and many are free like Survey Monkey.
By running regular surveys, and even embedding mini surveys in your selection, approval or quote process and the provisioning and deployment process, a wealth of information becomes available to the IT organization dedicated to improving consumer satisfaction and continuous improvement. In summary, here’s the top tips to keep in mind:
- Build surveys into your service order and service fulfillment procedures;
- Run a quarterly satisfaction survey on your clients;
- Differentiate between large clients and most frequent users, and small clients and least frequent users; and
- Use the information to ensure you are delivering the service offerings your consumers need, with attributes they value, at prices they can afford.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 1:00p |
Roundup: VMware Launches Public Cloud 
On Tuesday VMware (VMW) unveiled vCloud Hybrid Service, an infrastructure-as-a-service (IaaS) cloud built and operated by VMware. The virtualization pioneer’s entry into the public cloud arena prompted lots of analysis and commentary from around the web. Here’s a look at some of the most notable posts:
Gartner – Lydia Leong comments on her CloudPundit blog: “VMware has previously had a strategy of being an arms dealer to service providers who wanted to offer cloud IaaS. In addition to the substantial ecosystem of providers who use VMware virtualization as part of various types of IT outsourcing offerings, VMware also signed up a lot of vCloud Powered partners, each of which offered what was essentially vCloud Director (vCD) as a service. … In theory, this was a sound channel strategy. In practice, it didn’t work.”
The Register – From Timothy Prickett Morgan: “the vCloud Hybrid Service is not so much about competing against AWS, Rackspace Cloud, and other public clouds as it is about giving the now 500,000 customers using the ESXi hypervisor to virtualize their servers a place where they can burst their workloads and a reason to buy vCloud Director and other tools in the VMware box.”
Ars Technica – Here’s the “where” question answered:”VMware’s US-based services will be available to early access customers in June and will be generally available in the third quarter of this year from data centers in Santa Clara, CA; Dallas, TX; Las Vegas, NV; and Sterling, VA. Services will be offered from data centers in Asia and the EMEA (Europe, Middle East, and Africa) sometime in 2014. Customers outside the US aren’t prohibited from using the US-based services, but they would have to handle some latency.”
ITworld – Another reason the Vmware cloud isn’t an Amazon killer: “Individual developers won’t be able to sign up quickly and easily to start using compute services from VMware. The vCloud Hybrid Services are sold either on an annual or monthly basis. I’m not exactly sure yet but that sounds like services will be sold the same way that VMware products are sold today – through sales people or partners. There won’t be any chance to visit a web site, plop down a credit card and get to work.”
GigaOm – Jordan Novet has details on pricing: “The vCloud Hybrid Service actually has two flavors: a Dedicated Cloud mode has “physically isolated and reserved compute resources” for predictable workloads and a Virtual Private Cloud for seasonal workloads that require greater elasticity but are multitenant in nature. The former service will start at 13 cents an hour for a 1 GB virtual machine with a single processor on an annual basis, while the latter will start at 4.5 cents an hour on a monthly basis.” | | 3:20p |
TIBCO Launches Integration Platform as a Service TIBCO Software (TIBX) announced the launch of TIBCO Cloud Bus, its new subscription-based Integration Platform as a service (iPaaS) offering that leverages the company’s integration expertise and presents users with the ability to drastically shorten time to market and lower costs as they migrate applications and workloads to the cloud.
“With Cloud Bus, TIBCO is combining the deployment flexibility of the cloud with enterprise-class integration features in a single subscription service that customers can run anywhere – on-premise, in the cloud, in bare metal or virtualized environments,” said Matt Quinn, CTO for TIBCO Software. ”TIBCO Cloud Bus provides ready-made integrations across popular SaaS and critical on-premise applications, while allowing subscribers the ability to identify, configure and extend integration templates for their own business context with ease. Finally, and as you would expect from TIBCO, Cloud Bus includes extensive capabilities for real-time integration, meaning changes are reflected in all connected applications as they happen, without waiting for the next batch update.”
Single subscription model
TIBCO Cloud bus subscribers can deploy cloud applications and only pay for what they consume. The solution provides ready-made integrations for a variety of applications such as Salesforce.com and other vendors. Changes are reflected in all connected cloud applications as they happen, without waiting for the next batch update, and connectivity capabilities to deliver real-time integration.
“If a company’s approach to integration is too fixed or locked-in, it could limit the company’s ability to leverage the cloud or switch SaaS providers once it’s there,” said Quinn. ”TIBCO Cloud Bus offers a choice of deployment options, ready-made integrations, and real-time integration support that delivers the flexibility and time to value that enterprises are looking for in moving applications and workloads to the cloud.” | | 3:44p |
EdgeCast Launches Dedicated CDN for eCommerce CDN provider EdgeCast has built a complete separate, dedicated network to serve eCommerce, dubbed EdgeCast Transact. It’s PCI compliant, and incorporates device detection and dozens of commerce-specific optimizations. This “share nothing” approach to network architecture is unique in the content delivery industry, and might start a trend of offering exclusive footprints to specific sets of customers.
“Nowhere do speed and availability matter more than in eCommerce,” said Ted Middleton, EdgeCast VP of product management. “The Internet’s top retailers told us they wanted a discrete, secure, global network that they didn’t have to share with other types of content, so we spent the past year building exactly that.”
Edgecast designed, tested and built the new network in major metropolitan centers around the world over the course of a year. It’s based on the same proven methods Edgecast has been using for the past six years on its content delivery network.
The new solution offers optimized communication path to serve content and handle transactions regardless of conditions on the broader internet or with other EdgeCast CDN networks. eCommerce customers completely avoid competition for resources with other customers in different segments.
Security being a major concern, the network is architected with robust redundancy and failover, elastic provisioning for holiday-type traffic spikes, and credit card detection and removal algorithms.
Performance optimizations include secure pre-establishment of sessions between origin and end user. Mobile device detection and front end optimization is built in. Optimization is executed directly on edge servers for the best possible performance.
EdgeCast also aligns the network’s operating policies with eCommerce business cycles, conducting code freezes during the busiest shopping times to ensure 100 percent availability and stability. | | 4:10p |
Cloudscaling Raises $10 Million for OpenStack Solutions While OpenStack is once again in the spotlight following Dell’s exit from public cloud, in the background, more venture bets are being placed on OpenStack. The latest is a new $10 million, Series B round for Cloudscaling. The company provides an OpenStack-powered cloud infrastructure system. The funding comes from Trinity Ventures and two new big name tech investors: network equipment maker Juniper Networks (through its Junos Innovation fund), and storage specialist Seagate.
“This financing round caps a tremendous year of momentum for the company,” said Michael Grant, CEO of Cloudscaling. “That momentum affirms the voice of the market, clearly stating that customers want more than OpenStack. They want an on-premise, OpenStack-based private or public cloud turnkey system solution that delivers architectural and behavioral fidelity with major public clouds like Amazon Web Services. Our Open Cloud System product delivers on that need to enable hybrid cloud application deployments that span private and public cloud services.”
In 2013 Cloudscaling has secured key customer wins with LivingSocial, EVault, Ubisoft and DataFort, launched a channel partner program, and announced support for OpenStack Grizzly in the third generation of its OCS technology, Open Cloud System 2.5.
Partnership With Juniper
When Cloudscaling announced Open Cloud System 2.5 in April, it was also the first step of its partnership with Juniper through the integration of Juniper’s virtual network control (VNC) technology, JunosV Contrail, into Open Cloud System (OCS). OCS is a turnkey, OpenStack-powered cloud infrastructure system for enterprises, SaaS providers and cloud service providers. Juniper liked what they saw.
“Juniper Networks and Cloudscaling share a vision of how cloud infrastructure should be built and operated to support a new generation of cloud-aware workloads,” said Jeff Lipton, VP of venture and strategic investments at Juniper. “Collaborating to integrate our technology into OCS was just the first step. We are excited to continue our work with Cloudscaling and support the company as a strategic investor.”
Cloudscaling and Juniper plan to continue delivering innovative, joint networking solutions that are open and standards-based to support elastic cloud services and a new generation of enterprise workloads.
Cloudscaling also joined the Seagate Cloud Builder Alliance Partner program in April. Seagate liked what it saw.
“Seagate and Cloudscaling are working together on innovative solutions of jointly-optimized cloud systems supported by Seagate products,” said Rocky Pimentel, EVP and chief sales and marketing officer, Seagate. “We are pleased to deepen our relationship with them as an equity investor and be part of a collaborative effort to define and promote open source standards for cloud computing.”
Cloudscaling and Seagate are focused on the development of optimized storage solutions for OpenStack-powered cloud infrastructure.
Seagate and Juniper join Trinity Ventures, which so far seems happy with Cloudscaling’s prospects. “Cloudscaling has executed on a vision of elastic cloud infrastructure as a turnkey solution that many agree with but few have delivered,” said Dan Scholnick, general partner, Trinity Ventures. “The team has gained new customers and partners at an accelerating pace, highlighting their success at tapping an emerging, growing need among enterprise, SaaS and service provider segments.” | | 8:24p |
WalmartLabs Acquires OneOps Here’s our review of some of this week’s noteworthy links for big data:
@WalmartLabs acquires OneOps. @WalmartLabs – the technology arm of the retail giant’s Global eCommerce division, announced that it is acquiring OneOps, an application lifecycle management company. OneOps was founded by former eBay employees Kire Filipovski, Vitaliy Zinchenko and Mike Schwankl. With the acquisition @WalmartLabs gains the Platform-as-a-Service capability that automates and accelerates many processes related to environment management, application deployment and the monitoring of datacenter operations. @WalmartLabs also acquired software development company TastyLabs. Founded by Nick Nguyen, Paul Schachter and Joshua Schachter, the Silicon Valley startup developed innovative social and mobile applications such as Jig, skills.to and human.io. Nick and Paul will join @WalmartLabs as full-time associates, while Joshua, who also created del.icio.us, will join as a consultant.
Hortonworks for Windows now available. Hadoop contributor Hortonworks announced the availability of Hortonworks Data Platform (HDP) for Windows, a 100 percent open source data platform powered by Apache Hadoop. HDP for Windows is the first production-ready Apache Hadoop-based distribution that runs on both Windows and Linux, providing a common user experience and interoperability across operating systems. HDP includes all of the necessary components to refine and explore new data sources, as well as extend existing investments in applications, tools and processes with Hadoop. “Microsoft is committed to offering the best solutions for big data, and Hortonworks Data Platform for Windows extends our efforts to provide big data customers with access to Apache Hadoop-based solutions,” said Eron Kelly, general manager of product marketing for SQL Server at Microsoft. “HDP for Windows not only enables organizations to deploy Hadoop projects on Windows Server but also allows for easy migration to Windows Azure HDInsight Service in the cloud. Customers gain great Hadoop-based solutions across whatever environment they choose.”
Tableau announces IPO. Tableau Software (DATA) announced the pricing of its initial public offering of 8,200,000 shares of its Class A common stock at a price to the public of $31.00 per share. Shares began trading last Friday under the NYSE symbol DATA. The Seattle based software company transforms the way people see and understand data to solve problems. The company was founded by Stanford graduates, and has helped business intelligence and data analytics efforts at LinkedIn and Facebook.
Concurrent launches Hadoop scoring engine. Big Data application platform company Concurrent announced Pattern, a free, open source, standard-based scoring engine that enables analysts and data scientists to quickly deploy machine-learning applications on Apache Hadoop. Pattern lowers the barrier to Hadoop adoption by enabling companies to leverage existing intellectual property in predictive models, existing investments in software tooling and the core competencies of existing analytics staff to run Big Data applications from existing machine-learning models using Predictive Model Markup Language (PMML) or through a simple programming interface. “Concurrent is tearing down barriers for mass Hadoop adoption,” said Chris Wensel, CTO and Founder, Concurrent. ”With Pattern, we have cleared another path by enabling data scientists to more easily bring their work to production. When combined, Cascading, Lingual and Pattern close the modeling, development and production loop for all data oriented applications. The combination of the three is the application ensemble for further enabling enterprises to drive differentiation through data.” | | 8:30p |
Microsoft Launches Azure in China Via 21Vianet Group 
Microsoft is the first major U.S. provider to launch a public cloud in China. Windows Azure is rolling out in China through partner 21Vianet Group, a large carrier-neutral internet data services provider. Windows Azure service in China will be available on June 6.
Microsoft CEO Steve Ballmer attended an event for the launch with 21Vianet CEO Josh Chen, US Ambassador to China Gary Locke and Shanghai Governor, Jiang Liang. Also in attendance were CEOs from several of the platform’s initial and potential customers.
This is a big development for Microsoft, and huge news for 21Vianet. In November 2012, Microsoft, 21Vianet and the Shanghai Municipal Government announced a strategic partnership agreement in which Microsoft licensed the technology know-how and rights to operate and provide Office 365 and Windows Azure services in China to 21Vianet.
“21Vianet will act as an operation entity for Azure, hosting the service in its data centers and handling the customer relationship,” said Vianet’s CFO, Shang Hsiao. ”We also support the infrastructure too. That’s one of the reasons Microsoft selected 21Vianet – we specialize in China internet infrastructure. We’re considered the biggest Internet data center services provider in China.
“In China at this moment, we don’t have open cloud services,” Hsiao continued. “This will be the first cloud partner outside of China to serve cloud customers. It’s very important.”
21Vianet already has several customers lined up for the service. Named in the press release are Pactera, RenRen Inc, PPTV, a leading online video company in China, Kingdee International Software, and QOROS Auto Co. an independent international car company. Many of these names will be unfamiliar to Western audiences, but therein lies why this announcement is huge; China is a massive market whose potential hasn’t been tapped. Microsoft, through 21Vianet, is first in with an outside public cloud.
“We are extremely excited to officially launch Microsoft Windows Azure services in China and believe 21Vianet will provide great contributions to the growth of cloud infrastructure and services throughout China,” said Chen, Chairman and CEO of 21Vianet. “Our cooperation further enhances 21Vianet’s capabilities in helping to develop China’s cloud infrastructure services and strengthening our core competency for customers.
“As a cloud enabler, 21Vianet is pleased to offer Microsoft’s world-class cloud services for the first time to businesses in China,” Chen added. “By providing carrier-level services for better public cloud operations, including security and compliance, datacenter networking, maintenance, highly reliable engineering and customer services related to cloud operations, 21Vianet and Microsoft are committed to offering the best cloud services available throughout China.” | | 9:00p |
The Robot-Driven Data Center of Tomorrow  Tape libraries, like this one at Google, provide an example of the use of robotics to manage data centers. Robotic arms (visible at the end of the aisle) can load and unload tapes. (Photo: Connie Zhou for Google)
There is an evolution happening within the modern data center. Huge data center operators like Google and Amazon are quietly redefining the future of the data center. This includes the integration of robotics to create a lights-out, fully automated data center environment.
Let’s draw some parallels. There’s a lot of similarity between the modern warehouse center and a state-of-the-art data center. There is an organized structure, a lot of automation, and the entire floor plan is built to be as efficient as possible. Large organizations like Amazon are already using highly advanced control technologies – which include robotics – to automate and control their warehouses.
So, doesn’t it make sense to logically carry over this technology to the data center?
Robotics in the Data Center
As the reliance on the data center continues to grow, full software and hardware robotics automation is no longer a question of if, but a matter of when, technologists predict. Robotics organizations, like Chicago-based DevLinks LTD are already having conversations and creating initial designs for data center robotics automation.
Scott Jackson, Senior Robotics Programmer at DevLinks, says it’s becoming quite feasible to have a robot fetch a drive, blade or even a chassis and deliver it to a central bay for replacement.
“Simple RFID tags, laser and barcode identifiers can create true data center automation,” Jackson explains. “For example, you can tag drives with RFIDs and assign them to be wiped, destroyed and reused as needed.” Conveyor systems are able to run in parallel to robotics within the data center environment.
There are already working examples of robotics in the data center. Tape archives seen at Google and high-performance computing data centers use robotic arms to locate and retrieve backup storage tapes. (For an example, see this video of a system in action at the NCAR data center).
What Will Be Different?
What might a robot-driven “lights-out” data center look like? There would be rail-based robotics capable of scaling the entire data center. Here’s an interesting wrinkle: the modern data center would no longer be limited by horizontal expansion space. When using robotics, data centers can literally scale upwards. Utilizing space in the best possible manner is always a challenge for data center providers, so having the ability to scale both horizontally and vertically becomes a huge advantage.
“These robotics can scale the entire rack, which can now be much taller because of these intelligent robots can reach higher,” said Jackson. “Once a part is removed, a conveyer at the bottom can move the part to the appropriate floor space. Furthermore, detailed vision technology has progressed a long way as well. Solutions like Cognex are able to allow machines to take pictures of a device, barcode and many other variables to help identify the part’s destination or origin.”
Large organizations that invest heavily in their data center infrastructure are actively exploring robotics solutions to help them better control their data centers. IT shops such as Amazon and Google are looking at ways to create a fully automated, lights out data center. AOL has taken a first in that direction with an unmanned data center facility.
The Cost Equation
As with any technology, costs for custom data center robotics will start high and come down as time progresses and platforms become smarter. Smaller robotics are already becoming less expensive. Manufacturers like FANUC develop large machines; but they also create smaller, more agile robotics. Models like the LR and the Mate M-1iA are paving the way for super-agile, fast, robotics capable of granular part identification and distribution.
Both data center, automation, and robotics technologies have come a very long way over the past decade. From the warehousing perspective, robotics already know where everything is located, how to put things in order and are able to directly interact with the human-created automation scenarios. Because of robotics, something very interesting has happened: Instead of the human going to the warehouse, the warehouse comes to the human.
Soon it will be possible to do this at the data center level.
This would enable entirely new approaches to operations. Your data center will be able to run at a different temperature level, you won’t need any lights, and you can directly integrate your new robotics platform into a modern-day automation and orchestration platform. From a central command center, the human operator can maintain visibility into their data center environment, the robotics infrastructure and the workloads that are being managed. This can all be done without the need of a single person on the data center floor. |
|