Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, October 9th, 2015
| Time |
Event |
| 12:00p |
What You Should Know About Using Third-Party Cloud Management Tools While cloud providers usually offer powerful tool sets for managing your cloud environment, third-party tools can truly expand that environment’s capabilities. Oftentimes, highly dispersed cloud infrastructure requires a level of granular visibility native hypervisor and cloud monitoring tools cannot provide. Since every environment is unique, it’ll be up to the IT managers to decide which approach is best for them.
Third-party tools aim to provide a “single pane of glass” for cloud data center monitoring. These software packages look to unify cloud management tools to provide a global view of an entire infrastructure. Based on the location of the data center, administrators can to drill down to see all the necessary components to make sure their environments remain healthy.
What to Look for in Third-Party Tools
The idea behind acquiring third-party cloud management tools is to offset what native tools can’t manage or don’t see. Prior to going with any new tool set, administrators must examine and know what their existing environment has and what it is running. Based on the initial findings, IT managers are then able to make good decisions on what type of features they’ll need. Here are some of the features to look for when exploring third-party tools:
- Distributed environment management: The biggest benefit of third-party tools is visibility into a distributed cloud environment. When an organization has multiple cloud points, administrators must be able to have granular visibility into the operations of all data center locations. Your tool must be able to see everything happening within each cloud location. This means monitoring and managing everything, from resources to application load-balancing, and even managing user count.
- SLA management: A big feature of third-party cloud management tools is their ability to monitor SLA requirements. This can be QoS monitoring or even server-specific uptime metrics. By making sure an SLA is being met, administrators can make their environment operate to its fullest capacity.
- Disaster recovery: One goal of obtaining a third-party tool set is to enhance disaster recovery functionality. As a pre-planned initiative, administrators must know what the tool set has to offer as far as DR features. By having a DR plan in place, administrators can make the right choices around their tool set, especially when high-availability and DR failover are concerned.
- Workflow automation: A nice feature to use is the ability to automate some processes within the cloud environment. For example, if a cloud data center sees a spike in user count on a specific server, there can be a process in place to mitigate the additional user count issue. A software tool on the backend will automatically spin up new VMs to help offload the additional user count.
- Global resource control: With multiple points, cloud environments can become difficult to manage. Carefully examining resources on a global scale can potentially exasperate the management issue. This is where third-party tools can really help. Setting up alerts, monitoring protocols, and even automated recovery procedures are all potential functions of a tool set. The ability to carefully manage resources on distributed level is powerful feature which some administrators may want to leverage.
- Advanced alerting capabilities: Above and beyond sending out emails to the right administrator, some organizations are seeking even more from their alerting strategies. This means placing automated phone calls, sending a text message or even having an automated internal response system. Some third-party tools are built around this sort of advanced alerting mechanism and can really help out an IT environment seeking this type of detailed monitoring.
- Security auditing: When working with third-party tools, creating a security audit trail may be required by some organizations. Companies with strict standards and even compliance protocols may require a tool set capable of very granular security logging and monitoring. This is where administrators can take advantage of tool sets geared towards a security conscious environment.
- Application visibility: Some third-party cloud management tool sets have the ability to see how an application is performing, which is where many native tools fall short. This can be as granular as logging errors, securing access, and monitoring performance. Depending on the type of application being delivered, some administrators may find there to be a need for a software feature capable of monitoring over specific cloud-based applications.
- Chargeback: These features are great for organizations attempting to place a dollar figure on various departments trying to access specific cloud workloads. Using this feature, administrators are able to forecast department-based growth and work with budgeting to best fit the IT teams’ needs.
Things to Look Out For
Although they come feature-packed, it’s easy to get lost in all the functionality. Just like native tools, these software packages are not all-encompassing and will have their drawbacks. However, with more planning and understanding of the environment, administrators are able to make better decisions to avoid purchasing tool sets with features they won’t use. Here are a few things to be cautious of:
- Training: As with any new tool set, third-party tools will require additional training. Remember, just because the feature is there does not mean administrators will know how to use it. Take the time and learn your new tools since that is the best way to gain the most benefit from that software package.
- Setup and configuration: As opposed to native tools, which usually install as part of a package, working with third-party tools will require additional configuration. Sometimes this can be as easy as running a wizard, while other times it will be a detailed process to integrate existing components into the third-party monitoring tool set. Skipping steps or misconfiguring a third-party tool set can be a waste of dollars and, even worse, can have detrimental effects on the cloud environment.
- Testing and maintenance: It will be up to the administrator to ensure their third-party tool set is operating properly. Since third-party tools are installed on top of an existing environment, it’s very important to occasionally verify the metrics and results that this tool is providing. This means ensuring optimal performance out of the tool set and testing the various vital elements that this tool was brought in to accomplish.
- Ongoing visibility: In a distributed cloud environment, maintaining ongoing visibility can become a challenge. This is where third-party tools can both help and hurt. By setting up administrative roles, IT managers can break up the duties that go into monitoring the sometimes expansive third-party tool set. If the team isn’t well prepared or trained, all of the information they are seeing or gathering may go unused. From there, faults and errors can start to take effect on an environment since data isn’t being correlated properly.
- Alerting: Even with advanced alerting capabilities, the first step is to have the right setup. The second step is continuously maintaining this system. Just because a tool has the capabilities does not mean it will execute out of the box. This means making sure that all alerting is setup properly and regularly tested.
Your tools are the mechanisms which help your data center operate efficiently. Whether you’re working with native toolsets or are utilizing third-party solutions, always ensure proper alignment with your data center ecosystem. This means knowing where your business demands are growing, and which tools can be the drivers to get you there. | | 3:00p |
Accenture and AWS Create New Business Group to Ease Cloud Adoption 
This article originally ran at Talkin’ Cloud
LAS VEGAS — Businesses are moving to the cloud, but they’re not doing it alone. As part of its partner push this week, AWS has teamed up with Accenture (ACN) to create a new business group designed to help businesses move to the cloud.
In an announcement at AWS re:Invent on Wednesday, Accenture COO Omar Abbosh said that initially the Accenture AWS Business Group will offer cloud migration and big data analysis, with a range of services in other areas being added over time, including exploring the development of services around the Internet of Things (IoT) and Security on the AWS Cloud.
Accenture recently acquired Cloud Sherpas in order to scale is cloud consulting capabilities, boosting its Accenture Cloud First Applications team.
The group brings together dedicated professionals from each company with expertise in cloud solutions architecture and development, marketing, sales, and business development. It will develop a suite of services that include application migration services, architecture design and application development for the AWS Cloud.
Abbosh told the audience at the AWS re:Invent keynote that AWS and Accenture will be putting together a set of capabilities and offerings born in the cloud, which complements Accenture’s cloud-first approach.
As part of the announcement, Accenture is training 1,000 employees and certifying 500 of them on AWS, Abbosh said.
Accenture and AWS have worked together for nearly 10 years, collaborating on end-to-end cloud migration and management services last year, counting Discovery Networks International and Wire and Wireless Co as customers.
This first ran at http://talkincloud.com/cloud-computing-events/accenture-and-aws-create-new-business-group-ease-cloud-adoption | | 4:47p |
Weekly DCIM Software News Update: October 9 Nlyte adds a business intelligence angle to data center management, while ABB and Kepware expand their reach into the data center.
Nlyte announces integration with Tableau. Nlyte Software announced that it has integrated its data center service management (DCSM) solution with Tableau, a tool that enables visualization and sharing of complex data. Robert Neave, co-founder, CTO and vice president of product management at Nlyte said that “enterprises must have clear visibility into all aspects of their operations to ensure their data centers run with optimal efficiency. Enterprises can now leverage the power of both Nlyte DCSM and Tableau to improve oversight of IT services the data center provides.”
Kepware DCIM solution integrated with ABB Decathlon. Kepware Technologies announced that ABB has integrated KEPServerEX with its DCIM solution, Decathlon for Data Centers. By integrating KEPServerEX with ABB, the DCIM platform expands its reach to a multitude of devices and systems, which enables organizations to see a single data set from multiple sources within the data center. | | 5:02p |
Alibaba Launching Second Silicon Valley Data Center AliCloud, cloud services arm of the Chinese e-commerce giant Alibaba, is launching its second Silicon Valley data center, the company announced Friday, inviting potential customers to sign up for services in the new facility starting October 12.
Alibaba’s push into the US cloud services market, putting the company in direct competition with the likes of Amazon, Microsoft, and Google in their biggest market, started earlier this year with the announcement of the first Alibaba data center in Silicon Valley. The company has not elaborated whether it has been leasing space from data center providers in California or building its own facilities.
AliCloud also struck a number of partnerships with companies to integrate new products and services into its cloud platform. The new partners are Mesosphere, Bankware Global, Appcara, Appnovation, Cloud Comrade, and Panzura.
According to its statement issued Friday, the second Silicon Valley site will be its fourth in the US. Alibaba has not said where in the US the other two cloud data centers are.
One of the most recent AliCloud data centers that came online is in Zhejiang Province, China. Using a number of cutting-edge technologies, the facility is a showcase for modern cloud data center design.
The company now has nine cloud data center sites around the world, which is fewer than Amazon Web Services or Microsoft Azure have but more than Google. The Alphabet subsidiary has a massive global data center fleet to deliver its online services, but far from all of them host its cloud infrastructure services.
The amount of cloud locations is an important attribute of a cloud provider. The more distributed their infrastructure is, the more attractive it is for customers that deliver services globally. Distributed infrastructure also provides customers with more options for failover locations.
After the recent launch if a cloud region at its South Carolina data center, Google provides cloud services out of four locations. AWS has four general-public regions and one government cloud region in the Americas, plus two in Europe and four in Asia Pacific.
Microsoft Azure cloud lives in seven locations in the Americas, two in Europe, and nine in Asia Pacific, including the three recently launched India regions. | | 6:38p |
Safe Harbor Ruling Leaves Data Center Operators in Ambiguity Europe’s annulment of the framework that made it easy for companies to transfer data between data centers in Europe and the US while staying within the limits of European privacy laws has caused a lot of uncertainty for businesses that operate data centers on both sides of the Atlantic.
US cloud services giants have taken steps to make sure they continue to provide services legally using means other than the Safe Harbor framework, but actual consequences of the European Court of Justice ruling earlier this week remain unclear.
David Snead, an attorney and co-founder of the Internet Infrastructure Coalition, a US advocacy group whose members include Google, Amazon, and Equinix, among many others, said there were currently two “schools of thought” on the subject.
“One is that Safe Harbor is dead,” he said.” The other, which I think is actually the accurate answer, is that the European Union, and the European Commission in particular, need to figure out how to interpret the ruling.”
Internet businesses will continue to operate in ambiguity until the commission issues its interpretation.
“It is unrealistic to think that all transatlantic data is going to have to stop as a result of this decision,” Snead added. “The European Commission is likely to figure out a way to accommodate it, and the US is as well.”
Safe Harbor, created in 2000, is a uniform set of rules for handling personal data of citizens of member states of the European Union, including rules around moving that data to facilities in the US and storing it there. If a service provider complied with the rules, they could be confident that they were not breaking any European privacy laws.
Former NSA contractor Edward Snowden’s public disclosure of the US spy agency’s covert electronic surveillance practices, however, eroded trust in Safe Harbor. The court’s ruling this week that Safe Harbor led to privacy violations was a culmination of a process that started with a lawsuit by Austrian privacy advocate Max Schrems in Ireland against Facebook, charging that the social network was violating his privacy rights by complying with the NSA.
The court in Ireland sided with Facebook, citing Safe Harbor. Schrems’s appeal with the EU court resulted in this week’s ruling.
Model Clauses Not for Everyone
Choosing not to wait for the European Commission’s interpretation of the ruling, Amazon, which operates the world’s largest cloud services business with customers around the globe, has obtained an approval from EU data protection authorities for a data protection agreement and so-called “model clauses,” which according to the company enables it to continue serving its European customers legally.
“With our EU-approved [Data Protection Agreement] and Model Clauses, AWS customers can continue to run their global operations using AWS in full compliance with EU law,” an AWS spokesperson said in an emailed statement. “The AWS DPA is available to all AWS customers who are processing personal data, whether they are established in Europe or a global company operating in the European Economic Area.”
Other US cloud giants, including Salesforce, Microsoft, and Google, have also taken the model-clause route.
Model clauses are model data privacy agreements individual EU members can make with companies to give them assurances that they’re operating within legal limits. While any company can use this approach, it is a cumbersome process, Snead said. They sometimes have to be negotiated, and not all EU members have model clauses approved.
Service Providers Left Fending for Themselves
Operating in Europe without Safe Harbor is going to be a lot more complicated. Model Clauses or not, privacy rules in countries like Germany, for example, are very strict, and now that there isn’t a blanket compliance framework, service providers are left fending for themselves in each European market.
“This is an unfortunate and costly ruling and undermines the long-standing commitment that infrastructure providers have used to implement data protection methods for customer data,” Andreas Gauger, chief marketing officer and co-founder at ProfitBricks, a German cloud services company, said in an emailed statement. “Quality IaaS providers provide customers with secure, cloud-based virtual infrastructure, and are flexible enough to … give customers control over their data, encryption methods, and data transfer methods.”
Service Providers Not the Only Ones Affected
While cloud service providers are the most obvious category of businesses affected by the ruling, it can be disruptive for any international organization that has some part of its operations in the EU, Cliff Moyce, with Data Art, a New York-based software development and consulting company, said in an email.
Such organizations, including banks, for example, “will need to review their business processes, systems, controls, and agreements (including customer, supplier and personnel agreements) to ensure compliance for any data sharing and data transfer activity that crosses borders,” Moyce said.
‘Data Transfer’ an Antiquated Concept?
The concept of “data transfer” is an old-fashioned one, Moyce added. Today, data is accessed rather than transferred. “Modern systems infrastructures mean that data can be accessed from anywhere,” he said. “The secret to compliance is control of access, not control of ‘transfer.’”
The Need for a Global Discussion on Surveillance
Fundamentally, the ruling should be a wakeup call to US Congress that “the world still cares about US surveillance activity, and that US needs to continue to show that it respects the privacy of the world’s internet users,” Snead said.
The conversation shouldn’t be limited to the US, since government surveillance is an international issue, he added. “The reality is that you want to be safe from any governmental spying. No contract is going to keep the German government from spying on you or compelling your German data center to provide access to them without notifying you.”
It is important to stop saying internet surveillance is a US government problem, a German government problem, or a Chinese government problem, Snead said. “This is a global problem, where governments are seeking access to data in ways that users don’t know.” | | 7:48p |
In Dell, EMC $50B Takeover Deal, Activist Elliott Could Win Again 
This post originally appeared at The Var Guy
A year ago, reports surfaced that EMC had huddled with HP sporadically for about a year to discuss merger options and done the same with Dell, which at the time was seen as a longshot in any takeover deal.
But a deal with Dell apparently is back on the table and it’s potentially a doozie. According to reports, Dell and private-equity firm Silver Lake are nearing a conclusion to talks that will see the computer maker buyout the storage giant for some $50 billion.
Should a deal go through, it would be yet another victory for activist investor Elliott Management, which took a 2 percent stake in EMC valued at about $1 billion more than a year ago and has been pressuring EMC to sell of its VMware and Pivotal positions and alter its federation model to improve the company’s earnings performance.
Last January, EMC and Elliott agreed to a temporary standstill with a deal to add two mutually acceptable directors to the storage giant’s board.
Elliott’s history includes taking large positions in Blue Coat Systems, BMC Software, Compuware, NetApp, Novell, and Riverbed and, in some cases, paving the way for private equity firm Thoma Bravo to step in and buy out some of those companies and in others pressing for selloffs or management changes to boost shareholder value.
Elliott owns a 9 percent stake in Juniper, which late last February named two independent directors to its board supported by the activist investor.
With a $50 billion market value, more than $23 billion in annual sales and some 60,000 employees worldwide, EMC, whose federated business model spans data storage, virtualization and software development, would be a whale for any IT company to swallow whole.
Earlier talks between EMC and HP ultimately broke down over concerns on both companies’ part that shareholders would nix any deal. It will be interesting to see if that happens here with Dell considering how much influence Elliott now commands.
This first ran at http://thevarguy.com/information-technology-merger-and-acquistion-news/100815/dell-emc-50-billion-takeover-deal-activist-elliott | | 8:08p |
AWS Launches Internet of Things Cloud Platform 
This article originally appeared at The WHIR
In a highly anticipated announcement this week at AWS re:Invent, AWS announced a move into the Internet of Things (IoT) space on Thursday with the launch of AWS IoT, a managed cloud platform that lets connected devices interact with AWS services. The platform is currently in beta.
According to the announcement, devices connect to AWS IoT’s Device Gateway, and manufacturers set rules for what AWS IoT does with the data they send.
AWS IoT provides an SDK to make it easy for developers to use the AWS IoT functionality from connected devices, as well as mobile and web applications.
While it’s still relatively early days for IoT, AWS public cloud competitors like Microsoft Azure already have an IoT offering.
“The promise of the Internet of Things is to make everyday products smarter for consumers, and for businesses to enable better, data-driven offerings that weren’t possible before. World-leading organizations like Philips, NASA JPL, andSonos already use AWS services to support the back-end of their IoT applications,” Marco Argenti, Vice President, Mobile and IoT, AWS said in a statment. “Now, AWS IoT enables a whole ecosystem of manufacturers, service providers, and application developers to easily connect their products to the cloud at scale, take action on the data they collect, and create a new class of applications that interact with the physical world.”
As part of the announcement, a number of semiconductor manufacturers also have Starter Kits powered by AWS IoT that embed the AWS IoT Device SDK.
AWS is once again tapping its partners to help customers use its platform. AWS Partner Network Partners offer operating systems, management platforms, analytics and services that work with AWS IoT. Partnerships will certainly be a key strategy for adoption of AWS as it will need buy-in from manufacturers to be successful.
This first ran at http://www.thewhir.com/web-hosting-news/aws-launches-internet-of-things-platform |
|