Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, March 21st, 2014
Time |
Event |
1:04p |
Guide to Cloud Monitoring Tools and Best Practices There is a clear boom in data center and cloud utilization. This trend continues as more organizations find value in moving a part (or all) of their infrastructure into the cloud. New ways to deliver applications, process large workloads, and empower the end-user are all reasons to work with some type of cloud model.
Still, there have been growing pains throughout the process. Administrators have had to deal with security, growth, and resource challenges. How do I control dynamic user growth? How can we continue to provide excellent support to such a distributed environment? What can you do now that will help support a healthier cloud in the future?
IT has always had monitoring and management tools. It only makes sense that these tools begin to span out into the cloud. There are some amazing solutions out there that directly answer the above questions and help you optimize you entire cloud platform. Let’s take a look at a few cloud monitoring tools and management solutions which are designed to help create a more proactive cloud.
- BMC Cloud Operations Management. Here’s your chance to create a complete cloud life-cycle management and cloud operations management solution. BMC’s cloud monitoring platform enables IT organizations to really deliver the speed and service quality that users expect out of their cloud. This model helps IT organizations right-size capacity and optimize your monitoring and management processes. Furthermore, this monitoring solution integrates with OpenStack, CloudStack, and other cloud management platforms through an open API and metadata-driven user interface.
- AppDynamics. How about a layered approach to application intelligence? Starting with the infrastructure and moving all the way up the stack, AppDynamics is uniquely able to deliver rich performance data, learning, and analytics, combined with the flexibility to adapt to virtually any infrastructure or software environment. Some of those layers include behavioral learning and working with contextual data. Basically, you’ll be able to granularly see application models, servers, services, devices, as well as information around network and machine infrastructure.
- CA Nimsoft. This is one of those really broad cloud monitoring options. From a services perspective, Nimsoft Monitor really offers pretty much every necessary monitoring solution. Application monitoring support includes Apache systems, Citrix, IBM, Microsoft, SAP and more. Plus, if you’re working with an existing cloud infrastructure or management platform, Nimsoft integrates with Citrix CloudPlatform, FlexPod, Vblock and even your own public/private cloud model. The list of supported monitoring solutions continues to span through servers, networks, storage, virtualization and more. Basically, pick the monitoring solution you need and make sure it can integrate with your existing platform.
- New Relic Monitor. Already used by folks like Comcast, Citrix, GitHub and EA, New Relic offers a complete SaaS-based model for very granular cloud monitoring capabilities. This monitoring solution looks at the most critical components that make up a cloud-ready application. This includes SQL query analysis, application health statistics, transactional tracing, thread profiling, complete application mapping, and even proactive alerting. Whether the app is web, mobile, or server-based, New Relic takes into account numerous performance and optimization considerations into its monitoring algorithm.
- Hyperic. Did you know this is a division of VMware? Did you know Hyperic 5.0 has some pretty amazing management and monitoring solutions? Ranging from web to virtualization, this solution looks at operation intelligence and creates a powerful monitoring platform for a variety of systems. Hyperic can monitor web servers, a plethora of operating systems, applications, databases, mail servers, network environments, distributed platforms and even middleware messaging. As a component of the VMware vCenter Operations Management Suite, Hyperic collects a vast range of performance data. This includes 50,000 metrics across 80 application technologies, and it can easily extend to monitor any component in your application stack.
- Solarwinds. Let’s pretend for a few minutes that you have a completely private cloud infrastructure. Sure, you have some minor data elements that may be spanning into a different public data center, but for the most part, it’s a private cloud life. There are solutions out there that help monitor and really optimize your infrastructure. Virtualization Manager from Solarwinds offers a comprehensive monitoring solution which integrates with your VMware or Hyper-V environment. Their real-time dashboards simplify identification and troubleshooting of performance, capacity & configuration problems. Plus, you can integrate this solution with Server and Application Monitor which would provide application stack management from app to datastore.
- Boundary. These guys are pretty cool. Currently being run by the former Nimsoft CEO, Boundary aims to monitor and integrate with major cloud vendors to help control and manage applications as well as data. By integrating with Puppet Labs, AWS, Splunk, New Relic, AppDynamics, CA, BMC and many others, Boundary is able to enrich your monitoring solution with application topology and per-second streaming analytics. The idea is to aggregate a lot of data through Boundary and create a consolidated view. From there, you can run performance analysis, understand contextual navigation, examine application topologies, and even control architecture and APIs.
Here’s the reality – that’s a pretty short list. There are a lot of various solutions out there that can monitor every aspect of your cloud platform. Similarly, there are solutions which are designed to control and monitor very specific elements in your cloud like application performance, database health, and network data flow. Regardless of the type of solution you work with, there are some very good cloud monitoring, management and health maintenance considerations:
- Utilize automation and proactive remediation services wherever possible.
- Never forget to set good access control policies and always monitor security access.
- “Who watches the watchmen?” Always ensure that your monitoring system is running optimally and that configurations are kept updated.
- Not all workloads, apps, or data sets are alike – make sure to create appropriate monitoring profiles as needed
- Take the time to understand your own cloud and all its intricacies and dependencies before selecting a monitoring solution. The more you know, the better a monitoring tool can fit in.
Keeping a proactively healthy cloud environment is never an easy task. In fact, it’s an on-going battle against security threats, outages, broken systems, and random acts of IT failure. Still, more organizations and users are flocking to a cloud platform which offers many new types of services and solutions. Remember, utilizing more than one cloud monitoring or management solution is not a bad thing. For your own cloud system – work with an appropriate set of monitoring tools to help ensure a healthier infrastructure. | 1:34p |
Friday Funny: Warning Signs Thank Goodness It’s Friday! It’s been a long week around here at DCK, and we are looking forward to the weekend. But first, it’s time for some humor!
Diane Alber, the Arizona artist who created Kip and Gary, has a new cartoon for Data Center Knowledge’s cartoon caption contest. We challenge our readers to submit a humorous and clever caption that fits the comedic situation. Please add your entry in the comments below. Then, next week, our readers will vote for the best submission.
Here’s Diane’s thoughts on this week’s cartoon, “Do you ever really look at warning signs in the Data center? Usually there is a lot them but people seem to just glaze over when they read them. Maybe we should be reading them more carefully!”
Congrats to the last cartoon winner, Jim Leach from Raging Wire, who submitted, “He’s our new disaster recovery consultant.”
For more cartoons on DCK, see our Humor Channel. For more of Diane’s work visit Kip and Gary‘s website. | 1:48p |
Security, Hybrid Cloud Present Business Opportunities: Microsoft Study This article originally appeared on The WHIR.
Hosting providers are often told that security is one of the biggest barriers to customers adopting cloud. Following that logic, it should come as no surprise that security could be the most lucrative cloud opportunity for web hosts over the next few years, a new report by Microsoft suggests.
According to Hosting and Cloud Go Mainstream: 2014, a study released on Wednesday and conducted by 451 Research, 7.1 percent of organizations still believe that security concerns and issues are their single biggest challenge over the next two years.
“While cloud environments are significantly changing the way businesses operate today, one thing that hasn’t changed is the importance of security. As a result, security has emerged as the primary, and potentially most lucrative, cloud opportunity for hosters,” Michelle Bailey, senior vice president, Digital Infrastructure and Data Strategy, 451 Research. “Hosting is now the de facto solution for ‘trusted cloud’ implementations, and customers are willing to pay a premium for assurances.
Our research shows that 60 percent of customers would pay their hosting service provider a 26 percent premium on average for security guarantees – and an additional 25 percent are already paying for such services.”
Aside from cloud security, the report looks into the phases and types of cloud deployment at more than 2,000 organizations around the world, in a variety of fields including manufacturing, finance and banking, science and tech, healthcare, government and education.
According to the report, on-premises private cloud adoption accounted for 26 percent of on-premises infrastructure spending last year. Hosted private cloud will account for 32 percent of hosted spending in the next 24 months.
In terms of hybrid cloud implementation, 51 percent of organizations surveyed said they had configured a hybrid cloud deployment. Combining an on-premise private cloud with a hosted private cloud was the most popular hybrid cloud configuration, with 60 percent of hybrid users having deployed this type.
Microsoft noted the trend towards hybrid cloud adoption in a study last year, called The New Era of Hosted Services. In an interview with the WHIR, Limena said Microsoft was in a position to capitilize on this trend given its vast partnership channel. Over the past two years, Microsoft has added around 9,500 hosting service providers.
Recently, Microsoft launched ExpressRoute, a new service that offers private connections between customer data centers and Windows Azure, enabling them to use the Azure public cloud as an extension of their private deployments.
Enterprise hybrid cloud adoption will help drive the market, which is expected to reach $79.54 billion by 2018, according to a MarketsandMarkets study.
The study finds that 45 percent of organizations are moving past the pilot phase of their cloud computing deployments, and more than 30 percent now have a formal cloud computing strategy in place.
This article originally published at: http://www.thewhir.com/web-hosting-news/security-hybrid-cloud-present-lucrative-opportunities-hosting-service-providers-microsoft-study | 2:00p |
Hybrid Cloud: Creating a Roadmap for a Cloud-enabled Enterprise Hybrid cloud, which draws from multiple in-house and external resources, is the hot trend in 2104. To date, organizations have taken a “lite” approach for in-house private cloud using commodity hardware, virtualization and dynamic provisioning technology. However, the increased availability of ‘cloud operating systems’ such as OpenStack make it easier to replicate public cloud capabilities in-house. Thus, the concept of a hybrid cloud infrastructure really became popular.
Still many IT shops were faced with challenges questions around their cloud environment. This white paper from Equinix, examines key factors when making the decision in terms of resource placement and utilization. This includes:
- Bandwidth between the organization and the Internet
- Resourcing, technical skills and knowledge available in-house
- Security constraints and data compliance criteria
- Performance and availability stipulations
- Flexibility requirements and service level guarantees
- Available budgets and procurement constraints
Such criteria can be used as a checklist for decision making. An important point to note is that ‘hybrid’ does not mean ‘hotchpotch’, in which anything goes. Rather, it means working proactively at the architectural level, ensuring workloads are in the right places and that the right (high bandwidth, low latency, high availability, high security) communications paths exist between them.
Download this white paper today to learn about the key steps to creating a cloud-enabled enterprise. In understanding this environment around cloud technologies – your organization will see the six main factors in a cloud deployment roadmap. This includes:
- Legacy infrastructure – The starting point for most enterprises with in-house legacy systems, reflecting in-house technology delivery across the latter part of the last century.
- Dedicated connection – Private connection from a choice of carriers, bypassing the Internet to reach the Equinix ecosystem of best-of-breed cloud providers.
- Enterprise Cloud Gateway – Small footprint with core services that deliver Single Sign On (SSO) from legacy systems to public cloud services, even during in-house outages.
- Private cloud – Virtualization infrastructure that scales over time as business applications are migrated to it, minimizing risk and capital investment.
- Hybrid cloud – Seamlessly integrated public, private and legacy components, giving businesses access to the Platform Equinix ecosystem, enabling them to directly access their customers, suppliers and partners while offering maximum flexibility and agility.
- Public cloud – In the long term public cloud services displace legacy and private components, with Platform Equinix being the neutral point of interconnection.

Click to enlarge graphic.
Remember, cloud computing will only continue to expand and impact the modern organization. In creating your own environment, take the time to understand all of the critical cloud models and how they can help your business adapt to ever-changing industry demands. | 2:11p |
Yahoo: We’ll Save 4.6 Million Kilowatt Hours Of Energy With SynapSense Yahoo expects to save 4.6 million kilowatt hours of energy annually at its Quincy, Washington data center through a partnership with SynapSense, which provides wireless monitoring and cooling control solutions. The two companies are working together to reduce cooling costs, recover stranded cooling capacity and improve operational resiliency of the data center’s cooling infrastructure.
“Reducing the cooling energy in the Yahoo data center by 37 percent demonstrates the effectiveness of our partnership and the SynapSense Wireless Monitoring and Cooling Control solution,” said Bart Tichelman, President and Chief Executive Officer of SynapSense. “Working together, we achieve these objectives while increasing data center reliability, and pay for the system with the energy savings.”
The project consists of three phases. During phase one, wireless environmental monitoring is deployed. The Yahoo team and SynapSense Professional Services will fine tune using SynapSense advanced tools and metrics to optimize and balance the airflow in phase 2. Phase 3 consists of SynapSense doing its thing, automatically maintaining efficient operating conditions by dynamically matching cooling to varying equipment and conditions in the data center.
“At Yahoo, we’re focused on making the world’s daily habits inspiring and entertaining — and our state-of-the-art data centers power our users’ daily habits,” said Chris Page, Yahoo’s Global Director for Energy and Sustainability Strategy. “Our partnership with SynapSense will further our efforts to ensure our data centers are as energy efficient as possible. We look forward to working together with SynapSense to maximize our cooling efficiency and lower our overall carbon footprint.”
| 2:22p |
Teradata and SAS Solution Accelerates Big Data Analytics Teradata (TDC) has been selected by UniCredit Group and NTT Docomo for big data solutions. At the SAS Global Forum executive conference next week Rick Andrews from the Center for Medicare and Medicaid Services (CMS) will discuss the advantage from the Teradata platform summarize billions of rows of data in seconds and how that platform is enabling the SAS system to immediately analyze data with incredible performance.
Combined Teradata and SAS solution aids UniCredit Group
Italian financial company UniCredit Group worked with SAS and Teradata to run advanced analytics on processes directly connected to the Teradata Data Warehouse. UniCredit was the first organization to adopt and deploy the Teradata Appliance for SAS, Model 720, running SAS Visual Analytics to fully integrate data and analytics in a streamlined business process. The SAS and Teradata combination enables UniCredit to explore and identify trends and patterns, helping the organization seize emerging opportunities, manage risks and make the right choices. The solution gives UniCredit the best answers to complex business questions in near-real time, providing new insights to analysts – and gaining productivity as well. Teradata BYNET interconnect adds analytic servers to provide rich new tools and algorithms to the Teradata analytic environment. The BYNET enables high-speed, fault-tolerant, warehouse-optimized messaging between nodes.
“Our customers asked for appliance-ready offerings with quick time to value and low total cost of ownership. This is exactly what UniCredit is reporting,” said Rick Lower, CA-AM, Teradata Alliance Director. “As our many shared customers realize increasing value to their organizations, we are confident that we represent the most formidable analytics alliance in the world.”
NTT Docomo uses Teradata for a Marketing platform
Teradata announced that it was selected by Japanese mobile phone operator NTT Docomo for a new marketing operations platform in its growing consumer credit services business. NTT selected the Teradata Integrated Data Warehouse Appliance with the latest release of the Teradata Database to empower marketers to access and utilize the data from its marketing management application. By eliminating the need to wait for IT data extraction, the Teradata environment will enable agile marketing operations with incredibly fast, parallel-processing scalability to process massive volumes of data.
“We are proud to play a role in the evolution of NTT DOCOMO’s credit card business marketing operations, providing quick visibility into data for more insight and value opportunities across conversation points,” said Scott Sobers, Director, Communications Industry Marketing & Strategy. “Looking ahead, our approach is to help our telecommunications customers evolve beyond traditional analytics to the power of next-generation capabilities. Teradata is actively advising and guiding telecommunications customers forward with the innovation to deliver new products and services, and offer cutting-edge customer services.” |
|