Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, August 13th, 2015

    Time Event
    12:00p
    Where DCIM Software and Big Data Meet

    After a couple of years of DCIM buzz, the hype has died down, and while analysts are forecasting healthy growth for the DCIM software market overall, they expect most of that growth to happen among a handful of the biggest players in the space.

    What may make DCIM software increasingly relevant is progress in Big Data analytics. As companies deploy more sophisticated analytics systems to help with operations (and beyond), DCIM can provide a set of operational data about infrastructure that can be very useful to those systems.

    It’s already happening today to a certain extent, but such scenarios aren’t yet widespread. The current trend in IT management is toward automating infrastructure to support any particular application, and this application-aware IT automation concept may soon spread to the underlying data center infrastructure – the domain of DCIM software – too.

    The increasing focus by DCIM vendors on integrating with IT service management solutions – some people even say DCIM is becoming a subset of ITSM – is a step in that direction. DCIM tools today can tell you things like rack density, temperature, power consumption, where a server is, or where to put it to use the capacity more efficiently, but they don’t go much beyond that.

    “This is where DCIM really stops, and there is a gap between the business and the data center; even IT and the data center,” Richard Jenkins, VP of worldwide marketing at RF Code, says. RF Code sells sensors for data center instrumentation as well as software solutions for data center management.

    The future, according to him, is DCIM feeding data into Big Data analytics systems that companies use to make business decisions. “DCIM is a small piece of the overall picture,” he says. “The next wave of true Big Data will be an analytics platform that will stick between the data center and the business itself.”

    Nlyte Software, one of the major DCIM software vendors, has put a lot of effort into integrating with ITSM platforms and has also always made sure to have an open API, so that the data its software collects can be pushed to whatever systems need it. In that future of DCIM working as part of a holistic analytics-based management system, open APIs will be crucial.

    Robert Neave, Nlyte CTO and co-founder, says one of the company’s biggest customers pushes asset management data Nlyte’s software collects from its massive data center infrastructure, energy usage data collected by its facilities management system, network monitoring data, and other types of infrastructure data into Vertica, HP’s Big Data analytics platform, to understand and manage infrastructure demand of its cloud-based applications.

    Neave declined to say who the company was, citing confidentiality agreements with the big customer, but said it was a high-tech company with about 120,000 data center racks under management.

    There are applications for Big Data within DCIM itself too, particularly when DCIM software does predictive analytics. This is a capability the team behind Emerson’s Trellis DCIM team is working on, Steve Geffin, VP of strategic initiatives at Emerson’s Network Power unit, said.

    Predictive analytics is a Big Data problem, Geffin says. Trellis uses Big Data to create and refine operational models for some customers. Infrastructure data gets fed into a Hadoop cluster and analyzed to help the customer make operational decisions, he says.

    One example is predicting when a UPS battery going to fail. Data center operators usually replace batteries on a defined cycle, and when a battery gets replaced it doesn’t necessarily mean it has reached its end of life. Very often a battery may remain perfectly fine for another year, but it gets replaced as a precautionary measure. By analyzing patterns over many batteries’ lifecycles, replacement can be deferred until it’s actually necessary, which can save the operator a lot of money. “Problems like this you can only solve using Big Data techniques,” Geffin says.

    3:00p
    Prepping Your Data Center for the Internet of Things

    Pierre Frick is the Vice President of Product Marketing at EnterpriseDB.

    By now, most IT professionals and CIOs are aware of the astounding statistics surrounding the Internet of Things (IoT), especially as it relates to the amount of data being collected by the growing network of connected devices. In fact, according to Cisco, there will be well over 50 billion connected devices by 2020.

    This is largely powered by the fact that more than 82 percent of businesses will be using IoT applications by 2017 alone, according to Forrester. The rapidly growing interest in harnessing the power of the IoT is driving massive increases in spending. A recent study from Tata Consultancy Services found that 26 global companies plan to spend at least $1 billion each on IoT in the coming year. With the tremendous cost associated with capturing the exponential growth of data, and the competitive advantage such data can provide, companies that cannot afford to make this investment in the technology could be paralyzed.

    Enterprises are becoming more keenly aware that remaining competitive in their market is influenced by their ability to collect, manage and analyze data. But a key element many companies often overlook in the cost-benefit equation is the impact of investment requirements at the back end of an IoT infrastructure. For companies that rely on low-cost devices – ranging from smartphones to facility sensors – to collect information, it will become increasingly cost prohibitive to purchase expensive software licenses and hardware that manage the data once it’s acquired. Open source software offers an innovative solution to help companies control these rising costs.

    Today, open source-based solutions have the functionality, scalability and reliability to enable businesses to successfully cope with the explosion of data associated with the IoT. As Gartner noted in its April 2015 report, State of Relational Open Source RDBMSs 2015, “Open-source RDBMSs have matured and today can be considered by information leaders, DBAs and application development management as a standard infrastructure choice for a large majority of new enterprise applications.” The report also stated: “Information leaders who opt for an open-source DBMS (OSDBMS) licensing model can benefit from much lower costs than using a commercial model, even with today’s hosted cloud and database platform as a service (dbPaaS) offerings.”

    With these benefits in mind, open source databases are playing an increasingly vital role in the IoT-enabled data center, by helping CIOs free up the resources and budget that will help support the IoT and its applications.

    Keeping the IT ‘Lights’ On Could be Destroying Your Business

    ABI Research is estimating there may be as much as 44 zettabytes of data by 2020. To put that into perspective, up through 2013, humanity had produced only 2.7 zettabytes. The exponentially growing volume of data that devices are capturing is staggering, and as a result, it’s becoming increasingly difficult for businesses to store it. For many CIOs, an initial reaction to this challenge would be to buy more hardware and software, or to create clusters to manage and store the data. However, CIOs can often overlook the costly investment requirements at the back end of an IoT infrastructure. CIOs would do well to consider open source solutions, such as Postgres, which are much more cost-effective, and can perform just as well as propriety IT systems.

    By implementing leading open source database Postgres, CIOs can divert budget away from operational expenses and leverage the resulting cost savings to fund new and more strategic opportunities that can improve business performance. For example, companies could invest these new-found funds into the marketing department’s customer engagement initiatives or new mobile technologies.

    Postgres and Integrating Structured and Unstructured Data

    Another consideration for CIOs implementing a data center infrastructure strategy with the IoT in mind is the variety of data types produced by new applications and devices. Integrating unstructured and semi-structured data into relational tables comprised of structured data is a major challenge. Today, many organizations have a patchwork of applications. Developers have turned to NoSQL-only solutions to address specific workloads. However, these systems can result in numerous data silos, which can cause major headaches and huge expense. It also makes the collected data much harder to manage.

    Postgres has a solution for this. A unique feature called Foreign Data Wrappers (FDWs) allow for the seamless integration of data from disparate sources such as MongoDB, Hadoop and MySQL into a common model under the Postgres database. FDWs are able to link the Postgres databases to these external data stores so DBAs can access, manage and, most importantly, manipulate data from foreign sources as if it were part of the native Postgres database. With FDWs in place, Postgres acts as the central hub for data, enabling IT teams to ensure the integrity of their data regardless of what type it is or what generated it originally.

    Developing Your Database Infrastructure Strategy

    As CIOs work to develop a strategy for organizing their organization’s IoT infrastructure, there are several important decisions to consider. Evaluating whether to locally collect and store data at the device for future use or to move it to a centralized management system is one such decision. The argument for keeping the data local is that it speeds up the data-accessing process, and, therefore, will provide actionable insights at a faster rate. However, the economics of keeping the data local could be burdensome due to the number of individual database instances, as well as the costs of integrating data from this distributed architecture.

    Unlike other SQL databases, Postgres was developed to be expandable and to easily incorporate new data types, indexing schemes and languages without compromising other features of the database. For CIOs, this expandability allows them to centralize company data in a reliable and scalable way, but with a much greater cost savings than traditional commercial databases.

    The IoT promises to arm CIOs and IT professionals with a much deeper understanding of the business and customer environments by providing real-time data. However, in order to reap these benefits, executives need to understand the significant economic impact that storing and managing this data could have on their data center investments. Open source systems like Postgres provide a viable solution to the challenges presented by the IoT.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    4:49p
    BitMicro Says 8TB Server Flash Drive Can Shrink Your Data Center

    Looking to drive more adoption of server Flash as primary storage, BitMicro Networks at this week’s Flash Memory Summit unveiled solid-state drives designed to plug into a PCIe bus that can be configured with up to 8TB of storage.

    Based on a Talino ASIC processors that offload much of the management overhead associated with server Flash storage capacity, MAXio Z-Series PCIe SSDs are intended to help IT organizations dramatically shrink the size of their data centers, Zophar Sante, VP of marketing for BitMicro, said

    “We only need eight nodes to provide the current equivalent of performance that would today require 64 servers,” said Sante. “We can reduce two whole racks down into a quarter rack of space.”

    Sante added that the company’s ASICs reduce the overhead tax on servers for managing SSDs from about 12 percent on average down to one percent. The Talino architecture also does not require any on-board fans, with SSDs in the Z-Series generating as little as 50 watts, which equates to less than six watts per terabyte.

    The driving factor for being able to reduce the size of storage footprint in the data center is the use of a split controller architecture that enables BitMicro to maintain high throughput speeds across as much as 8TB of Flash storage using a single PCIe slot.

    The company also provides Network DriveLight Management Software to simplify MAXio SSD administration.

    Sante conceded that there is fairly intense battle between server and storage administrators concerning where SSDs should be deployed. While deploying SSDs in servers reduces the size of the data center footprint, it’s easier for more applications to invoke SSDs when they are configured in a storage array. For that very reason BitMicro offers SSDs that plug both into servers and in a Storage Area Network. The degree to which IT organizations will opt for one or the other will depend on how finite the application portfolio trying to access Flash storage actually is inside that organization.

    Another factor, of course, is to what degree IT organizations are going to want to be dependent on any single point of failure in the event of any component failure as the amount of data residing on any of those elements at any given time continues to rise.

    5:30p
    Alibaba Shows Open Source Support by Joining The Linux Foundation and Xen Advisory Board

    logo-WHIR

    This article originally appeared at The WHIR

    Online retailer and cloud computing provider Alibaba has joined the Linux Foundation as a silver member, and its cloud computing subsidiary, Aliyun, is a new advisory board member of the open-source Xen hypervisor project, which is hosted by the Linux Foundation.

    Xen is used by some of the largest clouds in production today. Aliyun joins other advisory board members, including Amazon Web Services, Rackspace, and Verizon. Members donate financial support, technical contributions, and high-level policy guidance.

    Aliyun uses Xen to power its cloud computing offering. According to its Thursday announcement, Aliyun had already been contributing vulnerability fixes to the Xen project for some time, and will continue to use Xen virtualization to run different operating systems and applications off the same physical infrastructure.

    “As an advisory board member, Aliyun is looking forward to deeper interaction and collaboration with the Xen Project board and community,” Aliyun CTO Wensong Zhang said in a statement. “We have been working with Linux for a long time, and Xen virtualization is increasingly important to enhancing our cloud and marketplace technology offerings in China and abroad.”

    Xen Project advisory board Chairman Lars Kurth said, “With cloud computing still in its infancy in China, Aliyun’s support is a gateway for major growth in Asia for Xen Project virtualization.”

    Also announced Thursday, Alibaba joined the Linux Foundation along with container-based solutions provider DCHQ, Taiwan-based chip supplier MediaTek, payment provider PayPal and Chinese Linux software provider Wuhan Deepin Technology.

    Wuhan Deepin Technology specializes in Linux R&D and develops software based on Linux technology, and produces Deepin, a Linux-based operating system. With nearly a million users across more than 50 countries, Deepin is the first Chinese OS with real overseas influence.

    Linux Foundation CMO Amanda McPherson remarked on how Linux is being used worldwide for a range of purposes, and that contributions also reflect its international scope. “There are more first-time contributors and paid developers than ever, contributing to how fast Linux is built,” she stated. “Our new members reflect just how importance, significance and wide reaching Linux is today. From central China and Taiwan, no corner of the world is untouched by Linux.”

    The Linux Foundation has been very active in recent months fostering collaboration on a range of technologies between a variety of stakeholders including independent developers and companies.

    Last year, the Linux Foundation announced it would be hosting Open Platform for NFV, a project aimed at standardizing a way of virtualizing entire networks. Last month, it began hosting the Cloud Native Computing Foundation whose task is to validate reference architectures for integrating various technologies built on top of Docker containers.

    The presence of more Asian organizations within open-source projects in general adds new perspectives and useful technical knowledge. It also helps add legitimacy to initiatives like the Linux Foundation, which are aimed at building bridges between stakeholders worldwide to produce better technology. And forming these connections between companies in the cloud space is especially remarkable given that many of the companies involved are and continue to be fierce competitors – even if they contribute to the same projects.

    This first ran at http://www.thewhir.com/web-hosting-news/alibaba-shows-open-source-support-by-joining-the-linux-foundation-and-xen-advisory-board

    6:03p
    Cisco Delivers Custom Web-Scale Platform to “Cloud Titan”

    In an apparent effort to demonstrate to analysts that Cisco is not losing the web-scale data center market to white-box or Taiwanese design manufacturers, its new CEO Chuck Robbins revealed that the company recently designed and delivered a custom high-end networking platform for one of the “cloud titans.”

    On his first quarterly earnings call as Cisco CEO Wednesday, Robbins said orders from web-scale data center operators were growing at a “significant” pace. “With our ability to build out scalable solutions with consideration to their unique needs I think we’re reasonably well positioned,” he said, answering a question about Cisco’s web-scale play from one of the analysts on the call.

    Like other big hardware vendors, the HPs and Dells of the world, Cisco is fighting to keep its business with customers that build and operate some of the world’s largest data centers. Also referred to as “web-scale” data center operators, companies like Google, Facebook, and Amazon have been designing their own hardware and going directly to manufacturers in Asia with massive orders of stripped-down boxes without extra features the “incumbent” vendors have been including in their products to increase their value.

    Google is known to have used this approach to sourcing networking hardware for years. Facebook has been transitioning to switches designed in-house recently.

    New companies have been formed to respond to this trend, companies like Arista Networks, a switch vendor that lists Facebook as one of its customers, and Cumulus Networks, which has designed a Linux-based network operating system for open bare-metal switches.

    Cisco’s incumbent competitors have responded too. HP announced plans to ship commodity switches with Cumulus software earlier this year, for example. HP also recently struck a partnership with Arista. Dell and Juniper have both cooked up switches that come with software other than their own.

    The web-scale market isn’t insignificant, but it is fairly small compared to the enterprise market that drives the bulk of the data center hardware revenue for Cisco and its rivals. Vendors that cater to the web-scale audience are attempting to expand into the enterprise market, and interest in the enterprise is rising, but the take-up of this entirely new approach to hardware sourcing has been slow.

    Cisco reported $12.8 billion in revenue for the fourth quarter of fiscal 2015 – up 4 percent year over year. Its earnings per share for the quarter were $0.45.

    The company’s switching business segment, the biggest one in terms of revenue, contributed about 30 percent of total revenue. Its data center business segment, which sells its Unified Computing System servers and converged infrastructure systems, contributed 7 percent of total revenue.

    Robbins was upbeat about the company’s future, saying he was focused on “accelerating what’s working and changing what’s not, simplifying our business, driving operational rigor, and investing in our talent and our culture. Cisco’s best years are ahead of us,” he said.

    6:28p
    Compliance, Security, and Cloud: Understanding your Data Center Options

    The proliferation of cloud computing has taken us to a new era when it comes to content and data delivery. More organizations than ever before are leveraging colocation providers and cloud services to enable their business and create a more productive workforce. We’re now able to house rich applications with delivery capabilities spanning regions, devices, various user groups, and many different verticals. Now, as organizations continue to look to cloud for even more services and offerings, some big questions have begun to arise. What happens to all of those cloud workloads that are bound by compliance? What if you require a colocation provider with very specific security metrics?

    Fortunately, when it comes to cloud, compliance, and security, there is good news. This interesting whitepaper approaches the cloud compliance and security conversation by looking at the barriers to adoption and how to overcome them!

    Here’s the reality: There are new kinds of services and offerings now available that are capable of hosting secure, multi-tenant workloads which are compliance-bound. The great part here is your options. Compliance workloads can include:

    • PCI DSS
    • HIPAA
    • SOC 1 and 2
    • FISMA
    • FedRAMP
    • EU Safe Harbor

    Download this whitepaper today to learn how providers like QTS break down the barriers when it comes to hosting specialized, compliance-bound workloads in the cloud. These powerful multi-tenant architectures are capable of bringing direct efficiencies to any organization while still creating a cost-effective cloud solution.

    7:14p
    Report: Arizona Wants to Hop on Data Center Tax Break Bandwagon

    Following in the footprints of Missouri, Washington, and Oregon, the State of Arizona is inching closer to instating new 10- and 20-year tax breaks for high-tech data centers.

    If the state legislature adopts the tax program, data center providers would be excluded from paying state, county, and local sales taxes on equipment purchases. It would also provide tax incentives for related businesses looking to locate their operations in Arizona, Phoenix Business Journal reported.

    Even without extensive data center tax breaks, Phoenix has emerged as a top US market for data centers and server farms. So, this could provide a very lucrative boost to the Grand Canyon State as well as the companies with data center operations already established there. According to the article, those that currently qualify are:

    • Aligned Data Centers
    • Apollo Education Group
    • Avnet Logistics
    • Charles Schwab
    • CyrusOne
    • Digital Realty Trust
    • Phoenix NAP
    • PayPal
    • GoDaddy
    • US Foods
    • IO

    Although Apple is missing from the list, once it completes its $2 billion project to convert a former 1.3 million square foot manufacturing plant in Mesa it bought early this year into a data center, it, too, will qualify for tax breaks. State officials forecast the Apple data center to create 300 to 500 construction and trade jobs and 150 permanent Apple jobs over a committed 30-year period.

    Meanwhile, Microsoft plans to turn the former Phoenix-based home of a Honeywell International manufacturing plant into a data center. Although currently being leased by a private equity fund and other third-party companies, it is slated to house a Microsoft data center totaling up to 575,000 square feet.

    State and local governments have turned to tax breaks to compete for big data center projects in recent years. These incentives are often deal-makers when companies are going through the data center site-selection process. Tax breaks rank just as high as considerations for power cost, availability, and network infrastructure.

    Earlier this year, Oregon officials approved a new package of tax breaks for a new data center Google is considering building—its third in the state—currently valued at $1.2 billion.

    Data center tax break legislation is also moving through the Pennsylvania Senate and House of Representatives. Both data center owners and tenants would be exempt from sales and use tax on equipment and software purchases if they satisfy a set of criteria.

    Other states that passed laws to create or extend data center tax breaks this year include Missouri and Washington, where the total data center investment has exploded to nearly $4 billion since 2006.

    7:21p
    More Than 600K Web-Facing Computers Still Run Windows Server 2003: Netcraft

    logo-WHIR

    This article originally appeared at The WHIR

    Although it has been nearly a month since Windows Server 2003 extended support ended, 175 million websites have been found by Netcraft’s latest survey still being served by the obsolete system. Netcraft’s July Web Server Survey shows that over 600,000 web-facing computers, serving a fifth of all websites, are still running Windows Server 2003, and therefore exposed to elevated cybersecurity risk.

    The last major update to Windows Server 2003 was Service Pack 2, released 8 years ago, and of computers still running it, 73 percent are served by Microsoft Internet Information Services 6.0, the version which shipped with Windows Server 2003. The Server header for another 1.7 million sites served from other operating systems also indicated Microsft IIS/6.0, and therefore more Windows Server 2003 machines in back-end use, further increasing the scope of risk.

    Netcraft Windows Server 2003 usage

    Netcraft estimates that 609,000 computers, or just over 10 percent of all web-facing computers, are still running Windows Server 2003. Because software licenses are usually granted at a per-machine cost, the number of installations is the best indication of the total cost of migration to supported systems.

    Of those computers, the majority (55 percent) are in the US and China, though the two countries make up only 42 percent of other web-facing computers. Alibaba subsidiary HiChina, which it acquired in 2009, operates roughly 12,000 instances of Windows Server 2003, while the company’s cloud division Aliyun hosts 7,500. Aliyun,which just boosted its relational database suite with an EnterpriseDB partnership, still offers Windows Server 2003 VMs.

    Companies using Windows Server 2003 include bank Natwest, ANZ, and Grupo Bancolombia. Other companies such asLivePerson and ING Direct also serves sites via F5 BIG IP devices, so their Windows Server 2003 machines are not directly exposed to the internet.

    While migration for some sites may happen slowly, any business subject to thePayment Card Industry Data Security Standard (PCI DSS) has automatically failed to comply with the standard by using Windows Server 2003 anywhere in its environment. PCI DSS requires all software to be up to date with vendor-supplied security patches.

    A PCI compliance report released by Verizon in April showed that 80 percent of retailers failed interim PCI compliance assessment. Enterprises already fear data breaches in the cloud, so service providers still using server software with “2003” in the name are arguably putting the reputation of the technology at risk, and inarguably setting themselves up for a public relations disaster if (or more likely when) an effective vector for attacking the obsolete systems is developed by hackers.

    A March Microsoft report referred to the migration market opportunity that the end of Windows Server 2003 support created for service providers. Microsoft also offers a Migration Planning Assistant hosted on Azure.

    This first ran at http://www.thewhir.com/web-hosting-news/more-than-600k-web-facing-computers-still-run-windows-server-2003-netcraft

    10:45p
    Metal Rebel the Robot Opens Switch’s Latest Huge Las Vegas Data Center

    After driving a vehicle, climbing stairs, turning valves, using power tools, and navigating around debris without human assistance in the DARPA robotics challenge finals this past June, cutting a ribbon at a data center opening was probably peanuts for Metal Rebel, the human robot from the University of Las Vegas who took eighth place in the finals, competing with robots by MIT, NASA, and Lockheed Martin, among others.

    Metal Rebel was on hand earlier this month to cut the ribbon at the opening of the latest massive Las Vegas data center by Switch, whose data center cluster in Sin City has now reached 215 MW in critical power capacity. The ceremony was more pleasure than business for the robot, where it got to rub shoulders with Nevada Governor Brian Sandoval and state senators Michael Roberson and Aaron Ford, who were all present at the event.

    Switch’s latest, ninth SuperNAP Las Vegas facility came online recently with a nearly 120,000-square-foot first phase completely sold out, company spokesman Adam Kramer said. “The demand has been incredible,” he said. “We already started construction on SuperNAP 10 and 11.”

    Switch is the dominant data center service provider in Las Vegas, known for the huge scale of its facilities, their sleek futuristic interiors, and ex-military armed guards. The company is currently also building a huge $1 billion campus in Reno, Nevada, near the Apple data center campus, where eBay has signed as the anchor tenant.

    Switch SuperNAP entrance

    Entrance to Switch SuperNAP in Las Vegas (Photo courtesy of Switch)

    Switch provides retail colocation from as little as half a cabinet to as much as 1,000 cabinets per customer, such as its recent deal with Shutterfly. Its competitors in the Las Vegas data center market include ViaWest, Cobalt, and zColo, among others.

    SuperNAP 9 is about 500,000 square feet total, and Switch said it will ultimately support up to 50 MVA of power. Subsequent phases will be of similar size to the first one. Kramer said 20 percent of phase two has been pre-leased.

    SuperNAP 8 has received Tier IV certification for constructed facility by the Uptime Institute, and the company expects to get the same certification for the new data center.

    And now on to the important stuff, which is a rather anticlimactic video of a robot cutting a ribbon:

    << Previous Day 2015/08/13
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org