Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, April 13th, 2017

    Time Event
    1:00p
    Data Center Automation With AIM

    Thomas Wellinger is global Market Manager of Data Centers for R&M.

    A data center is an ever-growing organism, interwoven at many levels and with an infrastructure that is difficult to see. Once installed, the entire structure needs to remain in place for many years. Under these conditions, data center managers need to deal with constant demand for new services and today’s tremendous growth of data volume. This constantly requires smart decisions regarding the infrastructure. In order to make these decisions adequately, data center managers must be able to revise all of their data centers’ operations and processes. This requires complete transparency, right down to the level of the inventory. This is also referred to as asset-level visibility or network visibility.

    KPIs for Data Center Efficiency

    Besides transparency, we need to take several critical key performance indicators (KPIs) into account. These provide information which is vital to achieving the desired efficiency. These indicators can help improve data center management in the areas of monitoring, analyzing and optimizing the infrastructure as well as offering specific services. These KPIs include PUE (power usage efficiency), total energy costs and delivery costs (price/kWh) but also the time required for the documentation and average delivery time. Once these indicators are related to increased efficiency, the time required for the delivery of infrastructures and services may be reduced. This result will surely result in greater internal and external customer satisfaction . Two further indicators are indispensable, as they clearly lead to higher data center reliability: accuracy of the documentation and the average mean time to repair (MTTR). Documentation accuracy depends on the number of data points which have been correctly recorded and are up to date.

    Several further indicators refer to increasing infrastructure efficiency and density. Stranded or unused capacities can be easily identified by analyzing the relationship between rackspace and limiting factors such as restrictions at the power supply. Space efficiency is the result of utilized floor and rack area per area unit. By additionally examining the personnel requirement, the total cost of ownership (TCO) can be improved correspondingly.

    Reliability and Transparency for Data Center Management

    Automated infrastructure management (AIM) not only supports each of the aforementioned KPIs, but offers another great advantage: various operational aspects of data center management become more reliable and transparent. In today’s world, many interdependent aspects of data center operation are outsourced to (multiple) third party companies. Company A might be in charge of network operation, for example, while company B may be responsible for MACs.

    Until relatively recently, if something went wrong, you could speak to someone from your own organization to find out what happened and resolve the issue. What’s more, the entire legal responsibility for any occurrence that had an impact would be clearly defined.

    What happens if everything is outsourced, however? If you have no way of recording and tracking the individual tasks, what happens if something goes wrong? Each party will simply blame the others and in the end, data center management might be held accountable.

    Approaches to Data Center Management

    For data center infrastructures in particular, the potential to improve on planning, forecasting, creating inventories and MAC processes, for example, is enormous. A smarter approach will also lead to a highly standardized service catalog and constantly high reliability and data integrity of documented actions. This is especially important for highly regulated industries such as the financial, pharmaceutical or chemical sectors.

    Improving visibility is the essential first step in moving towards infrastructure management maturity. With visibility in place, proactive planning based on predictions and forecasts can begin, eventually moving towards KPI-driven data center management.

    Deciding which type of AIM system best suits your needs, and ensuring your hardware, software and processes support this can be very complicated. Make sure you have a clear overview of your goals and requirements before specifying and selecting a solution. Once installed, the scope for quick fixes and changes can be limited. Of course, when in doubt, don’t hesitate to consult experts in this area.

    Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.
    3:53p
    DCK Exclusive – Infomart’s Big Plans for Iconic Dallas Landmark

    While many carrier hotels are challenged to find space to expand, Dallas Infomart is currently planning to spend as much as $500 million to fund its evolving master plan for the property.

    What more can be said about an iconic 1.6 million square foot building complete with a 7-story atrium, originally designed as an homage to London’s Crystal Palace, site of the first World’s Fair? Actually, quite a bit, since there is no building within hundreds of miles with similar carrier and network density, and existing tenant ecosystems.

    The Dallas Infomart, crown jewel of Infomart Data Centers.

    Each Tier 1 market has one or more data centers located at the aggregation point of fiber networks which help to form the backbone of the Internet. Tenants are willing to pay a premium for the low-latency and secure access to the vibrant ecosystems. However, many urban carrier hotels are already busting at the seams, with former elevator shafts maxed out as chases for mazes of conduits. Landlords are renting out every nook and cranny including basement space in some markets to meet demand.

    This is where Infomart Data Center’s eponymous Dallas landmark is a bit of a different animal, according to CEO John Sheputis, who believes “no other Tier 1 market has a carrier hotel with the on-site expansion potential of the Infomart.”

    Image – Infomart Data Centers

    The Master Plan

    Sheputis recently met with Data Center Knowledge at Data Center World in Los Angeles where among other topics, we discussed progress in Dallas. We were also allowed a glimpse at a preliminary master plan which at completion could require an investment of $500 million.

    One of the unique features of this carrier hotel property is acres of grade level parking. This is an artifact from the original 1985 property design concept. Legendary developer Trammell Crow had originally envisioned this massive building as a trade center for the IT industry. The ample parking area provides some flexibility when it comes to planning a purpose-built carrier hotel annex to maximize the value of the property.

    The company has previously announced a plan to add as much as 500,000 square-feet of data center space and more than 30 MW of additional capacity to the property during the next decade. However, the master plan for Dallas Infomart is not yet set in stone. Sheputis is currently working with his design team and the city to formalize the plan.

    Dallas Infomart: Projects Underway

    Dallas Infomart still has about half of its space devoted to conventional office uses. The current plan includes the continued evolution of the existing building into additional data center space – the highest and best use for any carrier hotel site.

    In 2014, Fortune Data Centers and the ABS Real Estate Fund (which had purchased the Dallas Infomart in 2006) combined, creating Infomart Data Centers. Since that time, activity has picked up at the Infomart with over $50 million in upgrade and expansion projects completed, or currently underway including:

    • Commissioning of a 2.9 MW multi-tenant data hall operated by Infomart.
    • Construction of the landlord Building-Meet-Me-Room (currently a dozen carriers including Level 3, Verizon, DE-CIX, InnerCity FiberNet, and UPN), with no landlord charge for cross-connects.
    • Phase I and II of security upgrades: new SOC, access controls, perimeter fencing, security system, surveillance, increased staffing, new security/delivery station with queuing lanes, addition of secured storage area, etc.
    • Build-out of shared conference rooms as a tenant amenity.
    • Installation of a new freight elevator.
    • Commissioning of a 500 kW private data hall.
    • Demo of first floor areas which will soon be converted to data center and office space.

    While most of the spend is devoted to building amenities that benefit all of the tenants, the landlord is also building its own data center suites to accelerate leasing.

    Sheputis explained to Data Center Knowledge that the former cafeteria and commercial kitchen area on the first floor is being demolished to make way for a new data center hall.

    The upper floors of the building have 14-foot clearance and can support a 150 pound per sf load without any column reinforcement. However, this first-floor space is unique because it has 20-foot clear ceiling height and a thickened slab on grade that would be suitable for a high-density deployment for major network node, or “Super PoP,” according to Sheputis.

    Infomart is bringing in two new circuits from existing substations with a 13 MW gross capacity to provide power for the new suites that are under construction, and additional projects which have been identified. Notably, of the 100 MW of capacity serving the Infomart today, only 30 MW are directly operated by the landlord.

    Equinix, the largest tenant in the building, has its own power capacity reserved with electric utility Oncor, to support its expansion plans at the Infomart. There are over 75 network, cloud and colocation tenants in the building which can sub-lease colocation space to retail tenants including ViaWest, Cologix and NTT America. These wholesale tenants continue to invest tens of millions of capex dollars each year to fund their own expansions.

    A Big Payday?

    Infomart Data Centers is owned by Washington, D.C.-based ASB Allegiance Real Estate Fund, a $7.2 billion real estate investment trust (REIT), which manages institutional capital. As of Dec. 31, 2016, ASB owned 168 commercial real estate assets, totaling 12.3 million square-feet, located in 15 US metropolitan markets.

    A little back-of-the-envelope math indicates that the 1.56 million square foot Dallas Infomart property must be the crown jewel of the entire portfolio. While another $500 million of investment would be a big bite of the apple for any organization, there is a big upside. In a follow-up discussion with Sheputis for this article, he shared “our ambition was to double or triple the income of the Infomart” with these investments.

    Sheputis had previously told Data Center Knowledge “that looking to 2018 and beyond, Infomart could be coming up on a strategic crossroads. Decisions will be made regarding expansions into other markets, growth by acquisition, or other strategic alternatives for Infomart.”

    Read more: DCK Exclusive: Infomart President John Sheputis Talks Strategy.

    Essentially, ASB and the Infomart management team will have to decide whether to expand and stay the course, merge privately, or look toward Wall Street for other alternatives.

    5:59p
    IT Certifications: How Valuable Are They?

    There seems to be two schools of thought on the value of IT certifications. Some technology professionals say they can absolutely turbo-charge your salary and earning potential. Others claim that certifications may not be worth the paper on which they’re printed.

    Obviously, the companies that sell that type of training are the biggest proponents. For example, IT training company Global Knowledge, last year published its list of the 15 Top-Paying Certifications for 2016. Thirteen boasted average salaries of $100,000. Similarly, the SANS Institute reported that “proven certifications can provide up to a 5 percent increase in compensation for certified staff over non-certified staff.”

    IT certifications and continuing education, like university degrees, go a long way toward proving professionals have the skills they claim. This validation often gives job candidates, with the right certifications and experience, an edge over comparable candidates with only experience.

    “As an IT service provider, you need to know what’s going on. Certification is a way to learn higher levels and get in depth into a topic,” wrote Jerry Irvine, CIO of Chicago-based Prescient Solutions and a member of the National Cyber Security Task Force in a blog post. Irvine holds multiple certifications (CISM, CISA, CISSP, MCSE, CCNA, CCNP, CCDA, CCDP, CNE, CBCP, CASP, CIPP/IT, IAPP/IT, ITIL and CGEIT) as well as at least six others. “IT is changing on a daily basis; you’ll be left behind if you don’t keep up.”

    As technology changes, some certifications (like Irvine’s 1991 Netware Engineer certification) become irrelevant. Others, like the Cisco Certified Network Associate (CCNA) certification, have multiple versions – like software updates – to reflect changes in the technology and business environment. Some certifications fulfill the continuing education requirements for new certifications, he added. That enables professionals to concentrate on the more advanced certifications.

    Today, multiple certifications are required that span competing and complementary technologies and vendors. Other certifications also are adding specializations. For example, facilities management certifications include special modules or completely separate certifications focused around sustainability. This is increasingly important as data centers incorporate new, more sustainable technologies.

    ‘Must-Have’ Certifications

    Today’s “must-have” certifications vary according to the job and the individual.  Microsoft certifications are king for technicians, and virtualization certifications from Citrix, Microsoft and VMware are extremely important as computing continues to turn to the cloud, according to Irvine.

    For management, Irvine recommends earning CISA (Certified Information Systems Auditor), CISM (Certified Information Security Manager) and CISSP (Certified Information Systems Security Professional) standing.

    VMware has developed expert level certification around network virtualization.

    Entry-level certifications like CompTIA’s Security+ are based upon testing and require little to no experience, but senior-level certifications generally require a combination of experience and testing. For example, the Professional level CISA certification requires three to five years’ experience and a minimum of three references from organizations for which you’ve performed those functions. These tests also require several different domains of expertise.

    Facilities management is one of the fastest-growing industries globally, according to a recent Global Industries Analytics report. In the past decade, it has evolved from maintenance and repair to document handling and IT management, as well as other functions.

    Educational Opportunities

    Certification classes help ensure professionals are familiar with all the areas covered by the certification exams. Many of the exams include both written and hands-on testing. As professionals advance in their careers, experience and hands-on capabilities become increasingly important – even to pass the certification testing. “Classes help ensure you can answer the questions appropriately, but the professional designation tests are grounded in the real world,” Irvine points out.

    There are many options for training. In addition to face-to-face classroom training that, like boot camps, takes professionals out of the day-to-day work environment to concentrate on specific material, video and online training is available. These options help students learn on their own time. Many call them invaluable for augmenting or replacing classroom training.

    Professionals also need to look for educational opportunities outside traditional IT training to learn about business processes, regulatory compliance and such soft skills as leadership.

    While certification can be invaluable in imparting and validating knowledge, additional educational opportunities also have merit. For example, conferences sponsored by organizations like AFCOM and Black Hat have a wealth of information to keep attendees current with the industry, technologies and emerging threats. Vendor classes and conferences can also provide additional educational options. Writing articles and training others can also help advance one’s own education. Other options include joining online communities and spending time to read articles and posts.

    Hiring

    Certification alone doesn’t necessarily make someone a better candidate. Value in IT comes down to experience, but certification can validate that expertise. For a junior-level person trying to get in the door, certification helps on the resume. For people with eight to 10 years’ experience, certifications are often the cherries on top because they help in identifying qualified people who are deeply knowledgeable about specific technologies.

    Certification may be most important in demonstrating expertise with new technologies. With the current U.S. IT unemployment rate at about 3 percent, according to Dice, a career hub for technology and engineering professionals, employers say finding the right person for any job openings remains difficult. The problem, they say, is in finding individuals with all the skills needed in modern, virtualized data centers.

    Sometimes there’s a reluctance among professionals to gain certifications, but as the IDC Research report, “IT Staffing Strategies: Demand and Change in the Era of the 3rd Platform,” pointed out, “In the long run, training an under-skilled employee with potential is far less expensive than hiring a fully skilled/experienced worker.”

    Training budgets, however, aren’t always available, so you can’t always rely on company funds. Instead, IT professionals should expect to invest financially in education that enhances their careers. A look at costs shows a range from about $200 to $2,000 for certification classes and testing. Some of the classes and certifying organizations include, or offer at low cost, physical or virtual computer laboratories equipped with the relevant technology and applications.

    Certifications, like a college degree, show that individuals are capable of learning and proactively taking  the steps necessary to advance their own careers. Certified individuals not only say they have certain skills, they validate those claims through standard, industry-recognized certifications.

    For hiring organizations, certifications reduce risk in the hiring process. Certifications, particularly when paired with job experience, create a winning combination for experts eager to expand their opportunities and increase their value.

     

    6:27p
    MSPs Have Key Roles in Mainstream DevOps; U.S. Demands More Cybersecurity from Government

    By The VAR Guy

    DevOps has been around for awhile. One might even say it’s going – as the kids call it – mainstream. Naturally when this happens, we start to look at what’s next. What will be the new, shiny trend? Experts believe it may be NoOps, and that MSPs will be the ones to help deliver it.

    DevOps, which has been around since about the late 2000s, emphasizes constant collaboration between everyone in technology – developers, systems administrators, QA teams, etc. The primary goal of DevOps is to make the development and management of software more streamlined and efficient. It needs to be able to scale and to react quickly to changes in a software environment, market or user needs.

    NoOps is the idea that manual IT Ops processes and the maintenance/management tasks that systems administrators have to perform by hand can be entirely cut out from software delivery. Essentially, the NoOps model “brings agile to the next level. Admins can say a sweet goodbye to the “dirty work” required to keep software running. With the manual management is removed, systems can scale and evolve without limit.

    The sophisticated and advanced tools that DevOps has provided have brought us extremely close to the possibility of NoOps. Of course, there will always be something that has to be done by hand – absolutely no manual maintenance is still very much a pipe dream at this point. BUT, here’s where managed services come in. MSPs can help organizations to effectively achieve NoOps.

    “Today, developers are increasingly turning to managed services for toolsets and infrastructure requirements – tasks traditionally managed by DevOps teams,” says Andrey Akselrod, founder and CTO of Smartling. Utilizing MPSs to eliminate the need to do DevOps will likely become more and more common, experts in the field predict. As DevOps becomes the norm, as all new bright and shiny software solutions do, organizations will seek out solutions that offer higher level of efficiency than what DevOps offers. Keep a sharp eye out, MSPs. Those who can deliver even more agility and automation than an in-house DevOps team will win the game.

    Our second story takes a look at a recent Accenture survey that sheds some light on how Americans think when it comes to cybersecurity and the government’s role. According to the study (as reported by Security Magazine), more than three quarters of folks in the U.S. are concerned about the privacy and security of their personal online and digital data, and almost two thirds say they would feel more confident if government agencies had more beefed up data-privacy and security policies. Imagine that…

    The survey also laid bare some upsetting, but not surprising news. Almost 30 percent of participants stated that they had been a victim of cybercrime in some form. Additionally, 74 percent of citizens have little to no confidence in government’s ability to keep their data private and secure, and nearly 65 percent lack confidence in the ability of law enforcement to investigate and prosecute cybercrimes. Those are big numbers, folks. Again, upsetting but not surprising.

    “This survey confirms that ‘cyber insecurity’ is pervasive, with citizens feeling concerned and vulnerable,” states Lalit Ahluwalia, Accenture’s North America security lead. “All organizations must make cybersecurity a top priority and move to deploy end-end cyber defense solutions to combat threats to data, and to ensure citizen confidence when engaging with government agencies.”

    From the personal side, the research also revealed that 66 percent of respondents said they would be willing to sacrifice convenience for increased data security. More than half (60 percent) said that they would willingly jump through more hoops (such as answering additional login questions) if it meant upped security and protection. Even further, almost half (47 percent) are in favor of the use of biometric technologies to verify identity and secure access.

    From the government and agency side, citizens expressed support for new security services that the organizations could adopt to enhance their data privacy and security measures. Respondents overwhelmingly agreed that the availability of a secure digital identity (85 percent), the undertaking of regular security assessments (82 percent) and new cyber defense services (85 percent) would up their faith and confidence in the privacy and security of their precious data.

    “While government agencies face many cybersecurity challenges, the research found strong citizen support for government organizations to take steps to increase data security and protect citizen information,” states Peter Hutchinson, public service strategy lead at Accenture. “Government agencies that take a comprehensive end-to-end security approach by integrating cyber security deep into their organizations will not only secure their data, but also win the trust and confidence of the citizens they serve.”

    Our last story delves into the cybersecurity jobs arena. A lot of predictions and reports will tell you that IT and cybersecurity jobs are already increasing in demand, and will continue to do so in the coming years. Sounds about right… right? It may be a bit more complex than that. According to an article by SC Magazine, there was a net increase of 13,000 information technology jobs reported in February. Encouraging? It certainly seems so – a sign of strong, healthy growth. However, a comparison to older employment numbers from the Bureau of Labor Statistics (BLS) complicates things a bit.

    While the numbers prove that February was the best month since September of 2016 in terms of job growth for IT professionals, there is something a bit unsettling about performance over the last three months of the year, according to David Foote, CEO and chief research officer at Foote Partners, an IT analyst firm and research organization.

    “Only 7,533 jobs were added on average in this period compared to 11,533 jobs per month in the first nine months,” said Foote in a report. While he pointed out that a three-month span is insufficient for a true analysis of labor numbers, still, the February results indicated “volatility and uncertainty in the marketplace for U.S. tech jobs.”

    Foote also stated that companies are treading lightly and being cautious about hiring full-time staff for technology-enabled solutions. Instead, they are hiring consultants and contingency workers. The goal? Flexibility. Enterprises are able to stay unrestrained as they develop their security implementations.

    Enterprises must scale quickly and seamlessly in order to stay competitive, Foote goes on to say. This means positions are being added in areas that have proved hugely effective, such as cloud, Big Data, mobile or digital technology. Why? Because the outlook shows these professionals having an impact far into the future.

    “What will drive new job creation in 2017 will be hiring in niche areas – such as Big Data and advanced analytics, cybersecurity and certain areas of applications development and software engineering, like DevOps and digital product development,” says Foote.

    This post originally appeared here at The VAR Guy.

    6:44p
    Toshiba Said to Put Temporary Hold on Memory Chip Sale Process

    Toshiba Corp. temporarily canceled all meetings and decisions related to the sale of its memory chip business to address concerns raised by an industry partner, according to people familiar with the matter.

    Toshiba is trying to sell the business to raise much-needed cash, and the company has been narrowing down the field of interested buyers. That hit a snag after joint-venture partner Western Digital Corp., based in San Jose, California, said a sale may violate the companies’ contract.

    Western Digital Chief Executive Officer Steve Milligan wrote a letter to Toshiba’s board members on April 9 advising them that they should negotiate exclusively with his company before any sale. He also argued that the rumored bidders were unsuitable and the reported prices offered were above the fair and supportable value of the chip business, according to a person familiar with the process, who asked not to be identified because the information is private.

    Toshiba and Western Digital are joint owners of certain chip business facilities.

    Western Digital’s contentions raise another potential roadblock in the troubled process. The Japanese company needs to shore up finances hurt by losses from its Westinghouse nuclear business and has warned that its very survival is at risk. Analysts cautioned that Western Digital does have legal rights that will bear on the sale process.

    ‘Consent to Approve’

    “We believe that WDC has rights surrounding the JV including the consent to approve/disapprove of any transaction involving the joint venture,” Amit Daryanani, an analyst at RBC Capital Markets, wrote in a research note. “WDC has the legal wherewithal to veto or approve a winning bid.”

    Toshiba disagrees with Western Digital’s assertion that a sale would violate the agreement between the two companies, Toshiba executives said when contacted by Bloomberg News.

    Last year Western Digital, one of the largest makers of computer hard drives, made a $15.8 billion bet on technology that’s making its core business obsolete, with its purchase of SanDisk Corp. SanDisk was a manufacturing partner of Toshiba, a role that Western Digital has assumed.

    That purchase piled debt onto its balance sheet and may restrict its ability to match some of the bids that other companies reportedly made for Toshiba’s chip unit. In January, Western Digital said it had cash and cash equivalents of $5.2 billion. The company had “liquidity available” totaling $6.2 billion and a net debt position of about $800 million, it said.

    Toshiba has narrowed the original group of contenders for the chip business after a first round of bidding. Taiwan’s Hon Hai Precision Industry Co. has indicated its willingness to pay as much as 3 trillion yen ($27 billion), Bloomberg has reported.

    Toshiba’s board is trying to balance the need for a quick sale with concerns that such a deal would mark the end of Japan’s chance of restoring its once-leading role in the $300 billion chip industry and potentially aid China’s push to enter that important market, Bloomberg News has reported.

    Milligan’s letter, which was earlier reported by the Nikkei Asian Review, cautioned in particular against accepting a bid from Broadcom Ltd., a company that has led the wave of consolidation in the chip industry over the past two years.

    << Previous Day 2017/04/13
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org