Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, July 28th, 2016

    Time Event
    4:14p
    Creating an Enterprise Data Migration Plan

    Today’s business has a lot of storage and data options. And, requirements around data control are going to continue to grow and evolve. With that in mind – let’s touch on one aspect of the IT and data center administrative process that some organizations hate to discuss: data migrations.

    What if you need to move a massive amount of data? What if it’s not as simple as just re-mapping a storage repository? In some cases, you might be migrating entire storage vendors to align with specific business strategies. Either way – when dealing with critical corporate data – you need to have a plan. So, here are 8 steps to creating an enterprise data migration plan:

    Business Impact Analysis. To best identify the business and operational requirements for a data migration project, a BIA should be conducted. The BIA process will involve different business stakeholders who will work to ensure that their requirements are factored into the migration plan.

    During this process, four elements will be identified:

    • IT team will examine and define available network bandwidth, storage, CPU needs, allowable downtime, and the migration schedule.
    • As more teams become involved in the process, database and system administrators define application and database requirements.
    • Key business partners will work to define the importance of, and requirements for, specific applications and types of data.
    • Internal security and compliance groups define compliance requirements for a given infrastructure.

    This document isn’t completed only for a data migration plan. The BIA is a critical plan which helps with disaster recovery planning and infrastructure management.

    Data Discovery and Requirement Planning. The discovery step helps define the migration hardware and software environment; as well as overall requirements for the migration. Leave no feature unchecked in this process and ensure that you conduct a deep discovery of your current systems. This means understanding dependencies, permissions, user groups, how data is organized, network configurations, and more.

    Data Mapping and Storage Design. Once you wrap up the data discovery phase, you’ll use this data to create your mapping and design architecture for the new ecosystem. In some cases, this’ll be easy when working with the same vendor. However, for heterogeneous migrations or working with multiple vendors – this process is especially critical. Although there are migration tools which can help, there needs to be a manual verification process to ensure proper data, storage, and configuration mapping and design.

    Creating the Data Migration Plan. In many situations, you’ll be able to stand up your secondary storage or data control ecosystem in parallel to your existing environment. This allows for a seamless migration. Nonetheless, you still need to have a complete migration plan. You need to understand how this can impact users, your business, applications, and other critical workloads. In creating your plan – take these four points into consideration:

    • Constraints must be identified through business and operational requirements. Make sure to take your business operations into consideration when doing a data migration project.
    • Identify all of data to be migrated and all of the necessary associated attributes. When creating the migration plan, use your mapping and design metrics to ensure that all data attributes are accounted for and configured.
    • Use vendor-specific migration tools if needed. You don’t have to do this alone. Aside from being able to work with a data migration or storage partner – you can leverage vendor-specific data migration tools. These can help you plan out the design, requirements, and even migration process.
    • Always incorporate environment storage and application best practices. Does the VM need to be shut down? Can the VM be migrated via hypervisor migration? Can you live-migrate storage repositories? When working with virtualization, next-gen storage appliances, and very critical workloads – ensuring best practices around apps and storage is absolutely necessary. Since every environment is unique in their requirements, creating a migration plan is often a challenge. Different types of data may require different migration tools and strategies. Furthermore, business and operational requirements (downtime for example) may require creative ways of migrating the data.

    The key thing to remember is that the planning phase is truly a living document. At any time, variables in the environment can alter the course of the project, or execution can lead to unexpected results. All of this can impact the migration plan as originally documented.

    Data and Storage Preparation and Provisioning. During provisioning, the destination storage environment is prepared for the data move. This is where you design your repositories, LUNs, volumes, specific policies, security rules, virtualization integration, and more. Remember, we’re working with a lot of different types of storage environments today. Understand differences between hybrid arrays, all-flash, and traditional spinning disk. Furthermore, know how to provision new features like caching and even dynamic provisioning/de-provisioning features. Also, when designing around a new storage system – ensure that you’re familiar with all of the new features in the environment.

    Validation Testing. This is a critical step. You can do things like load-testing and ensuring that policies and configurations properly transferred over. This is your chance to create a “production” test environment to ensure your new storage solution can handle user and business requirements.

    Migration and Cutover. At this point – your migration plan is ready to go, you’ve done all of your testing, and you finalized configurations. Now, you can begin the migration and cutover. Now, there are vendor tools you can use from a destination storage technology. However, there are several factors which play into choosing the best migration methodology. These factors may include:

    • The type of source and destination storage systems
    • Infrastructure network and storage topology (NAS or SAN)
    • Physical server and equipment locations
    • Mapping requirements for specific applications
    • Application and database data usage
    • Specific project, customer business, technical, and operational requirements

    With effective data migration, the truth is that there’s no one right way to do it. It will all depend on trade-offs based the infrastructure of the environment, core business and operational requirements, and migration experience. This is why the first planning phase in any data migration plan is so important. It provides the best-in-class migration plan for the specific migration project at hand.

    Once the data has been moved, all clients must be redirected to the destination devices.

    Final Migration Testing and Validation. The final and certainly very critical step is to validate the post-migration environment. This final step will confirm that all business and technical expectations have been met. From a testing side – at a minimum – network access, file permissions, directory structure, and database/applications need to be validated in their new environment.

    In planning your own data or storage migration process, make sure to work with storage engineers and architects who can help guide you throughout the process. Remember, you’re dealing with very critical data resources here. Spend the extra time planning effective migrations, make sure you conduct effective backups, and work to ensure the safe transfer of your information. In some cases, this means working directly with the vendor, in other cases a partner can help. Either way, take data and storage migrations seriously. Storage outages are never fun; good planning can help avoid the situation and keep an environment up and running.

    8:50p
    CoreSite Realty: Strong Q2 Overshadowed By CEO Tom Ray’s Departure

    CoreSite Realty’s outgoing CEO Tom Ray said on the Q2 earnings call that succession was an ongoing process — but the final chapter which began in July, was swiftly completed in under 30 days.

    Ray will be stepping down on September 10 after 15 years of service, and will be replaced as president and CEO by CoreSite lead director Paul E. Szurek. Szurek, who has served as a CoreSite board member since 2010.

    Ray will continue in a consulting role through June 30, 2017. He said on this morning’s Q2 earnings call, that a 12-month non-compete clause was part of his separation agreement.

    Another Strong Quarter

    CoreSite continued its string of impressive quarterly results. Q2 highlights included:

    • Reported second-quarter funds from operations (“FFO”) of $0.89 per diluted share and unit, representing 30.9% growth year over year.
    • Reported second-quarter total operating revenues of $96.1 million, representing an 18.1 percent increase year over year.
    • Executed a record 171 new and expansion data center leases comprising 48,147 net rentable square feet (NRSF), representing $7.7 million of annualized GAAP rent at a rate of $159 per square foot.
    • CoreSite is increasing its 2016 guidance of FFO per diluted share and unit to a range of $3.56 to $3.64 from the previous range of $3.52 to $3.60.

    Notably, CoreSite’s ongoing leasing success creates challenges of its own, with an overall occupancy of about 92 percent. Given the record high facility utilization, it gets harder to fill in the remaining “nooks and crannies.”

    Read more: What’s Behind CoreSite’s Fortune 100 Channel Partner Deals

    CoreSite announced two channel partnerships in June, with Fortune 100 Avnet and Ingram Micro, which could help to drive even higher occupancy across the portfolio from single cabinet deployments.

    COR - 2Q'16 Supp MRR per Cab

    Source: CoreSite – Q2 2016 Supplemental

    CoreSite’s revenue per cabinet continued to increase, and along with stronger interconnection revenues than forecast, helped to drive EBITDA margins up to a healthy 52.6 percent.

    Strategic Vision

    There is no way for anyone besides Ray and the board to know what really prompted the sudden departure.

    However, CoreSite’s immediate challenge going forward does not change the need to purchase land in order to continue along the sector-leading profitable growth trajectory.

    The company recently completed its final 48,000 SF phase in the red-hot Northern Virginia market. According to the Jones Lang LaSalle (JLL) July 2016 leasing update, 78 MW of wholesale space were absorbed during the first half of 2016. Meanwhile, there is plenty of dirt being moved, facilities being constructed, and land being banked by competitors looking to serve that wholesale market.

    In Santa Clara, CoreSite expects to complete its final 230,000 SF SV-7 facility during the next few months. This facility is already 59 percent pre-leased. Much of the remaining space may be needed to serve smaller colocation and interconnection customers in the Silicon Valley.

    Ray confirmed on the earnings call that CoreSite will be out of the wholesale data center leasing business for at least the next six to 12 months. He also discussed concerns regarding excess supply in the Bay Area next year, given the number of projects currently underway, permitted, or on the drawing boards.

    If wholesale leasing demand by the hyperscale cloud providers continues during the next 18 months, Ray said the Bay Area market condition could be “healthy.” However, it appears that cloud leasing roulette wasn’t a game that Ray was willing to play in either Northern Virginia or the Silicon Valley in 2017.

    Investor Takeaway

    Under Ray’s leadership CoreSite delivered tremendous returns for shareholders. During 2015, CoreSite was the top performing REIT in any sector, with a total return of ~50 percent.

    In order to put that into perspective, the MSCI REIT Index which tracks equity REIT performance across all the sectors only returned 2.52 percent in 2015. CoreSite’s outperformance in 2015 helped to “unmask” the relative strength of data center REITs.

    The first half of 2016 has been exceptional for the entire data center REIT sector, with incredible gains primarily driven by large wholesale leases from public cloud providers.

    Read more: Another Huge Quarter for Data Center REITs: What’s Next?

    CoreSite again was the top performer delivering total returns of over 60 percent through June 30, 2016. Frankly, that is a tough act to follow.

    CoreSite facilities are currently at 92 percent occupied, up 140 bps sequentially from Q1 2016. Filling up the final phases of large data center campuses is a low risk way to deliver high margins and ROIC for shareholders. However, by always allocating capital to the existing footprint, CoreSite would be unable to maintain the growth which supports the high earnings multiples for the stock.

    It remains to be seen what direction the board will take with Paul Szurek at the helm. Chairman of CoreSite’s board of directors, Robert Stuckey, said in the announcement that “Paul, an active and valuable contributor to the CoreSite Board, has been instrumental in providing oversight of the company’s growth and strategic direction.”

    In my view, CoreSite is now behind the curve when it comes to banking land for expansion in existing US markets. This is a situation which must be dealt with swiftly in order to deliver the growth that shareholders have enjoyed under Ray’s rock-solid leadership.

    If CoreSite were to announce a significant M&A deal to expand internationally, (InterXion immediately comes to mind), that would be an obvious change of direction. However, it would require morphing from a cautious posture to a higher risk approach to achieving growth.

    On the other hand, data sovereignty is becoming a more pronounced trend, so there is a growing risk associated with not expanding overseas as well.

    9:39p
    The Windows 10 Anniversary Update for IT Pros
    By WindowsITPro

    By WindowsITPro

    Next week Microsoft will release the second major update for Windows 10 and there are some updates in this release to consider if you are an IT Pro and considering a migration to Microsoft’s year old operating system.

    During the Microsoft Worldwide Partner Conference a couple of weeks ago, Yusuf Mehdi, the corporate VP for the Windows and Devices Group at the company indicated that 96 percent of its corporate customers are currently trialing Windows 10 and considering their own migrations.

    A tremendous amount of testing has gone into the Anniversary Update, previously known as Redstone 1 when its development first began late last year, and that impact shows in the momentum numbers for Windows 10.

    • Over 350 million devices are now running Windows 10 as of June 29, 2016.
    • More than 135 billion hours have been spent by users on Windows 10 actively using the system.
    • We have been tracking Windows Insider builds and there have been a total of 50 builds released since last December for testing. That is split almost evenly at 27 PC builds and 23 for Mobile devices.
    • Windows Insiders have spent more than 50,000 years worth of time using these testing builds – that is over 18 million hours if you do the math.
    • During that usage by Insiders, they have also submitted 75 million pieces of feedback and more than 5,000 features and fixes have made their way into Windows 10 as a result of that feedback.

    I did a full review of the Windows 10 Anniversary Update, which will begin its roll out to current Windows 10 users on August 2, over on the Supersite: Windows and that review highlights the areas that are getting a lot of attention in this release.

    Admittedly, the new features covered in my review are heavily consumer focused but may still be worthwhile for enterprise users to check out. However, there are a couple of pieces in the Windows 10 Anniversary Update that should be considered on the enterprise side of the house as well and their focus is on security.

    Windows Defender Advanced Threat Protection (WDATP)

    This new feature helps IT Pros to detect, investigate and deal with malicious attacks on their networks. It does this by providing comprehensive threat intelligence and attack detection.

    WDATP is a post breach feature and is built to help you remediate attacks and prevent them in the future.

    There are three key parts to this technology according to Microsoft:

    1. The Client – end-point behavioral sensor, built into Windows 10 (Windows 10 Anniversary update, Windows Insider Preview Build number 14332 and later) and activated upon service enrollment. The client logs relevant security events and behaviors from the endpoint.
    2. Cloud security analytics service – processing data from endpoints in combination with historical data and Microsoft’s wide data repository to detect anomalous behaviors, adversary techniques and similarity to known attacks. The service runs on the Microsoft scalable big data platform, and uses a combination of Indicators of Attacks (IOAs), generic analytics and machine learning rules, as well as Indicators of Compromises (IOCs) collected from past attacks.

    3. Microsoft and community intelligence – our hunters and researchers investigate the data, finding new behavioral patterns and correlating the data with existing knowledge from the security community.

    Windows Information Protection (WIP)

    The reality of work in a corporate environment is that personal and work related files, emails, etc. are going to become intermingled on users devices. WIP implements a series of features that helps protect that critical corporate information from being shared with individuals that should not receive it.

    WIP works on four key information protection fundamentals:

    1. Device Protection: Making sure that the device and the information on it are protected if the hardware is stolen/lost.
    2. Data Separation: This element makes sure that personal and corporate info is kept separate on any device they are both stored on.
    3. Leak Protection: This will keep unauthorized users and apps from accessing or sharing your protected data.
    4. Sharing Protection: This step makes sure that data you share outside of your organization and control continues to be protected from unauthorized individuals.

    The first two fundamentals above happen on a Windows 10 device by using Bitlocker and WIP. The final two areas are implemented through Azure Information Protection and Office 365 controls.

    Is your company/organization testing out Windows 10 right now? What is your biggest concern about making that migration?

    But, wait…there’s probably more so be sure to follow me on Twitter and Google+.

    This post first appeared at WindowsITPro.

     

    9:51p
    Microsoft Still Mum on Huge New Iowa Project

    Spokespersons from Microsoft have declined comment today on news that broke from the city of West Des Moines last week that the company would be building a tremendous new facility in the city. It may still be a secret in Redmond, but in Iowa, the governor has already rolled out the red carpet.

    “Nothing makes me more proud,” said Gov. Terry Branstad (R) during a press conference in West Des Moines last week, “than when a company with an existing presence in our state makes another significant investment in Iowa — especially one of this significance, and a company of this quality.”

    What Gov. Branstad was referring to was an agreement reached between his state and Microsoft to build a third major data center in West Des Moines in a tract spanning two counties adjacent to where its other two facilities there are located. The project would represent an estimated $3.5 billion capital investment from Azure’s parent, and according to Mayor Steven Gaer (R), a source of revenue for the city in the form of $23 million annually in collective tax payments, including $12 million per year in property taxes.

    The state did agree to grant Microsoft what Mayor Gaer described as “modest” tax benefits, though the company did commit to funding 100 percent of its infrastructure buildout without tax abatement.

    The word “Azure” was not uttered during the mayor’s press conference last week, though it can certainly be inferred.

    According to the mayor, Phase I of the four-phase project is scheduled to begin next year to build a 1.7 million square foot complex just south of the Dale Maffitt Reservoir, on 200 acres spanning the border between Warren and Madison counties. Completion is scheduled for 2022. The governor said Phase I would bring in some $417 million in capital investment to the state.

    “Companies look for locations that are safe from hurricanes, earthquakes, and rolling blackouts,” Gov. Branstad remarked. “But it’s also important that they have access to renewable energy. And Iowa just happens to lead the nation in the percentage of our electricity generated by wind.”

    The governor cited figures from MidAmerican Energy Co. last April stating that the state of Iowa currently generates some 31 percent of its total annual power production from wind — reportedly the highest percentage of any state. With the aid of federal renewable energy tax credits, the producer plans to boost that percentage to 40 percent by 2018, by way of a $3.6 billion investment toward building a 1,000-turbine, 2-gigawatt wind farm.

    MidAmerican’s parent company, by the way, is Berkshire Hathaway, having acquired the producer in 1999. Its CEO, Warren Buffett, is a personal friend of Microsoft founder Bill Gates.

    When Microsoft committed to building its second data center in the West Des Moines area in May 2014, its $2 billion commitment to Iowa represented the single largest outlay by a private company in the state’s history at that time. But it almost didn’t happen, until the state legislature in Microsoft’s home state of Washington suspended its debate over whether to continue extending tax breaks to a corporation that evidently didn’t feel like building data centers in Washington State.

    The big draw for Iowa at that time was cheap energy with costs per kilowatt-hour falling as much as 24 percent below the national average. But literally a perfect storm of climate change-triggered temperature extremes in the state (including a record low-temperature summer), coupled with the costs of building out the state’s renewable power, have threatened to make Iowa less attractive. Doug Shull, chairman of the Warren County Board of Supervisors, admitted that Microsoft’s big build came shortly after his county came in second in the race to attract another major data center builder — whose identity Shull refrained from revealing.

    The latest power data for May 2016 from the U.S. Energy Information Association shows that the price for commercial power in Iowa has risen almost 16 percent since Microsoft’s 2014 announcement, to 9.44¢ / kWh — still below the national average of 10.7¢ / kWh, but only by 8.1 percent.

    << Previous Day 2016/07/28
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org