Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, April 5th, 2017

    Time Event
    2:34a
    Automation: Not Just for DevOps

    Automation is most closely associated with DevOps’ teams, but it can be applied to any number of functions in a traditional data center or cloud environment—especially those mundane, repetitive tasks that take up precious time and invite human error.

    However, “just don’t automate for the sake of automating,” said Joel Sprague, principal systems engineer for General Dynamics, during his Data World session, “Automation: Not Just for DevOps.”

    If you automate a task that doesn’t end up saving time and money, it’s better for it to remain a manual one. That’s best accomplished by focusing on repetitive tasks that require doing the same things multiple times in one sitting; tasks that involve doing the same thing every day, week or month; and those that call for applying the same action to a large amount of systems in a short period.

    Tasks can range from developing scripts to checking for software updates to handling more complex matters such as rooting out security problems, including preventing potential risks that might have resulted from human error.

    “Automation can let one person accomplish the work of 20,” added Sprague. “When something needs to be done many times, very quickly, look to automation.”

    Automation can go beyond rudimentary data center tasks. As recently as 10 years ago, it could take a company several weeks to deploy an additional server. IT would wait for the purchasing department to buy the new server and software licenses, then wait for the equipment, CDs and installation manuals to be shipped, then manually install the software, manually match the hardware, software, networking and access rights configurations with the existing implementation, and physically deploy the server onto the network.

    Today, thanks to the automation of processes it is possible to fully provision an application server in under an hour.

    Sprague recommended two free automation tools: VMware’s vRealize Orchestrator and Microsoft Powershell.

    9:37a
    Finding the Sweet Spot for Your Data Center

    There’s certainly no shortage of options for expanding data center capacity these days. You can renovate an existing facility or add a modular unit onsite or offsite, build one from scratch, lease data center space, or move non-critical data and applications off your servers and into a cloud … and just about any combination of the above.

    Which scenario is right for your company? Whatever makes the most sense for the business, said HPE’s Laura Cunningham during her Data Center World session, “Finding the Sweet Spot for Your Data Center.”

    So, it’s imperative to know the future direction and financial preferences of your company before meeting face-to-face with a CIO, CEO or CFO to ask approval for any IT project.

    “What IT people don’t always get is that everyone in the company is asking for money, so you better make a good case,” she said.

    Cunningham offered examples that illustrate why Ford, Netflix and most financial firms choose and/or chose their specific strategy to meet core business needs.

    In anticipation of the deluge of data expected to be created by new connections between automobiles and technology, Ford is building a $200 million data center in Flat Rock Michigan.

    The company has the luxury of embarking on project that can take up to two years to complete. Most importantly, Ford is very motivated to boost the $5.4 trillion in revenue the transportation services market adds to the company’s coffers annually.

    On the other hand, when Netflix moved from a snail mail-based DVD rental model to providing an online streaming service, it opted to get rid of most of its data centers and move 100 percent of its customer-facing business to the cloud and became a customer of Amazon Web Services. The company clearly needed a faster turnaround time than Ford, said Cunningham. Plus, its servers are spread across the country, ensuring the shortest connections to customers.

    Finally, Cunningham addressed how financial companies—especially investment firms—are clamoring to lease or buy data center space as close to the NYSE Euronext data center in Mahwah, New Jersey, as possible. Because the ability to make spit-second stock trades or get market-moving news before their competitors are key to their success.

    “So, they’re willing to pay millions and millions of dollars. It’s that important to them,” she said.

    When it comes time to start building your case, keep in mind that, “What’s most important to your company is what’s most important to you.”

    Is time a factor? Will a certain location affect the business negatively or positively? What makes sense from a funding standpoint? Does your company treat data centers as an OpEx or CapEx expense? If latency is a key factor, how would you propose to keep it as low as possible?

    By knowing the answers to those questions before heading into the boardroom, then making decisions based on business needs, you’ll be that much closer to gaining approval from corporate.

    11:00p
    Google Invests in Submarine Cable to Speed Up Its Cloud in Asia Pacific

    Google has invested in yet another submarine cable project to boost bandwidth of the global network backbone that interconnects its data centers around the world.

    The 9,000-kilometer Indigo cable will land in Singapore, Jakarta, Perth, and Sydney, boosting bandwidth between the three countries by 18 terabits per second, which is enough for people in Singapore and Sydney to hold 8 million high-definition video calls, the company said.

    “Many people are coming online across Asia, including businesses that depend on the cloud,” Brian Quigley, who oversees Google’s global network infrastructure, and Michael Francois, the company’s infrastructure partnership manager in Australia, wrote in a blog post. “That’s why it’s so important to enable better internet connectivity across the region.”

    In recent years cloud giants, including Google, Facebook, Microsoft, and Amazon, have been getting more involved in submarine cable construction projects by investing in them instead of simply paying their operators (traditionally big telco consortia) to use international bandwidth. As they add users and services around the world, these companies are eager to expand network bandwidth and have found that it makes financial sense to fund these projects, each of which can cost hundreds of millions of dollars.

    Google has invested in more cable projects than others. Indigo, expected to come online by mid-2019, is its seventh overall and fifth in the Asia-Pacific region.

    See also: Here are the Submarine Cables Funded by Cloud Giants

    Other investors in the project are AARNet, Indosat Ooredoo, Singtel, SubPartners, and Telstra. The cable will be built by Alcatel Submarine Networks.

    An Alcatel Submarine Networks submarine cable laying vessel (Photo: Alcatel)

    << Previous Day 2017/04/05
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org