Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, January 26th, 2015

    Time Event
    1:00p
    With Colocation Security, Never Assume Anything

    British Telecom has gone a long way from the days Britain’s General Post Office sanctioned installation of the first telephone in the country in the late 1870s. Better known today as simply BT, it is a multinational giant of telecommunications and every flavor of IT infrastructure outsourcing services with about $27 billion in annual revenue.

    Naturally, BT has a massive global data center infrastructure and has been investing a lot of money into staying on top of current technology trends. We recently sat down with Jason Cook, CTO of BT Americas, to learn a little bit about the company’s data center strategy and some of the areas of technology it is currently interested in.

    Here are some highlights from our chat:

    Data Center Knowledge: Do you buy, build, or lease data centers?

    Jason Cook: All of the above. We have one of the most diverse IT estates out there. Different countries, different technologies.

    DCK: What drives those decisions?

    JC: It rely depends on the location. It’s not always frugal to build everything from scratch. In Colombia, we built because we see a longer-term strategic advantage of doing that. In each key region you’ll see that we physically own the data center, and then, in satellite areas, it depends on the business presence, the opportunity that is there.

    DCK: What do you look for in a colo provider?

    JC: Ability to scale; how fat the network is; how good the power is. You’d be surprised how often these colo sites go off the air. And more and more importantly now, physical security, as well as cybersecurity.

    DCK: How big of a role does the colo provider play in security of your customers?

    JC: We want to absolutely guarantee all of the obvious vulnerabilities [are addressed]. There’s no point in having sexy cybersecurity if someone can still walk in. We do random spot checks for all of our colos all the time.

    DCK: Have you been able to “penetrate” colocation security during random spot checks?

    JC: If we are looking for a colo partner, one of the things that we do is a random spot check. And yes, we have run into situations where we’re walking around on the third floor of the data center with the ability to touch stuff. And then we call the manager and say, “Excuse me, but we just walked in.”

    DCK: What’s the response usually?

    JC: I’ll leave it to your imagination.

    DCK: How big is the physical colocation security concern today?

    JC: Physical security is still one of the easiest ways to get access to data. With all of the sophistication in the current technology, what’s the point, if someone can walk in and open the door? And that is a pretty large issue. It’s not as if it’s a small factor. Anyone that’s trying to get access to data will try the most obvious things first.

    DCK: How does BT differentiate its data center services?

    JC: If it’s just a commodity-level piece, there’s nothing that differentiates us. We found with customers though, that perhaps they’re buying our other services, security, or voice services, or network services, and then they discover “Oh, you’re doing this as well?” As long as price-point-wise we’re OK. And we tend to find that we are quite competitive that way.

    DCK: What are some of the key current technology initiatives at BT?

    JC: We are very embedded [with] SDN. We are part of the NFV forum that kicked off in Europe two years ago. We spent [billions of pounds] on innovation in the last three-four years. A large focus of our investment is in essentially acknowledging that network, data center, your network, is the same thing now.

    DCK: Is OpenStack part of those efforts?

    JC: OpenStack, you name it. We’re running at the moment a number of proof-of-concepts ourselves and with some key customers across the globe. We’ve got a number of test framework architectures, [including OpenStack]. You’d expect it to be.

    DCK: BT is part of Cisco’s One Cloud initiative. How is that going?

    JC: I firmly believe something like that will absolutely come out. Whether it will be completely Cisco-flavored, I don’t know. Cisco are no doubt going to be at the heart of it. It’s their future. It’s wait-and-see. There are a number of standards that Cisco said, “These are the standards.” To truly get to utopia of what their vision is, those standards have to be pretty much agreed to by everyone, and not everyone has bought into those standards yet.

    DCK: To participate in One Cloud, you have to use Cisco’s Application Centric Infrastructure technology. Has BT bought into the ACI vision?

    BT: Are we completely betting that that’s the only way forward? No. Are we engaged? Are we working with them? Absolutely. It’s standards that really drive things. Sheer scale, diversity of the customers that we work with, you have to be able to cover a number of technologies and standards.

    4:30p
    Shedding Some Light on Shadow IT Management

    Tom Bice is the Vice President of Product Marketing and Sales Enablement at Attachmate and Novell.

    Shadow IT has been lurking in the dark corners of organizations for years now, but as BYOD and public cloud computing gain traction in the workplace more and more employees are stealthily adopting their own software and hardware without telling IT. When IT is left in the dark it makes it nearly impossible to mitigate potential risks, and in light of the recent barrage of data breaches executives are becoming increasingly concerned about the issue.

    So what can you do? First and foremost, accept that shadow IT is here to stay. IDC found that the majority of information workers share files via email and other unsecure methods while only a small group, about 10 percent, use a service provided by their company. Rather than trying to squash all instances of shadow IT that pop up in your organization, put a plan in place to help manage it. The following tips can help get you started.

    Monitor Your Network

    If you don’t know where shadow IT may be lurking in your organization, you cannot account for it or secure it. Make it a top priority to continuously monitor your network for new and unknown devices to help pinpoint where shadow IT is occurring and what kind of new devices are being introduced into your organization.

    Assess the Risk

    When you identify a new public cloud application on your network, evaluate its risk level. Not all applications that haven’t been sanctioned by IT are bad. Allow employees to continue using apps that are low-risk or for sharing materials that don’t contain sensitive information. For example, consumer-focused file sharing services Dropbox are just fine for sharing public-facing marketing materials, but it should never be used to share private customer data.

    Although it is important to compromise with business users on which outside applications they can use, you should always prioritize security. Top-tier data security is the biggest priority when considering cloud applications and electronic file management solutions.

    In addition, it is important to take regulatory compliance support into consideration. Always ensure that new tools adhere to the evolving legal standards specific to your industry. The financial services and healthcare and life sciences industries face some of the strictest information security regulations. Healthcare organizations must mitigate risks of noncompliance with the Health Insurance Portability and Accountability Act (HIPAA) and the Health Information Technology for Economic and Clinical Health (HITECH) Act. Make sure you have a thorough understanding of the regulations set forth for your industry before approving new applications.

    Determine its Manageability

    IT already has infrastructure and processes in place for managing applications and services, and it must be able to manage new cloud applications using its existing infrastructure. The solution needs to provide IT with provisioning and de-provisioning capabilities to ensure that all employees have the right level of access and it needs to utilize the existing identity management and security infrastructure.

    For example, does the solution work with the activity directory or LDAP, identity management and single-sign-on solutions? Can IT get the level of reporting and visibility required to manage the solution, troubleshoot and provide compliance and audit reports? All of these elements must be taken into consideration when determining whether an application is manageable.

    Establish Guidelines

    Determine what your company’s policy is going to be for cloud computing. Outlining a policy will help employees understand why it is important to keep IT managers in the loop, underscore the security implications of using unapproved cloud computing services and establish expectations around using outside cloud applications. Be sure to include a detailed policy that leaves no questions unanswered. It is also helpful to identify a list of pre-approved cloud computing services, provide directions on how to use the services and specify what type of information is acceptable to share on each platform.

    Provide Effective Alternatives

    Although it is important to accept shadow IT, most businesses in regulated industries prefer on-site deployment. A study conducted by Hanover Research found that all the respondents they interviewed from highly regulated industries were looking for on-site based file sharing solutions due to the sensitive nature of their documents. However, when business users reject company services for cloud-based apps they aren’t trying to rebel, they are simply trying to do their job more efficiently.

    The most effective way to get employees to use company software is to provide an option that is just as easy to use and productivity enhancing as the outside applications employees are turning to. If you provide the right solution, you can give employees the experience they want while maintaining the security you require.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    4:57p
    Hortonworks Becomes Official Google Cloud Feature

    Hortonworks’ Hadoop Data Platform (HDP) was made available and supported on the Google Cloud Platform late last week in a major development for cloud Hadoop. Engineers from both companies have collaborated to make it easier to provision HDP clusters on Google’s cloud.

    Highlights of the engineering work include integrating “bdutil”(command-line script used to manage Apache Hadoop instances on Google Compute Engine) with the Apache Ambari (Hadoop management project) plugin to provision and manage infrastructure, and a Google Cloud Storage connector for HDP, Ajay Singh, director of technical channels at Hortonworks, wrote in a blog post about the announcement.

    The companies have made source code for the integration available for use and open contribution on GitHub.

    Created by Yahoo, Hortonworks was one of the first companies to go after the enterprise Hadoop market, turning the open source technology into a software business. The framework enables users to turn cheap commodity servers into powerful compute clusters that can crunch through a lot of data using parallel processing techniques. This partnership means it’s easy to use Google’s cloud servers in doing so.

    Open source Hadoop forms the foundation of several companies, and Hortonworks is one of the leaders in the space. Other major players include Cloudera and MapR.

    Google’s investment arm, Google Ventures is a big investor in Cloudera, and so is Intel. Google is also a major backer of MapR.

    Cloudera and Google recently partnered to enable Google’s Dataflow system on Spark, a stream-processing framework for real-time big data analytics.

    Google has an interesting history when it comes to Hadoop. Last July, Google said it stopped using MapReduce, the model it itself created that served the basis for Hadoop. This did not affect Hadoop’s momentum, however, as seen by the continuing interest and investment going into companies in the space.

    Hortonworks has been keeping busy, with a successful IPO and several partnerships, such as one with a company called Talend, last month.

    Its cloud Hadoop partnership with Google speaks to several trends: big data functions are moving to the cloud because of economics and flexibility, and enterprises are embracing open source technologies for big data.

    Other big cloud providers, such as Amazon Web Service, have a variety of easily digestible Hadoop setups. Amazon’s Elastic Map Reduce (EMR) is a managed service that provides the Hadoop framework on EC2. MapR’s platform is also available on EMR.

    There are also startups like Xplenty providing easy-to-use Hadoop on AWS. It’s possible to deploy Hortonworks on AWS as well.

    “With Google Cloud Platform and Hortonworks Data Platform, enterprises benefit from limitless scalability and an enterprise-grade platform backed by community driven open source innovation,” Singh wrote.

    6:15p
    Expedient Completes Third Data Center in Hometown Cleveland

    Managed services-focused data center provider Expedient has finished the first phase of its third data center in Cleveland, where the company is based.

    The expansion adds to the company’s already sizable Ohio data center portfolio. Besides the three Cleveland facilities, the company has one in the Columbus metro, with another one currently under construction.

    Expedient goes after what are often referred to as secondary, or tier-two, data center markets. These metros are not the traditional data center hotbeds like New York City or Ashburn, Virginia, but there is a lot of demand for data center services there nonetheless.

    The provider has small and mid-size data centers in Boston, Baltimore, Indianapolis, Pittsburgh, and Memphis.

    The most recent Ohio data center expansion brings about 6,500 square feet of raised floor to the market in a 14,000 square foot facility.

    Expedient, whose legal name is Continental Broadband, is a wholly owned subsidiary of Landmark Media Enterprises, a Virginia-based broadcast TV, Internet publishing, and technology services company.

    6:53p
    Data Center Connectivity Briefing

    There have been many developments in global data center connectivity this month. Several U.S. data center providers that focus on secondary markets have signed major carrier deals. Internationally, Telstra now offers superfast connectivity on a group of submarine cables in Asia Pacific.

    Here’s a digest of the news:

    Telstra Launches 100G on APAC Submarine Cables

    Australian telco and IT services giant Telstra has launched 100 gigabit per second connectivity on multiple long-haul submarine cable routes in Asia Pacific. The new 100G wavelength service will cover Japan, Hong Kong, Taiwan, Korea, Australia, and the U.S.

    It will reach the U.S. via UNITY, a cable built by multiple investors, including Google, that connects Japan and the U.S.

    Zayo to Serve Eight EdgeConneX Sites

    EdgeConneX has signed Zayo Group to provide dark fiber, wavelength, and IP services for the data center provider’s customers in eight of its locations: Atlanta, Las Vegas, Memphis, Nashville, Portland, Salt Lake City, San Diego, and Richmond, Virginia.

    EdgeConneX specializes in catering to companies that need content caching at so-called edge locations. It strategically positions data centers near network provider aggregation points to ensure low latency.

    365 Signs TeliaSonera for Six Locations

    TeliaSonera International Carrier will provide 100Gbps data center connectivity services at six facilities operated by 365 Data Centers. The locations are Buffalo, New York, Tampa, Nashville, St. Louis, Cleveland, and Indianapolis.

    365 specializes in serving Tier II data center markets in the U.S. It has 17 locations total. TeliaSonera’s network spans North America, Europe, Asia, and the Middle East.

    H5 Adds Comcast in Denver

    H5 Data Centers has added Comcast Business to the list of network carriers available at its Denver data center campus. Other options there include AT&T, CenturyLink, Level 3, Time Warner Telecom, Verizon, and XO Communications.

    H5 is a secondary-market data center services player with facilities in Denver, Atlanta, Charlotte, Seattle, and San Jose and San Luis Obispo, California.

    7:30p
    Careless and Untrained Insiders Biggest Cybersecurity Threat to Federal Agencies: Report

    logo-WHIR

    This article originally appeared at The WHIR

    Careless and untrained insiders are the biggest cybersecurity threat to federal agencies, outweighing threats by external sources including hackers, according to a report by SolarWinds on Monday.

    This particular finding is consistent with a report SolarWinds released in March, in which 29 percent of respondents said insider data leakage and theft was a top cybersecurity threat to their agency.

    The latest study, which SolarWinds conducted with government market research provider Market Connections, finds that 53 percent of federal IT professionals said “careless and untrained” insiders are the greatest source of IT security threats. This is up from 42 percent in March’s survey.

    More than that (57 percent) believe breaches caused by insiders are more damaging as or as damaging as those caused by malicious outsiders.

    A report earlier this month by the Cloud Security Alliance focused on the private sector found that 22 percent of organizations have a cloud security awareness training program, which can help with preventing insider threats. The public sector may consider this approach as at least a segment of a broader prevention and mitigation plan.

    Bring-your-own-device is also part of the internal security concern for federal agencies. According to the report, top causes of accidental insider breaches include phishing attacks (49 percent), data copied to insecure devices (44 percent), accidental deletion or modification of critical data (41 percent) and use of prohibited personal devices.

    Twenty-nine percent of federal IT pros surveyed said that budget constraints are the most significant obstacle to maintaining or improving IT security, down from 40 percent last year.

    While 69 percent of respondents said that they had increased investment over the past two years to prevent external threats, only 46 percent did so for internal threats. Nine percent of respondents even said they decreased investment in insider threat prevention.

    “Contrasting the prevalence of insider IT security threats against a general lack of threat prevention resources and inconsistently enforced security policies, federal IT Pros absolutely must gain visibility into insider actions to keep their agencies protected. However, given the unpredictability of human behavior, the ‘Why?’ of those actions is an elusive query,” SolarWinds VP of product management Chris LaPoint, said. “Fortunately, there are IT management solutions that can help identify Who is doing What, and even point to Where and When, empowering federal IT Pros to isolate the threats and address them before the agency’s security is in peril.”

    While the study focuses on internal threats, external threats are not to be ignored.

    Earlier this month, as part of the preview to the State of the Union Address, US President Barack Obama spoke about his intention to introduce new legislation that would make it easier to prosecute cybercriminals and facilitate flow of information between the public and private sectors, something that the government has been trying to do for years.

    In December, the US Department of Justice announced a dedicated cybersecurity unit, which will provide guidance to law enforcement as well as cooperate with private sector partners.

    This article originally appeared at: http://www.thewhir.com/web-hosting-news/careless-untrained-insiders-biggest-cybersecurity-threats-federal-agencies-report

    7:40p
    University of Montana Deploys Modular Data Center by CommScope

    Afters of being in dire need of a data center upgrade, University of Montana has completed a $2.6 million modular data center, courtesy of CommScope. The prefabricated modular data center sits in a parking lot, with a slanted metal roof overhead.

    CommScope launched its modular data center product, called Data Center on Demand, in March 2014. It uses evaporative cooling and includes data center infrastructure management software by iTRACS, a DCIM vendor CommScope acquired in 2013.

    Tony Jablonski, associate CIO for IT central computing services at the university, said the school previously had “arguably the worst data center in public education,” according to the university’s IT newsletter. Some servers were moved to the new data center from a basement built in 1972, he said.

    Virtualization efforts that developed over the past few years while the new data center was being planned have reduced the space required from 12 racks to about seven.

    Wanting to both modernize and anticipate future scalability needs the university landed on the CommScope solution. No secondary cooling methods are employed.

    The unit also doesn’t have generator backup.

    Here’s a video of the data center’s installation:

    8:18p
    New Player Building East Coast Data Center for Trans-Atlantic Connectivity

    A newly formed company called New Jersey Fiber Exchange is building an East Coast data center that will provide connectivity to submarine cable systems that link U.S. to Europe and South America.

    The roughly 50,000 square foot two-story data center will provide colocation suites or cabinets to companies interested in connectivity options available there. Mumbai-based telecommunications giant Tata Communications will provide access to its submarine cables, but the data center will be carrier-neutral.

    NJFX is entering a busy market. There is no shortage of options for reaching submarine cable systems or data center providers that offer them on the East Coast. There are numerous cable landing stations in New York, New Jersey, and Rhode Island, as well as in Massachusetts and Florida, connecting to Europe, Middle East, Africa, South America, and the Caribbean.

    Some of the ongoing new submarine cable construction projects include a U.S.-Brazil cable being built by Seaborn, which Microsoft recently invested in, and a Hibernia Networks project to build a trans-Atlantic system that connects North America and Ireland.

    The company expects to bring the data center online in 2016.

    “We are building a data center that will serve as a network interconnection point at the eastern most edge of the United States to enable companies to design and construct the most efficient network for their business and ensure the delivery of high-bandwidth applications to serve end-customers,” Gil Santaliz, founder and managing member of NJFX, said in a statement.

    8:31p
    The Fourth Annual Northern California Data Center Summit

    The Fourth Annual Northern California Data Center Summit will be held February 23-25, 2015 at St. Francis Yacht Club in San Francisco.

    Join data center colleagues and develop new relationships while hearing from 40+ speakers in 13 panel discussions and workshops.

    The summit will feature an opening night reception on February 23, full-day conference on February 24 and operations workshop on February 25.

     

    For more information about this event, visit The Fourth Annual Northern California Data Center Summit website.

    To view additional events, return to the Data Center Knowledge Events Calendar.

    << Previous Day 2015/01/26
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org