Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Wednesday, May 3rd, 2017
Time |
Event |
3:30p |
Dubai Cybersecurity Hub on Hiring Binge, Only Locals Need Apply Nour Al Ali (Bloomberg) — In its bid to make the United Arab Emirates city the “safest electronic city in the world,” the Dubai Electronic Security Center (DESC) is looking to expand its Emirati-only workforce by 50 percent this year.
The center, which has a budget of millions of dollars, monitors and researches potential electronic incidents that may affect governmental institutes. To do that, the relatively new center, established in 2014, is looking to hire 30 more people, totaling to about 90 people in its workforce.
“We want to look for U.A.E. nationals because we believe in the national resource,” said Amer Sharaf, director of compliance support and alliances at DESC, in an interview. Foreigners will only work in a consultant capacity, he added. “It’s 100 percent U.A.E. nationals. We are proud of that and want to keep it like that.”
As cybersecurity concerns grow in the region, Dubai has been on high alert. DESC has been focusing on ensuring the security of transport and energy sectors, especially following the “Shamoon 2” virus that targeted government agencies and companies in Saudi Arabia. Dubai has had no recorded incidents of the virus, Sharaf said.
Some Dubai government institutes faced email phishing attempts, and other small incidents in the last six months, Sharaf said, declining to elaborate further.
DESC is only just beginning to seek publicity, via sponsoring conferences and participating in round tables and career fairs. They are looking to hire individuals with a cybersecurity and forensics background, Sharaf explained. Women account for 35 percent of its workforce.
Finding the appropriate national workforce may prove difficult. “We feel we can do more,” Sharaf said. “There is kind of a shortage. Graduates are coming, but they don’t have hands-on experience,” he said, explaining that they are working alongside universities and Dubai’s Knowledge and Human Development Authority to put together a curriculum to familiarize potential employees with cybersecurity concepts.
DESC has been researching technologies that currently lack regulation such as cloud computing, autonomous vehicles, and block-chain. It also plans to introduce a cybersecurity strategy for Dubai sometime this month, Sharaf said. It will be rolled out to their government stakeholders, he said, declining to provide further information. | 4:00p |
| 4:30p |
How to Make Cloud Integration Easy By David Flynn, CTO, Primary Data
Many enterprises today are adopting a “cloud first” mentality that advises IT to evaluate whether cloud storage is a workable option for all requests they receive. This is understandable, as the cloud offers many benefits, including facilitating collaborative work, increasing flexibility and agility with elastic performance and capacity, and providing a cost-effective data archive, not to mention the cost savings. In fact, Sid Nag of Gartner reports that, “Growth of public cloud is supported by the fact that organizations are saving 14 percent of their budgets as an outcome of public cloud adoption.”
Yet, the implementation of “cloud first” policies remains slow, as Nag also noted, “…the aspiration for using cloud services outpaces actual adoption. There’s no question there is great appetite within organizations to use cloud services, but there are still challenges for organizations as they make the move to the cloud.”
There are many reasons holding enterprises back from the cloud, but it can be much easier for enterprises to accelerate enterprise cloud adoption. Let’s take a closer look at how.
See What Data Is Hot and What Is Not
Enterprises commonly begin their cloud initiatives with archival, as moving data that is no longer being used is low risk. However, low risk does not mean cloud archival projects are any easier. IT must conduct extensive research into which applications are retired and where their data is located. They must then identify which storage resources are co-hosting business-critical data, and plan migrations around active application use. IT must then schedule and perform the migration to the cloud during off-hours to protect business continuity. This can take time since data is typically migrated to the cloud using slow internet bandwidth. In fact, some enterprises have even run sneakernets to ensure data can be moved quickly and securely, without interrupting business.
A metadata engine makes this process much simpler. As a data management software layer, it can enable enterprises to add cloud storage as another tier in a global namespace. Once the cloud storage is added, the metadata engine can automatically load balance cold data to the new cloud resource, according to policies set by admins. For example, a metadata engine can automatically identify data activity and archive any data that has not been active within a time window that IT defines, such as 30 days, six months, or three years. Data can move between on-premises storage and one or multiple clouds without disrupting an application’s access, even while the data is in-flight.
Importantly, a metadata engine can help IT archive data to the cloud more intelligently than typical archival solutions. First, rather than base movement decisions on simple file creation dates, as is common with popular archival tools, a metadata engine can see whether data is being accessed at all (either by applications or users) and keep it on premises if so. Second, data is migrated to the cloud only when movement won’t impact other running applications. This protects business continuity, while allowing archive migration to occur around the clock, without IT intervention. Transfer times can be reduced through WAN optimization techniques that de-duplicate and compress data before it’s sent to the cloud, while security can be ensured through encryption for both in-flight data and data at rest.
Application Awareness Delivers Cloud Cost Savings for Active Applications
IBM has reported that, “approximately 75 percent of the data stored is typically inactive, rarely accessed by any user, process or application. An estimated 90 percent of all data access requests are serviced by new data—usually data that is less than a year old.” This means that most of the storage capacity used by applications is being wasted on data that is not being accessed.
Of course, most enterprises would love to extend cloud cost savings beyond retired applications, but it’s easy to understand why they don’t. Three key challenges make it difficult to move cold data that is associated with a live application to the cloud. First of all, if applications need the data again, IT must scramble to restore it to on-premises storage. Secondly, as public cloud providers typically charge for bandwidth to retrieve stored data, enterprises must consider when the cost to restore data outweighs the cost savings. Finally, cloud archive data is typically stored as an object, which means that applications must be modified to use retrieved object data.
A metadata engine can resolve all these challenges. It ensures all data in its global namespace remains accessible and can automatically retrieve it should applications need it again. Bandwidth charges can be minimized since a metadata engine can offer the ability to retrieve single files. Conventionally, if a company needed to restore a single file from a backup, they would still need to pay the bandwidth charge to move the entire backup bundle on premises and then rehydrate the bundle to restore the file. These bandwidth charges can be substantial if video and audio files are contained within the dataset. The ability to keep data accessible as files also means that enterprises don’t have to modify applications to use object data.
In its 2017 Roadmap for Storage, Gartner predicts that “by 2021, more than 80 percent of enterprise unstructured data will be stored in scale-out file system and object storage systems in enterprise and cloud data centers, an increase from 30 percent today.” Using a metadata engine to manage data across the enterprise and into the cloud can make this transition simple. Petabyte-scale enterprises gain the ability to automate the movement of data from creation to archival across all storage types, including the integration of public clouds as an active archive. Many core management tasks can also be automated, making it easy for companies to maximize storage efficiency and cost savings with the cloud, while ensuring the performance and protection required to meet service levels.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 6:22p |
Rackspace CEO Taylor Rhodes Leaving Company Taylor Rhodes is leaving Rackspace after three years as CEO of the former hosting and cloud heavyweight that recently pivoted to providing managed cloud services for Amazon Web Services, Microsoft Azure, and other hyper-scale platforms.
Rhodes will be replaced by Rackspace president Jeff Cotten, who is stepping in as interim CEO but whom the company’s board considers “a strong candidate” for the chief executive role long-term.
In a blog post, Rhodes said he has taken the Windcrest, Texas-based company as far as his skill set has allowed, and it’s time to let someone with different knowledge and abilities to push it further. He has taken a CEO role at a much smaller company, whose name he did not disclose, describing it as:
“It’s using cloud technologies to disrupt what has been a very low-tech industry. The company is going through growing pains and needs a CEO who has been through those challenges before.”
Rhodes said his new company is about as big as Rackspace was when he joined 10 years ago.
He was appointed as Rackspace’s chief executive in 2014, replacing then publicly traded company’s co-founder and former CEO Graham Weston. The company made the CEO switch after declining several buyout and partnership offers.
Last August however, Rackspace went private, bought out by investment management company Apollo Global Management for $4.3 billion.
Here’s Rhodes on company performance since the buyout:
“I’m proud to have led Rackspace through a hinge in its history, as we seized the leadership of the young and fast-growing market for managed cloud services, and as we went private under the ownership of Apollo Global Management and its partners. We recently reported strong fourth-quarter results to our bond and debt holders. And 2017 is shaping up to be even stronger, as we’re exceeding almost all of the financial targets we established with Apollo and our board.”
In a follow-up blog post of his own, Cotten said Rhodes was leaving Rackspace in “solid condition.” Since the company is now private, it does not report the details of its financial performance, but according to Cotten, its managed cloud business is growing exceptionally well.
Managed AWS and Azure services have grown more than 1,400 percent year over year since they were launched two years ago, he said. The company also recently entered a partnership with Google, expecting to launch managed public cloud services for Google Cloud Platform in the near future. Rackspace also provides private cloud services, using VMware, Microsoft, and OpenStack platforms.
Cotten also said Rackspace is working to launch a data center in Germany. Its current footprint consists of data centers in Dallas, Chicago, Northern Virginia, London, Hong Kong, and Sydney. |
|