Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Tuesday, September 10th, 2013

    Time Event
    12:30p
    Brave New World for Applications: Is it IT + LOB, or IT vs. LOB?

    Pauline Nist is GM Enterprise Software Strategy at Intel Corporation.

    pauline_nist_tnPAULINE NIST
    Intel

    At a recent tech conference, I got to sit on two distinct panels. The first was focused on how business could use analytics (Hadoop) to derive critical insights that drove business impact. The second was on how IT could leverage Hadoop to derive a more complete picture, and analysis of their data.

    I was intrigued as to how many people would attend both sessions, how the questions would vary, but mostly I was amazed that there were two different panel discussions. After all, shouldn’t IT AND Business be working together? That’s not a rhetorical question. Rather I think it is an indication of a major shift taking place in corporations today. There is a key enabler behind this and it’s the public cloud. When you can start experimenting with Amazon Web services for free, who doesn’t want to try it?

    Cloud: Enemy or Enabler?

    I have heard IT managers half seriously suggest that their corporations shut down the use of corporate credit cards to pay for cloud usage. They see it as a way to get control over data and security issues. But I would argue that’s the least of their problems. The challenge is that Lines of Business (LOBs) aren’t always getting timely responsive solutions out of IT, so they are hiring developers and data analysts and doing the work themselves. The credit card purchase at AWS is the end point, not the beginning.

    Even Gartner is predicting that by 2014 at least 25 percent of new business applications will be built by end users. As Gartner suggests, IT will be better served if it acknowledges this trend, and manages the risks by educating and supporting programs that create a safe environment for end user applications developers. I must admit that I’m not seeing a lot of such support happening.

    Attracted to Hadoop

    Additionally, I see Hadoop as a real temptation, which encourages many LOBs to connect with one of the companies providing a Hadoop distribution, and then starting a project. In fact, one of my fellow panelists (from one of the Hadoop distro companies) at the aforementioned conference, said that he told his sales guys to only talk to the LOBs and not IT! (though not quite as politely as I have paraphrased it!).

    This is an age old trend. Users want the compute cycles and control of the apps. It gave rise to minicomputers, Unix workstations, PCs, Client/Server, and now the Cloud. IT would be wise to support the process not ignore it.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    12:40p
    DuPont Fabros Seeks President, With Eye on Succession

    Data center developer DuPont Fabros Technology (DFT) is seeking to hire a successor to President and CEO Hossein Fateh, the company said today. The board has begun a search for an executive to serve as president, with the expectation that this individual will eventually succeed Fateh.

    Fateh has been the public face of DuPont Fabros, a real estate investment trust focused exclusively on the data center sector. He has been a key player in the data center real estate scene for nearly 15 years, steering DFT through significant growth and a successful IPO.

    “I have been thinking for some time and discussing with the Board about the need to expand and strengthen our senior management team and plan for succession,” said Fateh. “Given all that we have accomplished in the last six years, I believe that now is the right time to begin the process to identify a successor and integrate him or her into the company’s management team. The company’s Board fully supports the decision.”

    The new president will initially be focused on corporate strategy, acquisitions and other aspects of business development. Although the company has initiated the search process, the timetable for the succession process has not yet been determined. If the candidate meets the board’s expectations, he will then succeed Fateh, who will shift to a new role as Executive Chairman. Lammot du Pont, who founded the company with Fateh and is now Chairman of the Board, will then become vice chairman.

    “In the six years since our initial public offering, the company has accomplished as much or more than we had thought possible in terms of development and leasing,” said du Pont. “Our data center portfolio is now 94 percent leased, and our data centers, tenant base and operations team are the envy of the industry, in no small part due to Hossein’s leadership.”

    DuPont Fabros also said that it is selling $600 million in senior notes, which will be used to retire $550 million of existing senior notes paying 8.5 percent. The new notes will mature in 2021.

    1:20p
    Report: T5 has Tenant for North Carolina Project
    The T5 data center at Kings Mountain, North Carolina. (Photo: T5 Data Centers)

    The T5 data center at Kings Mountain, North Carolina. (Photo: T5 Data Centers)

    T5 Data Centers has found a tenant for its data center project in Kings Mountain, North Carolina, according to local media. The Charlotte Business Journal reports that T5 has lined up a tenant for a $70 million deal in which the unnamed company will lease half the space in a 150,000 square foot powered shell on the company’s Kings Mountain campus.

    Interim Cleveland County manager David Dear, interim county manager told the business journal that T5 has a client, which will request incentives at a county meeting next week.

    The T5 Kings Mountain campus is already home to data centers for AT&T, Walt Disney Co. and Wipro.

    T5 Data Centers has raised a combined $500 million over the past three years. T5 currently offers wholesale data center space in business-critical data center facilities in Atlanta, Los Angeles, Dallas, and Charlotte with new projects announced in Portland andColorado Springs.

    Check out the Business Journal story for additional details.

    1:46p
    Datapipe Acquires AWS Monitoring Specialist Newvem

    Managed hosting provider Datapipe has greatly enhanced its managed service for Amazon Web Services with the acquisition of Newvem, a company that specializes in monitoring AWS. The acquisition will result in Datapipe adding new features into its platform around analytics.

    The acquisition means Datapipe will be able to combine its managed IT services with an innovative cloud optimization platform. Datapipe has a security slant, targeting enterprises with high security needs. The addition of Newvem will further appeal to these customers looking to hybrid cloud, and brings an experienced team of cloud development and business professionals into the Datapipe fold.

    “Datapipe’s acquisition of Newvem is aligned with our commitment to be the industry leader in managed cloud services for the enterprise,” said Robb Allen, CEO of Datapipe. “Combining Datapipe’s proven enterprise IT and cloud services with Newvem’s cloud optimization platform provides our clients with 360 degree insight into cloud operations and optimization.”

    Newvem brings several capabilities into the Datapipe fold, including:

    • Clear and customizable visibility into AWS purchasing options, costs and usage. Forecast and execute IT and cloud budgets with control and confidence.
    • Award winning AWS optimization, management, and support. Take advantage of Newvem insights with rapid response and resolution from Datapipe Managed AWS.
    • A single provider solution to solve specific enterprise governance needs. Gain clear visibility into security, compliance, and cost. These powerful tools can be easily customized to report on individual business units and development teams within an enterprise.

    While Newvem has a focus on AWS, prior to acquisition it also launched support for Windows Azure, the cloud platform from Microsoft. It will continue to support any Newvem Azure customers, but the focus will remain on AWS.

    “Ensuring that business critical workloads are secure and cost effective over the public cloud is very important to us; added capabilities to forecast costs, as well as identify opportunities to improve usage are a big part of making our cloud operations successful,” said Jim Mitchell, CEO of Fuhu, a Datapipe customer Fuhu and creator of the award winning nabi tablet. “Datapipe enables a 360 degree view of our cloud operations to provide the ability to immediately optimize when needed. No other managed IT or cloud provider can provide this.”

    2:30p
    DCK Webinar: How Big Data Analytics Will Transform Data Center Efficiency

    Data center efficiency is a topic that is always a top-of-mind. Join us on Thursday, September 26 for a special webinar, “How Big Data Analytics Will Transform Data Center Efficiency“, presented by IO lead sustainability analyst Patrick Flynn.

    In this one-hour webinar, Flynn will make the case that the data center is the best and only place we can hope to meet the massive sustainability challenges confronting today’s enterprises and governments. Patrick will describe how data analysis holds the key to energy efficiency, business continuity and a new, better paradigm in performance measurement that will transform infrastructure and IT alike.

    Patrick Flynn is the Lead Sustainability Strategist at IO. He holds an MBA from the MIT Sloan School of Management as well as a BS in Mechanical Engineering from Stanford University. His work includes identifying, prioritizing and implementing a wide array of projects within IO’s operations and product platform. He is a Professional Engineer (HVAC) and LEED Accredited Professional

    Title: How Big Data Analytics Will Transform Data Center Efficiency
    Date: Thursday, September 26, 2013
    Time: 2 pm Eastern/ 11 am Pacific (Duration 60 minutes, including time for Q&A)
    Register: Sign up for the webinar

    Following the presentation, there will be a Q&A session with your industry peers and Patrick. Sign up today and you will receive further instructions via e-mail about the webinar. We invite you to join the conversation.

    3:17p
    C7 Opening Newest Utah Data Center

    C7 Data Centers has announced an October launch for Granite Point II, a new 95,000 square foot data center complex in Bluffdale, Utah. The site will have 70,000 feet of raised floor space and is designed for 10 megawatts. The data center will also have 25,000 square feet of office space to accommodate the high rate of out of state customers that the company attracts.

    While the Utah market is typically seen as a disaster recovery hot spot, C7 says it is unique in that it is attracting a high percentage of production customers.

    “Seventy five to 80 percent of our customers are out of state, and 80 percent of that is production,” said Wes  Swenson, CEO of C7. “Utah has a population of 2.8 million, so we can’t totally depend on approximate market.”

    The company sees a bright future as the grip on server hugging lessens. ”The pressure to reduce the cost to compute is going to increase, making customer open to remotely colocating their infrastructure,” said Swenson.

    C7 has multiple Utah data centers, providing colocation, disaster recovery, cloud and storage solutions. The company can accommodate one rack to thousands of square feet. It’s first data center on this campus is a 65,000 square foot facility built in 2010. “We happen to sit in the middle part of the two major population centers, and maybe 3 miles away ‘as the crow flies’ from the NSA data center,” said Swenson.

    Swenson gives several reasons why Utah should be at the top of the list for data center space, and not just for disaster recovery, but for production.

    “We get great  operational efficiency,” he said. “The biggest differentiator is that we are in Utah, the lowest disaster rates in the U.S. It’s at a higher elevation, it sits in what they call a cold desert. We’re able to use ambient air almost 9 months out of the year. We also have some of the lowest power pricing per capita at 3-5 cents per kilowatt –  tremendous power pricing. Utah is number 5 or 6 in national gas or oil shale, so there’ no transportation cost for that energy. Utah can basically produce it’s own energy, meaning lower power pricing.”

    The company hopes to achieve a low power usage effectiveness (PUE) due to the ambient air the Utah climate provides and the type of equipment used. The company anticipates a PUE of 1.2.  The data center has 24 inch plenum, it uses Big Ass Fans (that’s a brand, not a descriptor. Okay, it’s a descriptor, too.)

    “The facility has a very specialized cold row containment system,” said Swenson. “Large fans on the ceiling push the hot air off faster. There’s a 36- inch raised cold floor. There’s no limit to the  kilowattage that we can cool to the cabinet. We can cool virtually any space.”

    Because of this, Swenson says that customers can use more of the rack, and ultimately need to take down less space, saving money, thanks to a slew of reasons that both the location of Utah and the design of the data center provide. He attributes this to the company’s appeal with out-of-staters colocating production infrastructure.

    The design of the  data center has been in development for the last two years and construction began in February of 2013.

    “This data center is the culmination of years of listening to our customers, combined with our vision of what a modern data center should be,” said Swenson. “It is truly world class and state of the art in every respect with a vision for the future. In this day and age, businesses are experiencing rapid IT growth but their data centers are aging.  Granite Point II is truly a modern data center, built at the foot of the Wasatch Mountains, and will provide a terrific product for our demanding customers experiencing growth.”

    7:00p
    What Personal Cloud Means for Consumers And Enterprises
    The evolution of cloud computing may include a new concept, the personal cloud, which is correlated to a user, rather than an enterprise.

    The evolution of cloud computing may include a new concept, the personal cloud, which is correlated to a user, rather than an enterprise.

    As the IT infrastructure continues to evolve, a new type of data delivery platform will begin to rise. Driven by IT consumerization and the ever growing number of personal end-points, the personal cloud will start to make its presence known.

    Here is the reality, with so much emphasis on the end-user and services delivery – above very many other platforms – this is the year of the personal cloud. Not yet an established name, already users are demanding that their data, files, personalization settings, and even applications have the capability to travel with them.

    Cloud, virtualization and consumer technology vendors are working hard to find a way to make hardware, operating systems and even location a non-issue. Already, we have technologies like Citrix’s XenMobile, which aims to both lock down and empower the end-user. From the cloud’s perspective, we have large data centers and cloud providers like Amazon and Google which are quickly trying to become your “everything-as-a-service” host.

    We already have a lot of cloud models out there. In reality, the future of the cloud will heavily rely on good connectivity methods, open APIs, stack-based platforms and the ability to deliver services quickly. Still, within the overall cloud definition, we will have smaller subsets which focus on specific delivery methodologies. In the world of cloud technologies aimed at the end-user, there really are two major levels:

    Corporate Cloud

    From a management perspective, organizations may one day suffer from cloud sprawl. We’ve seen it with servers, virtual machines, and now it’s happening with the cloud. Moving forward, single-pane of glass management solutions will allow organizations to unify their cloud platform and even enhance delivery. So, what does this mean for the organization?

    • Controlled data delivery to various end-points.
    • Granular device interrogation abilities.
    • Advanced data loss prevention (DLP) engines.
    • Visibility into how data is flowing into and out of the environment.
    • Complete information centralization with managed delivery options.
    • Mobile/Enterprise Device Management (MDM/EDM) capabilities.

    We can’t stop IT consumerization. We also can’t stop the growing number of data points and information that is becoming available. Finally, organizations can no longer restrict users based on devices or where they are accessing their data from. The shift has changed to empowering the end-user through intelligent corporate cloud control methods. Unless the device is corporate owned – why take ownership of it? The corporate cloud delivers data, applications and services via a secure app or connection down to the end-user device. When the user is no longer with the company – only the data is de-provisioned. The device once again becomes just a user end-point .

    Personal Cloud

    From a consumer perspective, cloud computing is a powerhouse when it comes to delivering data. Many users don’t really understand just how complex and secure some cloud solutions can be. Moving forward, vendors will be developing cloud solutions which will “virtualize” the user and upload their personality into the cloud. It’ll be secure, completely private, and create an extremely smooth computing experience. By creating that user layer, vendors can manage how the end-points process information and where it’s being set. How does this benefit the typical user?

    • Enjoy the same experience regardless of hardware device or OS.
    • Deploy new devices in seconds by downloading all of your settings, files and personalization information.
    • Know that if a device is lost, remote wipe capabilities are available and that the data is safe.
    • Intelligent personal cloud security will scan the connection as well as the end-point, see if it’s secure and always deliver data over a secured connection.

    There needs to be a shift in thinking as far as how the typical user interacts with his or her corporate data. Many organizations are much more flexible with an  employee’s work schedule, which means that work and life are more interspersed. This also means that a user is going to be accessing information from personal devices to stay productive. Instead of restricting him or her to only a few approved devices – organizations can now control their data and interface with personal devices and the personal cloud.

    There’s no doubt that these technologies will continue to grow. Even now, the average user may be utilizing 3-4 devices to access corporate data. This new type of cloud model can have benefits for organizations, users and cloud vendors. As more devices join the cloud, there will be requirements for more infrastructure, more management and of course – more security.

    Corporations will strive to not only control their cloud presence, but also ensure that it’s retaining the maximum ROI. For the user, it comes down to ease-of-use and a positive computing experience. Personal cloud technologies can already be seen with corporate-controlled platforms like DataNow and Sharefile. Now, imagine these same platforms but, with more user settings, and greater delivery capabilities. This may be simple resource like email or more complex workloads like entire desktops. Regardless of the delivery method, one thing is certainly clear: there are more devices, more data, and a greater need to access everything from anywhere.

    To keep up with Data Center Knowledge of the growth of cloud computing, bookmark our Cloud Computing channel.

    7:23p
    Joyent Co-Founder and CTO Jason Hoffman Steps Down

    Brought to you by The WHIR.
    WHIR_logo_100

    Joyent co-founder Jason Hoffman stepped down as chief technology officer of the cloud computing company this week.

    Hoffman co-founded Joyent 10 years ago, and since then, Joyent has become a leading cloud company, recently recognized in Gartner’s Magic Quadrant as “exceptionally innovative from a technology perspective.” Bryan Cantrill will take over as CTO, and Hoffman said he will stay “closely connected to Joyent” as an advisor.

    In June, Joyent launched its cloud object store platform Manta to help enterprises process big data faster and more securely. Hoffman says the launch of Manta puts Joyent in a strong position for growth.

    “I believe that Joyent’s Manta compute on storage innovation will disrupt the storage and big data analytics markets, even as it is just starting to get major traction. Over the past 60 days since launch I’ve watched as it has captured the imagination of our customers, partners, the media and the developer community at large,” Hoffman wrote in a blog post announcing his departure. “It’s been particularly gratifying to watch as the community explores the endless possibilities of Joyent Manta, often blogging and tweeting about the business benefits and use cases they’ve discovered. Now, more than ever, I’m confident that Joyent is ready to lead and define the future of cloud computing in support of the real-time web and mobile applications of the 21st century.”

    Last year, Joyent named Henry Wasik CEO in order to accelerate its cloud growth, bringing 20 years of senior leadership experience to Joyent.

    Original article published at: http://www.thewhir.com/web-hosting-news/joyent-co-founder-and-cto-jason-hoffman-steps-down

    8:00p
    IBM Targets High-Density, HPC Markets With NeXtScale Server Line
    ibm-nextscale

    IBM’s NeXtScale System is the newest addition to IBM’s x86 portfolio. The flexible computing platform provides three times as many cores as current one-unit rack servers. (Photo: IBM)

    High density data centers often feature racks filled with blue blinking lights. Are cloud builders ready to populate those racks with Big Blue servers?

    IBM today introduced the NeXtScale System, a new x86 computing platform designed to bring “the power of a supercomputer in any data center.” IBM says the new servers will combine high-density and improved power efficiency, and can operate in data center environments as warm as 104 degrees F. That’s a scenario seen most often in hyperscale server farms, which represent a growing chunk of server sales.

    “NeXtScale is designed to deliver raw throughput and performance, and is positioned well to handle HPC, cloud, grid, and managed hosted workloads,” said Kevin Rozynek, NASA Client Executive at IBM Business Partner Direct Systems Support. “In addition, this new system provides clients a great deal of flexibility in configuration and components, making it one platform that can do it all.”

    The NeXtScale servers leverage Intel’s new Xeon E5-2600 v2 processors, which were officially introduced today at the Intel Developer Forum. NeXtScale in corporates up to 84 x86-based systems and 2,016 processing cores in a standard 19-inch rack, and uses industry-standard components including I/O cards and top-of-rack networking switches.

    Form Factors Include Rolling Racks

    NeXtScale is configured to be flexible to meet a range of data center requirements, and can be purchased as a single node, an empty or configured chassis, or in full racks as a complete pre-tested IBM Intelligent Cluster solution.

    IBM also provides a software stack to run atop of NeXtScale, including IBM General Parallel File System, GPFS Storage Server, xCAT, and Platform Computing, providing powerful scheduling, management and optimization tools.

    “NeXtScale is designed to deliver raw throughput and performance, and is positioned well to handle HPC, cloud, grid, and managed hosted workloads,” said Kevin Rozynek, NASA Client Executive at IBM Business Partner Direct Systems Support. “In addition, this new system provides clients a great deal of flexibility in configuration and components, making it one platform that can do it all.”

    On the storage front, IBM today introduced the x3650 M4 HD, an enhancement of its 3650-class system featuring first-in-class 12-gigabyte RAID and a 60-percent higher spindle count for higher density storage and higher IO performance, making it ideal for applications such as big data and business-critical workloads.

    IBM NeXtScale and System x3650 M4 HD are part of a broad refresh of the entire System x core server portfolio of two-socket systems to incorporate the Xeon E5-2600 v2 product family, including System x racks and towers, Flex System, iDataPlex, and BladeCenter offerings.

    << Previous Day 2013/09/10
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org