Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Wednesday, February 26th, 2014
| Time |
Event |
| 1:00p |
Forsythe Enters Data Center Development With Chicago Project  An illustration of Forsythe Data Centers’ plan for a new data center in Elk Grove Village, Ill. (Image: Forsythe)
IT infrastructure integrator Forsythe Technology is entering the data center development business. The company plans to build a 221,000 square foot facility in the suburban Chicago market in Elk Grove Village, Ill., where it will offer data center suites with dedicated infrastructure. Forsythe expects to break ground in March and have the facility ready for occupancy in early 2015.
Forsythe has a long history in IT consulting and data center engineering. With its new data center, the company seeks to blend elements of the colocation and wholesale data center markets. It’s offering 56 private client suites that will be 1,000 square feet each, compared to the 12,000 square feet seen in suite offerings from wholesale providers. Customers of that size have historically wound up in cages within colocation facilities. Each suite will have its own UPS support and a dedicated cooling system.
With this “Retail+” approach, Forsythe hopes to bring wholesale-style infrastructure to colo-sized requirements. It’s splitting the difference with projected lease terms of five years, compared to three years for colo and up to 10 years or more for wholesale.
A Flexible Approach
“Forsythe’s facility offers the flexibility and agility of the retail data center market, in terms of size and shorter contract length, with the privacy, control and density of large-scale, wholesale data centers,” said Albert Weiss, the President of Forsythe Data Centers, and also executive vice president and CFO of Forsythe Technology.
The vice president of data center development for Forsythe Data Centers is Steve Harris, who has headed Forsythe’s data center engineering team since 2002, and also worked in the data center division at Comdisco.
“After many years of listening to the needs of our clients and helping them build data centers and find colocation data center space, we made sure this data center facility has everything they have been asking us for,” said Harris.
Contintued Growth in Chicago Suburbs
The Forsythe projects adds to the growing cluster of data centers in the western suburbs of Chicago. DuPont Fabros, Equinix and CenturyLink all have facilities in Elk Grove Village, while Digital Realty Trust has a major project in Franklin Park, Ascent has a site in Northlake, and Latisys and Server Farm Realty have projects in Oak Brook. Single-tenant facilities in “Chicagoland” include large data centers for Microsoft and the CME.
The Forsythe facility is being designed to comply with U.S. Green Building Council LEED certification standards for data centers and to obtain Tier III certification from the Uptime Institute. Forsythe will move its multi-vendor Technology Evaluation Center and Integration and Configuration Center from its corporate headquarters in Skokie to the new data center. The evaluation center will be about 2,500 square feet, while the integrated technology center will use about 14,000 square feet of the completed building.
Forsythe’s data suites will be designed with redundant UPS configurations and air conditioning systems, with the density choices ranging from 200 to 300 watts per square foot.
“Each suite will have a dedicated power infrastructure with 2N UPS and dedicated cooling infrastructure with N+1 precision air conditioning through Emerson Network Power,” said Harris. “Generator backup is building-based and shared across suites until a client reaches 4,000 square feet and then generator support is private.”
The 1,000 square foot data suite will have an adjacent 670 square foot equipment gallery housing the UPS and air conditioning units. Clients with larger requirements can either lease multiple suites, or Forsythe can remove walls to combine floor space up to 4,000 square feet.
Contractors working with Forsythe include Environmental Systems Design, Inc., Duke Realty, Turner Construction, Emerson Network Power, Anixter and Commonwealth Edison. | | 1:30p |
7 Tips for Successful Governance of a System Integration Project Glenn Johnson is the U.S/ Senior Vice President at Magic Software Americas. He is the author of the blog “Integrate My JDE” on ittoolbox.com and contributor to the Business Week Guide to Multimedia Presentations. He has presented at Collaborate, Interop, COMMON, CIO Logistics Forum and dozens of other user groups and conferences.
 GLENN JOHNSON
Magic Software Americas
Application portfolio managers and business analysts in an IT environment are being asked to redesign business processes based on the new capabilities enabled by mobile, social and cloud computing. Despite pressure from line-of-business managers for new strategies and approaches to traditional business issues, many IT departments resist change due to risk aversion, financial restraints and short-term thinking. In many cases they lack the agility within their IT infrastructure needed to update systems efficiently, manage data securely and integrate processes effectively across the enterprise.
System integration projects that leverage new social, mobile and cloud capabilities risk becoming overly complex, lengthy and expensive. By adopting a reasoned approach to IT governance of system integration projects, your chances for success will increase dramatically. Here are seven tips for successful governance of a systems integration project:
1. Recruit executive sponsors and keep them on board
Every project needs a champion – someone who will advocate for the project, understands its benefits and can corral cooperation from change-weary managers and employees. An executive sponsor brings the vision and energy necessary to effectively instill corporate change. Fortunately, the enthusiasm level for mobile, social and cloud computing strategies and tactics remains high and IT will often find a refreshing willingness of traditional line-of-business and C-level executives in sales, marketing, manufacturing, finance and operations that has not always been present in other past IT initiatives aimed at more mundane needs such as service-oriented architecture where the benefit relied on a technical viewpoint. Your executive sponsor or champion is your key partner in any systems integration project. Get their input from the outset and utilize their clout within the organization to get the cooperation you need.
2. Plan for incremental success
Regardless of whether you use agile or scrum project management methodologies, early and frequent successes are essential. Too many projects are cancelled midstream simply because their results are shrouded in mystery. You may want to have an initial proof-of-concept project that shows the viability of your underlying integration approach, so that you can keep the project moving forward to the final result without suffering waning enthusiasm. Plan for a series of small successes that put reasonable results in front of all interested parties to the project.
3. Never overlook security
While it will be tempting to strike quickly and beat the competition in delivering new customer experiences based on social, mobile and cloud computing, never put you or your customers’ data, privacy or security at risk. Never. There are several smart decisions you can take to improve your chances of secure system integration projects; keeping the integration server behind the firewall, encrypting all transport layers, and leveraging a strong mobile data management (MDM) platform are among some of the more obvious approaches. Follow the requirements of your security plan and adjust it as needed to anticipate new threats.
4. Approach integration systematically
Too many enterprises rely on manual programming to connect point-to-point integration of enterprise systems today. While this is manageable in the short-term with small numbers of systems involved, it becomes virtually impossible in the context of cloud, social and mobile integration. For one thing, the discrete nature of mobile business processes is leading to an explosion in the number of apps. This is compounded by an equivalent explosion in the number of APIs, all of which must be disambiguated through data transformation and messaging approaches. Choose an integration platform with a good balance of application adapters and technology adapters that can manage the communications needed between all of the APIs in your enterprise IT environment. | | 1:40p |
CenturyLink Boosts Cloud Offerings With Flash, Beefier Instances CenturyLink is boosting its cloud portfolio with a beefed up cloud and an expanded data center footprint. The company is introducing Hyperscale, a high performance compute and flash storage service for web-scale workloads, Big Data and App development. The company also announced it is expanding its public cloud data centers from nine locations to 13 in the first half of 2014.
Starting in March, new cloud data centers in Santa Clara, California and Sterling, Virginia will be online and ready for customer use. The company is expanding its European footprint, adding locations in Paris and the London metro market, expected to come online by the second quarter of 2014. All of CenturyLink’s cloud services, including the new Hyperscale, will be available in all of these new locations. The increased footprint helps customers geo-target their solutions better.
Hyperscale Beefed Up Server Instances
Hyperscale is a self service offering that allows developers to create and run mission critical applications in the public cloud on beefed-up server instances. Hyperscale addresses the new breed of web-scale applications and Big Data needs.
“New applications are crucial to delivering a competitive advantage for enterprises, and Hyperscale is the ideal service for these workloads,” said Jared Wray, CenturyLink Cloud chief technology officer, CenturyLink Technology Solutions. “CenturyLink continues to bring developers and IT together with this new capability. Developers get self-service and lightning-fast performance for popular NoSQL platforms, and IT can easily use our cloud management platform for governance and billing.”
Hyperscale consists of 100 percent flash storage for increased IOPS (input/output per second) performance. IOPS is usually the point where cloud performance suffers, so a lot of providers have been turning to Flash storage to solve this problem. The company claims that users will consistently see performance at or above 15,000 input/output operations per second for a diverse range of workloads. Typical use cases for Hyperscale are intensive web-scale architectures built on Couchbase, MongoDB and other NoSQL technologies.
The high performance server instances are also ideal for Big Data applications, and are complementary to CenturyLink’s existing Big Data Foundation Services, which offer a series of options and managed services to address a broad set of enterprise use cases. Hyperscale is an important piece in a portfolio that includes public cloud, colocation, managed services and network solutions. | | 2:00p |
Intel Broadens Strategy for Mobile and Internet of Things At Mobile World Congress this week in Barcelona, Intel (INTC) expanded its portfolio of computing and communication assets for the smallest of devices to the most complex mobile networks. Intel President Renee James set the mobile vision in motion, detailing how its products are set to compete in today’s mobile ecosystem and shape the next era of computing – the Internet of Things (IoT).
“The continued growth of the mobile ecosystem depends on solving tough computing challenges – unlocking data’s potential while securely and reliably connecting billions of devices with leading edge computing and communications technologies,” said James. “Today we are announcing leading communications products as well as new computing platforms. As a result, Intel is well-positioned to shape the future of mobile computing and the Internet of Things.”
New Atom processors and LTE-Advanced Communications Platform
Formerly code named Merrifield, Intel introduced the 2.13 GHz Atom processor Z3480 for the Android smartphones and tablets market. Based on a 22 nanometer Silvermont microarchitecture, the 64-bit ready SoC delivers best-in-class compute performance for the mainstream and performance segments, and solidly outperforms the competition in compute-intensive application, web application and light media editing performance. It is the first Intel Atom SoC to feature the new Intel Integrated Sensor Solution, which efficiently manages sensor data to keep applications smart and contextually aware even when the device is in a low-power state. The new processor also features a PowerVR Series 6 Graphics IP core from Imagination Technologies and is designed for simple pairing with the Intel XMM 7160 LTE platform.
“Sixty-four bit computing is moving from the desktop to the mobile device,” James said. “Intel knows 64-bit computing, and we’re the only company currently shipping 64-bit processors supporting multiple operating systems today, and capable of supporting 64-bit Android when it is available.”
Looking ahead to its next-generation 64-bit Intel Atom processor, code-named “Moorefield,” Intel said it will add an enhanced GPU and two additional Intel architecture (IA) cores for up to 2.3GHz of compute performance. It is expected to be available in the second half of 2014. Moorefield is optimized for Intel’s XMM 7260 LTE platform. Announced Monday, the new Intel XMM 7260 delivers competitive LTE-Advanced capabilities including carrier aggregation (supporting 23 CA combinations in a single chip), category 6 speeds and support for TDD LTE and TD-SCDMA, which expands the addressable market. Now certified to run on 70 percent of LTE networks worldwide, the 7160 is expanding to connect a range of products spanning smartphones, tablets, 2 in 1s, Ultrabook systems and more. Customers currently shipping or planning to launch devices featuring Intel’s LTE platforms include Acer, ASUS, Dell, Lenovo and Samsung, among others.
“We are entering 2014 with a very competitive mobile portfolio spanning application processors and communications platforms that will only get stronger,” said Hermann Eul, vice president and general manager of Intel’s Mobile and Communications Group, during the press conference. “Our new Atom processors for Android smartphones and tablets offer leading 64-bit performance and battery life, and the new 7260 platform gives the ecosystem a compelling LTE-Advanced experience.”
Intel-Based Mobile Devices
Signaling the expanding availability of tablets and smartphones powered by Intel Atom processors and connected by Intel communications, James announced three new multiyear agreements with leading device manufacturers for Intel-based mobile devices. Intel announced plans with Lenovo for mobile devices, including incorporating Intel LTE connectivity into some Ultrabook and multimode designs. ASUS announced it will bring a full portfolio of Intel-based smartphones and tablets to market this year. At Mobile World Congress, ASUS unveiled the ASUS Fonepad 7 LTE (ME3762CL) featuring an Intel Atom processor and Intel LTE connectivity. Dell and Intel are expanding the long-standing collaboration between the two companies to include a range of innovative tablets that started with the introduction of the Dell Venue line in fall of last year. Intel-based products from Dell will span Android and Windows solutions.
Intel also announced expanded relationships with Alcatel-Lucent and Cisco to accelerate network function virtualization (NFV) and software defined network (SDN) technologies. Renee James also highlighted numerous Intel-based trials with global operators including China Mobile, SK Telecom and Telefonica that are demonstrating the benefits of NFV and SDN for enabling personalized and contextually aware services, improving asset utilization, and simplifying installations and upgrades. | | 2:30p |
Microsoft Azure Now Available in 2 Japanese Regions Microsoft Windows Azure cloud is now available in Japan within two availability zones: Japan East (Saitama Prefecture) and Japan West (Osaka Prefecture). Microsoft has already invested more than $15 billion in its cloud infrastructure providing more than 200 cloud services to more than 1 billion customers in 90 markets worldwide.
“These new regions will help fulfill the current and future needs of our cloud customers with secure and highly available services that help them grow their business,” wrote Takeshi Numoto, Corporate Vice President, Cloud and Enterprise Marketing, Microsoft. “In addition, they provide local customers the ability to achieve data residency and realize data recovery scenarios, as data will be replicated between the two regions.”
Japan is an interesting cloud market. It’s considered as a separate entity from the general Asia Pacific market as it’s a very insular culture. Japanese businesses want their cloud computing to be within the borders of Japan, and companies serving Japan need to host on cloud within the country in order to be successful. The cloud market forecast in Japan for 2014 is estimated at around $1.6 billion (170 billion Yen), according to IDC. In Japan alone, storage usage for Windows Azure has grown 10-fold in the last 15 months.
According to Numoto’s blog post, demand for Windows Azure is increasing so significantly that the company is doubling capacity every six to nine months. More than 1,000 customers a day signing up for Azure
“As we continue at this growth rate, we will work with our customers and partners to ensure that we provide the value and support needed,” wrote Numoto. “We look forward to growing Japan’s cloud market, offering customers new options while helping push cloud adoption forward across the globe.” | | 3:00p |
Oracle Acquires BlueKai Marketing Cloud Oracle acquires marketing cloud provider BlueKai, Pivotal is joined by industry leaders to establish an open governance model for Cloud Foundry, and VMware launches its vCloud Hybrid Service in Europe.
Oracle acquires BlueKai. Oracle (ORCL) announced it has signed an agreement to acquire cloud-based marketing provider BlueKai. The BlueKai solution includes its Data Management Platform, which centrally organizes a company’s customer and audience data in the cloud to help implement personalized marketing campaigns across all channels and deliver better results and higher marketing ROI. Terms of the deal were not disclosed, but it was reported that Oracle paid more than $400 million. BlueKai also runs the world’s largest third party data marketplace to augment a company’s proprietary customer data with actionable information on more than 700 million profiles. Combining Oracle’s Marketing and Social solutions with BlueKai will give customers the ability to build the richest user profiles combining information from first party and third party sources including media, advertising, social, and mobile sources. BlueKai will be integrated with both Responsys for B2C marketing automation and Eloqua for B2B marketing automation. “Modern marketers require new ways of acquiring, centralizing, interpreting, and activating customer data across marketing channels so that they can enhance the customer experience and maximize the return on their marketing spend,” said Steve Miranda, Executive Vice President, Applications Development, Oracle. “The addition of BlueKai to the Oracle Marketing Cloud enables marketers to act on data across both known customers and new audiences and precisely target customers with a personalized message across all channels.”
Pivotal proposes open governance model for Cloud Foundry. Pivotal announced that it will move to establish a formal open governance model for the Cloud Foundry open source project. Founding Platinum sponsors of the independent non-profit foundation for the Cloud Foundry project include EMC, IBM, HP, Rackspace, SAP and VMware, with ActiveState and CenturyLink joining as Gold level founding sponsors. “As software continues to disrupt every aspect of business, enterprises want the ability to develop and deploy applications and have the freedom to seamlessly deploy those applications across a wide variety of cloud-based platforms,” said Paul Maritz, CEO, Pivotal. “The foundation for Cloud Foundry will bring together industry leaders committed to the growth of the open PaaS movement, working in concert toward the development of an open cloud architecture that will enable a broad, open ecosystem that will allow many to contribute and benefit, creating applications and services that have major impact on business and our everyday lives.” In a blog post Pivotal CEO Paul Maritz discusses how Cloud Foundry has evolved from an open source project into a true open platform ecosystem.
VMware vCloud Hybrid launched in Europe. VMware (VMW) announced the general availability of VMware vCloud Hybrid Service in Europe from a data center in Slough, UK. Complementing VMware’s existing US data centers, the Slough data center provides customers with a European location that addresses UK and EU compliance and data sovereignty demands. A recent survey of 200 VMware enterprise customers in the UK conducted by Vanson Bourne, an independent and specialist market research provider for the global technology sector, revealed that 86 percent of respondents believe that it is important to ensure their business critical data is stored with a UK-based cloud service provider. ”Customer response to the UK public beta of VMware vCloud® Hybrid Service™ has been tremendous,” said Bill Fathers, senior vice president and general manager, Hybrid Cloud Services Business Unit, VMware. “VMware is uniquely positioned to deliver the best of both worlds: the compelling economics and agility of a public cloud, yet fully compatible with a customer’s existing data center, applications, management tools, networking and security. As a result, VMware vCloud Hybrid Service solves key business issues allowing organizations to seamlessly extend their data centers to the cloud — linking both private and public cloud together to create a truly hybrid cloud.” | | 3:30p |
Tilera Unveils Many-Core Appliance for Security Apps  Tilera has the availability of the TILEmpower-Gx72 FR, a high-availability platform powered by Tilera’s TILE-Gx72 processor and delivering 80Gbps of wire-speed Ethernet I/O. (Photo: Tilera)
At the RSA Conference in San Francisco this week Tilera launched several security products and solutions with partners to highlight the power and speed of its processor portfolio.
Aimed at large network operators, Tilera announced the availability of the TILEmpower-Gx72 FR, a high-availability platform powered by Tilera’s TILE-Gx72 processor and delivering 80Gbps of wire-speed Ethernet I/O. The new chip integrates high-availability features including redundant, hot-swap power supplies and redundant field replaceable fan trays, assuring reliable operation in mission critical environments. In addition, the platform hardware design is FIPS 140-2 ready, dramatically accelerating time to market for companies deploying network security applications.
“Our customers are demanding a turnkey platform that accelerates their time to market for security centric applications,” said Devesh Garg, CEO of Tilera. “The quantity and diversity of internet threats is exploding, with increasing complexity so time-to-deployment for both software and hardware is critical for our customers. With the combination of Tilera’s TILEmpower-Gx72 FR platform and the available IPsec, SSL and deep packet inspection (DPI) application toolkits from third party vendors, we dramatically accelerate our customers’ ability to deploy a high-performance, scalable solution.”
The TILEmpower-Gx72 FR platform includes redundant power supplies, FIPS 140-2 level 3 chassis design, up to 80 Gbps of network I/O and 32GB of DDR3 memory with ECC, two internal solid-state discs, and an integrated baseboard management controller for remote management.
Integrated SSL/TLS Security Solution on a Single Chip
Tilera launched a fully integrated secure sockets layer/transport layer security (SSL/TLS) security stack powered by INSIDE Secure’s (INSD) industry-proven MatrixSSL toolkit and tuned to run seamlessly on the entire family of TILE-Gx processors. Only a fraction of the TILE-Gx cores are consumed in the integrated solution, freeing the remainder for developers to build advanced networking functions such as DPI, intrusion detection and prevention (IDS/IPS), and virtual switching.
“We are seeing increasing demand from our customers for high-performance SSL processing,” said Bob Doud, director of marketing, Tilera Corporation. “The amount of SSL-protected traffic is exploding, and this presents a huge problem to network monitoring and analysis systems, as well as intrusion detection equipment, since they cannot inspect the encrypted payloads. With the ability to support line-rate SSL termination Tilera can solve those problems and preserve the performance and security of the network.”
Open Virtual Switch Solution (OVS)
Tilera announced the availability of TILE-OVS, an optimized Open vSwitch (OVS) offload solution for network functions virtualization (NFV) deployments in data center and telco networks. Powered by Tilera’s TILE-Gx manycore processors and deployed in a PCI Express form factor, the solution delivers up to 80Gbps of OVS processing with additional headroom to spare for many other sophisticated networking applications such as deep packet inspection (DPI), network analytics and cyber security processing. With TILE-OVS, high-throughput networking, switching, and security functions are handled by TILE-Gx processors with direct coupling to the Ethernet I/O, dramatically offloading the x86 server.
“Offloading Open vSwitch from x86 host CPUs provides several benefits for NFV,” said Matthew Mattina, CTO at Tilera. “OVS is computationally demanding on x86 machines, especially when dealing with high throughput, high packet rates, and tunneling protocols like NVGRE that leave fewer cycles to run the desired virtual functions. By offloading OVS to a Tilera data-plane adapter, our customers are able to free up more cycles for host-side applications resulting in NFV capable servers that are power and cost efficient, and support OpenFlow controllers.”
Tilera and Procera Networks Partner
Tilera and Procera Networks (PKT) announced a high performance DPI-on-a-NIC deep packet inspection solution for software defined networks (SDN) and network functions virtualization. Tilera’s TILEncore-Gx Intelligent Application Adapters combined with Procera’s Network Application Visibility Library (NAVL) software development kit delivers wire-speed deep Layer 7 application intelligence required by emerging network architectures. “By offloading the networking data plane to our intelligent application adapters, customers can scale to meet increased demands as their data and video traffic escalates,” said Bob Doud, director of marketing at Tilera. ”With the combination of Tilera’s TILEncore Intelligent Application Adapters and Procera’s NAVL software solutions, customers can implement a flexible and scalable DPI offload model using their existing servers.” | | 3:41p |
Piston Integrates OpenStack, Other Technologies for Private Clouds Brought to you by The WHIR.
For those looking to turn their individual servers into a private OpenStack cloud, the latest release of Piston OpenStack from Piston Cloud Computing provides a platform for complete private cloud environments on an enterprise’s existing commodity hardware.
Piston OpenStack 3.0 essentially integrates several technologies into the standard, open-source OpenStack framework to provide a more complete and easily deployable private cloud solution. And it installs directly onto commodity hardware from almost any major x86 vendor, virtualizing a data center’s compute, storage, and network capabilities. It can scale to tens of thousands of physical servers.
Version 3.0 includes multi-tier storage pooling, software-defined networking options (Juniper Contrail, PLUMgrid, and VMware NSX), and the Moxie Runtime Environment for orchestrating third party services. It also provides a redesigned customer dashboard, new management APIs and tools to monitor node and cluster health, and the ability to access 10,000 worldwide cloud servers via API.
Since first launching a commercial OpenStack product two years ago, Piston OpenStack has been deployed by dozens of companies in the financial sector, medical research, global businesses and federal agencies.
It was recently chosen by Intelemage, a company that provides medical image sharing, for its expanding private cloud.
“We selected Piston for our global private cloud because of our need to maintain a secure and reliable environment,” Intelemage CTO and co-founder John Danner said in a statement. Particularly, Piston OpenStack helps the company automate its build-out and deployment activities, and streamlining its server management.
Piston OpenStack 3.0 is available now for download. After a 90-day free trial period, Piston OpenStack is available through an annual license which includes updates and customer support. Piston also provides training and professional services to support enterprise-level integration for custom authentication, audit and compliance, or monitoring solutions.
This post oringially appeared at: http://www.thewhir.com/web-hosting-news/piston-integrates-openstack-technologies-make-complete-private-clouds | | 4:00p |
Uptime Institute Symposium 2014 Uptime Institute Symposium will be held May 20-22, 2014, at the Santa Clara Convention Center in Santa Clara, California. This knowledge-sharing event, providing new expert insight from Uptime Institute staff and 451 Research analysts and industry experts, as well as peer experience, will focus on the ”Empowering the Data Center Professional.”
Key Issues at Symposium 2014
The sessions at Uptime Institute Symposium 2014 will cover key topics, including:
- Data center outsourcing
- DCIM adoption and implementation
- Adherence to shifting codes and safety practices
- New data center infrastructure technologies
- The costs, benefits and tradeoffs of green certifications
- Cloud computing pricing methods, attributes and metrics
- Aligning your Digital Infrastructure strategy to business demands
- The influence of cutting-edge IT technologies on capacity planning
This multi-disciplinary event will bring together a diverse group of data center operators, designers and senior-level IT executives, and as such, the program will be organized into three tracks tailored to these job roles. For more information and registration see the Uptime Institute Symposium website.
Venue
Santa Clara Convention Center
5001 Great America Pkwy, Santa Clara, CA 95054
Ph: (408) 748-7000
For more events, please return to the Data Center Knowledge Events Calendar. | | 4:18p |
WHIR Networking Event: Chicago, IL The WHIR brings together professionals in the hosting industry for fun (and free!) networking events at different locales in the U.S. and internationally as well. The one-night event is an opportunity to meet like-minded industry executives and corporate decision makers face-to-face in a relaxed environment with complimentary drinks and appetizers.
The WHIR provides a great local venue, and you do the rest – do business, make new connections and learn more about those in the web hosting industry.
Gather with your colleagues from Chicago and meet new associates from your region.
Event Date: Thursday, March 13, 2014
Time: 6:00 pm to 9:00 pm
Place: ¡AY CHIWOWA!, 311 W Chicago Ave, Chicago, IL, 60654, USA
Learn more and RSVP!
YOU MUST BRING A BUSINESS CARD TO WIN A PRIZE
For more events, return to the Data Center Knowledge Events Calendar. | | 4:28p |
WHIR Networking Event: Los Angeles, CA The WHIR brings together professionals in the hosting industry for fun (and free!) networking events at different locales in the U.S. and internationally as well. The one-night event is an opportunity to meet like-minded industry executives and corporate decision makers face-to-face in a relaxed environment with complimentary drinks and appetizers.
The WHIR provides a great local venue, and you do the rest – do business, make new connections and learn more about those in the web hosting industry.
Gather with your colleagues from LA and meet new associates from your region.
Event Date: Thursday, April 10, 2014, 6:00 pm to 9:00 pm
Time: 6:00 pm to 9:00 pm
Place: TBA, Los Angeles, CA
Learn more and RSVP!
YOU MUST BRING A BUSINESS CARD TO WIN A PRIZE
For more events, return to the Data Center Knowledge Events Calendar. | | 4:30p |
Alcatel-Lucent Expands Cloud Partnership With Intel At Mobile World Congress this week in Barcelona, Alcatel-Lucen (ALU) expanded its relationship with Intel to combine resources and expertise, launched its suite of virtualized network functions, and joined forces with Telefonica to accelerate NFV.
Intel Collaboration to drive industry to the cloud
Alcatel-Lucent announced it has expanded its collaboration with Intel (INTC) to help operators worldwide improve their time to market and operational efficiency, and to drive the creative development of new products and services for consumers and business customers with the use of cloud technologies. Specifically the expanded collaboration will focus on Virtualized Radio Access Networks (RAN) portfolio, developing and optimizing Alcatel-Lucent’s CloudBand NFV platform, and high-performance packet processing for advanced IP/MPLS platforms and cabinets.
“There are billions of devices and machines making up the Internet of Things that will connect to the network in the next few years consuming increasing amounts of bandwidth and network resources. However, the current economics and lack of flexibility in telecom networks is unsustainable,” said Renee James, President of Intel Corporation. “Intel is working with industry leaders such as Alcatel-Lucent to leverage server and virtualization technology in telecom networks, data centers and the cloud, with the goal of reducing costs and making it easier to deploy new services.”
Virtualized mobile network functions virtualization
Alcatel-Lucent also announced that it is delivering a portfolio of virtualized mobile network function applications – evolved packet core (EPC), IP Multimedia Subsystem (IMS) and radio access network (RAN) – and extending them to the cloud. Alcatel-Lucent and China Mobile are co-demonstrating voice and video delivered over a virtualized proof of concept LTE RAN Baseband Unit (BBU) and virtualized evolved packet core (vEPC) onboarded to the cloud by CloudBand 2.0. NFV enables network automation and efficient use of resources for rapid scaling up and down of services. Applied correctly, it will optimize delivery of services such as voice over LTE (VoLTE), Web real time communications (Web RTC), secure mobile communications for enterprises and machine-to-machine (M2M) communications.
“Each mobile operator will have different priorities and want to take their own specific path to NFV, based on their business priorities and the state of their network and operations readiness,” said Marcus Weldon, Corporate Chief Technology Officer (CTO) of Alcatel-Lucent and Bell Labs President. ”Building on our strong IP foundation, we have pioneered an open SDN and NFV architecture that combines the best of IP with the best of IT, to create a truly carrier-grade cloud network solution. Our accelerated investments in virtualized telecom applications, our CloudBand 2.0 platform and our Nuage Networks SDN venture are clear proof points that we have all the elements to help operators create an open agile, efficient cloud environment at a speed that meets their individual needs.”
Alcatel-Lucent and Telefonica co-innovate for network virtualization infrastructure
Alcatel-Lucent announced that it has signed a co-innovation agreement with Telefónica to drive innovation and adoption of Network Functions Virtualization (NFV) by the telecommunications industry, building on the strength of a shared vision. The partnership will use the CloudBand NFV platform to identify and develop process models to help service providers decide which elements within the network should be virtualized and when. Virtualization of network functions allow decoupling of the software-based functions from the physical infrastructure enabling service providers such as Telefónica to greatly reduce operating costs through a more efficient management of the physical infrastructure, which can be reused for playing different functionalities.
Using Alcatel-Lucent’s CloudBand platform, the co-development program with Telefónica will define NFV-evolved architectures, identify and test different NFV scenarios and environments. The companies will create will create a joint NFV research facility, as well as invest in certification programs, and engagement with other service providers. As The Shift Plan industrially repositions Alcatel-Lucent as a specialist vendor of IP, cloud and ultra-broadband access, the agreement with Telefónica underlines the operator’s endorsement of this strategic direction, and the value it will bring to operators and service providers.
“The intimacy required for a successful collaboration between large enterprises does not come easily,” said Enrique Blanco, Global CTO of Telefónica. Fortunately, we have an excellent track record of successfully working with Alcatel-Lucent around the world. We know each other very well. I am confident that what we achieve over the course of the next year in the realm of virtualization will have major benefits for our two companies and the industry as a whole.” |
|