AnandTech's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, January 7th, 2016

    Time Event
    1:03a
    Oculus VR Reveals Retail Price of Its Virtual Reality Headset: $599

    Oculus VR on Wednesday revealed the price of its Oculus Rift virtual reality headset as well as its launch date. The price of the VR hardware appears to be considerably higher than expected by gamers and industry analyst. The developer claims that the high price is conditioned by high costs and the use of custom hardware. However, such price point may slowdown adoption of virtual reality technologies by the masses.

    The Oculus Rift bundle includes the VR headset, an Xbox One gamepad, a sensor, the Oculus Remote controller as well as EVE: Valkyrie and Lucky's Tale VR games. The initial bundle will not include the Oculus Touch controllers, which were recently delayed to the second half of the year. The Oculus Rift virtual reality headset is available for pre-order for $599 on the company’s web-site and will ship starting March 28, 2016, to 20 countries. Select retailers will also sell Oculus Rift hardware in April. In addition, makers of gaming PCs plan to offer Oculus Ready PCs with the headset next month starting at $1499.

    Back in early October, 2015, Palmer Luckey, the founder of Oculus VR, said in an interview that the price of one Oculus Rift headset was in the “$350 ballpark”, but it was “going to cost more than that”. As it appears, the virtual reality head mounted display (HMD) costs nearly two times more than that. The $599 price-point is a yet another indicator that the first-generation VR headsets are expensive to make in general. However, that price is too high for the mass market and for many gamers, believes Jon Peddie, the head of Jon Peddie Research, which tracks sales of graphics adapters and PC gaming hardware.

    A Lot of Custom Hardware

    While the virtual reality HMD is available for pre-order now, Oculus VR still has to confirm its final technical specifications. Based on what the company revealed about six months ago, the Oculus Rift uses two custom AMOLED panels (one per eye) with 2160×1200 resolution and 90 Hz refresh rate (1080×1200 per eye). The AMOLED displays were architected for low persistence, they display each image for about 2 ms in a bid to minimize delays and avoid effects like motion blur, which can cause nausea. The headset also features specially designed adjustable lenses to enable wide field of view. Each headset has integrated headphones and a microphone. Besides, the Oculus Rift sports various sensors, including the company’s own Constellation system based on infrared sensors, which tracks position of the user’s head.

    To connect to a PC, the Oculus Rift and devices that accompany it (gamepad, sensor, remote, etc.) use one HDMI 1.3/HDMI 1.4 interconnection, three USB 3.0 interconnections and one USB 2.0 interconnection.

    The Oculus Rift virtual reality headset uses a lot of custom components that were designed specifically for this device. For example, the low-persistence AMOLED display panels were co-developed by Oculus and Samsung Electronics. Oculus VR claims that they wanted to make a device that will offer the best virtual reality experience possible today, which is why they tried to avoid any trade-offs or compromises. Due to extensive usage of parts that are not mass-produced today, the cost of each Oculus Rift should be rather high, which is one of the reasons why the headset is priced at $599.

    High-End PC Needed

    Since the Oculus Rift should run games in 2160×1200 resolution at 90 Hz with minimal latency, it requires a rather powerful personal computer to offer comfortable experience. Oculus VR recommends a PC with a quad-core Intel Core i5-4590 microprocessor, an AMD Radeon R9 290 or NVIDIA GeForce GTX 970 graphics adapter as well as 8GB of RAM. The company admits that the more powerful your system is, the better experience with Oculus Rift you are going to get.

    Developers of graphics processing units have implied multiple times that for the best VR experience a dual-GPU graphics sub-system is required today. For example, AMD plans to align release of its new dual-chip Fiji video card with availability of VR headsets in the second quarter. In a dual-GPU graphics sub-system, each graphics chip renders its own part of the scene for one eye. Such approach doubles performance and lowers latency. However, two GPUs also require a more powerful central processing unit as well as a high-end power supply unit.

    For makers of computer hardware the launch of the first VR headset for gamers means a chance to improve sales of their higher-end products. Not only manufacturers of video cards or microprocessors can benefit from availability of the Oculus Rift, but also producers of RAM, solid-state drives and motherboards can take advantage of the headset as enthusiasts begin to build their new systems. Unfortunately, significant investments in hardware may slowdown adoption of virtual reality HMDs by both gamers and the general public.

    Oculus VR: 100+ Virtual Reality Games to Be Available in 2016

    Oculus VR claims that more than 100 games designed for virtual reality and compatible with the Rift are set to be available by the end of 2016, including “dozens of full-length AAA” games. The company does not reveal a lot of names, but in addition to the titles bundled with the VR headset, the firm mentions Rockband VR by Harmonix, Edge of Nowhere by Insomniac, and The Climb by Crytek.

    While over a hundred of titles that support VR is a lot, only a handful of them will actually attract users to the platform. Since $599 is a significant investment for many gamers, there should be several compelling titles, which not only demonstrate the technology itself, but make people want to play.

    A Lot of Excitement

    There is a lot of excitement about virtual reality technologies not only among gamers, but also among developers of hardware and software. While the technology itself has a lot of potential for video games and beyond, the very first Oculus Rift headset is designed primarily for games. The price of the HMD is high for many gamers, but for general users it is prohibitively expensive. Therefore, sales of the device will likely be rather limited. In fact, even Facebook, the owner of Oculus VR, does not expect to sell a lot of VR headsets this year.

    Sales enthusiast-class graphics cards, which cost $399 and higher, total approximately three million units a year, according to Jon Peddie Research. There are many PC gamers nowadays, but only a fraction of them invests thousands of dollars in hardware. Various analysts make different predictions about sales of the first-generation VR gear, some are optimistic and some are pessimistic. For example, according to a report released by Juniper Research several months ago, cumulative sales of VR headsets in their first year of availability (i.e., 2016) will be approximately three million units. There are three major VR devices to be released this year: the Oculus Rift, the Vive from HTC and the PlayStation VR from Sony. It is highly likely that the majority of hardcore enthusiast gamers will buy only one of them. Juniper predicted that cumulative sales of VR headsets will hit around 30 million units by 2020 as hardware and software evolves.

    It remains to be seen how many virtual reality head-mounted displays Oculus VR will sell this year. Palmer Luckey said in an interview that the first consumer version of the Oculus Rift was developed to offer great experience and to show potential of the technology to the world. Hopefully, it will deliver to the promise.

    6:00a
    Ambarella CES 2016 Tour

    Yesterday Josh and I met with Ambarella and went on a tour of their exhibits. The main topic was their new line of SoCs, along with the various products and projects that have branched off from what these SoCs and their video encoding and decoding capabilities can enable.

    The high end SoC in Ambarella's line of chips for cameras is H2. H2 is built on Samsung's 14nm process, and it incorporates a quad core 1.2GHz Cortex A53 cluster, and its capable of encoding 4K HEVC video at 60fps, or 4K AVC video at 120fps, the latter of which makes it capable of doing 4K slow mo videos by playing back the 120fps footage at 30fps. H2 also includes support for capturing video with 10bit color, as well as support for HDR which has recently been integrated into the Bluray and UHD standards.

    The next SoC in Ambarella's line is H12. H12 isn't shown in the image above, but it's capable of encoding 4Kp30 video using AVC or HEVC. It uses a single 1GHz Cortex A9 core, and it’s built on a 28nm process.

    The last two SoCs are A9SE and A120. A120 is an entry level chip, while A9SE has some advanced functionality, but is intended for devices sitting at lower prices than ones that incorporate H2. A9SE offers 4Kp30 support, and can do 1080p60 video with electronic image stabilization.

    One of the demos that Ambarella showed was an example of their electronic image stabilization for 4K video. Part of the drive behind this is the fact that stabilization on drones has had to be implemented using a mechanical system that shifts the camera along each axis to keep the sensor in the same position. This type of system increases the size, mass, and cost of the drone, and so it's obviously something that drone makers would be keen to eliminate in order to allow for reduced prices and improved battery life. Above you can see a short video which compares two real time video feeds with EIS on and off. As you can see, the difference is dramatic, and the level of stabilization that can be done by the SoC is extremely impressive.

    Another exhibit showcased the ability to record 4Kp120 video. This is the first time that I’ve seen any 4K footage recorded at a high enough frame rate to slow it down an appreciable amount.

    Several of the exhibits that Ambarella had related to technology that will be used in self driving cars. Some of this builds on demos that were shown at last year’s CES. The demo that I found most interesting is the electronic mirror. Essentially this is a mirror that integrates a display which streams footage from a rear-mounted camera on your car. The reasons for using an electronic mirror include the ability to have a higher field of view, no obstruction from passengers in rear seats, and better visibility at night due to the HDR processing that can be done by the SoC at night in order to make the car behind you visible without making the headlights overpoweringly bright. It’s important to note that the mirror can act as a normal mirror in conditions when the camera is not necessary.

    Another car-related demo from Ambarella involved mapping the environment around a vehicle using cameras mounted on the various sides. This isn’t exactly a new concept, but it does tie in with their new SoCs. Some things demoed included environment mapping for self-driving cars, and using the cameras to view an environment in order to implement features like automatic parking.

    The last demo that I found quite interesting demonstrated the image de-warping capabilities of the H2 and H12 SoCs. The demonstration shown gave the example of a fisheye camera being used in a security camera mounted on on a door, with the de-warping being used to put the image into a state that is easy to view.

    As far as video encode and decode goes, the tech being shown off by Ambarella definitely impresses. I haven’t seen 4Kp120 recording in anything else that is consumer-focused, and the push for improved 4Kp60 HEVC encoding with 10bit color and HDR support is something that will be necessary as new standards for UltraHD video are adopted.

    7:00a
    Revisiting Keyssa: Commercial Availability, Products in Q1 2016

    While we talked about Keyssa at CES last year, details were rather sparse as the technology was still in the early stages of getting off the ground. However, this year Keyssa’s connector technology is now commercially available. Based upon discussions with those at Keyssa, products with this new technology could ship as early as Q1 2016.

    For those that haven’t seen Keyssa in action before, it’s hard to really understand the potential of this technology. At a high level, it’s basically like NFC in the sense that this technology is very short range wireless with a range of roughly an inch or a few centimeters before the signal disappears completely. However, within that range you get 6 Gbps of bandwidth and relatively low power compared to something like 802.11ad/WiGig. Unlike 802.11ad WiFi, the connector and chip needed to enable this technology is almost absurdly tiny, as the chip is no more than a few millimeters squared. This is purely a physical layer technology, which means that at the operating system level a Keyssa connector can appear to be USB, DisplayPort, HDMI, SATA, PCIe, and pretty much any point to point digital connection protocol you can name today.

    As a result, Keyssa has the potential to completely do away with physical data ports in devices. Probably the most obvious example of this would be 2-in-1 hybrid devices like the Surface Book, which in theory could completely do away with all of the wired connections that introduce additional engineering challenges when designing a device like the Surface Book.

    Keyssa has also discussed the potential to replace flex cables internally in smartphones and other devices, which could reduce board area and/or z-height along with simplifying design and reducing cost as flex cables would no longer need to be laid out by hand.

    This connector can also use simple plastic with certain shapes like tubes to introduce directionality and make wire-like connections over distance without the need for actual wires or proper connections.

    Overall, Keyssa shows great potential and judging by the discussions I’ve had there’s a significant amount of interest from OEMs and ODMs for this technology, with hints that devices with this technology are already in development. It’s hard to say what the full potential of this technology is, but it’s definitely going to be interesting to see how this develops.

    6:40p
    Dell Demonstrates 30-inch 4K OLED Display

    Dell does not produce its own display panels, but when it comes to unique “world’s first” monitors, it is sometimes years ahead of all of its rivals. At the International CES 2016, Dell introduced its UltraSharp 30-inch OLED display, the company’s first monitor to use organic light emitting diode panel. The product is designed for professionals and carries a rather extreme price tag, but this is going to be a dream display for years to come.

    The Dell UltraSharp UP3017Q is a 30-inch display with 3840×2160 resolution, 0.1 ms response time and an unknown refresh rate (yet, it should be very high). The monitor can reproduce 1.07 billion colors, it covers 100% of Adobe RGB color space as well as and 97.8% of DCI-P3 color space (used for digital movie projection by the U.S. movie industry and is expected to be adopted in televisions and in home cinema), according to Dell. Just a few professional displays nowadays cover 100% of Adobe RGB. The manufacturer declares 400,000:1 dynamic contrast ratio, but admits the value is only that because testing equipment won't go higher.

    The UltraSharp UP3017Q ultra-high-definition display has very narrow bezels; the monitor itself is thin, but not remarkably thin like OLED TVs, possibly because it features internal power supply unit as well as complex logic inside. The monitor features a mini DisplayPort (mDP) connector, an HDMI port as well as a USB type-C port, which could be used for video and data connectivity as well as for power delivery (it can be powered using a type-C cable, or deliver power to another device).

    Emissive electroluminescent layer in organic light-emitting diode is made of organic compound that emits light in response to an electric current. The organic semiconductor layer is situated between two electrodes and does not require a backlight. As a result, it can display truly deep black levels, unlike liquid crystal display (LCD) panels, which use various kinds of backlighting. Besides, since the emissive electroluminescent layer is very thin and can take different shapes, it is possible to build ultra-thin and even curved monitors and TVs using OLEDs.

    While OLED technology can deliver deep blacks, high contrast ratio and exceptional colours, it is not free of drawbacks. The organic layer may burn down over prolonged amount of time, and colors can shift over time. To maximize lifespan of the OLED panel inside the UltraSharp UP3017Q, Dell integrated a special presence detector into the front panel of the display, which switches the monitor off when nobody uses it. Another disadvantage of OLEDs is a possibility of static image burn in. To reduce the chance of burn in, the UP3017Q has a special pixel-shifting technology.

    The Dell UltraSharp 30 OLED monitor will cost whopping $4,999, which it becomes available on March 31, 2016, in the United States. The display at this point is only aimed at professionals, who work in color-critical environments such as graphic arts and photography. However, due to exceptional colors and contrast as well as ultra-fast response time, the UltraSharp UP3017Q will be a dream display for gamers, prosumers and other users, who value quality.

    OLED panels are considerably more expensive to produce than modern LCD panels, partly because of lower yields. Last year an executive from LG Electronics said that yields of OLED panels had reached 80% and would continue to grow. At the International CES 2016, Kwon Bong-suk, the head of LG’s TV business, said that the company had cut prices of OLED TVs in the U.S. by 45% late in 2015. As a result, LG now expects sales of OLED televisions to triple this year. Price reduction of OLED TVs indicates that production costs of organic light-emitting diode panels are going down. Perhaps, over time, the Dell UltraSharp UP3017Q will also become more affordable, or Dell will release an OLED display for a wider audience.

    << Previous Day 2016/01/07
    [Calendar]
    Next Day >>

AnandTech   About LJ.Rossia.org