AnandTech's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, August 7th, 2013

    Time Event
    1:32a
    IBM Offers POWER Technology for Licensing, Forms OpenPOWER Consortium

    The CPU wars are far from over, but the battlegrounds have shifted of late. Where once we looked primarily at the high-end processing options, today we tend to cover nearly as much in the ARM licensing world as we do in the x86 world. IBM is joining with Google, NVIDIA, Mellanox, and Tyan to create the OpenPOWER Consortium, with the intent being to build advanced server, networking, storage, and GPU-accelerated technologies based on IBM’s POWER microprocessor architecture. High performance computing clusters and cloud computing are other areas of focus for OpenPOWER.

    Along with the forming of the OpenPOWER Consortium, POWER hardware and software will be made available for open development for the first time, and POWER IP will be licensable to others. (While not stated explicitly in the news release, Ars Technica's Andrew Cunningham reports that licensing will begin with POWER8.) Steve Mills, senior vice president and group executive at IBM, states, “Combining our talents and assets around the POWER architecture can greatly increase the rate of innovation throughout the industry. Developers now have access to an expanded and open set of server technologies for the first time. This type of ‘collaborative development’ model will change the way data center hardware is designed and deployed.”

    The NVIDIA aspect is also interesting, considering how many of the Top 500 Supercomputer list now use some form of GPU. Sumit Gupta from NVIDIA’s Tesla Accelerated Computing Business states, “The OpenPOWER Consortium brings together an ecosystem of hardware, system software, and enterprise applications that will provide powerful computing systems based on NVIDIA GPUs and POWER CPUs.” Considering NVIDIA has also announced their intent to license Kepler and future GPU IP to third parties, we could potentially see SoCs in the coming years with POWER-based CPU cores and NVIDIA-licensed GPU cores in place of the common ARM and PowerVR solutions so prevalent today.

    This is clearly intended to slow and perhaps even reverse the exodus seen from the POWER architecture over the past decade. Apple switched from POWER to x86 back in the Core 2 Duo days (2006), and after getting wins in both the current generation consoles (Xbox 360 and PlayStation 3) the next generation Xbox One and PlayStation 4 will both be going with x86 designs. Many are likely to see this as vindication of the IP (Intellectual Property) licensing route taken by ARM, with NVIDIA, and now IBM all looking to license their IP (not to mention AMD and others licensing ARM IP). Considering the decline in POWER use in recent years, this move should help give POWER more relevance in the future.

    11:45a
    AMD's Radeon HD 7990 Gets an Official Price Cut: $799 and Below

    We don’t typically run pipeline stories on video card price cuts, but then again most price cuts are gradual affairs that even the manufacturers themselves rarely draw attention to. However today we have a case where we’re looking at a far more substantial price cut on a far more substantial product: AMD's Radeon HD 7990.

    For the launch of AMD’s frame pacing enhanced Catalyst 13.8 drivers earlier this month, AMD’s partners were able to get reference 7990 cards down to as low as $799. That was $200 below the 7990’s official list price and still $100 cheaper than it was earlier in July. That alone is a fairly stiff price cut for a product that only launched less than 3 months prior.

    However after doing our weekly price checks and noticing that prices were lower still, after poking some contacts we’ve been told that AMD has since then enacted a further official price cut, which in turn has already pushed down prices even further. Officially the price on the 7990 is being reduced to $799, which is the price we already saw it at last week. But as is usually the case, AMD is quoting the MSRP rather than the price their partners are actually paying for 7990. By taking a hit on their own margins AMD’s partners could hit $799 before this week’s price cut, and now with this new price cut in effect those same partners have room to lower prices once again.

    The end result is that while the official MSRP on the 7990 is $799, street prices are consistently lower; much lower in fact. PowerColor and XFX have their respective reference models down to $669 and $699 after rebates respectively, while HIS, Gigabyte, and VisionTek are all $749 and lower without rebates. This gives the 7990 an effective price cut of somewhere between $250 and $330 since its launch 3 months ago, and around $100 cheaper than it was just last week. $799 was already a good deal 7990 for the product, so it goes without saying that this puts the card in an even better position.

    AMD for their part isn’t in the business of giving away hardware, so significant price cuts like this are both a boon for buyers and a concern for the company. The timing of the 7990 launch – a product that should have ideally been released months earlier as opposed to coming after the release of FCAT – was undeniably poor. Consequently when we see this large of a price cut this quickly it hints to AMD sitting on a lot of unsold inventory, possibly a consequence of that weak launch, but in the end that’s a matter for AMD and their partners.

    Ultimately $669 is by no means cheap for a video card – we are after all still talking about a luxury class dual-GPU card – but it does represent a not so subtle shift in the market. At these prices 7990 is no longer directly competing with NVIDIA’s GTX Titan and GTX 690, but rather we’re now seeing the 7990 priced a stone’s throw away from NVIDIA's lower end GK110 based card, the GTX 780. The GTX 780 was itself something of a spoiler for the $1000 GTX Titan, so at these prices the 7990 serves much the same role.

    More importantly however is that AMD now has a direct counter for what’s technically NVIDIA’s fastest consumer card, no longer leaving NVIDIA unchallenged there. We won’t wax on about the performance of the two cards, but with AMD’s frame pacing improvements in play the 7990 is very strong contender for this segment. The wildcard, as always, is going to be faith in whether AMD will be able to continue quickly delivering performance-consistent Crossfire profiles for new games, a never-ending challenge for dual-GPU products.

    Summer 2013 GPU Pricing Comparison
    AMD Price NVIDIA
      $1000 GeForce GTX Titan/GTX 690
    AMD Radeon HD 7990 $700  
      $650 GeForce GTX 780
    Radeon HD 7970 GHz Edition $400 GeForce GTX 770

     

    11:50a
    Hands On with the LG G2 - LG's latest flagship

    Today LG is announcing the LG G2, there’s no Optimus this time, it’s just the LG G2. The G2 is the successor to the Optimus G, the phone that also became the Nexus 4, and makes a number of improvements above and beyond the Optimus G. The G2 is the flagship product that LG is putting all of its resources behind, and takes the flagship throne from the G Pro.

    The G2 makes a number of interesting hardware changes in the shape, size, and button area compared to the competition. Rather than having side-mounted power and volume buttons, to minimize edge bezel, LG has moved them to the back of the device just below the camera module. The volume rocker is one solid piece with a raised power button in the center. The edge around the power button is the notification LED, which glows a white color when powered on or when things roll in.

      LG G2
    SoC Qualcomm Snapdragon 800 (MSM8974)
    4x Krait 400 2.3 GHz, Adreno 330 GPU
    Display 5.2-inch IPS-LCD 1920x1200 Full HD
    RAM 2GB LPDDR3 800 MHz
    WiFi 802.11a/b/g/n/ac, BT 4.0
    Storage 32 GB internal
    I/O microUSB 2.0, 3.5mm headphone, NFC, Miracast, IR
    OS Android 4.2.2
    Battery 3000 mAh (11.4 Whr) 3.8V stacked battery
    Size / Mass 138.5 x 70.9 x 9.14 mm
    Camera 13 MP with OIS and Flash (Rear Facing)
    2.1 MP Full HD (Front Facing)

    LG believes that as devices grow in size, hand positioning has changed and putting the buttons on the side is no longer natural. I’ll admit I was initially confused about how to turn the G2 on, but after a few minutes of playing with the device, turning it on and off via the rear center power button or changing the volume seemed natural. The raised bump makes it easy to locate the buttons, and there’s another lip before your finger hits the camera front glass. Pressing the top button or bottom button for three seconds launches the memo app or camera, respectively. Inside the camera application, volume also doubles as a camera button and triggers image capture. It takes a little bit of getting used to, but putting the buttons on the back actually doesn't feel anywhere near as awkward as I thought it would. I'll have to spend more time with the G2 to really be able to tell how well this works in practice, but my initial subjective impressions are a lot more positive than I thought they would be. 

    The G2 eschews hardware buttons for the on-screen Android kind, although LG has made a number of customization options available in another settings menu.

    The back of the G2 is a curved, rounded profile. LG has included a stacked battery inside the G2 that maximizes the volume of the internal space. It’s a 3.8V 3000 mAh (11.4 watt-hour) LG Chem battery. If you’ve been paying attention this was also something Motorola talked about for their Moto X (the stacked part), turns out that LG Chem is indeed a supplier for Motorola. Of course the back on the G2 is non removable, and sealed, which isn’t a surprise anymore.

    The G2 comes in white and black models which are of polycarbonate construction. The materials choices aren’t anything revolutionary in a world where wood, metal, and composites seem to be the trend, but at least this time there’s no glass on the back that’s going to give people pause.


    LG G2 Touch Panel (Left) LG Optimus G Touch Panel (Right)

    The highlight of the G2 is of course its 5.2-inch 1920x1200 IPS display, and thin bezel. Getting the bezel to be as thin as possible seems to have been LG’s main design direction for the G2, and again moving the buttons to the back side means less button intrusion into the size and a thinner bezel. The other part is moving to top and bottom fanout for the touch traces – instead of routing everything to the top or bottom, there’s a top connector and bottom connector, that means thinner edge profile.

    The G2 display also includes built-in memory to enable panel self refresh. When the display contents aren’t being updated, the display GRAM holds this frame buffer and refreshes itself so the AP and display controller can go into an idle state. LG purports it gets a 26 percent reduction in power consumption from the display size using this GRAM (Graphic RAM) panel self refresh functionality.

    Viewing angles on the G2 and brightness seemed great from what time I spent with a prototype model. LG Display always seems to do an awesome job with its panels, and I don’t think the G2 will stray far from that mark.

    Camera on the G2 is also a step forwards from Optimus G. There’s a 13 MP rear facing module with OIS (Optical Image Stabilization) this time, which means LG joins HTC and Nokia in the OIS party. The module for the G2 is considerably bigger and includes the on-package gyro you’d expect for OIS to work properly. LG tells me that the CMOS still uses the 1.1µm pixels and size shared with the original Optimus G, but is a newer, faster version that supports 1080p60 video capture. That’s right, the G2 can do Full HD at 60 FPS on video. LG also does temporal oversampling (taking multiple frames and combining them into one image) for their digital zoom, instead of just a resampling. OIS definitely works on the G2 to help stabilize videos and take longer exposures in low light for still images. 

    The G2 also includes a sapphire crystal window on the back side to prevent scratching.

    LG has made audio in the line-out sense a priority for the G2. We’ve seen a lot of emphasis from other OEMs on speaker quality and stereo sound, with the G2 LG has put time into rewriting part of the ALSA stack and Android framework to support higher sampling and bit depth. The inability of the Android platform to support different sampling rates for different applications remains a big limitation for OEMs, one LG wrote around, and with the G2 up to 24 bit 192 kHz FLAC/WAV playback is supported in the stock player, and LG says it will make an API available for other apps to take advantage of this higher definition audio support to foster a better 24-bit ecosystem on Android. 

    I asked about what codec the G2 uses, and it turns out this is the latest Qualcomm WCD part, which I believe is WCD9320 for the MSM8974 platform. LG says that although the previous WCD9310 device had limitations, the WCD9320 platform offers considerably better audio performance and quality that enables them to expose these higher quality modes and get good output. The entire audio chain (software, hardware codec, and headphone amplifier) have been optimized for good quality and support for these higher bit depths, I’m told. I didn’t get a chance to listen to line out audio, but hopefully in testing this emphasis will play itself out in testing.

    The G2 is based on Qualcomm’s latest and greatest Snapdragon 800 SoC, MSM8974 at 2.3 GHz (the higher bin - Qualcomm is launching MSM8974 in two binned flavors at different costs, 2.2 and 2.3 GHz). This is of course the latest SoC built on TSMC’s 28nm HPm process with 4 Krait 400 CPUs inside, and Adreno 330 GPU. Alongside that the G2 includes 2 GB of LPDDR3 RAM. LG wasn’t ready for us to run benchmarks yet, as the prototypes we played with were not running stable release software with final tuning yet, but UI performance felt very speedy just in playing around on the device. Of course along with Snapdragon 800 comes LTE-A with carrier aggregation support – the banding for this international version I played with included LTE on bands 1, 3, 7, 8, and 20, and HSPA+ on 1, 2, 5, and 8, alongside Quad band EDGE. 

    The software platform is Android 4.2.2, and atop that is LG’s skin. LG has added a bunch of new features to its skinned Android experience, although its visual themeing remains essentially unchanged. Double tap to turn on and off uses the built in accelerometer to wake the phone up or turn it off – you just double tap quickly on the device when it’s in an off state to turn it on, and double tap quickly on a blank part of the display or status bar to turn it off. I don’t have a problem getting my index finger to the raised power button, but this is obviously an accommodation just in case that’s difficult.

    LG also is including 8 different colors of Quick Window cases with the G2, which offer a small window for getting glanceable information like the time or notifications. LG was quick to point out that it debuted this feature with the LG Spectrum 2. 

    The LG G2 looks like a big step forwards from the original Optimus G and includes an impressive list of new features, and may just be the place we see Snapdragon 800 first. The LG G2 will arrive internationally and on the four major carriers in the USA with the appropriate network band support. More on availability is coming soon, but I would suspect mid September for at least the international model. 

    12:55p
    John Carmack Joins Oculus Rift as CTO

    The Oculus Rift Kickstarter page (and various other places) announced today that John Carmack is joining them as their new Chief Technology Officer. John is one of the biggest names in the industry of 3D gaming, having been on the forefront of technology with Wolfenstein 3D, Doom, Quake, and Rage. The fact that he’s interested in Oculus Rift shouldn’t come as too much of a surprise, and in fact everyone I know that has had a chance to see the technology in action has been impressed. I wasn’t able to get there at CES 2013, but I know Brian stopped by—he mentioned that the transition from the Virtual Reality environment back to the real world was disorienting, in a good way (i.e. it was much better VR than we’ve seen in the past).

    Of course, this isn’t the first time John has had anything to do with Oculus Rift—he was the first developer to get the Oculus running with a 3D game (Doom 3). In a statement to the community he writes, “I have fond memories of the development work that led to a lot of great things in modern gaming – the intensity of the first person experience, LAN and internet play, game mods, and so on. Duct taping a strap and hot gluing sensors onto Palmer’s early prototype Rift and writing the code to drive it ranks right up there. Now is a special time. I believe that VR will have a huge impact in the coming years, but everyone working today is a pioneer. The paradigms that everyone will take for granted in the future are being figured out today; probably by people reading this message. It’s certainly not there yet. There is a lot more work to do, and there are problems we don’t even know about that will need to be solved, but I am eager to work on them. It’s going to be awesome!”

    Just to be clear, John isn’t leaving id Software for this new position; he will continue his work there, as well as with other companies/projects. It’s also interesting to look at the last id Software release, Rage, and think about what John might have to say regarding gaming performance of the Oculus Rift. Rage basically made itself useless as a benchmark by targeting a maximum frame rate of 60FPS, and it would dynamically adjust quality to hit 60FPS as best as it could, generally succeeding even on relatively low-end hardware. For Virtual Reality, I can see having a constant 60FPS stream of content being far more important than getting additional graphics quality, so hopefully John can help other developers realize that goal.

    As for the Oculus Rift, with many (over 17000!) development kits having now shipped to the community, as well as showcasing the 1080p HD version at E3 2013, we’re getting ever closer to the final hardware. The 0.2.3 SDK is also available, and besides the $2.5 million from the initial Kickstarter campaign, Oculus Rift has brought in a significant amount of additional funding over the past year. There’s still no official release date, but given the progress from the last year I’d expect to see the first consumer release within the next year, and very like before then. I’m sure they’d love to get on shelves before Christmas this year, but whether or not they can manage that remains to be seen.

    1:47p
    Understanding Panel Self Refresh

    Earlier today Brian spent some time with the G2, LG's 5.2-inch flagship smartphone based on the Qualcomm Snapdragon 800 (MSM8974) SoC. I'd recommend reading his excellent piece in order to get all of the details on the new phone, but there's one disclosure I'd like to call out here: the G2 supports Panel Self Refresh.

    To drive a 60Hz panel, your GPU must present the display with the contents of the frame buffer 60 times per second. Regardless of what's being displayed (static vs. active content), every second there are 60 updates pushed from the GPU to the display. When displaying fast moving content (e.g. video playback, games, scrolling), this update frequency is important and appreciated. When displaying static content however (E.g. staring at the home screen, reading a page of an eBook), the GPU is consuming power sending display updates when it doesn't need to. Panel Self Refresh (PSR) is designed to address the latter case.

    To be clear, PSR is an optimization to reduce SoC power, not to reduce display power. In the event that display content is static, the contents of the frame buffer (carved out of system RAM in the case of a smartphone) are copied to a small amount of memory tied to the display. In the case of LG's G2 we're likely looking at something around 8MB (1080p @ 32bpp). The refreshes then come from the panel's memory, allowing the GPU (and SoC) to drive down to an even lower power state. Chances are the panel's DRAM is also tied to a narrower bus and can be lower power than the system memory used by the SoC, making these refreshes even lower in power cost.

    LG claims a 26% reduction in power when displaying a still image with PSR enabled. I'm curious to see the impact on overall battery life. There are elements of our WiFi web browsing test that could benefit from PSR but it's unclear how much of an improvement we'll see. The added cost of introducing additional memory into a device is something that panel vendors have been hesitant to do, but as companies look to continue to reduce platform power it's a vector worth considering. LG's dual-role as a component supplier and device maker likely made the decision to enable PSR a lot easier.

    PSR potentially has bigger implications for notebook use where it's not uncommon to just stare at a desktop that's not animating at all. I feel like the more common use case in smartphones is to just lock your phone/display when you're not actively using it. 

    << Previous Day 2013/08/07
    [Calendar]
    Next Day >>

AnandTech   About LJ.Rossia.org