With the GPU side of things doing much better (at least from a design standpoint) than the CPU side it is disturbing that AMD would let this guy go. Hopefully this won't dull the wonderful competition that us consumers have enjoyed in the GPU industry in recent years.
This likely means less dedicated engineers working on high-end parts. When you have limited resources, you have to cut costs to fund your new strategy. Unfortunately, I either see AMD's Bulldozer Next or AMD's Graphics R&D budget being redirected towards low-power and more mainstream devices.
For instance, they can manufacture more versions of the Brazos CPU to better align with tablet/smartphone market needs. They can focus on creating low-powered server CPUs by massaging Bobcat.
The creation of all-in-one chips (Something that Steve Jobs described in his biography as what Intel didn't want to do) is likely the direction AMD wants to follow. So the focus might be shifting away from high-end GPUs (although I hope not).
It seems even Micron's CEO, Appleton, just announced that they too are moving towards manufacture of chips for mobile devices:
A lot of companies foresee that the next 5-year growth phase will come in smartphone/tablet space. Of course, discrete GPUs will still have a market, but it won't grow anywhere as fast.
AMD's GPU division also doesn't appear to be making that much $, given Q3 earnings report by AMD. AMD keeps competing on price for GPUs, which isn't helping their average selling price/profitability. For example, if HD7970 launches way ahead of Kepler, then AMD should price such as chip at $499, not $369. Their current GPU strategy is good for market share and mobile devices (i.e., manufacture of very efficient GPUs), but it's not helping them sell their GPUs for more $ since they aren't faster than NVs, and hence can't command a premium.
I am sure Read is not happy about selling GPUs at low prices just to keep market share or please us gamers. It will suck for gamers if high-end GPUs are given less priority, but I feel that AMD must raise prices for their GPUs. I mean sure, gamers love it when you can buy an HD6950 2GB for $230 and unlock it into an HD6970, but that's not favourable for business profitability.
I have a feeling Dirk was ousted because he was strongly against the new direction the board wants to pursue, because it likely de-prioritizes the enthusiast server, CPU and GPU markets in favour of low-priced, higher volume designs for low-power devices, and emerging markets.
The next gen consoles will probably define what GPUs look like going forward. As integrated graphics improve and laptops take over, the market for discreet graphics shrinks in the PC space.
I'm pretty sure the next gen consoles will look a lot like what AMD is going to put into laptops next year. Consoles that are huge power hungry beasts may produce better graphics but they're all going to look so much better than XBox 360 and PS3 that it's probably not a good strategy for AMD.
Nvidia is probably going to focus on mobile and turning their chips into CPUs (project Denver) so they're not stuck selling video cards in a dying market.
The Higher-end gpu market will be more likely to shrink instead of expand. Its really not a very profitable business as you can see with AMD's GPU division not being very strong in making money the past 1/4 even though they sold lots of cards.
Were just not big enough anymore, and never will be like the market was in the late 90's through the mid-2000's. Thus, when the next gen consoles come around, the masses (even including people from us) will surely abandon pc gaming even more. Which will make the pc gaming world even less of an incentive.In the long run consoles cost less, is much simplier to program for, and is much better to deal with when you have to try and keep up with gpu/cpu tech.
Don't be surprised if Amd TRULY starts to Only care about the Lower End/Mainstream CPU And GPU markets when it comes to non-server customers. Even the nVidia fannys will have a cow about the cost of future video cards when AMD finally announces what their new strategy will be in the next year or 2. (I'm obviouslly not 100% sure what their stategy will be. It just seems that the writing is on the wall).
Its all about money, period. You can be sure as well that the board is already VERY upset selling their low-powered IP to Qualcomm.Especially as cheap as they sold it. Something like $12 Million iirc?
I agree. Especially with next gen consoles about A year or so out . The aaa titles are mostly console... I don't have one, but I get the feeling play a game on a 48-60 inch tv may pull u into the game more than a 20-24" monitor.
I've got my PC connected to my 46 inch Samsung. Runs quite nicely and looks slightly better than my PS3 with only a Radeon 5670.
For one, the mid range cards are quite powerful, so the high end card sales are weaker, two, the economy sucks, people are doing without or doing with less.
by that logic you should just buy a long hdmi cable and a wireless keyboard and mouse and play your pc games on your home tv and you'd save the pc gaming market.
I can't see how letting the high end go would be a good thing. AMD's APUs look good because they have the GPU architecture that their competition doesn't have. With CPU architecture so weak it would make sense to maintain a lead in high end graphics and HPC tech while the CPU guys try to get everything back together. On the other hand I don't know how many high end GPUs AMD sells, maybe being competitive with phone/tablet GPUs and making efficient APUs is just as profitable.
If I read this article right, then there's a chance that AMD will give up on high-end GPUs as well?
I'm no industry expert or anything, but as a hardware enthusiast (and thus de facto tech support guy to everyone I know) I've seen most of AMD's success with GPUs come from delivering a high-end part at a great price.
I guess the traditional wisdom is that high-end parts are for advertising, and mid-range is for making money, but I actually don't know *anyone* who has less than a 6870 in their desktop PC.
Of course I don't know anything about how business is going at AMD, just strikes me as absurd that they would can one of their handful of remaining (visibly) successful product brackets.
"I've seen most of AMD's success with GPUs come from delivering a high-end part at a great price."
And here is why AMD's GPU division has not been very profitable. They have essentially eroded the brand value of ATI graphics by selling them at lower prices. This is actually a recent development. Historically, ATI has been very successful despite selling high-end parts at a high-price.
ATI almost never sold their high-end GPUs for low prices to compete with NV, which is why ATI on its own was worth more than AMD is today. The firm was far more profitable in its strategy.
Not sure if it's even confirmed that he's leaving (any other source besides Icrontic?) and if he is,is he getting fired or he is leaving on his own?
For example Patrick Moorhead was on that list,yet AllThingsD reported that he "is leaving the company, and according to people familiar with his plans, will be launching a consumer-focused technology analyst and consulting firm around the time of the International Consumer Electronics Show in Las Vegas in January." so maybe he wasn't forced out.
In any case we'll (most likely) see what changed in AMD's plans on feb 2 next year.
I am a solid AMD fan and every PC in my home runs both AMD CPU's and GPU's. That said, it seems to me that since the Athlon64 days the majority of AMD's choices have been bad to the point that AMD is still in business despite it's choices and not because of them. AMD should be heavily invested in the ARM SoC market right now but of course they sold out of that market right when the smartphone market was hitting critical mass. an upgraded 32nm "Phenom 3" core would have made more sense than the turd that is Bulldozer on the desktop. I can't express how bummed I am that I won't be getting a new kick-a@@ AMD cpu and 7xxx series gpu for Christmas this year. At least my 2 1/2 year old Phenom ll and overclocked 4850 can still run Skyrim well enough to get me in the game.
As in many other industries product planners always need a crystal ball and a lot of luck to guess at what direction will be the best for the future. In technology fields the target can change in a heartbeat when some new technology catches on be it the internet, social networking, PC appliances or vehicle entertainment systems - just to name a few.
AMD has to figure out where it wants to be in the technology game. Their resources and capabilities must be effectively directed to maximize profits from these areas while keeping their eye on the future and trying to make the best Biz decisions they can.
One of AMD's biggest historical challenges has been their Fabs and as we see with GloFo and even TSMC the transition to smaller circuit designs is exponentially more difficult to master. From my perspective AMD's partners need to sort their production issues as much as AMD needs to focus their energies on their core competence.
It should be interesting to see what direction Read and the board have planned for AMD.
A mobile strategy is going to be difficult. Intel seems like the only company that might be able to make x86 work in tablets and ultramobiles due to their manufacturing lead on the industry. NVIDIA, Qualcomm and TI have shown off working Windows on ARM tablets, so you know they have something in the pipeline that could work in these new form factors. I don't see how AMD is going to "cross the chasm" between 'big' computers and 'small' ones. They don't have an architecture or manufacturing process advantage over anyone.
I agree with everyone else here that pushing out the fellow in charge of product planning for their most successful product seems like a bad move. He might not be the long term future of the company, but they are going to need revenue from his products to invest in their next strategy.
I'm hoping that AMD hasn't decided to start hiring from the same pool of candidates that HP does. Letting go of leaders who execute plans in a company that hasn't executed well in the CPU division seems foolish to me. Letting go huge chunks of your marketing and PR when your main product is weak doesn't make sense either. Who is going to try to persuade people to buy the let down BDozer CPU line.
I was hoping BD would be just a little better so I could go back to buying AMD. I have a Core i5 760 and I am having a hard time making the switch to a BD chip with the power thermals and not quite good enough performance. I am also thinking they would have been better with a die shrink and power gating on the Phenom. I could have reasoned that better. I see some promise with BD and am hoping on Trinity now. Same old story, hoping for the next AMD architecture when they just release a new product. I think Brazos and Llanos have turned out well and look forward to the next iterations.
AMD's biggest strength is their refined (as in not brute force) gpu technology cascading down from their high end parts to their low profile and now apu integrated offerings.
If they intend to let go of that strategy, and seeing the the failure of BD to assault Intel's reign on the enthusiast market, they won't last that long after the 7xxx series are released.
BD's is being somewhat successful in the usual strong range for AMD, the budget one, so it could true to some extent that they might be intending to put their focus from the mid to bottom markets. If so say goodbye to 8 core IBs
Their PR and marketing department had it coming for a while now, but I agree that the only reason for Carrel's departure was some kind of collision with the new CEO's view for the company.
There must have been a disagreement between Rory Reed and Carrell Killebrew because if there wasn't then firing him would have been a huge mistake. Carrell proved that he could take the company in a new direction which is exactly what AMD needs. If AMD can't beat Intel at its game it needs to shift to beating Intel to the new market. Both AMD and Intel have become complacent and Rory Reed is just seeing this obvious fact. The market is shifting, has been shifting, and will continue to shift. I think part of the problem was that nobody saw the mobile market explode like it has. We all like our hi end GPUs, but that day and age is coming to an end. As somebody involved in the games industry all I have to say is, thank goodness (not about Carrell).
Seems quite obvious to me that Carrell and AMD's plans didn't mesh regarding future APUs. Carrell wants to create novelty items while AMD wants to make money. AMD just can't do both right now.
It seems like AMD is betting much harder on their CPU/GPU integration than on the discrete market. So a product planner that wants to push the high horsepower GPU system with larger and more complex chips probably doesn't fit with what seems to be AMDs vision of a chip that's less CPU/GPU and more of a big integrated package. Intel is pushing the same direction as more and more of the discrete systems become integrated into the CPU.
In all honesty I can't say that for the mass of the market they are wrong. How many work PCs (mine included) couldn't be handled more efficiently by such a system? Even most home users are the same. If a tablet can do 90% or more of what you want to do on a computer then will most people really need a monster desktop with huge power? If you design smart CPU/GPU combos that can handle the home use tasks well (video playback, transcoding, and power management), then most people will be happy.
I may like my gaming rig and it's capabilities but even I'll admit that the most used PC in my house is my home server which does DVR duties as well as storage and delivery, and it's headless.
They do not make profit on Radeons. While the Radeons are proud, lack of profit is shame. Designing GPU at AMD is pure idea. It does not matter they are first with GDDR5, or with DX11 or with new silicon process. What matters is lot of clients and cash - NV leads in this spaces. Additionaly NV is more complete graphics company. They make lot of osftware (CUDA, Physx, Optix, lot of other tools), have strong relationships with game developers, industry, movies studios, academia, etc. AMD have only good unprofitable GPU with some drivers and nothing more. Maybe Rory Read want new NV inside AMD, than old ATI.
Where do you think the GPU side of the APU came from, thin air? It comes from the Radeon side of things, and advancing the state of the GPU means long-term advances in the AMD APUs.
I wonder how Killebrew and restructuring is going will affect the development of next gen GPUs? While Southern Islands will be unlikely to be affected by this, the gen after that might be completely diffferent, if they ditch the current GPU strategy explained in the slide. Maybe they'll go for Nvidia style big chip again?
It would be really sad (and bad) if AMD completely folded high performance GPUs and CPUs and concentrated on Brazos class products.
1. AMD's APU Strategy is just a sum of its parts (CPU+GPU)
From Wikipedia:
"Fusion was announced in 2006 and has been in development since then. The final design is the product of the merger between AMD and ATI, combining general processor execution as well as 3D geometry processing and other functions of modern GPUs (like GPGPU computation) into a single die."
The reality is, AMD has done very little in the mainstream to actually leverage GPGPU. Intel has done a better job here with QuickSync. The GPU folks have done a horrible job with letting people use their GPU for anything other than graphics. AMD's video converter offers a limited number of default targets, and there is an extremely limited number of configurable options. To top it off, I'm willing to bet most people with AMD GPU's don't even know how to access the video converter, or even know it's there.
2. GPU design methods forced upon CPU
Nvidia and ATI use(d) more automated approaches to chip design. When they are releasing new architectures every 18 months, this is pretty much a requirement. This design approach won out between the CPU and GPU designers following the ATI acquisition, and was used for Bulldozer. The result is a chip that requires many more transistors to essentially just match the performance of the previous generation.
I'm guessing the debate between the CPU and GPU folks about the design strategy was an unpleasant one, and given the result, the GPU folks are taking much of the blame.
3. Failure in high-end GL products
Nvidia commands a huge lead in the workstation class GPU segment. AMD is pretty much failing to compete in this high profit area as well. This isn't completely disconnected from the first point. AMD needs to significantly improve the software and drivers associated with their graphics chips.
It's sad, but not a bad thing per se. What lots of people seem to forget is what graphics cards where developped to do in the first place: Display graphics! I remember playing carmageddon 2 at <25FPS and commenting to my friend he needed a new graphics card because his voodoo 1 wasn't cutting it any more. back then software always outpaced the hardware.
In the current console age though, the most money isn't made off gamers it's made off everybody else. Most games are ported over from the console to the PC and while the PC gets performance upgrades the consoles don't thus limiting what the games can actually do.
And we've seen news messages that the consoles are starting to lose revenue and die out in favor of social media games, which require even less graphical power.
I've only built a monster PC once in my life. It had a radeon x1900xt graphics card bought a day after it was released. The entire PC cost around $3500. Back then, It was "for every $1100 you spend, you get an extra year of use. at >$3300, you get 1 year of high settings, 1 year of medium settings and 1 year in low settings of games, then you need a new PC." This was about 5 years ago. 3 years later, i needed a new PC as predicted. But the PC i bought then i still have now, with a GTX275. Still the most expensive component, but more then $200 cheaper then i paid for the X1900xt. If it was 5 years ago, i should've had to drop setting everything to max settings within 6 months. 2 years later, BF3 is the most taxing game released in a long time, and it's the first game i can't run on "ultra" anymore. But it runs >60 FPS on medium. Some settings i can even set to high. In the current age if i buy a $500 graphics card now it will still run games in a decade. Good for me, yes, but bad for AMD who won't sell me another graphics card for an entire decade - but still has to develop new high end parts that entire decade at the rate of 2 new generations per year!
The reason to have high end parts simply isn't there anymore.
But since everybody seems in agreement that the future of computing is mobile, be is smartphone, tablet or notebook, that would be the smart segment to focus on. It will still be about performance per watt, only at much less watt's then where used to. There's much more room to grow, as mobile graphics still suck (while PC's hardly got room left for improvement really).
This doesn't mean high end parts will dissapear! once they have a good running laptop design, there's nothing to prevent them from just upping the watt's. Remember intel's Core CPU lineup came from a CPU designed for mobile stuff, the pentium M. But it will mean that the focus is no longer on getting the newest, fastest GPU.... It will be getting the one that does more with less.
Nobody is playing serious video game on mobile device. Perhaps AMD wants to change that.
OK, but before we get to that point, AMD must have SOC design. Even Intel is struggling in this space. But they have buy many small companies, each provides a piece of puzzle. In 1-2 years, Intel will have a SOC with lot of features integrated into one single chip. AMD has not even started.
Who want to partner with AMD to get into this space? I say who else? Qualcom but they don't have to buy AMD. Just send a head-hunter to AMD campus.
As you'll find more and more exec boards will be taken over by the accountants and finance men. The creatives will be pushed out. It will happen to Nvidia/Intel/Apple etc in time.
It's all about cutting costs, the bottom line etc. Excitement or revolutionary products don't come into their way of thinking. Its the money that counts to them, not the 'Vision'.
Eventually we will all end up being drip fed a line of similar dull mediocre products from all the big corps.
One of the guys laid off is my local AMD Rep who deals mostly with Server chips. 20 year AMD Vet and they laid him off 2 weeks before Interlagos was released to consumers?
AMD has been making up the costs of the fabs it ditched and is actually doing well, this layoff is more personal that practical and I think they are heading in another direction that may be we see the last of them in the main stream cpu/gpu world. If this happens we can all sit nicely and cut our own throats, the ones that is that like cheap cpu's and gpus.
I thought their mix of the two was a great direction if they kept it up to where they merge middle class cpu's with good gpu's because the both offest the other's weakness's. If the merged the Bulldozer with the next gen GPU they would have one killer product if the software was there to take advantage of it. In non gpu apps it could double as the FPU where the BD is weak and use the cpu to help out with some of the physic the gpu games need. This is just fast guessing but they had a good direction and are making money and this layoff isn't about profits and I am really regretting they changed CEO's. At least Myer's was dedicated full time to AMD and I really wonder just where the new ones dedications really are grounded at. If we loose AMD cpu's (which the hardware sites like this one and others have done their best to talk people out of buying them before software has been out to show just how well they can really do, too much early reviews that don't mesh with that new of a cpu structure) and no one around seems to remember the days when Intel was alone in the 286 cpu days then you will get a taste of it if AMD moves out of this market because this is a fact you WILL be paying as much for a cpu as your paying for you whole computer and this isn't counting what Nvidia charges for discrete GPU cards if ATI goes down with AMD. Computer's used to be pretty expensive before AMD and Cyrix came on the market and most people ran Atari and Commador and Apple computers. Paying 2,000 or more for a 286 was common and even not counting inflation it's more than a good one costs today. Wave bye bye to cheap computers. I'll forget about using computers or use one for a very very very long time because that one will be the last one affordable to buy. AMD needs to sell good server CPU's and good high end graphics cards for the high end, not our high end but cad servers and higher as there is big money in those area's. They have been falling short in the high end GPU range lately maybe thanks to the merger with AMD. I wish they never would have merged because now they are all in one target range to really screw things up if they fail. Oh well, it was fun for a while. It is way to obvious they can not keep up with Intel and they know they can't and were trying for the mid range market because a R&D budget is very high and getting a lot higher as the size shrinks. AMD has worked with IBM and others to offset this somewhat but will never be able to match Intel so don't expect their cpu's to come close to their top of the line cpus but more in the mid range where with a good production run they can do ok and focus on the long term merge of the cpu/gpu market where the future for them is bright for them and the main stream users. I just hope they have the time and this new CEO doesn't really screw things up like it sure sounds like he is doing. If they can't sell enough or make enough to make their R&D budget the game is over.
they may not foade in to irrelevancy, but might go the way of VIA... they still do x86 but not much is heard about them. They now do low power cpu's... for embedded systems mostly.
I just hope they can do what they were suppose to do.
Ditch the automation, do a cycle every 18-24 months with high specced Cpu's, not just their FPU's of which use to be great, and then start competing with Intel again.
They should not have sold their ARM part off... but hey, if they really want to, they can get a licence and start again, and do their own design. nVidia is doing that already.
Hindsight is great... but getting in to the game, and having the guts for tough decisions is even more tougher especially when you have to stay afloat.
Personally, I would think of diversification... try and get most markets. They already have a good portfolio of products, they jsut need to optimise that portfolio.
Have at least 3-5 items going, with 2 of them in the front and the other in the back with a slightly more "slacker" time frame, so they can still be in the game, and be merging them every 1.5-3 cycles.
An example would be:
Bulldozer + 7xxx series, merge the mid-series, every 6-8 months, create a new skew with the same or slightly more optimized version of the CPU with an updated graphics core and adding a suffix. When the GPU is not needed for "gaming" or "boinc" or some other high demand, then to power gate / throttle down and jsut use a portion of its available power. Same as with the CPU. When you browse or use social networking with flash available, you do not need the full power, at this power state, the APU (CPU+GPU) can go in to a very low state, and have a power usage of about 15-30 Watts, then when you go in to another task, rev up to the 100+ Watts.
SouthBridge etc: Make it the most power efficient it can be. Make the Skew's scalable, all M/B's do not need 12-20 USB ports! Chop down lanes, not features! i.e.: Keep Sata, but not all applicaitons need 6+ ports, or all the usb ports or even PCIe / PCI lanes!
Example: Low End Mid High End PCIe* 8/4 16/8 32/8 USB** 2/4 4/8 6/12 SATA***4/0 6/2 8/4
PCIe = Video/ Remainder for other ports like a 4 lane PCIe port or 2 * 1x ports etc. High End can be 2 * 16x lanes + 1 * 4x + 2 * 1x lanes
USB = 3.0 / 2.0. The 3.0 chips are still more expensive to make and also for routing on the board.
SATA = Int / Ext
Backburner: SoC = Make server on Chip. SoC = Also System on Chip for Mobile / Tablet. Nearly everything is on the chip, including LPDDR2!!! This is becoming common practise for ARM. This has EVERYTHING ON IT. Just peripherals are added. Ideal for "closed" systems as phones and tablets. no SB needed.
Give it another 2-3 years and most folks will be doing their computing on low power, light devices and probably mobile to a degree too.
10 years ago everyone wanted desktops because laptops were a compromise (slow/missing features and ports) and they cost far more.
Now none of my customers want desktops. Its laptops, laptops and more laptops. Yeah I still call them laptops, big whoop wanna fight about it?
Now I'm getting asked about tablets and smartphones. It's all about getting smaller, lighter for normal folks, not bigger and faster.
to 90% of the computing public Bulldozer and anything over an i5 is pretty pointless.
Once we get smartphones and tablets up to a 2GHz dual core platform with a GPU built in (and it will happen) then that's when AMD/Intel need to worry. Especially if they haven't got anything in the cupboard that fits that bill.
What we have to remember is that the world of computing no longer hangs on the word of the PC enthusiast anymore (if it ever did). We are largely irrelevant to the guys at AMD/Intel etc.
As long and mom and pop can browse the web and look at their photos and Joe Corporate can work on his spreadsheet in a hotel room or in a airport lounge that's as far as it needs to go.
It seems pretty obvious to me from the actions of game developers that if you want to play games then get a console.
I don't like it either but then I guess the blacksmiths and coach-makers didn't like it when the automobile came along.
I hate to say it, as the proud ownder of a 6950 2GB card... but I don't need it anymore. No one will. The PC is dying.
Windows 8 will use the Metro style interface. Big solid blocks of video crap. Windows 8 will NOT require any extra computing power. In fact, it will require LESS.
The game industry is pissing on PCs - giving us console port crap like Crysis 2. Why bother? Really?
Let's all just go out and by an iPhone and play angry birds and farmville. Pull up the old rocker, get a hound dog, and stop innovation.
Everyone is working so hard to kill off the PC. We have no voice.
AMD never really seemed to have interest in high end graphics. It seems they have just been riding out a left over ATI design plan (which usually extends 3-5 years). The posted road map and firing of Killebrew confirm this.
I disagree...I think AMD was really banking on the high end discrete GPU market since that's where they can make the most profit...but since the AMD/ATI merger they had to split resources and just could not keep up with Nvidia in the High end market. It was a simple and hard choice, sell for a lower cost or lose substantial market share to Nvidia.
I think AMD re-evaluated their approach when they saw the performance of the 8800GTX and realized it's too hard to compete in that space. It's sad that AMD is pretty much fighting a war on two fronts. Intel for CPU and Nvidia for GPU. It's AMD's innovation thats going to keep them in the industry as a strong competitor.
sounds more to me like the new CEO is just cleaning house of all the "top brass" and could possibly have cut his own company off at the knees. I mean the new ceo used to work for lenovo... aka IBM's now defunct (and sold to a chinese corporation) desktop and laptop division. So you really shouldnt expect a lot of common sense and/or advancement from someone in that position.
It does indeed seem the pc is on deaths door, as microsoft, mozilla, amd etc. all seem to be focusing on portable devices now. This is something that appeared to happen suddenly as if they all suddenly changed direction on a whim.
I agree its bad for gaming as certian games like starcraft would be aweful on a console, and in addition I use my pc for media, movies, encoding. etc. It has multipl hdd's for tons of storage and so forth, a tiny laptop or tablet or whatever wouldnt cut it, to me they just toys, a novelty.
Even the server market is going the same way as cloud computing is the new fad, which means the hope of enterprise hardware been developed to save the day is also dwindling.
Windows 8 is aweful, but of course I am using it with a mouse etc. whilst its primarily designed for touchscreen portable devices and the pc will be an afterthought secondary use for it. The new xbox interface will be primarily designed for kinect and as such using the gamepad wont be too good to use on it also. Also the browsers IE and firefox with the removal of gui toolbars, buttons etc. whilst not too useful on a pc is all been done to make them better for small mobile screens.
One thing people forget tho is office workers, call centres etc. I cant see companies suddenly migrating to tablets etc. however these companies tend to rarely upgrade and usually use outdated low end stuff anyway.
Simplify the market into three segments. Maintain high-end GPU market for enthusiasts and gamers, making competitive mainstream GPU for HTPC / desktop, and create an integrated CPU + GPU for mobile / tablet.
If AMD can not compete with Intel's processor speed, leave the high-end CPU market, and create just the integrated GPU + CPU and energy saving for personal home server. This is not end of the PC, but evolution of PCs into the early era of cloud computing.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
48 Comments
Back to Article
fic2 - Friday, November 4, 2011 - link
Seems like he should have been moved over to the CPU side. Or put above both CPU and GPU.deputc26 - Friday, November 4, 2011 - link
With the GPU side of things doing much better (at least from a design standpoint) than the CPU side it is disturbing that AMD would let this guy go. Hopefully this won't dull the wonderful competition that us consumers have enjoyed in the GPU industry in recent years.RussianSensation - Friday, November 4, 2011 - link
This likely means less dedicated engineers working on high-end parts. When you have limited resources, you have to cut costs to fund your new strategy. Unfortunately, I either see AMD's Bulldozer Next or AMD's Graphics R&D budget being redirected towards low-power and more mainstream devices.For instance, they can manufacture more versions of the Brazos CPU to better align with tablet/smartphone market needs. They can focus on creating low-powered server CPUs by massaging Bobcat.
The creation of all-in-one chips (Something that Steve Jobs described in his biography as what Intel didn't want to do) is likely the direction AMD wants to follow. So the focus might be shifting away from high-end GPUs (although I hope not).
It seems even Micron's CEO, Appleton, just announced that they too are moving towards manufacture of chips for mobile devices:
http://www.businessweek.com/news/2011-11-03/micron...
A lot of companies foresee that the next 5-year growth phase will come in smartphone/tablet space. Of course, discrete GPUs will still have a market, but it won't grow anywhere as fast.
AMD's GPU division also doesn't appear to be making that much $, given Q3 earnings report by AMD. AMD keeps competing on price for GPUs, which isn't helping their average selling price/profitability. For example, if HD7970 launches way ahead of Kepler, then AMD should price such as chip at $499, not $369. Their current GPU strategy is good for market share and mobile devices (i.e., manufacture of very efficient GPUs), but it's not helping them sell their GPUs for more $ since they aren't faster than NVs, and hence can't command a premium.
I am sure Read is not happy about selling GPUs at low prices just to keep market share or please us gamers. It will suck for gamers if high-end GPUs are given less priority, but I feel that AMD must raise prices for their GPUs. I mean sure, gamers love it when you can buy an HD6950 2GB for $230 and unlock it into an HD6970, but that's not favourable for business profitability.
I have a feeling Dirk was ousted because he was strongly against the new direction the board wants to pursue, because it likely de-prioritizes the enthusiast server, CPU and GPU markets in favour of low-priced, higher volume designs for low-power devices, and emerging markets.
mckirkus - Saturday, November 5, 2011 - link
The next gen consoles will probably define what GPUs look like going forward. As integrated graphics improve and laptops take over, the market for discreet graphics shrinks in the PC space.I'm pretty sure the next gen consoles will look a lot like what AMD is going to put into laptops next year. Consoles that are huge power hungry beasts may produce better graphics but they're all going to look so much better than XBox 360 and PS3 that it's probably not a good strategy for AMD.
Nvidia is probably going to focus on mobile and turning their chips into CPUs (project Denver) so they're not stuck selling video cards in a dying market.
formulav8 - Saturday, November 5, 2011 - link
The Higher-end gpu market will be more likely to shrink instead of expand. Its really not a very profitable business as you can see with AMD's GPU division not being very strong in making money the past 1/4 even though they sold lots of cards.Were just not big enough anymore, and never will be like the market was in the late 90's through the mid-2000's. Thus, when the next gen consoles come around, the masses (even including people from us) will surely abandon pc gaming even more. Which will make the pc gaming world even less of an incentive.In the long run consoles cost less, is much simplier to program for, and is much better to deal with when you have to try and keep up with gpu/cpu tech.
Don't be surprised if Amd TRULY starts to Only care about the Lower End/Mainstream CPU And GPU markets when it comes to non-server customers. Even the nVidia fannys will have a cow about the cost of future video cards when AMD finally announces what their new strategy will be in the next year or 2. (I'm obviouslly not 100% sure what their stategy will be. It just seems that the writing is on the wall).
Its all about money, period. You can be sure as well that the board is already VERY upset selling their low-powered IP to Qualcomm.Especially as cheap as they sold it. Something like $12 Million iirc?
Belard - Monday, November 7, 2011 - link
I agree. Especially with next gen consoles about A year or so out . The aaa titles are mostly console... I don't have one, but I get the feeling play a game on a 48-60 inch tv may pull u into the game more than a 20-24" monitor.No games = no need for high end gpus.
0ldman79 - Monday, November 7, 2011 - link
I've got my PC connected to my 46 inch Samsung. Runs quite nicely and looks slightly better than my PS3 with only a Radeon 5670.For one, the mid range cards are quite powerful, so the high end card sales are weaker, two, the economy sucks, people are doing without or doing with less.
shin0bi272 - Tuesday, November 8, 2011 - link
by that logic you should just buy a long hdmi cable and a wireless keyboard and mouse and play your pc games on your home tv and you'd save the pc gaming market.Zink - Friday, November 4, 2011 - link
I can't see how letting the high end go would be a good thing. AMD's APUs look good because they have the GPU architecture that their competition doesn't have. With CPU architecture so weak it would make sense to maintain a lead in high end graphics and HPC tech while the CPU guys try to get everything back together. On the other hand I don't know how many high end GPUs AMD sells, maybe being competitive with phone/tablet GPUs and making efficient APUs is just as profitable.sjael - Friday, November 4, 2011 - link
If I read this article right, then there's a chance that AMD will give up on high-end GPUs as well?I'm no industry expert or anything, but as a hardware enthusiast (and thus de facto tech support guy to everyone I know) I've seen most of AMD's success with GPUs come from delivering a high-end part at a great price.
I guess the traditional wisdom is that high-end parts are for advertising, and mid-range is for making money, but I actually don't know *anyone* who has less than a 6870 in their desktop PC.
Of course I don't know anything about how business is going at AMD, just strikes me as absurd that they would can one of their handful of remaining (visibly) successful product brackets.
RussianSensation - Saturday, November 5, 2011 - link
"I've seen most of AMD's success with GPUs come from delivering a high-end part at a great price."And here is why AMD's GPU division has not been very profitable. They have essentially eroded the brand value of ATI graphics by selling them at lower prices. This is actually a recent development. Historically, ATI has been very successful despite selling high-end parts at a high-price.
ATI almost never sold their high-end GPUs for low prices to compete with NV, which is why ATI on its own was worth more than AMD is today. The firm was far more profitable in its strategy.
jjj - Friday, November 4, 2011 - link
Not sure if it's even confirmed that he's leaving (any other source besides Icrontic?) and if he is,is he getting fired or he is leaving on his own?For example Patrick Moorhead was on that list,yet AllThingsD reported that he "is leaving the company, and according to people familiar with his plans, will be launching a consumer-focused technology analyst and consulting firm around the time of the International Consumer Electronics Show in Las Vegas in January." so maybe he wasn't forced out.
In any case we'll (most likely) see what changed in AMD's plans on feb 2 next year.
Anand Lal Shimpi - Friday, November 4, 2011 - link
He was included in the layoff, I confirmed with Carrell personally last night.Take care,
Anand
Kamen75 - Friday, November 4, 2011 - link
I am a solid AMD fan and every PC in my home runs both AMD CPU's and GPU's. That said, it seems to me that since the Athlon64 days the majority of AMD's choices have been bad to the point that AMD is still in business despite it's choices and not because of them. AMD should be heavily invested in the ARM SoC market right now but of course they sold out of that market right when the smartphone market was hitting critical mass. an upgraded 32nm "Phenom 3" core would have made more sense than the turd that is Bulldozer on the desktop. I can't express how bummed I am that I won't be getting a new kick-a@@ AMD cpu and 7xxx series gpu for Christmas this year. At least my 2 1/2 year old Phenom ll and overclocked 4850 can still run Skyrim well enough to get me in the game.Beenthere - Friday, November 4, 2011 - link
As in many other industries product planners always need a crystal ball and a lot of luck to guess at what direction will be the best for the future. In technology fields the target can change in a heartbeat when some new technology catches on be it the internet, social networking, PC appliances or vehicle entertainment systems - just to name a few.AMD has to figure out where it wants to be in the technology game. Their resources and capabilities must be effectively directed to maximize profits from these areas while keeping their eye on the future and trying to make the best Biz decisions they can.
One of AMD's biggest historical challenges has been their Fabs and as we see with GloFo and even TSMC the transition to smaller circuit designs is exponentially more difficult to master. From my perspective AMD's partners need to sort their production issues as much as AMD needs to focus their energies on their core competence.
It should be interesting to see what direction Read and the board have planned for AMD.
tspacie - Friday, November 4, 2011 - link
A mobile strategy is going to be difficult. Intel seems like the only company that might be able to make x86 work in tablets and ultramobiles due to their manufacturing lead on the industry. NVIDIA, Qualcomm and TI have shown off working Windows on ARM tablets, so you know they have something in the pipeline that could work in these new form factors. I don't see how AMD is going to "cross the chasm" between 'big' computers and 'small' ones. They don't have an architecture or manufacturing process advantage over anyone.I agree with everyone else here that pushing out the fellow in charge of product planning for their most successful product seems like a bad move. He might not be the long term future of the company, but they are going to need revenue from his products to invest in their next strategy.
eanazag - Friday, November 4, 2011 - link
I'm hoping that AMD hasn't decided to start hiring from the same pool of candidates that HP does. Letting go of leaders who execute plans in a company that hasn't executed well in the CPU division seems foolish to me. Letting go huge chunks of your marketing and PR when your main product is weak doesn't make sense either. Who is going to try to persuade people to buy the let down BDozer CPU line.I was hoping BD would be just a little better so I could go back to buying AMD. I have a Core i5 760 and I am having a hard time making the switch to a BD chip with the power thermals and not quite good enough performance. I am also thinking they would have been better with a die shrink and power gating on the Phenom. I could have reasoned that better. I see some promise with BD and am hoping on Trinity now. Same old story, hoping for the next AMD architecture when they just release a new product. I think Brazos and Llanos have turned out well and look forward to the next iterations.
arjuna1 - Saturday, November 5, 2011 - link
AMD's biggest strength is their refined (as in not brute force) gpu technology cascading down from their high end parts to their low profile and now apu integrated offerings.If they intend to let go of that strategy, and seeing the the failure of BD to assault Intel's reign on the enthusiast market, they won't last that long after the 7xxx series are released.
BD's is being somewhat successful in the usual strong range for AMD, the budget one, so it could true to some extent that they might be intending to put their focus from the mid to bottom markets. If so say goodbye to 8 core IBs
Their PR and marketing department had it coming for a while now, but I agree that the only reason for Carrel's departure was some kind of collision with the new CEO's view for the company.
Only time will tell.
allingm - Saturday, November 5, 2011 - link
There must have been a disagreement between Rory Reed and Carrell Killebrew because if there wasn't then firing him would have been a huge mistake. Carrell proved that he could take the company in a new direction which is exactly what AMD needs. If AMD can't beat Intel at its game it needs to shift to beating Intel to the new market. Both AMD and Intel have become complacent and Rory Reed is just seeing this obvious fact. The market is shifting, has been shifting, and will continue to shift. I think part of the problem was that nobody saw the mobile market explode like it has. We all like our hi end GPUs, but that day and age is coming to an end. As somebody involved in the games industry all I have to say is, thank goodness (not about Carrell).Iketh - Saturday, November 5, 2011 - link
Seems quite obvious to me that Carrell and AMD's plans didn't mesh regarding future APUs. Carrell wants to create novelty items while AMD wants to make money. AMD just can't do both right now.djc208 - Saturday, November 5, 2011 - link
It seems like AMD is betting much harder on their CPU/GPU integration than on the discrete market. So a product planner that wants to push the high horsepower GPU system with larger and more complex chips probably doesn't fit with what seems to be AMDs vision of a chip that's less CPU/GPU and more of a big integrated package. Intel is pushing the same direction as more and more of the discrete systems become integrated into the CPU.In all honesty I can't say that for the mass of the market they are wrong. How many work PCs (mine included) couldn't be handled more efficiently by such a system? Even most home users are the same. If a tablet can do 90% or more of what you want to do on a computer then will most people really need a monster desktop with huge power? If you design smart CPU/GPU combos that can handle the home use tasks well (video playback, transcoding, and power management), then most people will be happy.
I may like my gaming rig and it's capabilities but even I'll admit that the most used PC in my house is my home server which does DVR duties as well as storage and delivery, and it's headless.
TristanSDX - Saturday, November 5, 2011 - link
They do not make profit on Radeons. While the Radeons are proud, lack of profit is shame. Designing GPU at AMD is pure idea. It does not matter they are first with GDDR5, or with DX11 or with new silicon process. What matters is lot of clients and cash - NV leads in this spaces.Additionaly NV is more complete graphics company. They make lot of osftware (CUDA, Physx, Optix, lot of other tools), have strong relationships with game developers, industry, movies studios, academia, etc.
AMD have only good unprofitable GPU with some drivers and nothing more.
Maybe Rory Read want new NV inside AMD, than old ATI.
Targon - Monday, November 7, 2011 - link
Where do you think the GPU side of the APU came from, thin air? It comes from the Radeon side of things, and advancing the state of the GPU means long-term advances in the AMD APUs.george1976 - Monday, November 7, 2011 - link
Numbers, or i will call you a fanboy :)Pantsu - Saturday, November 5, 2011 - link
I wonder how Killebrew and restructuring is going will affect the development of next gen GPUs? While Southern Islands will be unlikely to be affected by this, the gen after that might be completely diffferent, if they ditch the current GPU strategy explained in the slide. Maybe they'll go for Nvidia style big chip again?It would be really sad (and bad) if AMD completely folded high performance GPUs and CPUs and concentrated on Brazos class products.
Tanclearas - Saturday, November 5, 2011 - link
1. AMD's APU Strategy is just a sum of its parts (CPU+GPU)From Wikipedia:
"Fusion was announced in 2006 and has been in development since then. The final design is the product of the merger between AMD and ATI, combining general processor execution as well as 3D geometry processing and other functions of modern GPUs (like GPGPU computation) into a single die."
The reality is, AMD has done very little in the mainstream to actually leverage GPGPU. Intel has done a better job here with QuickSync. The GPU folks have done a horrible job with letting people use their GPU for anything other than graphics. AMD's video converter offers a limited number of default targets, and there is an extremely limited number of configurable options. To top it off, I'm willing to bet most people with AMD GPU's don't even know how to access the video converter, or even know it's there.
2. GPU design methods forced upon CPU
Nvidia and ATI use(d) more automated approaches to chip design. When they are releasing new architectures every 18 months, this is pretty much a requirement. This design approach won out between the CPU and GPU designers following the ATI acquisition, and was used for Bulldozer. The result is a chip that requires many more transistors to essentially just match the performance of the previous generation.
I'm guessing the debate between the CPU and GPU folks about the design strategy was an unpleasant one, and given the result, the GPU folks are taking much of the blame.
3. Failure in high-end GL products
Nvidia commands a huge lead in the workstation class GPU segment. AMD is pretty much failing to compete in this high profit area as well. This isn't completely disconnected from the first point. AMD needs to significantly improve the software and drivers associated with their graphics chips.
TSS - Saturday, November 5, 2011 - link
It's sad, but not a bad thing per se. What lots of people seem to forget is what graphics cards where developped to do in the first place: Display graphics! I remember playing carmageddon 2 at <25FPS and commenting to my friend he needed a new graphics card because his voodoo 1 wasn't cutting it any more. back then software always outpaced the hardware.In the current console age though, the most money isn't made off gamers it's made off everybody else. Most games are ported over from the console to the PC and while the PC gets performance upgrades the consoles don't thus limiting what the games can actually do.
And we've seen news messages that the consoles are starting to lose revenue and die out in favor of social media games, which require even less graphical power.
I've only built a monster PC once in my life. It had a radeon x1900xt graphics card bought a day after it was released. The entire PC cost around $3500. Back then, It was "for every $1100 you spend, you get an extra year of use. at >$3300, you get 1 year of high settings, 1 year of medium settings and 1 year in low settings of games, then you need a new PC." This was about 5 years ago. 3 years later, i needed a new PC as predicted. But the PC i bought then i still have now, with a GTX275. Still the most expensive component, but more then $200 cheaper then i paid for the X1900xt. If it was 5 years ago, i should've had to drop setting everything to max settings within 6 months. 2 years later, BF3 is the most taxing game released in a long time, and it's the first game i can't run on "ultra" anymore. But it runs >60 FPS on medium. Some settings i can even set to high. In the current age if i buy a $500 graphics card now it will still run games in a decade. Good for me, yes, but bad for AMD who won't sell me another graphics card for an entire decade - but still has to develop new high end parts that entire decade at the rate of 2 new generations per year!
The reason to have high end parts simply isn't there anymore.
But since everybody seems in agreement that the future of computing is mobile, be is smartphone, tablet or notebook, that would be the smart segment to focus on. It will still be about performance per watt, only at much less watt's then where used to. There's much more room to grow, as mobile graphics still suck (while PC's hardly got room left for improvement really).
This doesn't mean high end parts will dissapear! once they have a good running laptop design, there's nothing to prevent them from just upping the watt's. Remember intel's Core CPU lineup came from a CPU designed for mobile stuff, the pentium M. But it will mean that the focus is no longer on getting the newest, fastest GPU.... It will be getting the one that does more with less.
nofumble62 - Saturday, November 5, 2011 - link
Nobody is playing serious video game on mobile device. Perhaps AMD wants to change that.OK, but before we get to that point, AMD must have SOC design. Even Intel is struggling in this space. But they have buy many small companies, each provides a piece of puzzle. In 1-2 years, Intel will have a SOC with lot of features integrated into one single chip. AMD has not even started.
Who want to partner with AMD to get into this space? I say who else? Qualcom but they don't have to buy AMD. Just send a head-hunter to AMD campus.
jabber - Saturday, November 5, 2011 - link
As you'll find more and more exec boards will be taken over by the accountants and finance men. The creatives will be pushed out. It will happen to Nvidia/Intel/Apple etc in time.It's all about cutting costs, the bottom line etc. Excitement or revolutionary products don't come into their way of thinking. Its the money that counts to them, not the 'Vision'.
Eventually we will all end up being drip fed a line of similar dull mediocre products from all the big corps.
softdrinkviking - Sunday, November 6, 2011 - link
seriously. this always happens.it's like the wallmart syndrome.
goodbye to quality and care, hello to bottom line.
rtfg - Saturday, November 5, 2011 - link
== ( http://www.hapous.com )====== ( http://www.hapous.com )====
== ( http://www.hapous.com )====
Casper42 - Saturday, November 5, 2011 - link
One of the guys laid off is my local AMD Rep who deals mostly with Server chips.20 year AMD Vet and they laid him off 2 weeks before Interlagos was released to consumers?
YOU SO STUPID
B.Wallach - Saturday, November 5, 2011 - link
AMD has been making up the costs of the fabs it ditched and is actually doing well, this layoff is more personal that practical and I think they are heading in another direction that may be we see the last of them in the main stream cpu/gpu world.If this happens we can all sit nicely and cut our own throats, the ones that is that like cheap cpu's and gpus.
I thought their mix of the two was a great direction if they kept it up to where they merge middle class cpu's with good gpu's because the both offest the other's weakness's.
If the merged the Bulldozer with the next gen GPU they would have one killer product if the software was there to take advantage of it. In non gpu apps it could double as the FPU where the BD is weak and use the cpu to help out with some of the physic the gpu games need. This is just fast guessing but they had a good direction and are making money and this layoff isn't about profits and I am really regretting they changed CEO's. At least Myer's was dedicated full time to AMD and I really wonder just where the new ones dedications really are grounded at.
If we loose AMD cpu's (which the hardware sites like this one and others have done their best to talk people out of buying them before software has been out to show just how well they can really do, too much early reviews that don't mesh with that new of a cpu structure) and no one around seems to remember the days when Intel was alone in the 286 cpu days then you will get a taste of it if AMD moves out of this market because this is a fact you WILL be paying as much for a cpu as your paying for you whole computer and this isn't counting what Nvidia charges for discrete GPU cards if ATI goes down with AMD. Computer's used to be pretty expensive before AMD and Cyrix came on the market and most people ran Atari and Commador and Apple computers. Paying 2,000 or more for a 286 was common and even not counting inflation it's more than a good one costs today. Wave bye bye to cheap computers.
I'll forget about using computers or use one for a very very very long time because that one will be the last one affordable to buy.
AMD needs to sell good server CPU's and good high end graphics cards for the high end, not our high end but cad servers and higher as there is big money in those area's.
They have been falling short in the high end GPU range lately maybe thanks to the merger with AMD. I wish they never would have merged because now they are all in one target range to really screw things up if they fail.
Oh well, it was fun for a while.
It is way to obvious they can not keep up with Intel and they know they can't and were trying for the mid range market because a R&D budget is very high and getting a lot higher as the size shrinks. AMD has worked with IBM and others to offset this somewhat but will never be able to match Intel so don't expect their cpu's to come close to their top of the line cpus but more in the mid range where with a good production run they can do ok and focus on the long term merge of the cpu/gpu market where the future for them is bright for them and the main stream users. I just hope they have the time and this new CEO doesn't really screw things up like it sure sounds like he is doing.
If they can't sell enough or make enough to make their R&D budget the game is over.
softdrinkviking - Sunday, November 6, 2011 - link
and scary that they would let him go.it seems like AMD is going to blaze a trail into budget chips and fade away into irrelevancy.
i wonder what's about to happen to AMD's stock value? could this be a good thing for investors?
Aries1470 - Sunday, November 6, 2011 - link
they may not foade in to irrelevancy, but might go the way of VIA... they still do x86 but not much is heard about them. They now do low power cpu's... for embedded systems mostly.I just hope they can do what they were suppose to do.
Ditch the automation, do a cycle every 18-24 months with high specced Cpu's, not just their FPU's of which use to be great, and then start competing with Intel again.
They should not have sold their ARM part off... but hey, if they really want to, they can get a licence and start again, and do their own design. nVidia is doing that already.
Hindsight is great... but getting in to the game, and having the guts for tough decisions is even more tougher especially when you have to stay afloat.
Personally, I would think of diversification... try and get most markets. They already have a good portfolio of products, they jsut need to optimise that portfolio.
Have at least 3-5 items going, with 2 of them in the front and the other in the back with a slightly more "slacker" time frame, so they can still be in the game, and be merging them every 1.5-3 cycles.
An example would be:
Bulldozer + 7xxx series, merge the mid-series, every 6-8 months, create a new skew with the same or slightly more optimized version of the CPU with an updated graphics core and adding a suffix.
When the GPU is not needed for "gaming" or "boinc" or some other high demand, then to power gate / throttle down and jsut use a portion of its available power. Same as with the CPU. When you browse or use social networking with flash available, you do not need the full power, at this power state, the APU (CPU+GPU) can go in to a very low state, and have a power usage of about 15-30 Watts, then when you go in to another task, rev up to the 100+ Watts.
SouthBridge etc:
Make it the most power efficient it can be.
Make the Skew's scalable, all M/B's do not need 12-20 USB ports!
Chop down lanes, not features! i.e.:
Keep Sata, but not all applicaitons need 6+ ports, or all the usb ports or even PCIe / PCI lanes!
Example:
Low End Mid High End
PCIe* 8/4 16/8 32/8
USB** 2/4 4/8 6/12
SATA***4/0 6/2 8/4
PCIe = Video/ Remainder for other ports like a 4 lane PCIe port or 2 * 1x ports etc. High End can be 2 * 16x lanes + 1 * 4x + 2 * 1x lanes
USB = 3.0 / 2.0. The 3.0 chips are still more expensive to make and also for routing on the board.
SATA = Int / Ext
Backburner:
SoC = Make server on Chip.
SoC = Also System on Chip for Mobile / Tablet. Nearly everything is on the chip, including LPDDR2!!! This is becoming common practise for ARM. This has EVERYTHING ON IT. Just peripherals are added. Ideal for "closed" systems as phones and tablets. no SB needed.
Anyway, these are just my thoughts.
jabber - Sunday, November 6, 2011 - link
It's just not going our way.Give it another 2-3 years and most folks will be doing their computing on low power, light devices and probably mobile to a degree too.
10 years ago everyone wanted desktops because laptops were a compromise (slow/missing features and ports) and they cost far more.
Now none of my customers want desktops. Its laptops, laptops and more laptops. Yeah I still call them laptops, big whoop wanna fight about it?
Now I'm getting asked about tablets and smartphones. It's all about getting smaller, lighter for normal folks, not bigger and faster.
to 90% of the computing public Bulldozer and anything over an i5 is pretty pointless.
Once we get smartphones and tablets up to a 2GHz dual core platform with a GPU built in (and it will happen) then that's when AMD/Intel need to worry. Especially if they haven't got anything in the cupboard that fits that bill.
What we have to remember is that the world of computing no longer hangs on the word of the PC enthusiast anymore (if it ever did). We are largely irrelevant to the guys at AMD/Intel etc.
As long and mom and pop can browse the web and look at their photos and Joe Corporate can work on his spreadsheet in a hotel room or in a airport lounge that's as far as it needs to go.
It seems pretty obvious to me from the actions of game developers that if you want to play games then get a console.
I don't like it either but then I guess the blacksmiths and coach-makers didn't like it when the automobile came along.
MadAd - Sunday, November 6, 2011 - link
what we need is another killer app for the pc, one that you just cant do on wind up toy pc tabletsdfsd - Sunday, November 6, 2011 - link
==== Something unexpected surprise ====[ http://www.hapous.com ]
very good web,believe you will love it.
fdfsxcvdcfdh - Monday, November 7, 2011 - link
yegrt7eytg8rfdfsxcvdcfdh - Monday, November 7, 2011 - link
eghfydhgvtfdfsxcvdcfdh - Monday, November 7, 2011 - link
good shoppingfdfsxcvdcfdh - Monday, November 7, 2011 - link
ONLINE STORE
Shinobisan - Monday, November 7, 2011 - link
I hate to say it, as the proud ownder of a 6950 2GB card... but I don't need it anymore.No one will.
The PC is dying.
Windows 8 will use the Metro style interface. Big solid blocks of video crap.
Windows 8 will NOT require any extra computing power.
In fact, it will require LESS.
The game industry is pissing on PCs - giving us console port crap like Crysis 2.
Why bother? Really?
Let's all just go out and by an iPhone and play angry birds and farmville.
Pull up the old rocker, get a hound dog, and stop innovation.
Everyone is working so hard to kill off the PC.
We have no voice.
Wreckage - Monday, November 7, 2011 - link
AMD never really seemed to have interest in high end graphics. It seems they have just been riding out a left over ATI design plan (which usually extends 3-5 years). The posted road map and firing of Killebrew confirm this.Glibous - Monday, November 7, 2011 - link
I disagree...I think AMD was really banking on the high end discrete GPU market since that's where they can make the most profit...but since the AMD/ATI merger they had to split resources and just could not keep up with Nvidia in the High end market. It was a simple and hard choice, sell for a lower cost or lose substantial market share to Nvidia.I think AMD re-evaluated their approach when they saw the performance of the 8800GTX and realized it's too hard to compete in that space. It's sad that AMD is pretty much fighting a war on two fronts. Intel for CPU and Nvidia for GPU. It's AMD's innovation thats going to keep them in the industry as a strong competitor.
shin0bi272 - Tuesday, November 8, 2011 - link
sounds more to me like the new CEO is just cleaning house of all the "top brass" and could possibly have cut his own company off at the knees. I mean the new ceo used to work for lenovo... aka IBM's now defunct (and sold to a chinese corporation) desktop and laptop division. So you really shouldnt expect a lot of common sense and/or advancement from someone in that position.chrcoluk - Wednesday, November 9, 2011 - link
It does indeed seem the pc is on deaths door, as microsoft, mozilla, amd etc. all seem to be focusing on portable devices now. This is something that appeared to happen suddenly as if they all suddenly changed direction on a whim.I agree its bad for gaming as certian games like starcraft would be aweful on a console, and in addition I use my pc for media, movies, encoding. etc. It has multipl hdd's for tons of storage and so forth, a tiny laptop or tablet or whatever wouldnt cut it, to me they just toys, a novelty.
Even the server market is going the same way as cloud computing is the new fad, which means the hope of enterprise hardware been developed to save the day is also dwindling.
Windows 8 is aweful, but of course I am using it with a mouse etc. whilst its primarily designed for touchscreen portable devices and the pc will be an afterthought secondary use for it. The new xbox interface will be primarily designed for kinect and as such using the gamepad wont be too good to use on it also. Also the browsers IE and firefox with the removal of gui toolbars, buttons etc. whilst not too useful on a pc is all been done to make them better for small mobile screens.
One thing people forget tho is office workers, call centres etc. I cant see companies suddenly migrating to tablets etc. however these companies tend to rarely upgrade and usually use outdated low end stuff anyway.
mbryans - Tuesday, November 22, 2011 - link
Simplify the market into three segments. Maintain high-end GPU market for enthusiasts and gamers, making competitive mainstream GPU for HTPC / desktop, and create an integrated CPU + GPU for mobile / tablet.If AMD can not compete with Intel's processor speed, leave the high-end CPU market, and create just the integrated GPU + CPU and energy saving for personal home server. This is not end of the PC, but evolution of PCs into the early era of cloud computing.