Read more.Promises "new and enhanced capabilities that will change how you interact with your PC".
Read more.Promises "new and enhanced capabilities that will change how you interact with your PC".
Maybe most systems are "ripe for refresh," -being 4+ years old." Because Intel stopped making CPUs around that time and started making GPUs with a CPU bolted on the side.
I still live in hope that one day I'll be able to buy a new CPU without 2/3 of wasted silicon.
you can it's called socket 2011 or xeon. unless you're purely on about mobile cpu's.
Personally I can't wait to see what MS decide to stick in the surface pro 4, hopefully iris not hd graphics. I still say the sp3 was designed with these in mind and intel screwed them by being late to the party.
Isn't there a cheaper alternative to the Extreme CPUs, I thought that some of the Xeon processors were built on the same architecture as the Core i series CPUs but without the IGPU which means they won't have features like Quick Sync etc... but would perform the same in all other respects.
Quicksync annoys me because I can't use it with Handbrake since I have a discreet vid card installed. Wish they had - disable onboard graphics EXCEPT Quicksync option -.
You say that like they stopped improving the CPUs. They constantly improve, but also include a GPU in the package too, it's not like you lost something by them including the GPU.
I'm about due for a new CPU, probably when the next ones come out I'll get an i7, as I wanna switch to 4K gaming and need a new motherboard anyway because this one is PCIe 2.0, but I'm thinking of waiting for the 980ti, as the 980 seems to struggle with the 256bit bus.
Its late, and crap.
These chips will NOT be socketed so no upgrades. BGA format.
x4 PCIe is naff for gpu bandwidth. Its also wired via the southbridge. Bandwidth? M25 rushhour i'd say.
If you want it in a tablet or ultrabook sure. Anything else? forget it.
Hmmm.Originally Posted by Article
Well, based on the previews, I'm not seeing anything that'll "change how I interact" with my PC. Firstly, 4% productivity improvement? I don't believe for a nanosecond MY productivity will improve at all as a result. Why? Because the PC isn't the bottleneck now. Indeed, in the vast bulk of my work, the current processor's quad cores are idle about 90% of the time. Given that the bulk of my 'productivity' is fairly basic WP, a bit of spreadsheeting, accounts and a smidge of email, ANY processor in the last 10 years or more was capable of handling my needs, standing on it's head, both hands tied behind it's head, whilst humming Yankee Doodle Dandy. A 4% improvement isn't threatening any underwear-wetting on my part.
3D graphics? Well, it MIGHT enhance my game-playing a bit, but the games industry has done such a stellar job of driving me away that most of my 3D game playing is done with an R/C helicopter these days, and my PC processor won't make any difference at all to that.
Video conversion? If I was doing that extensively, then great. But I rarely watch video on PC, bar the odd YouTube clip, and don't do much editing with video, and what I do do is on a machine that isn't even the fastest I have now, and I've felt no need to upgrade it, or evem migrate to the faster system. It just .... works. Does what I need.
Battery life? Well, that sounds appealing, but "change how I interact"? That might be over-egging the cake a bit.
In other words, it sounds to me like the usual marketing hype, masking .... not really that much.
Quite a number of computing technologies have reached the point where, for just about ANYTHING I currently do, what I have now is simply good enough. It's like .... how many dots does my inkjet printer produce? If it's got to the point where I can't see them, and it's not making a visually obvious (or at least detectable) difference to the print, why would I care? Same goes for SLR camera pixel count. Other things, like low noise operation, might but frankly, 8-10 MP is good enough for me. Is 20MP, 50MP, a gazillion MP better? Probably. But am I forking out hard cash to upgrade? Unlikely.
So unless there's a lot to 5th Gen Core that isn't in 4th Gen Core, or frankly in a Q6600 Gen, then my reaction is ho-hum, yawn. What I've got is perfectly good enough, until/unless something comes along that really does offer something dramatically new and useful, that turns the cost/benefit analysis of upgrading into a truly justifiable benefit. And, so far at least, I'm just not seeing it from that announcement.
The days of every new generation of chip making real, useful performance improvements seems to me to be a thing of the past. My current PC platform is likely to get replaced/upgraded when it dies, but probably not before.
I am, as might be obvious, totally underwhelmed. And, unlike in some past times, I no longer suffer even from mild upgraditus.
Yeah you can get xeons for desktop boards, I think the issue was aimed at wasted silicon though. You pay for a chip but you only want half of it and to get one where that "wasted" silicon is used for extra CPU power, you have to buy into the workstation and server platform.
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
I understand why are you so annoyed people, but would you make every time a rectangular CPU? -most likely the CPU is to small for the number of pins(its not completely human engineered).
So they would have to occuppy somehow the remaining space - Integrated GPUs will become more powerfull with each generation and maybe +8% increase in performance for each of it (20% for the cpu and 28% x the GPU).
Thats why AMD has better integrated graphics.
I'd rather have more cores and/or cache than integrated gfx.
------------------
Valar Morghulis
So either really expensive extreme processors, or server grade CPUs with more cores than your average Jo needs
Sure they have improved the CPU part, but are you seriously saying a 4% improvement is on par with the improvements we saw before the days Intel decided to become a GPU manufacturer.
Take another look at the die shot. That is 2/3rds of the die is now GPU, just like on an AMD APU. It was largely wasted space for me when AMD did it (and they at least have decent drivers) but now Intel makes me buy silicon where only 1/3rd of it is functional to me as there is no way I am relying on Intel graphics.
Or put another way, if they made the die smaller by cutting that out, and offered the non GPU infested chip for half the price, would you be interested? Half price mind, you could put that towards a GPU, or at least you could if you buy Haswell as these parts aren't capable of driving an external GPU of any note thanks to the crippled PCIe.
There are currently 1 users browsing this thread. (0 members and 1 guests)