Read more.Core i9 and Quadro at the helm.
Read more.Core i9 and Quadro at the helm.
Last edited by Tarinder; 14-11-2017 at 10:14 AM.
that is a monster
Those are some very very serious numbers :-)
Originally Posted by Advice Trinity by Knoxville
Oh I do love these sorts of 'biased' reviews....now I may be a little more critical of these types of reviews because this is the sort of system I use daily for my work.
It would have been better to compare the geforce system using the same cpu as the quadro system because some of those results are likely being cpu limited, let alone the difference twice the ram and 'faster' ssd (scratch/temp files etc) can make.
There's also a surprising lack of autodesk software used in the review or is that because apart from Maya they use direct x these days and well the quadro's are no better than their equivalent geforce card there....
I'm also surprised there's no vray/vray rt type benchmarks either and I know they benchmark software available (here's the benchmark) is heavily used in 3D rendered graphics these days.
Looks pretty cool, but it will be better if front panel will be balck.
TBH, I didn't make it past the first paragraph of the review for the sole reason of this:
While this statement holds true for AMD's ThreadRipper (still hate this name) it in no way holds true for Intel's Core X. No professional in their right mind would be using a Core X series processor to do serious work, because it can't handle ECC memory. Validity of data is of paramount importance in business use.Run in a uniprocessor environment, there really is no good reason for many workstation users to look past the Core X series or, for that matter, the latest performance Threadripper chips from rival AMD.
Linux use wasn't even covered in the review, maybe that should have been covered too because we all know how 'temperamental' linux is with certain hardware combinations.... but it does depend on usage, all the ones I use run on windows due to the software they use, not to mention the monopolistic aspect of some of the 'side software' I need to use such as Adobe which is windows only
You'd think so, wouldn't you? But I've seen lots of serious workstation use done on HEDT machines. Most of the time it comes down to budget: In large enterprises you might be able to convince central IT to drop the necessary cash on Xeon/ECC/Quadro, but in your standard SME, or particularly in academia where the money is coming straight out of your research grant, the vast majority of "workstation" systems will be using a Core whatever and GeForce/Radeon.
Of course, a lot depends on what your "workstation" is doing. I get the feeling you're focusing very much on number-crunching and scientific calculations, and sure, that's a classic "workstation" workload that very much depends on accuracy and reproducability. It could realistically rake several days to produce a results set on that kind of workload, and you *really* don't want the risk of a single-bit memory error trashing days of work. ECC makes explicit sense.
But what if you're developing a system to allow VR visualisation of scanned archaeological finds? Your data isn't changing, there's no "analysis" involved as such, so that requirement for ECC really isn't as critical. If you're a 2D CAD technician producing transport plans, how much difference is ECC going to make to your workflow? What benefit is it really going to provide? Those are the kind of workloads that really blur the distinction between workstation and desktop (as the first line of the review says).
Sure, there are situations where you can make an explicit case for ECC memory being a requirement. But the less damage a random memory error will cause, the lower the risk of not using ECC. There's plenty of workflows where ECC simply isn't needed, so other considerations - cost, performance, etc. - come into play first.
Besides, people like having a workstation. It sounds good. So even if you think they've really got a glorified desktop, let them have their little glow - as long as the tool they're using is appropriate to the job, it doesn't really matter
If your time is worth money, you should be using ECC ram. I really think it is that simple. If my machine crashes whilst playing Elite, I will be slightly annoyed. If it crashes losing an hour of work due to a memory bit flip, then I am probably out of pocket compared to the small incremental cost of buying ECC ram. If a program silently spits out misleading results due to a bit flip, then that could waste days before it gets spotted.
In my case, my Elite playing machine even has ECC ram and no overclock as the risk of instability whilst working is not worth a few frames whilst gaming.
It depends on where the choke points are in your budget, how long your processes take to run, how quick they are to re-run, and how good your staff are at remembering to save often!
Thing is most staffing budgets over-provision, so losing an hour to a PC crash is already budgeted for anyway (after all, it's nigh on impossible to build an uncrashable PC). In many cases, staffing and hardware purchases will come from separate budgets anyway, so the potential saving in staff time doesn't even factor into it.
Sure, there are good reasons for getting ECC. But there are also cases where it's absence isn't critical. The issue isn't that there are non-ECC workstations available, it's making sure people a) understand the cases where that's not an issue, and b) understand how to mitigate the risks. I once kitted out an analysis team with non-ECC workstations, and I can't remember one instance in three years of any of them coming back to me having suffered a random crash. *shrug*
Horses for course. Yes ECC has benefits. There are plenty of situations where I would only recommend ECC. I just don't think the lack makes a workstation a complete non-starter; there are other factors to consider.
EDIT: to add some meat to my points.
Based on a quick play with HP's configurator, @ 16GB dropping to non-ECC RAM means you can add > 10% clock speed *and* Hyperthreading to your processor choice. (Xeon E3 1245 v6 v Xeon E3 1225 v6). That's a big difference in performance, that could easily save more time per year than you'd lose due to any issues experienced from not having ECC.
To be fair that's comparing two Xeon processors (the lower end Xeons are actually really good value ), but I suspect at the higher core counts you'd be looking at similar-priced Xeons and Core X having not dissimilar performance deltas. As with all things it depends on your type of workload, but I think it's a valid point that having a faster processor instead of ECC could easily play out benefits beyond the simple up-front cost...
Last edited by scaryjim; 14-11-2017 at 01:08 PM.
But saving often just means you saved your silent memory corruption into a file. That is the problem, it is the silent nature of memory corruption that makes it so damned nasty.
I guess we are back to naming as well, to me if you chose a lower end Xeon then it really isn't a workstation. The 160W Xeon under a couple of kilos of heatsink and a massive fan are a given, memory requirements start at 32GB with some people I know requiring (not nice or optional but absolute requirement) 128GB.
I think that might be the key point. Is a CAD workstation a "workstation"? Ask the CAD tech who uses it and they'll tell you it definitely is. The 3XS under test explicitly calls itself a Vis Workstation, and targets content creators. Is it a "workstation"? *shrug* Probably depends on who you ask....
Some years ago I used to use CAD workstations. They were high end Unix machines that cost as much as a house and a build quality way above any PC.
Overclocking a desktop is not a workstation, but hey we get the same nonsense elsewhere in life. Lots of people with luke warm hatchback cars think they are driving a sports car because they have never tried the real thing.
Those Unix machines you talk of were likely sgi machines and I've had the (dis)pleasure of working with them too and when I got to them x86 was considerably faster and in my experience just as stable...
In regards to the 3XS WI600 I think you need to look at it's target market, this is an 'arch viz workstation' (so 3D rendering etc) aimed more at the small business/sole trader, rather than the large scale companies, where they can't afford to get the top end 'workstation hardware' which can cost 5x the price of this (trust me I checked the cost of my 'ideal workstation machine' and it was around 20k...) and something like this is enough for them to do their work without issues (although imo you could get an equal/better machine for it's target market for less than 4k).
Arch viz doesn't really need ecc ram, it does however need processing speed for non gpu rendering where the more cores/clock speed the better and while I don't personally like highly overclocked rigs for 24/7 rendering if you do the cooling etc correctly you should have no issues. In some cases a small cpu overclock can sometimes help with minor bottlenecks etc.
They were actually Intergraph. It was a long time ago, they did switch to Intel at the end and made dual Pentium machines which felt sluggish compared to the single Clipper core which they used before, and then for lots of reasons it all went down the pan.
I just had a look on Scan, their cheapest 8GB DDR4 2400 RAM stick is £83.48, but their cheapest unbuffered ECC stick at the same speed is £82.99 so actually *cheaper* to have the extra ram chip on there.
A quick history lesson for the youngsters here: The IBM PC originally had memory chips plugged in in banks of 9 bits, that gave 8 bits plus parity. That allowed the PC to detect that it had seen a memory error and tell the user about it, a "business class" feature that the likes of the Apple II, PET and TRS-80 couldn't do. Some years later memory became wide enough that someone realised if you added some logic you could use all those parity bits in parallel to do error correction as well as better error detection.
That is why this stuff annoys the heck out of me, Intel expect you to pay through the nose for memory protection when that was a part of the original PC spec. Even the ECC support is now *really* old tech. I struggle to think of any other part of the PC which has gone backwards in spec.
There are currently 1 users browsing this thread. (0 members and 1 guests)