Read more.Insiders can test it on their Surface Pro X or similar Snapdragon PC.
Read more.Insiders can test it on their Surface Pro X or similar Snapdragon PC.
Could I theoretically install Win10 on my S20+ (Exynos) and then try and launch League of Legends x86 client?
Seems like a significant first step at least, as far as x86/64 is concerned in the pro-sumer space?
Kalniel: "Nice review Tarinder - would it be possible to get a picture of the case when the components are installed (with the side off obviously)?"
CAT-THE-FIFTH: "The Antec 300 is a case which has an understated and clean appearance which many people like. Not everyone is into e-peen looking computers which look like a cross between the imagination of a hyperactive 10 year old and a Frog."
TKPeters: "Off to AVForum better Deal - £20+Vat for Free Shipping @ Scan"
for all intents it seems to be the same card minus some gays name on it and a shielded cover ? with OEM added to it - GoNz0.
Yes its technically possible provided you have an unlocked bootloader, a compatible build of windows and the necessary drivers but its just not realistic. Samsung use a mix or Arm and PowerVR GPU's which do not have good windows support.
You can install Arm Windows on the Rasperry Pi 4, it works but without GPU support its not all that useful. https://www.reddit.com/r/WoRPi/
I want to see someone Bootcamp a new Mac with the M1/2 chip in there and see how Windows 10 runs on that, then install some x86 stuff and see how that works with decent chip powering it..
Interesting idea. I suspect Apple's proprietary hardware stuff will be the barrier without specific drivers. Why do you say a "decent" chip? The Apple M1 chip is very exciting and impressive, but it's not as if the competition is terrible. I'll be very interested to see where they go with it but, ultimately, I'd say I'd never touch them (partly) as it'll all be locked down and designed to not work properly unless used precisely as intended.Originally Posted by [GSV
Imagine where ARM could be in the PC market if Microsoft had prioritized this for Windows 8 on ARM, and allowed general emulation of x86/x64 PC applications. Windows 8 on ARM flopped largely because you couldn't run traditional Windows applications, so there wasn't much point in buying it. So users weren't there, which meant developers didn't both to port much of anything to native Windows on ARM. If Microsoft had made emulation and legacy support a priority in the first place, they might have had users, solving the chicken-or-egg problem.
It's still worth something now, but it's a steeper climb with WOA already have the veneer of having failed once before.
They use some tricks yes, but for many people other better solutions exist. Quite a few nice 6 core ARM SoC's out there from people like AMlogic. There are some really nice SBC's now which run Win 10 quite nicely, but gpu acceleration as said above is often the sticking point. Debian is getting there and Android is even better. Many SBC's now play youtube 4k back really smoothly which is one test often usedOriginally Posted by [GSV
Old puter - still good enuff till I save some pennies!
If Apple's M1 laptops prove to be popular going forward (and I see no reason why they're not going to be currently, more powerful ones unreleased aside) and more and more people see they can get such excellent battery life, then laptop makers might well start to move thin and light/ultralight x86 laptops to if not being ARM, at least having an ARM option.
Microsoft convincing developers to recompile for WoA is the key though. If all they can offer is Windows + Office, then beyond busienss users it won't have much appeal.
Windows on Mac M1 has already been done (using QEMU virtualisation iirc). Was a beast - 60%+ faster than the Surface Pro X with the Qualcomm chips.
https://www.youtube.com/watch?v=YZtNyoqOlss
A beast? Brand new chip v chips that are about 2 years old in design. The M1 throws transistors at the problem and incorporates dram on SoC. Nearly all that 60% is right there in the better process, clockspeed and the integrated dram. I'm not saying the M1 is a slouch but nobody else has tried to do a chip like it, and only Apple are in the position to have the large amount of sales etc. to make it worthwhile. Plus it's the only chip available in volume on TSMC's 5nm process
Old puter - still good enuff till I save some pennies!
I keep seeing this, but to be fair you have to throw transistors at exactly the right part of the problem or it actually makes things worse.
You could make a core with 20 integer pipelines. The scheduler wouldn't issue instructions to fill them, and the heat and wiring implications would lower the overall clock rate. You could give each core a 128MB L1 cache, and hits would be so slow performance would decrease. This stuff has to carefully and holistically balanced, so whilst I still have close to zero chance of buying an Apple product I have to acknowledge a nice bit of engineering there.
Old puter - still good enuff till I save some pennies!
Yeah, I get that they are on 5nm and that gets them more available transistors. But let me put it this way, you can give me as much paint and ceiling as you want, and I will never create something like the Sistine Chapel. Pure lack of talent. All the paint in the world could be wasted on me, and you would be lucky to get one giant monkey Jesus at the end of it
I might think that as a design Bulldozer got an overly rough ride and some more development could have knocked out a decent product; but if you look at the list of problems it had I don't see a single one that would have been fixed by simple scaling up with more transistors. BD was already a big design, and yet not competitive.
Using billions of transistors might be an opportunity, but it is hard, and I think people are way too dismissive. At extremes of scale, even GPUs have been shown to be hard with the likes of Intel's Larrabee and those are probably the easiest cases of replicate lots of blocks, but that's useless if you can't feed them or get the basic blocks wrong.
Hahahaha yes but in this case, they have removed a lot of things as well. Mac Mini and M1 laptops have limited IO, no ram interface and limited pci-e lanes. They've removed the dram interface and integrated the dram on the chip. They have thunderbolt lanes but because of what's been chopped not full thunderbolt connectivity. Removed a lot of ways to communicate with the outside wolrd and the usb implementation is slow. Now if someone else tried that in say the pc arm space, they'd be derided. But as it's Apple nobody looks at that. So the M1 is useless out of the niche market that Apple have over the last couple of years created for themselves. Heck even the Pi has loads more io available - and yes I know that's because it has 40 pins available...
I like the M1 - but it's only useful to Apple and what they are trying to do. For me, not enough usb lanes for starters for anything I'd probably want to do (Macbook Pro has 2 x usb-c ports... and 1 is for charging)
Old puter - still good enuff till I save some pennies!
ik9000 (15-12-2020)
There are currently 1 users browsing this thread. (0 members and 1 guests)