Both I guess, What if you suddenly add a folder with 50,000 objects to a drive .. is it going to take a long while to index ? either way there are some improvements to be made in the performance area e.g. Linux is faster with NTFS. Going from memory: Indexing is not a core part of NTFS, integration could increase performance.
I guess it depends on if you believe that indexing is a function of the OS or the filesystem, doesn't it? Not that it makes any real difference as this thread is about OS features, but... How often do you add 50000 objects to your disk in one fell swoop? I'm genuinely interested.
Thing is one is about project management.
When you get the error report in windows, you get a mini-dump, giving basically the stack. This gets a ID attached to it, when submitted to MS they use simple huristics to pick out which problems are likely to be cheap to fix and have a big impact on user experiance.
All very good imo.
Enless we are talking about COM exceptions, which are all 0x80028012.
But COM is just a horrible horrible horrible horrible thing. Which is still miles better than any major non-managed alternative.
throw new ArgumentException (String, String, Exception)
Large image/audio/document/game/etc collections ... I think my drive is at around 500,000 files.. and a lot of stuff isn't even extracted.
Cuffz,
WTF are you talking about?
You want to be able to add 50,000 objects quickly, yet have them fully indexed?
Do you want the moon on a stick whilst we're at it?
Yes things like MLC SSDs are showing great possibilities, given the random nature of sector access we could have to re-think the way we build data on HDDs.
But the fact remains thats not a normal operation, and if people want to do that sort of oddity they will have to toss up the merrits of indexing, and disable it.
This is NO different to working with a database, even the strange column store ones.
throw new ArgumentException (String, String, Exception)
Out of interest I've just set off a complete re-index of my Vista box - that's nigh on 2.5Tb of data, showing something in the region of 160000 files (including a USB disk, which I'd expect to slow performance). I'll let you know how long it takes while the machine is under normal load (nothing heavy, just regular surfing etc)
Also would be good to know, how much the machine deteriates whilst ur trying to surf?
throw new ArgumentException (String, String, Exception)
I'm remoted into the machine in question over LogMeIn, so I can't hear the churn of the disks (though I'd expect it to be noisier than usual), but neither core seems to be taking a massive hit and just over half of my 8Gib of RAM is being used. The CPU is actually Speedstepped down at it's current level.
What happens when the drives are accessed by multiple machines on the same network? does windows have to index the drives more than once e.g. every machine builds its own index of that drive?
Yes, and just a simple home network shared drive.
Well... unless I'm much mistaken networked drives aren't indexed by default anyways due to the crazy amount of traffic and server load it would generate. File indexing on my Vista box is coming along nicely, sitting at 35682 files indexed so far (or an average of about 1050/minute). It may just be me but that really doesn't seem to bad. Add to that the fact that I have what I regard as a lot more data than the average user (who I guess the indexing will be targetted and balanced for) and... I dunno: I don't think there's much to grumble about.
Maybe I guess I just like all these things to work really fast, indexing is just one piece of the performance puzzle . Once there is a really nice abstraction layer in windows (if ever) any performance needs will be filled I think. Perf + abstraction go hand in hand.
From http://technet.microsoft.com/en-us/l.../cc772446.aspx regards indexing networked locations
So as long as the server has the shares indexed then under Vista or Server 2008 you should be able to make use of the index. If it's not been indexed then it's a fallback to traditional search. You learn something new every day, I guess.Querying from Windows Vista or Windows Server 2008
To query a remote computer, users use Windows Explorer to browse the shared, indexed folder on another machine and enter their searches in Explorer’s search box. If the location is not indexed, then Vista falls back to a slower GREP search instead of WS4.
Querying from Windows XP or Windows Server 2003
To query a remote computer, users select the location from their All Locations menu and enter their search query as usual. First, of course, they must add the remote location to their search scope:
From the Windows Search UI, click the All Locations menu and select Add Location.
Enter the full path of the location, or browse to the location.
Once added, the new location appears at the bottom of the All Locations menu allowing users to select that location to search in. In the same way, users can remove a location by selecting Remove Location. If the remote location is not indexed, a message appears advising users that the location cannot be searched.
There are currently 1 users browsing this thread. (0 members and 1 guests)