Read more.Windows Defender is defending advertisers.
Read more.Windows Defender is defending advertisers.
That's a real sensationalist headline. It will block them fine, you just need to change a setting to allow it.Windows 8 hosts won't block Doubleclick ads or Facebook
The hosts file should have been locked down a long time ago, given redirects are put into it by any malware not written by an idiot.
Nice of Microsoft to give the user control over their browsing requirements
Am I the only one who reads the above and comes to the conclusion that Windows Defender is "broken"? Preventing malware changing the host file seems like a darn good idea, but couldn't they have implemented some form of check-in/check-out system so you - as an informed user - could make your edits in peace and have WD accept them as genuine "yes I really mean that!"?Even if you edit your hosts file and write protect it, once you open a web browser it will be restored to unblock the above sites. While you can get around this by turning off Windows Defender ... Windows Defender protects your hosts file to stop malware changing it. Malware often changes the hosts file to redirect your browsing to dodgy websites and to lock you out of from helpful antivirus vendor sites.
A user at GHacks.net has suggested that Windows 8 users who want to keep Windows Defender yet be able to edit the hosts file can actually exclude the file from “protection”.
My suspicious mind also has a problem with the fact that advertisers seem to figure prominently amongst the list of "do not touch" sites...
99% of users don't know what a hosts file is. Anyone that does will known how to add an exclusion or Google it to find out. If an option did pop up, you'd just get most users hitting "yes".
The advertisement servers are being added as some of the less harmful malware adds in redirects to different ad servers to get the writers more money. Given that most pages also have ads, you can use it as a way to check for updates of the malware.
Anything that gets high traffic ultimately is probably on the list. I doubt there is a sinister motive here.
Sounds like it's actually doing as intended and probably as it should for the vast majority of users - i.e. monitoring the hosts file and removing redirects which would be highly likely to impact your web experience or be used to replace common pages with malware replacements.
If you're a virus writer and you want to make sure the zombie is checking in then you could DNS spoof the most common ad provider URLs to send ads in pages to your command and control server, simples.
If anything Defender is not doing ENOUGH (given the number of entries untouched above) - for 99% of users the hosts file should be devoid of custom entries and Defender should enforce this but possibly allow setting of any hosts entries via a secure mechanism buried in it's own UI and do what some anti-spam tools do and add entries to localhost for known bad/dangerous URLs. Oh and add a nice friendly comment to the file letting power users know what's going on...
Dangerous ground, this. Surely with UAC there should be a mechanism by which Windows Defender can know that a change has been intentionally made by the user. Maybe even a way of identifying that the entry is pointing to 127.0.0.1 and is therefore highly unlikely to have been done by malware.
A lot of security software monitors the hosts file, often warning the user if anything changes it. Upon reading the headline it does sound quite worrying, but it's nothing new; as others have said, it should be doing more really.
Just because it points to 127.0.0.1 doesn't mean it's legit, in theory the malware could host its own web server locally, especially if the writers knew localhost entries remain untouched.
I use SpybotSD to monitor my Hosts file. Im not a massive fan of Windows Defender as to get a new version seems to require a new operating system.
Home/Small Business routers have the ability to block websites/IP's, but none of them that I'm aware of allow you to upload a list, instead you have to sit there forever entering in new sites to block. It would be really really handy to be able to upload a text file to your router to block sites, then you dont need to worry about individual PC's host files because all of them will be blocked (while this doesnt help with portable equipment, for small businesses or just multiple desktops at home this would be useful). Routers with space limitations could utilise USB/SD memory for large text files/database.
Good point, watercooled. It may be a necessary evil then. I'm sure other methods of blocking such sites will fill in the gap left by this change.
Abp.
One more reason to use 0.0.0.0 "black hole" redirects instead of 127.0.0.1 loop-back address on systems that support this. It's also a lot faster (since that address doesn't exist) and uses less resources as it won't try to establish a connection to localhost firing all kinds of network aware events and running locally installed software. Many users are running web servers and/or update services listening on specific ports and 127.0.0.1 redirects would try to establish a connection with these services. Too many connections to localhost without specifying a port number can create all kinds of problems, including extremely long log files and random system crashes if certain advanced SYN Flood or DDoS detectors are installed and block incoming ports on a network loop-back address as a result of too many requests. Do try however, if your system supports 0.0.0.0 redirects before using them with all DNS targets you'd like to block with specific redirects in your HOST file! Cheers!
Last edited by howdee; 20-08-2012 at 04:42 PM.
There are currently 1 users browsing this thread. (0 members and 1 guests)