Results 1 to 13 of 13

Thread: Windows - how to use it more securely

  1. #1
    Ex-MSFT Paul Adams's Avatar
    Join Date
    Jul 2003
    Location
    %systemroot%
    Posts
    1,926
    Thanks
    29
    Thanked
    77 times in 59 posts
    • Paul Adams's system
      • Motherboard:
      • Asus Maximus VIII
      • CPU:
      • Intel Core i7-6700K
      • Memory:
      • 16GB
      • Storage:
      • 2x250GB SSD / 500GB SSD / 2TB HDD
      • Graphics card(s):
      • nVidia GeForce GTX1080
      • Operating System:
      • Windows 10 x64 Pro
      • Monitor(s):
      • Philips 40" 4K
      • Internet:
      • 500Mbps fiber

    Windows - how to use it more securely

    Lock Your Doors & Secure Your Windows

    This is a quick intro to a subject which I have dealt with professionally for a number of years and is subject to rumours, myths & misinformation - IT security.

    As is my wont, I have focused on the most commonplace OS - Windows - although some of what I shall cover is also applicable in general terms.
    I shall begin by setting the scene and giving a background on "how things used to be" to illustrate how differently we use computers today.


    A Brief History Of Time
    Before the Internet really took a hold and became a daily presence in companies and households, there was little focus on security for anyone other than banks taking care of their "core systems".

    People had become used to the concept that computer networks and high speed interconnectivity were for universities, and their home computers were inviolate and entirely self-contained with a specific set of functions.

    Software packages were purchased to perform specific, rigid tasks and new features implied upgrades - "patches" were not commonplace.
    This may lead to the assertion that software written today is not as well written or tested, but the reality is that both the OS and applications did not have a fraction of the capabilities that they have (and are expected of them) now.

    Viruses were not rampant because if they had a destructive payload the invariably wiped themselves out along with the data, and they most commonly got transferred via floppy disks (people swapping freeware, shareware or pirated software).

    So what changed?
    When Windows 95 arrive it was obvious it was a huge improvement over "Windows For Workgroups 3.11" (WFW), not only in terms of the user interface (the right mouse button actually DOES something??) but it introduced concepts such as the registry and plug & play (PnP).

    Windows 95 improved upon WFW by also having much better networking and internetworking support - now it was possible for users to buy modems, sign up with an Internet Service Provider (ISP) and pay through the nose for per-minute Internet access in addition their subscription fees.
    The days of free CDs offering Internet connectivity software mounted on magazines had arrived (thanks to AOL we had a never-ending supply of coasters, frisbees and things to melt for fun).

    Where previously we would use our standalone computers to do word processing, put our finances into spreadsheets, write programs or play single-player games, now we had the ability to connect with millions of computers around the world and enjoy pornography in every language.

    If Windows found a networking device then it would bind TCP/IP and file & printer sharing to it so that everyone in your "workgroup" could share files easily - very convenient.

    Life was good, the era of the "dot com" was upon us and the future was looking just peachy.

    But the convenience of home computing came with its risks - any user of a Windows machine was "God", and any process which launched was able to achieve anything the user could (and I am talking about permission rather than skill).


    The Times, They Are A-Changin'
    The "9x" family of Windows still lacks 3 major things:
    - users and privileges
    - a file system while allows for privacy and security of its contents
    - a protected kernel

    Fundamentally this branch of Windows was really a 32-bit shell on top of 16-bit DOS rather than an operating system in its own right (rename win.com and booting the machine drops you at a command prompt, and "ver" reports MS-DOS v7.x).
    This means it was subject to the stability of the underlying 16-bit OS and its legacy drivers.

    If the machine booted up you were left at the desktop without any request to log on, full access to every file on the system, and every application you ran had access to all memory areas so a program could easily "bluescreen" the system.
    Convenient, simple, but far from secure or allowing for privacy.

    I mentioned briefly before the automatic binding of "File and Printer Sharing for Microsoft Networks" to all networking interfaces, including modems.
    This convenience was very useful for local networking - as there was no concept of "users", everyone had the same level of permission.

    Unfortunately it also meant that anyone who could find your public IP address when you were connected to the Internet could also connect to any of the shared resources too, including some "administrative" ones built into the system.
    My introduction to computer security started with my discovery of the "File and Printer Sharing" issue - luckily it was a fried who demonstrated its exposure to me by creating a folder on my desktop when we were chatting in IRC.

    While unbinding the service from dial-up adapters was trivially achieved, how many people knew that it needed doing, or how to do it?


    Experience Is Something You Get Just After You Need It
    The Internet was to provide the solution to faulty design implementations and buggy code at the same time as creating the risks these things presented.
    "Patches" and "service packs" have been used for years in coroprate environments - Windows NT 4.0 is now out of its support lifecycle but it peaked at "Service Pack 6a", and anything less than this service pack level is considered unstable.

    A service pack is, for the most part, simply a collection of patches bundled into a single deployment - though sometimes service packs can also introduce new features.

    A patch, in reference to Windows, is not actually a fix that needs applying to a file, but it is a replacement for the entire file - you install the "patched version" of 1 or more files.

    All operating systems are patched frequently, this is not something exclusive to Windows.

    "Windows Update" was introduced to take the responsibility away from the user to seek out patches that they need to apply by reading magazines, web pages or KB articles.

  2. #2
    Ex-MSFT Paul Adams's Avatar
    Join Date
    Jul 2003
    Location
    %systemroot%
    Posts
    1,926
    Thanks
    29
    Thanked
    77 times in 59 posts
    • Paul Adams's system
      • Motherboard:
      • Asus Maximus VIII
      • CPU:
      • Intel Core i7-6700K
      • Memory:
      • 16GB
      • Storage:
      • 2x250GB SSD / 500GB SSD / 2TB HDD
      • Graphics card(s):
      • nVidia GeForce GTX1080
      • Operating System:
      • Windows 10 x64 Pro
      • Monitor(s):
      • Philips 40" 4K
      • Internet:
      • 500Mbps fiber
    95 + NT = ?
    Windows 2000 was the combination of 9x and NT versions.
    Windows 9x brought the user interface, PnP, DirectX, while NT brought the security model - the protected kernel, user concept and NTFS file system.

    While they merged to form a common OS, there were 2 main flavours developed - one focusing on user workstations (Professional) and one on servers (er...Server).
    The roles of the different flavours is principally governed by configuration settings - workstations want to give priority to the interactive applications running on the desktop, while servers tend to focus on background activities.

    There are a few different versions of the Server product to fine-tune the capacity to meet the requirements at the right cost - is the bias on number of CPUs, amount of memory, disk space, network throughput, etc.

    Now we had an OS at home which had the ability to provide users with a more secure environment to work and play in, as well as extending the concept of domains into "Active Directory".

    For companies there was now an extensible environment with a correpsonding set of tools to control workstations and users centrally - deployment of software, logon scripts, security hardening, removal of user control over specific functions.

    For home users there could be an Administrator account to install software and a collection of separate User accounts with their own separate profiles (desktop, file area, sound effects) which were inherently protected for privacy.

    Unfortunately, "could" is the key word.
    Convenience still remained foremost in so many users' minds - in order to install "Shiny New Software v4.5" they didn't want to have to log off, log on as an Administrator, install the software, log off, log on as a User - it was too much like hard work.
    To compound this, software developers were often lazy and wrote software which assumed or required administrative privileges - so your application would refuse to run unless you were logged in as an Administrator.

    You can take a horse to water, but you can't make it drink...


    The Day Today
    So we know the current workstation Windows OS is XP (Home or Professional depending on requirements), and the server OS is Windows 2003 (again in various shapes to suit the different environments customers have).
    Both are significant improvements in security and performance on Windows 2000, which peaked at Service Pack 4.

    XP received Service Pack 2 at the cost of delaying the next version of Windows, which we now know as Vista, due to the ever-increasing focus that computer anarchists were putting on "owning" Windows systems, and even spammers exploited the legacy Messenger service intended for local network popup messages.
    Messenger service gets disabled, Windows Firewall is given more publicity and enabled by default - so now programs which want to act as a "server" will create an alert to the user and require the relevant permission to allow, and at the same time unsolicited connection attempts from untrusted networks are ignored.
    Also "Automatic Updates" is turned on to keep the OS informed of the latest patches so the users' involvement in checking for updates is even less of a burden.

    Windows 2003 may look similar to 2000 but there have been many tweaks under the surface, and it has had its first Service Pack in March 2005 which also introduced more security-oriented features including more restrictions on Remote Procedure Calls (RPCs).
    Windows 2003 "R2" is also imminent, will require 2003 SP1 as a base and provide enhanced & revamped features again - including the much-improved File Replication Service (FRS).


    Why so many patches? Why can't it be done right first time, every time? Why do patches take so long to come out for known issues?
    Anyone involved in a large-scale project involving millions of lines of code, where modules have dependencies on one another and "impact analysis" is a process that take hundreds of man-hours will be able to tell you that it's not as simple as you think.
    Every patch created by Microsoft goes through intesive testing, and a fix for Windows could be across multiple versions, with multiple service pack levels, on multiple hardware platforms, in multiple languages.
    A patch that is rushed out and has a bug could prove disastrous on a global scale, and what if that patch rendered home PCs unbootable, or unable to connect to Windows Update to get an updated version of the patch?

    The Windows WMF vulernability exploit received a lot of attention and was the first out-of-band patch released for Windows since Microsoft moved to the monthly patch cycle - the fix was to a kernel mode (GDI) process, and kernel mode code in particular has to be very, very stable.
    The idea that it was a deliberate ploy to implement a backdoor in every copy of Windows is a cry for attention by a well-known but not so well-respected self-acclaimed "security guru".
    It is what is known as spreading Fear, Uncertainty and Doubt (FUD). Or unbelievable bull****, take your pick.


    Drink, Damn You Horse, Drink!
    For going on 6 years now we have had the opportunity to practise safer computing by using the computer as a restricted "User" and only elevating our privileges to "Administrator" to install software or make system changes - hopefully not that frequent a task.

    How difficult is this task?
    When logged on as a "normal" user in Windows XP you can right-click an icon and select "Run As", then select an Administrator account and authenticate.
    That process (and any process it spawns) runs under the context of the Administrator, while every other process you launch has your security level.
    If more users did this then the impact to them and everyone else of viruses and worms would be greatly reduced.

    Windows Vista is taking a different approach - if we can't convince users to run as users, and developers to write software that works correctly without administrative privileges, then we shall have the concept of "User Account Protection" (UAP) or "Limited User Account" (LUA).
    If you are interested in more detail then read here: http://www.microsoft.com/technet/win...at/uaprot.mspx
    You can also still right-click icons and select "Run Elevated" to manually specify an alternative user context.


    That was a very rapid run-through of how the way in which we have been designing, implementing and using computers (or specifically Windows) over the last decade, but ultimately we need to change the way we think and work and not assume that the solution is something that we can just buy, install and forget about.

  3. #3
    Ex-MSFT Paul Adams's Avatar
    Join Date
    Jul 2003
    Location
    %systemroot%
    Posts
    1,926
    Thanks
    29
    Thanked
    77 times in 59 posts
    • Paul Adams's system
      • Motherboard:
      • Asus Maximus VIII
      • CPU:
      • Intel Core i7-6700K
      • Memory:
      • 16GB
      • Storage:
      • 2x250GB SSD / 500GB SSD / 2TB HDD
      • Graphics card(s):
      • nVidia GeForce GTX1080
      • Operating System:
      • Windows 10 x64 Pro
      • Monitor(s):
      • Philips 40" 4K
      • Internet:
      • 500Mbps fiber
    Security is a process, not a product
    There are several ways which security can be improved upon a base OS and with online services:
    - log on as a restricted user for day-to-day activity
    - use one "on-access" anti-virus program
    - use a "software" personal firewall
    - use a router with Network Address Translation (NAT) and/or "hardware" firewall features
    - use complex passwords and do not use the same password on multiple websites

    This is known as "layered" security - picking 1 of the options is not going to give you security, you should try to adopt as many as you can.

    Using a user account with restricted privileges I have covered, anti-virus may seem like a "given" but you should use one that uses a filter driver to intercept I/O operations and prevent infected files from even reaching the hard disk.
    In tests I ran, AVG Free which I had been using for years did not immediately detect a test virus signature when I downloaded it even in an executable form - it was saved to disk and I could move it around for a few minutes until it warned me, and even then it was an old location it said the file was in.
    Reliable real-time virus scanning is essential, and much more useful than relying on a regaulr full system scan.

    I specifically stated 1 "on access" AV program as they insert themselves as filter drivers, and if you have more than you can run into performance issues or possibly even conflicts.
    There should no issue with having multiple AV products installed that are "on demand" scanners - in fact they could complement each other well.

    Traditionally, firewalls have been dumb "follow the rules" devices, allowing network traffic only if it matches a rule created for the source & destination address and port - the contents of the traffic were irrelevant, so a rule to allow stateful outbound "destination port 80" traffic would permit any program to connect to any website on the usual port.
    A "software" firewall is able to control the programs themselves as it runs on the client machine - if you want to allow program X to communicate out on port Y but allow no other program, this can be done.
    The irony of people complaining that their firewall needs constant management and getting annoyed at the configuration options is not lost on me - what do you expect a firewall product to do?

    Network Address Translation (NAT, and specifically "hide mode") is a convenience to allow multiple client machines to share a single public IP address, as typically used in home broadband routers.
    By chance rather than design, this offers extra security as network traffic which originates on the untrusted public network cannot reach any client machine by default (without a port forwarding rule or DMZ set up).
    This is by its failure to route the packets rather than a conscious "your name's not down, you're not coming in" concept used by firewalls.

    Complex passwords are easy to create and keep unique, and can be easy to remember if you use a basic formula which you never disclose to anyone.
    Passwords should consist of letters (upper and lower case), numbers and non-alphanumeric characters such as !"#%&/(.

    As an example, you could take the title of a song, take the first letter and number of letters of each word and use a basic formula of having a symbol between each pairing, alternating upper and lower case.
    e.g.
    "Who Wants To Live Forever" = W3!w5"T2#l4¤F7%

    Where you are allowed password hints, the title of the song or a clue to it is enough to jog your memory but not make it any easier to crack the password.

    If creating password for online services, the formula could include some appended abbreviation of the site name.

    Things that are NOT recommended:
    - using a real word found in a dictionary
    - basic replacement of letters with similar-looking numbers
    - simple rotation such as shifting every letter one place to the left so G=F, F=E, E=D, etc.

    Brute force password hacking programs will go for dictionary attacks and can do basic pattern matching to make w0rd5 lik3 thi5 g3t f0und, as well as rotation-replacements.


    A Phased Plasma Rifle In a 40-Watt Range
    There are other tools too aimed at ferretting out spyware & malware, but these are reactive "cleanup" tools and in my experience tend to create more questions than they resolve genuine security risks.
    Some of the real-time protections offered can be useful, such as alerts when your homepage is altered, or "auto run" keys are added into the registry.

    Windows Defender is shaping up to be a very useful tool for this kind of protection, and Autoruns (from SysInternals) is a great tool to find out what processes are set to launch automatically when you log on.

    If you run as an Administrator all the time, or you have a service or component with a flaw which can be exploited to elevate its permissions, there is the possibility that a "rootkit" can get into your system - and your AV & spyware tools won't necessarily even see it, even on the hard disk.
    Rootkits attach to the low-level system calls and if they see a request to display their process or files, they remove them from view - the guys that write these things are way beyond virus writers (VX'ers) and are more organised (check out http://www.rootkit.com).

    The current rootkits that have been written for 32-bit Windows, including the infamous Sony music CD rootkit, don’t work in 64-bit versions of Windows.
    This is because when updating the kernel code for the 64-bit version, Microsoft programmers took the opportunity to include a “patch guard” – code that is part of the kernel makes it impossible to install a patch in a running kernel (which kernel mode rootkits do on 32 bit systems).
    In addition to this, 64-bit Vista will not allow any unsigned kernel mode code to be loaded - even if you have Administrator privileges.

    If you are running 32-bit Windows, SysInternals have a "Rootkit Revealer" tool which uses equally cunning methods to highlight the anomolies between what the registry reports through the API and what is in the raw file on disk.

    Going back to the concept of patching the OS itself, Windows Update should be visited at least monthly if it is not automated - the second Tuesday of every month is the release schedule for patches for Windows.
    Critical patches should always be applied to remain secure, and I would recommend "important" patches too - remember the patches may take until Wednesday to appear for Europeans, as the Americans are a few hours behind.
    Last edited by Paul Adams; 28-01-2006 at 03:01 PM.

  4. #4
    Ex-MSFT Paul Adams's Avatar
    Join Date
    Jul 2003
    Location
    %systemroot%
    Posts
    1,926
    Thanks
    29
    Thanked
    77 times in 59 posts
    • Paul Adams's system
      • Motherboard:
      • Asus Maximus VIII
      • CPU:
      • Intel Core i7-6700K
      • Memory:
      • 16GB
      • Storage:
      • 2x250GB SSD / 500GB SSD / 2TB HDD
      • Graphics card(s):
      • nVidia GeForce GTX1080
      • Operating System:
      • Windows 10 x64 Pro
      • Monitor(s):
      • Philips 40" 4K
      • Internet:
      • 500Mbps fiber
    Things You Thought You Thought You Knew
    "I don't need anti-virus, I know how to use a computer and control what I download."
    or
    "I've never used anti-virus in X years and I've never had a virus."
    Because of the potential for vulnerabilities in base OS components or commonly-used applications such as web browsers, you do not have to explicitly download and execute an infected file for your system to be at risk.
    Look at the WMF vulnerability as an example - simply viewing a page with a specially-crafted embedded file could execute custom code on your machine, and at the time of the discovery there was an exploit without any kind of disclosure taking place.
    People were actively putting links to these files in their signatures on forums - so you don't have to visit "Ub3r D3wd's Hou5e Of War3z & XXX P4ssw0rdz" to have been at risk.


    "I have anti-virus, so I'm protected."
    Anti-virus products are reactive - they use 2 principle methods to analyse files for infections:
    - pattern matching: high accuracy, based on a known library of infection signatures and an explicit match
    - heuristics: lower accuracy, where the engine tries to analyse what the code is doing and make a decision if it looks suspicious
    If you don't keep AV signatures up to date, or receive an infected file for which there is no signature yet, then there is little chance it can protect you.
    If you are logged in as an Administrator there is also the possibility that a program could neuter your anti-virus program by modifying, removing or bypassing its filter driver.


    "I don't need a software firewall, I have a hardware firewall."
    Hardware firewalls in the home typically prevent connection attempts from the Internet reaching your machine, but they offer little or no protection for traffic leaving your machine - so if you get a keylogger, trojan or zombie then that hardware firewall does nothing.
    Software (or "Personal") firewalls run on the OS itself and have the ability to prevent inbound connections as a hardware firewall as well as governing which programs are allowed to make outbound connections (and what type of connections are permitted).
    The drawback with a software firewall is that it is still just a process running on your computer and it can potentially be reconfigured or crashed by malware if executed with high enough privileges.
    Software firewalls are "host protection", hardware firewalls are "perimeter protection" - the 2 are complementary and one does not preclude the necessity of the other.


    "I don't need to update Windows unless I actually have a problem."
    Windows Update is where you get critical and high priority updates for your OS.
    If Microsoft rate a discovered issue as critical then it stands a chance of being exploited to expose data, run malicious code or reduce system stability.
    To assert that code might be deliberately left with flaws to encourage "repeat customers" is ludicrous, as you don't pay for patches and conversely Internet bandwidth does cost money and there is no advertising on Windows Update sites.


    "I have a NAT firewall."
    or
    "My NAT router is a firewall."
    Network Address Translation is simply a method of having multiple machines on a private network use 1 public IP address on the Internet.
    The side-effect of this design is that traffic originating from the Internet which is not a direct response to a request from a client does not get routed to a machine by default - only because the router does not know which machine it should send it to.
    (The creation of port forwarding rules or specifying 1 machine to be a "DMZ" changes this behaviour.)
    A firewall is actively looking at the traffic and blocking or allowing it based on a set of rules, and in some cases uses "Stateful Packet Inspection" (SPI) look inside the packets in more detail as a rudimentary check against spoofing or denials of service.


    "Anything less than stealth for a network port is bad."
    "Stealthing" your network interface is in some ways good, in some ways bad, but overall probably not as much of a security bonus as people often think.
    The principle is that someone probes a specific network port to see if you respond and are running a service that can be exploited, there are 3 possible reactions:
    - network port is open, let's do business
    - network port is closed, sorry I don't have a service running for you
    - (nothing)
    The 3rd reaction, or rather complete lack of one, is "stealthing" - a potential intruder gets no response to their probes so moves on by and checks the next address in its list.
    It is argued that a "port closed" response is giving away your presence, even if you don't have a service to exploit.
    While this is true for the most part, the complete absence of any reply tells a potential intruder exactly the same thing - the response they should have got if there really was noone there is from an upstream router reporting "destination unreachable".

    How can stealth be a bad thing?
    If you have firewalls on internal networks or incorrectly configured host firewalls then you might have clients trying to talk to servers and getting no response, so they have to time out before continuing or reporting an error.
    In a workgroup this can show as an extremely long time "an error occurred" message appears when trying to view files on another computer.
    In a corporate or WAN environment this can be an extremely long logon time delay while the computer tries in vain to check for group policies, or maybe a complete inability for replication to work between domain controllers.

    Internal firewalls should be set to reject packets, not silently drop them - this speeds up the process of troubleshooting and makes clients & servers alike much more responsive.
    The result is the same - communication is not allowed, but the performance of the computers can be greatly increased.

    Stealthing external ports on perimeter firewalls is good, but don't assume it is actually a great deal better then reporting them "closed".
    Stealthing ports on internal networks is pointless, reduces performance and can make troubleshooting network problems that much harder.


    The jobs performed by the various security products do not overlap much, and where that does happen they do not have the same vulnerability issues - hence the "layered approach" being the recommendation from anyone who knows what they are talking about.
    Last edited by Paul Adams; 28-01-2006 at 03:09 PM.

  5. #5
    Sublime HEXUS.net
    Join Date
    Jul 2003
    Location
    The Void.. Floating
    Posts
    11,819
    Thanks
    213
    Thanked
    233 times in 160 posts
    • Stoo's system
      • Motherboard:
      • Mac Pro
      • CPU:
      • 2*Xeon 5450 @ 2.8GHz, 12MB Cache
      • Memory:
      • 32GB 1600MHz FBDIMM
      • Storage:
      • ~ 2.5TB + 4TB external array
      • Graphics card(s):
      • ATI Radeon HD 4870
      • Case:
      • Mac Pro
      • Operating System:
      • OS X 10.7
      • Monitor(s):
      • 24" Samsung 244T Black
      • Internet:
      • Zen Max Pro
    Nice post Paul
    (\__/)
    (='.'=)
    (")_(")

  6. #6
    Senior Member
    Join Date
    Aug 2003
    Posts
    326
    Thanks
    0
    Thanked
    0 times in 0 posts
    • Curly's system
      • Motherboard:
      • Asrock 939-Sata 2
      • CPU:
      • Opteron 165 @ 2.52
      • Memory:
      • 2gb Geil Value Ram
      • Storage:
      • 900gb in various HD's
      • Graphics card(s):
      • 7800 gtx
      • PSU:
      • Hiper Type-R 580W
      • Case:
      • Globalwin
      • Monitor(s):
      • Samsung syncMaster 205BW
      • Internet:
      • 4mb Virgin Media Cable
    interesting read

    Curly

  7. #7
    Network|Geek kidzer's Avatar
    Join Date
    Jul 2005
    Location
    Aberdeenshire
    Posts
    1,732
    Thanks
    91
    Thanked
    46 times in 41 posts
    • kidzer's system
      • Motherboard:
      • $motherboard
      • CPU:
      • Intel Q6600
      • Memory:
      • 4GB
      • Storage:
      • 1TiB Samsung
      • Graphics card(s):
      • BFG 8800GTS OC
      • PSU:
      • Antec Truepower
      • Case:
      • Antec P160
      • Operating System:
      • Windows 7
      • Monitor(s):
      • 20" Viewsonic
      • Internet:
      • ~3Mbps ADSL (TalkTalk Business)
    Yay! more excellent reading!

    Good job
    "If you're not on the edge, you're taking up too much room!"
    - me, 2005

  8. #8
    Splash
    Guest
    Paul - though I'm a *nix fan through and through I think your guides are wonderful. Thanks for proving that not everything to do with Microsoft is nasty!

  9. #9
    Senior Member
    Join Date
    Mar 2005
    Location
    North East
    Posts
    400
    Thanks
    5
    Thanked
    12 times in 12 posts
    I disagree with what he says about stealthing.

    Stealthing means that when somebody sends a DDOS attack tieing up all of your inbound bandwidth, you don't reply to it tieing up all of your outbound bandwidth too. If the return addresses in the packets sent to your are spoofed, stealthing also stops you from CONTRIBUTING to a DDOS attack on somebody else.

    Dropped packets don't always indicate that something is stealthed, not every router on the 'net is set up to follow RFCs either. Dropping packets is a very common thing to do, it saves on network bandwidth and it makes the software people are using to probe your computer less responsive too. Stealthing your ports won't do any harm to your own network performance because you don't connect to yourself, it only harms people who try to connect to you and if you're stealthing ports thats because you DONT WANT ANYBODY to connect to them.

    Talk to anybody who has experience of running any big public network (e.g. an ISP) and they'll tell you a similar story to what I do.

  10. #10
    Ex-MSFT Paul Adams's Avatar
    Join Date
    Jul 2003
    Location
    %systemroot%
    Posts
    1,926
    Thanks
    29
    Thanked
    77 times in 59 posts
    • Paul Adams's system
      • Motherboard:
      • Asus Maximus VIII
      • CPU:
      • Intel Core i7-6700K
      • Memory:
      • 16GB
      • Storage:
      • 2x250GB SSD / 500GB SSD / 2TB HDD
      • Graphics card(s):
      • nVidia GeForce GTX1080
      • Operating System:
      • Windows 10 x64 Pro
      • Monitor(s):
      • Philips 40" 4K
      • Internet:
      • 500Mbps fiber
    Quote Originally Posted by KowShak
    Stealthing means that when somebody sends a DDOS attack tieing up all of your inbound bandwidth, you don't reply to it tieing up all of your outbound bandwidth too.
    A DoS has most likely been achieved if either your upstream or downstream line is saturated, and I think most DoS attempts are directed at servers that provide a public service so are aimed at ports that are open (not closed or stealthed).

    Quote Originally Posted by KowShak
    If the return addresses in the packets sent to your are spoofed, stealthing also stops you from CONTRIBUTING to a DDOS attack on somebody else.
    Fair point, though I think most DDoS attacks aim to saturate the server (or an upstream router or firewall) with half-open connection requests - an unsolicited packet would be filtered by a firewall or ignored by the TCP stack on the server.

    While SYNflood protection is trivial, if you have thousands of clients establishing a session with a published service then you can bring the server itself to its knees (and once the server is in a state where it has no more endpoints it is not difficult to keep them in the stuck state with a fraction of the traffic).


    Please note, I didn't advocate turning stealthing off for public-facing firewalls, as was clear with the line "Stealthing external ports on perimeter firewalls is good", it was meant as an education for people unaware that i) it's not as good as most people think and ii) it can be detrimental for internal networks.

    If ISPs took more responsbility for routing traffic correctly, spoofing would also be a lot less of a problem.
    Same goes for spam relays - it would be trivial for ISPs to block all well-known service ports by default and open them on request from their customers.

    Thanks for taking the time to read it and your comments though
    ~ I have CDO. It's like OCD except the letters are in alphabetical order, as they should be. ~
    PC: Win10 x64 | Asus Maximus VIII | Core i7-6700K | 16GB DDR3 | 2x250GB SSD | 500GB SSD | 2TB SATA-300 | GeForce GTX1080
    Camera: Canon 60D | Sigma 10-20/4.0-5.6 | Canon 100/2.8 | Tamron 18-270/3.5-6.3

  11. #11
    smtkr
    Guest
    There is this patch for Windows called Linux. It works like a charm

    Seriously, nice guide PA. Thanks for sharing.

  12. #12
    Network|Geek kidzer's Avatar
    Join Date
    Jul 2005
    Location
    Aberdeenshire
    Posts
    1,732
    Thanks
    91
    Thanked
    46 times in 41 posts
    • kidzer's system
      • Motherboard:
      • $motherboard
      • CPU:
      • Intel Q6600
      • Memory:
      • 4GB
      • Storage:
      • 1TiB Samsung
      • Graphics card(s):
      • BFG 8800GTS OC
      • PSU:
      • Antec Truepower
      • Case:
      • Antec P160
      • Operating System:
      • Windows 7
      • Monitor(s):
      • 20" Viewsonic
      • Internet:
      • ~3Mbps ADSL (TalkTalk Business)
    Can we add this to the sticky with the rest of paul's excellent guide?
    "If you're not on the edge, you're taking up too much room!"
    - me, 2005

  13. #13
    Senior Member kasavien's Avatar
    Join Date
    Aug 2005
    Location
    St. Albans
    Posts
    1,829
    Thanks
    145
    Thanked
    104 times in 49 posts
    nice post put an hour on at work for me , i'll be taking some of the tips and putting them into action too.

    Thanks

    Andy

    p.s. i'm a slow reader

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Replies: 63
    Last Post: 14-11-2011, 09:17 AM
  2. The future of OS/2 - Open source or not?
    By Steve in forum HEXUS News
    Replies: 11
    Last Post: 18-07-2009, 08:06 PM
  3. ATI Catalyst 5.8 released
    By =TcQi= in forum Graphics Cards
    Replies: 17
    Last Post: 18-08-2005, 12:35 AM
  4. Windows Installation Fatal Error
    By Weng in forum Software
    Replies: 23
    Last Post: 16-10-2004, 02:57 PM
  5. Windows XP Email?
    By joshwa in forum Software
    Replies: 9
    Last Post: 18-01-2004, 09:38 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •